• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

SparkTR

Member
PC Exclusive Crysis comes out, ultra-settings required a $2500 rig, everybody praised CryTek for making something that beautiful and a benchmark for years to come.

Ah, no they didn't. People were pretty bent out of shape due to the game being unoptimized. How true that was at the time I don't know, but there was a lot of negativity.
 

Koobion

Member
First time buying a video card?

No? What is that supposed to mean? Don't insult me because I called you out on your ignorant comment. You suggested that people complaining about the VRAM requirement have old cards, when the reality is that some people who have brand new cards are also amongst those complaining because they also don't have the necessary memory.
 

Dolor

Member
Finally something to make me want to upgrade my GPU. Nice to see actual software pushing the limits rather than just the hardware arms race of the last few years. I hope it looks as good as its requirements.
 
Did this thread really turn into a 15 page thread about people bitching about their old video cards becoming obsolete?

Come on PC guys, act like you've been there before.

If it was only that simple.

chfh4.jpg
 

UnrealEck

Member
Looks great to me.

It does look quite nice. But not better texture quality than many games I've seen running on 2GB of VRAM.
I'm guessing that quality in the screenshot will likely contribute to a total requirement of 2GB. I'm sure those aren't the highest textures you'll get and those that are the highest are probably going to be best used at 4K.

This is why you don't upgrade at the early stages of a console gen.

Why's that?


Because not every graphics effect contributes as much to VRAM.

Never said it does. Or am I missing your point?
 

TheUsual

Gold Member
Things like this excite me because it's something I can't run right now, but will be able to in the future. It's like playing with a new toy.

Exactly. No problem adding it an additional option like this.

Doesn't the Witcher 2 still have some settings that people can't fully max out? (I remember reading something about Uber settings or something along that line).

We're not losing anything if not all of us can play at the absolute maximum (especially when this additional setting is an optional download).
 
This is why you don't upgrade at the early stages of a console gen.

How so? We aren't suddenly going to make an absolutely huge jump in GPUs after the consoles are released.

Exactly. No problem adding it an additional option like this.

Doesn't the Witcher 2 still have some settings that people can't fully max out? (I remember reading something about Uber settings or something along that line).

We're not losing anything if not all of us can play at the absolute maximum (especially when this additional setting is an optional download).

Ubersampling it was called. And yeah, that was the case but it was announced in advanced that you weren't supposed to use it now. And later they patched the option to show up in red. It can run on plenty of GPUs now.

It does look quite nice. But not better texture quality than many games I've seen running on 2GB of VRAM.
I'm guessing that quality in the screenshot will likely contribute to a total requirement of 2GB. I'm sure those aren't the highest textures you'll get and those that are the highest are probably going to be best used at 4K.



Why's that?




Never said it does. Or am I missing your point?

I think VRAM requirements usually go up quite a bit in open world games, but I am not sure.

But it is a bit of a bad screenshot to look at texture quality. You presented the screenshot as: "The game doesn't look so good graphically, why does it use so much VRAM." But it isn't about the total picture.
Also, since this is an extra optional package I doubt whether that screenshot includes those textures.
 

MaLDo

Member
Sad to see people who post in the PC screenshot thread crying about a better texture options. Super sad that people would rather not have this option just because their PC can't handle it.

I'm judging you guys right now, smh.

The problem is not that the option exists. The problem is likely that option implies they not spent time looking for the right balance between resolution of every texture for every "Texture preset" in the game. For example, I can imagine a thrid person camera game where the main character texture is 4Kx4K in consoles while the rest of the textures are 2Kx2K or 1Kx1K. On the other hand, the pc version of that game uses 2Kx2K textures for high quality in every asset filling 1.4 GB of VRAM and 4Kx4K textures for every asset in ultra quality textures option filling 3.2 GB of VRAM. So, you will need a 4 GB gpu to match main character texture quality of consoles when a simple mod that put that singular texture in the high quality preset could fill 1.405 GB for a way better experience.

Another problem in games with development centered on consoles (probably not for Mordor) is they use the console framerate for calculations. Watch Dogs (and Dead Rising 3) have streaming routines that are designed to work on 33 ms frametimes. Max number of shaders and textures streamed per frame in watch dogs are fine for 33 ms but in no way are ok for 16 ms frametime.
 

Corpsepyre

Banned
Yeah, I don't know about this. Will wait for DSOgaming's performance analysis before I get this. Shit seems outrageous.
 

Arcticfox

Member
Haven't there been reports that games use noticeably less VRAM on the 900 series Nvidia cards due to the new texture compression method they support. I would not be surprised if that 6gb recommendation is not actually necessary. I'll wait to see some tests.
 
Haven't there been reports that games use noticeably less VRAM on the 900 series Nvidia cards due to the new texture compression method they support. I would not be surprised if that 6gb recommendation is not actually necessary. I'll wait to see some tests.

No.

http://international.download.nvidi...nal/pdfs/GeForce_GTX_980_Whitepaper_FINAL.PDF Tiled Resources and related techs from page 23 on. With that you can take a 1.6GB VRAM scene and render it using just 156MB VRAM. That's going to require a game being programmed in DX11.3 or DX12, but when they arrive we're going to see a massive leap in visual detail, with far fewer repeated textures.

That hasn't happened yet as far as I'm aware.
 
Good on them for pushing the limits. Are people forgetting that Monolith made FEAR which was also using advanced tech at the time, and as a result still looked fantastic on the PC years later?

I have to imagine that the people having meltdowns that they won't be able to max this game are the summer childs of PC gaming, who got in last gen when you could max everything on a $500 budget build and are now for the first time having to turn the sliders down.

This is how progress is made, learn to live with 'only' high settings, these ultra high end options are just that: options for later down the line when you have better hardware.
 

Papacheeks

Banned
Wish people who are complaining about their PC setups, understand that this game is a ported game, that has obviously an engine that's -unoptimized for PC?

Case in point Battlefield works fine for the most part on most PC setups, because the Frost Bite 2 Engine they are using is optimized for PC.

Monolith has become more of a console developer as of late, so it's understandable to see an unoptimized game from them on PC.

Especially when you compare this to other Next gen console games on PC.

The issue I think will be if this continues to become a trend. But I think it's just going to take time to get the kinks worked out of new software for developers so they can have some sort of parity across Console and PC, in terms of just optimization on their engines.

It happened last generation, it took them a while to get their engines somewhat optimized for PC when they ported. We still get crap ports, but most games now I feel run fine on most moderate hardware.

Also who's to say they won't patch this later? Sleeping Dogs I believe had some performance issues on PC when it first came out, they patched later down the line, and it ran smooth as silk.
 
I kinda thought this would happen. Gonna stick with my 780 Ti Classy till the next set of cards come out next year around the time Oculus Rift CV1 comes out.
I would've done the samething if i had your card. I upgraded to gtx 970 from a pair of 6870s,and i'm very happy with my performance. I'm pretty confident that i can play this,and plenty other games at higher quality than the consoles. So i'm not worried by this news at all,i'm actually very happy that they did this.
 

UnrealEck

Member
But it is a bit of a bad screenshot to look at texture quality. You presented the screenshot as: "The game doesn't look so good graphically, why does it use so much VRAM." But it isn't about the total picture.
Also, since this is an extra optional package I doubt whether that screenshot includes those textures.

It's not about the total picture? What is it then? I presented a screenshot displaying graphics. I presented my view on whether I think those graphics justify more than 2GB of VRAM.

I'm well aware that textures aren't all that's in VRAM. I said that a few pages back too.
I never said those textures are the ultra textures. I am saying those graphics in that screenshot don't appear to justify more than 2GB of VRAM. Meaning mostly that those are the sort of graphics I'd expect a 2GB card to run. That's the sort of texture quality I'd expect to see possible on a 2GB card.

Maybe 3GB as a requirement, but I'm still thinking that's a stretch.

Ultra textures are another story. That's not what my post was saying.
 

DieH@rd

Banned
Maybe Monolith is aiming to fill out the game with just a little better textures and then slap 8x MSAA on it. That would eat a ton of VRAM.
 
It's not about the total picture? What is it then? I presented a screenshot displaying graphics. I presented my view on whether I think those graphics justify more than 2GB of VRAM.

I'm well aware that textures aren't all that's in VRAM. I said that a few pages back too.
I never said those textures are the ultra textures. I am saying those graphics in that screenshot don't appear to justify more than 2GB of VRAM. Meaning mostly that those are the sort of graphics I'd expect a 2GB card to run. That's the sort of texture quality I'd expect to see possible on a 2GB card.

Maybe 3GB as a requirement, but I'm still thinking that's a stretch.

Ultra textures are another story. That's not what my post was saying.

No, you can't look at a picture and think: "These graphics look like X VRAM."

There are many aspects of graphics that have a significantly lesser impact on VRAM. Texture quality, a very significant one is not particularly well-showcased in that screenshot.

I feel like selling my 780ti and getting one of those ATI 6GBs video cards :(

Why would you want less performance in all other areas for possibly slightly better textures?

Wish people who are complaining about their PC setups, understand that this game is a ported game, that has obviously an engine that's -unoptimized for PC?

Case in point Battlefield works fine for the most part on most PC setups, because the Frost Bite 2 Engine they are using is optimized for PC.

Monolith has become more of a console developer as of late, so it's understandable to see an unoptimized game from them on PC.

Especially when you compare this to other Next gen console games on PC.

The issue I think will be if this continues to become a trend. But I think it's just going to take time to get the kinks worked out of new software for developers so they can have some sort of parity across Console and PC, in terms of just optimization on their engines.

It happened last generation, it took them a while to get their engines somewhat optimized for PC when they ported. We still get crap ports, but most games now I feel run fine on most moderate hardware.

Also who's to say they won't patch this later? Sleeping Dogs I believe had some performance issues on PC when it first came out, they patched later down the line, and it ran smooth as silk.

Will you not say whether the game is or isn't optimized well right now? How much hardware a highest settings requires is a very bad representation of whether or not a game is well-optimized.
 

TheUsual

Gold Member
Ubersampling it was called. And yeah, that was the case but it was announced in advanced that you weren't supposed to use it now. And later they patched the option to show up in red. It can run on plenty of GPUs now.

Ah, okay. Nice to hear there are plenty of GPUs that can run it now. So it's basically the same situation: they're telling us "Hey, if you want to use this setting, you're going to need this much VRAM". They aren't hiding anything from us.
 

UnrealEck

Member
I feel like selling my 780ti and getting one of those ATI 6GBs video cards :(

I think that if you're contemplating that move based on this game (and what you think it might mean for future games), you should hold off. Wait to see what the game looks like without the texture pack and ask yourself if you really think you need textures of the quality of the texture pack.
They're possibly factoring in running the game at something like 4K too.
 
The problem is not that the option exists. The problem is likely that option implies they not spent time looking for the right balance between resolution of every texture for every "Texture preset" in the game. For example, I can imagine a thrid person camera game where the main character texture is 4Kx4K in consoles while the rest of the textures are 2Kx2K or 1Kx1K. On the other hand, the pc version of that game uses 2Kx2K textures for high quality in every asset filling 1.4 GB of VRAM and 4Kx4K textures for every asset in ultra quality textures option filling 3.2 GB of VRAM. So, you will need a 4 GB gpu to match main character texture quality of consoles when a simple mod that put that singular texture in the high quality preset could fill 1.405 GB for a way better experience.

Another problem in games with development centered on consoles (probably not for Mordor) is they use the console framerate for calculations. Watch Dogs (and Dead Rising 3) have streaming routines that are designed to work on 33 ms frametimes. Max number of shaders and textures streamed per frame in watch dogs are fine for 33 ms but in no way are ok for 16 ms frametime.

Good point. Texture settings are not clear enough / designed around really good customization for those with lower RAM amounts. It is just kind of a blanket/carpet bomb approach of texture handling.

Also, the second point is also spot on about game performance in a number of ports.
 
Or maybe none of us are saying it's better to buy the console version, but are questioning whether we should invest money into expensive PC parts if this is the sort of effort that we will be getting from the PC version of the game. Maybe we're saying that if companies aren't going to optimize the PC versions properly, we'd rather just buy the console version and accept the hit to graphics due to a better bang for the buck. There's no reason that this game should have such insanely high requirements to max out.

When high quality cards that have JUST been released can't even max out this game despite the fact that those graphics cards alone have nearly the amount of VRAM as the amount of total usable ram by consoles... not to mention much more system ram and much faster processors, people are going to be curious as to what is going on. What is this visually average game doing that requires not only 8GB of system ram, but 6GB of VRAM to max out? 14 GB of total ram to max this game out? Sorry, that's ridiculous.
If you just bought a brand new graphics card alone that costs more than a one or ps4, then I don't think you'd be wrong for expecting more than what pc gamers have been given with some of these games such as the watch dog port and possibly this port. There is nothing this game is doing graphically that would justify a total requirement of 14GB of Ram (8GB system ram + 6 GB of VRAM) to max out. If this was a Crysis situation, which it is not, then it'd be understandable and more than acceptable since Crysis looked far beyond what consoles were providing at the time and Crysis was a game that was optimized on PCs and actually put that additional power to use to produce something mind-blowing at the time. This, on the other hand, looks cross-gennish. What we're seeming to get with a lot of these games is highly optimized console versions with quick and dirty pc versions that let you get the extra features like higher res textures and framerates by brute forcing your way through it. Games like Battlefield 4 and Crysis 3 look better and they require nowhere near the resources this is asking for.
Crysis was optimized for PC and blew everything out of the water for its time. Mordor looks cross-gennish and doesn't even look better than Battlefield 4 or Crysis 3, which take less resources to max out. The fact that a $700 graphics card (780ti) can't max it out has some people wondering what exactly is going on.
kevm is telling it like it is, agreed on everything.





On another note, is AMD coming out with any top flight 20nm 6-8gb cards in 2015? I only buy AMD, including my CPU... I know I know, but I'm an AMD/ATI fanboy. They've never done me wrong all these years.
 
I find this thread hilarious because people honestly believe the PS4 version is going to be using the 6GB ultra textures AND they are crapping on the Nvidia 900 series.

Sigh....
 

bj00rn_

Banned
Looking at how the game looks in general, I'm don't understand why it is so high on VRAM, even at high. Anyway, increasing memory usage isn't the first place I would start to improve the graphics in this game, because I think it looks rather average when it comes to details, geometry and effects.
 

UnrealEck

Member
No, you can't look at a picture and think: "These graphics look like X VRAM."

There are many aspects of graphics that have a significantly lesser impact on VRAM. Texture quality, a very significant one is not particularly well-showcased in that screenshot.

There's plenty of other screenshots around. I can absolutely look at a game and question why its using so much VRAM. I can absolutely say 'wait a minute, why are they asking for so much memory?' I can absolutely say 'I don't think this looks like it justifies such high requirements' or 'I don't see why the graphics shown in these screenshots will use more than X amount of VRAM'.

I'm not saying I know it shouldn't. I'm saying it doesn't seem like it should and that if it does, I'm going to be curious as to why.

As for the comment about various other aspects using VRAM, I thought I had covered that already.
 
kevm is telling it like it is, agreed on everything.





On another note, is AMD coming out with any top flight 20nm 6-8gb cards in 2015? I only buy AMD, including my CPU... I know I know, but I'm an AMD/ATI fanboy. They've never done me wrong all these years.

I don't agree with him at all. You likely still have a great experience with less VRAM and this is only one detail you are paying attention too.

Yeah, they will come with 20nm 6-8 cards in 2015, although I doubt they will be preferable over their Nvidia counterparts.

There are ? Did not know that.

Yes there are AMD cards with 6 GB.

There's plenty of other screenshots around. I can absolutely look at a game and question why its using so much VRAM. I can absolutely say 'wait a minute, why are they asking for so much memory?' I can absolutely say 'I don't think this looks like it justifies such high requirements' or 'I don't see why the graphics shown in these screenshots will use more than X amount of VRAM'.

I'm not saying I know it shouldn't. I'm saying it doesn't seem like it should and that if it does, I'm going to be curious as to why.

As for the comment about various other aspects using VRAM, I thought I had covered that already.

I am not trying to say that various other aspects use VRAM too, I just say that texture quality is an important part of it and that part is not showcased well in that screenshot. The textures where you need to more than 4GB of VRAM for are likely not even showcases in the screenshots out there now. With 2GB of VRAM the game will likely still look and run very similar.
 
I don't agree with him at all. You likely still have a great experience with less VRAM and this is only one detail you are paying attention too.

Yeah, they will come with 20nm 6-8 cards in 2015, although I doubt they will be preferable over their Nvidia counterparts.



Yes there are AMD cards with 6 GB.
Performance-wise they might be inferior but price/performance ratio AMD will probably be better again, right? That ratio is a lot more important to me than pure performance. I'll probably upgrade around 3Q 2015.
 
No.



That hasn't happened yet as far as I'm aware.

I think you are talking about 2 different technologies. The memory compression is discussed on page 10. However, it only discusses the impact to memory bandwidth requirements, not actual memory usage and I don't know nearly enough to know if it impacts both.
 
Top Bottom