• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

I'm going to subscribe to this thread, just so in 5 years time I can look at it and laugh how we thought 6Gb was a lot for VRAM. It's like looking at a PC magazine from 2000 which is about "superfast 1Ghz CPUs" :p

totally. PC is not a static platform... it's what makes it PC. After 30 years of being in market, youd think people would get it.
 

Ray Wonder

Founder of the Wounded Tagless Children
It has 256bit memory interface... so no. 8GB models are more realistic, but they wont be cheap.

That's Ok, my addiction will force me to buy one.

There is a 8GB version coming.
Still, I'm going to guess it won't even be needed.
You know what? I'll hazard a prediction and bet that 4GB will max the game despise these "recommended" specs.

Yeah, I guess I'll see. Gonna put my 3GB 780 to the test until I can grab the 980.
 

KKRT00

Member
Five years from now 16 GB of VRAM may even be the standard, but that doesn't mean that at ANY point in the next years you'll going to *need* them to match consoles.

Here's what everyone throwing around the optimization bullshit is missing: hardware that eclipse consoles today will STILL eclipse consoles in five years, no matter how magic optimization you can throw in. What it won't necessarily do is "max out" PC games in five years, but that's because it's a very vague, arbitrary requirement to fulfill.

But, but fairy dust!
 

SerTapTap

Member
This thread
facepalm.gif


Devs offering these sort of settings is always a good thing in my eyes.

I'm a bit antsy about purchasing a card right now, but yeah I am glad to see limits pushed again, PC performance being about "it only takes THREE of MY graphics cards to do 4K @ 60Hz with this" as the only differentiating factor vs 3 year old cards is pretty boring. Ram usage and multicore had to go up some time. Refuse to run SLI for...well, anything, with all the bugs I hear about, so I'm hoping some 6-8 GB cards will show up for well under $1k sometime.
 

kevm3

Member
The agenda that you may be better off buying the console version. Which is horseshit of course, but some people may actually fall for it.

Or maybe none of us are saying it's better to buy the console version, but are questioning whether we should invest money into expensive PC parts if this is the sort of effort that we will be getting from the PC version of the game. Maybe we're saying that if companies aren't going to optimize the PC versions properly, we'd rather just buy the console version and accept the hit to graphics due to a better bang for the buck. There's no reason that this game should have such insanely high requirements to max out.

When high quality cards that have JUST been released can't even max out this game despite the fact that those graphics cards alone have nearly the amount of VRAM as the amount of total usable ram by consoles... not to mention much more system ram and much faster processors, people are going to be curious as to what is going on. What is this visually average game doing that requires not only 8GB of system ram, but 6GB of VRAM to max out? 14 GB of total ram to max this game out? Sorry, that's ridiculous.
 
Exactly, but my point isn't about that. I just think the developer is going the wrong way here, if they're going to include a 6 GB VRAM option, they better show it off properly, both the textures themselves, and the performance difference between a 6 GB card and 2-3 GB card. Since this setting is clearly targetting the enthusiast market segment, the enthusiasts typically know their stuff, so the developer should respect that.

You're talking like they're crowing about these ultra textures from the rooftops, when in reality, the first (and only) thing we know about them has come from this screenshot of the options screen.

This is a bonus for high-end and future PC gamers, and there's very few developers that would put this kind of option in their game. And honestly, with the way people have reacted to this, I doubt any developers are likely to do it in the future - best just to relabel 'high' detail as 'ultra' and make everyone happy.
 

Sentenza

Member
But I thought VRAM and PC RAM not quite same task. That's why not many games use PC RAM as VRAM.
That... was my point?
On PC you have separate RAM pools for separate tasks, on consoles you have an unified one.
That's why comparing the VRAM on your GPU to, say, the total amount of GDDR5 in a PS4 is idiotic; it always was, it always will be.
 

Braag

Member
So it's like Watch Dogs where most people had a system strong enough to run the game at 60fps easy but couldn't set the textures to Ultra because it needed 3GB of VRAM.
However this game needs double that...
ATI and Nvidia need to start making 8GB cards the norm cause that's where we are heading it seems.
 

Henrar

Member
Why is it some people think that the only quality preset for PC Gaming is ultra?

If you don't have enough VRAM you can play on medium or even high. That's what PC Gaming is about. When you buy new GPU in the future you'll be able to return to the game and max it.
 
Or maybe none of us are saying it's better to buy the console version, but are questioning whether we should invest money into expensive PC parts if this is the sort of effort that we will be getting from the PC version of the game. Maybe we're saying that if companies aren't going to optimize the PC versions properly, we'd rather just buy the console version and accept the hit to graphics due to a better bang for the buck. There's no reason that this game should have such insanely high requirements to max out.

When high quality cards that have JUST been released can't even max out this game despite the fact that those graphics cards alone have nearly the amount of VRAM as the amount of total usable ram by consoles... not to mention much more system ram and much faster processors, people are going to be curious as to what is going on. What is this visually average game doing that requires not only 8GB of system ram, but 6GB of VRAM to max out? 14 GB of total ram to max this game out? Sorry, that's ridiculous.

Let's go under the assumption that this is a setting for future proofing the game. You have an issue with a developer pushing the limits of hardware even though they know it probably isn't worth their time because only 0.01% of PC owners can run it? Sometimes you can't max things out, even with top end hardware. I just bought a GTX 980 but you don't see me getting all pissy about it. At least fucking wait until you can see the differences when the game is out.

Fuck, man. A former PC developer decides to make a good PC game and people flip their shit.
 

Henrar

Member
That... was my point?
On PC you have separate RAM pools for separate tasks, on consoles you have an unified one.
That's why comparing the VRAM on your GPU to, say, the total amount of GDDR5 in a PS4 is idiotic; it always was, it always will be.

On the PC, whenever something is loaded from HDD it is first loaded to normal RAM pool, then copied to VRAM if it's data needed for graphics processing. In other words, what you have in VRAM, you also have a copy of it in normal RAM. In unified memory space, such thing is not needed.
 

Sentenza

Member
On the PC, whenever something is loaded from HDD it is first loaded to normal RAM pool, then copied to VRAM if it's data needed for graphics processing. In other words, what you have in VRAM, you also have a copy of it in normal RAM. In unified memory space, such thing is not needed.
Way to oversimplify things.
Not to mention that there's a lot loaded in normal RAM that doesn't need to be passed on VRAM at all.
 
Watch Dogs.
https://www.youtube.com/watch?v=KRveD-kzuME

And chances are 3GB+ will be necessary for same texture quality as consoles. It's just the way it is, the common denominator has been raised.

If this is true, than it would stand to reason that games that are out on PC and PS4 would look better on my PS4 because I'm only running a 2GB GTX 670, especially in texture quality. But its not the case at all. BF4 looks and performs much better on my PC than on consoles.

With Watch Dogs, I think the issue was all about poor optimization on the PC side, they obviously spent way more time optimizing for the unified pool of memory they had on the PS4. They didn't seem to really do any optimization on PC. Also the PS4 version renders at 900p and lacks AF and probably other subtle effects, so direct comparisons of texture quality don't really make sense.
 

iceatcs

Junior Member
That... was my point?
On PC you have separate RAM pools for separate tasks, on consoles you have an unified one.
That's why comparing the VRAM on your GPU to, say, the total amount of GDDR5 in a PS4 is idiotic; it always was, it always will be.
Shared pools allow what devs want to sharing, whereas PC can't. I believe the gaming wasn't much on system ram than video ram.

So there might be not enough to save video memory by separate tasks. i.e. 5gb RAM console game might be only less 1GB on system tasks. And the rest are graphics tasks.
 

Kezen

Banned
Why is it some people think that the only quality preset for PC Gaming is ultra?

If you don't have enough VRAM you can play on medium or even high. That's what PC Gaming is about. When you buy new GPU in the future you'll be able to return to the game and max it.

Again, anyone should have a look at this Durante thread :
http://www.neogaf.com/forum/showthread.php?t=885444

PC Gaming isn't all about maximum settings, nor it should be because it's obvious it will be very, very demanding regardless of what consoles are capable of.
I don't think you will ever need more than a 4GB 760 to match the PS4 for instance.

Watch dogs high setting is consoles. Not the very high.
As you can see the PS4 uses ultra textures. Nothing to be surprised by, consoles do not lack VRAM for now.

If this is true, than it would stand to reason that games that are out on PC and PS4 would look better on my PS4 because I'm only running a 2GB GTX 670, especially in texture quality. But its not the case at all. BF4 looks and performs much better on my PC than on consoles.
With Watch Dogs, I think the issue was all about poor optimization on the PC side, they obviously spent way more time optimizing for the unified pool of memory they had on the PS4. They didn't seem to really do any optimization on PC. Also the PS4 version renders at 900p and lacks AF and probably other subtle effects, so direct comparisons of texture quality don't really make sense.
BF4 does not use that much VRAM so here's why. In future games your 670 won't get you the same texture quality than consoles, but you could run higher settings.
Regarding Watch Dogs I'm aware of the differences between the PC and PS4 (played both) but in terms of textures the PS4 matches PC's ultra textures and does that without stuttering.
 
Watch dogs high setting is consoles. Not the very high.

Watch Dogs was also a horrible mess on PC and should never have been released in the state it was. It couldn't even achieve 60fps with SLI Titans on Ultra textures. Hopefully Mordor doesn't have such issues outside of the optional Ultra texture download.
 

Heigic

Member
Weren't people laughing at Crytek not to long ago for saying that RAM was already a limiting factor this gen? edit: this thread

Complaining about ultra textures packs is just going to mean devs aren't going to release them in the future as they have not worth the negative PR.
 

Henrar

Member
Way to oversimplify things.
Not to mention that there's a lot loaded in normal RAM that doesn't need to be passed on VRAM at all.

Well, that's true (the oversimplification stuff). But the transfer of texture data happens and it takes space on normal RAM. GPUs on PCs don't have direct access to HDD. Not yet, anyway.
 

R_Deckard

Member
No.. pc ultra settings do not equal console settings they are well above it.


Like watch dogs high texture( 2 gig vram) setting is equal to the consolesame but then gives a even higher texture setting strictly for 3+ gig gpus.

Yeah this myth keeps going, IT IS NOT. Console Textures are Ultra.
 

pestul

Member
Guys.. The 8GB 970/980s are going to be at least $100 more you know. They will milk it for sure. Come on AMD! Make the 380/390s 8GB stock.
 

2San

Member
This thread
facepalm.gif


Devs offering these sort of settings is always a good thing in my eyes.
Yup it's a good thing. It becomes a problem when the game doesn't scale well with lower settings. Crysis 3 looks good even if you don't play it on extreme balls to the wall settings.

And people swore up and down in the I need a new pc thread that 2gig of vram would be enough. I went with a 780 3gb. Now I'm sad.
For a GTX760/GTX770 it is enough. As till now even in 1080p+ setting the extra 2GB offers barely any improvement. Which is still true. I think 780 will still be fine for quite a while as well. Only the crazies would argue that the 2/3GB gpu's will last you a generation at max settings.
 

kevm3

Member
Watch Dogs was also a horrible mess on PC and should never have been released in the state it was. It couldn't even achieve 60fps with SLI Titans on Ultra textures. Hopefully Mordor doesn't have such issues outside of the optional Ultra texture download.

That's my biggest concern. Will developers actually take the time to optimize for the PC versions? There is zero reason a single Titan, much less dual Titans can't max out Watch Dogs. If we need all this horsepower to max out games now, especially games that look inbetween generations, what's going to happen when we get further into the generation?
 

MaLDo

Member
Wow 6gb is alot. I wonder why they do the extra work for Ultra Textures just for 0.1% of the PC gamer.

Is Maxwell not 20-30% more effective with their VRAM because of this new color compression or was it only for the memeory bandwith? And MFAA does save VRAM compared to MSAA too.

Actually what they did is not more work, is less work.

Basically for every developer is a good straight forward way to work create texture without limitations and to scale down after that.

A good way to known what is the best size for a texture is to analyze how many texels of the texture will be visible in screen. But is a good way, not the best way. Best way is to join that info with how many seconds those texels will be visible in a complete run of the game. That info could be possible using automatic tools but I can't find a developer that take it account.

I can say a few examples using watch dogs. That game uses big textures for ad panels that are a few meters over the streets. You can't reach that panels so in a normal gameplay you are not seeing half the texels of that texture for a 1080p screen resolution. But, the game uses the same resolution for grass textures that are a used everywhere, you look at they frequently and on top of that grass textures are expanded for big zones of terrain so the ratio texels/ratio is horrible. Same for roads. A road texture used for 20 meters of terrain has same resolution than a newspaper decal used over the street on a corner.

Why that happens?

Doing a good balance between work in sizing textures is a lot of work. And moreso you need the memory requeriments before hand to start the calculation. That have to work over the premises I said before. A ordered list starting with most visible texture to the less visible taking accound how many texels will be necessary for them using a fixed screen resolution.

As difficult that task is, is easiest for a closed platform. Watch Dogs in PS4 use a clever mix of ultra/high/medium textures (not perfect mix btw). In pc they take the fast path: We have 1Kx1K textures for enviroment and 4Kx4K textures for characters. So ultra texture quality will be every texture in max resolution, high will be every texture at half and normal one quarter. So the fucking dirt leaves over the streets will be a blurry mess and we don't care the player will be looking that texture half of the game.

So, if the mordor game has a few 4K textures for important assets in console versions, I guess ultra quality for pc means every fucking texture is maxed, they don't care is most of them are not really necessary.
 

R_Deckard

Member
there has been no pc port of a console game that has needed more than a 2 gig card to run the same textures.

but you are looking at Last gen or launch.

I am at a loss to the naivety and overreaction in this thread.(this or the below is not aimed directly at you)

Now the systems have moved on the base for dev's is higher. Once the 8gb machines where announce 2gb vram was never going to cut it.

All we are seeing now is more and more games using the ram. Consoles will run around 4gb textures maybe upto 5 nearer the end of the cycle, I have bought a 970 with its 4gb of ram I see it getting me through most of this gen, the issue is that high end cards like the 780 etc still have 2gb running en mass and this seems to annoy people who dont also game on console as they think they should just "max" everygame out.

There is more to system than just one metric. 4gb will be the new base this gen on PC and as expected if you want to "keep up" then many will need to upgrade, simple enough really. But hey it is PC so you could just run on Medium textures and be happy, hardly a deal breaker?!!?
 

Kezen

Banned
That's my biggest concern. Will developers actually take the time to optimize for the PC versions? There is zero reason a single Titan, much less dual Titans can't max out Watch Dogs. If we need all this horsepower to max out games now, especially games that look inbetween generations, what's going to happen when we get further into the generation?
A Titan can max out WD but 60fps locked is not guaranteed.
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/21.html

Got a source for this? 'Cause, I fucking doubt it.
https://www.youtube.com/watch?v=KRveD-kzuME
 
And people swore up and down in the I need a new pc thread that 2gig of vram would be enough. I went with a 780 3gb. Now I'm sad.

It is enough. Witcher 2 will make most pc's cry on it's highest settings, doesn't mean that 3gb vram won't be able to appropriately cover all your games. Expecting all games, especially those tailored around pc, to run at everything max is unrealistic. No consumer PC out there can handle new games at 4k/60 fps yet, doesn't mean the card is garbage.
 

pestul

Member
That's my biggest concern. Will developers actually take the time to optimize for the PC versions? There is zero reason a single Titan, much less dual Titans can't max out Watch Dogs. If we need all this horsepower to max out games now, especially games that look inbetween generations, what's going to happen when we get further into the generation?
I think its going to become the new extreme tessellation. They'll all leave textures uncompressed just because... this move is going to make all the 'ultra or upgrade' enthusiasts really groan. I have no problems choosing High settings, but I wouldn't ever want to select Medium anything with brand new hardware. We're all so stubborn.
 
Top Bottom