• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

Anyone can confirm. Watch Dogs does not ship with AF of any kind on PC.
You have to force 16xAF in your drivel panel, fortunately it works perfectly.

Well, if you don't have it on either versions, the difference isn't because of that. I did not know whether the feature was in the PC version.
 

mdzapeer

Member
A lot of people don't play at 1080p anymore but higher resolutions.

These guys aren't trying to max their games all the time.

But if you pay £500/$750 for a new graphics card, that is capable of powering the game at that resolution, but because nvidia wants to pace itself with its products, your card can't accommodate the game at the resolution you intended to play at.

That's being gimped. Not by the game company (unless they optimized it really poorly) but by the guys selling the cards, to get you to upgrade again in a few months/next year.

You hit the nail on the head.
 
I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs

I'm a bit curious how you measure VRAM usage because some of the usual applications aren't all that reliable in DX11 games. If you have the means to try it using Process Explorer then I'd appreciate if you could check how much the game consumes under 'Committed GPU Memory'. MSI Afterburner and GPU-z only report 'Dedicated GPU Memory' which can be very misleading.

Here's Watch Dogs, for example, with textures on 'high' then 'ultra' on a 2gb card. Ultra textures do need 3gb of VRAM but the wrong metric will only ever show it using 1.6gb on a 2gb card. The overlay you see is how much it consumed on 'high' which is also more accurate than the other metric would suggest.

1HvvQBH.png
 

bj00rn_

Banned
0.8GB VRAM running in 1080p 4xMSAA, and at 60fps (http://www.neogaf.com/forum/showpost.php?p=131720732&postcount=508)


ibrZzRfTsu60yi.gif

ib2H1Ka73Veqr2.gif

ip4EkgQxATXY.gif


The Vanishing of Ethan Carter.

Here's another maxing at 60 fps using ~1.4GB VRAM: https://www.youtube.com/watch?v=n2znh0ppBzQ

Of course no enemies at screen, but come on, 0.8-1.4GB VRAM (it will use more on different cards though, it uses 3GB VRAM on mine, maybe throw in some imprecise measuring methods as well.., but it seems like people are able to run it like this in 2GB)..

So what in the fucks name is Shadow of Mordor doing with its VRAM usage..

TLDR; There is more to graphics than texture usage..
 

Eusis

Member
I agree with almost everything you said except the "buy a console at launch, then a PC later" part. It was true during earlier console launches but it is not true anymore. Not when there is affordable hardware out there that can handily outperform next gen consoles even in poorly optimised games like Watchdogs.
Pretty sure it was true then too (Deus Ex predated the PS2, but had to be cut down to work for it), the biggest change is that usually you needed a full PC overhaul if it was a few years old. Nowadays though you can just upgrade the video card so long as it wasn't TOO old or TOO cheap and be stomping consoles handily. Well, and generally stability and reliability is better than it's ever been, it says something that SRII was one of a kind of bigger titles and that Dark Souls, a port whose main flaw was being TOO faithful, was heralded by some as "worst port EVER."

All of which might undermine my point admittedly, but I'm also operating from a mentality of going for full on overkill at a low-to-midrange cost, and just a bit more waiting (or a custom 970, not the first time we had special models with extra ram bolted on) is still needed.
 
0.8GB VRAM running in 1080p 4xMSAA, and at 60fps (http://www.neogaf.com/forum/showpost.php?p=131720732&postcount=508)



ibrZzRfTsu60yi.gif

ib2H1Ka73Veqr2.gif

ip4EkgQxATXY.gif


Vanishing of Ethan Clarke.

Here's another maxing at 60 fps using ~1.4GB VRAM: https://www.youtube.com/watch?v=n2znh0ppBzQ

Of course no enemies at screen, but come on, 0.8-1.4GB VRAM (it will use more on different cards though, it uses 3GB VRAM on mine, maybe throw in some imprecise measuring methods as well.., but it seems like people are able to run it like this in 2GB)..

So what in the fucks name is Shadow of Mordor doing with its VRAM usage..

TLDR; There is more to graphics than texture usage..

http://www.neogaf.com/forum/showpost.php?p=131816942&postcount=1102
 

bj00rn_

Banned

Exactly; Shadow of Mordor is fuck all optimized, and it doesn't even look that good. So why are we using it to decide the future of GPUs again? (rhetorical question, the answer is: we shouldn't, it's not a good sample for that)

Try reading the last page or so. "Ultra" settings uses downsampling.

Waitaminute, but that kind of changes the whole premise..
 

Jedi2016

Member
Yeah, but who actually just uses a preset any more? I manually crank everything to maximum, and they nearly always show "Custom" under the preset menu.

So, if you're running the game only at 1080p, without downsampling, then what kind of memory usage are we looking at with the texture pack installed?
 
Exactly; Shadow of Mordor is fuck all optimized, and it doesn't even look that good. So why are we using it to decide the future of GPUs again? (rhetorical question, the answer is: we shouldn't, it's not a good example for that)



Waitaminute, but that kind of changes the whole premise..

Uhm, what I said isn't anything like that. AT ALL.

It uses photomapping for the textures, which is not feasible for this LOTR game. Although that doesn't explain the difference in VRAM requirements, just why the textures aren't as good as that game.
 

bj00rn_

Banned
Uhm, what I said isn't anything like that. AT ALL.

I know, I was trolling you, it was wrong, I feel guilty, I'm sorry :)

The game shouldn't even use 3GB VRAM at "High" looking like it does though, so in my super-pro expert (NOT) opinion it doesn't appear to be on par when it comes to optimizing

I just tested Vanishing of Ethan Carter, downsampling it @4K - the VRAM usage in Afterburner went DOWN from 3GB+ to a little over 2GB VRAM usage.. I'm confused, LOL. Could be because I'm a little tipsy
 
Try reading the last page or so. "Ultra" settings uses downsampling.

Supersampling performance is typically more dependent on memory bandwidth rather than size. It does need more memory but we're only talking a few hundred megabytes.

Edit: Spoke a bit too soon, it's closer to a gigabyte in more modern games. BF4 goes from 1.6 to 2.5 gigabytes of VRAM use between 1080p and 4k.
 

nbthedude

Member
Exactly; Shadow of Mordor is fuck all optimized, and it doesn't even look that good. So why are we using it to decide the future of GPUs again? (rhetorical question, the answer is: we shouldn't, it's not a good sample for that)



Waitaminute, but that kind of changes the whole premise..

If you actually read that linked post you quoted and conclude that it agrees with you that Mordor is "fuck all optimized" then you have a reading comprehension problem.
 
That might be enough to push it past 4gig. They aren't going to say needs 4.5gig of vram.

Yep, it was stupid of me to not check first. I edited my post.

I just think it weird because they state that it applies to 1080p rendering resolution. You'd think they were decoupling rendering resolution from texture resolution when making statements like that.
 
I know, I was trolling you, it was wrong, I feel guilty, I'm sorry :)

The game shouldn't even use 3GB VRAM at "High" looking like it does though, so in my super-pro expert (NOT) opinion it doesn't appear to be on par when it comes to optimizing

I just tested Vanishing of Ethan Clarke, downsampling it @4K - the VRAM usage in Afterburner went DOWN from 3GB+ to a little over 2GB VRAM usage.. I'm confused, LOL. Could be because I'm a little tipsy

Alrighty then, sorry.

If you actually read that linked post you quoted and conclude that it agrees with you that Mordor is "fuck all optimized" then you have a reading comprehension problem.

Yeah, turned out to be a joke.
 

UnrealEck

Member
That might be enough to push it past 4 GB. They aren't going to say needs 4.5 GB of vram.

That's another good point. They'' be going by 1GB, 2GB, 3GB, 4GB and 6GB increments due to the cards typically having those amounts. It may not use the 6GB, but it might still require a card with more than 4GB and those only come in 6GB flavours.
 

Ragus

Banned
4gb card here, so high for me, but I wonder about one thing:

Will it be possible to run it at ultra textures, but capped at 30 fps? I wouldn't mind that if I could get them ultra textures.
 

UnrealEck

Member
4gb card here, so high for me, but I wonder about one thing:

Will it be possible to run it at ultra textures, but capped at 30 fps? I wouldn't mind that if I could get them ultra textures.

You can run it (unless they set a restriction) but if there's not enough memory it'll be swapping data in and out constantly and you'll get short pauses all the time.
 
4gb card here, so high for me, but I wonder about one thing:

Will it be possible to run it at ultra textures, but capped at 30 fps? I wouldn't mind that if I could get them ultra textures.

It depends on too many factors and we simply don't know yet. As Lactose_Intolerant pointed out, ultra textures might only need 4.5gb or it could need the full 6 gigabytes of VRAM.

PCIe 3.0 has a bandwidth of ~16GB/sec or roughly half a gigabyte per frame at 30fps.
 

Kezen

Banned
It depends on too many factors and we simply don't know yet. As Lactose_Intolerant pointed out, ultra textures might only need 4.5gb or it could need the full 6 gigabytes of VRAM.

PCIe 3.0 has a bandwidth of ~16GB/sec or roughly half a gigabyte per frame at 30fps.

I may be wrong but PCI-E 3.0 has 32GB/S :
Following the tradition that goes back to AGP (remember that?), the bandwidth is again doubled from 500MB/sec or 4Gb/sec on PCI-Express 2.0 to 1GB/sec or 8Gb/sec on PCI-Express 3.0. That's per lane, in each direction. This means the total bandwidth for an 16x PCI-Express graphics slots go up from 16GB/s to 32GB/s, so it should cope better with the future demands of high performance graphics cards.
http://www.bit-tech.net/hardware/2010/11/27/pci-express-3-0-explained/1
 

Larogue

Member
I'm assuming you're pulling this data out of your arse as there's no currently prescribed date or announcement for 8Gb cards.

Wait, do 8GB versions of 970/980 really come out in November? Source?

That'll be my time to upgrade, if true. Hadn't heard that!

According to OCUK the GTX 970 and GTX 980 with 8GB models are expected somewhere between November and December.



http://www.guru3d.com/news-story/nv...o-have-4gb-initally-and-8gb-models-later.html
 

Kezen

Banned
is last gen lower quality than 'low' settings? how high is next gen console settings? digital foundry get in here.

I would be surprised if there is even a "360/PS3" equivalent setting. Judging by the minimum requirements the game does not seem to scale that low.
 

UnrealEck

Member
I'm somewhat glad I didn't scrap cash together for a GTX 900 series card just yet. I'll be almost definitely going for one of the 8GB versions if they arise. Even a 6GB version would be nice.
 

BONKERS

Member
Couch gaming works great on PC. If you want it the option, you have it. I know a lot of people here, myself included, have their PCs hooked up to their TV. I run a long HDMI cable down into the basement and back up behind my TV (something like a 50 foot run).

The Bluetooth adapter for my DS4 that I have works without issue on my couch, and I can mirror the display back to my desk for the instances where using the trackpad on the DS4 as a mouse doesn't work. Combine that with steam big picture mode, and you've got yourself an amazing setup.

Other options include putting a much less powerful media PC or steam box next to your TV and using Steam streaming, or buying something like the Alienware X51 to use more like a console and having it hooked up directly. Many other options exist.


I also use my PC for TV gaming as well.

I run a DVI to HDMI cable either to the other side of the room or from my office to my bedroom(Less than 35Ft) and then run a 32Ft active USB extension cable for a wireless keyboard with a small unpowered hub and any controllers.

Works like a charm
 
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

You're doing the Lord's work.
 
I'm asking because I've been meaning to ask for ages, not picking on you.

So - why is it considered to be a fail if a 1080/60fps PS4 (not PC) title turns out to be 60 with "some minor drops to ~50 fps" ? I remember the absolute fascination with whether TLOU was going to have dips below 60 or not (turned out it really was 60, to the second deviation or better).

On PC there's the implicit assumption that if you want higher frame rate you'll lower some settings a little. On PS4, if a game is shooting for a certain frame rate and fails, then it is considered a failure.

He's running it maxed at 2720x1700, not 1920x1080. Cut him some slack, lol.
 

IMACOMPUTA

Member
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

Thanks for that!
And LOL at this thread, given this info.
 

Lowmelody

Member
The wording of the description and the reaction it elicits is completely intended. It is an Nvidia sponsored game, after all. The wording is such to ensure the only way you know you're getting the best experience is if you buy Nvidia's up coming 6gb cards. More words between that and the player only muddies the message.

Yes expectations increase with time which in turn mandates increasingly powerful tech and yes that ebb and flow is and will always be an aspect of PC gaming and one that some enjoy, but that doesn't mean that it's not business as usual in other areas as well. This is marketing.
 

Lulubop

Member
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

So I don't have to trade in my sli 980s for a ps4?

I Can't for the next pc spec recommedation thread for Kinggi to overreact in, and console warriors to rush in and say I told you so before any benchmarks are out.
 
I have a gtx 660 3gb, 8gb ram and a crappy AMD FX-4100 Quad-Core Processor.

How would this game hypothetically run on my PC or should I just hold out for the PS4 version? Even though I can get the PC version for about £17.
 
The wording of the description and the reaction it elicits is completely intended. It is an Nvidia sponsored game, after all. The wording is such to ensure the only way you know you're getting the best experience is if you buy Nvidia's up coming 6gb cards. More words between that and the player only muddies the message.

Yes expectations increase with time which in turn mandates increasingly powerful tech and yes that ebb and flow is and will always be an aspect of PC gaming and one that some enjoy, but that doesn't mean that it's not business as usual in other areas as well. This is marketing.

Yeah....I have my doubts about that. I am having a hard time deciding whether you are even serious.

First of all, I don't know if this game is Nvidia sponsored in the first place, usually you get to know about Nvidia exclusive effects and such and I haven't been aware of such things.

Secondly the cards with 6GB aren't out yet, haven't officially been announced and don't come from Nvidia themselves.

Third point is that AMD also does have 6GB cards.

And last of all it is incredibly stupid.

I have a gtx 660 3gb, 8gb ram and a crappy AMD FX-4100 Quad-Core Processor.

How would this game hypothetically run on my PC or should I just hold out for the PS4 version? Even though I can get the PC version for about £17.

Wait for benchmarks, your GPU will probably be fine, but I don't know how well the CPU will fare.
 

UnrealEck

Member
I have a gtx 660 3gb, 8gb ram and a crappy AMD FX-4100 Quad-Core Processor.

How would this game hypothetically run on my PC or should I just hold out for the PS4 version? Even though I can get the PC version for about £17.

Reccomended graphics card is a GTX 660 I think.
Not sure about the processor.
Your main memory is plenty.
I'd say there's a good chance PS4 will run it better.
 

BONKERS

Member
Show me this proof then of it other than a tweet?



I think you mean heterodoxy and no I posted images months ago. On a screen looking at them with my eyes Ultra textures=PS4 textures end of.


No as I have stated above, this strawman witch hunt is comical and derailing this thread so I am done here.

The fact of the matter is, Developers say otherwise.

And no one has taken the time to make extensive 1:1 lossless PNG comparisons of either.

Digital Foundry's screenshots mostly compare cutscenes and the like and their comparison gallery is quite small. Plus their screenshots are extremely compressed. Making it null anyway.
 

BBboy20

Member

According to OCUK the GTX 970 and GTX 980 with 8GB models are expected somewhere between November and December.



http://www.guru3d.com/news-story/nv...o-have-4gb-initally-and-8gb-models-later.html
Oh please let this be true.

Man, that Disappearance of John Carter game looks incredible.
http://www.theastronauts.com/2014/03/visual-revolution-vanishing-ethan-carter/

Amazing vets can go from making meathead shooters creating cerebral entertainment.
 

Lowmelody

Member
Yeah....I have my doubts about that. I am having a hard time deciding whether you are even serious.

First of all, I don't know if this game is Nvidia sponsored in the first place, usually you get to know about Nvidia exclusive effects and such and I haven't been aware of such things.

Secondly the cards with 6GB aren't out yet, haven't officially been announced and don't come from Nvidia themselves.

Third point is that AMD also does have 6GB cards.

And last of all it is incredibly stupid.



Wait for benchmarks, your GPU will probably be fine, but I don't know how well the CPU will fare.

Aside from whatever it is you think you are aware of being completely irrelevant to anything and that I clearly said "up coming", I'm most of all puzzled as to what I could have possibly said to have warranted such a typical angry gamer post™. Moving on
 
Aside from whatever it is you think you are aware of being completely irrelevant to anything and that I clearly said "up coming", I'm most of all puzzled as to what I could have possibly said to have warranted such a typical angry gamer post™. Moving on

There are no 6gb nvidia cards coming out. They will either be 4 or 8. They just released 4 gb cards, so I don't know how this will be good for them.
 
Aside from whatever it is you think you are aware of being completely irrelevant to anything and that I clearly said "up coming", I'm most of all puzzled as to what I could have possibly said to have warranted such a typical angry gamer post™. Moving on

I don't think this is a Nvidia sponsored game. There was some footage on Game24, but that's it. And I know you said upcoming, but it still doesn't make sense.

Also I am not angry, I just don't agree for many reasons and you are literally the only person that brings this point up in this thread.

There are no 6gb nvidia cards coming out. They will either be 4 or 8. They just released 4 gb cards, so I don't know how this will be good for them.

Oh yeah, and I forgot this.
 
Top Bottom