• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

So, the OT and silly W_D tangent aside, I love the fact that some developers are already going beyond current console level not just in effects or IQ, but actual asset quality.

I honestly didn't expect that before late 2015. Nice job Monolith.

While I'm not disagreeing with you here, is it really more than just nudging the kompression slider a little less in one direction when it comes to Textures? (I actually have to wonder why they didn't do this with every game in the past)
 

Durante

Member
While I'm not disagreeing with you here, is it really more than just nudging the kompression slider a little less in one direction when it comes to Textures? (I actually have to wonder why they didn't do this with every game in the past)
It really depends on the original resolution the textures were made at. During the later years of the previous console generation, textures were clearly made in much higher resolution/quality than what they were presented at in the console versions of games.

I'm just surprised that happened again so quickly, and that developers are actually putting in assets at a quality which only few PC gamers can make full use of right now. I find that admirable.

Show me this proof then of it other than a tweet
If a tweet from the developer is not good enough for you compared to the full weight of evidence resting in your word, then I'm not going to put in any more effort. You're not worth it.
 

Brofist

Member
If a tweet from the developer is not good enough for you compared to the full weight of evidence resting in your word, then I'm not going to put in any more effort. You're not worth it.

What would the developer know about anything? Clearly eyeballing it is the only legit way.
 

Nethaniah

Member
So, the OT and silly W_D tangent aside, I love the fact that some developers are already going beyond current console level not just in effects or IQ, but actual asset quality.

I honestly didn't expect that before late 2015. Nice job Monolith.

As long as lowered settings (equal to console settings) provides me with better performance considering i have better hardware, it's totally awesome.
 

irishcow

Member
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

Thank you for this excellent post! Made me very excited to play with my overclocked R9 290!
 
Ah PC gaming. No worries here. I'll probably open Geforce Experience and click optimize with GTX660 and call it a day. Feel free to flame me. Don't care. The game cost me $25 and I'll enjoy it even at low settings.
 

krizzx

Junior Member
This sounds ridiculous. I want to see some comparison vids between high and ultra to see if the difference actually worth it. What could 6GB of VRAM possibly bring to a game that 4GB cannot? These textures better look realer than real.

I'm betting on it being poor optimization, though.
 
Ah PC gaming. No worries here. I'll probably open Geforce Experience and click optimize with GTX660 and call it a day. Feel free to flame me. Don't care. The game cost me $25 and I'll enjoy it even at low settings.

Why would it look bad on a GTX 660?

You can tweak individual settings and still make the game look great.

High, Low, Medium, Ultra are just presets.
 

Alasfree

Member
Holy shit, are they stupid if this is actually true.

tFhCHpD.png


They wrote at "1080p rendering resolution". If you use super sampling, internally the game isn't using that resolution, but 4K! There is a world of difference. This is why some people were shitting bricks, all the discussion was thinking this requirements were with a native 1080p.
If this is the case hopefully they fix that text in the final release.
So much panic for nothing, hehe.
 

R_Deckard

Member
If a tweet from the developer is not good enough for you compared to the full weight of evidence resting in your word, then I'm not going to put in any more effort. You're not worth it.
But if this tweet proves it then the consoles run High on all textures not a mixture of Medium/high and Ultra right as per the proof you put forward I guess?
 
Why would it look bad on a GTX 660?

You can tweak individual settings and still make the game look great.

High, Low, Medium, Ultra are just presets.

Not saying that it would. I'm saying that I'm not going to obsess over texture quality to the point where I feel like I can't enjoy the game.
 

scitek

Member
So, the OT and silly W_D tangent aside, I love the fact that some developers are already going beyond current console level not just in effects or IQ, but actual asset quality.

I honestly didn't expect that before late 2015. Nice job Monolith.

Pretty sure you're supposed to be upset that most people won't get to max the game out.
 
Yes.

Generally you author textures to be max quality for the console platforms when seen up close, so if you want to load all those in at once on PC, you need shit tons of VRAM.

I could very well see myself phasing out of PC gaming due to this. I work in IT so I'm on a computer all day, and consoles are a welcome break from that. Add in the extra cost / maintenance / time consumption and fuck it I'm just gonna run my current 6850/Phenom X4 gaming PC into the ground and roll w/ consoles from here on out.
 

Daante

Member
I wonder how much of an impact this information has on the potential overall sales, for the first bach of GTX 980 and 970 (that "only" has 4GB ram).
 
I guess Watch Dogs' VRam scare proved to be a sign of things to come. Personally, I think this is hugely problematic. Requirements have suddenly shot up a huge degree, requiring ludicrously priced cards to get the same performance you were getting a year ago. This is bullshit price gouging.

Don't even see the point of getting the 970 now and you better believe I'm not going to buy one of those absurdly priced uber cards. Seriously, fuck these guys.
 

pixlexic

Banned
I guess Watch Dogs' VRam scare proved to be a sign of things to come. Personally, I think this is hugely problematic. Requirements have suddenly shot up a huge degree, requiring ludicrously priced cards to get the same performance you were getting a year ago. This is bullshit price gouging.

Don't even see the point of getting the 970 now and you better believe I'm not going to buy one of those absurdly priced uber cards. Seriously, fuck these guys.

there is no signs that this will be standard and nothing but a cool extra.

I would rather devs add in options that I wont be able to hit for the next 3-4 years than have the game look older sooner.

Also the only problem here is just from people with spec addiction. "I MUST MAX OUT ALL GAMES!!" Thats not even possible anyway since not all games are ported equally.
 

Nzyme32

Member
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

Nice to see something meaningful come out of this thread, thanks for posting. Looking forward to checking it out when I can get my mitts on it
 
So, the OT and silly W_D tangent aside, I love the fact that some developers are already going beyond current console level not just in effects or IQ, but actual asset quality.

I honestly didn't expect that before late 2015. Nice job Monolith.

Going by the PC Gamer video of the Ultra settings, it doesn't look much different than the screens of the PS4 version that was posted in the OT yesterday.
 
there is no signs that this will be standard and nothing but a cool extra.

I would rather devs add in options that I wont be able to hit for the next 3-4 years than have the game look older sooner.

Also the only problem here is just from people with spec addiction. "I MUST MAX OUT ALL GAMES!!" Thats not even possible anyway since not all games are ported equally.

It just seems to me that at this rate, 4GB of VRam isn't going to do the job for very long. A year from now, maybe less, it will probably be like having a 2GB card now - somehow already antiquated.

It's obvious that starting to design for next-gen consoles has accelerated the leap in VRam requirements, creating a really sharp, sudden price jump in order to keep up.
 
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

So what is the total game size MINUS the ultra textures?
 
But if this tweet proves it then the consoles run High on all textures not a mixture of Medium/high and Ultra right as per the proof you put forward I guess?

The tweet doesn't refer specifically to only textures. Obviously there can be a bit of mixing. Even then "high" corresponds more to a mixture of medium, high and ultra, than your ultra does.

But okay, here we go, I bothered to search for comparison pictures between the PS4 and ultra on the PC:

http://www.hardcoregamer.com/2014/05/27/watch-dogs-face-off-pc-vs-ps4/86291/

Sadly I can't link the full pictures. But here some snippets:

f6c8f0db8b.png
PC

70df3a31ee.png
PS4

73c7004074.png
PC

ce569f18cb.png
PS4

f5b64e2ffa.png
PC

714a391c99.png
PS4

Both are rendered at 1080P. EDIT: PC Version is rendered at 1080P
 

nbthedude

Member
Couch gaming works great on PC. If you want it the option, you have it. I know a lot of people here, myself included, have their PCs hooked up to their TV. I run a long HDMI cable down into the basement and back up behind my TV (something like a 50 foot run).

The Bluetooth adapter for my DS4 that I have works without issue on my couch, and I can mirror the display back to my desk for the instances where using the trackpad on the DS4 as a mouse doesn't work. Combine that with steam big picture mode, and you've got yourself an amazing setup.

Other options include putting a much less powerful media PC or steam box next to your TV and using Steam streaming, or buying something like the Alienware X51 to use more like a console and having it hooked up directly. Many other options exist.


I did something similar. I ran 50 ft of both an HDMI and USB cable (which I use for Xbox 360 dongle or whatever else) through a wall. It was the best $50 I've ever spent gaming wise. I unplug the HDMI from the TV and my PC defaults back to my cozy desk set up. Boot up Steam Big Picture and plug the HDMI back in and it automatically defaults to the big screen TV. No futzing with settings or anything. I can switch back and forth at my leisure.

Though honestly I find I LIKE being at my desk most still. But nice to have both options.
 
The tweet doesn't refer specifically to only textures. Obviously there can be a bit of mixing. Even then "high" corresponds more to a mixture of medium, high and ultra, then your ultra does.

But okay, here we go, I bothered to search for comparison pictures between the PS4 and ultra on the PC:

http://www.hardcoregamer.com/2014/05/27/watch-dogs-face-off-pc-vs-ps4/86291/

Sadly I can't link the full pictures. But here some snippets:



Both are rendered at 1080P.

And I believe with this the Watchdogs discussion is officially over. The PC ultra textures are clearly of a higher quality.
 

Mrbob

Member
My 280x has 3gb
So, High textures for me
Who can play on Ultra? 1% of pc players?

Yeah I wouldn't go above high. I have a R290 witih 4GB but I'll probably just play on high. I don't feel the need to max out everything all the time and I'm sure the game will look great on high anyway.

Me and my mates all have our PC's connected to our TV's now. Comfy couch gaming is awesome.

Indeed. I built a home theater PC this summer and love it.
 
Oh sorry, I didn't know/forgot. Well, I don't feel that is responsible for the difference in textures we see here.

It's difficult to say. Some of the textures are close enough that the difference could easily be attributed to the lower resolution, others are clearly of lower quality. It seems that the game does indeed use a mix of variable quality textures in the console version. Which honestly makes sense, it would be difficult to tell the difference while sitting at a distance from the TV while playing on a monitor would allow you to see every little detail.
 

roMonster

Member
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

Reading this makes me feel a lot better.
 

UnrealEck

Member
Yep I knew it. Had to be either 4K (which was I guess disproven by the screenshot of the settings) or supersampling. Guess it's with supersampling.
 

Skyzard

Banned
there is no signs that this will be standard and nothing but a cool extra.

I would rather devs add in options that I wont be able to hit for the next 3-4 years than have the game look older sooner.

Also the only problem here is just from people with spec addiction. "I MUST MAX OUT ALL GAMES!!" Thats not even possible anyway since not all games are ported equally.

A lot of people don't play at 1080p anymore but higher resolutions.

These guys aren't trying to max their games all the time.

But if you pay £500/$750 for a new graphics card, that is capable of powering the game at that resolution, but because nvidia wants to pace itself with its products, your card can't accommodate the game at the resolution you intended to play at.

That's being gimped. Not by the game company (unless they optimized it really poorly) but by the guys selling the cards, to get you to upgrade again in a few months/next year.
 

SapientWolf

Trucker Sexologist
The tweet doesn't refer specifically to only textures. Obviously there can be a bit of mixing. Even then "high" corresponds more to a mixture of medium, high and ultra, than your ultra does.

But okay, here we go, I bothered to search for comparison pictures between the PS4 and ultra on the PC:

http://www.hardcoregamer.com/2014/05/27/watch-dogs-face-off-pc-vs-ps4/86291/

Sadly I can't link the full pictures. But here some snippets:



Both are rendered at 1080P. EDIT: PC Version is rendered at 1080P
Is some of that the lack of anisotropic filtering? Because yeesh.
 
Top Bottom