• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

kevm3

Member
PC Exclusive Crysis comes out, ultra-settings required a $2500 rig, everybody praised CryTek for making something that beautiful and a benchmark for years to come.

Mordor comes around, ultra-level textures requires you to have a top-end GFX card and it's not optimized and Monolith should be ashamed.

Previous generation, PC gamers were complaining about being left behind in the multi-plat dev-cycle, game-settings were left to the inferior settings of the consoles, leaving much PCs running those games without flexing a muscle.

I, I'm so confused right now.

Crysis was optimized for PC and blew everything out of the water for its time. Mordor looks cross-gennish and doesn't even look better than Battlefield 4 or Crysis 3, which take less resources to max out. The fact that a $700 graphics card (780ti) can't max it out has some people wondering what exactly is going on.
 

FoneBone

Member
You'd probably barely notice them here too. The option is just there for people who want it. Like the "uber" texture settings in Witcher 2.

Turn them down and enjoy a great looking game with great performance. All these very high end extra bells and whistle settings do really are two things:

1) assure a games place in benchmarks for years to come

2) make it cool to revisit the game in a decade to crank everything at 4k.

They are neat "future proof" settings and not much more,

Yeah, I don't understand why people are upset about this. The game seems to be pretty scalable going by the released specs - it's not an Evil Within situation.
 

Kinthalis

Banned
People, running this game at Ultra everythign and "high" textures >>>> what's going to be on a PS4.

Jeebus, what are people so upset about. Let's wait for performance benchmarks before we pull out the torches and pitchforks.
 
This is starting to feel like Doom 3's Ultra setting

JMg6H.png

Yeah, and did we have a 17 page thread about in 2004? I don't think so. I have the feeling the discourse has changed over the years, and not for the best.
 
I'm not sure what you mean. My 880m msi laptop runs Witcher 2 just fine on ultra. What I'm talking about is this new shift to large vram pools, which most people argued wwouldn't happen until 2015. I'm not expecting my 780 3gig to run at 4k, I'm not asking it to. I'm asking that my card run a cross platform game on a PC with much beefier specs and to run it at the highest setting. It's not a crazy idea.

My point is that ultra settings aren't always planned for current hardware. I'd rather people get over their mental block of the currently available hardware not being able to handle the highest settings on a game rather than having developers strip away the option, depriving me of the ability to go back and play a game after a few years with a new card and playing it on higher settings. If you want to bring cross platform comparisons, I'd suggest waiting until the Digital Foundry article.

I'm also really doubting your laptop can run Witcher 2 with ubersampling aka "Highest settings" at a steady 60.
 
No game in history has required 6GB of VRAM for ultra textures at 1080p (unless you overloaded your Skyrim with mods). So either they did a really shitty job of porting this over, or it's complete BS.

Guess we'll have to wait and see.
 

UnrealEck

Member
For (too many) years, the argument has been made that consoles are holding the PC back. Now, with the current gen, we see a significant bump in quality...and people complain about that?

I don't think the consoles aren't the reason for 6GB VRAM requirements to accomodate very high quality textures. The 6GB VRAM requirement is there to give PC gamers something more to spend their memory budgets on should they have them that high.
 

brobban

Member
No game in history has required 6GB of VRAM for ultra textures at 1080p (unless you overloaded your Skyrim with mods). So either they did a really shitty job of porting this over, or it's complete BS.

Guess we'll have to wait and see.

Or you know, they have really high res textures..
 

Gojeran

Member
Don't care much about ultra textures can you run high textures at 1440p with 4G of VRAM? Apparently 3gb at 1440p is a no go (not surprising really).

Edit: I thought the game was out today because I'm so used to release day review enbargos... didn't realize it isn't even out till Tuesday.
 

QaaQer

Member
No game in history has required 6GB of VRAM for ultra textures at 1080p (unless you overloaded your Skyrim with mods). So either they did a really shitty job of porting this over, or it's complete BS.

Guess we'll have to wait and see.

Yeah, they are lying or really really lazy/bad at their job. Why not both?
 

Kinthalis

Banned
No game in history has required 6GB of VRAM for ultra textures at 1080p (unless you overloaded your Skyrim with mods). So either they did a really shitty job of porting this over, or it's complete BS.

Guess we'll have to wait and see.

It's very hard to think of a way they could do that when it comes to VRAM. Performance? Sure. Issues with cacheing, ok, but just straight up using double the VRAM?
 

nbthedude

Member
No game in history has required 6GB of VRAM for ultra textures at 1080p (unless you overloaded your Skyrim with mods). So either they did a really shitty job of porting this over, or it's complete BS.

Guess we'll have to wait and see.
Either they did a shitty job or the option is bullshit... That sounds like a reasonable set of alternatives.
 
Sad to see people who post in the PC screenshot thread crying about a better texture options. Super sad that people would rather not have this option just because their PC can't handle it.

I'm judging you guys right now, smh.
 

KaiserBecks

Member
a bump for who? consoles maybe, but the only thing that's changed for PC gamers is more unoptimized garbage, we haven't really seen anything push the envelope for PC's

Because there aren't that many multi platform next gen (well, current gen) games out yet.
 

scitek

Member
Things like this excite me because it's something I can't run right now, but will be able to in the future. It's like playing with a new toy.
 
Have they made any mention of texture resolution? Said they use little tiling or something? Shown any screenshots?

We've had super high res textures and photorealistic texture mods in games for ages using <1.5GB vram

So people saying 'hurdur games require more resources no longer held back by old consoles why do you complain' really aren't getting where the scepticism is comes from.

Titanfall had ugly as shit low res textures yet the game still required >2GB vram, clearly the vram requirement did not result in graphical fidelity.

Watch dogs was similar, unremarkable texture quality yet insane vram requirements.


So unless in shadow of Mordor the low textures already look par with mirror's edge/project cars etc textures , medium look as good or better than modded skyrim textures and the high or ultra ones look like something we've never seen before people wonder what the deal is....

More vram and higher requirements are supposed to translate into better graphics.

If this game is the next crysis/doom3/quake 3 of its time then great, NOONE will mind the requirements
This is what pc gamers want btw, games made for future hardware that can justify the system requirements with future gen graphics.

If it's another titanfall then can stick it up their butts

Personally I'm cynical that it won't be another titanfall (lack of trust in the developer especially considering the amount of hot air blowing they have been doing over their game) and have not forgotten all the inflated requirements other devs have been boasting (and lying about:p) just for publicity (cod ghost requirements)


Sad to see people who post in the PC screenshot thread crying about a better texture options. Super sad that people would rather not have this option just because their PC can't handle it.

I'm judging you guys right now, smh.
More strawmen and lack of reading comprehension


tldr: http://accordingtoathena.com/wp-content/uploads/2012/11/receipts.gif

My money is on shadow of mordor low textures looking like bf3 low textures despite using 4x the vram
Why? because I'm a cynic
 

heyf00L

Member
This is a good thing!

It seems the controversy is over two misconceptions

  1. PC Ultra setting = PS4 setting. Yeah right. If they could compress these textures to fit in the 4.5 GB that the PS4 has, they would have for PC too. The PS4 doesn't have secret sauce texture compression (no, unified RAM doesn't do this).
  2. The PC version is unoptomized. No, it just means they have optional super high res textures. And why not? Most game art is done high res and then the resolution is lowered to save RAM. Why not just release the original art for the people with the hardware to use it? More games should (and will) do this. 6 GB of RAM will be standard in 3 years.
 

derExperte

Member
There's more than just VRAM though, these new cards are leagues ahead of what is in the consoles. Lets wait for benchmarks before freaking out.

Also the 970/980s have better texture compression/management than older cards so maaaybeee they'll be able to use the 6GB option without problems. We'll know in less than a week.
 

mnannola

Member
Do people really think they will get a better experience on PS4 than if they have a 770 2GB card?

We need some comparisons between medium PC settings and PS4 stat!
 
Crysis was optimized for PC and blew everything out of the water for its time. Mordor looks cross-gennish and doesn't even look better than Battlefield 4 or Crysis 3, which take less resources to max out. The fact that a $700 graphics card (780ti) can't max it out has some people wondering what exactly is going on.

$700 wat, the card costs less than $500. And people are making a lot of assumptions on this game without seeing the game with Ultra textures in motion.
 
Or you know, they have really high res textures..

Yeeaaah... they'd have to be the highest resolution textures I've ever seen O_O

A shitty\lazy port job seems to be the likeliest explanation. I'm going in with that expectation. I'd like them to prove me wrong but... I'm not holding my breath.
 

Gbraga

Member
Crysis was optimized for PC and blew everything out of the water for its time. Mordor looks cross-gennish and doesn't even look better than Battlefield 4 or Crysis 3, which take less resources to max out. The fact that a $700 graphics card (780ti) can't max it out has some people wondering what exactly is going on.

If the $700 graphics card had 6GB of VRAM, it would easily max it out, I believe. So it's not really a fair comparison. If the $330 970 had 6GB it would also easily max it out. The Ultra Texture Settings just requires more VRAM, not more powerful GPUs.

Now, if your $700 card can't max it out aside from texture quality, then we might have something to talk about.
 

Stallion Free

Cock Encumbered
Do people really think they will get a better experience on PS4 than if they have a 770 2GB card?

We need some comparisons between medium PC settings and PS4 stat!
They will get a better experience because they won't have to deal with not being able to use ultra textures.
 
I'm not planning on getting the game but I'm not even mad, even though I ordered a 970. About time games start to go all out with that fancy stuff. I just wonder how big those "Ultra" textures are.
 

Stallion Free

Cock Encumbered
Yeeaaah... they'd have to be the highest resolution textures I've ever seen O_O

A shittylazy port job seems to be the likeliest explanation. I'm going in with that expectation. I'd like them to prove me wrong but... I'm not holding my breath.
....or they were simply over generous with the requirement numbers.
 

ClearData

Member
A difference of an additional 2 GB from what I can get in a GTX 970 or 980 has laid bare all of my PC insecurities.

My machine has been found lacking? At 1080p settings?

No!

*weeps
 

Gbraga

Member
Yeeaaah... they'd have to be the highest resolution textures I've ever seen O_O

A shitty\lazy port job seems to be the likeliest explanation. I'm going in with that expectation. I'd like them to prove me wrong but... I'm not holding my breath.

Why not just a shitty Ultra Texture Quality setting then instead of the entire port? Judging from the requirements and amount of options (plus the fact that it's Monolith), the port seems to actually be well above average.
 

forrest

formerly nacire
Actually what they did is not more work, is less work.

Basically for every developer is a good straight forward way to work create texture without limitations and to scale down after that.

A good way to known what is the best size for a texture is to analyze how many texels of the texture will be visible in screen. But is a good way, not the best way. Best way is to join that info with how many seconds those texels will be visible in a complete run of the game. That info could be possible using automatic tools but I can't find a developer that take it account.

I can say a few examples using watch dogs. That game uses big textures for ad panels that are a few meters over the streets. You can't reach that panels so in a normal gameplay you are not seeing half the texels of that texture for a 1080p screen resolution. But, the game uses the same resolution for grass textures that are a used everywhere, you look at they frequently and on top of that grass textures are expanded for big zones of terrain so the ratio texels/ratio is horrible. Same for roads. A road texture used for 20 meters of terrain has same resolution than a newspaper decal used over the street on a corner.

Why that happens?

Doing a good balance between work in sizing textures is a lot of work. And moreso you need the memory requeriments before hand to start the calculation. That have to work over the premises I said before. A ordered list starting with most visible texture to the less visible taking accound how many texels will be necessary for them using a fixed screen resolution.

As difficult that task is, is easiest for a closed platform. Watch Dogs in PS4 use a clever mix of ultra/high/medium textures (not perfect mix btw). In pc they take the fast path: We have 1Kx1K textures for enviroment and 4Kx4K textures for characters. So ultra texture quality will be every texture in max resolution, high will be every texture at half and normal one quarter. So the fucking dirt leaves over the streets will be a blurry mess and we don't care the player will be looking that texture half of the game.

So, if the mordor game has a few 4K textures for important assets in console versions, I guess ultra quality for pc means every fucking texture is maxed, they don't care is most of them are not really necessary.

chickenlittlef2j4k.jpg

Quoting for all of the Chicken Littles on the new page.
 

heyf00L

Member
Yeeaaah... they'd have to be the highest resolution textures I've ever seen O_O

A shitty\lazy port job seems to be the likeliest explanation. I'm going in with that expectation. I'd like them to prove me wrong but... I'm not holding my breath.

I bet they will be pretty high res, but I'm guessing it's also lazy. The texture pack will be something like the original source textures unoptimized. What I mean by unoptimized is including 4096x4096 texture for a pebble.

Optimization is a process of cutting out and down things that aren't important. Ever optimized an animated gif? It's about seeing which frames you can drop and how few colors you can use without a big loss in quality. I'm guessing this texture pack is like saying "Whatever, give them 256 color gifs, 50 fps, full screen."

You'll still be able to use the stock, optimized textures.

All this complaining is nonsense.
 

Kinthalis

Banned
Do people really think they will get a better experience on PS4 than if they have a 770 2GB card?

We need some comparisons between medium PC settings and PS4 stat!

People are weird and stupid sometimes. A 770 is going to run the game better and with more effects than a PS4.

The only question is will it match the texture quality?

I would think that will be close enough, and personally, I'll take a slight hit on texture resolution over NO AF, like most ocnsole games.

Blech.
 
I just bought a 970 4 GB and I have no problem with this Ultra setting. Means I can go back in a couple of years with a beefier card and 4K monitor and crank that baby up. One of my favorite things about PC gaming is with each upgrade, its like your backlog is new again!

That said, I would love to see comparison screens at some point. =)
Yup. I downsampling the shit out my back catalog right now.
 

Fractal

Banned
You're talking like they're crowing about these ultra textures from the rooftops, when in reality, the first (and only) thing we know about them has come from this screenshot of the options screen.

This is a bonus for high-end and future PC gamers, and there's very few developers that would put this kind of option in their game. And honestly, with the way people have reacted to this, I doubt any developers are likely to do it in the future - best just to relabel 'high' detail as 'ultra' and make everyone happy.
Ah, you're right, I'm sorry, guess I got a bit caught up into the "hype" and didn't pay much attention to the OP!
But still, I stand behind my statement. Since these textures clearly aim for the enthusiast segment, I think it would've been respectful of the devs to show us some actual comparisons and explanations. Fact of the matter is, as of now, I haven't seen a single piece of footage that justifies such a hefty VRAM requirement. Now, if this really turns out to be some real hardware pushing feature and makes a strong and noticeable improvement to the look of the game, I'll be very happy, and will gladly replay the game a few years down the line on a stronger PC, like I did with Crysis. But right now, I don't think that's likely. In any case, I hope we get some official word about this...
 

pixlexic

Banned
It was a comment about WD running only High Textures on console proving that PC does not need more than 2gb.



Yeah cause DF never make mistakes, fact is I have the game, posted shots and looked at both games and PS4 has textures that look like Ultra with no visible difference on key areas. If you believe it or not is mute and we are off topic so last I will say on it, but this post explains it perfectly.






Great input to the discussion, have any thoughts yourself or just google .gif all day?


umm your quote proves that they couldn't use the high res textures of the 3 gig ultra settings on pc.

the consoles will never use 6 gigs of vram for textures in a single scene
the gpus just cant handle it. even with mixed textures WD is still 900p on ps4 with much lesser mip maps.

the point is again. PC games on ultra settings does not mean it equals the console version.
 
Top Bottom