• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

IMACOMPUTA

Member
WE WANT LESS OPTIONS!
CATER TO THE LOWEST COMMON DENOMINATOR!

Sounds like you wish PC were more like console. Why not just get a console?
 

Dr.Acula

Banned
Explain please, i'm not laughing, i'm very pissed off with this situation.

If you could magic open the gfx sliders on the PS4 they would all be set to "medium." You'll play the game on "high." People with thousand dollar video cards instead of your low/midrange card will run "ultra."

The end.
 

IMACOMPUTA

Member
If you could magic open the gfx sliders on the PS4 they would all be set to "medium." You'll play the game on "high." People with thousand dollar video cards instead of your low/midrange card will run "ultra."

The end.

And this is the beauty of PC gaming. In a couple of years when all gpu's have 6gb+, you can enjoy the game on a higher setting. What is there to be mad at?
 

leng jai

Member
Theoretically any PC dev could make textures as large as they want and make the VRAM requirement as high as they feel like right? Getting angry about an optional texture pack is an utter waste of time.
 

Rubius

Member
Why do people dont like the option for elitists to get a better looking game? If it run perfectly at High and look way better than consoles, why is it a problem for less than 1% of players to be able to play right now the game on Ultra, and maybe in 3-5 years when 6GB cards are more common, to be able to play it then?

It's like Crysis, nobody was able to max it except a super small number of people.
 

leng jai

Member
Why do people dont like the option for elitists to get a better looking game? If it run perfectly at High and look way better than consoles, why is it a problem for less than 1% of players to be able to play right now the game on Ultra, and maybe in 3-5 years when 6GB cards are more common, to be able to play it then?

It's like Crysis, nobody was able to max it except a super small number of people.

In principal I agree but this isn't really a Crysis situation since I highly doubt the visuals here will come anywhere near to justifying needing 6GB of VRAM. Most people still couldn't max out Crysis 6 years later, Crytek are just nuts.
 
In principle I agree but this isn't really a Crysis situation since I highly doubt the visuals here will come anywhere near to justifying needing 6GB of VRAM. Most people still couldn't max out Crysis 6 years later, Crytek are just nuts.

This told a whole lot more about the optimization rather than the visuals. Although it does hold up quite well.

But you don't need to look at the entire visuals. We are talking about an optional texture download that you need VRAM for. The texture resolutions may well need 6 GB of VRAM. And then you have sharper textures. You don't think that's worth it? Well fine, then don't use the optional download.
 

dr_rus

Member
Yeah I added an edit to that post. Not sure if I'm still missing things though.
The game24 keynote talk showing the pink texture compression was related to bandwidth savings. Is that right?
Yes. That's one of new things in Maxwell which allow it to beat GK110 while having 50% narrower memory bus.

So is the quote from Andy saying "With that you can take a 1.6GB VRAM scene and render it using just 156MB VRAM" related to the memory usage in terms of amount? Doesn't this mean there will be savings in memory amount usage when this is put to use?
Well that depends on what you mean by "savings".

Tiled resource basically mean that you may "load" a texture which is several times bigger in its size than you have VRAM available. It'll be loaded in "tiles", depending on which ones are needed in each frame, thus only parts of this texture will reside in VRAM - hence the PRT name.

Basically - see id Tech 5 for a software solution of this. Were there a lot of "savings" in id Tech 5 games? I wouldn't say so. And I certainly don't expect any savings from tiled resources in the future as their primary goal is the ability to use stuff which wouldn't fit into VRAM otherwise. Meaning that we're talking about textures with 4-8-16 GB sizes.

It is likely that SoM engine simply preloads a lot of stuff into memory to reduce resource streaming during gameplay. Solving this has more to do with development of a proper streaming engine and less with h/w PRTs/TRs support.
 

Almighty

Member
Yeah some of you need to take a chill pill and wait for benchmarks to come out before you start freaking out. I like the idea of developers pushing PC hardware to its limits, but there is also the chance that this is a exaggeration and PCs that don't meet the requirement the devs put out can still play on those setting no problem. It wouldn't be the first time a developer has done that.
 
I haven't said that :/

I said that if this tendency continues, that "superiority" at the end of the gen may not exist.
PC is a constantly evolving platform. I can guarantee you the superiority exists now like it will at the end of the generation. Why would you even think that? It makes no sense. If you're trying to min/max everything you are doing it wrong. Your looking at one strength of pc gaming(Ultra Textures)while completely overlooking others(AA/Framerate/down sampling, choice). Point is if you got into PC gaming to max everything out AND only bought a 770(which is an amazing card btw) you obviously didn't care that much as you would have picked up a Titan and called it a day.
 

Dr.Acula

Banned
Basically - see id Tech 5 for a software solution of this. Where there a lot of "savings" in id Tech 5 games? I wouldn't say so. And I certainly don't expect any savings from tiled resources in the future as their primary goal is the ability to use stuff which wouldn't fit into VRAM otherwise. Meaning that we're talking about textures with 4-8-16 GB sizes.

I played RAGE on a 4gb card and it ran great after I modded my .ini files. There was no pop-in or bizarre texture resizing as I moved across the landscape or whipped around. This was very noticeable on a 512 card.

But after playing RAGE for a dozen hours, the blurriness, the baked lighting, the low res just started grating on me. It was an interesting experiment, but it seemed really "hacky."

The megatexture concept where you don't have to tile works great on the ground and the walls and stuff. But once you get right up close it looks ugly. I do think there should be a future there, if it scales well megatextures are amazing. It might be one of those things where it just gets exponentially more demanding though, I don't know.
 

seph1roth

Member
PC is a constantly evolving platform. I can guarantee you the superiority exists now like it will at the end of the generation. Why would you even think that? It makes no sense. If you're trying to min/max everything you are doing it wrong. Your looking at one strength of pc gaming(Ultra Textures)while completely overlooking others(AA/Framerate/down sampling, choice). Point is if you got into PC gaming to max everything out AND only bought a 770(which is an amazing card btw) you obviously didn't care that much as you would have picked up a Titan and called it a day.

I must say, I have no idea about specs and configuration...so i recognize i am talking without having any idea.

And by Christmas, i will buy a GTX970 and give this card to my brother as a gift.

Thanks for the response ;)
 

UnrealEck

Member
I haven't said that :/

I said that if this tendency continues, that "superiority" at the end of the gen may not exist.

You said that you bought a 4GB GTX 770.
You then said you think things like this make you sad and that you are wary of poor optimisation continuing to happen and that your GTX 770 won't last until the end of this gen.
Then you gave examples of three games currently out which run fine (perhaps not exceptionally, but fine nonetheless) on your card compared to their console counterparts.
Lastly you said that people are lying when they're saying a GTX 770 4GB will run games better than a PS4.

I'm wondering what data you're using to conclude that a PS4 is going to overtake a GTX 770 4GB in terms of gaming performance.
I'm not saying you or their position is wrong. I can't because I can't say for certain. It's highly unlikely though that a PS4's performance will overtake a GTX 770's in games. I'm wondering if you can convince me with the data you've used to convince yourself.
 

Skyzard

Banned
This told a whole lot more about the optimization rather than the visuals. Although it does hold up quite well.

But you don't need to look at the entire visuals. We are talking about an optional texture download that you need VRAM for. The texture resolutions may well need 6 GB of VRAM. And then you have sharper textures. You don't think that's worth it? Well fine, then don't use the optional download.

High texture setting is 3GB at 1080p.

That's lowest native resolution of most monitors these days.
 
I must say, I have no idea about specs and configuration...so i recognize i am talking without having any idea.

And by Christmas, i will buy a GTX970 and give this card to my brother as a gift.

Thanks for the response ;)

It's nothing to be embarrassed about. It's a common misconception if you're (relatively) new to PC gaming or someone sold you on it the wrong way.

In terms of an ideal performance/price sweet spot, you'd buy a x60/x70 GPU every two to three years (four would be stretching it). These cards aren't supposed to last an entire generation like your one-time console purchase. Sure, it's more expensive than console gaming, but as you saw when you built your rig, you can easily spend over $1000 on a laptop that your gaming desktop blows out of the water.
 

nbthedude

Member
It's nothing to be embarrassed about. It's a common misconception if you're (relatively) new to PC gaming or someone sold you on it the wrong way.

In terms of an ideal performance/price sweet spot, you'd buy a x60/x70 GPU every two to three years (four would be stretching it). These cards aren't supposed to last an entire generation like your one-time console purchase. Sure, it's more expensive than console gaming, but as you saw when you built your rig, you can easily spend over $1000 on a laptop that your gaming desktop blows out of the water.

You forgot one important step. Sell your old card every 3-4 years and pay $150 + sold price of old card for upgraded card. That's what I do. In terms of cost it ends up being pretty much what you'd spend on Xbox Live or PSN on consoles in that same time period but you get a kick ass new toy that boosts your game performance instead of paying a ransom to play in a closed off sandbox. :p
 

DarkoMaledictus

Tier Whore
Well didn't sleeping dogs also have an optional texture DLC? I still remember the 290x with 4 gigs running for crap... also game didn't look that much different.

Don't worry and just play the game on High.
 
D

Deleted member 17706

Unconfirmed Member
Explain please, i'm not laughing, i'm very pissed off with this situation.

The answer is: you can.

Just because you can't max out settings does not mean you aren't running better than the current gen consoles. Take the settings as close to the console version as you can and you are guaranteed far superior performance unless you got a bunk port (and even then, probably).
 

DrPreston

Member
The answer is: you can.

Just because you can't max out settings does not mean you aren't running better than the current gen consoles. Take the settings as close to the console version as you can and you are guaranteed far superior performance unless you got a bunk port (and even then, probably).

The consoles have a lot more memory available for textures than most high end video cards do right now
 

Teremap

Banned
The consoles have a lot more memory available for textures than most high end video cards do right now
No, they don't.

The PS4 has 5.5GB that is addressable by game developers. IIRC Killzone: Shadowfall used 3.5GB of it as VRAM (NOT all for textures, don't know the exact proportion), which is far from being more than "most high end video cards" (lol).
 

Eusis

Member
I can't believe that the just released GTX 980 is already obsolete. This is absurd.
The same exact thing happened in 2004 with Doom 3 having a graphical mode that was best with 512 MB VRAM. In that regard I wouldn't worry too much.

Personally though, this is why I feel it's better to get a console first, then upgrade a PC later. The console will be a reliable baseline if it turns out a game won't run well on your PC, then later you can upgrade your PC (or get a new one, whichever) and destroy consoles entirely outside of some terrible ports. When we get video cards that laugh off Ultra here I think we'll be fully in the clear.

EDIT: I'm actually more concerned about LOW being the one recommend for 1 GB VRAM. Not being able to rock ultra with a few years old computer that's in the same ballpark as the consoles? No problem. Being unable to run at above LOW? Uhhhh well shit.
 

DarkoMaledictus

Tier Whore
No, they don't.

The PS4 has 5.5GB that is addressable by game developers. IIRC Killzone: Shadowfall used 3.5GB of it as VRAM (NOT all for textures, don't know the exact proportion), which is far from being more than "most high end video cards" (lol).

Well arguably it is more than most video card. Not very long ago the norm was 2 gig or 3gig of ram. Just recently newer cards are coming out with 4 gigs. Pretty sure most people have 2 gigs or maybe 3. Not everyone buys the latest cards, very niche market.
 
You forgot one important step. Sell your old card every 3-4 years and pay $150 + sold price of old card for upgraded card. That's what I do. In terms of cost it ends up being pretty much what you'd spend on Xbox Live or PSN on consoles in that same time period but you get a kick ass new toy that boosts your game performance instead of paying a ransom to play in a closed off sandbox. :p

Oh, yeah, of course. lol
 

Trickster

Member
If you could magic open the gfx sliders on the PS4 they would all be set to "medium." You'll play the game on "high." People with thousand dollar video cards instead of your low/midrange card will run "ultra."

The end.

Shadows of Mordor on PS4 runs the equivelant of "high".

Watch Dogs runs a mix of medium/high/ultra from what I understand

I imagine you would be right if you were talking about ps3/360 games though
 

Teremap

Banned
Well arguably it is more than most video card. Not very long ago the norm was 2 gig or 3gig of ram. Just recently newer cards are coming out with 4 gigs. Pretty sure most people have 2 gigs or maybe 3. Not everyone buys the latest cards, very niche market.
Most cards in general? Yes. But the poster I replied to specifically said most high end video cards, and the high-end video cards cleared the 2GB barrier some time ago (and I would argue that anyone who bought a 2GB card did so under bad advice).

I, personally, have two GTX 670s with 4GB of VRAM each, and I've had them for two years running now. I'm pretty well set for the moment.
 

SparkTR

Member
Shadows of Mordor on PS4 runs the equivelant of "high".

Watch Dogs runs a mix of medium/high/ultra from what I understand

I imagine you would be right if you were talking about ps3/360 games though

During the last years of the PS3/360 generation it was normal for multiplatform games to run at below minimum settings compared to the PC versions. I'm fully expecting this to run with a mix of low/med/high settings on PS4, like with Metro Redux.
 

Kieli

Member
The same exact thing happened in 2004 with Doom 3 having a graphical mode that was best with 512 MB VRAM. In that regard I wouldn't worry too much.

Personally though, this is why I feel it's better to get a console first, then upgrade a PC later. The console will be a reliable baseline if it turns out a game won't run well on your PC, then later you can upgrade your PC (or get a new one, whichever) and destroy consoles entirely outside of some terrible ports. When we get video cards that laugh off Ultra here I think we'll be fully in the clear.

The issue that frustrates me at the moment is that the GTX970 and GTX980 are more than amply powerful cards. I have no doubts they can run whatever the current-gen consoles can throw at them (barring exceptions such as Naughty God secret sauce, but companies like that are rare).

So I can understand to some extent why people are getting irate at nVidia because it's not the power that's the issue, it's the VRAM. That's something that can be easily remedied, if only they weren't so stingy with the VRAM. If I have to pay a premium, so be it.

I shouldn't be shelling out $400+ for an instantly obsolete card when it is actually not obsolete at all (in the power sense) just because nVidia skimped on VRAM.

Edit: I guess what I'm trying to say is that if the GTX970 weren't powerful enough and didn't have enough VRAM (but especially the former) to play current-gen games, I'd be darned excited. It means GPU manufactors are receiving the dearly-needed kick to their behinds to begin moving PC gaming forward.

But it's not. At all. What we're seeing is a McLaren 1000+ HP engine placed into a Volkswagen Beetle to race on a 45 mph track...
 

BONKERS

Member
Exactly. Developers have to share that 5.5GB between stuff that is normally done in VRAM alongside everything else the game has to do.

Here's Killzone's Breakdown for the demo

ibba6sNs3DZi1d.jpg


~1300 just for non-streaming textures

No, they don't.

The PS4 has 5.5GB that is addressable by game developers. IIRC Killzone: Shadowfall used 3.5GB of it as VRAM (NOT all for textures, don't know the exact proportion), which is far from being more than "most high end video cards" (lol).
 

dreamfall

Member
For those on 2GB cards and PS4/XBO, knowing that we'll be running Medium Texture settings, are you buying this? Or on console?

I'm conflicted.
 

Eusis

Member
The issue that frustrates me at the moment is that the GTX970 and GTX980 are more than amply powerful cards. I have no doubts they can run whatever the current-gen consoles can throw at them (barring exceptions such as Naughty God secret sauce, but companies like that are rare).

So I can understand to some extent why people are getting irate at nVidia because it's not the power that's the issue, it's the VRAM. That's something that can be easily remedied, if only they weren't so stingy with the VRAM. If I have to pay a premium, so be it.

I shouldn't be shelling out $400+ for an instantly obsolete card when it is actually not obsolete at all (in the power sense) just because nVidia skimped on VRAM.
I admittedly can't disagree with that (holding back on RAM is frustrating in every fucking thing I swear, whether it's smartphones, consoles, or computer parts), and my angle there was MOSTLY with when the consoles had just freshly launched. We're about a year past, and so we're reaching the point where the gains are enormous over consoles similar to when the 8800 GT launched relative to the PS3. Another year hopefully and it will be on another league entirely.
 

Kieli

Member
I'm also a little confused on something.

Right now, both consoles have a maximum of 5.5 GB of shared RAM to play with (barring further OS patches freeing up more RAM).

However, this RAM is shared, isn't it? Why does a game like the Evil Within need 4GB minimum of VRAM when that would mean the PS4/Xbone would only have 1.5GB of ram left for everything else?
 

Teremap

Banned
I'm also a little confused on something.

Right now, both consoles have a maximum of 5.5 GB of shared RAM to play with (barring further OS patches freeing up more RAM).

However, this RAM is shared, isn't it? Why does a game like the Evil Within need 4GB minimum of VRAM when that would mean the PS4/Xbone would only have 1.5GB of ram left for everything else?
False premise, the Evil Within does not require 4GB of VRAM minimum, that was the recommended settings (likely for maxing the game out entirely). The actual minimum requirements are much lower than that.
 
Seeing posts here and in the Evil Within thread just shows how many people have gotten into PC gaming recently. The baseline has been moved just like in the past. You're no longer playing hi res 360 games anymore.
 

Odrion

Banned
~*don't decide to build a computer right before the next generation happens*~
Seeing posts here and in the Evil Within thread just shows how many people have gotten into PC gaming recently. The baseline has been moved just like in the past. You're no longer playing hi res 360 games anymore.
bu-bu-but it was suppose to be different this time! these current consoles were only mid-range in power!!
 

DarkFlow

Banned
This is the beginning of the new gen.

I buyed a PC because i'm very pissed off with the console companies, this things makes me sad...i have a 4770K and a 770 4Gb and now i think that thanks to the poor optimization from the developers, I probably may not reach to the end of the gen. This is not an exception, this is the rule by now, first Wolfenstein, then Watch Dogs, Dead Rising 3, Evil Within, this game...and more in the future.

And even more, i thought that with my PC i could run all the games better than PS4 and One...lies....lies everywhere.

What's wrong with Wolfenstien? My 780 handles that game like a champ.
 
Top Bottom