• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

Didn't a CoD recently say the same shit? Not saying it's not true but i wanna see benchmarks @ 1080p with 780ti, 980s, and 290x before we toss all the current cards in the garbage.
 

Cyriades

Member
Didn't a CoD recently say the same shit? Not saying it's not true but i wanna see benchmarks @ 1080p with 780ti, 980s, and 290x before we toss all the current cards in the garbage.

I think a lot of people are suckers for running out and getting 980/970s when they are only marginally better than a 780/780ti http://www.anandtech.com/bench/product/1072?vs=1351

I'm disappointing more in the gimp VRAM. Now that consoles have more ram ports are asking for over 4GBs in PCs
 
I think a lot of people are suckers for running out and getting 980/970s when they are only marginally better than a 780/780ti

I'm disappointing more in the gimp VRAM. Now that consoles have more ram ports are asking for over 4GBs in PCs

If i'm not mistaken the real issue is sloppy ports since the PS4/Xbox are both APUs and only have Vram. On PCs (most of em anyways) we are dealing with Vram and system ram.
 

Cyriades

Member
If i'm not mistaken the real issue is sloppy ports since the PS4/Xbox are both APUs and only have Vram. On PCs (most of em anyways) we are dealing with Vram and system ram.

At any rate PC GPUs need more VRAM.

How can these card manufacturers speak of 4K with only 4GB of VRAM... turn AA off?
 
There are 280x and 780s with 6GBs costing less than $400

example http://www.newegg.com/Product/Product.aspx?Item=N82E16814127786

As you can see price isn't the issue here.

So what's the deal with this card?

This has 6GB of VRAM at about $375, which seems like a great deal.

While, the GTX 970 has 4GB at $350.

and how about the GTX 980, which still only has 4GB of VRAM but is priced at $550?


I just built my PC in May, and right now I have a Radeon r7 260x, which has 2GB of VRAM. But I am considering an upgrade potentially after I get my tax return refund. Just trying to get an understanding on what I should be looking for.
 
So what's the deal with this card?

This has 6GB of VRAM at about $375, which seems like a great deal.

While, the GTX 970 has 4GB at $350.

and how about the GTX 980, which still only has 4GB of VRAM but is priced at $550?


I just built my PC in May, and right now I have a Radeon r7 260x, which has 2GB of VRAM. But I am considering an upgrade potentially after I get my tax return refund. Just trying to get an understanding on what I should be looking for.
Vram isn't all that substantial until you start playing above 1080p, and even then, 4GB is enough. A lot of people are doubting the requirements of max settings in Shadow of Mordor, and rightly so.
 

hengyu

Member
there has been no pc port of a console game that has needed more than a 2 gig card to run the same textures.


Watch Dogs.
https://www.youtube.com/watch?v=KRveD-kzuME

And chances are 3GB+ will be necessary for same texture quality as consoles. It's just the way it is, the common denominator has been raised.

To be fair though, Watch Dogs is a donkey arse port and ran unbelievably poorly even on a card (GTX 780) it was a bloody gift for...
 
I haven't said that :/

I said that if this tendency continues, that "superiority" at the end of the gen may not exist.

It will exist. The PS4 will never catch up to your 770 no matter how many years go by and your graphics card's performance will not deteriorate over time. What will happen, and this it seems is actually the case with Shadows of Mordor, is that developers will take advantage of newer and better hardware and push graphical quality far above the console versions. I repeat: The PS4 will never, ever, ever "catch up" to your graphics card. Anyone telling you otherwise is lying to you.
 

DarkFlow

Banned
Reading this thread has been fun. You definitely tell the people who have never had to make a card last for awhile by turning down settings, and just buy new ones every year.
 

Renekton

Member
It will exist. The PS4 will never catch up to your 770 no matter how many years go by and your graphics card's performance will not deteriorate over time. What will happen, and this it seems is actually the case with Shadows of Mordor, is that developers will take advantage of newer and better hardware and push graphical quality far above the console versions. I repeat: The PS4 will never, ever, ever "catch up" to your graphics card. Anyone telling you otherwise is lying to you.
Unless it's 770 2GB :D oops
 

Mohonky

Member
Seeing posts here and in the Evil Within thread just shows how many people have gotten into PC gaming recently. The baseline has been moved just like in the past. You're no longer playing hi res 360 games anymore.

Problem i am having is that the games dont exactly look amazing. If they did, I would be more inclined to believe the big hoo haa about needing so much vram.

Vram was never even really a thing with video cards, lower tier cards just threw more of it in there on lower end cards to make them more attractive but having large amounts of vram was never actually a thing until very recently. Especially 4gb and 6gb, I mean there is what, a single card out there with 6gb. Even the new cards coming dont have that. I wouldnt be surprised if even nVidia wasnt expecting that sort of requirement, its just nuts especially when everything else about many gpus with less ram will wipe the floor with their console counterparts.
 

Eusis

Member
It will exist. The PS4 will never catch up to your 770 no matter how many years go by and your graphics card's performance will not deteriorate over time. What will happen, and this it seems is actually the case with Shadows of Mordor, is that developers will take advantage of newer and better hardware and push graphical quality far above the console versions. I repeat: The PS4 will never, ever, ever "catch up" to your graphics card. Anyone telling you otherwise is lying to you.
What you're REALLY buying, performance-wise, with consoles is the peace of mind at an affordable price that whatever you put in will work, and generally at a playable level (though that can falter at both the start and the end of a generation, with rushed titles or those with minimal experience with the hardware early on and those pushing too hard at the end.) It's why I felt at console launch it's better to chose the console over a new PC (waiting on BOTH is probably best though), because you give it a little time and you can curbstomp a console very nicely without blowing TOO much money in about a year (hey there 8800GT and 970!) and for pretty cheap two or more years after (I recall the 9800 GT was a budget rebrand/tweaking of the 8800 GT that hit in about 2008.) Nevermind that as games like Watch Dogs and Thief highlight you can get some oddness where in theory your computer should perform just as well if not destroy consoles entirely, but in practice performance is uneven or the game just has weird issues that simply don't exist in the console version due to shoddy porting. That kind of stuff can at least partially be powered through with stuff that is leagues ahead, it's a shame to not be able to use Ultra textures for Shadow of Mordor with a 970 but you should still be able to readily smash the console versions otherwise. Hell, barring the game being dumb and going 'uhhh you don't have the vram I won't work sorry' you can PROBABLY enable it and just have a bit of hitching every so often, if you feel that's a price worth the texture quality. At least that was usually the case when overshooting the VRAM by a not-too-crazy amount.
 

Seanspeed

Banned
So what's the deal with this card?

This has 6GB of VRAM at about $375, which seems like a great deal.

While, the GTX 970 has 4GB at $350.
That card will allow you to play with this optional Ultra texture setting, but will be probably be about 20% slower than a 970.

Not even close to a great deal.
 
How many textures can you display on screen at once in a single scene and what resolution do they have to be so they wont get visibly better on a 1080p display? And how much memory would that take?

I know this question isn't really related to the topic but I find this to be an interesting thought none the less.... I don't really expect answers to that either as I can't imagine someone would really know...maybe on a mathematical level.
 

BONKERS

Member
With consoles though, everything is pared back FOR YOU. By the developers. They don't make any easy way to do that for PCs because there are so god damned many different hardware combinations.

Problem i am having is that the games dont exactly look amazing. If they did, I would be more inclined to believe the big hoo haa about needing so much vram.

And this is the inherent problem, the VRAM recommendation for TEW is ridiculous, but a given since it runs on shittech5. (Quantity of variation vs Quality of what there is).


But we haven't seen anything about these new textures for Mordor. What we have seen are pictures of textures from screenshots released.
 

BONKERS

Member
How many textures can you display on screen at once in a single scene and what resolution do they have to be so they wont get visibly better on a 1080p display? And how much memory would that take?

I know this question isn't really related to the topic but I find this to be an interesting thought none the less.... I don't really expect answers to that either as I can't imagine someone would really know...maybe on a mathematical level.

At a marginal distance, you'd run into problems from aliasing obscuring detail at only 2 million pixels before it's makes a significant enough difference except for close up.
Which is at a point i'd have to imagine, that they wouldn't use MIP1 (1being 4k x 4k texture) until you get close to begin with. Which is somewhat where I have to imagine that requirement coming from.

Streaming in all MIPs for a given texture for a given distance as the player travels.
 
Unless it's 770 2GB :D oops

I don't think you understand what catching up is supposed to mean in this context. We've known the XB1 and PS4 memory configuration for some time now, even effort the 7 series graphics cards launched. The 770 had a memory disadvantage right from the start, while it has an advantage in everything else. It was always quite possible that a 770 owner would have to maybe drop texture quality a bit while using overall higher quality settings than the PS4. It was a fact then that the 770 2GB lacks in the VRAM department and is superior in everything else, it is a fact now and it will be a fact five years from now. Nothing will change. The 770 won't magically grow more VRAM modules and the PS4 won't magically gain more teraflops of computing power.

What you're REALLY buying, performance-wise, with consoles is the peace of mind at an affordable price that whatever you put in will work, and generally at a playable level (though that can falter at both the start and the end of a generation, with rushed titles or those with minimal experience with the hardware early on and those pushing too hard at the end.) It's why I felt at console launch it's better to chose the console over a new PC (waiting on BOTH is probably best though), because you give it a little time and you can curbstomp a console very nicely without blowing TOO much money in about a year (hey there 8800GT and 970!) and for pretty cheap two or more years after (I recall the 9800 GT was a budget rebrand/tweaking of the 8800 GT that hit in about 2008.) Nevermind that as games like Watch Dogs and Thief highlight you can get some oddness where in theory your computer should perform just as well if not destroy consoles entirely, but in practice performance is uneven or the game just has weird issues that simply don't exist in the console version due to shoddy porting. That kind of stuff can at least partially be powered through with stuff that is leagues ahead, it's a shame to not be able to use Ultra textures for Shadow of Mordor with a 970 but you should still be able to readily smash the console versions otherwise. Hell, barring the game being dumb and going 'uhhh you don't have the vram I won't work sorry' you can PROBABLY enable it and just have a bit of hitching every so often, if you feel that's a price worth the texture quality. At least that was usually the case when overshooting the VRAM by a not-too-crazy amount.

I agree with almost everything you said except the "buy a console at launch, then a PC later" part. It was true during earlier console launches but it is not true anymore. Not when there is affordable hardware out there that can handily outperform next gen consoles even in poorly optimised games like Watchdogs.
 

coastel

Member
I don't think you understand what catching up is supposed to mean in this context. We've known the XB1 and PS4 memory configuration for some time now, even effort the 7 series graphics cards launched. The 770 had a memory disadvantage right from the start, while it has an advantage in everything else. It was always quite possible that a 770 owner would have to maybe drop texture quality a bit while using overall higher quality settings than the PS4. It was a fact then that the 770 2GB lacks in the VRAM department and is superior in everything else, it is a fact now and it will be a fact five years from now. Nothing will change. The 770 won't magically grow more VRAM modules and the PS4 won't magically gain more teraflops of computing power.



I agree with almost everything you said except the "buy a console at launch, then a PC later" part. It was true during earlier console launches but it is not true anymore. Not when there is affordable hardware out there that can handily outperform next gen consoles even in poorly optimised games like Watchdogs.

Wrong buy a ps4 at launch and never buy a PC...see how arrogant and stupid that sound's? You miss the point of console's if you think it's about power ypu got it wrong which you have because your a PC elitist so your wasted on explaining it. Oh and for all that power there are not many games that look better ov erall than the first wave of next gen game's. I really like PC some.of the best games i played are on there but you goto chill with this console hating nonsense.
 

Seanspeed

Banned
Wrong buy a ps4 at launch and never buy a PC...see how arrogant and stupid that sound's? You miss the point of console's if you think it's about power ypu got it wrong which you have because your a PC elitist so your wasted on explaining it. Oh and for all that power there are not many games that look better ov erall than the first wave of next gen game's. I really like PC some.of the best games i played are on there but you goto chill with this console hating nonsense.
I don't understand this response at all. I'm reading what you responded to and there's nothing of any of what you're saying there.
 
Wrong buy a ps4 at launch and never buy a PC...see how arrogant and stupid that sound's? You miss the point of console's if you think it's about power ypu got it wrong which you have because your a PC elitist so your wasted on explaining it. Oh and for all that power there are not many games that look better ov erall than the first wave of next gen game's. I really like PC some.of the best games i played are on there but you goto chill with this console hating nonsense.



Eh... he never said that. I don't see any console hating in his post man. He's stating that a GPU has more computational power than PS4, which is true.
 

BONKERS

Member
Wrong buy a ps4 at launch and never buy a PC...see how arrogant and stupid that sound's? You miss the point of console's if you think it's about power ypu got it wrong which you have because your a PC elitist so your wasted on explaining it. Oh and for all that power there are not many games that look better ov erall than the first wave of next gen game's. I really like PC some.of the best games i played are on there but you goto chill with this console hating nonsense.

wat-wat.jpg
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Vram isn't all that substantial until you start playing above 1080p, and even then, 4GB is enough.

Folks said that about 2GB video cards.

To future proof yourselves, folks really need to think about getting 6GB video cards if they don't want to compromise on texture quality with next gen console ports.

I'm going to wait until the consumer version of Oculus Rift comes to find out what the recommended GPU requirements will be and nVidia have got a consumer price friendly models with at least 6GB of Vram before I buy my next GPU.
 

Nokterian

Member
I will hit the auto config button and start working from there. I like it in a lot games to push that button and just fiddle with the other options to see what works and doesn't.
 
As much crying as there has been (yes some of you are literally sulking from the keyboard) this is good advertising for the game.

There may be a few PC gamers now refusing to buy the game because their card apparently can't run the game at max settings flawlessly, but there will be more people who had no intentions of buying, now doing so because it will be a benchmark and a new way to test their rigs.
 
Folks said that about 2GB video cards.

To future proof yourselves, folks really need to think about getting 6GB video cards if they don't want to compromise on texture quality with next gen console ports.

I'm going to wait until the consumer version of Oculus Rift comes to find out what the recommended GPU requirements will be and nVidia have got a consumer price friendly models with at least 6GB of Vram before I buy my next GPU.

I will do the same but I'm hoping there will be an 8gb or more (lol) card next year, it also has to be reasonably priced (whatever that means) and SLI VR featured, what I've read about VR it will up the requirements again, I'm really trying to get as close as possible to 4k 3d stereoscopic 90 hz... shit will be expensive I fear but I'm willing to shell out for this.
 
As much crying as there has been (yes some of you are literally sulking from the keyboard) this is good advertising for the game.

There may be a few PC gamers now refusing to buy the game because their card apparently can't run the game at max settings flawlessly, but there will be more people who had no intentions of buying, now doing so because it will be a benchmark and a new way to test their rigs.

Both sides seem dumb to me. Who the hell is going to test their rig with a game that doesn't look that great but has a feature that they recommend a lot of VRAM for?

OOOOOH YES GOTTA PUSH MY VRAM TO THE LIMIT.
 

Seanspeed

Banned
Both sides seem dumb to me. Who the hell is going to test their rig with a game that doesn't look that great but has a feature that they recommend a lot of VRAM for?

OOOOOH YES GOTTA PUSH MY VRAM TO THE LIMIT.
For real.

Its like the actual end result of what it looks like isn't even that important. Its about how hard you're pushing components, how many boxes you've check, how many sliders you've cranked, etc etc.

If we get some screenshots and these Ultra textures are like jaw-dropping compared to High textures, then maybe I'll feel a bit of envy, but I expect they'll be a very marginal improvement at best.

People can hold out and then spend more money so they can turn up the texture slider one notch if they want to, but eh, I'm not fretting about it.
 

R_Deckard

Member
Both sides seem dumb to me. Who the hell is going to test their rig with a game that doesn't look that great but has a feature that they recommend a lot of VRAM for?

OOOOOH YES GOTTA PUSH MY VRAM TO THE LIMIT.

come on this is PC gamers here, first thing when a new card launches and comes our...Running 3dmark and the like and comparing artificial scores.

I long past this stage and simply buy games and play them based on my choice, this thread just proves the amount of "new too PC" gamers here struggling with the need upgrade hardware more often then you think if you want to stay with the curve.

Add in buying just before or after a console launch is NEVER a good thing, hardware always lags behind and this time (clear from the launch) Vram was always going to be the issue and as they pass back some to games later in the cycle 4gb may still lack alittle.

So many meltdowns its crazy!
 

Sentenza

Member
Unless it's 770 2GB :D oops
A "mere" 2GB 770 will still run circles around a XB1 or PS4.
I'm starting to doubt some of you people even know what you are talking about at this point.

Wrong buy a ps4 at launch and never buy a PC...see how arrogant and stupid that sound's? You miss the point of console's if you think it's about power ypu got it wrong which you have because your a PC elitist so your wasted on explaining it. Oh and for all that power there are not many games that look better ov erall than the first wave of next gen game's. I really like PC some.of the best games i played are on there but you goto chill with this console hating nonsense.
wat_gif_by_ryla_sehn.gif
 
This is worse imo. 680s and 770s are kill

Not catering high settings around 2GB cards is the right choice. It's too restrictive and games shouldn't be held back by a combination of Nvidia's stubbornness and gamers making bad purchase decisions.

High settings shouldn't have worse textures than the console version and that's essentially what you're asking for if you want developers to "optimise" high settings around a 2GB baseline.

As soon as the PS4 was announced with 8GB GDDR5 it was obvious 2GB GPUs were DOA.
 
come on this is PC gamers here, first thing when a new card launches and comes our...Running 3dmark and the like and comparing artificial scores.

I long past this stage and simply buy games and play them based on my choice, this thread just proves the amount of "new too PC" gamers here struggling with the need upgrade hardware more often then you think if you want to stay with the curve.

Add in buying just before or after a console launch is NEVER a good thing, hardware always lags behind and this time (clear from the launch) Vram was always going to be the issue and as they pass back some to games later in the cycle 4gb may still lack alittle.

So many meltdowns its crazy!

This response, that other response, this entire thread.

It just leaves me here going:
tumblr_lyxx3af2wz1r50g7wo1_500_large.gif
 

Stahsky

A passionate embrace, a beautiful memory lingers.
I'm rolling with a 770 2G, but have been considering going for the 970. Not just for this, but also for Witcher later down the road. Is this a good idea, or should I just stick it out and wait a while?
 

Denton

Member
I'm rolling with a 770 2G, but have been considering going for the 970. Not just for this, but also for Witcher later down the road. Is this a good idea, or should I just stick it out and wait a while?

I'd upgrade after Witcher 3 is out to see if it will be really necessary. It's what I plan to do with my 280x. In the meantime it handles everything perfectly anyway.
 
I'm rolling with a 770 2G, but have been considering going for the 970. Not just for this, but also for Witcher later down the road. Is this a good idea, or should I just stick it out and wait a while?

You'll need to upgrade at some point soon if you prefer to play on higher settings. The 970 is a decent jumping off point but as ever the best time to upgrade a GPU is when you absolutely need to. If you're happy with the performance you currently have then you won't lose anything by waiting.
 
Wrong buy a ps4 at launch and never buy a PC...see how arrogant and stupid that sound's? You miss the point of console's if you think it's about power ypu got it wrong which you have because your a PC elitist so your wasted on explaining it. Oh and for all that power there are not many games that look better ov erall than the first wave of next gen game's. I really like PC some.of the best games i played are on there but you goto chill with this console hating nonsense.
LOL
 
Top Bottom