• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Shadow of Mordor offers Ultra texture optional download, recommends 6GB VRAM @ 1080p

Sentenza

Member
I'm rolling with a 770 2G, but have been considering going for the 970. Not just for this, but also for Witcher later down the road. Is this a good idea, or should I just stick it out and wait a while?
Here's a few things a lot of people in this thread are failing to realize:

- increase in requirements for VRAM in upcoming games will not be "to match console version", but to go beyond that. If matching textures on a PS4/XB1 is your standard of doing fine your 2GB card will last you a while.

- "maxing out" a game is not mandatory and texture resolution is not necessarily the most meaningful metric for visual quality (i.e. if vanilla textures are good enough you would probably care more for 120 fps than "ultra" textures).

- they keep saying "i told you so, you needed it from the beginning" and they don't even realize that facts are proving them wrong.

"You will totally need that 4GB card now or you will regret it". Guess what? I bought a 2GB GPU *deliberately* knowing it was just a stepping stone I would have upgraded it at some point quite early (once again, because I want to, not because I need to) and by the time I will actually need and buy a 4GB GPU (I was considering a GTX 970 for February or later) I will pay both my purchases LESS than I would have paid the 4GB GPU back then.
 
The issue that frustrates me at the moment is that the GTX970 and GTX980 are more than amply powerful cards. I have no doubts they can run whatever the current-gen consoles can throw at them (barring exceptions such as Naughty God secret sauce, but companies like that are rare).

So I can understand to some extent why people are getting irate at nVidia because it's not the power that's the issue, it's the VRAM. That's something that can be easily remedied, if only they weren't so stingy with the VRAM. If I have to pay a premium, so be it.

I shouldn't be shelling out $400+ for an instantly obsolete card when it is actually not obsolete at all (in the power sense) just because nVidia skimped on VRAM.

Edit: I guess what I'm trying to say is that if the GTX970 weren't powerful enough and didn't have enough VRAM (but especially the former) to play current-gen games, I'd be darned excited. It means GPU manufactors are receiving the dearly-needed kick to their behinds to begin moving PC gaming forward.

But it's not. At all. What we're seeing is a McLaren 1000+ HP engine placed into a Volkswagen Beetle to race on a 45 mph track...

Nvidia rightly deserve a great deal of shit for being stingy with VRAM in the past couple generations but the 970 isn't the card to level that complaint at.

4GB is a good compromise to hit that level of performance at the $330 price point. For a single GPU setup aimed at 1080p it really is the sweet spot right now.

Aslong as 8GB cards turn up at $400 in due course for those that want to investigate 4K and SLI then I can have no complaints. The card only just launched though and they're having enough trouble meeting demand as it is. Holding off the eventual 8GB card a couple months is reasonable enough.
 

blaidd

Banned
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)
 
Everything set to ultra except textures I get an average framerate of just over 60 with some minor drops to ~50.

I'm asking because I've been meaning to ask for ages, not picking on you.

So - why is it considered to be a fail if a 1080/60fps PS4 (not PC) title turns out to be 60 with "some minor drops to ~50 fps" ? I remember the absolute fascination with whether TLOU was going to have dips below 60 or not (turned out it really was 60, to the second deviation or better).
 
Here's a few things a lot of people in this thread are failing to realize:

- increase in requirements for VRAM in upcoming games will not be "to match console version", but to go beyond that. If matching textures on a PS4/XB1 is your standard of doing fine your 2GB card will last you a while.

2GB cards will suffer from worse textures than their console counterparts as this generation progresses. The fact that a launch title initially designed around the 4GB spec already dedicated 3GB of memory to "VRAM" (and over 2GB to texture assets alone) is proof enough of that.

4GB is what you want to comfortably match console textures at 1080p. Not 2GB. You will be dropping below console spec on memory intensive settings on a 2GB card and as console spec is usually a good parallel for PC minimum (as it should be) it's a mater of time before you drop below minimum spec altogether.

Waiting on an upgrade until absolutely necessary is always the best approach but giving owners of the 770 2GB a false hope at this point is just delaying the inevitable. They were always destined to sink, and the sooner we all accept this fate the sooner we can all move on.
 

Seanspeed

Banned
I'm asking because I've been meaning to ask for ages, not picking on you.

So - why is it considered to be a fail if a 1080/60fps PS4 (not PC) title turns out to be 60 with "some minor drops to ~50 fps" ? I remember the absolute fascination with whether TLOU was going to have dips below 60 or not (turned out it really was 60, to the second deviation or better).
Ask the people who said it was a failure.

Some people are picky about framerate consistency. I find 60 with dips to the 50's ok, but not ideal.
 

Sentenza

Member
2GB cards will suffer from worse textures than their console counterparts as this generation progresses. .
It's not been true so far, I'm not sure what kind of voodoo you are expecting to make it true in the future.
Do you think consoles will come out with some new magic ways to reproduce textures that won't work on PC?
 

Seanspeed

Banned
read his post history im pretty sure a console killed his dog and nice intelligent responses.
Regardless of his post history, that response was completely unwarranted and was just you lashing out for no reason. That was a fair post by alexandros.

I can see that with your post history, you seem to have a chip on your shoulder when it comes to PC gamers. I gave you the benefit of the doubt on it once after you assured me this wasn't the case, but its very hard to do anymore.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
A "mere" 2GB 770 will still run circles around a XB1 or PS4.

In terms of pushing geometry and fill rate, yes. But a XB1/PS4 game that isn't pushing geometry that hard but chose to use lots of large texture will look better then a 2GB 770.
 

R_Deckard

Member
Here's a few things a lot of people in this thread are failing to realize:

- increase in requirements for VRAM in upcoming games will not be "to match console version", but to go beyond that. If matching textures on a PS4/XB1 is your standard of doing fine your 2GB card will last you a while.

- "maxing out" a game is not mandatory and texture resolution is not necessarily the most meaningful metric for visual quality (i.e. if vanilla textures are good enough you would probably care more for 120 fps than "ultra" textures).

- they keep saying "i told you so, you needed it from the beginning" and they don't even realize that facts are proving them wrong.

"You will totally need that 4GB card now or you will regret it". Guess what? I bought a 2GB GPU *deliberately* knowing it was just a stepping stone I would have upgraded it at some point quite early (once again, because I want to, not because I need to) and by the time I will actually need and buy a 4GB GPU (I was considering a GTX 970 for February or later) I will pay both my purchases LESS than I would have paid the 4GB GPU back then.

Are you trying to convince yourself here or others..

the meldowns continue....
drocks12_6.gif
 
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

Thanks!
 

blaidd

Banned
I'm asking because I've been meaning to ask for ages, not picking on you.

So - why is it considered to be a fail if a 1080/60fps PS4 (not PC) title turns out to be 60 with "some minor drops to ~50 fps" ? I remember the absolute fascination with whether TLOU was going to have dips below 60 or not (turned out it really was 60, to the second deviation or better).

It's something mystical with strange numbers. Seriously: It's a hype. All because with the new consoles it's become (somewhat) political. There might be people out there, who can feel the difference in slight drops from 60 fps to say 55, but not the normal gamer. I can tell 45 from 60 and 80 from 60, but only if these are the average, not the min-fps. The fascination with the PS4 dropping from 60 fps to somewhat lower is because often enough the Xbox One won't do 60 and some pissed of people who with all propability don't even have a clue what's going on can point their fingers and say "Sony is cheating!!" Which will get the Sony-guys pissed off and attacking back (all the while they probably don't have a clue eigther). There's always some drops. There is no constant 60 or 30 fps. It's the average (probably slightly rounded up to look nicer).

Regarding Mordor:
I don't consider that a fail. Quite the opposite - It's actually quite impressive with downsampling enabled. You could roughly add a third of that to get 1080p, as long an the CPU isn't bottlenecking. So 90 FPs in 1080p, with minor drops to ~67. You can't really complain about that.

This is just an extremely rough estimate so don't nail me on these numbers, but that's what I'd expect. I also expect that you can run the game with everything (including textures) set to ultra with an 2 Gig-GPU. You might experience some slight stuttering here and there but I'm pretty sure it won't be major and game- or immersion-breaking if you're not extremely sensitive.
 

MaLDo

Member
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)


Thank you. Have you seen hitches/pauses as loading zones? PC gameplay trailer had a lot of them.
 

BONKERS

Member
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as nativ and will only use percentages of that- so instead of 1080p I'll get some really wiered ones like 1927 x 1187 or something like that.

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling your effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

Repost for visibility.


But specifically
There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

This depends on the quality of the built in Super sampling. Straight OGSSAA itself, isn't very good, nor effective. (Unless combined with other solutions) And there is also the possibility that the technique they use, completely bypasses things like rending buffers that use fractional resolutions in order to make it faster.

IE: Say a random effect is 768x432, a fractional 0.4x0.4 of 1920x1080, when super sampling if bypassing, then this effect would stay the same resolution. And also save additional render time. But doing driver based downsampling (Or even DSR) and you played at 3840x2160, this effect would then become 1536x864 instead. Thus increasing quality.


However knowing Monolith, given their past with SSAA in games like Condemned and FEAR. I would have a small margin of hope it's as good as the SSAA in those games.

4xFSAA in Condemned is comparable to 4xSGSSAA, even better performance wise
 

Sentenza

Member
In terms of pushing geometry and fill rate, yes. But a XB1/PS4 game that isn't pushing geometry that hard but chose to use lots of large texture will look better then a 2GB 770.
Well, so far that isn't happening.
There isn't a single multiplatform game I played so far on this 770 that forced me to scale detail lower than a PS4.

Are you trying to convince yourself here or others..

the meldowns continue...
Look, don't bother trying to play this "meltdown" childish bullshit with me.
It's a waste of time.
Keep telling everyone that the PS4 will use the Ultra textures so you can comfort yourself in your own delusion, but let me out of this crap.
 
More than you it seems with "lol"
Fine, junior, I'll explain it to you.

You have completely missed the point of alexandros's post, and given your ill-formated, pointless rant, I doubted if you even read his post.

He was merely explaining how a 770, and later, a PC performs compared to a PS4. Not telling people not to buy one.
 

R_Deckard

Member
Look, don't bother trying to play this "meltdown" childish bullshit with me.
It's a waste of time.
Keep telling everyone that the PS4 will use the Ultra textures so you can comfort yourself in your own delusion, but let me out of this crap.

Seriously man, you need to chill out. This hate of PS4 is not healthy, I am sure the game will look great on PC and PS4 (having played the PS4 version myself) i know it looks great and has a good ~60fps, I care little as to how much actual "ram" is used for the textures but what ever floats your boat I guess!
 
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

Thanks! Maybe this should be in the OP, or is too late now? :p
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Well, so far that isn't happening.
There isn't a single multiplatform game I played so far on this 770 that forced me to scale detail lower than a PS4.

Which is interesting. I think as we move further into the next generation and devs get used to large pool of memory available to them then that will no longer be the case.
On the other hand. I really don't see why we would need a GPU with more the 4GB of Vram if we choose to game at 1080p. That should be enough for the entire generation if folks are happy at gaming with the same texture resolution as XB1/PS4. Of course, some devs might choose to target PC ports higher.
 

Bl@de

Member
It's not been true so far, I'm not sure what kind of voodoo you are expecting to make it true in the future.
Do you think consoles will come out with some new magic ways to reproduce textures that won't work on PC?

Of Course. You forgot the power of the almighty Cloud in the Sky. Oh Save us from it Lord Gaben.
 

Sentenza

Member
Seriously man, you need to chill out.
And you need some argument, instead of keeping up with this childish game of rebuttals.
This hate of PS4 is not healthy
I don't have any "hate" of a PS4 and the mere fact that you perceive some factual statement as such is very telling of your "brand cheerleader" mindset, if anything.

I am sure the game will look great on PC and PS4 (having played the PS4 version myself) i know it looks great and has a good ~60fps, I care little as to how much actual "ram" is used for the textures but what ever floats your boat I guess!
I'm... very happy for you? But I'm not sure why that should be relevant.
 
Seriously man, you need to chill out. This hate of PS4 is not healthy, I am sure the game will look great on PC and PS4 (having played the PS4 version myself) i know it looks great and has a good ~60fps, I care little as to how much actual "ram" is used for the textures but what ever floats your boat I guess!

I don't think there is any evidence of PS4 hate. It is just that this whole VRAM fear-mongering has spiralled compleletely out of control, mainly because of people who very obviously don't understand it.
 

Fractal

Banned
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)
Ah, finally a real report from someone who had the opportunity to try the game out for himself. What you said falls in line with my expectations, despite some of my previous posts in this thread which were more about a hypothetical worst case scenario. Since I have no plans to use SSAA here, I think I'll be okay on my 780 Ti.
 
Look, don't bother trying to play this "meltdown" childish bullshit with me.
It's a waste of time.
Keep telling everyone that the PS4 will use the Ultra textures so you can comfort yourself in your own delusion, but let me out of this crap.

No one (but you it seems) is suggesting that the PS4 will be using ultra textures but it won't be a great surprise if they're equivalent to high. High textures that according to both the developer and a user with the game probably won't run on a 2GB card without hiccups.
 

Kezen

Banned
Alright.

Ultra-Textures are not available yet, so the PC Gamer Vid only uses "High".

I doubt even the ultra-textures will use 6 Gigs on 1080p however - High uses 2.7 Gigs but with downsampling from 2720 x 1700 @ 1920 x 1200. Reason because I didn't try 1080 is because Mordor recognizes my downsampling-resolution as native and will only use percentages of that- so instead of 1080p I'll get some really weired ones like 1927 x 1187 or something like that (and I can't be arsed to fix it right now).

There is however a ingame-supersampling option so you can set the internal rendering to 200 % and will get effectivly 4K-Res while still being in 1080p. If I'd use that coming from 1080p instead of 1700p I'd wager you'd be around 3 Gigs. So the recommendation seems plausible.

So with Ultra-Textures taking somewhat more VRAM and Supersampling enabled, 4 Gigs might be just not enough, making the jump to 6 Gigs logical. - Remember: With Supersampling you're effectively running 4K not 1080p. Coming from there, I'd wager you'll be absolutely fine with 3 Gigs running this game with everything set to ultra, including textures but disabled supersamping.

It also runs very nice. Everything set to ultra except textures and downsampling I get an average framerate of just over 60 with some minor drops to ~50. Setting this to 4K, I still get 45+ with a R9 290X. This is an Nvidia-sponsored game, so Geforce-users will probably be at least slightly higher than this with a similar performing GPU (GTX 780- 780T i, Titan, GTX 970)
I'll test that more in-depth at some point next week, so we'll see.

So much for not opimized.

So don't get your panties in a bunch ;-)

Thanks a lot for this.
 
Well, so far that isn't happening.
There isn't a single multiplatform game I played so far on this 770 that forced me to scale detail lower than a PS4.
Tuco do you have a 770 2GB? I forget.


While I never would have recommended a 2 gig card to anyone (in fact, about 3 years ago as the rumors about next gen consoles were coming out I told a friend taht his 670 would perhaps need more VRAM as time went on), I find it hard to give devs the benefit of the doubt to make a good streaming solution.

Games have shown that good streaming produces high quality textures, it is just some devs fail to implement it. While I do imagine that on average 2GB will yield similar texel quality as to the console counterparts, if a dev just throws the game on PC without any streaming at all really... or a poor streamer (Titanfall, WatchDogs)... then seriously expect the worst.
 

blaidd

Banned
Repost for visibility.


But specifically


This depends on the quality of the built in Super sampling. Straight OGSSAA itself, isn't very good, nor effective. (Unless combined with other solutions) And there is also the possibility that the technique they use, completely bypasses things like rending buffers that use fractional resolutions in order to make it faster.

IE: Say a random effect is 768x432, a fractional 0.4x0.4 of 1920x1080, when super sampling if bypassing, then this effect would stay the same resolution. And also save additional render time. But doing driver based downsampling (Or even DSR) and you played at 3840x2160, this effect would then become 1536x864 instead. Thus increasing quality.


However knowing Monolith, given their past with SSAA in games like Condemned and FEAR. I would have a small margin of hope it's as good as the SSAA in those games.

4xFSAA in Condemned is comparable to 4xSGSSAA, even better performance wise

Yeah, this is just an ingame-resolution-slider like in Battlefield 4 for example. Or downsampling (upsampling if you chose a lower-than-native res), or OGSSAA if you prefer. There seems to be some post-fx-aa going on as well, so even while OGSSAA is not perfect, it looks rather smooth.

With deferred rendering, implementing SGSSAA doesn't seem worth the extra work, implementing OGSSAA should be extremely simple compared to that.

Also: OGSSAA will do everything - brutal and inefficient but it will get everything - SGSSAA? Not necessarily. I always hated to see a very nice and smoothed-out picture with just one or two edges still flimmering violently. So there are some advantages to DS/OGSSAA.
 
I'm asking because I've been meaning to ask for ages, not picking on you.

So - why is it considered to be a fail if a 1080/60fps PS4 (not PC) title turns out to be 60 with "some minor drops to ~50 fps" ? I remember the absolute fascination with whether TLOU was going to have dips below 60 or not (turned out it really was 60, to the second deviation or better).

Probably partly because gaffers like to be outraged at something.

But also, if he wants a locked 60 he can lower settings, with a console you can't so if you want a locked 60 FPS you don't have an option.
 

Kezen

Banned
I'm with Dictator on this, while a class-leading streaming solution may ease the burden on 2GB cards I'm not optimistic developper will bother with that at all. From a multiplatform developper POV the cheaper path is usually the best one, they won't even "optimize" for PC. They're expecting us to blast our way through with the raw power of the hardware available (for a price hey).

The best advise anyone could give to PC gamers at the moment is to overprovision. Don't resort to bean counting (consoles have X amount of VRAM so I should Y for my PC) and buy the best hardware you can.
 

R_Deckard

Member
And you need some argument, instead of keeping up with this childish game of rebuttals.

I don't have any "hate" of a PS4 and the mere fact that you perceive some factual statement as such is very telling of your "brand cheerleader" mindset, if anything.


I'm... very happy for you? But I'm not sure why that should be relevant.

Look I offered up my FIRST HAND experience of a game on my PC and PS4 which shows that textures that you can see clearly are of the same quality and composite construction as PC on Ultra, this is factual and I know myself.

This was then met with, dont believe you, cant be because reasons..etc. It is not an argument or fan side I am on. Look at your posts here on this thread you are wound up.

It is relevant that the PC and PS4 are close visually, the 6GB need of ram is a mute point and clearly not needed to == PS4 textures and was not what I or others have said. But was is clear is that a 2GB card is going to have <Vram than both X1 and PS4 for textures this is a FACT and not a POV, you can argue and use old launch and x-gen games to defend this but I am at a lost as to why. 3-4GB is about what consoles will use for textures, using these on a PC with < this as Vram will result in System Ram to Card stutters and performance hits or wont run at all.

I don't think there is any evidence of PS4 hate. It is just that this whole VRAM fear-mongering has spiralled compleletely out of control, mainly because of people who very obviously don't understand it.

I think it just shows the amount of confusion and panic from many a new PC gamer at present. a 4gb card will be mostly fine this gen to get the same quality textures as Console (AF improvements aside), a 2gb card will not..simple I cannot understand the confusion and aggression here..I really can't?!
 

Sentenza

Member
I'm with Dictator on this, while a class-leading streaming solution may ease the burden on 2GB cards I'm not optimistic developper will bother with that at all. From a multiplatform developper POV the cheaper path is usually the best one, they won't even "optimize" for PC. They're expecting us to blast our way through with the raw power of the hardware available (for a price hey).

The best advise anyone could give to PC gamers at the moment is to overprovision. Don't resort to bean counting (consoles have X amount of VRAM so I should Y for my PC) and buy the best hardware you can.
The "best hardware you can" is rarely a cost-effective approach.
Typically, taking GPUs as an example, buying an average one today and and average one two years from now that will beat the best available today will cost you LESS than buying a top gamma right now.
That's the approach I would suggest to people and I trust the principle so much that's what I did myself.

Of course, if money aren't an issue for you, that's another matter. In that case, go for overkill all you want. Feel free to overspend. Change a top gamma GPU when the new model that is barely 10% more powerful comes out and so on.

Look I offered up my FIRST HAND experience of a game on my PC and PS4
Look, let's stop beating aroudn the bush.
You made a very specific and VERY FALSE claim: that Watchdogs was using on PS4 the equivalent of Ultra textures on PC.
The very source you quoted to prove your point proved you wrong. That was pages back.
And then I would be the rabid "hater" for stating a couple of facts.

You also keep trying to imply that Shadow of Moror on PS4 will match the top textures on PC, which no one can prove or disprove yet but sounds incredibly unlikely.

I don't even know what we are trying to argue about, at this point.
 

blaidd

Banned
Since I have no plans to use SSAA here, I think I'll be okay on my 780 Ti.

You probably could set the resolution-scale to 150% (at 1080p) and runn the game with around 60 fps. That's not too far off from the resolution I ran the game with (2880x1620 compared to 2720x1700) and the GTX 780 Ti would probably beat my R9 290X regarding performance in Mordor. Also with 3 Gigabyte of VRAM you shouldn't experience any stuttering when the game only used 2.7 with my configuration.

Again, this is just an rough estimate, but I wouldn't expect Mordor to run perceptibly worse.
 

Kezen

Banned
The "best hardware you can" is rarely a cost-effective approach.
I don't think it's necessarily the case, but I will rephrase : buy the best hardware you can afford. That was what I meant. Don't cheap out just because you don't immediately see the point of a higher priced GPU that you could buy. Mind you, I'm not even remotely interested in this mythical "future-proofing" but I want my purchase to last me at least a while without having to drop too many settings. I'm fine with upgrading yearly if the performance is there (and if I can resell my current card).

Typically, taking GPUs as an example, buying an average one today and and average one two years from now that will beat the best available today will cost you LESS than buying a top gamma right now.
That's the approach I would suggest to people and I trust the principle so much that's what I did myself.
I didn't see it that way, but that is true. It all depends on yout budget. Speaking for myself my sweet spot is 350-400.

Of course, if money aren't an issue for you, that's another matter. In that case, go for overkill all you want. Feel free to overspend. Change a top gamma GPU when the new model that is barely 10% more powerful comes out and so on.
10% isn't enough. 25% is for an upgrade.

I'm not judging you or anything for your choices but you have to know that there will be a time when 2GB will simply not cut it for consoles-like textures. If you're fine with dropping below that in this respect then carry on but I think there is not much point buying a 3tflops GPU to be bottlenecked by your VRAM.
 
The "best hardware you can" is rarely a cost-effective approach.
Typically, taking GPUs as an example, buying an average one today and and average one two years from now that will beat the best available today will cost you LESS than buying a top gamma right now.
That's the approach I would suggest to people and I trust the principle so much that's what I did myself.

Of course, if money aren't an issue for you, that's another matter. In that case, go for overkill all you want. Feel free to overspend. Change a top gamma GPU when the new model that is barely 10% more powerful comes out and so on.

I agree with this. Midrange is where the best value is at.
 

Fractal

Banned
You probably could set the resolution-scale to 150% (at 1080p) and runn the game with around 60 fps. That's not too far off from the resolution I ran the game with (2880x1620 compared to 2720x1700) and the GTX 780 Ti would probably beat my R9 290X regarding performance in Mordor. Also with 3 Gigabyte of VRAM you shouldn't experience any stuttering when the game only used 2.7 with my configuration.

Again, this is just an rough estimate, but I wouldn't expect Mordor to run perceptibly worse.
Yeah, but I play at 1440p, and personally, I don't think there's much need for SSAA there. Usually, 2xMSAA gives me satisfying results, or sometimes even good post-processing based solution like SMAA.
 
was considering getting this for ps4 because a long repetitive-ish game like this would be better suited to my couch but damn the price difference

70 bucks for a brand new console game. and i can get it on PC for like 25 from some people in the BST thread.

yeah PC it is.
 

R_Deckard

Member
Look, let's stop beating aroudn the bush.
You made a very specific and VERY FALSE claim: that Watchdogs was using on PS4 the equivalent of Ultra textures on PC.
The very source you quoted to prove your point proved you wrong. That was pages back.
And then I would be the rabid "hater" for stating a couple of facts.

You also keep trying to imply that Shadow of Moror on PS4 will match the top textures on PC, which no one can prove or disprove yet but sounds incredibly unlikely.

I don't even know what we are trying to argue about, at this point.

Ha ha! Stop hitting your keyboard so hard ;-)

I have no idea where and how deluded you are but I can see a rational conversation is wasted on you..fare de well and keep fighting the good fight!
 

KungFucius

King Snowflake
I am a little confused. Are the ultra textures for > 1080p rendering so if you use them at 1080p are they rendered at their native res and then downsampled?
 

Kezen

Banned
I'm confused now.

Different games : Watch Dogs on PS4 supposedly uses a mixture of high and ultra textures. PC took the brute force approach and everything is very high res....But 3GB gpus have trouble following in certain situations.

The second comment is about Mordor. I would be surprised if the PS4 version packs the ultra (6GB) textures. Most likely high settings which require 3GB on PC.

Ha ha! Stop hitting your keyboard so hard ;-)

I have no idea where and how deluded you are but I can see a rational conversation is wasted on you..fare de well and keep fighting the good fight!
You aren't exactly helping with this kind of post.
 
Top Bottom