• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Pascal GPU to feature 17B transistors and 32GB HBM2, coming Q2 2016 (earliest)

paskowitz

Member
"...next high-resolution 4K and 8K gaming panels..."

As the one of the very few people who has run 4K, 4K Surround, and now 5K, I simply cannot wait for 32GB VRAM, 4-Way SLI (is 8-Way SLI possible?) with a beautiful 8K gaming monitor!

That is astoundingly jaw-dropping just to see in words! Can only imagine it in real life! :)

In the meantime, something to tide us through: 5K Gaming

Dude that is a sweet system(s) you've got there. Out of curiosity how do you find the two Classifieds compare to 2 (or more) Titan X's at 4K? Also, you should look into putting your cards under water cooling if you aren't already.
 

Renekton

Member
From my limited exposure to it, 4K seems to makes PS360 era games show their age even more. I guess mods are essential.
 
Have you ever played a game running at native 4k at proper viewing distance? It really is not a waste...

its not a huge jump over something like the taa in the new unreal tournament preview build. but besides th epoint, hes talking about 5 and even 8k monitors. pixels arent nearly pretty enough to be spending that much rendering power on resolution. its grossly inefficient. this constant race for higher resolutions might be part of the reason pc graphics have stagnated so heavily
 
its not a huge jump over something like the taa in the new unreal tournament preview build. but besides th epoint, hes talking about 5 and even 8k monitors. pixels arent nearly pretty enough to be spending that much rendering power on resolution. its grossly inefficient. this constant race for higher resolutions might be part of the reason pc graphics have stagnated so heavily

PC graphics stagnated? When? Also blaming higher screen resolutions or people with multiple GPUs for stagnating game graphics seems odd when the obvious answer for which is console GPUs low flops stagnating graphics.

Also, you are confusing two notionally different concepts in your response. Anti-aliasing vs. Higher base resolution. The main benefits of a higher base resolution are not that it decreases aliasing... but rather you can resolve more detail. Something no amount of TAA in any engine will be able to produce, in fact, that would destroy more detail.

Why not the best of both worlds? Higher resolution screen with that same TAA...
 
PC graphics stagnated? When? Also blaming higher screen resolutions or people with multiple GPUs for stagnating game graphics seems odd when the obvious answer for which is console GPUs low flops stagnating graphics.

Also, you are confusing two notionally different concepts in your response. Anti-aliasing vs. Higher base resolution. The main benefits of a higher base resolution are not that it decreases aliasing... but rather you can resolve more detail. Something no amount of TAA in any engine will be able to produce, in fact, that would destroy more detail.

Why not the best of both worlds? Higher resolution screen with that same TAA...

they have absolutely stagnated. i think resolution chasing is partly to blame. its almost become a meme at this point for pc gamers to say "omg x looks so good at y resolution" where x is any random game with mediocre graphics and y is any resolution 4k or higher. the high res pc screenshot thread is like a museum for this. why would developers spend their time actually improving graphics fidelity when having unlocked resolution requires almost no work and scales infinitely with gpu power? i mean its gotten to the point where the most bare bones of console ports are now declared to be a generational leap just because you can play them at a higher resolution.

im not confusing the 2. current games dont have nearly enough detail where you gain anything worthwhile from such insane resolutions, so what your basically left with is less aliasing.
 
im not confusing the 2. current games dont have nearly enough detail where you gain anything worthwhile from such insane resolutions, so what your basically left with is less aliasing.
um...
if a texture is occupying screen space that is less than its native resolution (say 4096X4096 or even 2048x2048). then there is much more detail to be resolved by increasing resolution.

This applies to almost any modern game.
 
um...
if a texture is occupying screen space that is less than its native resolution (say 4096X4096 or even 2048x2048). then there is much more detail to be resolved by increasing resolution.

This applies to almost any modern game.

you can probly count on your fingers the number of textures that are 4kx4k in any game.

infiltrator is locked to 1080p and it resolves far more detail and looks vastly better than any of the super high res screens being posted in the pc ss thread. lets take infiltrator as a baseline. either scale it up to 4k, or spend that same 4x increase in gpu power to improve the quality of rendering while remaining at 1080p. i know which id rather have and its not even close.
 

thuway

Member
I want this GPU to be priced in a sensible manner, but I feel stupid for even thinking about that. I'll stick with the price dropped GPU's of yore whenever this drops, and pick up the value GPU in the second gen.
 
you can probly count on your fingers the number of textures that are 4kx4k in any game
So what? Not every game is star citizen and needs a number of 8KX8K textures.

All you need is a 2k texture. In order for there to be NO appreciable increase in fidelity by upping screen resolution, that 2048X2048 texture would have to be taking up that same amount of screen space. AKA, more pixels than a 1080p screen and more pixels than a 1440p screen. How often are single textures taking up more screen realty than a 1440p screen? Never and they never will.

Increasing screen resolution helps resolve more texture detail in almost any game since 2010 and even in a number of games since before that.
infiltrator is locked to 1080p and it resolves far more detail and looks vastly better than any of the super high res screens being posted in the pc ss thread. lets take infiltrator as a baseline. either scale it up to 4k, or spend that same 4x increase in gpu power to improve the quality of rendering while remaining at 1080p. i know which id rather have and its not even close.
And infiltrator looks much better at 4K. You are acting like it is a linear trade off that all devs make or something. The reason why all games do not achieve the same asset and scene fidelity of Infiltrator is NOT because they assume PC gamers are all chasing some high resolution pipe dream. It is because the infiltrator is a tech demo made for a GPU that has more flops than the PS4, it isnt even a game.

I cannot believe I am arguing such an obvious point.
 

aember

Member
I'm still on my dying 460v2. I wonder just how much longer I can stretch it out till upgrading to the latest and greatest..
 

tuxfool

Banned
So what? Not every game is star citizen and needs a number of 8KX8K textures.

All you need is a 2k texture. In order for there to be NO appreciable increase in fidelity by upping screen resolution, that 2048X2048 texture would have to be taking up that same amount of screen space. AKA, more pixels than a 1080p screen and more pixels than a 1440p screen. How often are single textures taking up more screen realty than a 1440p screen? Never and they never will.

Increasing screen resolution helps resolve more texture detail in almost any game since 2010 and even in a number of games since before that.

And infiltrator looks much better at 4K. You are acting like it is a linear trade off that all devs make or something. The reason why all games do not achieve the same asset and scene fidelity of Infiltrator is NOT because they assume PC gamers are all chasing some high resolution pipe dream. It is because the infiltrator is a tech demo made for a GPU that has more flops than the PS4, it isnt even a game.

I cannot believe I am arguing such an obvious point.

This especially true when you consider that in most 3d games textures are rarely (if ever) going to be exactly mapped to the pixels on screen i.e. at 0 distance from the camera. Having a higher resolution allows games to push back LODs, use lower level mips.
 
So what? Not every game is star citizen and needs a number of 8KX8K textures.

All you need is a 2k texture. In order for there to be NO appreciable increase in fidelity by upping screen resolution, that 2048X2048 texture would have to be taking up that same amount of screen space. AKA, more pixels than a 1080p screen and more pixels than a 1440p screen. How often are single textures taking up more screen realty than a 1440p screen? Never and they never will.

Increasing screen resolution helps resolve more texture detail in almost any game since 2010 and even in a number of games since before that.

And infiltrator looks much better at 4K. You are acting like it is a linear trade off that all devs make or something. The reason why all games do not achieve the same asset and scene fidelity of Infiltrator is NOT because they assume PC gamers are all chasing some high resolution pipe dream. It is because the infiltrator is a tech demo made for a GPU that has more flops than the PS4, it isnt even a game.

I cannot believe I am arguing such an obvious point.

i guess we disagree on how much of an improvement 4k is. if a game already has great aa, i find 4k to be a very small improvement, especially relative to the increase in gpu grunt required.
 

BriGuy

Member
I really wanted to get a semi-capable laptop this year, but it seems like the absolute worst time to do so. The best laptop cards are already a year old and it seems like it will be another year before there's something to replace them.
 

Azulsky

Member
My GTX 580 is dying, Will the version that comes out in Q2 be reasonably priced like the 970?

I would expect them to keep similar price tiering. What the performance delta between Maxwell and Pascal is no one knows yet.

Im not expecting some crazy 6800 Ultra --> 7800 quantum leap ever again.
 

Daante

Member
I would expect them to keep similar price tiering. What the performance delta between Maxwell and Pascal is no one knows yet.

Im not expecting some crazy 6800 Ultra --> 7800 quantum leap ever again.

Im guessing that the pascal version of the present gtx 970, will perform slightly below a gtx 980 ti, (5-10 fps less in gaming benchmarks on average)

Im guessing that the pascal version of the present gtx 980 will perform better than a gtx 980 ti, (5-10 fps more in gaming benchmarks on average)

I think the Pascal cards will shine at their best if you game above 2560x1440, and from here up to 4k resolutions the leap from maxwell might be bigger than im guessing above.
 

The1Ski

Member
My question is what kind of game development costs are we going to see that take advantage of that much power? I just think there has to be some kind of ceiling to how much a developer will spend and still justify a $60 selling point on games.

I don't know a lot about game development, though. Is it just a matter of more powerful development tools?
 

Qassim

Member
Q4 2016 would be like 2 years after their 9xx series. That would be way too long.

That's completely normal though.. new architectures are every two years, used to be Q1/2, but the cycle got pushed down to in one of the recent releases, probably with Maxwell. Tick-Tock.

e.g:

Tesla - 2008
Fermi - 2010
Kepler - 2012
Maxwell - 2014

Then Pascal - 2016
 

Kezen

Banned
My question is what kind of game development costs are we going to see that take advantage of that much power? I just think there has to be some kind of ceiling to how much a developer will spend and still justify a $60 selling point on games.

I don't know a lot about game development, though. Is it just a matter of more powerful development tools?

4K and VR will most certainly tax this class of hardware, along with 60fps and render settings beyond those found on consoles.

Don't worry, scaling up to that level of hardware is not going to be difficult nor too expensive. Assets will remain more or less identical across PC and consoles in multiplatform games.
 

Renekton

Member
That's completely normal though.. new architectures are every two years. Tick-Tock.

e.g:

Tesla - 2008
Fermi - 2010
Kepler - 2012
Maxwell - 2014

Then Pascal - 2016
Tick-tock was for Intel, GPUs tended to combine both.

But we're very likely facing double-tocks from now on.
 
So what? Not every game is star citizen and needs a number of 8KX8K textures.

All you need is a 2k texture. In order for there to be NO appreciable increase in fidelity by upping screen resolution, that 2048X2048 texture would have to be taking up that same amount of screen space. AKA, more pixels than a 1080p screen and more pixels than a 1440p screen. How often are single textures taking up more screen realty than a 1440p screen? Never and they never will.

Increasing screen resolution helps resolve more texture detail in almost any game since 2010 and even in a number of games since before that.

And infiltrator looks much better at 4K. You are acting like it is a linear trade off that all devs make or something. The reason why all games do not achieve the same asset and scene fidelity of Infiltrator is NOT because they assume PC gamers are all chasing some high resolution pipe dream. It is because the infiltrator is a tech demo made for a GPU that has more flops than the PS4, it isnt even a game.

I cannot believe I am arguing such an obvious point.

You obviously don't realise who you're responding to lad.
 

Raticus79

Seek victory, not fairness
i guess we disagree on how much of an improvement 4k is. if a game already has great aa, i find 4k to be a very small improvement, especially relative to the increase in gpu grunt required.

Depends on the size of the 4k monitor, I'd say. Mine's 32" and it's just about on the edge. At 27" I'd feel like I was rendering detail I couldn't really see.
 

Grief.exe

Member
My question is what kind of game development costs are we going to see that take advantage of that much power? I just think there has to be some kind of ceiling to how much a developer will spend and still justify a $60 selling point on games.

I don't know a lot about game development, though. Is it just a matter of more powerful development tools?

I would argue monitor/TV technology are going to be pushing the fold, rather than game development. 4K TVs/monitors are set to hit mainstream in 2015/2016 with content creators set to deliver 4K content to meet the new demand.
Consoles will really be the only medium lagging behind.


I want this GPU to be priced in a sensible manner, but I feel stupid for even thinking about that. I'll stick with the price dropped GPU's of yore whenever this drops, and pick up the value GPU in the second gen.

I'll sell you my 970 when the time comes. Win/win.
 

Qassim

Member
Tick-tock was for Intel, GPUs tended to combine both.

But we're very likely facing double-tocks from now on.

I'm aware, I wasn't really applying Intel's methodology directly, more referring to what the industry has adapted based on that as a generic "Major release -> refresh -> Major release -> refresh" cycle.
 
Depends on the size of the 4k monitor, I'd say. Mine's 32" and it's just about on the edge. At 27" I'd feel like I was rendering detail I couldn't really see.

i could see 4k being a much bigger deal for large tvs, but in the 24 to 32 inch range which is typical for pc monitors, its not nearly worth it imo
 

E-Cat

Member
Taking a wild guess here, whatever AMD's architecture after Arctic Islands / R-400 is, which is probably in R&D now and intended for PC in 2018 (R-500 series) could very well possibly be the basis for the GPUs in XB4 & PS5 APU by 2020.
Adding a little bit of data to your wild guess, the Radeon 5xxx Evergreen series (2009) outclasses the PS4 computationally--four years before its launch. Based on this precedent, a PS5 released in 2020 should not be more powerful than an Arctic Islands flagship GPU from 2016. It may indeed use a more modern architecture such as the Rx 500, however.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
Adding a little bit of data to your wild guess, the Radeon 5xxx Evergreen series (2009) outclasses the PS4 computationally--four years before its launch. Based on this precedent, a PS5 released in 2020 should not be more powerful than an Arctic Islands flagship GPU from 2016. It may indeed use a more modern architecture such as the Rx 500, however.

Architecturally, you don't use FLOPs like that. An Evergreen GPU is going to get its ass kicked by Pitcairn, a Southern/Sea Islands GPU easily using similar FLOP numbers.
 
Adding a little bit of data to your wild guess, the Radeon 5xxx Evergreen series (2009) outclasses the PS4 computationally--four years before its launch. Based on this precedent, a PS5 released in 2020 should not be more powerful than an Arctic Islands flagship GPU from 2016. It may indeed use a more modern architecture such as the Rx 500, however.
That's like saying that a Radeon 5770 outclasses Xbone's GPU. It doesn't make any sense.

Architecturally-wise GCN is superior compared to VLIW[4-5].
 

E-Cat

Member
Architecturally, you don't use FLOPs like that. An Evergreen GPU is going to get its ass kicked by Pitcairn, a Southern/Sea Islands GPU easily using similar FLOP numbers.
What are you arguing, exactly? That a newer architecture is going to outperform an older architecture with a similar FLOPS nomenclature? I'd say that's pretty obvious.
 

tuxfool

Banned
Adding a little bit of data to your wild guess, the Radeon 5xxx Evergreen series (2009) outclasses the PS4 computationally--four years before its launch. Based on this precedent, a PS5 released in 2020 should not be more powerful than an Arctic Islands flagship GPU from 2016. It may indeed use a more modern architecture such as the Rx 500, however.

It could be, but both perf/W and perf/Area is lower in the evergreen series. This also forgets hardware features that allow better hardware usage and enable graphical effects. None of these things are accounted for when only looking at FLOPS (single or double precision).
 

E-Cat

Member
That's like saying that a Radeon 5770 outclasses Xbone's GPU. It doesn't make any sense.

Architecturally-wise GCN is superior compared to VLIW[4-5].
Yeah, but I'm sure it can be quantified. There must be a limit to how much more efficient GCN is 'per FLOPS'.

And no, saying that a HD 5870 outclasses PS4's GPU is not like saying that a Radeon 5770 outclasses Xbone's GPU (which it probably doesn't, seeing as the 5770 is only 3.81% faster "on paper" than Xbone and has an older arch).

Whereas a 5870 is 47.58% faster than a PS4--again, "on paper". However, I seriously doubt GCN is that much more efficient that it would be a clear-cut victory.
 

tuxfool

Banned
Yeah, but I'm sure it can be quantified. There must be a limit to how much more efficient GCN is 'per FLOPS'.

And no, saying that a HD 5870 outclasses PS4's GPU is not like saying that a Radeon 5770 outclasses Xbone's GPU (which it probably does, seeing as the 5770 is only 3.81% faster "on paper" than Xbone).

Whereas a 5870 is 47.58% faster than a PS4--again, "on paper". However, I seriously doubt the GCN is that much more efficient that it would be a clear victory.

And people are telling you that FLOPS isn't the entire story.

It is also peak theoretical which won't be hit. Then there are issues with the fact that it is a VLIW architecture and GCN is SIMD, and as a result not only is more flexible but should also allow better usage of hardware from the point of view of modern graphics algorithms.

Then we start talking about other hardware features like more advanced TMUs or ROPs which do not figure at all into FLOPS numbers.

e: Looking at the max TDP of the 5870 at 228W, yikes, it is around double of the entire TDP of the PS4 APU.
 

E-Cat

Member
And people are telling you that FLOPS isn't the entire story.

It is also peak theoretical which won't be hit. Then there are issues with the fact that it is a VLIW architecture and GCN is SIMD, and as a result not only is more flexible but should also allow better usage of hardware from the point of view of modern graphics algorithms.

Then we start talking about other hardware features like more advanced TMUs or ROPs which do not figure at all into FLOPS numbers.
All I'm saying is, FLOPS is a great way of comparing the relative performance of two GPUs on similar architectures--and a slightly less good, but still useful approximation between two GPUs w/ consecutive generation architectures (using some constant multiplier).

For example, I bet you I could use some ratio comparing Kepler and Maxwell that would quite accurately predict benchmark performance, adjusted for FLOPS - e.g., 1.2 x Maxwell = Kepler. Not saying this is the correct figure, btw.
 

hesido

Member
hard to imagine a bigger waste of gpu resources

Indeed, what a waste of gpu cycles. I guess devs will have to come up with creative ways to fill those displays with them colours, if they want to use better shaders. Probably most will not bother and just do more of the same but at 4K and 8K.
 

tuxfool

Banned
All I'm saying is, FLOPS is a great way of comparing the relative performance of two GPUs on similar architectures--and a slightly less good, but still a useful approximation between two GPUs w/ consecutive generation architectures (using some constant multiplier).

For example, I bet you I could use some ratio comparing Kepler and Maxwell that would quite accurately predict benchmark performance, adjusted for FLOPS - e.g., 1.2 x Maxwell = Kepler. Not saying this is the correct figure, btw.

Except that you can't. GCN is a completely architecture to VLIW4/5. An entirely different paradigm, where GCN is a SIMD architecture and VLIW4/5 are VLIW architectures. Kepler ->Maxwell is better because they are iterations (but still different enough that your idea isn't all that appropriate as a general measurement).

This is why benchmarks are done with multiple games and not simply a recounting of theoretical numbers (possibly achievable only in synthetic tests).

Also you keep on insisting FLOPS are a good way of measuring relative performance of the entire gpu. What happens when ROPs and TMUs cannot sample and map textures as fast as newer architectures? The shaders in the GPU end up waiting. FLOPS are only good as measuring the peak theoretical performance of possibly simple shading operations.

Also VLIW are architectures are a lot harder to work with and keep fed than SIMD architectures. They're a pain to employ in modern compute tasks as they're more dependent on the compiler for optimization.
 

Black_Red

Member
The Pascal cards should be on a whole different price range when compared to a gtx950 and 960 right?

I don't want to feel bad when buying one of those soon.
 

E-Cat

Member
It could be, but both perf/W and perf/Area is lower in the evergreen series. This also forgets hardware features that allow better hardware usage and enable graphical effects. None of these things are accounted for when only looking at FLOPS (single or double precision).
Do we even know the TDP of PS4's GPU?

Anyway, perf/W doesn't really matter since discrete GPUs are usually over 200W, much more than console.

I didn't "forget" better hardware usage. Obviously, a closed system console is going to utilize its GPU better than an average PC.
 

Qassim

Member
The Pascal cards should be on a whole different price range when compared to a gtx950 and 960 right?

I don't want to feel bad when buying one of those soon.

There will be Pascal cards that are equivalent to NVIDIA's entire range now, including the 950 and 960 cards. NVIDIA move their entire line to their new architectures. But they will start with the high end cards (*70, *80 cards), the *50 and *60 cards could be 4-6 months after that initial pascal release.
 

Vuze

Member
The Pascal cards should be on a whole different price range when compared to a gtx950 and 960 right?

I don't want to feel bad when buying one of those soon.
There will probably be lower end cards as well but not close to launch if the 9xx series is anything to go by. No idea how they dealt with this in previous series, first time I actively witness a full architecture shift.
 
Top Bottom