• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Pascal GPU to feature 17B transistors and 32GB HBM2, coming Q2 2016 (earliest)

efyu_lemonardo

May I have a cookie?
If/when Pascal or similar tech becomes widespread are we going to see physics and other stuff running on the gpu start to affect gameplay in more complex and meaningful ways? Possibly this is already happening and I haven't been keeping up..
 

magnumpy

Member
The Fury only has 4GB of HBM. I don't know what you're talking about in regards to "nvidia better have a desktop card to answer it." They already do in performance, albeit not in that form factor.

I'm talking about HBM. nvidia currently has no cards with HBM, AMD does. if nvidia hopes to compete technologically they need to have an answer.
 

xkramz

Member
one gtx 970 should be sufficient for me for atleast 2 years. if anything i can SLI 970.. will wait for price drops on that baby tho
 

Durante

Member
sorry, got the names mixed up. I'm talking about this
That is a 8.9B transistor card, with 4GB of memory.

this is a desktop card. if this isn't outside the realm of possibility, then nvidia better have a desktop card to answer it or they will be left behind.
Fury (X) didn't really put Nvidia in a position where they have to rush to counter anything (or even drop their prices, sadly):
value-99th-nopcars.gif
 

BennyBlanco

aka IMurRIVAL69
This is beautiful. Next gen consoles shouldn't be gimped or half stepped like the current ones are. At the very least they should be able to do 4k @ 60fps by 2019.

60 fps console just don't ever seem to be in the cards.

I'm gonna try to ride this 970 out until I get a 4k tv, but damn.
 

Ryoku

Member
This is just reminding me of how people went crazy over PS4's ram architecture, but at the end of the day it didn't add up to much

Well, this is talking about up to 1TB/s bandwidth with up to 32GB just for VRAM.

PS4 has 176GB/s bandwidth shared between GPU and CPU, with the CPU taking up to a maximum of 20GB/s from that shared bandwidth. Not to mention the 5/5.5GB GDDR5 is also shared between GPU and CPU. Although 8GB GDDR5 was impressive and unexpected, there were already cards on the market that had more than 4GB just for the GPU with much faster memory to boot. The price was the most impressive part of the PS4, imo.
 

RaijinFY

Member
This is beautiful. Next gen consoles shouldn't be gimped or half stepped like the current ones are. At the very least they should be able to do 4k @ 60fps by 2019.

That's nonsense. Take a look at the recent UE4 demo, Kite. It is running on a Titan (12GB) at 1080p/30fps! That should give you an indication on where most dev will put their resources... most likely not on 4k resolution.
 

Painguy

Member
Am I gonna need a new mobo for this chip? Currently rocking a 580 and I really don't feel like taking my rig apart :/
 
well, I'm glad we agree

No you don't agree, there is no need for them to compete more and right now they aren't likely to be able to bring out HBM cards anyway because there is not enough supply.

Also the initial reply you were talking about was also referring to the amount of memory and HBM2 is different than HBM1.

Why does Nvidia need a new card with HBM memory right now according to you?
 

SparkTR

Member
I'm a native console gamer so what does this mean for me in the future?

32 GB should be generation proof. What ways can a game justify allocating all of this power?

A couple I can think of looking at last generation. New/better effects brought on with DX12 and UE4 can be utilized. This one is pretty obvious seeing how much stuff was seemingly removed/cut down from UE4. We saw this last generation with games like BF4, Sleeping Dogs and Hitman Absolution, looking back now despite those games being based on last generation consoles, the PC versions looked akin to next generation games that were still a few years off.

4k resolutions will also become way more common in the future. Like last generation where the console were pushing sub-720p resolutions while on PC 1080p was the standard since like 2008, 4k will slowly become the standard resolution on PC while consoles are outputting to 1080p.
 

squidyj

Member
Am I gonna need a new mobo for this chip? Currently rocking a 580 and I really don't feel like taking my rig apart :/

The only reason you'd need to upgrade for a videocard is because the pcie bandwidth became a bottleneck. Or because of form factor.

Here's my question, the article only seems to state that 32GB is possible with HBM2 but it doesn't seem to clearly state that that much will actually be available on one of these cards.
 

Fractal

Banned
Looking forwards to it... sticking with my 780 Ti until then. I was tempted to upgrade to 980 Ti, but I'm still satisfied with the current one.
 

laxu

Member
I'm a native console gamer so what does this mean for me in the future?

32 GB should be generation proof. What ways can a game justify allocating all of this power?

In no way at all. We won't see cards with that much RAM for years because it's completely excessive for anything except scientific computing (or other similar purposes). In the next few years we will see cards with around 8 GB VRAM, maybe 16 for the next Titan. That will be plenty even for 4K.

The main limitation on a GPU is still and always will be it's processing power, not the amount of VRAM it has. That combined with consoles driving game development means we're unlikely to see anything that would warrant shitloads of VRAM.

Next year we will most likely see somewhat faster replacements for the 970 and 980 at best, with 4 GB RAM unless HBM2 is out when they are released.
 

-MB-

Member
The only reason you'd need to upgrade for a videocard is because the pcie bandwidth became a bottleneck. Or because of form factor.

Here's my question, the article only seems to state that 32GB is possible with HBM2 but it doesn't seem to clearly state that that much will actually be available on one of these cards.

If it does, it'll be on the Titan version of the Pascal generation, not on the more "mainstream" cards.
 
This is gonna be a beast.

Gonna upgrade from a 670 to a 970 for 1080p soon just to keep up a bit, then jump to 4K once Pascal Hits and HDMI 2.0a is common in devices. (TV, AVR, Graphics Cards)
 

knitoe

Member
So far, the rumors makes it look like a huge performance upgrade. Normally, I upgrade CPU / MB one year, video cards the next and repeat. Been on the SLI Titan to SLI Titan X, but I might upgrade again if the performance shows to be true.
 

Genio88

Member
Looks like a huge improvement, as I expected from 16nm, I'm definitely gonna change my GTX 980 next year with a Pascal x80 for sure. Unfortunately all that power will only be used to achieve higher resolutions and VR(if finally we'll have some good game for those devices, unfortunately consoles will keep on holding back PC gaming, like we've seen in the recent Witcher 3 and Batman
 

Man

Member
Sounds like something I will buy when Oculus Rift consumer version #2 comes around (probably late 2017).
For now I am keeping my 980ti EVGA Hybrid back-order (planning to build a new rig in September).
 
Hopefully the '970' equivalent of the Pascal range will offer the same performance as my current 970SLi cards, if so I'll buy a couple of them.
 

Renekton

Member
Looks like a huge improvement, as I expected from 16nm, I'm definitely gonna change my GTX 980 next year with a Pascal x80 for sure. Unfortunately all that power will only be used to achieve higher resolutions and VR(if finally we'll have some good game for those devices, unfortunately consoles will keep on holding back PC gaming, like we've seen in the recent Witcher 3 and Batman
Batman?
 

Genio88

Member

Yes, in Arkham Knight all effort was into the console versions while the PC port has been done by outside developers with the results we all have seen, PC version worse looking than console one besides bugs, frame rate issue and crashes
 

Renekton

Member
Yes, in Arkham Knight all effort was into the console versions while the PC port has been done by outside developers with the results we all have seen, PC version worse looking than console one besides bugs, frame rate issue and crashes
PC AK problems are not due to console horsepower.

If Rocksteady's dev comfort zone is on fixed-spec devices, that's their decision.
 

Rodin

Member
Unless this is the new Titan, i call bullshit. Next series will be great though, between HBM2 and 16nm there's room for a lot of improvements from this gen.
 

aeolist

Banned
In no way at all. We won't see cards with that much RAM for years because it's completely excessive for anything except scientific computing (or other similar purposes). In the next few years we will see cards with around 8 GB VRAM, maybe 16 for the next Titan. That will be plenty even for 4K.

The main limitation on a GPU is still and always will be it's processing power, not the amount of VRAM it has. That combined with consoles driving game development means we're unlikely to see anything that would warrant shitloads of VRAM.

Next year we will most likely see somewhat faster replacements for the 970 and 980 at best, with 4 GB RAM unless HBM2 is out when they are released.

widening the bus that much will have a big impact even without going to 32GB. lots of modern shader effects as well as AA and AF are bandwidth-hungry.
 

OraleeWey

Member
When you've got a 970 or a 980, for example, what's the point of playing at 4k, when you've got most of the settings turned off?
 

Ce-Lin

Member
we are reaching the point where devs will really struggle to get the most out of the tech (apart from being able to crank up resolution, physx, AA and compute effects)
 

Durante

Member
When you've got a 970 or a 980, for example, what's the point of playing at 4k, when you've got most of the settings turned off?
Well, that depends on the game. There are a great many games that you can play at 4k on such a card at high settings -- far more really than you can't.

It's just the majority of the most recent AAA titles where you'll have to make some sacrifices. The nice thing about gaming on a PC is that you can decide the tradeoff between image quality, graphical complexity, effects and framerate for yourself on a per-game basis.
 

RE4PRR

Member
In no way at all. We won't see cards with that much RAM for years because it's completely excessive for anything except scientific computing (or other similar purposes). In the next few years we will see cards with around 8 GB VRAM, maybe 16 for the next Titan. That will be plenty even for 4K.

The main limitation on a GPU is still and always will be it's processing power, not the amount of VRAM it has. That combined with consoles driving game development means we're unlikely to see anything that would warrant shitloads of VRAM.

Next year we will most likely see somewhat faster replacements for the 970 and 980 at best, with 4 GB RAM unless HBM2 is out when they are released.

Next year is Pascal, no ifs or buts. Nvidia are using HBM2 on their next cards and it will mean their Titan will be at least 8GB, you can see just from AMD's fury cards 4GB doesn't appear to be enough and most review sites were not happy with this.

It won't be their full cut Pascal, since it's the first on 16nm , but it'll certainly be a beast.
 

Rosur

Member
Sounds pretty good for 4k gaming (or down-sampling). Though gonna be waiting for the 2nd gen HBM2 cards or will they go straight to HBM3?
 

aeolist

Banned
Next year is Pascal, no ifs or buts. Nvidia are using HBM2 on their next cards and it will mean their Titan will be at least 8GB, you can see just from AMD's fury cards 4GB doesn't appear to be enough and most review sites were not happy with this.

It won't be their full cut Pascal, since it's the first on 16nm , but it'll certainly be a beast.

titan will probably be 16GB, the ridiculous memory pool is the only way you can really justify those things
 

mhayze

Member
16nm FinFET - finally! This will be exciting. Considering how far Nvidia and AMD have pushed performance on the same 28nm node for the past many years, jumping up a full node (instead of the usual half-node) should allow for something truly special next gen.

That said, I'm not so sure we will all be dancing in this wonderland of cheap, plentiful 17-billion transistored 1080ti cards in Spring 2015, as that story implies.

1. The "full fat" 17 billion transistor model is probably going to be in limited supply 1H 2016, which means either Quadro/compute only, or a small supply of very expensive flagship consumer cards (i.e. Titan Y)

2. This limited supply will be compounded by availability of HBM2. If Hynix continues to be the only supplier, this could be trouble. It is a JEDEC standard so others should be able to make it, but Hynix is a co-developer and the only one with actual manufacturing experience, so I think they will be the major supplier or more like only supplier in 1H 2016. This means that unless Hynix are planning a major ramp up, and starting that ramp soon, Nvidia will be competing with AMD for supply in 1H 2016.

Just speculation at this point, but I hope there's another GPU in the wings, a "GP204" - i.e. the "1070 ti" equivalent of the 970 ti, rather than what will probably be the "1080 ti" model to replace the current 980 ti. That's where we will most likely see the new enthusiast sweet spot from Nvidia.
 
titan will probably be 16GB, the ridiculous memory pool is the only way you can really justify those things

Or 12GB if it's not hbm2 ready. I think titans ram will continue increasing in increments of 6 and I'm expecting next years titan to be similar to Titan Black with the same amount of ram as last years model, launching early 2016. Then later in the year the mainstream cards will come equipped with 8GB. Early 2017 sees hbm2 single gpu Titan with 24GB vram. Though there's probably gonna be a 3,000€ dual gpu card like the Z foreshadowing whats to come a year later.
 

pottuvoi

Banned
That's definitely not true.
+1
It's imporant to remember that even detail as small as one hundreth of a pixel can be hundreds of times brighter than the neighbouring area and perfectly hit a pixel center to create sparcle, especiially noticeable in motion.

There are lots of different sources of aliasing and every single one of them is still bad at high resolutions.
 

AP90

Member
Laptops are gonna be huge

This, but like 1yr ish afterwards.

Also I think 2017 Q2 will be my next upgrade point. Because I feel like that will be the "true" Pascal release. Hoping Intel will have made significant gains like this too.

As for now, my 980tj should hold me nicely till then. Hopefully my 2600k can make it another 2years =]
 
Top Bottom