• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA Pascal GPU to feature 17B transistors and 32GB HBM2, coming Q2 2016 (earliest)

knitoe

Member
+1
It's imporant to remember that even detail as small as one hundreth of a pixel can be hundreds of times brighter than the neighbouring area and perfectly hit a pixel center to create sparcle, especiially noticeable in motion.

There are lots of different sources of aliasing and every single one of them is still bad at high resolutions.

Think, it also depends on the displays pixel density. On my Acer 28" 4K Gsync, I don't need to use any AA, but I can see someone with 40" or greater would still need some form.
 

OraleeWey

Member
Well, that depends on the game. There are a great many games that you can play at 4k on such a card at high settings -- far more really than you can't.

It's just the majority of the most recent AAA titles where you'll have to make some sacrifices. The nice thing about gaming on a PC is that you can decide the tradeoff between image quality, graphical complexity, effects and framerate for yourself on a per-game basis.

Its not always about the latest games.

There are older games to run at 4k

Okay that makes sense. I'm just perfectly happy with 1080p atm. But then again I haven't seen native 4k gaming.
 
Guys buy a card now, enjoy pascal when it finally actually comes out (based on AMD competition versus the silicon taping out etc - ie the business has to make sense not just the technology)

I wouldnt be surprised to see pascal launch first as a high precision card for the pro market since Titan X sacrificed that versus the prior titan.

Been loving my Titan X playing Witcher 3 on my 1080p (3D vision, 144Hz) monitor

By the time pascal comes out and makes sense, I'll play again at 1440 or 4K
 

ZOONAMI

Junior Member
Think, it also depends on the displays pixel density. On my Acer 28" 4K Gsync, I don't need to use any AA, but I can see someone with 40" or greater would still need some form.

Depends on how far you are from the display too. I'd want AA in your scenario, but it's not really necessary on a 4k tv if you're sitting on a couch. In both cases adding AA helps, but isn't a terrible idea to just turn it off if you're trying to boost fps to 60.
 

Hasney

Member
No one has given a source for any of this information...

Nothings a leap. Most of what's in there has been assumed for a while from what they've said in the past.

As others have noted in here, the only unexpected part was 32GB, but that'll be the pro version/Titan.

Guys buy a card now, enjoy pascal when it finally actually comes out (based on AMD competition versus the silicon taping out etc - ie the business has to make sense not just the technology)

I wouldnt be surprised to see pascal launch first as a high precision card for the pro market since Titan X sacrificed that versus the prior titan.

Been loving my Titan X playing Witcher 3 on my 1080p (3D vision, 144Hz) monitor

By the time pascal comes out and makes sense, I'll play again at 1440 or 4K

I would, but there's no point shelling out £200 now and then spend tons on the top end Pascal cards. Even selling the previous, it doesn't make sense when I can just play PS4 for a year.
 

dr_rus

Member
sorry, got the names mixed up. I'm talking about this

v7k8CBM.jpg


this is a desktop card. if this isn't outside the realm of possibility, then nvidia better have a desktop card to answer it or they will be left behind.

This card is selling for a premium above it's own performance league right now because it has HBM1 on it mostly. But this isn't even important - what's important is that this card's GPU is using an old, proved, tested and relatively cheap 28nm production process.

The biggest cost concerns are in the new 16nm production process and not in HBM2 - which probably will cost more than GDDR5 as well but the difference won't be as dramatic as in case of 600mm^2 chips made on 28 and 16 nanometers.
 

pottuvoi

Banned
Think, it also depends on the displays pixel density. On my Acer 28" 4K Gsync, I don't need to use any AA, but I can see someone with 40" or greater would still need some form.
Depends a lot on content.
If you have HDR buffer which has contrast differences of order of maginude you always need some sort of AA.
 

AmyS

Member
TSMC Begins Volume Production of 16nm FinFET Process – Nvidia Pascal GP100 GPU Among the Products in Production

http://www.taipeitimes.com/News/biz/archives/2015/08/11/2003625049
http://wccftech.com/tsmc-begins-volume-production-16nm-finfet-nvidia-pascal-gp100-gpu/
The TSMC 16nm FinFET node is probably the most notable process, that is of interest to PC enthusiasts. This is the node that will house Nvidia’s next generation lineup of graphic cards (specifically the “16FF+” variant) and is one of the most authentic indicators of their time-frame. Taipei Times, in accordance with everything we heard in the past, has confirmed that TSMC has (finally) started mass production of 16nm FinFET products. However, It is expected that the initial run will be dedicated for Apple SoCs.

TSMC and Nvidia have also confirmed on more than one occasion that the next generation (Pascal) GPUs will be produced on the 16nm FinFET+ node, with initial confirmation dating back approximately 9 months. AMD’s next generation Radeon graphics processor on the other hand, codenamed Arctic Islands, was not on the official list of products released by TSMC, so while their CEO have confirmed the use of a FinFET node (14/16) the exact specifics remain to be seen.

http://hexus.net/tech/news/industry/85565-tsmc-starts-volume-production-16nm-chips/
TSMC has also announced that it will ramp up "an enhanced version of 16nm chips, or 16 FinFET+ chips, in the third quarter and that production would reach a high volume in the same quarter," reports the Taipei Times. It is this process upon which Nvidia is going to depend for its next-generation GP100 Pascal GPU. Pascal reportedly has 17 billion transistors and up to 32GB of HBM2 based vRAM

http://www.extremetech.com/computing/212221-tsmc-quickly-ramping-up-16nm-volume-production
Earlier rumors surrounding the node have suggested that 16nmFF+ might be better suited to higher power devices than 16nmFF, which would explain why Apple might be planning to tap it for the iPhone, while companies like AMD and Nvidia aren’t expected to release 16nm hardware until next year. AMD hasn’t confirmed that it will use TSMC for further GPU manufacturing, but both it and Team Green have historically done so. We’ve kicked around the idea of whether or not Samsung/GlobalFoundries might capture some of that business for the first time, but for now, it seems prudent to assume the cycle will continue. More details on Zen’s manufacturing should be available closer to its launch date.

As for how this will play out in the graphics card industry itself, we’d expect Nvidia to launch the next round of products (Pascal), followed by an AMD launch later in the year. Which company beats the other to the punch depends on a number of factors, but based on current product cadences, that’s the most likely scenario. A great deal is riding on how strong TSMC’s 16nmFF+ node actually is — if the foundry can’t deliver a substantially better product than Samsung has fielded, it could find itself locked out of substantial revenues at the 16/14nm node. This is part of why TSMC has been aggressively pushing 10nm. The company is used to leading at cutting-edge nodes and capturing the majority of the revenue from doing so. An aggressive 10nm ramp, if the company can pull it off, would put it neck-and-neck with Intel at that node.

3eUxgsC.jpg
 

dr_rus

Member
28nm will live for a couple years from now still. I'm not even sure that we'll have any GeForces on 16nm by this time next year - there is a reason why NV would push out a big Pascal first but there's slim chance of it coming out as even a Titan in 2016. We'd better hear about some GP104 soon because this is likely the GPU which will launch in gaming market segment in 1H16.
 

AmyS

Member
It's also interesting that Nvidia is skipping standard 16nm FinFET, as Pascal will use 16nm FinFET+

Also, even assuming some delays with 10nm FF (FF+ for 10nm I presume) by the time Microsoft and Sony next-gen appears (2019 earliest, 2020 most likely) 10nm would probably have been in volume production for a long time and the most advanced processes like 7nm EUV may or may not be ramping up for other things.

A great deal is riding on how strong TSMC’s 16nmFF+ node actually is — if the foundry can’t deliver a substantially better product than Samsung has fielded, it could find itself locked out of substantial revenues at the 16/14nm node. This is part of why TSMC has been aggressively pushing 10nm. The company is used to leading at cutting-edge nodes and capturing the majority of the revenue from doing so. An aggressive 10nm ramp, if the company can pull it off, would put it neck-and-neck with Intel at that node.
 

Momentary

Banned
No.

And it's unlikely that the jump will be any bigger than between GTX580 and GTX680.


Hopefully you're wrong. I hate thsee tiny ass jumps.


Makes me wish Samsung would go ahead and buy AMD and get into the GPU game just to make competiton for NVIDIA. They hate them enough to probably pull that move out of spite.
 
Yeah, I'll be shocked if NVIDIA pushes hard with Pascal, especially with AMD being such a non-factor. The second generation with Pascal's successor might be where it's at.
 
No.

And it's unlikely that the jump will be any bigger than between GTX580 and GTX680.

I imagine it will be like this as well, then the big Pascal will come out and be shiva, destroyer of worlds.
8800 GTX the sequel coming soon?

Obviously the 8800 GT is the card everyone remembers, but the 8800GTX (and the Ultra specifically) was such a monster forward thinking beast. Due to its extra large memory set up, it basically scaled ´super far into the future. In moments were the 8800 GT had shading power but not enough VRAM, the 8800 GTX came in and stomped all over the place.

Awesome card.
 

gatti-man

Member
I imagine it will be like this as well, then the big Pascal will come out and be shiva, destroyer of worlds.


Obviously the 8800 GT is the card everyone remembers, but the 8800GTX (and the Ultra specifically) was such a monster forward thinking beast. Due to its extra large memory set up, it basically scaled ´super far into the future. In moments were the 8800 GT had shading power but not enough VRAM, the 8800 GTX came in and stomped all over the place.

Awesome card.
I had dual 8800gtx's. Those things were monsters.
 

dr_rus

Member
Hopefully you're wrong. I hate thsee tiny ass jumps.


Makes me wish Samsung would go ahead and buy AMD and get into the GPU game just to make competiton for NVIDIA. They hate them enough to probably pull that move out of spite.

Last time I checked Samsung was limited by the same laws of physics as everyone else -)
 

Two Words

Member
What lovely news to read after buying a 980 GTX Ti :p

I'm sure this GPU will last me for years until 16 nm process GPUs are either a thing of the past or completely matured.
 
Such low expectations in this thread. You guys are like the beaten housewife who is grateful when her drunk husband doesn't throw his beer at her head.
 

Momentary

Banned
Last time I checked Samsung was limited by the same laws of physics as everyone else -)

Not when you got deep pockets like they do. How is it a physics thing when the technology is already available, but only distributed to the top dogs first? It's a money thing right now. There have been smaller die sizes for years now, but it only goes to everyone else except the discrete GPU market. They would probably also be able to create a better R&D group than what AMD has now. Plus we'd probably get better aesthetics instead of those vomit inducing "GAMER" aesthetic cards that AMD and their partners put out. Samsung always had some classy looking stuff.

Well something's got to happen because AMD's stocks haven't been this low since the 1970's. Which is even worse if you take into account inflation. $1.84 a share. Someone is waiting for it to get just low enough for them to swoop in and get a deal.
 
I am glad I just bought a GTX 960 when I just built my computer 3 months ago. I am just gaming on a 1080p 60hz monitor, right now. When these cards comes out... boom new video card and monitor.
 

kitch9

Banned
I feel like a pauper trying to enjoy a wealthy man's hobby.

PC gaming is the pits for this.

Saying that, not upgraded my processor for a long time 3770k that runs at 4.8 on low volts all day and a 780 6gb should see me through on a 2k g sync monitor until pascal.
 

Xenus

Member
No.

And it's unlikely that the jump will be any bigger than between GTX580 and GTX680.

You likely quite wrong if only because they skipped multiple in between processes rather then just reach to the next new node. True 28 nm to 16 nm scaling would make chips nearly 1/4th the size. Power doesn't scale as well anymore but should be enough to give them significant headroom as well. Though I know different foundries have been playing with the numbers so I'm not sure if it's true 16nm anymore at all. There is always the option that they take the destroyer of worlds chips and make them enterprise only for huge margins though if AMD doesn't come out swinging as well.
 

Sanctuary

Member
Glad I haven't upgraded my GTX 780 yet. The only game out now that I own that would have really benefitted (not even that much though with Hairworks off) from something more powerful would be TW3. I was planning on upgrading next summer anyway, so we'll see how much of a leap this is for gaming.

That memory sounds like a wet dream for anyone who plays with a high amount of high quality texture mods too.
 

Xenus

Member
PC gaming is the pits for this.

Saying that, not upgraded my processor for a long time 3770k that runs at 4.8 on low volts all day and a 780 6gb should see me through on a 2k g sync monitor until pascal.

I'm waiting for Skylake 6700k to finally become available in the US for my upgrade from my ancient Core2 Q6700 I wanted to wait till skylake-E but it started hanging and now boots once every time in 10 to push the issue.
 

AmyS

Member
Volta is rumored to be due out in 2017 for HPC use, but not in consumer graphics cards until 2018. Totally unconfirmed, but for now I'll believe it for a couple of reasons.

First, Nvidia originally announced Volta in early 2013 as the next architecture after Maxwell. A year later they swapped out the name Volta for Pascal. This year, both Pascal and Volta appeared on the same roadmap.


Pascal looks like H1 2016 - Volta looks like it could make Q4 2017 - But again, the rumor was Volta for HPC in 2017, certainly consumer GTX Volta cards won't be seen until sometime in 2018.

Second thing is, Nvidia should have plenty of headroom on 16nm FF+ after Pascal, after finally moving from 28nm to 16nm FF+ - Volta could be to Pascal what Maxwell was to Kepler. A major refresh, not completely new architecture. It's often been said both Pascal and Volta will make use of NV Link.

Perhaps one difference between Pascal and Volta might be Pascal uses HBM2 and Volta uses HBM3 or the Hybrid Memory Cube they originally planned for the Volta intended for 2016.

Pure speculation of course. Feel free to tell me where I'm wrong, as I probably am on some things.
or wrong on everything
 
Top Bottom