• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia announces 2014 CES Press Conference and Live Webcast - MAXWELL?

artist

Banned
t1388980800z4.png


SANTA CLARA, CA - NVIDIA (NASDAQ: NVDA) today announced that it will hold a press conference at the 2014 International Consumer Electronics Show (CES) on Sunday, Jan. 5, 2014, which will be webcast live.

The invitation-only event begins at 8 p.m. PT at the Cosmopolitan Hotel in Las Vegas. Interested parties can watch a live webcast, available on the NVIDIA blog, at http://blogs.nvidia.com. A replay will be available for seven days.

At CES, NVIDIA will showcase new NVIDIA® Tegra® mobile technologies, gaming innovations and advanced automotive display technologies. The company will have a presence at the Las Vegas Convention Center's South Hall 3, booth #30207.
http://nvidianews.nvidia.com/Releas...cast-Coverage-Dedicated-CES-Newsroom-a84.aspx

nvidia-maxwell-new-85tpjxj.png


According to SemiAccurate’s sources NVIDIA might reveal some details of the upcoming Maxwell architecture during CES in Las Vegas. Whether this is going to be a public announcement or a special meeting behind closed doors, is yet to be confirmed.

As always take everything with a grain of salt.

Maxwell on 28nm process?
I’ve been hearing about Maxwell for few months. The rumor was that NVIDIA plans two major GPUs, the GM107 processor which would still be made in 28nm process and a new flagship (GM204) based on 20nm architecture. There were rumors about the GM104, but apparently it was canceled.

Charlie claims that NVIDIA will stick to 28nm process with the new architecture. He is basically saying there won’t be any 20nm GPUs from NVIDIA in 2014. I find this highly unlikely, it would deny all previous rumors and simply cause a huge disappointment.

One thing is certain though, there are no 20nm GPUs in first and second quarter of 2014. First Maxwell samples were made few months ago, they are almost ready for mass production and they are all in 28nm fabrication process.

I could be wrong but I have a feeling that GM100 (2014) are 28nm, while GM200 (late 2014/2015) are 20nm, we will verify this in the upcoming months.

NVIDIA Maxwell GM107 and GM204 release dates
According to S|A, the GM107 would be revealed in February. This is a mid-range GPU for cards like GTX 860, maybe even GTX 870. The Big Maxwell, which is codenamed GM204 would arrive later, possibly in fourth quarter. Of course this would be a GTX 780 (Ti) replacement.

What is new in Maxwell?
The Maxwell architecture will be a derivative of Kepler. The shaders will look similar to Kepler’s. The new feature is coherent memory, and here we should mention Project Denver, which apparently is not yet ready to be implemented into Maxwell. Instead of using a separate CPU NVIDIA will use a controller to take some steam off the GPU. How is this going to affect the performance? Let’s wait and see.
http://videocardz.com/48610/nvidia-maxwell-details-revealed-ces-2014
 

Stallion Free

Cock Encumbered
No 20nm in 2014 would be a huge disappointment. It's hard enough waiting for Q4.

Expanded manufacturer support for G-sync before it even launches would make me slightly forgive them.
 

Bor

Neo Member
Looking forward to this. After i got 3 280x and all of them had massive coil whine i decided to wait and see what nvidia has to offer for next year. The witcher 3 will probably force me to get a new card
 

Joezie

Member
Non 20nm would be a bummer, but so would an incomplete implementation of Project Denver.

These 2 days are going to be one hell of a ride.
 
Unlikely we'll get anything significant due to the essence of CES. Best highlight we can expect is a raft of high-quality G-sync monitors announced I would wager.
 

artist

Banned
Unlikely we'll get anything significant due to the essence of CES. Best highlight we can expect is a raft of high-quality G-sync monitors announced I would wager.
So they'll hold a livestream event twice for G-Sync? Sounds more than that, may not be Maxwell but just G-Sync and Tegra doesnt seem to warrant a livestream event.
 

Durante

Member
Best highlight we can expect is a raft of high-quality G-sync monitors announced I would wager.
If we get a 1440p non-TN G-sync monitor that would actually be a bigger deal for me than Maxwell. But I don't think that will happen so soon.
 

Redmoon

Member
No 20nm would be disappointing, as that's when the "real" new line of GPU's start.

Really hope my 680's will hold up till then, both performance wise and life-wise.

Still look forward to hearing anything new about Maxwell.
 

LiquidMetal14

hide your water-based mammals
Need to start seeing more on the GSYNC front since having only 1 available GSYNC compatible piece is unacceptable. I already have a 27"IPS and want to get the ASUS 27"144hz monitor but unlike its 24" counterpart it's mysteriously not GSYNC compatible. Stuff like this is giving me a headache.

I don't want to mix a 24 and 27 together as I'm going for dual monitor gaming here.
 

Nachtmaer

Member
After reading so many articles and posts about how 20nm (the type they use for GPUs, since there different types of processes) won't be ready for H1 2014, or 2014 at all, I'd be surprised to see GM104/204 or even GM100 make it as early as nVidia hoped for. I guess we'll see how things pan out.

gpu coil whine is sometimes caused by the refresh rate not matching the monitors refresh rate.

From what I know, having coil whine is just a case of bad luck. It can happen to any brand. Even then it can depend on the actual card's manufacturer and whatnot.
 

pixlexic

Banned
Looking forward to this. After i got 3 280x and all of them had massive coil whine i decided to wait and see what nvidia has to offer for next year. The witcher 3 will probably force me to get a new card

gpu coil whine is sometimes caused by the refresh rate not matching the monitors refresh rate.

.
 

riflen

Member
A Feb launch of a 28nm GPU is kind of strange on first inspection, but it makes complete sense if you've been following the state of 20nm GPU fabrication. It's going to be interesting to see where a GTX860 would sit for price/performance.

I'll be disappointed if the integrated ARM CPU is not ready, as that was a really interesting idea to me.
 

GHG

Gold Member
If there's no flagship till Q4 then I'll probably look at getting a 780 ti soon.
 

McHuj

Member
If it's really on 28nm, the Maxwell architecture better be amazing in terms of performance/watt otherwise it will hardly be an upgrade.

IMO, they can't really push power consumption anymore at the high end and it's already too high at the mid tier GPU levels.
 

Nachtmaer

Member
If it's really on 28nm, the Maxwell architecture better be amazing in terms of performance/watt otherwise it will hardly be an upgrade.

IMO, they can't really push power consumption anymore at the high end and it's already too high at the mid tier GPU levels.

If they do go that route, my guess would be that GM104's size would end up between GK104 and GK110. They can toss out the compute stuff like they did with GK104 so they can achieve bigger gaming performance than GK110 without it being huge and soaking up a lot of power.
 

Bor

Neo Member
gpu coil whine is sometimes caused by the refresh rate not matching the monitors refresh rate.

Well, the only difference the framerate in games made was the frequency of the noise, if that's what you mean. It went from this in games to the more high pitched sound in menues. You could even hear it when i was watching a movie with 2 monitors connected, although it was quiet. As sound some gpu load was applied the thing started making noise and i'm glad the shop took it back because it would have drove me insane
 

TronLight

Everybody is Mikkelsexual
If they do go that route, my guess would be that GM104's size would end up between GK104 and GK110. They can toss out the compute stuff like they did with GK104 so they can achieve bigger gaming performance than GK110 without it being huge and soaking up a lot of power.

Wouldn't that cripple performance when a game is using advanced Physx stuff?
 
Bad optimization aside, I doubt there's much coming this year that will make my 680's sweat aside from Witcher 3. G-Sync should make a 40 fps experience feel really solid for that, so I'll deal until 20nm's are out, although I'm kind of just waiting for 780tis to drop in price so I go SLI on those which should destroy everything until probably mid-gen at worst.
 

McHuj

Member
Wouldn't that cripple performance when a game is using advanced Physx stuff?

It shouldn't. A lot of the compute in something like GK110 is targeted for scientific computing that requires 64-bit double floats. You don't need that for a game engine (at least, I hope their only doing it in single precision floats).
 
So they'll hold a livestream event twice for G-Sync? Sounds more than that, may not be Maxwell but just G-Sync and Tegra doesnt seem to warrant a livestream event.

See below your post. G-Sync is a massive thing and fits in more with CES than any Maxwell type stuff does.
 

TronLight

Everybody is Mikkelsexual
It shouldn't. A lot of the compute in something like GK110 is targeted for scientific computing that requires 64-bit double floats. You don't need that for a game engine (at least, I hope their only doing it in single precision floats).

I see. I asked because I heard that the main cause of TressFX bad performance on latest nVidia's card is the lack of proper compute support.
I know TressFX isn't a Physx effect, but both require lots of computing power, right?
 

chaosblade

Unconfirmed Member
28nm would be unsurprising but still disappointing. No regrets buying a 760.

Interested in G-Sync though. Hopefully we see some monitors that aren't like $400, but I'm not counting on it.
 
Looks like I'm sticking with my 560 Ti 448 core until the 20mm stuff comes out or I find a good deal on a used/new card.
 
I see. I asked because I heard that the main cause of TressFX bad performance on latest nVidia's card is the lack of proper compute support.
I know TressFX isn't a Physx effect, but both require lots of computing power, right?

It just happens to be that AMD cards are faster in certain compute scenarios than not. This is the reason why they absolutely mince Nvidia at mining (and quite a lot of other stuff, really).

Edit: http://www.extremetech.com/computing/153467-amd-destroys-nvidia-bitcoin-mining

The bottom half of this article explains why Nvidia lack in certain computing scenarios.
 

Nachtmaer

Member
I see. I asked because I heard that the main cause of TressFX bad performance on latest nVidia's card is the lack of proper compute support.
I know TressFX isn't a Physx effect, but both require lots of computing power, right?

From my understanding, TressFX uses DirectCompute, which is DirectX"s implementiation of compute (pretty self-explanatory and all). Somehow it ends up running faster on AMD/GCN cards than on nVidia's for reasons I don't remember. It depends on the type of workload I guess.
 

riflen

Member
If it's really on 28nm, the Maxwell architecture better be amazing in terms of performance/watt otherwise it will hardly be an upgrade.

IMO, they can't really push power consumption anymore at the high end and it's already too high at the mid tier GPU levels.

I've read that shrinking the process does not necessarily reduce power consumption overall. There is power leakage and power usage, with leakage increasing as the process is shrunk. There are technologies designed to reduce the issue, but obviously the problems are complex. I expect maxwell to include some good design improvements over kepler to even make a 28nm maxwell worthwhile designing and manufacturing.

Great article here on the subject: http://www.extremetech.com/computing/123529-nvidia-deeply-unhappy-with-tsmc-claims-22nm-essentially-worthless
 

McHuj

Member
I've read that shrinking the process does not necessarily reduce power consumption overall. There is power leakage and power usage, with leakage increasing as the process is shrunk. There are technologies designed to reduce the issue, but obviously the problems are complex. I expect maxwell to include some good design improvements over kepler to even make a 28nm maxwell worthwhile designing and manufacturing.

Great article here on the subject: http://www.extremetech.com/computing/123529-nvidia-deeply-unhappy-with-tsmc-claims-22nm-essentially-worthless

Yes, that's true to an extent. Leakage is going up and up with shrinking nodes. It's really problematic in the mobile space, where it can dominate power consumption.

That's why Intel moved to the FinFet transistors, they help a lot with leakage. The 16nm node from TSMC is basically 20nm node with FinFet, so the power should be better.

20nm should provide a power or performance boost. http://www.tsmc.com/english/dedicatedFoundry/technology/20nm.htm

TSMC is advertising a 25% power reduction or a 30% performance increase. Granted that's marketing, but it at least it will provide some gains. For desktop GPU's this should ok, for laptop components it maybe an issue.

My expectation is that we'll probably get boards that run at the same power consumption levels but with better performance (maybe ~1.25 GHz clocks).
 

Serandur

Member
I couldn't find anything online, so I assume Maxwell wasn't even mentioned at the event? That's massively disappointing. The Tesla K1 is pretty impressive-looking for a mobile chip I guess, but meh. With the GTX 880M being a rebadge, Nvidia still releasing Kepler parts (Tegra K1 and GTX 790), and AMD recently releasing their Volcanic Islands parts I'm upset we might not be seeing any 20 nm parts soon. :/
 
Top Bottom