• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo NX rumored to use Nvidia's Pascal GPU architecture

Status
Not open for further replies.

heidern

Junior Member
X1 peak consumption (512GF single precision) is 10W right? Let's say power consumption drops 30% because of 16nmFF (possibly going to 7W if the 10W i mentioned before is right) would that be efficient enough to be put into the (battery powered let's leave the docked thing aside for a moment) portable console?
What is the upper limit a portable console should consume?

16nmFF is 60% more power efficient. Upper limit depends on the size. If it's handheld then supposedly it'll need to be closer to 2W but if it's a tablet then could be closer to 10W.
 

Eolz

Member
So with Gameblog getting Sony date right, does this mean all of their nx rumors are real?

Some are likely real, some aren't. The only stuff they usually get right are from Ubisoft and from retailers. They usually are far off with most third parties and power/technical related ones.
 

antonz

Member
So with Gameblog getting Sony date right, does this mean all of their nx rumors are real?

Just means they heard from someone before Sony made it official. If they had guessed it weeks or months in advance would lend more weight. Saying a day or 2 before yo its going to be X isn't very special
 

Kimawolf

Member
Can you please repost their rumors? I forgot them lol
http://www.neogaf.com/forum/showthread.php?t=1236809

This is my translations of some bits of information coming from Gameblog's own "sources". As usual, take with a grain of salt.

The NX was initially set to release in November 2016. Everything was ready for a for a launch in France, and worldwide.

According to our information, Nintendo had even reserved a lot of publicity space... before backing out. The machine was set to be accompanied by a line-up of 30 games. And then, the plans changed

Even if they'll say otherwise, the arrival of new consoles from Sony and Microsoft also have spooked Nintendo a little.

This time, impossible to release mid-gen, a system that's too far behind. The finals specs of the NX will come close to the PS4 Neo, with component more "up to date" but not top of the line. We were told that Nintendo definitely wasn't part of that kind of race.

According to our sources, 3 major deals with third-party publishers also contributed to the delay... the games were simply not ready. Included in the names mentioned, Capcom and Square-Enix are part of Nintendo's allies in what is considered to be operation "reconquer". A large part of the titles from those editors should release on NX, including a few exclusives.

The overall architecture of the system would be in big parts tied to Android, with Miitomo playing in major part in the structure of the NX itself. Miitomo could even serve as the HUB for games and evolve the interface as time goes.Finally the system could allow up to 8 controller devices to be connected at once. Quite a departure from the single GamePad on Wii U.

Nintendo doesn't want to make the NX too pricey. They wish to stay the most affordable of the 3. But that does not mean the system will be at a discounted price, either.
Our different sources claim the machine is "surprising and interesting". Aside from the reported problems linked to the delay, we haven't heard any negative impressions from developers.

Our sources point to September for the reveal of the NX. Nintendo indeed sounds interested by making an announcement in timing with the Tokyo Game Show.
Last edited by BY2K; 06-20-2016 at
 

10k

Banned
They've really changed their tone since the eurogamer post. Still super optimistic about it, but not talking about close to Neo anymore (but able to get PS4 ports easily), this kind of stuff.
(Also "operation "reconquer"" was a super bad translation, way too literal, closer to "winning back the heart/market/etc of" )
I'll carry the close to neo dream for them.
 

fahr

Member
Let's assume the rumors are true, and the NX will be a portable tablet-style system using Nvidia tegra tech to power it. What will this thing cost? What should it cost?

My poorly researched estimate will be roughly $199-$249. This is based on a few factors. The Nvidia shield would be very similar device, and Nvidia, like Nintendo, likes to profit on their hardware, but I doubt Nintendo requires the margins Nvidia does since they also make money on software. The shield costs $199 but has no screen, making me concerned nintendo could hit a $199 price point with a built in touch screen.

I can't imagine nintendo wants this thing to cost more than 250. The 3ds was 250 and that was a massive blunder. But since nintendo stated that they won't lose money on the haddware, I can't imagine it costing less than 199 with tegra hardware and a touch panel.

For this reason I also expect the panel to be small. Like 3ds xl sized, not 8-10 tablet sized. Then again, I don't think thermals of a tegra would be ok inside a phone-sized casing.
 

LeleSocho

Banned
16nmFF is 60% more power efficient. Upper limit depends on the size. If it's handheld then supposedly it'll need to be closer to 2W but if it's a tablet then could be closer to 10W.

Source? From what i remember from Intel's 22nm papers FinFET provides 30% less power consumption and since 16nm is only nominal (the die sizes are not actually smaller compared to planar 20nm) i think the only benefit comes from FinFET alone.
 

Kimawolf

Member
Let's assume the rumors are true, and the NX will be a portable tablet-style system using Nvidia tegra tech to power it. What will this thing cost? What should it cost?

My poorly researched estimate will be roughly $199-$249. This is based on a few factors. The Nvidia shield would be very similar device, and Nvidia, like Nintendo, likes to profit on their hardware, but I doubt Nintendo requires the margins Nvidia does since they also make money on software. The shield costs $199 but has no screen, making me concerned nintendo could hit a $199 price point with a built in touch screen.

I can't imagine nintendo wants this thing to cost more than 250. The 3ds was 250 and that was a massive blunder. But since nintendo stated that they won't lose money on the haddware, I can't imagine it costing less than 199 with tegra hardware and a touch panel.

For this reason I also expect the panel to be small. Like 3ds xl sized, not 8-10 tablet sized. Then again, I don't think thermals of a tegra would be ok inside a phone-sized casing.



I will be the one guy going out on the limb.

I will say this Mobile NX device revealed next month will have a 7 inch screen, perhaps 720P screen, it will be a Pascal based Tegra chip that will be clocked higher than we all think.

The dock will charge and provide upscaling to 1080p and allow the chip to run at full power. It'll be marketed as "Play your way."

Simple. Elegant marketing.
 
Looking at those old Gameblog rumours it doesn't seem unbelievable, but also not that interesting.
The part about connecting 8 devices sounds sensible if they are introducing something like a revamped Wii remote & nunchuck but without the cord between them.
 

ggx2ac

Member
I really liked your vid also, I was even considering making a thread about it to help show folks what is facts vs speculation (like I did in my "What we know about NX so far with sources" threads). But I wasn't sure if you would've wanted that or not.

In my opinion, there wouldn't be much to discuss because there aren't many of those people on here that believe that Nintendo is using AMD with x86 etc as the vendor for NX.

It all boils down to Nvidia not announcing a semi-custom win or announcing that they licensed out their tech as someone pointed out that it's what Nvidia mainly does apparently, licensing out and not doing semi-custom designs.

Then there's the fact that absence of evidence is not evidence of absence. This is with regards to the, "AMD announced to investors 3 semi-custom design wins but Nvidia haven't therefore..."

Also, we've had game journalists have info on Dwango, Osiris, Scorpio, Neo. They'd surely have info on NX but apparently, revealing that Nintendo is using Nvidia as revealed by Eurogamer, Kotaku, WSJ (Weekly Shonen Jump :p) means that it's a "conspiracy theory!" according to a particular person. I'm sure that if NX really was using AMD as a vendor, another site like IGN or something equivalent would have claimed the current rumours to be not true and give their own scoop to get clicks.

Finally, let's not forget we haven't had any insiders such as developers just waltz right into the Eurogamer NX rumour thread and just say, "No". We all know how entertaining that would have been.

The current NX rumours haven't so easily been debunked and it's been a couple of weeks now. There was already an NX "leaked" controller and lots of Reddit and NeoGAF "insiders" that gave their insider information on NX but found themselves clashed with other insiders in dispute of particular rumours.

That's the summary of events and hopefully things start to clear up before Nintendo even has to show off the NX. It's too bad we probably won't see during Nvidia's webcast if an investor asks about NX and the CEO reacts like, "NX, NX? What you cooking?... *Looks around, knocks jug of water over and runs off*"
 

antonz

Member
Source? From what i remember from Intel's 22nm papers FinFET provides 30% less power consumption and since 16nm is only nominal (the die sizes are not actually smaller compared to planar 20nm) i think the only benefit comes from FinFET alone.

They developed a more power efficient 16nmFF+ which has more or less taken over for 16nmFF. There is of course a lower cost 16nmFFC
 

ggx2ac

Member
Looking at those old Gameblog rumours it doesn't seem unbelievable, but also not that interesting.
The part about connecting 8 devices sounds sensible if they are introducing something like a revamped Wii remote & nunchuck but without the cord between them.

But it doesn't sound impressive because the Wii U can connect up to 9 controllers.

Runbow uses a combination of the Wii U Gamepad and 8 Wii U Pro Controllers/Wii Remotes.
 
D

Deleted member 465307

Unconfirmed Member
Source? From what i remember from Intel's 22nm papers FinFET provides 30% less power consumption and since 16nm is only nominal (the die sizes are not actually smaller compared to planar 20nm) i think the only benefit comes from FinFET alone.

I saw the 60% number get passed around before as well, but I'm not sure where it came from. When I saw it previously, it was talking about the move from the Maxwell X1 to the Pascal X2, but that seems suspiciously specific for a product we know basically nothing about.
 

Eradicate

Member
Thanks, here is my mockup based on wii u ... food for thought

4QlzbDl.png

Very cool!

They have that modular controller patent, and while just a patent, I could see Nintendo doing modular parts like this as long as those parts were useful! Like a LEGO controller!

Let's assume the rumors are true, and the NX will be a portable tablet-style system using Nvidia tegra tech to power it. What will this thing cost? What should it cost?

My poorly researched estimate will be roughly $199-$249. This is based on a few factors. The Nvidia shield would be very similar device, and Nvidia, like Nintendo, likes to profit on their hardware, but I doubt Nintendo requires the margins Nvidia does since they also make money on software. The shield costs $199 but has no screen, making me concerned nintendo could hit a $199 price point with a built in touch screen.

I can't imagine nintendo wants this thing to cost more than 250. The 3ds was 250 and that was a massive blunder. But since nintendo stated that they won't lose money on the haddware, I can't imagine it costing less than 199 with tegra hardware and a touch panel.

For this reason I also expect the panel to be small. Like 3ds xl sized, not 8-10 tablet sized. Then again, I don't think thermals of a tegra would be ok inside a phone-sized casing.

Good figuring! I'm betting $200 ($150 if it's "weakened" somehow internally), adding on $50 for a worthy bundle (special case and a game cartridge with a downloaded game in it...I'm an optimist!). Touch screens don't have to run up the cost as much as it's figured, but it may also be dependent on if the screen is free form (as was originally said two years ago that Sharp was doing this for Nintendo) or some random thing. Maybe it'll be bendable for all I know?!

I will be the one guy going out on the limb.

I will say this Mobile NX device revealed next month will have a 7 inch screen, perhaps 720P screen, it will be a Pascal based Tegra chip that will be clocked higher than we all think.

The dock will charge and provide upscaling to 1080p and allow the chip to run at full power. It'll be marketed as "Play your way."

Simple. Elegant marketing.

Another optimist! I'm thinking 6-7 inch screen too, leaving open the "form factors" in the future for a smaller 3-5 one and a larger 9 inch one down the road. (Why not?!) The NPad.

In my opinion, there wouldn't be much to discuss because there aren't many of those people on here that believe that Nintendo is using AMD with x86 etc as the vendor for NX.

It all boils down to Nvidia not announcing a semi-custom win or announcing that they licensed out their tech as someone pointed out that it's what Nvidia mainly does apparently, licensing out and not doing semi-custom designs.

Then there's the fact that absence of evidence is not evidence of absence. This is with regards to the, "AMD announced to investors 3 semi-custom design wins but Nvidia haven't therefore..."

Also, we've had game journalists have info on Dwango, Osiris, Scorpio, Neo. They'd surely have info on NX but apparently, revealing that Nintendo is using Nvidia as revealed by Eurogamer, Kotaku, WSJ (Weekly Shonen Jump :p) means that it's a "conspiracy theory!" according to a particular person. I'm sure that if NX really was using AMD as a vendor, another site like IGN or something equivalent would have claimed the current rumours to be not true and give their own scoop to get clicks.

Finally, let's not forget we haven't had any insiders such as developers just waltz right into the Eurogamer NX rumour thread and just say, "No". We all know how entertaining that would have been.

The current NX rumours haven't so easily been debunked and it's been a couple of weeks now. There was already an NX "leaked" controller and lots of Reddit and NeoGAF "insiders" that gave their insider information on NX but found themselves clashed with other insiders in dispute of particular rumours.

That's the summary of events and hopefully things start to clear up before Nintendo even has to show off the NX. It's too bad we probably won't see during Nvidia's webcast if an investor asks about NX and the CEO reacts like, "NX, NX? What you cooking?... *Looks around, knocks jug of water over and runs off*"

Good summary so far!

Also, I'd love to see some water jugs get knocked over!
 

maxcriden

Member
Regarding what Eradicate said above about a bundled game, I hadn't previously thought about it much before reading his post but in reading it now I agree, I think it's a safe bet that Nintendo will include some kind of game with the system to show off its features. Wii had Wii Sports and Wii U had Nintendo Land so I suspect NX in being positioned as a handheld and console will come with a game of some kind also, presumably one that shows off the system's uniqueness.
 

ggx2ac

Member
It's from the official TSMC site, the industry in taiwan that produce the chips.

Source: http://www.tsmc.com/english/dedicatedFoundry/technology/16nm.htm

Thanks for this, it's very interesting because there shouldn't be much issue with yields.

Comparing with 20SoC technology, 16FF+ provides extra 40% higher speed and 60% power saving. By leveraging the experience of 20SoC technology, TSMC 16FF+ shares the same metal backend process in order to quickly improve yield and demonstrate process maturity for time-to-market value.
 
Regarding what Eradicate said above about a bundled game, I hadn't previously thought about it much before reading his post but in reading it now I agree, I think it's a safe bet that Nintendo will include some kind of game with the system to show off its features. Wii had Wii Sports and Wii U had Nintendo Land so I suspect NX in being positioned as a handheld and console will come with a game of some kind also, presumably one that shows off the system's uniqueness.
Xenoblade X : Good face edition (Open World gaming in the palm trees your hands)
 

ozfunghi

Member
Comparing with 20SoC technology, 16FF+ provides extra 40% higher speed and 60% power saving. By leveraging the experience of 20SoC technology, TSMC 16FF+ shares the same metal backend process in order to quickly improve yield and demonstrate process maturity for time-to-market value.

Err... Does this mean, that WHILE 40% faster, it is ALSO 60% more power efficient on TOP of that? As in, it will perform 40% better than TX1 WHILE consuming 60% less power? Or either one or the other?
 

ggx2ac

Member
Err... Does this mean, that WHILE 40% faster, it is ALSO 60% more power efficient on TOP of that? As in, it will perform 40% better than TX1 WHILE consuming 60% less power? Or either one or the other?

Well, anyone that knows it's current clockspeed and wattage should know.

So we are estimating a 40% increase in clock speed and a 60% reduction in wattage.

If that makes any sense.

Edit: Referencing a random article:

Tegra X1 GPU has 1GHz clock speed, would it manage to get to 1.4GHz while decreasing the wattage by 40%?

Edit 2: Referencing this article http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/3

They were testing the X1 and claim the GPU was running at 1.5W for their tests using "Manhattan"?

So if we just took a 40% reduction in that as if it mean nothing.

Then it would go down to 0.9W?

Of course this is just guess work so dispute it.
 

MDave

Member
Err... Does this mean, that WHILE 40% faster, it is ALSO 60% more power efficient on TOP of that? As in, it will perform 40% better than TX1 WHILE consuming 60% less power? Or either one or the other?

TSMC say:
TSMC's 16FF+ (FinFET Plus) technology can provide above 65 percent higher speed, around 2 times the density, or 70 percent less power than its 28HPM technology. Comparing with 20SoC technology, 16FF+ provides extra 40% higher speed and 60% power saving. By leveraging the experience of 20SoC technology, TSMC 16FF+ shares the same metal backend process in order to quickly improve yield and demonstrate process maturity for time-to-market value.
 

ozfunghi

Member
Well, anyone that knows it's current clockspeed and wattage should know.

So we are estimating a 40% increase in clock speed and a 60% reduction in wattage.

If that makes any sense.

Sure, but it would be a much larger jump from TX1 than me (most of us) were expecting. That would theoretically mean it might push around 700Gflops while drawing less than half the power? This could be huge especially for "handheld" mode (if there even is a dock mode).

Maybe Thraktor or equivalent nerd
<3 Thraktor
can see how close to full speed the TX2 could get in handheld mode if this is true? It might explain the lack of different 'modes' (handheld/dock)? And might explain if (!) the TX1 in the devkits was actually overclocked.
 

ozfunghi

Member
TSMC say:
TSMC's 16FF+ (FinFET Plus) technology can provide above 65 percent higher speed, around 2 times the density, or 70 percent less power than its 28HPM technology. Comparing with 20SoC technology, 16FF+ provides extra 40% higher speed and 60% power saving. By leveraging the experience of 20SoC technology, TSMC 16FF+ shares the same metal backend process in order to quickly improve yield and demonstrate process maturity for time-to-market value.

Yes, what i quoted. But their wording is interesting:

"above 65 percent higher speed, around 2 times the density, or 70 percent less power than its 28HPM technology"

"Comparing with 20SoC technology, 16FF+ provides extra 40% higher speed and 60% power saving."

That's why i'm asking. I'm still thinking it's either 60% more efficient OR 40% faster?
 

ggx2ac

Member
Yes, what i quoted. But their wording is interesting:

above 65 percent higher speed, around 2 times the density, or 70 percent less power than its 28HPM technology"

It's probably saying 'or' to this part where I bolded. I assume they're using density as a measurement and converting it to watts.

Edit: It's more dense so it probably takes up less space (hence the die shrink) and possibly means less energy is lost as heat so then would mean it's more energy efficient.
 

Kouriozan

Member
This sounds like such a big piece of crap I can't even understand why people gives them credit.

"Operation reconquer", lol.
Reconquer (in French reconquête) isn't as strong meaning as in English, it just mean getting back some lost audience in this context and not getting all gamers on a Nintendo platform.
By the way, I hope nobody is expecting "PS4 Neo" power anymore, which invalidate this rumor, even more that it's supposedly an hybrid.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Reconquer (in French reconquête) isn't as strong meaning as in English, it just mean getting back some lost audience in this context.
By the way, I hope nobody is expecting "PS4 Neo" power anymore, which invalidate this rumor, even more that it's supposedly an hybrid.

Haven't seen anyone aside from SMD expecting this. The vast majority of people in these threads have reasonable expectations.
 

MDave

Member
Yes, what i quoted. But their wording is interesting:

"above 65 percent higher speed, around 2 times the density, or 70 percent less power than its 28HPM technology"

"Comparing with 20SoC technology, 16FF+ provides extra 40% higher speed and 60% power saving."

That's why i'm asking. I'm still thinking it's either 60% more efficient OR 40% faster?

Yes, I would agree with you on the or assumption. Take a look at their 20nm page, the wording they use there seems to be the case, as they don't use the word and:

TSMC's 20nm process technology can provide 30 percent higher speed, 1.9 times the density, or 25 percent less power than its 28nm technology.
 

ggx2ac

Member
Yes, I would agree with you on the or assumption. Take a look at their 20nm page, the wording they use there seems to be the case, as they don't use the word and:

Keep in mind as I just pointed out, they keep saying 'or' when they reference density.

It is more dense so it is taking up less space, hence it should be having less energy lost as heat when running so it should be more power efficient.
 

G.ZZZ

Member
As i interpreted it, you can get up to 40% more speed and u spend 60% less energy for the same speeds. So for a X1 , you can go at 1 Ghz and consume 4W instead of 10W (same speed), and you can go up to 1.4 Ghz , but at which wattage i don't know.
 

ggx2ac

Member
As i interpreted it, you can get up to 40% more speed and u spend 60% less energy for the same speeds. So for a X1 , you can go at 1 Ghz and consume 4W instead of 10W (same speed), and you can go up to 1.4 Ghz , but at which wattage i don't know.

With your one it should be 1.4GHz at (Wrong wattage). However I am assuming this 60% energy efficiency is linear until there is real world performance to show if it ends up using more Watts because it is losing more energy due to heat loss as it requires more work.

Edit: I probably read it wrong
It could be that it manages 1.4GHz at 6.4W
 

G.ZZZ

Member
With your one it should be 1.4GHz at 10W. However I am assuming this 60% energy efficiency is linear until there is real world performance to show if it ends up using more Watts because it is losing more energy due to heat loss as it requires more work.

Edit: I probably read it wrong
It could be that it manages 1.4GHz at 6.4W

Power and Speed are not linear. In general, the higher the frequence, the more you have to spend (in term of watts) for each increase you want. That's why you can go at half power and get 80% of the frequence speed or something like that. Would need a graph.
 

MuchoMalo

Banned
Thank you.
I knew about 16nmFF+ but sincerely didn't imagine it was that more efficient than normal 16nmFF.
60% more efficiency from FF+ plus a little bit more from Pascal and it can lead to very interesting scenarios...

Compared to 20nm Planar, not 16nmFF.

I should note that they say that:

TSMC's 16FF+ (FinFET Plus) technology can provide above 65 percent higher speed, around 2 times the density, or 70 percent less power than its 28HPM technology.

Yet when comparing the current desktop Pascal cards to their Maxwell counterparts, efficiency anywhere from 45-69% better depending on the card, game, and resolution. Also, the gap closes the lower the resolution gets. (Also, the density improvement is around 1.9x at best.) One big thing which should help NX over the Tegra X1 though is if it has a 128-bit bus. 50 GB/s + Pascal compression should be a good boost over the X1 at similar clock speeds.

With your one it should be 1.4GHz at (Wrong wattage). However I am assuming this 60% energy efficiency is linear until there is real world performance to show if it ends up using more Watts because it is losing more energy due to heat loss as it requires more work.

Edit: I probably read it wrong
It could be that it manages 1.4GHz at 6.4W

As said above, it's not linear. That's why you can get better efficiency from running more cores at lower clocks than less cores at higher clocks.
 

ggx2ac

Member
Power and Speed are not linear. In general, the higher the frequence, the more you have to spend (in term of watts) for each increase you want. That's why you can go at half power and get 80% of the frequence speed or something like that. Would need a graph.

That's why I said assume because they made it sound good on paper when I assume it would require more power to do work at that frequency because of energy lost as heat.

Thanks for confirming that though.
 

ozfunghi

Member
So, basically, we'll get TX1 performance at 60% less power. Wich is what thought to begin with. An added 40% performance (or speed) bump would be too good to be true.
 

ggx2ac

Member
So, basically, we'll get TX1 performance at 60% less power. Wich is what thought to begin with. An added 40% performance (or speed) bump would be too good to be true.

Nooooo stop!

You're giving the dock people some ammo!

"It's the dock! The dock I tells you! The dock will give NX all the power!"

/jk
 
I know this thread has gone very off track, but I saw a post on Videocardz that I thought was relevant. The GTX 1070 M is rumored to have more CUDA cores than the desktop variant. Why? They get better performance at a given thermal limit with more cores clocked lower than fewer cores clocked higher.

Something very similar was discussed with regards to the amount of SMs for the NX. Given that Nvidia seems to be already doing this for Pascal in the mobile environment, I don't think it's unreasonable to think the NX will also see a similar high SM low clock strategy. 3-4 SMs with a relatively low clock in mobile mode with a significantly higher clock in dock mode would make a lot of sense as long as cost is reasonable.
Intel did this with their iGPUs as well.
 

orioto

Good Art™
So, basically, we'll get TX1 performance at 60% less power. Wich is what thought to begin with. An added 40% performance (or speed) bump would be too good to be true.

But less power means higher clock ? no ? So it's still more power in the end!
 

ozfunghi

Member
But less power means higher clock ? no ? So it's still more power in the end!

No, more power efficient means they "could" decide to clock it higher (because less power = less heat). So, it's either 40% more powerful at the same power draw, or it's equally powerful at 60% less power draw... if i'm understanding it correctly. They could of course chose to go for something a bit more powerful (+/-20%) while still drawing (+/-30%) less power.

But considering this is also for a handheld, i'm pretty sure Nintendo will go for the 60% less power draw.
 
Status
Not open for further replies.
Top Bottom