• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Nintendo NX Powered By Nvidia Tegra! Initial Spec Analysis

Status
Not open for further replies.

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I was thinking a bit about X2.

Nintendo was willing to work with Nvidia on 3ds using Tegra, which wasn't exactly cheap at that time and the rumours say it was dropped because it didn't meet the power consumption target.
With Wii U they rather targeted power consumption/efficiency than CPU/GPU power and cost didn't seem to be a top priority.

So using a X2 doesn't seem too far fetched in my opinion, seeing as it supposed to be more efficient than X1. Or some kind of custom X1 in between.
Highest probability scenario is TN1 is something developed in parallel with the T-Next, thus sharing sufficient degree of tech with the latter, but not being T-Next per se.
 

KingSnake

The Birthday Skeleton
Highest probability scenario is TN1 is something developed in parallel with the T-Next, thus sharing sufficient degree of tech with the latter, but not being T-Next per se.

Yeah, that would make the timeline more realistic (unless Nvidia really shared info with Nintendo early on). The question would be how much it would share with T-Next.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Yeah, that would make the timeline more realistic (unless Nvidia really shared info with Nintendo early on). The question would be how much it would share with T-Next.
Same SM as a minimum, would be my largely-uneducated guess.
 

sfried

Member
So using a X2 doesn't seem too far fetched in my opinion, seeing as it supposed to be more efficient than X1. Or some kind of custom X1 in between.
If you're going on the same thought, a custom X1 would seem the likeliest. But this is Nintendo we're talking about, so I'm not even sure if using something from Nvidia is a done
deal or not. They might end up pulling off another custom chip a la PICA, but that wouldn't fit exactly with having industry leading chips.
 

KingSnake

The Birthday Skeleton
If you're going on the same thought, a custom X1 would seem the likeliest. But this is Nintendo we're talking about, so I'm not even sure if using something from Nvidia is a done
deal or not. They might end up pulling off another custom chip a la PICA, but that wouldn't fit exactly with having industry leading chips.

I think it's much less likely they will switch to anything else now, because even standard X1 is not that power hungry, especially at lower clocks.
 
In 2017 a portable games console with an X1 is just that, a portable games console, nothing more or less. It is entirely the expected average performance of a handheld released in the present day, neither particularly "supercharged" or underpowered, just standard.

If Sony were to release a Vita 2 for example, you might expect this or even a little faster.

Nintendo are not making a home console this generation, they are a handheld gaming company as of 2017.

HDMI out on an X1 does not = home console, no matter how much Nintendo may try to market it as such.
 

sfried

Member
I think it's much less likely they will switch to anything else now, because even standard X1 is not that power hungry, especially at lower clocks.
Well, that's the thing: What's up with mentioned "overclocked" TegraX1 being in devkits? Seems to suggest it's going to be a custom chip from Nvidia (an X1.5?) if that was the case.
In 2017 a portable games console with an X1 is just that, a portable games console, nothing more or less. It is entirely the expected average performance of a handheld released in the present day, neither particularly "supercharged" or underpowered, just standard.

If Sony were to release a Vita 2 for example, you might expect this or even a little faster.

Nintendo are not making a home console this generation, they are a handheld gaming company as of 2017.

HDMI out on an X1 does not = home console, no matter how much Nintendo may try to market it as such.
I'd rather wait an see how this thing pans out, before I make any judgements about whether or not to consider it a handheld or a hybrid.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
In 2017 a portable games console with an X1 is just that, a portable games console, nothing more or less. It is entirely the expected average performance of a handheld released in the present day, neither particularly "supercharged" or underpowered, just standard.

If Sony were to release a Vita 2 for example, you might expect this or even a little faster.

Nintendo are not making a home console this generation, they are a handheld gaming company as of 2017.

HDMI out on an X1 does not = home console, no matter how much Nintendo may try to market it as such.
That's quite an interesting statement to make, given we don't really know what 'this' is, so why would sony make something 'a little faster'?
 

KrawlMan

Member
In 2017 a portable games console with an X1 is just that, a portable games console, nothing more or less. It is entirely the expected average performance of a handheld released in the present day, neither particularly "supercharged" or underpowered, just standard.

If Sony were to release a Vita 2 for example, you might expect this or even a little faster.

Nintendo are not making a home console this generation, they are a handheld gaming company as of 2017.

HDMI out on an X1 does not = home console, no matter how much Nintendo may try to market it as such.

Why do so many people assume these rumors are the whole picture ? assuming the dock is just HDMI out, literally nothing about this leak is novel/worthy of secrecy.

I may not expect Nintendo to make some super powered Neo grade home console, but assuming this leak tells the whole picture is obtuse. Whatever they're hiding, it's not some copy cat idea.
 
In 2017 a portable games console with an X1 is just that, a portable games console, nothing more or less. It is entirely the expected average performance of a handheld released in the present day, neither particularly "supercharged" or underpowered, just standard.

If Sony were to release a Vita 2 for example, you might expect this or even a little faster.

Nintendo are not making a home console this generation, they are a handheld gaming company as of 2017.

HDMI out on an X1 does not = home console, no matter how much Nintendo may try to market it as such.

I agree that the NX is a handheld system, but you need some perspective before you call it 'average' in terms of power. Was the power of the ps4 'average' for a game console just because we anticipated its specs before it released?

Power is relative and usually has to do with precedence and frame of reference. For the mobile industry, the most powerful GPU is a chip that hasn't even been released. Aside from this chip, the X1 is the most powerful mobile chip currently available. The NX will be somewhere between these two frames of reference, even though there will be many handheld devices (i.e. smartphones) released in 2017 that will be less powerful than either of these options. I don't consider that to be an average amount of portable power.
 
That's quite an interesting statement to make, given we don't really know what 'this' is, so why would sony make something 'a little faster'?

vita was about as fast as it conceivably could have been when it was announced, but the X1 is 18 months old already.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
vita was about as fast as it conceivably could have been when it was announced, but the X1 is 18 months old already.
A quad Cortex-A9 ~444MHz and a quad SGX54x was "as fast as it conceivably could have been" in December '11? Let's see..

Ipad 2: dual Cortex-A9 @1GHz + dual SGX54x: March 11, 2011
Ipad 2: dual Cortex-A9 @1GHz + quad SGX54x: March 16, 2012

https://en.wikipedia.org/wiki/IPad#iPad_series

Seems to me sony were just using top-bracket industry-standard design.
 
Thats a PC though. Games consoles just don't target CPU power. Which tbh is why it being better or worse than X a bit negligible.

The bottleneck in a handheld is going to be memory. This is just going to be a next gen handheld imo with the power implications of that. 3D will be dropped which helps but still there is going to be a big (imo) trade off between visuals and resolution and frame rate.
The Pixel C is a tablet
 
RIght, because the NX has launched with an TX1.

the post you're quoting is about the X1, because evidence is heavily weighted towards it being an X1. if you want to say "let's wait and see!" then fair enough, but you probably won't get much out of discussion in this thread.
 

KingSnake

The Birthday Skeleton
the post you're quoting is about the X1, because evidence is heavily weighted towards it being an X1. if you want to say "let's wait and see!" then fair enough, but you probably won't get much out of discussion in this thread.

I think you forgot to what post you replied when this discussion started.
 

gogogow

Member
In 2017 a portable games console with an X1 is just that, a portable games console, nothing more or less. It is entirely the expected average performance of a handheld released in the present day, neither particularly "supercharged" or underpowered, just standard.

If Sony were to release a Vita 2 for example, you might expect this or even a little faster.
There's nothing standard about a 1tf gpu for a portable device, that is twice the amount of flops compared to the Snapdragon 820 Adreno 530. This is top of the line. You don't know what you're talking about.
 
I think you forgot to what post you replied when this discussion started.

i replied to blu saying "you don't know what 'this' is" in response to a post saying that a hypothetical vita 2 would be faster than a hypothetical X1-powered NX ("this").

i mean, if hypothetical discussion is worthless, why even have the thread!
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
the post you're quoting is about the X1, because evidence is heavily weighted towards it being an X1. if you want to say "let's wait and see!" then fair enough, but you probably won't get much out of discussion in this thread.
If you actually read my post, that's exactly what I said. A devkit with an allegedly overclocked TX1 does not indicate in the slightest the actual product will be the same overclocked TX1.

i replied to blu saying "you don't know what 'this' is" in response to a post saying that a hypothetical vita 2 would be faster than a hypothetical X1-powered NX ("this").

i mean, if hypothetical discussion is worthless, why even have the thread!
A "hypothetical vita2" made of the dreams of, erm, sony aficionados would be more powerful than anything with the slightest foothold in reality. That much we agree.
 
There's nothing standard about a 1tf gpu for a portable device, that is twice the amount of flops compared to the Snapdragon 820 Adreno 530. This is top of the line. You don't know what you're talking about.

Will the NX support a 1tf GPU? That's great for a handheld hybrid :)
 

KingSnake

The Birthday Skeleton
i replied to blu saying "you don't know what 'this' is" in response to a post saying that a hypothetical vita 2 would be faster than a hypothetical X1-powered NX ("this").

First of all, you're trying to wage a hypothetical console wars, which is not the scope of this thread and usually not a thing that is encouraged around here.

Secondly, yes, you answered to a post stating "you don't know what it is" and then later you complained you can get much of a discussion when blu said the exact same thing once again. You're trying to start a war out of thin air practically.
 

Jaagen

Member
Will the NX support a 1tf GPU?

Probably not, but it's too early to tell. Maybe if the chip is based on NVIDIAs lates Pascal architecture and if it get's some clock boost when connected to the dock. However, I wouldn't count on it.

Ok, with what we discussed before, it makes sense that the A57s would have the advantage due to the clockspeed, but this is a lot more lopsided than I expected.

Assuming that LCGeek is accurate about the CPU difference, is it possiblity that NV is considering replacing the four little A53 cores inside X1 with A57s for the NX?

It seems likely that it will be some sort of custom solution. I guess four A57s make sense(or perhaps even A72s!)
 
In 2017 a portable games console with an X1 is just that, a portable games console, nothing more or less. It is entirely the expected average performance of a handheld released in the present day, neither particularly "supercharged" or underpowered, just standard.

If Sony were to release a Vita 2 for example, you might expect this or even a little faster.

Nintendo are not making a home console this generation, they are a handheld gaming company as of 2017.

HDMI out on an X1 does not = home console, no matter how much Nintendo may try to market it as such.


It's not standard in any way. Underclocked, it would be on par with iPhone 6s plus.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Ok, with what we discussed before, it makes sense that the A57s would have the advantage due to the clockspeed, but this is a lot more lopsided than I expected.

Assuming that LCGeek is accurate about the CPU difference, is it possiblity that NV is considering replacing the four little A53 cores inside X1 with A57s for the NX?
A far more likely scenario for NX would be to keep the bigLITTLE setup, so it could use only 4x A53s on the go, and then add to those 4x A72s (worst-case 4x A57) in the docks.
 

MuchoMalo

Banned
A far more likely scenario for NX would be to keep the bigLITTLE setup, so it could use only 4x A53s on the go, and then add to those 4x A72s (worst-case 4x A57) in the docks.

The problem is that Tegra X1 can't use both sets of cores simultaneously.
 
vita was about as fast as it conceivably could have been when it was announced, but the X1 is 18 months old already.


Not true. While it had great specs, it was dramatically low clocked.
Also, let's see about your claim:
https://imgtec.com/news/press-relea...-sgx543mp-multi-processor-graphics-ip-family/

February 2009. Vita was announced in February 2011. Nice try.
By the way, X1 is already 18 months old and it remains one of the fastest SoC. For a good reason: We have yet to see it in a smaller form factor.
 

MuchoMalo

Banned
There's nothing standard about a 1tf gpu for a portable device, that is twice the amount of flops compared to the Snapdragon 820 Adreno 530. This is top of the line. You don't know what you're talking about.

Will the NX support a 1tf GPU? That's great for a handheld hybrid :)

Once again, the X1 's GPU does not run at 1 TFLOPS; it's 0.5. The "1 TFLOPS" figure is half-precision.

If you actually read my post, that's exactly what I said. A devkit with an allegedly overclocked TX1 does not indicate in the slightest the actual product will be the same overclocked TX1.

It's likely a custom chip. I think we're close enough to launch that the dev kit shouldn't be significantly faster than the final version, so I'd imagine that the final chip will be as fast as or faster than this overclocked X1.

Not true. While it had great specs, it was dramatically low clocked.
Also, let's see about your claim:
https://imgtec.com/news/press-relea...-sgx543mp-multi-processor-graphics-ip-family/

February 2009. Vita was announced in February 2011. Nice try.
By the way, X1 is already 18 months old and it remains one of the fastest SoC. For a good reason: We have yet to see it in a smaller form factor.

We have seen it in the Pixel C tablet. Considering that, I don't see it being less than 500MHz on the GPU in NX's portable mode.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
The problem is that Tegra X1 can't use both sets of cores simultaneously.
Once again, TX1 problems are not necessarily TN1's problems. It's not like ARM don't offer off-the-shelf coherency fabric to link two ARMv8 clusters in concurrent mode.
 

Thraktor

Member
If you're going on the same thought, a custom X1 would seem the likeliest. But this is Nintendo we're talking about, so I'm not even sure if using something from Nvidia is a done
deal or not. They might end up pulling off another custom chip a la PICA, but that wouldn't fit exactly with having industry leading chips.

I think it's pretty safe to say that Nvidia is a done deal. We're far too late for Nintendo to switch it out, and if they've already done so then there wouldn't be any sense in leaving devs with Nvidia-powered dev kits.

the post you're quoting is about the X1, because evidence is heavily weighted towards it being an X1. if you want to say "let's wait and see!" then fair enough, but you probably won't get much out of discussion in this thread.

Actually, evidence is weighted towards it not being a TX1:


  1. Nintendo have never used an off the shelf CPU or GPU in any of their home consoles or handhelds. Even in their earlier days, when they used "close to stock" chips on occasion, they were always customised in some manner. Their hardware has only become further customised as they've gone along.
  2. The TX1 in the dev kits is actively cooled. Nintendo won't be releasing an actively cooled handheld.
  3. Dev kits given to third parties typically use the closest off the shelf chips to approximate the performance of the final device. For a custom Nvidia chip, the TX1 would be the closest off the shelf option for dev kits. Final hardware doesn't typically arrive with third parties until a few months before launch, so it would be normal for them to still have kits with approximate hardware.
  4. The TX1 is manufactured on 20nm, and every vendor who produced any mobile SoCs on 20nm (there weren't many) shifted their fabrication to 14nm/16nm extremely quickly. Mid-range chips are skipping 20nm altogether, moving straight from 28nm to 14nm/16nm. This suggests that there's no cost benefit to 20nm as opposed to the newer nodes, and as they offer a big jump in efficiency there's no reason for Nintendo to use a 20nm chip.
The only possible reason I could see for Nintendo to use a 20nm SoC would be if Nvidia had made a big wafer commitment and were willing to give Nintendo an obscenely good deal on it, but even then we'd be talking about a custom chip rather than TX1.
 
sorry if this has been answered already but how do the flops in tegra compare to the flops in the wii u gpu? as we know the wii u has less than the 360 but its work better by being more modern, so are tegras flops better still or about the same?
 

MuchoMalo

Banned
This is the type of thing Nintendo could modify. If they have the correct kernel, cache coherence, and scheduler, they could use the cores any way they want.

To my understanding, this is a hard limitation of the chip's hardware. Nintendo will most likely use a custom chip in the end, but I'm not sure if they can change that specific limitation without heavy, costly modifications. I'm also not sure that they have any actual incentive to do so.

Once again, TX1 problems are not necessarily TN1's problems. It's not like ARM don't offer off-the-shelf coherency fabric to link two ARMv8 clusters in concurrent mode.

But can it be supported by Tegra X1's IP? If by some miracle Nintendo is using a Parker-based chip they might be able to work some magic, but I feel that Nintendo would have told devs that they were if that were the plan.

Also, I don't believe that rumor from that forum post at all. Who ever heard of predicting leaks months in advance? And leaking price information? And being a developer who knows about a chip which no other dev seems to know about yet, considering that Eurogamer/DF have heard that X1 is all devs have heard about? Seriously, I don't understand why anyone is taking that seriously. It's such an obvious hoax that it's not even funny.

Half-precision FLOPS are still FLOPS. You can argue that they're less efficient but that doesn't change its floating point operations per second.

The issue is that people are comparing half-precision to single-precision and falling for a marketing trick. And none of it really matters in gaming either way!
 
sorry if this has been answered already but how do the flops in tegra compare to the flops in the wii u gpu? as we know the wii u has less than the 360 but its work better by being more modern, so are tegras flops better still or about the same?

Historically speaking, NVIDIA's FLOPS have been more efficient than AMD's FLOPS (the Wii U's GPU is an AMD chip).

Even if the Tegra X1 had the same amount of FLOPS as the Wii U GPU, it would likely outperform it.
 
The issue is that people are comparing half-precision to single-precision and falling for a marketing trick. And none of it really matters in gaming either way!

Actually, if you compare its half-precision performance with the half-precision performance of any other mobile chip, it's still just as impressive.
 

MuchoMalo

Banned
I think it's pretty safe to say that Nvidia is a done deal. We're far too late for Nintendo to switch it out, and if they've already done so then there wouldn't be any sense in leaving devs with Nvidia-powered dev kits.



Actually, evidence is weighted towards it not being a TX1:


  1. Nintendo have never used an off the shelf CPU or GPU in any of their home consoles or handhelds. Even in their earlier days, when they used "close to stock" chips on occasion, they were always customised in some manner. Their hardware has only become further customised as they've gone along.
  2. The TX1 in the dev kits is actively cooled. Nintendo won't be releasing an actively cooled handheld.
  3. Dev kits given to third parties typically use the closest off the shelf chips to approximate the performance of the final device. For a custom Nvidia chip, the TX1 would be the closest off the shelf option for dev kits. Final hardware doesn't typically arrive with third parties until a few months before launch, so it would be normal for them to still have kits with approximate hardware.
  4. The TX1 is manufactured on 20nm, and every vendor who produced any mobile SoCs on 20nm (there weren't many) shifted their fabrication to 14nm/16nm extremely quickly. Mid-range chips are skipping 20nm altogether, moving straight from 28nm to 14nm/16nm. This suggests that there's no cost benefit to 20nm as opposed to the newer nodes, and as they offer a big jump in efficiency there's no reason for Nintendo to use a 20nm chip.
The only possible reason I could see for Nintendo to use a 20nm SoC would be if Nvidia had made a big wafer commitment and were willing to give Nintendo an obscenely good deal on it, but even then we'd be talking about a custom chip rather than TX1.

Right, but if Nintendo is using 16nmFF+, shouldn't they have told devs by now that the final chip will be Pascal-based? (There's no point in putting Maxwell on 16FF+, since that's already what Pascal basically is.) I want to believe as well, but usually with Nintendo as of late dev kits get weaker as launch nears, not more powerful. Also, shouldn't Nvidia be able to send out Parker chips by now? It's probably just gonna be the X1 with a wider bus or some eSRAM.

Actually, if you compare its half-precision performance with the half-precision performance of any other mobile chip, it's still just as impressive.

No, it's half as impressive... :p
 
Right, but if Nintendo is using 16nmFF+, shouldn't they have told devs by now that the final chip will be Pascal-based? (There's no point in putting Maxwell on 16FF+, since that's already what Pascal basically is.) I want to believe as well, but usually with Nintendo as of late dev kits get weaker as launch nears, not more powerful. Also, shouldn't Nvidia be able to send out Parker chips by now? It's probably just gonna be the X1 with a wider bus or some eSRAM.

I would imagine that the number of people who have the target specs for the final console is far smaller than the number of people who get to see and interact with devkits, so the leaker(s) might not know exactly what the final target chip is.

I agree that Nintendo in the past has typically gone weaker than their devkits, since devkits need that extra overhead, so I wouldn't necessarily expect an improvement but they also place a premium on power consumption, especially in a handheld. So it could go either way in my mind.

Edit:
You may have to elaborate on where you got that from. If you're talking about the Wii U, I believe that the (vague) specs listed on initial document was different from the very first dev-kits, but they did get progressively stronger after that.

Isn't it common for all console manufacturers to provide extra power in the devkits to allow all of the development software to run smoothly in addition to the games? Or am I making that up?

Edit2: Never mind, misunderstood Malo's post- I expect the final product to be slightly weaker than the final devkit, not necessarily the devkits themselves getting weaker as revisions are sent out.
 
Right, but if Nintendo is using 16nmFF+, shouldn't they have told devs by now that the final chip will be Pascal-based? (There's no point in putting Maxwell on 16FF+, since that's already what Pascal basically is.) I want to believe as well, but usually with Nintendo as of late dev kits get weaker as launch nears, not more powerful. Also, shouldn't Nvidia be able to send out Parker chips by now? It's probably just gonna be the X1 with a wider bus or some eSRAM.

You may have to elaborate on where you got that from. If you're talking about the Wii U, I believe that the (vague) specs listed on initial document was different from the very first dev-kits, but they did get progressively stronger after that.
 

Thraktor

Member
A far more likely scenario for NX would be to keep the bigLITTLE setup, so it could use only 4x A53s on the go, and then add to those 4x A72s (worst-case 4x A57) in the docks.

If you're going to dedicate the die space to A72s you might as well leave them enabled in handheld mode, even if they're running at 500-600MHz or so. I would also be surprised if there's a big difference in CPU performance between handheld and docked mode (if any). Games will have to work in both modes, and I can't imagine developers wanting to add a load of CPU-intensive functionality for docked mode only to have to rip it out while in handheld mode (without significantly altering how the game plays). It would seem a lot simpler to me to leave CPU performance the same (or at least close) in both modes, and use any increase in TDP in docked mode purely for bumped GPU clocks.

Right, but if Nintendo is using 16nmFF+, shouldn't they have told devs by now that the final chip will be Pascal-based? (There's no point in putting Maxwell on 16FF+, since that's already what Pascal basically is.) I want to believe as well, but usually with Nintendo as of late dev kits get weaker as launch nears, not more powerful. Also, shouldn't Nvidia be able to send out Parker chips by now? It's probably just gonna be the X1 with a wider bus or some eSRAM.

As you say, though, there's not much difference between Maxwell and Pascal aside from the manufacturing process (unless you're talking GP100 Pascal, which wouldn't be used this scenario), and Nintendo may simply be waiting until final hardware is available before confirming all the final details with third parties. And if NX's SoC is a 2xSM part, then TX1 would actually be a better match for that than Parker (assuming the latter uses 3 or 4 SMs).
 
I would imagine that the number of people who have the target specs for the final console is far smaller than the number of people who get to see and interact with devkits, so the leaker(s) might not know exactly what the final target chip is.

I agree that Nintendo in the past has typically gone weaker than their devkits, since devkits need that extra overhead, so I wouldn't necessarily expect an improvement but they also place a premium on power consumption, especially in a handheld. So it could go either way in my mind.

Edit:


Isn't it common for all console manufacturers to provide extra power in the devkits to allow all of the development software to run smoothly in addition to the games? Or am I making that up?

Edit2: Never mind, misunderstood Malo's post- I expect the final product to be slightly weaker than the final devkit, not necessarily the devkits themselves getting weaker as revisions are sent out.
Is the common practice with dev kits is that it may simply have more RAM?
 
Status
Not open for further replies.
Top Bottom