• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

A Nintendo Switch has been taken apart

AmyS

Member
Nothing new, really. Basically it is still a battle between Eurogamer and Foxconn, and we will most likely only know the answer when someone scans the SOC.


Based on the pictures in OP we have reasonably identified (like 90%) the RAM as two 2GB modules for a total of 4GB with 25.6GB/s bandwidth.

Beyond that we aren't really any closer to figuring out final specs. I think we determined the SoC in the OP is almost exactly the same size as a Tegra X1, though that doesn't really mean anything beyond any more than 256 CUDA cores is theoretically impossible.

This gives us no indication of 20nm vs 16nm and certainly doesn't tell us anything about CPU cores or clock speeds.

It's also possible that the unit in the OP is not a final retail unit. That's about it I think.

The latest specs are: we have no idea. Could be anywhere from 4x A57s with 3 usable at 1GHz, 256CC at 768MHz docked to 4x A72s at 1.78GHz and 256CC at 921MHz docked. Or anywhere in between. Or the launch clocks are the former but could be raised after launch, potentially up to the latter.

Thank you guys.

Trying to get caught up with all the Switch discussion while sipping on some hot tea :)
 
It suffered more from their devotion to low power consumption than BC. The CPU core count or the low shader count is what made it a slow console.
The BC dic cripple them, in using an outdated node for the GPU and a CPU that had long been outperformed in power efficiency and with crippled SIMD support.
Not only that, the in-house R&D was admittrdly insane and the price of the package was estimated to be 100$!by Chipworks. What made the console so expensive wasn't the Gamepad, it was the absurd innards it has. Not even the Xbone SoC was that expensive.
 
Cuningas de Häme;230853458 said:
This. As always. Although I'd never buy a PS4 or XBox, I have a nice PC and most of the console games don't interest me in any way.

Switch has very unique premise for it. I am eagerly waiting for it. It will be used so much when I am not able to be on my PC.

This is the major factor for the system, and why the key for Switch ports is not to run as well as their counterparts, but to run well enough. Portability grants a bonus to any Switch version of a game that the others simply will not have, and indeed for games that aren't so demanding that any of the systems would bottleneck, it'll arguably be the best value by default. Sure, you could play Sonic Mania on PC, PS4 and XBO, but if all versions run at 60 fps, then it's the Switch that comes out ahead with the additional incentive of being usable on the go.

I suppose that's why there's also a particular interest in trying to figure out just what this thing is capable of. More than just a specs comparison, it's whether or not it'll be enough to play games that have never been associated with portability, well, portably. This is why they came out with Skyrim as one of the first reveals for the system, because it is a pretty damned good example of that.
 

Hermii

Member
My Switch is being unboxed, docked and used exclusively as a home console, so any games they are superior on the PS4 Pro will be purchased for that console.

In other words its a first party only box for you.

This is the major factor for the system, and why the key for Switch ports is not to run as well as their counterparts, but to run well enough. Portability grants a bonus to any Switch version of a game that the others simply will not have, and indeed for games that aren't so demanding that any of the systems would bottleneck, it'll arguably be the best value by default. Sure, you could play Sonic Mania on PC, PS4 and XBO, but if all versions run at 60 fps, then it's the Switch that comes out ahead with the additional incentive of being usable on the go.

I suppose that's why there's also a particular interest in trying to figure out just what this thing is capable of. More than just a specs comparison, it's whether or not it'll be enough to play games that have never been associated with portability, well, portably. This is why they came out with Skyrim as one of the first reveals for the system, because it is a pretty damned good example of that.

Yea the difference between Foxconn and Eurogamer gpu wise is not that signficant, but cpu wise the difference is huge and could be the difference between a game getting ported or not. Thats why people care.
 

AzaK

Member
That's just not an appropriate reading of the Switch. The system is full of impressive tech, and isn't really "cheap" in any way.

Could it have been more powerful? Yes. But that's true of every system ever released.

Sorry, I was talking specifically about the "power". Nintendo will cheap out on that the first chance they get to save money or in this case, do something else. So long as they get their joy cons in there (Which, to be honest will get almost no use their extra tech of any real significance) then they'll be happy to have the console be quite weak. To them, I imagine, Wii U levels of power is considered "great".
 

KtSlime

Member
Sorry, I was talking specifically about the "power". Nintendo will cheap out on that the first chance they get to save money or in this case, do something else. So long as they get their joy cons in there (Which, to be honest will get almost no use their extra tech of any real significance) then they'll be happy to have the console be quite weak. To them, I imagine, Wii U levels of power is considered "great".

What is a more powerful chip they could have used to make a hybrid console?
 

KingSnake

The Birthday Skeleton
I just can't believe that Nintendo would tell the developers in August-September "you need to work around these CPU limitations to port your game" only to come 1-2 months late "we changed our mind, unlimited CPU powaaa! Btw, we changed the CPU cores completely".
 

z0m3le

Banned
Sorry, I was talking specifically about the "power". Nintendo will cheap out on that the first chance they get to save money or in this case, do something else. So long as they get their joy cons in there (Which, to be honest will get almost no use their extra tech of any real significance) then they'll be happy to have the console be quite weak. To them, I imagine, Wii U levels of power is considered "great".

The expectations of this device was largely a Wii U on the go, it's more powerful than a Wii U even when undocked and around 4 times more powerful when docked. Nothing about the Switch seems cheap, 20nm required a ton of active cooling for instance, and if it's 16nm it still needs the active cooling to avoid throttling. The battery is as big as it could be and the screen resolution is 720 and from major media sites, we've heard it is the best screen ever on a handheld.

I mean, if you don't like the hybrid concept that is fine, but you have to meet with reality at some point, a Nintendo console at $300 that was as powerful as a PS4 Pro would still lack 3rd party support, it would still have missed games like ME4 and have a tough climb with 3rd party western developers.

Nintendo made a tablet that can replace your 3DS and your Wii U, it's the most powerful handheld ever released and if you have a problem with Nintendo's PR wordings, why do you care? PR is PR and they have a lot of 3DS devices to sell, what IPs does 3DS get that you don't think Switch will? because it isn't mainline Pokemon (confirmed by eurogamer) or Fire Emblem.

I just can't believe that Nintendo would tell the developers in August-September "you need to work around these CPU limitations to port your game" only to come 1-2 months late "we changed our mind, unlimited CPU powaaa! Btw, we changed the CPU cores completely".

Prerelease development is a hard life. Wii U had the greenhill issue, and 2 CPU cores @ 1ghz some developers had to work with for launch specs, there was a day 1 patch for Wii U to clean up the OS some, and the OS hogged a lot of CPU resources, loaded everything super slowly. Final Wii U specs were 3 CPU cores @ 1.24ghz which is a similar increase to what Switch would be seeing. I agree it should be done better, but it doesn't indicate that it is wrong. Also I am not sure that it would need A72 to be Foxconn's clocks, looking at Samsung's similar 14nm, the SoC could draw as much as 4watts and reach those clocks on A57.
 

KingSnake

The Birthday Skeleton
Prerelease development is a hard life. Wii U had the greenhill issue, and 2 CPU cores @ 1ghz some developers had to work with for launch specs, there was a day 1 patch for Wii U to clean up the OS some, and the OS hogged a lot of CPU resources, loaded everything super slowly. Final Wii U specs were 3 CPU cores @ 1.24ghz which is a similar increase to what Switch would be seeing. I agree it should be done better, but it doesn't indicate that it is wrong. Also I am not sure that it would need A72 to be Foxconn's clocks, looking at Samsung's similar 14nm, the SoC could draw as much as 4watts and reach those clocks on A57.

I'm not saying it is impossible. Yes, it can happen. But I just can't see why something like this would happen. In July docs the usable clocks were not yet communicated, then wait 1-2 month and communicate a set of clocks and then again 1-2 months and communicated a totally different one sounds like an unfortunate set of events. Why not wait 3-4 months since July and only communicate once if they already knew they are going for a different fabrication process (and possible different CPU). This is not like on Wii U, freeing up a core and increasing the frequency for all. This going into a different direction all together.

These things are not decided on the spot. To go in October with full production on 16m means preparing everything (both SoC and the console itself going through sample production, QA etc).

If they didn't know already in August that in October they will use a 16nm SoC sounds like everything was one hell of a crazy ride.
 

z0m3le

Banned
I'm not saying it is impossible. Yes, it can happen. But I just can't see why something like this would happen. In July docs the usable clocks were not yet communicated, then wait 1-2 month and communicate a set of clocks and then again 1-2 months and communicated a totally different one sounds like an unfortunate set of events. Why not wait 3-4 months since July and only communicate once if they already knew they are going for a different fabrication process (and possible different CPU). This is not like on Wii U, freeing up a core and increasing the frequency for all. This going into a different direction all together.

These things are not decided on the spot. To go in October with full production on 16m means preparing everything (both SoC and the console itself going through sample production, QA etc).

If they didn't know already in August that in October they will use a 16nm SoC sounds like everything was one hell of a crazy ride.

Eurogamer's source is the timeline you are working with, it was brought up that july devkits were still being used by some developers as recently as December. If Eurogamer's source isn't a Nintendo developer or a very close Japanese developer, they might not have gotten final hardware/specs until much later.

Think about it this way, if you are an indie developer and you go through NoA, they don't know the same information as Nintendo internally does in Japan, so that updated doc could be old information when they got it.

Remember it also said "clocks at launch" I don't see any 3rd party software that would push the console, there is no traditional multiplatform ports, who exactly are you worried about with these specs? The bomberman team? Just look at these games:

TG5CEam.png

As for why they didn't wait, it is much the same with my questioning about the OP picture, if these SoCs were ready in july, why not put them in the july devkits?
 

Hermii

Member
Remember it also said "clocks at launch" I don't see any 3rd party software that would push the console, there is no traditional multiplatform ports, who exactly are you worried about with these specs? The bomberman team? Just look at these games:

The battery capacity for Zelda is 3 hours. Assuming thats at Eurogamer speed, there isn't much room for upclocks.
 

z0m3le

Banned
The battery capacity for Zelda is 3 hours. Assuming thats at Eurogamer speed, there isn't much room for upclocks.

You think Eurogamer's source is working on Zelda?!

The point of my post was pointing out that no games developed by 3rd parties would need to worry about their games at launch with Eurogamer's clocks. Internally Nintendo could have still been testing final clocks and just offered a target clock for developers working inside the launch window.
 

Hermii

Member
You think Eurogamer's source is working on Zelda?!

The point of my post was pointing out that no games developed by 3rd parties would need to worry about their games at launch with Eurogamer's clocks. Internally Nintendo could have still been testing final clocks and just offered a target clock for developers working inside the launch window.

You think Nintendo would ration out differently clocked dev kits based on the game?

edit: I guess what you are saying is a theoretical possibility.
 

KingSnake

The Birthday Skeleton
Eurogamer's source is the timeline you are working with, it was brought up that july devkits were still being used by some developers as recently as December. If Eurogamer's source isn't a Nintendo developer or a very close Japanese developer, they might not have gotten final hardware/specs until much later.

Think about it this way, if you are an indie developer and you go through NoA, they don't know the same information as Nintendo internally does in Japan, so that updated doc could be old information when they got it.

Remember it also said "clocks at launch" I don't see any 3rd party software that would push the console, there is no traditional multiplatform ports, who exactly are you worried about with these specs? The bomberman team? Just look at these games:

So you're assuming that Eurogamer's timeline is not true? Or Eurogamer's clocks didn't happen? We know from the July development docs that there were no recommended clocks yet defined at the end of July. So it happened sometimes after.

And why are talking about "at launch". The retail units are either on 16nm or on 20nm. That can't be changed via a patch afterwards. That decision had to be taken way before October when the mass production started. And way before the SoC mass production started.

As for why they didn't wait, it is much the same with my questioning about the OP picture, if these SoCs were ready in july, why not put them in the july devkits?

We actually don't know if the SoC changed from July to October.
But assuming it did, the answer to your question is the same as the assumption behind my question: in a big company (and Nintendo is one and we know that the bureaucracy in NCL is quite slow from different stories over the years) it is very difficult to plan, prototype, test, produce and deliver in very short time. Having SoC manufactured mid July and the final product already reach the consumer (the dev in this case) within the same month is almost magic.
 

z0m3le

Banned
You think Nintendo would ration out differently clocked dev kits based on the game?

That isn't what I said. Back when that document was written, the july devkits were available but not final devkits, at least if the timelines we are working in is correct. However yes, indie devs didn't get final hardware until late, eurogamer's source is likely a launch game source, and it could be skylanders or more likely just dance (because ubisoft) but even these games aren't stressing to need more performance, so Nintendo had no one at launch that Nintendo more performance than this.

I am just speculating on how that information came about, btw does anyone know what specs were suppose to be inside the Nvidia Portable that they put through FCC in july last year? interesting timing...
 
I'm not saying it is impossible. Yes, it can happen. But I just can't see why something like this would happen. In July docs the usable clocks were not yet communicated, then wait 1-2 month and communicate a set of clocks and then again 1-2 months and communicated a totally different one sounds like an unfortunate set of events. Why not wait 3-4 months since July and only communicate once if they already knew they are going for a different fabrication process (and possible different CPU). This is not like on Wii U, freeing up a core and increasing the frequency for all. This going into a different direction all together.

These things are not decided on the spot. To go in October with full production on 16m means preparing everything (both SoC and the console itself going through sample production, QA etc).

If they didn't know already in August that in October they will use a 16nm SoC sounds like everything was one hell of a crazy ride.

While you are making a few assumptions (1GHz could have been on 4 cores or A72s, EG leak could be based only on the July devkits) I do agree that it would be a pretty weird circumstance if they raised the CPU clocks by this much after explicitly limiting them after July. But I can see how it could make sense.

It could be possible that Nintendo initially had only 3 A57 cores available to developers, and did not limit the clock speeds then. After that, they had done sufficient work on the OS to unlock that fourth core and said, okay, now that we have 4 cores let's limit the clock speed to 1GHz for the time being. After that, they began developing 16nm devkits with A72s (possibly) and said, okay, at this same power envelope we can push 3 (or 4) A72s to 1.78GHz, so we'll use that for these newer devkits and the retail units.

Like you said it seems like a quite odd and confusing series of events, but I can see it being possible.

You think Nintendo would ration out differently clocked dev kits based on the game?

We heard from a few reliable sources that some developers were still working with old July devkits as recent as November/December if I recall correctly. I'm sure that's not uncommon, the smaller partners like indies not being first in line for the most updated devkits.

EDIT:
We are working with rumors and speculation here... while Capcom themselves said they are okay and got what they needed from Nintendo. They said they want AAA experiences on Switch. Doesn't sound like a console that is lacking. Obviously we have to take a wait and see approach but it's better than what Wii U got from developers from their statements.

A standard TX1 with the Eurogamer clocks could be enough for Capcom's AAA efforts. A higher CPU clock should make ports a lot easier, but I don't think the EG specs and clocks would actually prevent any AAA game from running with downgrades.
 

KingSnake

The Birthday Skeleton
While you are making a few assumptions (1GHz could have been on 4 cores or A72s, EG leak could be based only on the July devkits) I do agree that it would be a pretty weird circumstance if they raised the CPU clocks by this much after explicitly limiting them after July. But I can see how it could make sense.

It could be possible that Nintendo initially had only 3 A57 cores available to developers, and did not limit the clock speeds then. After that, they had done sufficient work on the OS to unlock that fourth core and said, okay, now that we have 4 cores let's limit the clock speed to 1GHz for the time being. After that, they began developing 16nm devkits with A72s (possibly) and said, okay, at this same power envelope we can push 3 (or 4) A72s to 1.78GHz, so we'll use that for these newer devkits and the retail units.

Like you said it seems like a quite odd and confusing series of events, but I can see it being possible.

Sure it can be possible. But in my opinion, to start mass-production in October the decision about the fabrication node and customisation of the SoC must have been taken way before July. Which would make the subsequent set of messages for the devs quite a crazy ride.
 

Salex_

Member
The Switch isn't even out yet, so you really can't really judge that it would fail right out of the gae. It seems pretty clear to me that Nintendo is rushing the Switch, and it shows from their of info for the OS(OS features, lack of web browser, netflix), online content, and around 7 games at launch. Come November, and Nintendo is going to have really really strong support from japanese devs. Hell the 1st party alone would blow Wii U's in their first three years if everything comes out as planned from Nintendo's end(Splatoon 2, Super Mario Odyssey, Xenoblade Chronicles 2) and leaks(Super Smash Bros 4, Pokemon Stars). Also we haven't heard of all the 3rd party support for this fall/winter yet either. I feel fairly confident we'll hear something from activision and ubisoft. Would be legit surprised if we don't get an assassins creed and call of duty game this year from them.

Also another factor is that the portability factor is pretty underrated. Not just because it combines handheld and homeconsole install bases and it frees up nintendo devs time by focusing mostly on one platform, but a lot of people are interested in having a home console experience on the go. If the Switch is successful, you can expect Sony and Microsoft to emulate their success and make their hybrid versions also, just like how they emulated Wii's success with motion controls.

If you look at all the PS3/PS4-Vita features, you can tell they wanted to have a hybrid system but the mobile tech wasn't ready. Maybe by the time the PS5 comes out they can have an optional hybrid SKU for those we want a handheld. It seems like common sense if we can have PS4 or PS4 Pro-ish graphics by then.
 

z0m3le

Banned
Sure it can be possible. But in my opinion, to start mass-production in October the decision about the fabrication node and customisation of the SoC must have been taken way before July. Which would make the subsequent set of messages for the devs quite a crazy ride.

Think about it in reverse... Before Eurogamer's clocks, developers were using 3 A57 cores at up to 2GHz and probably the full 1ghz GPU. Nintendo's target battery life was 5 to 8 hours, so maybe 16nm was always what they wanted to achieve and had not completed testing around july of the SoC we see taped out in the OP, so they gave developers who are launching basically Wii U ports and indie titles, the power envelope of the 20nm chip instead. After testing they found the higher clocks they wanted to use and told developers with final hardware that their games would run perfectly. I mean Zelda is the only Switch game with any frame rate drops afaik and they could have been working with the higher clocks all along.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
They could have gone with Jaguar,
It really helps when discussing such matters to keep an eye on the prospective vendor's timeline, lest we sway into Wishful Thinking land. First Jaguar APUs showed up in May 2013 (Kabini, Tamesh). WiiU was a late-2011-turned-2012 product. Moreover, there are no 40nm Jaguars. A Jaguar-based wiiU would've been pushed back yet another year, while using a bleeding-edge fab node on top of that. So in reality, for the respective wiiU timeframe, AMD had Bobcats (cue in cries of '64-bit ALUs are not real SIMD!' from the "anonymous dev" crowd).

was A15 capable enough? Would have given them up to 2.5Ghz quad core fairly easily without breaking the power consumption demand much. Processed even at 32nm would have been a big deal for Wii U as well.
Sure, A15 was capable enough. Who'd have supplied that? And what about the GPU? - Who'd have supplied that? Definitely not AMD (not with this CPU), and NV's Tegra 4 (earliest A15 design) came out in Q2 2013. Same issue as with Jaguar - not matching the timeline.

NEC's embedded memory forced the GPU's process node, if they had went without that, they could have gone for 32nm shrink with the embedded E6760 (39w on 40nm)
They could have, if they targeted that TDP. Let's not pin that to BC.


They could have used an AMD APU from the time with a 6000 series GPU and 2GB GDDR5 for a healthy ~500GFLOP GPU with a quad-core x86 CPU, possibly enough to receive many current-gen games.
Edit: or as Zom3ie says, quad core Cortex A15 at high frequency, bigger GPU than what it ended up having.
See me reply to z0m3le.

[ed] Actually no, such level of confusion warrants a proper answer.

Has it occurred to you that if Sony and MS - two vendors that traditionally go for higher TDPs, would not want to use a Piledriver APU in their consoles, neither would nintnedo?
 
Sure it can be possible. But in my opinion, to start mass-production in October the decision about the fabrication node and customisation of the SoC must have been taken way before July. Which would make the subsequent set of messages for the devs quite a crazy ride.

Well, we really don't know the timeline behind the SoC manufacturing here. It could be that the July devkits had SoCs made in March or April, which would make it more difficult to get those on 16nm. So the 20nm devkits were placeholders, and they had to use placeholder clocks which would match up power consumption wise to the final retail clocks on 16nm. So, developers who weren't able to get a new devkit are still limited to the clocks on the July devkits, which is why they were presented as final for launch.

The point is, the message could have been very different from dev to dev (and devkit to devkit)- maybe most AAA partners knew the final clocks all along, but those who could only get the old devkits had only the old clocks.

Writing that out though, I'm not sure how much sense that makes, as the developers would be targeting the final retail hardware, not what their devkits were capable of... Hmmm. Maybe the July devkits effectively maxed out at those clocks for some reason, so they couldn't run them at the retail specs?

Think about it in reverse... Before Eurogamer's clocks, developers were using 3 A57 cores at up to 2GHz and probably the full 1ghz GPU. Nintendo's target battery life was 5 to 8 hours, so maybe 16nm was always what they wanted to achieve and had not completed testing around july of the SoC we see taped out in the OP, so they gave developers who are launching basically Wii U ports and indie titles, the power envelope of the 20nm chip instead. After testing they found the higher clocks they wanted to use and told developers with final hardware that their games would run perfectly. I mean Zelda is the only Switch game with any frame rate drops afaik and they could have been working with the higher clocks all along.

Yeah this is for sure possible. It's also possible that they opened up that 4th CPU core at some point, which would let them reduce the CPU clock speed a bit to maintain the same performance. I guess we'll find out either way within a few weeks.
 

Xanonano

Member
Think about it in reverse... Before Eurogamer's clocks, developers were using 3 A57 cores at up to 2GHz and probably the full 1ghz GPU. Nintendo's target battery life was 5 to 8 hours, so maybe 16nm was always what they wanted to achieve and had not completed testing around july of the SoC we see taped out in the OP, so they gave developers who are launching basically Wii U ports and indie titles, the power envelope of the 20nm chip instead. After testing they found the higher clocks they wanted to use and told developers with final hardware that their games would run perfectly. I mean Zelda is the only Switch game with any frame rate drops afaik and they could have been working with the higher clocks all along.

Where's your source on the target battery life being 5 to 8 hours? Given that the leaked dev documents that you're getting the 3 cores and 2GHz from also state that the battery life is approximately three hours, it doesn't sound like that was ever a possibility.
 

z0m3le

Banned
It really helps when discussing such matters to keep an eye on the prospective vendor's timeline, lest we sway into Wishful Thinking land. First Jaguar APUs showed up in May 2013 (Kabini, Tamesh). WiiU was a late-2011-turned-2012 product. Moreover, there are no 40nm Jaguars. A Jaguar-based wiiU would've been pushed back yet another year, while using a bleeding-edge fab node on top of that. So in reality, for the respective wiiU timeframe, AMD had Bobcats (cue in cries of '64-bit ALUs are not real SIMD!' from the "anonymous dev" crowd).

Honestly, I always assumed those chips were Puma, I was thinking about the bobcat chip which was part of AMD's cat family processors released in 2011, thought Jaguar was an early update 2012 chip for bobcat, but I guess AMD stuck with bobcat for 2 years? I even have a bobcat E-350 HP 11.6inch netbook, guess it escaped me..

Sure, A15 was capable enough. Who'd have supplied that? And what about the GPU? - Who'd have supplied that? Definitely not AMD (not with this CPU), and NV's Tegra 4 (earliest A15 design) came out in Q2 2013. Same issue as with Jaguar - not matching the timeline.
Nintendo themselves produce ARM CPUs for some of their handhelds and anyone can get an ARM license, considering AMD and Nvidia did just that, there is no reason they couldn't do it for a custom APU, not sure where you thought the hang up would be here. A15 came to market in "late 2012" so it could have made it to the Wii U on time considering they only needed 13m, they could have printed the entire run before launch jk
They could have, if they targeted that TDP. Let's not pin that to BC.
The GPU size was ~157mm^2 or similar right? Sticking with 45nm was an issue Nintendo had with space. The Wii U has a million design flaws, we could design it completely differently but I'm trying to work inside the box, I mean they could have made the PS4 with 2GHz bobcat cores if they wanted to, the GPU is pitcairn right? Originally released March 2012 on 28nm? The 16CU version is 130watts so a 150-200watt console from Nintendo could completely replace the Wii U. My point is if we aren't working inside of some limitations, we could get a bit crazy and the original question asked about removing BC, not TDP.

Where's your source on the target battery life being 5 to 8 hours? Given that the leaked dev documents that you're getting the 3 cores and 2GHz from also state that the battery life is approximately three hours, it doesn't sound like that was ever a possibility.

multiple sources gave us those numbers for target battery life on final hardware, a july devkits do not have final hardware, they don't even have target clocks, so the 3 hour battery life we see there comes from eurogamer's clocks causing the system to expend its charge in that time frame, meaning that my speculation about why Eurogamer's clocks on 20nm would move to foxconn's clocks on 16nm final hardware makes sense. Nintendo at some point was likely asked to change the clocks IF Foxconn's clocks are for retail, and beforehand eurogamer's clocks probably gave ~5 to 8 hours of battery life before final clocks were changed/worked out on final hardware.
 

LordOfChaos

Member
It really helps when discussing such matters to keep an eye on the prospective vendor's timeline, lest we sway into Wishful Thinking land. First Jaguar APUs showed up in May 2013 (Kabini, Tamesh). WiiU was a late-2011-turned-2012 product. Moreover, there are no 40nm Jaguars. A Jaguar-based wiiU would've been pushed back yet another year, while using a bleeding-edge fab node on top of that. So in reality, for the respective wiiU timeframe, AMD had Bobcats (cue in cries of '64-bit ALUs are not real SIMD!' from the "anonymous dev" crowd).

Alternate wishful thinking land: A years advantage gained them almost nothing in the scope of their generation, so in our what-if machine I would further put in the parameters "what if they had just waited until Jaguar, and even with a lower end APU they would have been more easily compatible with the universe, if they forgot BC".


I'd further say, and I think you'd agree, that while the PPC750 stands up better to Bobcat than the common thinking may be, three cores was the nail in that bed. I've wondered a few times if all that space for one core having 2MB eDRAM was really worth it over a fourth core with the same cache as the others, maybe that's a further decision for BC.
 

z0m3le

Banned
Well, we really don't know the timeline behind the SoC manufacturing here. It could be that the July devkits had SoCs made in March or April, which would make it more difficult to get those on 16nm. So the 20nm devkits were placeholders, and they had to use placeholder clocks which would match up power consumption wise to the final retail clocks on 16nm. So, developers who weren't able to get a new devkit are still limited to the clocks on the July devkits, which is why they were presented as final for launch.

The point is, the message could have been very different from dev to dev (and devkit to devkit)- maybe most AAA partners knew the final clocks all along, but those who could only get the old devkits had only the old clocks.

Writing that out though, I'm not sure how much sense that makes, as the developers would be targeting the final retail hardware, not what their devkits were capable of... Hmmm. Maybe the July devkits effectively maxed out at those clocks for some reason, so they couldn't run them at the retail specs?

If they were using the form factor ones that look like the retail units, 20nm chips would throttle at the final hardware clocks, and if your launch game doesn't need more than 1GHz and 3 cores, why fix the problem. My point was just that no games in the launch line up needed that and final devkits would be out for everyone at a later date so best to give developers what they can work with. If there was a Call of Duty or Mass Effect 4 at launch, they couldn't do this, but because the most demanding 3rd party game at launch is a Wii U port, 1GHz 3 cores would be more than enough.

Yeah this is for sure possible. It's also possible that they opened up that 4th CPU core at some point, which would let them reduce the CPU clock speed a bit to maintain the same performance. I guess we'll find out either way within a few weeks.

I just want to be able to speculate without having trolls attack me on Twitter because they can't add.

Edit: Blu What was bleeding edge about the 28nm process? AMD had already been using it for a year to make HD7000 series cards, I personally think with an A15 CPU, Nintendo could have grabbed the embedded AMD chip I mentioned on 32nm and had the entire console around 40w tdp which was apparently their goal originally anyways. ARM made more sense with Nintendo at this point too as they could have rapidly ported 3DS games and older titles, it would have been easier to port games to the Switch as well now if Wii U was running on ARMv7.
 
If they were using the form factor ones that look like the retail units, 20nm chips would throttle at the final hardware clocks, and if your launch game doesn't need more than 1GHz and 3 cores, why fix the problem. My point was just that no games in the launch line up needed that and final devkits would be out for everyone at a later date so best to give developers what they can work with. If there was a Call of Duty or Mass Effect 4 at launch, they couldn't do this, but because the most demanding 3rd party game at launch is a Wii U port, 1GHz 3 cores would be more than enough.

Yeah I guess it's possible that the July devkits literally maxed out at 1GHz CPU and 768MHz GPU due to heat in that form factor, especially with what we know about the Shield TV. That would definitely explain the discrepancy between "max clocks for launch" (when using that devkit) versus the "standard specs" identified by the Foxconn leaker.

And like I said, it's possible that the July devkits were actually fabbed as early as March or April, which would make it a fair bit harder to get 16nm I would imagine, even if 16nm was always the plan.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Nintendo themselves produce ARM CPUs for some of their handhelds and anyone can get an ARM license, considering AMD and Nvidia did just that, there is no reason they couldn't do it for a custom APU, not sure where you thought the hang up would be here. A15 came to market in "late 2012" so it could have made it to the Wii U on time considering they only needed 13m, they could have printed the entire run before launch jk
While anyone, including you and me, could license an ARM design, given we pay the fees, not anyone could licence an advanced CPU design *and* produce one in a timely frame. It really helps to see who actually came up first with A15-based products:

NV - Q2 2013
TI - Q2 2013
Samsung - Q3 2012
(full list)

Of those only Samsung had any chance of meeting the timeframe we're discussing, and that with an early Midgard GPU (read: ARM being the vendor of both IPs really helps with the time-to-market). So basically, in an alternative timeline where nintendo decided to go A15, they'd have automatically gone for an ARM GPU as well. Would it have worked, wouldn't it have worked? - Given Nintendo can't go with the Android sw stacks - somebody should have produced the console GPU HAL as well.

The GPU size was ~157mm^2 or similar right? Sticking with 45nm was an issue Nintendo had with space. The Wii U has a million design flaws, we could design it completely differently but I'm trying to work inside the box, I mean they could have made the PS4 with 2GHz bobcat cores if they wanted to, the GPU is pitcairn right? Originally released March 2012 on 28nm? The 16CU version is 130watts so a 150-200watt console from Nintendo could completely replace the Wii U. My point is if we aren't working inside of some limitations, we could get a bit crazy and the original question asked about removing BC, not TDP.
If we go that route (of overriding the actual TDP), they could've produced an 8-core Espresso and a 2x-4x Latte - it's not like either of them was hitting any inherent area or power limits, and this conversations would not have ever happened.

ps: there are no 2GHz bobcat cores - fastest bobcat ever was 1.65GHz.

Alternate wishful thinking land: A years advantage gained them almost nothing in the scope of their generation, so in our what-if machine I would further put in the parameters "what if they had just waited until Jaguar, and even with a lower end APU they would have been more easily compatible with the universe, if they forgot BC".
Well, the more parameters we teak the greater the solution space. None of us here know why nintnedo targeted that launch period, or why they targeted that TDP. Had they waited, had they bumped up the TDP a bit - all valid hypotheticals that may never get validated.

I'd further say, and I think you'd agree, that while the PPC750 stands up better to Bobcat than the common thinking may be, three cores was the nail in that bed. I've wondered a few times if all that space for one core having 2MB eDRAM was really worth it over a fourth core with the same cache as the others, maybe that's a further decision for BC.
And indeed I agree with you there. The cores could have been more, the core division of 'better-' and 'worse-equipped' for larger datasets could have been different or gone altogether. Heck, the GPU - the thing that scales up best, could have comprised of more units. My point being, of all things people (rightfully) complained re the wiiU, the CPU architecture was the last of their actual problems.
 

Hermii

Member
Yeah I guess it's possible that the July devkits literally maxed out at 1GHz CPU and 768MHz GPU due to heat in that form factor, especially with what we know about the Shield TV. That would definitely explain the discrepancy between "max clocks for launch" (when using that devkit) versus the "standard specs" identified by the Foxconn leaker.

And like I said, it's possible that the July devkits were actually fabbed as early as March or April, which would make it a fair bit harder to get 16nm I would imagine, even if 16nm was always the plan.
Not saying this is 100% impossible, but my expectations is still a standard tx1 at eurogamer clocks, perhaps with some slightly customized memory layout.
 
Not saying this is 100% impossible, but my expectations is still a standard tx1 at eurogamer clocks, perhaps with some slightly customized memory layout.

Probably a very good base for expectations. I expect the same, and I'll be pleasantly surprised with any increases. Definitely think the Foxconn clocks and 16nm are very possible though.
 

Xanonano

Member
multiple sources gave us those numbers for target battery life on final hardware, a july devkits do not have final hardware, they don't even have target clocks, so the 3 hour battery life we see there comes from eurogamer's clocks causing the system to expend its charge in that time frame, meaning that my speculation about why Eurogamer's clocks on 20nm would move to foxconn's clocks on 16nm final hardware makes sense. Nintendo at some point was likely asked to change the clocks IF Foxconn's clocks are for retail, and beforehand eurogamer's clocks probably gave ~5 to 8 hours of battery life before final clocks were changed/worked out on final hardware.

But the specifications are clearly for the final hardware, not the devkits, and that doesn't answer the question of how these sources would know the 'true' battery life if Nintendo didn't even tell third-party developers that. Are we to believe that these sources were Nintendo or Nvidia employees involved in the design of the hardware? Those are the only people who would be in a position to know what Nintendo was planning.
 

z0m3le

Banned
blu said:

Thanks for the history lesson, tbh this entire week has had me worn out from lack of sleep (9 month old son at home) so I'm running on fumes and that time frame just sort of blurs together I guess. I figured a multi-billion dollar company could launch an A15 CPU in Q4 of 2012, too bad, it would have been the perfect answer to Nintendo's console woes at the time.

Didn't know my bobcat was pushing it, I remember overclocking it in the netbook to 1.85ghz without issues, wonder what they could have done with a desktop/console design.

Probably a very good base for expectations. I expect the same, and I'll be pleasantly surprised with any increases. Definitely think the Foxconn clocks and 16nm are very possible though.

Honestly, I'd be leaning to Eurogamer's clocks too, but Foxconn's clocks have no answer and all the facts were pretty much proven. Now that I have a better understanding of the kind of testing that was going on (random sample testing) and Eurogamer's information being older, I have to LEAN the other way, but I respect people who are of opposite mind.

But the specifications are clearly for the final hardware, not the devkits, and that doesn't answer the question of how these sources would know the 'true' battery life if Nintendo didn't even tell third-party developers that. Are we to believe that these sources were Nintendo or Nvidia employees involved in the design of the hardware? Those are the only people who would be in a position to know what Nintendo was planning.

Maybe for some reason the developers who leaked that (I'm going to keep assuming it's Ubisoft, because it's Ubisoft) got final hardware and didn't have final clocks? decided to run it at Eurogamer's clocks and found the battery lasted 5 to 8 hours? or maybe someone just estimated that without knowing final hardware but did know that the chip was going to be 16nm. It's probably most likely that the leaker got bad info, talked to another leaker and they both reported it.
 

Thraktor

Member
By BC you surely refer to the CPU, right? I'm curious to hear what you consider their alternatives were for that timeframe.

It's possible that the big pool of eDRAM was necessary to replicate the latency characteristics of Wii's 1T-SRAM, which obviously increased the size and cost of Latte by quite a bit. Using a CPU which could be fabricated on-die with the GPU, such as Bobcat or A9, would have also allowed them to ditch the MCM in favour of a much cheaper single-die package.

That's not to say that I would expect the Wii U to have been much more powerful had they abandoned Wii BC, but they likely could have hit similar performance levels at a significantly reduced cost.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
It's possible that the big pool of eDRAM was necessary to replicate the latency characteristics of Wii's 1T-SRAM, which obviously increased the size and cost of Latte by quite a bit. Using a CPU which could be fabricated on-die with the GPU, such as Bobcat or A9, would have also allowed them to ditch the MCM in favour of a much cheaper single-die package.

That's not to say that I would expect the Wii U to have been much more powerful had they abandoned Wii BC, but they likely could have hit similar performance levels at a significantly reduced cost.
I see your point, IFF one considers the eDRAM pool as a burden. I actually think it allowed the device to do things neither xb360 nor xb1 could do at such levels of efficiency - that properly sized, fast GPU memory pool with a fast data path from the CPU is something that sits well in a game console. And I'm not even considering the effect it had on the BOM (i.e. 800MHz ddr3 on a 64-bit bus).
 

Astral Dog

Member
They could have used an AMD APU from the time with a 6000 series GPU and 2GB GDDR5 for a healthy ~500GFLOP GPU with a quad-core x86 CPU, possibly enough to receive many current-gen games.
Edit: or as Zom3ie says, quad core Cortex A15 at high frequency, bigger GPU than what it ended up having.

They could have gone with Jaguar, was A15 capable enough? Would have given them up to 2.5Ghz quad core fairly easily without breaking the power consumption demand much. Processed even at 32nm would have been a big deal for Wii U as well.

NEC's embedded memory forced the GPU's process node, if they had went without that, they could have gone for 32nm shrink with the embedded E6760 (39w on 40nm)

Honestly they should have just went with that GPU regardless, 570gflops would not have been great for 3rd parties working on the XB1 and PS4 in 2015+ but it would have at least been able to push 1080p of last gen games, which was what everyone expected Nintendo to do with all their 360+ talk.

Wii U in the end had a million problems, many were major.
K think they didn't do this because in part they were planning to move to mobile and combine their teams eventually . Wii U was like a step in that direction.

A 500 glops GPU would have been better than what the Switch currently has and the backlash would have been insane.
 

Thraktor

Member
I see your point, IFF one considers the eDRAM pool as a burden. I actually think it allowed the device to do things neither xb360 nor xb1 could do at such levels of efficiency - that properly sized, fast GPU memory pool with a fast data path from the CPU is something that sits well in a game console. And I'm not even considering the effect it had on the BOM (i.e. 800MHz ddr3 on a 64-bit bus).

I agree that, in theory, a pool of fast embedded memory plus a larger, slower main memory pool seems well suited to a game console, but I'm becoming less convinced that it's a smart choice in practice, when faced with a limited budget. Of course it's near impossible to say whether a, say, 128-bit GDDR5 pool would have been cheaper for Nintendo than the eDRAM + DDR3, but we do have a near-perfect case study of the two approaches in the PS4 and XBO.

Both had access to identical CPU/GPU architectures and a very similar BoM (judging by sales price once MS dropped Kinect). Pretty much the only meaningful high-level distinction between the two designs was MS's teams's decision to combine embedded memory with DDR3, and Sony's decision to go with a single GDDR5 pool. The results are quite obvious; the single pool was the better decision. That's not to say that MS's embedded memory approach didn't have its advantages, but they were obviously heavily outweighed by the extent to which they had to cut back on GPU logic in order to accommodate the memory pool on-die.

Granted, XBO's 32MB may be less than ideal for its target resolutions, and they used SRAM rather than Wii U's (presumably) cheaper eDRAM, but in the absence of other evidence I'd err on the side of a single fast memory pool being the better approach, either to maximise performance at a given cost, or minimise cost at a given level of performance.

This, of course, excludes the possibility of a tile-based GPU like Nintendo has moved to with Switch, which would obviously have a different set of trade-offs.
 

z0m3le

Banned
I agree that, in theory, a pool of fast embedded memory plus a larger, slower main memory pool seems well suited to a game console, but I'm becoming less convinced that it's a smart choice in practice, when faced with a limited budget. Of course it's near impossible to say whether a, say, 128-bit GDDR5 pool would have been cheaper for Nintendo than the eDRAM + DDR3, but we do have a near-perfect case study of the two approaches in the PS4 and XBO.

Both had access to identical CPU/GPU architectures and a very similar BoM (judging by sales price once MS dropped Kinect). Pretty much the only meaningful high-level distinction between the two designs was MS's teams's decision to combine embedded memory with DDR3, and Sony's decision to go with a single GDDR5 pool. The results are quite obvious; the single pool was the better decision. That's not to say that MS's embedded memory approach didn't have its advantages, but they were obviously heavily outweighed by the extent to which they had to cut back on GPU logic in order to accommodate the memory pool on-die.

Granted, XBO's 32MB may be less than ideal for its target resolutions, and they used SRAM rather than Wii U's (presumably) cheaper eDRAM, but in the absence of other evidence I'd err on the side of a single fast memory pool being the better approach, either to maximise performance at a given cost, or minimise cost at a given level of performance.

This, of course, excludes the possibility of a tile-based GPU like Nintendo has moved to with Switch, which would obviously have a different set of trade-offs.

Where do you factor in the price of the kinect and the dedicated rather large silicon with what was it? 5 billion transistors in the xbox1? The problem with Microsoft's approach was it was a delivery system for an over engineered kinect 2.0 system with hdmi in. I'm pretty sure that didn't come cheap. Without those components, I imagine Microsoft could have gotten a larger gpu.
 

Thraktor

Member
Where do you factor in the price of the kinect and the dedicated rather large silicon with what was it? 5 billion transistors in the xbox1? The problem with Microsoft's approach was it was a delivery system for an over engineered kinect 2.0 system with hdmi in. I'm pretty sure that didn't come cheap. Without those components, I imagine Microsoft could have gotten a larger gpu.

I'm referring to the prices of the systems once MS dropped Kinect (both were $400 at that point). I don't believe there was very much dedicated hardware (if any) in the XBO itself dedicated to Kinect.
 

LordOfChaos

Member
Where do you factor in the price of the kinect and the dedicated rather large silicon with what was it? 5 billion transistors in the xbox1? The problem with Microsoft's approach was it was a delivery system for an over engineered kinect 2.0 system with hdmi in. I'm pretty sure that didn't come cheap. Without those components, I imagine Microsoft could have gotten a larger gpu.


I beleive they were leaning right up against the interposer size limit with the APU at launch regardless of the Kinect. The other issue was they weren't sure if they could secure enough GDDR5 so played it safe while having to sacrifice GPU execution units, while Sony took the risk and it paid off.
 

Thraktor

Member
I beleive they were leaning right up against the interposer size limit with the APU at launch regardless of the Kinect. The other issue was they weren't sure if they could secure enough GDDR5 so played it safe while having to sacrifice GPU execution units, while Sony took the risk and it paid off.

I think part of the decision was that they believed they could get a memory capacity advantage over Sony with 8GB of DDR3 vs 4GB of GDDR5. Sony bumped up to match them at 8GB quite late on, and presumably at that point countering with 16GB of DDR3 would have been either too expensive or simply not feasible given available parts.
 
I think part of the decision was that they believed they could get a memory capacity advantage over Sony with 8GB of DDR3 vs 4GB of GDDR5. Sony bumped up to match them at 8GB quite late on, and presumably at that point countering with 16GB of DDR3 would have been either too expensive or simply not feasible given available parts.

I actually think Sony got really lucky with the PS4, they wanted GDDR5 but at the time it wasn't available in the densities required to do 8GB hence Microsoft decided to do DDR3 and ESRAM.

At the last minute they could get GDDR5 in the densities they needed and it was a drop in solution that didn't require them to re-engineer anything so they did it. Developers didn't even know it had 8GB until Sony announced it to the public.
 

Hermii

Member
I beleive they were leaning right up against the interposer size limit with the APU at launch regardless of the Kinect. The other issue was they weren't sure if they could secure enough GDDR5 so played it safe while having to sacrifice GPU execution units, while Sony took the risk and it paid off.
Wasn't Sony really really lucky because gddr5 prices dropped significantly allowing them to have 8 gigs instead of 4?

Edit: just saw the post above me.
 
Side note...

8 CPU cores (4x ARM Cortex A57 + 4x ARM Cortex A53)

Do we know based on the tear downs or anything about the A53s?

Wasnt sure if this was discussed at all? Do we believe maybe the A53s are running the os?
 

Pancake Mix

Copied someone else's pancake recipe
Wasn't Sony really really lucky because gddr5 prices dropped significantly allowing them to have 8 gigs instead of 4?

Edit: just saw the post above me.

Yeah, but they still had to use 16 different 512 MB GDDR5 modules on the initial launch model. It was most likely still quite pricey in 2013, and there wasn't a graphics chip on the market that had a remotely comparable amount at that time.
 

LordOfChaos

Member
Side note...

8 CPU cores (4x ARM Cortex A57 + 4x ARM Cortex A53)

Do we know based on the tear downs or anything about the A53s?

Wasnt sure if this was discussed at all? Do we believe maybe the A53s are running the os?

Discussed, tl;dr Most likely 3 cores are running game threads right now, and one A57 is for the OS.

The problem with running the OS on an A53 and the game on the A57 cores is that the CCI for Tegra X1 only supported the first generation of big.LITTLE, cluster switching, either cluster is on at once. Using a mix of A53 cores and A57 cores would take updating that whole fabric.


vTrv4YW.png



Will remain unclear if the A53s even stayed on the die until a die shot.
 

z0m3le

Banned
Discussed, tl;dr Most likely 3 cores are running game threads right now, and one A57 is for the OS.

The problem with running the OS on an A53 and the game on the A57 cores is that the CCI for Tegra X1 only supported the first generation of big.LITTLE, cluster switching, either cluster is on at once. Using a mix of A53 cores and A57 cores would take updating that whole fabric.


vTrv4YW.png



Will remain unclear if the A53s even stayed on the die until a die shot.

For those who aren't following as closely, it's not so cut and dry because customizations can be made. We don't know what those changes to the die are. One thing for instance that we don't know is if final hardware has embedded memory on the die, or like in this post, if the A53s have been freed and allowed to run the OS and give developers the 4th A57 core and possibly even a couple A53 cores as well.

I will say from reading the impressions, that it sounds like the switch is very snappy in its OS, so it might in fact be a high performance core behind it.
 

LordOfChaos

Member
That alleged benchmark in the previous also showed three CPU threads as well I think. Again, everything is unsure at that point. With a die shot we could at least see if the A53s were left in, and maaaybe if the interconnect is different from TX1, but it would take a developer blinking at us a certain number of times to show how many cores they have access too.
 
Top Bottom