• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

Do console makers do this? Test a retail unit at far higher CPU/GPU clocks than what they are allowed to use by developers?

I've never heard of such a thing, at least not at those kind of levels (at least on the CPU side). If you can run it for 8 days straight, no problem, at those speeds, then that's what you release it at.
 

Donnie

Member
So this is still going the way of insanity, I see.

So about the whole gtx 1060 SCD.-

* Even if the heavier unit in the leak was real (which I have no reason to doubt there was one), it was referenced as having a much bigger 200mm SoC, not that it had both the Switch SoC and another 200mm chip, so all that "GTX 1060 is also 200mm, all checks out" is basically wrong speculation from the start.

The most logical explanation is that it was a test unit with a much bigger SoC with more cores to run both debugging software along with the Switch hardware.

A SoC double the size of the Switch SoC just to enable debugging?, definitely not.
 

z0m3le

Banned
I've never heard of such a thing, at least not at those kind of levels (at least on the CPU side). If you can run it for 8 days straight, no problem, at those speeds, then that's what you release it at.

This is pretty much why I was dismissing of it being a stress test for a soc that would consume less than half that wattage, zero logic behind that assumption imo.
 
OK, thanks.

If there is a much more powerful dock coming later, I wish Nintendo would have put all their cards on the table from the start as a lot of day one Switch buyers will feel burned if they reveal this dock add on at E3 or in late 2017.

I'm more excited about the potential clock boosts to the standard Switch console, especially the the CPU boost. How much more powerful is the Switch CPU at 1GHz over the WiiU CPU ?

They shouldn't feel burned, if you were fine with the Switch where it is technically. I mean, most of them don't care about graphics or third-party support.

I'm all in, for this thing if it means real third-party support.
 
This is pretty much why I was dismissing of it being a stress test for a soc that would consume less than half that wattage, zero logic behind that assumption imo.

Well Thraktor's theory was that some VC emulation (like Gamecube) will use those CPU clock speeds but far lower GPU clock speeds, which could balance out the power consumption. Which is why I also asked if that's ever been done with a console for emulation. Like, does the XB1 use far higher CPU clock speeds than the normal max for XB360 emulation?
 

z0m3le

Banned
Well Thraktor's theory was that some VC emulation (like Gamecube) will use those CPU clock speeds but far lower GPU clock speeds, which could balance out the power consumption. Which is why I also asked if that's ever been done with a console for emulation. Like, does the XB1 use far higher CPU clock speeds than the normal max for XB360 emulation?

Why would they clock both the CPU and GPU higher for 8 days if that was the case? He's coming from a conclusion and trying to fit this into a narrative that supports that conclusion, it's not sound logic IMO. Not saying Thraktor isn't smart, god no, he is incredibly intelligent, but if you take the above approach, you can morph any information to fit a narrative, his puzzle piece here just doesn't fit the leak properly, why would they be stress testing a Unity Demo instead of the Gamecube Emulator that we know is currently being worked on if Eurogamer's other rumors are to be believed? http://www.eurogamer.net/articles/2...ch-will-have-gamecube-virtual-console-support
 

Thraktor

Member
As for the CPU difference being substantial, I already pointed to the ~88% increase that developers saw with Wii U's CPU at launch, with their games stuck running on 2 cores.

The Wii U always had a 3-core CPU, though, it simply may have had one reserved for OS functions at a certain point. The clock speed increase was only 25%, whereas here we're talking about three times that. More importantly, though, and in conjunction with:

As a history lesson in Nintendo consoles' past, the Gamecube had a clockspeed change (CPU and GPU) four months prior to launch.

http://uk.ign.com/articles/2001/05/16/pre-e3-gamecube-specifications-revisions

Both Gamecube and Wii U had clock speed changes a few months before launch. However, there's a big difference between them and Switch, which is that they're purely stationary games consoles. There are only two factors for Nintendo to consider when deciding on clocks for a stationary device:


  • Can the PSU/power circuitry deliver sufficient power to maintain the clocks?
  • Can the cooling system properly dissipate the heat generated by the clocks?
If the answer to the two of these is yes, then there's really no reason not to use the higher clocks, as it's just free bonus performance. (For Gamecube it was more a matter of re-allocating thermal/power budget between CPU and GPU).

For a portable device, though, Nintendo have to make a value judgement based on the impact on battery life that any given clock speed has, even if thermally there's no problem. There aren't any free lunches on handhelds when it comes to clock speeds. Now, that doesn't mean that it's impossible for them to make late changes. It may be that they decide to trade a little bit of battery life for a little bit of performance, or vice versa. From this point of view, a small change in clocks may be plausible. For example, clocking the GPU up from 307MHz to 384MHz might be a worthwhile trade-off for them, and that would allow them to up the docked GPU clock to 921MHz while keeping a reasonable ratio between the two.

Clocking the CPU up from 1GHz to 1.78GHz is a much bigger jump than I could expect from them at the last minute, though, because it would make such a large impact on battery life. We can actually calculate this, based on Switch's battery specs, a battery life of 3hrs for Zelda and Anandtech's power curves.

For quad-A57s on 20nm (pretty much worst case scenario), jumping from 1.02GHz to 1.78GHz would reduce battery life almost by half, from 3hrs to about 1hr and 35 minutes.

For quad-A72s on 16FF+ (pretty much best case scenario), moving from 1.02GHz to 1.78GHz would decrease battery life by almost 20% to about 2 and a half hours.

A 20% to 50% drop in battery life is a huge change to make at the last minute, particularly after you've already told developers that the clock speeds are final.

EDIT: Last thing Thraktor, incase you are right about a SCD, how do you feel about a quad core A57 @ 1ghz partnering with 4.4tflops of GPU power? I assume not as good as you would feel about a quad core A72 @ 1.7ghz, that is a big tell in this leak IMO, if the dock is what we think it is, the lacking CPU doesn't sound like Nintendo, especially not after we saw what they did with New 3DS.

Firstly, I haven't discounted the possibility of A72 cores at all, and I'd find it rather odd for them to be using A57s when the A72 would have been available to them from the start of development. That said, I can't really rule anything out, so who knows.

On the subject of the SCD, it sort of depends on the intended use of the device. If it's a stand-alone device that plays games which can't run on Switch (let's say AAA wester third-party games, for the sake of argument), then yes, I'd find a quad-core clocked at 1GHz to be somewhat lacking, regardless of whether it's A57 or A72. However, as I mentioned in my original post, it's possible that Nintendo could disable the Switch's GPU while docked in the SCD in order to clock the CPU up higher in that scenario. Alternatively, the Switch SoC and GP106 could just be early stand-ins for a new SoC they have in development with newer/more/higher clocked CPU cores.

The other possible use case of the device is simply to play exactly the same games as Switch, but at a higher resolution, in which case they don't really need anything much more powerful than Switch has on the CPU front, similar to the way PS4Pro and Scorpio are sticking with Jaguar cores.

So this is still going the way of insanity, I see.

So about the whole gtx 1060 SCD.-

* Even if the heavier unit in the leak was real (which I have no reason to doubt there was one), it was referenced as having a much bigger 200mm SoC, not that it had both the Switch SoC and another 200mm chip, so all that "GTX 1060 is also 200mm, all checks out" is basically wrong speculation from the start.

Actually, the leak states that the 200mm² chip is in addition to the 100mm² one:

Reddit translation said:
Haven't seen such a huge core, and it's 16nm + 100mm2 main core

The leaker also speculates the 200mm² chip only includes a GPU, which wouldn't make sense if it's there's no other CPU on there.

The most logical explanation is that it was a test unit with a much bigger SoC with more cores to run both debugging software along with the Switch hardware.

This isn't a thing. Not only is it unnecessary (at most a debug unit would require extra RAM), but it would be massively, enormously expensive. They'd basically have to double their entire R&D spend to cover the cost of designing a separate debug SoC, and then tape out and fab perhaps just 20 or so wafers. The cost per chip would be astronomical.

Yeah I don't know if that leak was referring to this devkit/debug unit but I certainly think it's far too early to make any assumptions about an SCD, especially when all we "know" is the potential die size. Also keep in mind the SCD patent focuses heavily on wireless supplemental processing, so even if this leak is describing an SCD there is likely a lot more to it than we can know right now.

Yeah, there's potentially quite a bit more to it than just a powerful dock, but the applications could be quite broad, and there's no easy way to narrow it down much based on the information we've got. For example I had previously speculated that the SCD could be used for distributed low-latency game-streaming tech, but there's no real evidence as to whether they would actually be going this route or not.

On the other hand it definitely seems worthwhile to think about why the Foxconn leak detailed the clocks it did. Do console makers typically run stress tests at speeds far above where they intend the processors to perform?

I don't know, but they could. Thermal stress tests can be pretty over-the-top, such as covering a console in a blanket and running a test in a hot room for days on end. It may be that they decided a simpler option was just to test at higher than normal clock speeds. If the system can handle them, then it should be able to handle normal speeds in unusual operating environments. Or perhaps they're doing both, just to cover every possible eventuality.

Have any other consoles ever had a higher CPU clock solely for the purposes of emulation?

Not in exactly the same way, but n3DS sort of did (as the boosted CPU clock over 3DS allowed them to do SNES emulation). The Switch is kind of a unique device, though, both in its form-factor and in the range of possible systems it could emulate.

We have precedence with the Wii U and Gamecube of a late in the game pre-launch clock speed boost, but do we know if those lower clock speeds were ever described as "final for launch"?

Not as far as I'm aware. Clock speeds change over the course of development for a lot of systems, and they're usually kept fluid until there's final hardware to test on. I just don't see why Nintendo would specify "final for launch" unless it actually is final for launch. They're hardly new to designing console hardware, and they'd know as well as anyone what kind of scope they have for clock speed changes when they're sending out specs to developers.
 
Why would they clock both the CPU and GPU higher for 8 days if that was the case? He's coming from a conclusion and trying to fit this into a narrative that supports that conclusion, it's not sound logic IMO. Not saying Thraktor isn't smart, god no, he is incredibly intelligent, but if you take the above approach, you can morph any information to fit a narrative, his puzzle piece here just doesn't fit the leak properly, why would they be stress testing a Unity Demo instead of the Gamecube Emulator that we know is currently being worked on if Eurogamer's other rumors are to be believed? http://www.eurogamer.net/articles/2...ch-will-have-gamecube-virtual-console-support

The idea is that they would be testing the absolute maximum clock speeds the CPU and GPU could be running at, and since it's (presumably) connected to a power supply during this 8 day test, there's really no reason not to run both of the components at their max speed.

On the other hand, I do agree that this stress test wouldn't make sense if these clocks are never used. If they are ensuring that every Switch can run at those clocks simultaneously for 8 days, then the SoC would need to be capable of that, which would wind up disqualifying some amount of the chips fabricated. It seems like it would be overly expensive to do this just for some thermal/battery headroom.

I don't know, but they could. Thermal stress tests can be pretty over-the-top, such as covering a console in a blanket and running a test in a hot room for days on end. It may be that they decided a simpler option was just to test at higher than normal clock speeds. If the system can handle them, then it should be able to handle normal speeds in unusual operating environments. Or perhaps they're doing both, just to cover every possible eventuality.

But, as I said above, doesn't this test indicate that all of these chips should be capable of these clock speeds at retail? Wouldn't that wind up affecting their fabrication yields and in the end costing them a bit more?
 

z0m3le

Banned
  • Can the PSU/power circuitry deliver sufficient power to maintain the clocks?
  • Can the cooling system properly dissipate the heat generated by the clocks?
If the answer to the two of these is yes, then there's really no reason not to use the higher clocks, as it's just free bonus performance. (For Gamecube it was more a matter of re-allocating thermal/power budget between CPU and GPU).

For a portable device, though, Nintendo have to make a value judgement based on the impact on battery life that any given clock speed has, even if thermally there's no problem. There aren't any free lunches on handhelds when it comes to clock speeds. Now, that doesn't mean that it's impossible for them to make late changes. It may be that they decide to trade a little bit of battery life for a little bit of performance, or vice versa. From this point of view, a small change in clocks may be plausible. For example, clocking the GPU up from 307MHz to 384MHz might be a worthwhile trade-off for them, and that would allow them to up the docked GPU clock to 921MHz while keeping a reasonable ratio between the two.

Clocking the CPU up from 1GHz to 1.78GHz is a much bigger jump than I could expect from them at the last minute, though, because it would make such a large impact on battery life. We can actually calculate this, based on Switch's battery specs, a battery life of 3hrs for Zelda and Anandtech's power curves.

For quad-A57s on 20nm (pretty much worst case scenario), jumping from 1.02GHz to 1.78GHz would reduce battery life almost by half, from 3hrs to about 1hr and 35 minutes.

For quad-A72s on 16FF+ (pretty much best case scenario), moving from 1.02GHz to 1.78GHz would decrease battery life by almost 20% to about 2 and a half hours.

A 20% to 50% drop in battery life is a huge change to make at the last minute, particularly after you've already told developers that the clock speeds are final.

Except a move on the GPU from 20nm to 16nm while only increasing clocks by 25% would reduce the power consumption further, giving back even more time to the battery, the real problem though is that you are using Zelda's 3hrs as a point of reference, which could just as easily be 3 and a half hours with the eurogamer clocks and now about 3 hours with the foxconn clocks.

Also I assume they didn't use A72 cores until the final hardware because there is some customization that did to the SoC, which would mean that the X1 SoC which we know was used through out development until at least October, would be fine if it was used for launch software.
 

Mokujin

Member
The leaker also speculates the 200mm² chip only includes a GPU, which wouldn't make sense if it's there's no other CPU on there.

Well let's revisit the Reddit translation of the leak.-

https://www.reddit.com/r/NintendoSw...r_someone_who_producing_switch_at_foxconn_is/

*Speculated provided the core is only include GPU, it would be even more powerful than PS4 pro

From my understanding this quote means that "if" the SoC was all a GPU it would be more powerful than PS4, not that the chip is a GPU only in itself.

Devkit version?
*A much powerful version, producing 2000x units for now *The core is 1x times bigger than the one above,200m㎡, looking it looks like 12x18 *2 extra ram, this version is 8GB *2x wifi, 1hdmi, 1x mini dp, 1x ethernet, 2x unidentified socket, 3x network led indicator, *Looks much more complex than the normal version, 6, 7 extra unidentified storage, different socket *

This part talks about one core, which I understand it means SoC, all the ports match with the image mentioned above.

(22 Nov 2016 update) Confirmed it's a devkit, and Nintendo was coming to exam the devkit today *No dock for this version for now

This update confirms it's a devkit.

*Haven't seen such a huge core, and it's 16nm + 100mm2 main core

This quote that you point out I can't really see that indicates 2 chips, but more like the Core (SoC) is done in 16nm and is 100mm2 more than the standard Switch SoC.

We could argue about what is lost in translation, but my understanding is that there is only one chip.

PD.- Oh, I liked an image from this blog post which also has a Wii U dev kit photo which shares similarities with the "fat" Devkit Switch.
 

Schnozberry

Member
I don't mean to rain on any parades, but it's very possible that the leaker was doing hardware testing prior to Nintendo's firmware and software being flashed onto the unit, at which point the clocks would be whatever Nintendo specified them to be. The clocks in the leak are well within the capability of the Tegra X1 and are multiples of the base frequency, and may have been used for testing purposes only to verify the SOC was within tolerances.
 
I don't mean to rain on any parades, but it's very possible that the leaker was doing hardware testing prior to Nintendo's firmware and software being flashed onto the unit, at which point the clocks would be whatever Nintendo specified them to be. The clocks in the leak are well within the capability of the Tegra X1 and are multiples of the base frequency, and may have been used for testing purposes only to verify the SOC was within tolerances.

I thought I recalled someone identifying the "millions of fish" software test demo (which is where this guy saw the clock speeds) as a Unity demo, which would naturally require the firmware and software to be on the machine, if true.
 

Schnozberry

Member
I thought I recalled someone identifying the "millions of fish" software test demo (which is where this guy saw the clock speeds) as a Unity demo, which would naturally require the firmware and software to be on the machine, if true.

Yes, it does require utilities for testing to be loaded, but that's not the final software image and kernel loaded onto the device for shipping. It would be quite the faux pas if a consumer received a unit that contained all of the quality control software
 

joesiv

Member
Yes, it does require utilities for testing to be loaded, but that's not the final software image and kernel loaded onto the device for shipping. It would be quite the faux pas if a consumer received a unit that contained all of the quality control software
Thats a good point. Actually all the Nintendo test scenes that I recall were custom, not some generic third party job.

For example the Gamecube was a dolphin swimming around where you could flip though some render modes.
 

joesiv

Member
All we know is that there are 2000 of these things, they are much more powerful than the Switch we will buy in March and come with a screen attached to them, in my opinion this is so they can target both Switch's original specs and higher specs with this one device.
Count me on board with your speculation! I also think the dev kits are meant to target different versions of the Switch. Undocked, docked and a future revision. This way developers can these current software against future specs. It might not be finalized but if Nintendo puts something beefy in there they can just update the dev kit to unlock or modify the config.
 

Donnie

Member
I don't mean to rain on any parades, but it's very possible that the leaker was doing hardware testing prior to Nintendo's firmware and software being flashed onto the unit, at which point the clocks would be whatever Nintendo specified them to be. The clocks in the leak are well within the capability of the Tegra X1 and are multiples of the base frequency, and may have been used for testing purposes only to verify the SOC was within tolerances.

We've yet to see an example of a X1 managing to run anywhere near those speeds without throttling down almost straight away. It may be able to run at that combined speed in a device with far better cooling capabilities then Shield TV, but no way will Switch have that. It may be able to do so at 16nm, but then the question becomes why drop the systems clocks so low at that node? I think it's pointless to test those speeds for tolerance if they are well above what's required for the device. What is the tester aiming for exactly? I mean Nintendo aren't going to want to up chip cost dramatically by throwing away chips that can't reach a clock speed that is well above the speed they require.
 
Yes, it does require utilities for testing to be loaded, but that's not the final software image and kernel loaded onto the device for shipping. It would be quite the faux pas if a consumer received a unit that contained all of the quality control software

But if the demo was specifically a Unity one, wouldn't that require the testing utilities to have compatibility with Unity? I admit I don't know the complexity of that testing software so it could have this compatibility. If the Unity info is wrong though then you're totally right.

We've yet to see an example of a X1 managing to run anywhere near those speeds without throttling down almost straight away. It may be able to run at that combined speed in a device with far better cooling capabilities then Shield TV, but no way will Switch have that. It may be able to do so at 16nm, but then the question becomes why drop the systems clocks so low at that node? I think it's pointless to test those speeds for tolerance if they are well above what's required for the device. What is the tester aiming for exactly? I mean Nintendo aren't going to want to up chip cost dramatically by throwing away chips that can't reach a clock speed that is well above the speed they require.

The only thing I can think of is if the unit the leaker was testing was opened during testing, so that cooling wouldn't be necessary. But then that would beg the question, what's the pointing of testing a unit in a non-retail mode for 8 days? You would be disqualifying a good amount of chips if you're requiring this to function at the clock speeds mentioned.

Maybe they just did this for a few chips, but not all? That wouldn't make any sense, right?
 

Donnie

Member
But if the demo was specifically a Unity one, wouldn't that require the testing utilities to have compatibility with Unity? I admit I don't know the complexity of that testing software so it could have this compatibility. If the Unity info is wrong though then you're totally right.



The only thing I can think of is if the unit the leaker was testing was opened during testing, so that cooling wouldn't be necessary. But then that would beg the question, what's the pointing of testing a unit in a non-retail mode for 8 days? You would be disqualifying a good amount of chips if you're requiring this to function at the clock speeds mentioned.

Maybe they just did this for a few chips, but not all? That wouldn't make any sense, right?

No makes no sense to me. Also if they were just testing the SoC as suggested (testing at max with no limits on cooling and power) why not just test at full X1 spec of 1Ghz/2Ghz? Why test these very specific clocks?
 
Plus if you're just testing the SoC you do it before it's soldered onto a board. Once its on a board and installed it makes negative sense to test it for 8 days at speeds it's never going to run at. If the system can sustain 8 days of running at those speeds it can be shipped at those speeds.

I'm not even going to attempt to guess if these leaked speeds are correct or not, but the mental gymnastics being taken to some how say what the foxconn leak saw was correct but not what the chip is going to run at is kind of silly.
 

Thraktor

Member
But, as I said above, doesn't this test indicate that all of these chips should be capable of these clock speeds at retail? Wouldn't that wind up affecting their fabrication yields and in the end costing them a bit more?

I can't imagine it affecting yields all that much. Every TX1 die had to be able to hit 2GHz CPU and 1GHz GPU, so comparatively the clocks we're looking at are more conservative, and with two years of yield improvements since the TX1 started production I don't imagine they'd have many clock speed-influenced yield problems.

Except a move on the GPU from 20nm to 16nm while only increasing clocks by 25% would reduce the power consumption further, giving back even more time to the battery, the real problem though is that you are using Zelda's 3hrs as a point of reference, which could just as easily be 3 and a half hours with the eurogamer clocks and now about 3 hours with the foxconn clocks.

Also I assume they didn't use A72 cores until the final hardware because there is some customization that did to the SoC, which would mean that the X1 SoC which we know was used through out development until at least October, would be fine if it was used for launch software.

The bolded is a fair criticism, as the 3 hour battery life will be based on final clocks. The proportional drop is still the same (~20% drop from 3hr 45 to 3hr with A72s on 16nm as a best case), although in this case A57s on 20nm would be impossible, as at 1.78GHz they consume more power than the entire Switch system itself.

Regarding the rest of the post, even if Nintendo were using a TX1 in dev-kits in October as a temporary solution with a 16nm A72 based SoC being used for the final device, they still would have taken this into account when giving clock speeds to developers. If they told developers that "final clocks" are 1GHz for the CPU, then that would have been based on their own internal testing of the final 16nm chips, even if they weren't with devs yet. If Nintendo hadn't been able to test with final chips, then they wouldn't have called them final clocks.

Well let's revisit the Reddit translation of the leak.-

https://www.reddit.com/r/NintendoSw...r_someone_who_producing_switch_at_foxconn_is/



From my understanding this quote means that "if" the SoC was all a GPU it would be more powerful than PS4, not that the chip is a GPU only in itself.

Yes, but if it were the only large integrated circuit on the board then it would be a nonsensical thing to speculate on. If there isn't a separate CPU in the device then there's no possible way it could be a pure GPU.

This part talks about one core, which I understand it means SoC, all the ports match with the image mentioned above.

It talks about a core, although it doesn't specify that it's the only one. (The wording implies one core, but it's hard to parse such implications without reading the original language).

This update confirms it's a devkit.

I don't disagree at all on this point, I just don't think it's a devkit for the standard Switch.

This quote that you point out I can't really see that indicates 2 chips, but more like the Core (SoC) is done in 16nm and is 100mm2 more than the standard Switch SoC.

We could argue about what is lost in translation, but my understanding is that there is only one chip.

I'm reading "+ 100mm2 main core" as being "in addition to the 100mm² main core". The use of the phrase "main core" would also imply that the chip he's talking about is a secondary core. It is pretty difficult to read into something like this without a more specific translation, though. There are a few gaffers with a good understanding of Chinese who have helped us out with translations of things like this before, but unfortunately I haven't seen them pop their heads into this thread yet. Hopefully someone can provide a more clear translation of these parts, because until then it would seem very much open to interpretation.

The other problem I have with the idea that it's a single 200mm² SoC is that there's no plausible explanation for what that SoC actually is. Parker shouldn't be that big (it's basically a 16nm version of the TX1 with Denver cores), but it also makes zero sense for a Switch development kit. Xavier, meanwhile, is both a year away from even sampling, and larger again than our mystery chip (it's expected to be about 300mm²).

If it's a custom SoC, then if they've got engineering samples in late 2016 you'd expect the actual product using it to launch late 2017 at the latest, which would mean a Switch 2 only about 6 months after the first model. In my mind this is even less likely than the alternative of them using a Switch SoC + a GP106 as a stand-in for some kind of device to launch in 2018.

PD.- Oh, I liked an image from this blog post which also has a Wii U dev kit photo which shares similarities with the "fat" Devkit Switch.

I think that photo is likely of some kind of Switch development hardware, but I don't think it's the one we're talking about. For one, it predates this device even being manufactured by a couple of months, but it's also missing several ports from the leaked device (e.g. mini DisplayPort).

I thought I recalled someone identifying the "millions of fish" software test demo (which is where this guy saw the clock speeds) as a Unity demo, which would naturally require the firmware and software to be on the machine, if true.

If they're using any pre-existing demo then by far the most likely is Nvidia's Threaded Rendering Vulkan Sample.

Plus if you're just testing the SoC you do it before it's soldered onto a board. Once its on a board and installed it makes negative sense to test it for 8 days at speeds it's never going to run at. If the system can sustain 8 days of running at those speeds it can be shipped at those speeds.

I'm not even going to attempt to guess if these leaked speeds are correct or not, but the mental gymnastics being taken to some how say what the foxconn leak saw was correct but not what the chip is going to run at is kind of silly.

In Switch's case, though, the limiting factor isn't the maximum clocks the chip can run at, but what battery life can be achieved with given clocks. In the case of the CPU, which is identically clocked in both portable and docked mode, this is a very direct and obvious relationship. Just because a CPU could clock at 1.8GHz doesn't mean it's a reasonable speed for a battery powered device. For the GPU, even though docked an portable will run at different clocks, Nintendo will still want to limit the gap between the two to a reasonable ratio, which means docked clocks will still be constrained by what's doable in the battery powered mode. If ~300MHz is as much as Nintendo feels they can clock the GPU in portable mode, then they may want to limit docked to 2.5x that, even if it can go higher, just to make software development less troublesome.
 

z0m3le

Banned
The original Chinese leak was wiped from the Internet, which is curious itself, I'm not sure we could get a better translation than what exists, but if we do it will be because someone saved the original leak that everyone pretty much thought was fake at the time.

Thraktor, I agree with the cpu power consumption you have assumed, but did you take into account the power saving from the gpu, as the clock is smaller than the gains from moving to 16nm allows, there should be additional savings there, making the power consumption difference smaller.
 
In Switch's case, though, the limiting factor isn't the maximum clocks the chip can run at, but what battery life can be achieved with given clocks. In the case of the CPU, which is identically clocked in both portable and docked mode, this is a very direct and obvious relationship. Just because a CPU could clock at 1.8GHz doesn't mean it's a reasonable speed for a battery powered device. For the GPU, even though docked an portable will run at different clocks, Nintendo will still want to limit the gap between the two to a reasonable ratio, which means docked clocks will still be constrained by what's doable in the battery powered mode. If ~300MHz is as much as Nintendo feels they can clock the GPU in portable mode, then they may want to limit docked to 2.5x that, even if it can go higher, just to make software development less troublesome.

I could give you that on the CPU, but at the same time I have NEVER heard of any one stress testing a chip (specially one in a finished product) at an almost 100% over clock, while only over clocking the GPU 20%. Plus that extra 20% in the GPU isn't going to make development between the two tiers go from super easy to OMG so hard.

All I'm saying is that what you guys are suggesting for reasons to run the system at these speeds just aren't done.

Again I have no idea if these are the final clocks or not, all I'm saying is the reasons to test a system for 8 days at these speeds and not ship them at those speeds are 0. No one does that.
 
If they're using any pre-existing demo then by far the most likely is Nvidia's Threaded Rendering Vulkan Sample.

That makes perfect sense, yes. Not sure where I saw Unity, it could've been from the comments in that reddit post.

I could give you that on the CPU, but at the same time I have NEVER heard of any one stress testing a chip (specially one in a finished product) at an almost 100% over clock, while only over clocking the GPU 20%. Plus that extra 20% in the GPU isn't going to make development between the two tiers go from super easy to OMG so hard.

All I'm saying is that what you guys are suggesting for reasons to run the system at these speeds just aren't done.

Again I have no idea if these are the final clocks or not, all I'm saying is the reasons to test a system for 8 days at these speeds and not ship them at those speeds are 0. No one does that.

Do you know this from experience in the field? Not that I'm disagreeing or accusing you, just curious. This whole leak has made me very interested in the inner workings of console development, post-fabrication. I know way too much about the fabrication process and little about anything else!
 

Donnie

Member
Well IMO testing at those speeds suggests that either they will run at those speeds or they plan to possibly increase clock speed to those speeds in the future. I don't see any other reason.
 
Do you know this from experience in the field? Not that I'm disagreeing or accusing you, just curious. This whole leak has made me very interested in the inner workings of console development, post-fabrication. I know way too much about the fabrication process and little about anything else!

No I can't say professionally, while I'm working on games its on PC. Though I've followed hardware (as in PC stuff) for decades now, not that I'm any kind of authority on that or anything.
 

Scoobie

Member
With all the speculation (which makes for interesting reading), once Switch is released, what's the likely timeline on knowing what the hardware actually is?

If there's a tear down and there are no markings on the SOC are there people out there with scanning electron microscopes at home who can give the definitive answers???
 
With all the speculation (which makes for interesting reading), once Switch is released, what's the likely timeline on knowing what the hardware actually is?

If there's a tear down and there are no markings on the SOC are there people out there with scanning electron microscopes at home who can give the definitive answers???

I think it'll have to go to chipworks again, who got the Wii U die shots.

Unless someone here is quite wealthy and loves tech I don't think anyone has a SEM at home! Those things are incredibly expensive.
 

Padinn

Member
Any thought on if this late change might explain what appears to be a small launch availability? I think they said two million units at launch. Do we know when they started mass production and what their daily capacity is?

If it's 20k units per day it takes about 3.25 months to hit 2 million.
 

Rodin

Member
Since this thread contains a fair bit of fanfiction, i was wondering if the 1080p screen mentioned along with the larger GPU isn't simply a unit like the WiiU Gamepad to enable off tv for those who buy this (at the moment imaginary) high end home console. You attach the joycon to its sides and you're good to go. It doesn't have a SoC inside so it can be 1080p because the games are streamed from the console like on Wii U.
 
Since this thread contains a fair bit of fanfiction, i was wondering if the 1080p screen mentioned along with the larger GPU isn't simply a unit like the WiiU Gamepad to enable off tv for those who buy this (at the moment imaginary) high end home console. You attach the joycon to its sides and you're good to go. It doesn't have a SoC inside so it can be 1080p because the games are streamed from the console like on Wii U.

I'm pretty sure it was clarified that the "1080p screen" was a mistranslation of "1080p output". The rest of the devkit makes it seem just like the standard Switch devkit (or debug unit) that had pictures going around before the October reveal. I wouldn't read much at all into it.
 

Zedark

Member
Any thought on if this late change might explain what appears to be a small launch availability? I think they said two million units at launch. Do we know when they started mass production and what their daily capacity is?

If it's 20k units per day it takes about 3.25 months to hit 2 million.

So, according to this leak, production started only in november, so from the start of November to March 3 is 4 months. The 20k units a day would mean 600k a month, so at launch they could theoretically have 2.4 million units, and 3 million at the end of march. That is probably what Kimishima meant with being able to raise the production of the Switch units: having 3 million, instead of 2 million, at the end of march.

Of course, this requires this foxconn leak to be correct, but that seems quite reasonable what with the amount of correct info it has provided. If not, then they would have been able to start production much earlier. This does invite a bit of a chicken and egg question: does Nintendo only ship 2 million at launch due to production restrictions, or did they only start production in November due to only wanting to ship 2 million units at launch?
 
So, according to this leak, production started only in november, so from the start of November to March 3 is 4 months. The 20k units a day would mean 600k a month, so at launch they could theoretically have 2.4 million units, and 3 million at the end of march. That is probably what Kimishima meant with being able to raise the production of the Switch units: having 3 million, instead of 2 million, at the end of march.

Of course, this requires this foxconn leak to be correct, but that seems quite reasonable what with the amount of correct info it has provided. If not, then they would have been able to start production much earlier. This does invite a bit of a chicken and egg question: does Nintendo only ship 2 million at launch due to production restrictions, or did they only start production in November due to only wanting to ship 2 million units at launch?

This also lines up with what we've heard from Japan Display about providing 3 million screens by the end of March.
 
I'm pretty sure it was clarified that the "1080p screen" was a mistranslation of "1080p output". The rest of the devkit makes it seem just like the standard Switch devkit (or debug unit) that had pictures going around before the October reveal. I wouldn't read much at all into it.

I wonder if the Switches with the black Joy-cons/dark UI a couple of devs posted pictures of are dev kits or just testing units?

C2CZR_rXEAEUtQm.jpg


2017011693501519.jpg


jpg


For its worth, only the Wii U's dev kits had dark UI

IMAG0459.jpg
 

Xdrive05

Member
Hey folks, gaffer //ARCANUM posted new impressions in his recent thread saying he investigated but could not hear or feel the fan running on any Switch games demoed in any configuration, handheld *or* docked.

I know there has been a lot of speculation as to why they would even include a fan at the conservative clock speeds reported previously, and now in this thread's leak we are hearing about much higher clocks being tested in production.

Seems to me like all of this might suggest Nintendo is intending to upclock this thing at some point, so therefore they need a fan and the SoC to be able to run faster than the speeds it will ship with.

It's the only thing I see that makes sense of all the known facts as well as the credible sounding leaks. Just thought I'd put it here for the record. :)
 

z0m3le

Banned
Hey folks, gaffer //ARCANUM posted new impressions in his recent thread saying he investigated but could not hear or feel the fan running on any Switch games demoed in any configuration, handheld *or* docked.

I know there has been a lot of speculation as to why they would even include a fan at the conservative clock speeds reported previously, and now in this thread's leak we are hearing about much higher clocks being tested in production.

Seems to me like all of this might suggest Nintendo is intending to upclock this thing at some point, so therefore they need a fan and the SoC to be able to run faster than the speeds it will ship with.

It's the only thing I see that makes sense of all the known facts as well as the credible sounding leaks. Just thought I'd put it here for the record. :)

This falls in line with the thinking that the device is on a smaller node than shield TV, given that you can hear the fan in that device and it has a larger profile for cooling once the battery is taken into account. Though it could also be that none of the launch games are taxing the system.
 
This falls in line with the thinking that the device is on a smaller node than shield TV, given that you can hear the fan in that device and it has a larger profile for cooling once the battery is taken into account. Though it could also be that none of the launch games are taxing the system.

Or that they got rid of the fan for the retail unit and are just using the vents for convection cooling? Not likely but it's a possibility.
 

Padinn

Member
Well assuming the hardware is prepped for it they could increase clocks fairly easily if needed. Similar to how they are adding usb3 after launch. The thing that stood out to be about eurogamer is they specified the development doc as stating those were launch speeds.
 

OryoN

Member
As some of you are probably aware of by now, several Jpanese developers were asked about their thoughts on the Nintendo Switch presentation, and what things they found interesting about the device.

(If you haven't read it by now, hop of over to Nintendo Everything and give it a go.)

What caught my attention, though, was how some devs thought the concole was cheap. Not just that, but on more than one occasion, it came in context of them being especially surprised because they were aware of what the hardware brings to the table. This could be purely from a feature standpoint, or it could be processor related, or both. Either way, it's very interesting that some expressed their thoughts in this manner. For example:

Ganbarion
I was expecting the price to be ¥28,000, based on its performance, but I didn’t think the dock and Joy-Con grip and everything would be included. They’ve packed those in anyway, and still had the price at ¥29,980

Bandai Namco
The full thing is pretty cheap, and I say that as someone familiar with the inner workings of the system. I have no idea how they kept the price that low.

PlatinumGames
PlatinumGames has already come out saying we’re developing games for the Nintendo Switch, so we understand what the hardware is capable of. Which is why the price was surprising. This thing is seriously cheap.


These comments do seem to lean in favor of speculation based on the Foxconn leak. Or to say it another way; I find it hard to believe that these developers were so impressed by 4 A-57's and an underclocked X1, even to the point where the $299.99 price tag came as a complete surprise/shock. Am I wrong to believe that this wouldn't make much sense?

It's true that none of us know what's really under Switch's hood, but all the circumstantial evidence makes it hard - for me at least - to lean in favor of speculation heavily built upon Eurogamer's report. Thankfully, not too long before we are able to get some teardowns. In the meantime, how do you interpret those comments(especially with the two main opposing leaks in the current discussion)?
 
As some of you are probably aware of by now, several Jpanese developers were asked about their thoughts on the Nintendo Switch presentation, and what things they found interesting about the device.

(If you haven't read it by now, hop of over to Nintendo Everything and give it a go.)

What caught my attention, though, was how some devs thought the concole was cheap. Not just that, but on more than one occasion, it came in context of them being especially surprised because they were aware of what the hardware brings to the table. This could be purely from a feature standpoint, or it could be processor related, or both. Either way, it's very interesting that some expressed their thoughts in this manner. For example:

Ganbarion


Bandai Namco


PlatinumGames



These comments do seem to lean in favor of speculation based on the Foxconn leak. Or to say it another way; I find it hard to believe that these developers were so impressed by 4 A-57's and an underclocked X1, even to the point where the $299.99 price tag came as a complete surprise/shock. Am I wrong to believe that this wouldn't make much sense?

It's true that none of us know what's really under Switch's hood, but all the circumstantial evidence makes it hard - for me at least - to lean in favor of speculation heavily built upon Eurogamer's report. Thankfully, not too long before we are able to get some teardowns. In the meantime, how do you interpret those comments(especially with the two main opposing leaks in the current discussion)?

Not a popular opinion, but I think it is kind of impressive.

We have to remember that Nvidia sells the Shield TV for $200 and that comes with 16GB of storage while not coming with a screen, battery, or joycons (not including dock or grip because of developer who was impressed by hardware even excluding those two items). They also condensed it all into a tablet form factor. I'm sure Nvidia sells the Shield TV at a good profit, but regardless, $100 extra for a screen, 16GB extra, joycons (with motion, haptic feedback, and NFC), battery, and slimmed down to a tablet form factor seems about right to me..this is before even counting the dock.

It would be super cool to find out that they brought it down to a 16nm node allowing for future clock increases, but I'm just not expecting it..
 
As some of you are probably aware of by now, several Jpanese developers were asked about their thoughts on the Nintendo Switch presentation, and what things they found interesting about the device.

(If you haven't read it by now, hop of over to Nintendo Everything and give it a go.)

What caught my attention, though, was how some devs thought the concole was cheap. Not just that, but on more than one occasion, it came in context of them being especially surprised because they were aware of what the hardware brings to the table. This could be purely from a feature standpoint, or it could be processor related, or both. Either way, it's very interesting that some expressed their thoughts in this manner. For example:

Ganbarion


Bandai Namco


PlatinumGames



These comments do seem to lean in favor of speculation based on the Foxconn leak. Or to say it another way; I find it hard to believe that these developers were so impressed by 4 A-57's and an underclocked X1, even to the point where the $299.99 price tag came as a complete surprise/shock. Am I wrong to believe that this wouldn't make much sense?

It's true that none of us know what's really under Switch's hood, but all the circumstantial evidence makes it hard - for me at least - to lean in favor of speculation heavily built upon Eurogamer's report. Thankfully, not too long before we are able to get some teardowns. In the meantime, how do you interpret those comments(especially with the two main opposing leaks in the current discussion)?
Well, from MDave's tests, the GPU speed in docked mode is the possible average of the clock frequency for the TX1 due to the chip throttling when the system reaches a certain thermal level. The CPU reaching full speeds would be an issue because that would drain the battery in portable mode fast if the chipset was on 20nm. Even with that, the system is WAY above the 3DS is power (which still retails for $200), has good screen, and includes all of the tech in the joy-cons.

Switch may be using 16nm afterall, though, since the system appears to be running very cool with the fan not being notable even while it's docked. Did more than one poster vouched for that?
 


In response to Ganbarion, of course the dock would be included. lol I don't think the dock is worth $90 standalone. I think NIntendo will make some decent profit with the controllers and accessories standalone.

Well, from MDave's tests, the GPU speed in docked mode is the possible average of the clock frequency for the TX1 due to the chip throttling when the system reaches a certain thermal level. The CPU reaching full speeds would be an issue because that would drain the battery in portable mode fast if the chipset was on 20nm. Even with that, the system is WAY above the 3DS is power (which still retails for $200), has good screen, and includes all of the tech in the joy-cons.

Switch may be using 16nm afterall, though, since the system appears to be running very cool with the fan not being notable even while it's docked. Did more than one poster vouched for that?

I thought it was already established in the eurogamer spec thread(that's been closed sine the switch presentation) that with eurogamer's clockspeeds, the Switch wouldn't need to use a fan in handheld mode?
 

z0m3le

Banned
Or that they got rid of the fan for the retail unit and are just using the vents for convection cooling? Not likely but it's a possibility.

Passive cooling the X1 chip on 20nm with 768mhz clock in a device smaller than the shield TV after the (charging) battery is taken into effect is unlikely, and it's down right impossible for it to run passive with the Foxconn leak clocks for 8 days straight without active cooling, and they were mass producing that unit already. From everything we know about the chips and clocks, it's really impossible to have this device use passive cooling, it's far too thin imo.
 
Top Bottom