There is a lot of weird dismissal of these clocks from the luka.
First I'd like to comment on Thraktor's last post, Your estimates only apply to the Maxwell to Pascal power consumption. The SoC estimation is as good as we can get and even if your chart is off, the difference is going to be counted in 100th of watts not whole watts.
The chart could definitely be off by quite a large margin, particularly at low clock speeds (there's a reason I didn't extend the chart any further left). Just to clarify the data points I used for the chart, in decreasing order of likely accuracy:
- Nvidia power measurement of P4 HPC GPU (GP102) at 1060MHz of 36W (GPU only)
- Third party power measurement of GTX1080 (GP102) at both stock and overclocked, with estimated RAM power consumption subtracted
- Nvidia's 1.5W power consumption claim for TX1 GPU (2xSM) at ~500MHz, adjusted based on TSMC's claims of 16FF+ improvement over 20nm.
The other point I need to make in interpreting the graph is that it's making a very simple assumption that all non-SM GPU logic will scale proportionally with the number of SMs (which isn't really the case, as I'll clarify below).
The only pure data point I have for Pascal power consumption is from the Nvidia's P4 slide, as it's a precise GPU-only measurement without having to extrapolate anything. However, it's a far larger GPU than what's in Switch, and there's no guarantee that power consumption would scale linearly. Most importantly, though, we know that non-SM power consumption
won't scale linearly. It's a 20 SM GPU, which is 10 times what we're expecting in Switch, but at 64 ROPs it only has 4x the quantity I'd expect in Switch's GPU (based on TX1 having 16). The front-end is also likely to see limited benefits from scaling, so even if we know GP102's power draw at 1.06GHz it's hard to say, even at that frequency, that a much smaller Pascal GPU would have a similar per-SM power consumption.
The GTX1080 power measurements should be reasonably accurate, but are at far higher clocks than we're looking at (1.5GHz+) and are also based on the much larger GP102 GPU, so will the same scaling questions to smaller GPUs.
The last one, which is the closest to the size and clocks of the Switch GPU, is also the most likely to be inaccurate. It's based on Nvidia's claim that, while matching the Apple A8X's performance, the TX1's GPU draws 1.5W. Then, it relies on TSMC's claims of the power efficiency improvements of 16FF+. This presents several possible sources of error:
- We don't actually know the clock speed. TX1 at "full clocks" got about twice the performance of the A8X, so I had assumed, given a 1GHz max clock for TX1, that to match A8X it would be running at about 500MHz. However, we now know that, even in the actively cooled Shield TV the TX1 can throttle down to 768MHz, so it's possible that it was likewise throttling in the main test, and the A8X matching test could be likewise lower (perhaps around 400MHz).
- The 1.5W figure was given by Nvidia themselves, but they could be measuring in a way which is favourable to them, or they may have taken the best-performing TX1 die they could find for the test, with the real-world average die consuming more power.
- TSMC's claims for the power consumption benefits of 16FF+ are likely to be as favourable as possible (it's an advertising claim after all). There are probably certain chips at certain clock speeds which see these improvements, but there's no particular guarantee that Switch's SoC is one of those, and the real-world savings could be lower.
The high end of the graph is probably reasonably accurate (at least for large Pascal GPUs). The low end of the graph, though, could easily be off by a factor of two or three. It's very difficult to estimate where a power curve like this will bottom out without precise measurements at those clocks, and as I only had vague extrapolations to work off of, the numbers given should be taken with a healthy dose of salt.
Next Wii U saw a very similar CPU performance upgrade from developers working on launch software.
Launch software developers thought the Wii U had 2 CPU cores at 1GHz and up until a few months before launch, this was the case. Later on they got access to 3 CPU cores at 1.24ghz. That is substantial, nearly a doubling of total CPU performance months before launch, and launch titles ran without that extra performance. The GPU performance change was even greater than this foxconn's leak on top of that, so I don't see your point being a strong one.
Lastly, Eurogamer answered someone, I believe it was Hermii who got the response and posted it in this thread, the post has since been altered, which is weird but it originally said that the clocks might have changed, after looking at this foxconn leak, doesn't sound too final to me IMO, and with some developers still using the july devkits up to 3 or 4 weeks ago and possibly even now, we don't know what changes were made in the final hardware, even Eurogamer made that same comment.
PS I believe the Eurogamer rumor 100%, I simply think that final hardware allowed Nintendo to make these changes resulting in Foxconn's leak, which is as solid info as we've ever gotten on Switch IMO.
I don't think it's impossible for Nintendo to have made some small last-minute clock speed changes, and something like increasing the GPU clocks to 921MHz in docked mode would seem somewhat plausible, if they realised on final testing that the cooling system could easily handle it. I don't see them increasing the CPU clock to 1.78GHz for games, though. It's simply far too big a jump for them to make on a last-minute basis. Had they known that there was a change in store from the specs given to Eurogamer's source (either a more efficient CPU or a move to 16nm) then they would have taken that into account and wouldn't have described the specs as final.
The leak of the titles came from 4chan and yeah, the real leaks always seem to be dismissed to me, but the reason no sites are reporting on the foxconn leak might be that NDAs around final devkits could be much more serious, and not every dev has them, I'd also speculate that Nintendo has only given this info to wave 2 titles, since launch titles should all be targeting the older specs regardless, and I think even Nintendo has come out and said that Zelda BotW isn't using the Switch to its fullest.
Well, there are probably a few things discouraging sites from reporting on this leak. Firstly, on the surface the clock speeds contradict those from reliable sources (even though they are most likely just thermal stress test clocks and bear no relation to in-game speeds). Secondly, the latter half of the leak sounds very out-there, and it's unlikely they can find any sources to corroborate it (as whatever device it is is likely only just making its way out to developers).
I suppose the fact that the only discussion of it seems to be here doesn't help, what with crackpots like me talking about a GTX 1060 powered super-dock and so forth
So after all the speculation does the evidence lean towards Switch being 16nm Finfet?
My money's still on 20nm.
We'll know in 1 month and a week. Someone will tear down the switch, we'll measure the SoC, epoxy it, sand it all the way to the exposed die.
An amateur tear-down or die-photo is unlikely to tell us the manufacturing process, as 16nm and 20nm have pretty much the same transistor density. You'd really need a cross-section of the die (and professional imaging equipment) to be able to tell the difference, but hopefully Chipworks will come through for us again on that front.
I am so very late to this party, but I have to assume that SCD probably exists inside the hardware division of Nintendo and was tabled for not being effective (either as a function of cost or suitability for a good experience).
If it was the former and not the latter, then making a 4K dock could happen at some point, but that presents a different set of issues (is more RAM or CPU needed in addition to the GPU).
Perhaps the next Switch is targets 1080p in the portable mode, 4K on the dock, but can run the 720p render target as well.
Will be interesting to see.
It's quite likely Nintendo have a lot of crazy prototypes locked away in their hardware labs, but if they're manufacturing 2000 units of a devkit with Foxconn then that suggests that they're moving into full-scale software production for the device. It's really just puzzling as to exactly what the device is, and what Nintendo's plans are for it.
The one thing holding me back from believing this 100% (or even 80%) was Eurogamer explicitly describing those clocks as final, like you said.
You don't think it's possible though that Nintendo had been investigating the possibility of 16nm for a while and finally determined they could go that route around August - October? Maybe they had to await reports of yield issues? And then come October they sent out those 16nm devkits which would line up with LKD's report of October devkits being more powerful. Again, Eurogamer describing those clocks as final sorta contradicts all of this, including LKD's October devkit leak.
I suppose if the numbers are very rough then it seems a lot less convincing than I previously thought. Power consumption matching up identically would have been an enormous indicator, but like you said since we don't have much hard data to work from that can't really be determined now.
Nintendo would have locked down the manufacturing process
long before August. Ditto with choice of CPU core. Those decisions would have been made a long time ago, and by the time they told devs that the Eurogamer clocks were final they would have had actual silicon coming off the production lines to test (which is precisely why they would be confident enough to say that they're final clocks, because they're testing them on final hardware).
That would be very interesting news if it was coming as early as Holiday 2017 or something... It sure seems like the Switch is a "soft" launch, as some users have been calling it, with an incomplete OS, a trial online period, and relatively few games. They could be throwing us a huge curveball a year from now, though I really do doubt they want to muddy development even further with 1-2 more development targets.
Nintendo is nothing if not unpredictable though.
Well, that's just console launches for you. There are always "few games" and missing OS features in the first couple of months.
I don't know how the SCD would fit into their business plans, though, to be honest. Perhaps it's a last-minute thing, and they're trying to use it as a way to get western third-parties on board. I actually don't think an extra development target is
that big of a deal (depending on the gap), assuming that Switch's development toolset/APIs/etc are already based around the idea of having multiple performance/resolution targets and making it simple for developers to accommodate them.