• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

Passive cooling the X1 chip on 20nm with 768mhz clock in a device smaller than the shield TV after the (charging) battery is taken into effect is unlikely, and it's down right impossible for it to run passive with the Foxconn leak clocks for 8 days straight without active cooling, and they were mass producing that unit already. From everything we know about the chips and clocks, it's really impossible to have this device use passive cooling, it's far too thin imo.

Yeah like I said it was unlikely, but it's certainly possible if it's a 16nm chip at the Eurogamer clock speeds. But then the battery life should be a good amount better, not to mention it doesn't really make much sense to test the CPU at 1.78GHz and then lower it to 1GHz if those speeds are doable in a handheld. Although removing the fan could be an incentive to lower the clock speeds...

But on the other hand, I just can't really understand the reasoning behind locking the launch clock speeds to a certain rate and then quickly raising them right after. Are these demo units just old 20nm units? If they could produce 20k a day starting in November then surely they should have enough of the final retail units to demo, right? Why handicap their launch games in this case? Maybe final devkits only were available October and later?

I dunno guys, the Foxconn leak is obviously legit, but there are just too many unanswered questions to determine what the final hardware will be. Hopefully someone can do a teardown in 5 weeks!


EDIT: This is the first impression I've heard so far of the unit getting warm when playing anything (Zelda in this case):

I thought it was very good, too. Bright, colourful, and very crisp. And big, like a large phablet. The back did seem to grow pretty warm when I was playing Zelda, though. Speaking of games, we tried around six or seven of them. Which were your favourite?

From this article.
 

z0m3le

Banned
Yeah like I said it was unlikely, but it's certainly possible if it's a 16nm chip at the Eurogamer clock speeds. But then the battery life should be a good amount better, not to mention it doesn't really make much sense to test the CPU at 1.78GHz and then lower it to 1GHz if those speeds are doable in a handheld. Although removing the fan could be an incentive to lower the clock speeds...

But on the other hand, I just can't really understand the reasoning behind locking the launch clock speeds to a certain rate and then quickly raising them right after. Are these demo units just old 20nm units? If they could produce 20k a day starting in November then surely they should have enough of the final retail units to demo, right? Why handicap their launch games in this case? Maybe final devkits only were available October and later?

I dunno guys, the Foxconn leak is obviously legit, but there are just too many unanswered questions to determine what the final hardware will be. Hopefully someone can do a teardown in 5 weeks!

I think they were having a fair amount of trouble with launch software getting out for last holiday, so they delayed it until March and decided to wait on final hardware to get final clocks, remember Tegra 2 was suppose to be used in 3ds and they skipped it because Nvidia promised Nintendo something they couldn't deliver, so using the safe clocks for launch software when you look at the launch software, makes perfect sense imo. There is nothing here that requires a huge amount of performance from the device.

There was also the rumor that Nintendo was being very selective about what 3rd party games could come at launch, they might not want to have painted the switch in the light that the eurogamer clocks would set, if indeed the Foxconn clocks are accurate.
 

Thraktor

Member
The original Chinese leak was wiped from the Internet, which is curious itself, I'm not sure we could get a better translation than what exists, but if we do it will be because someone saved the original leak that everyone pretty much thought was fake at the time.

That's disappointing to hear, I feel like we would have clarified quite a few things with a better translation (particularly whether the 200mm² chip is in addition to, or a replacement for, the 100mm² one).

Thraktor, I agree with the cpu power consumption you have assumed, but did you take into account the power saving from the gpu, as the clock is smaller than the gains from moving to 16nm allows, there should be additional savings there, making the power consumption difference smaller.

My point, though, was that if Switch is actually using a 16nm chip, but back in October dev kits were still using the 20nm TX1, Nintendo still would have had engineering samples of the 16nm chips at the point they were telling Eurogamers source of the "final" clocks. So both CPU and GPU clocks at the time would have been based on Nintendo's own internal testing of the final chip, even if third parties didn't actually have dev kits using them yet. Now, it's possible that between the engineering samples in Sept/Oct and the final chips in Nov/Dec there was enough of an improvement that Nintendo decided they could afford to bring clocks up a bit, but that decision would have been a comparison of 16nm (engineering sample) to 16nm (final product), not a comparison from 20nm to 16nm. There's certainly scope for clock increases from ES to final chips, if manufacturing goes well, but I would expect something on the order of 10-20% perhaps, not 75%. The GPU clock of 921MHz, for example, is something I wouldn't rule out, but unless they dropped the link of 1:1 CPU clock speed between handheld and docked mode I can't imaging things improving enough to clock the CPU up that high while running games.

I could give you that on the CPU, but at the same time I have NEVER heard of any one stress testing a chip (specially one in a finished product) at an almost 100% over clock, while only over clocking the GPU 20%. Plus that extra 20% in the GPU isn't going to make development between the two tiers go from super easy to OMG so hard.

All I'm saying is that what you guys are suggesting for reasons to run the system at these speeds just aren't done.

Again I have no idea if these are the final clocks or not, all I'm saying is the reasons to test a system for 8 days at these speeds and not ship them at those speeds are 0. No one does that.

I absolutely agree that stress-testing a CPU at 75% higher clock speeds than you plan to do it is crazy, but I think that telling developers that "final clocks" are 1GHz only a couple of months before launch and then suddenly jumping to 1.78GHz is equally crazy.

The best explanation I could give would be if the CPU does clock up that high, but just not during (normal) games. The could use a dynamic clock while browsing the OS, eShop, etc., similar to smartphones where it would jump between 600MHz (or similar) and 1.78GHz depending on usage, to ensure a smooth user experience without too much power draw. The other possibility is that they clock just one or two cores up that high for Gamecube or Wii emulation, with the GPU clocked down to accommodate.

A less likely, but somewhat possible explanation would be that they've given developers an extra performance profile in games to run a smaller number of cores at a higher frequency. So, they could use 4 cores at 1Ghz, 2 at 1.4GHz or 1 at 1.8GHz, or something like that. Some developers may prefer to work in a more lightly parallelised environment like that.

As some of you are probably aware of by now, several Jpanese developers were asked about their thoughts on the Nintendo Switch presentation, and what things they found interesting about the device.

(If you haven't read it by now, hop of over to Nintendo Everything and give it a go.)

What caught my attention, though, was how some devs thought the concole was cheap. Not just that, but on more than one occasion, it came in context of them being especially surprised because they were aware of what the hardware brings to the table. This could be purely from a feature standpoint, or it could be processor related, or both. Either way, it's very interesting that some expressed their thoughts in this manner. For example:

Ganbarion


Bandai Namco


PlatinumGames



These comments do seem to lean in favor of speculation based on the Foxconn leak. Or to say it another way; I find it hard to believe that these developers were so impressed by 4 A-57's and an underclocked X1, even to the point where the $299.99 price tag came as a complete surprise/shock. Am I wrong to believe that this wouldn't make much sense?

It's true that none of us know what's really under Switch's hood, but all the circumstantial evidence makes it hard - for me at least - to lean in favor of speculation heavily built upon Eurogamer's report. Thankfully, not too long before we are able to get some teardowns. In the meantime, how do you interpret those comments(especially with the two main opposing leaks in the current discussion)?

Higher clocks don't actually (directly) cost any more money though, and A72s are actually smaller (i.e. cheaper) than A57 cores. The Foxconn leak also mentions that the SoC in Switch is about the size of the TX1, so it would rule out something like a larger GPU (i.e. more SMs).

My guess is they're talking about the HD rumble. Although we don't have a full breakdown of the technology, it seems to be using linear actuators similar to what's used in the iPhone (hardly a cheap piece of hardware), but Nintendo are using at least two, possibly more (if there's also one in the main body for touch feedback), and may well be using more advanced models than Apple are using (given the more diverse applications for it in Switch). This is something we'll get more info on once we have a teardown, but I'd expect it to be pretty high up the list of component costs in the system.
 

Rodin

Member
The best explanation I could give would be if the CPU does clock up that high, but just not during (normal) games. The could use a dynamic clock while browsing the OS, eShop, etc., similar to smartphones where it would jump between 600MHz (or similar) and 1.78GHz depending on usage, to ensure a smooth user experience without too much power draw.

Phones use higher clocks when they need to run heavy applications though, like... games.

You don't need A57 cores (or better) at 1.78GHZ to run a barebone OS and the eShop.

My guess is they're talking about the HD rumble. Although we don't have a full breakdown of the technology, it seems to be using linear actuators similar to what's used in the iPhone (hardly a cheap piece of hardware), but Nintendo are using at least two, possibly more (if there's also one in the main body for touch feedback), and may well be using more advanced models than Apple are using (given the more diverse applications for it in Switch). This is something we'll get more info on once we have a teardown, but I'd expect it to be pretty high up the list of component costs in the system.

The guy from Ganbarion specifically said that he expected the machine to be 28.000 yen based on performances alone, there's no reason to assume that other developers were referring to something else.
 
The guy from Ganbarion explicitely said that he expected the machine to be 28.000 yen based on performances alone, there's no reason to assume that other developers were referring to something else.

Yeah, that specifically is a pretty telling quote. However I'd still advise caution about this because it still doesn't really make much sense from a reasoning perspective, at least to me. Why would they tell developers about final clocks assuming they actually knew what hardware would be final likely whenever they decided on a March release? It couldn't just be due to the final devkits not being available until late, because developers could still use TX1 hardware to simulate the final specs.

I'm just stuck at trying to find out why this would happen I guess. And the annoying part is we may never know.
 
So I've not really been following this. Could someone just tell me if we think this foxconn leak is true and if it is, what that means for the performance of the system
 
So I've not really been following this. Could someone just tell me if we think this foxconn leak is true and if it is, what that means for the performance of the system

It's very, very likely that at least a majority of the leak is true, as there are some things that would have been absolutely impossible to guess, like the battery capacity, colored joycons, weights of both the joycons and the Switch tablet, etc...

Further, the clock speeds would have been very hard to fake in a leak like this because they line up precisely with what clock speeds are possible on the Tegra X1, which is not at all common knowledge. But it's possible that the leaker made those up (though very unlikely as I said above). It's also possible the clock speeds he saw were simply for stress testing, or some other sort of testing and won't be used for the final hardware.

However, regarding the SoC itself, the leaker is making an educated guess that it's been made on a 16nm node and that the CPU cores are A72/3 just because of the clock speeds and battery capacity I believe. So this info is certainly not guaranteed to be true, even if the clock speeds he claimed are 100% accurate. But many in this thread have made some persuasive points that back up the leaker's info.

If this all is true it won't mean much for graphics, as the GPU is getting a 20% performance bump in docked mode (we don't know for sure about handheld mode but 20% there would make sense too). However this is almost a 100% increase in CPU performance, which would put the Switch CPU on par (or above) the XB1/PS4 CPUs for most tasks. Which would be a fairly big deal, especially for porting AAA titles.
 

Malus

Member
Oh shit didn't catch that confirmation of Platinum working on games for the Switch.

They were probably included in that 3rd party developer list from a while back, but it's nice to see them actually say something.
 

Donnie

Member
I absolutely agree that stress-testing a CPU at 75% higher clock speeds than you plan to do it is crazy, but I think that telling developers that "final clocks" are 1GHz only a couple of months before launch and then suddenly jumping to 1.78GHz is equally crazy.

Wasn't the Eurogamer info from 6 or 7 months ago? Could be possible that they gave developers conservative clocks to aim for at launch as at that point they didn't know what performance the SoC would be able to hit (hence the tests at Foxconn months later).

The best explanation I could give would be if the CPU does clock up that high, but just not during (normal) games. The could use a dynamic clock while browsing the OS, eShop, etc., similar to smartphones where it would jump between 600MHz (or similar) and 1.78GHz depending on usage, to ensure a smooth user experience without too much power draw. The other possibility is that they clock just one or two cores up that high for Gamecube or Wii emulation, with the GPU clocked down to accommodate.

But if you're testing just to make sure the CPU can clock up to 1.78Ghz for OS you don't also have the GPU running at full whack (or in this case above the reported clock). You'd end up throwing away chips because they couldn't hit a combined performance level that will never be used.


Higher clocks don't actually (directly) cost any more money though, and A72s are actually smaller (i.e. cheaper) than A57 cores. The Foxconn leak also mentions that the SoC in Switch is about the size of the TX1, so it would rule out something like a larger GPU (i.e. more SMs).

Using the smaller A72 and removing some none gaming functions could negate the bigger GPU? Not saying I believe there are more SM's, I actually don't. But its maybe possible to have additions to the GPU without increasing SoC size in comparison to X1.
 

Hermii

Member
Wasn't the Eurogamer info from 6 or 7 months ago? Could be possible that they gave developers conservative clocks to aim for at launch as at that point they didn't know what performance the SoC would be able to hit (hence the tests at Foxconn months later).

No the information is from "fall", its most likely the October devkit.
 

z0m3le

Banned
No the information is from "fall", its most likely the October devkit.

I disagree as the video leak stated that the source gave him information months ago that he sat on and the trigger of the eurogamer article wasn't new information, but venturebeat's article. While his information lines up perfectly with the July kits, it doesn't line up with the final hardware's testing and the vague comment about "fall" while stating that his information is a few months old back in December when they revealed clocks.

The real issue with the info being the October devkit is that final devkits were late October and eurogamer had no information about the final hardware beyond target launch clocks, which existed back in July.

Meanwhile we know the testing that doesn't fit with eurogamer's article, happened in November and is newer information anyways.
 
I think its really strange that Nintendo hasn't announced the RAM for switch yet. The Wii U had the storage and RAM announced like 3 months ago.

I'm expecting 4GB as rumored. Not keeping my hopes up for an increase like 6GB or more... Though the more RAM the better for future proofing it when we get an SCD. If its gonna really be 50% as powerful as xbone with foxconn specs while docked, it needs all it can get for porting especially.
 

z0m3le

Banned
I think its really strange that Nintendo hasn't announced the RAM for switch yet. The Wii U had the storage and RAM announced like 3 months ago.

I'm expecting 4GB as rumored. Not keeping my hopes up for an increase like 6GB or more... Though the more RAM the better for future proofing it when we get an SCD. If its gonna really be 50% as powerful as xbone with foxconn specs while docked, it needs all it can get for porting especially.
The SCD can have its own vram. If the switch has 4GB and uses 1GB for 720p, the SCD can have 4GB or even 8GB and target 4k.
 
The SCD can have its own vram. If the switch has 4GB and uses 1GB for 720p, the SCD can have 4GB or even 8GB and target 4k.

I wish Nintendo just future proofed the thing for RAM instead of waiting for 2 years. 6GB of RAM and more bandwidth would go a long way in terms of scaling for 3rd party ports.

Totally out of hour hands though of course. Just a thought. Though I still find it weird that Nintendo hasn't announced RAM yet.
 

z0m3le

Banned
I wish Nintendo just future proofed the thing for RAM instead of waiting for 2 years. 6GB of RAM and more bandwidth would go a long way in terms of scaling for 3rd party ports.

Totally out of hour hands though of course. Just a thought. Though I still find it weird that Nintendo hasn't announced RAM yet.
3.2GB vs 5GB (6GB for ps4pro) is plenty for "future proof" of a 720p device.
 

Thraktor

Member
Phones use higher clocks when they need to run heavy applications though, like... games.

You don't need A57 cores (or better) at 1.78GHZ to run a barebone OS and the eShop.

Phones are rarely able to maintain high clocks for long, though, whereas Switch needs to be able to keep consistent clock speeds for as long as someone is playing a game. Mobile games are also rarely reliant on more than one or two threads, while many Switch games would be expected to use all 4(?) cores at pretty much full load.

You don't need A57s or A72s at 1.78GHz to run an OS, but if you have it it can certainly help. OS smoothness is all about latency, and if you can clock up to 1.78GHz for a fraction of a second to bring down the latency of a given action from (say) 500ms to 300ms then it's well worth doing.

The guy from Ganbarion specifically said that he expected the machine to be 28.000 yen based on performances alone, there's no reason to assume that other developers were referring to something else.

Sorry, I missed that, I should have read a bit more closely!

I still don't think it's necessarily any reason to expect it to be any more powerful than we already know. As a handheld (which is how I'd expect most Japanese devs to view it), even "just" a 1GHz quad-A57 and 300/768MHz 2SM Maxwell GPU is still an extremely impressive piece of kit. It's at least as ambitious as Vita was (which launched at ¥25K/¥30K), plus a larger screen, detachable controllers, HD rumble and the ability to dock and clock up for 1080p TV gaming. The Vita and n3DS LL even still sell for around ¥20K in Japan now, so a ¥30K ~150Gflop handheld would give Japanese gamers orders of magnitude more performance per yen than they get from existing portables.

Wasn't the Eurogamer info from 6 or 7 months ago? Could be possible that they gave developers conservative clocks to aim for at launch as at that point they didn't know what performance the SoC would be able to hit (hence the tests at Foxconn months later).

As far as I'm aware Eurogamer clarified to someone that the specs were from around October (although I can't find the source for this, so take it with a grain of salt). I don't see why Nintendo would tell a developer that given clock speeds are final unless they are at least pretty confident that they've been nailed down, though. Varying clock speeds are pretty standard for pre-launch software development, and I can't imagine developers having any issue with Nintendo listing clocks as "subject to change" if they don't have near-final hardware to test on yet.

But if you're testing just to make sure the CPU can clock up to 1.78Ghz for OS you don't also have the GPU running at full whack (or in this case above the reported clock). You'd end up throwing away chips because they couldn't hit a combined performance level that will never be used.

It does seem strange, but perhaps there are circumstances where the system is clocked to 1.78GHz and stressing the GPU. For example, GC/Wii BC rendered at 1080p. Or perhaps testing of the final production run at TSMC has shown that all the working dies coming off the line can hit those clocks, so they're testing the cooling system with them just in case they might add any functionality down the line which requires it.

Using the smaller A72 and removing some none gaming functions could negate the bigger GPU? Not saying I believe there are more SM's, I actually don't. But its maybe possible to have additions to the GPU without increasing SoC size in comparison to X1.

Yeah, there's definitely scope for savings compared to the TX1 die, such as the video codec block (TX1 is capable of encoding 4K h265 at 60fps, which is obviously rather redundant for Switch). I'm not sure how much actual die space these kinds of things would save, but I wouldn't be surprised if they've given Nintendo some scope to add to or customise some aspects of the GPU. One relatively obvious example would be increasing the GPU L2 cache, but there may be other aspects where Nintendo would want to make changes. I wouldn't expect any major changes in performance, though, just removing (or minimising) some bottlenecks which Nintendo may have identified as being important to them.
 

Donnie

Member
As far as I'm aware Eurogamer clarified to someone that the specs were from around October (although I can't find the source for this, so take it with a grain of salt). I don't see why Nintendo would tell a developer that given clock speeds are final unless they are at least pretty confident that they've been nailed down, though. Varying clock speeds are pretty standard for pre-launch software development, and I can't imagine developers having any issue with Nintendo listing clocks as "subject to change" if they don't have near-final hardware to test on yet.

They were listed as final for launch, so I've got little doubt that launch clocks are those Eurogamer frequencies (or certainly very close). But IMO if they're doing these tests they're at least strongly considering upping the clocks at some point.

It does seem strange, but perhaps there are circumstances where the system is clocked to 1.78GHz and stressing the GPU. For example, GC/Wii BC rendered at 1080p. Or perhaps testing of the final production run at TSMC has shown that all the working dies coming off the line can hit those clocks, so they're testing the cooling system with them just in case they might add any functionality down the line which requires it.

Well the question then becomes why run so much faster for high res backwards compatibility but limit things so much for native gaming. I think a possible increase down the line is most likely, and I'd think that would be across the board.
 
^^I don't think they will upclock handheld mode, because I think they will want to save as much battery ad possible. I seriously wonder about the upclocking of the cpu though. Does it have to be the same in both handheld and docked mode?

And I wonder if Nintendo will go the ps4 pro route for developers and let them choose what they want to increase in fidelity instead of just being a resolution increase like eurogamer rumor has claimed. Because it only takes 2.25x the power to go from 720p to 1080p and then they still have a bit more power (2.5x for docked). And this is assuming Nintendo will always have 1080p games in docked and 720p in handheld, which likely won't ever be the case. Heck botw switch is only 900p when docked and 720p on handheld.
It wpuld feel like a waste if they didn't squeeze everything in docked. And if foxconn ends up being true, we could get a 20% boost in power over eurogamer clock speed.
 

Polygonal_Sprite

Gold Member
You were able to squeeze quite a lot out of the Wii U with Fast Racing Neo. How would you say Switch compares to that?

On Wii U it was tough to get to the visuals we wanted at 60fps. We had to use a lot of tricks to make it happen at all. For instance, on Wii U we were able to pull off two player split-screen racing at 720p and 60fps. Not bad at all, but with three or four players we had to scale down 30fps. Well, most racing games today don't even feature split-screen at all, so we didn't felt too bad about it. However, on the Switch we didn't had that limit. On Nintendo Switch in TV mode we now have four player split-screen, at 1080p and 60fps with much more details enabled. For worst case scenarios we enable dynamic resolution scaling to make sure the game keeps 60fps in unforeseen situations, tho that's usually not something you can spot.

Very interesting quote from Shinen about the Switch version of Fast Racing Neo. So that's 720p / 30fps on Wii U to 1080p / 60fps with better graphics on Switch in 4 player mode. Would that be possible with Eurogamers leaked clocks ?
 
Very interesting quote from Shinen about the Switch version of Fast Racing Neo. So that's 720p / 30fps on Wii U to 1080p / 60fps with better graphics on Switch in 4 player mode. Would that be possible with Eurogamers leaked clocks ?



If our deduction that the switch is 4x as powerful as wii u when docked with eurogamer clock speeds + 2 SMS for GPU, then I think it should be. The GPU is 4x as much, while we get 3x more for gaming on RAM, and the CPU is a lot better. It takes 2.25x GPU power to go from 720p to 1080p. I don't know how much processing power it takes from going to 30 fps to 60fps, but the CPU and RAM also have a significant factor in frame rate stability, as well as how well the game/game engine is optimized.

https://www.lifewire.com/optimizing-video-game-frame-rates-811784

I hope Nintendo is listening. Shinen just made them look bad, considering 3-4 players will still be be 30fps on MK8 deluxe, and they have it at 60 fps with the same 1080p resolution. Granted you have items and arguably more players to deal with in mario kart series.

I wish they ended up using dynamic resolution while maintaining 60fps for 3-4 players like Shinen. /:
 
VC uses more CPU power to emulate, so the GPU could be clocked down in that mode to reduce overall power. Still.. in order for the system to run that CPU that high, 16nm will be needed or else it would have less battery life than the "regular mode. "

^^I don't think they will upclock handheld mode, because I think they will want to save as much battery ad possible. I seriously wonder about the upclocking of the cpu though. Does it have to be the same in both handheld and docked mode?

And I wonder if Nintendo will go the ps4 pro route for developers and let them choose what they want to increase in fidelity instead of just being a resolution increase like eurogamer rumor has claimed. Because it only takes 2.25x the power to go from 720p to 1080p and then they still have a bit more power (2.5x for docked). And this is assuming Nintendo will always have 1080p games in docked and 720p in handheld, which likely won't ever be the case. Heck botw switch is only 900p when docked and 720p on handheld.
It wpuld feel like a waste if they didn't squeeze everything in docked. And if foxconn ends up being true, we could get a 20% boost in power over eurogamer clock speed.

The Eurogamer leak implies that the speed boost is not mandated at all.


Very interesting quote from Shinen about the Switch version of Fast Racing Neo. So that's 720p / 30fps on Wii U to 1080p / 60fps with better graphics on Switch in 4 player mode. Would that be possible with Eurogamers leaked clocks ?

Even with the Eurogamer clockspeeds, the Switch is several times stronger than the Wii U all around with a more modern GPU. Perhaps this does bring more confidence about the Switch's memory setup, though.
 

KingSnake

The Birthday Skeleton
The demo that Todd Howard was talking about was 1-2-Switch. It was said in the Q&A session yesterday. Switch has so much power that you don't even need to look at the screen, you can feel the power in the palm of your hands.
 
The demo that Todd Howard was talking about was 1-2-Switch. It was said in the Q&A session yesterday. Switch has so much power that you don't even need to look at the screen, you can feel the power in the palm of your hands.

Aha! That was the basis for the theory I presented that Skyrim will utilize HD rumble. Where was this Q&A? From Nintendo's meeting? Or a Bethesda meeting?
 

Damn I can't google translate that. I'll wait until the english is up before I revive my Skyrim theory thread haha

Anyway, I think it was clear after the presentation that the Todd Howard "demo" had nothing to do with graphics. So I don't really see what that has to do with this Foxconn leak anyway, unless some were trying to connect his quote to the possible SCD devkit (which I still think is nothing). A 16nm SoC with a 20% improved GPU wouldn't really blow anyone away anyway, but the better CPU would definitely help with ports.
 

Zedark

Member

That citation from Miyamoto from the other thread about one year porting time is also in here:
Miyamoto said:
Switch 向けにゲームを移植しようとすると1年以内には移植できるほどの開発環境が整っていま
I dumped it into google translate but the translation does not flow, so I can't say for certain what it means. It feels, however, like Miyamoto is referring to porting engines over rather than games, though I absolutely can't say for certain. Here is the google translate blurb:
When we try to port games for Switch, development environment that can be transplanted is prepared within one year.
Could someone who can read Japanese give us a translation for this? It seems quite a vital quote for Switch development, so we need to have a correct translation for it. Thanks!
 

KingSnake

The Birthday Skeleton
Damn I can't google translate that. I'll wait until the english is up before I revive my Skyrim theory thread haha

Anyway, I think it was clear after the presentation that the Todd Howard "demo" had nothing to do with graphics. So I don't really see what that has to do with this Foxconn leak anyway, unless some were trying to connect his quote to the possible SCD devkit (which I still think is nothing). A 16nm SoC with a 20% improved GPU wouldn't really blow anyone away anyway, but the better CPU would definitely help with ports.

This is the Google Translation of that part:

Since last year we introduced Nintendo Switch to various development companies using software such as "1-2-Switch", but at that time, many people are interested in new ways of playing Todd Howard of Bethesda Game Studio, who was developed "Skyrim", already showed favorable reactions according to the video message streamed in the Nintendo Switch presentation
 
This doesn't actually say it was the only software shown. Though I have been pretty sure all along that Todd Howard was talking mostly about hd rumble and maybe about hybrid consept.
 

KingSnake

The Birthday Skeleton
If i read one more time "this doesn't actually say" I think I will bail out of GAF, Subtlety and context are dead, we live only in absolutes.
 
If i read one more time "this doesn't actually say" I think I will bail out of GAF, Subtlety and context are dead, we live only in absolutes.
Eh? In either case, a demo that Todd Howard's saw would only have at most minor differences between graphical potential between E.G's reported specs and the rumored one. It makes sense that whatever wowed him has little to do with graphical power, because the system will not be a powerhouse either way.

It seems that HD Rumble has intrigued more people than most expected, but that is definitely OT.
 
Eh? In either case, a demo that Todd Howard's saw would only have at most minor differences between graphical potential between E.G's reported specs and the rumored one. It makes sense that whatever wowed him has little to do with graphical power, because the system will not be a powerhouse either way.

It seems that HD Rumble has intrigued more people than most expected, but that is definitely OT.

Actually, I wonder if the HD rumble has any sort of effect on CPU use. Based on what we know so far, I think we can pretty safely assume HD rumble is built with several physics libraries in mind in order to get sensations like marbles rolling in a box, so it could be that HD rumble enabled games have a bit of a larger CPU requirement than other games.

Though I kinda doubt that will be big enough of a CPU need. The physics programmed for HD rumble are likely very simple in that they don't really need to interact with any other complex systems like the physics in a game like BotW do.

Anyway, I still don't know why the CPU might have a max clock rate of 1.78GHz if its max clock rate for games is just 1GHz. Emulation likely wouldn't require 75% more CPU power would it?
 

Thraktor

Member
They were listed as final for launch, so I've got little doubt that launch clocks are those Eurogamer frequencies (or certainly very close). But IMO if they're doing these tests they're at least strongly considering upping the clocks at some point.

Yeah, that's a possibility, but it would seem very strange of Nintendo, given that it's not something they've ever done before. They'd also have to tank the battery life by quite a bit to make these kinds of changes to portable clocks, which I'm not sure they'd be willing to do.

Well the question then becomes why run so much faster for high res backwards compatibility but limit things so much for native gaming. I think a possible increase down the line is most likely, and I'd think that would be across the board.

Perhaps the clocks required to emulate Wii are too high for portable mode and they're limiting Wii BC to docked mode only (possibly with the excuse that you need to detach the joy-cons for wiimote functionality anyway). If (hypothetically) Wii BC required a 1.78GHz clock on the CPU, then GC BC should be feasible at around 1.2GHz, which would seem within the bounds of reason for handheld mode (particularly if only two cores are being used and the GPU isn't stressed).

Actually, I wonder if the HD rumble has any sort of effect on CPU use. Based on what we know so far, I think we can pretty safely assume HD rumble is built with several physics libraries in mind in order to get sensations like marbles rolling in a box, so it could be that HD rumble enabled games have a bit of a larger CPU requirement than other games.

Though I kinda doubt that will be big enough of a CPU need. The physics programmed for HD rumble are likely very simple in that they don't really need to interact with any other complex systems like the physics in a game like BotW do.

It shouldn't be particularly computationally expensive. If, as seems likely, they're using linear actuators, then the output to them will be a series of frequencies over time. That is, effectively an audio signal, but with likely much lower precision required. It's quite likely that the "mixing" could actually be done on an audio DSP, for that matter.

The physics shouldn't be any more complicated than any other game physics, and in certain circumstances I'm sure they can reduce the model into two dimensions (or even one dimension) and still get a good enough effect. I'd imagine the "revamped physics engine" that Nvidia talked about is actually related to this, effectively a version of PhysX extended specifically to give developers an easy way to make use of the HD rumble functionality.

Anyway, I still don't know why the CPU might have a max clock rate of 1.78GHz if its max clock rate for games is just 1GHz. Emulation likely wouldn't require 75% more CPU power would it?

It depends on a great many factors. What system is being emulated? What CPU are you emulating it on? How high/low level is your emulator? How well coded is your emulator? etc. etc.

It's an extremely difficult question to answer in isolation. We can say that, for example, Dolphin runs reasonably well on the Shield TV (which is ostensibly 2GHz A57 but likely throttles), but doesn't quite achieve locked 60fps for every game. On the other hand, one would expect that a Nintendo developed native emulator should outperform a community developed one (which runs on Android/OpenGL, at that), but to what extent is difficult to say.
 

Charamiwa

Banned
If i read one more time "this doesn't actually say" I think I will bail out of GAF, Subtlety and context are dead, we live only in absolutes.

What's the reasoning behind this? I would think not fully trusting a google translated sentence would be the logical thing to do in general. Especially when said sentence is already vague as is.

Here's what I'm reading here: they've been showing demos of titles like 1-2 Switch to a lot of developers. Those responses were positive, as it is shown in the Todd Howard prerecorded segment during the presentation. It's basically just PR speak. Just like the Todd Howard interview.
 
It shouldn't be particularly computationally expensive. If, as seems likely, they're using linear actuators, then the output to them will be a series of frequencies over time. That is, effectively an audio signal, but with likely much lower precision required. It's quite likely that the "mixing" could actually be done on an audio DSP, for that matter.

The physics shouldn't be any more complicated than any other game physics, and in certain circumstances I'm sure they can reduce the model into two dimensions (or even one dimension) and still get a good enough effect. I'd imagine the "revamped physics engine" that Nvidia talked about is actually related to this, effectively a version of PhysX extended specifically to give developers an easy way to make use of the HD rumble functionality.

Yeah I figured it shouldn't be too CPU intensive, it's just something that does take a bit of the CPU away from the main game. And that's a very interesting thought about the revamped physics engine... I wonder how much input Nvidia had with the HD rumble. I would have thought Immersion's licensed software would have most of that work done. Maybe Nvidia just helped merge that with their own NVN API.

It depends on a great many factors. What system is being emulated? What CPU are you emulating it on? How high/low level is your emulator? How well coded is your emulator? etc. etc.

It's an extremely difficult question to answer in isolation. We can say that, for example, Dolphin runs reasonably well on the Shield TV (which is ostensibly 2GHz A57 but likely throttles), but doesn't quite achieve locked 60fps for every game. On the other hand, one would expect that a Nintendo developed native emulator should outperform a community developed one (which runs on Android/OpenGL, at that), but to what extent is difficult to say.

I guess I would just say that it seems unlikely given what we know, but I guess possible, that the 1.78GHz CPU speed is solely for emulation. It still likely wouldn't explain how you can get 4 A57s to 1.78GHz and a 20nm 2SM Maxwell GPU to 921MHz continuously for 8 days when a Shield TV can't do that without throttling and has a much larger volume, and thus much better heat dissipation.

Add to that the fact that no one from any of the Switch events have felt air coming out of the vents even when docked playing Zelda, and it looks quite possible that they have built this on a 16nm node.
 

Pokemaniac

Member
Perhaps the clocks required to emulate Wii are too high for portable mode and they're limiting Wii BC to docked mode only (possibly with the excuse that you need to detach the joy-cons for wiimote functionality anyway). If (hypothetically) Wii BC required a 1.78GHz clock on the CPU, then GC BC should be feasible at around 1.2GHz, which would seem within the bounds of reason for handheld mode (particularly if only two cores are being used and the GPU isn't stressed).

Even if it's docked-only, I find it pretty unlikely that they'd make clocks available for VC but not native games. The only thing where they've really done something like that before is relaxing the NX bit enforcement so VC stuff could use JIT compilers, and that is was likely more for security than anything else.
 

KingSnake

The Birthday Skeleton
What's the reasoning behind this? I would think not fully trusting a google translated sentence would be the logical thing to do in general. Especially when said sentence is already vague as is.

Just letting some frustration out. Too many people trying to kill a discussion that they don't like using the argument "yes, it's states that the grass is green, but it doesn't specifically say it can't be blue".
 

optimiss

Junior Member
Add to that the fact that no one from any of the Switch events have felt air coming out of the vents even when docked playing Zelda, and it looks quite possible that they have built this on a 16nm node.

I felt air. It was very slight and I had to hold the device up to my lips to make sure (tons of nerve endings). I'm sure it looked like I was trying to smell it lol.
 
I felt air. It was very slight and I had to hold the device up to my lips to make sure (tons of nerve endings). I'm sure it looked like I was trying to smell it lol.

Oh really? Was that when docked or undocked? This is the first I've heard of that.

Also did you happen to feel if the device got warm at all?
 

optimiss

Junior Member
Oh really? Was that when docked or undocked? This is the first I've heard of that.

Also did you happen to feel if the device got warm at all?

I had just pulled it from the dock during the Zelda demo to play in handheld mode. The housing was slightly warm and the air was warm as well. I could feel the warmth above the vent with my hand, but not movement of air, which is why I tried my lips and found there was without question, a very soft steam of warm air.

Edit: haha this reads like gamer erotica
 
I had just pulled it from the dock during the Zelda demo to play in handheld mode. The housing was slightly warm and the air was warm as well. I could feel the warmth above the vent with my hand, but not movement of air, which is why I tried my lips and found there was without question, a very soft steam of warm air.

Edit: haha this reads like gamer erotica

Haha I can see why you say that but I totally get why you put it up to your mouth. But thank you for this, this seems to be the first impression I've read where anyone has felt air coming through the vents.

One of the points of speculation (which was always very unlikely) is that they managed to build the console without a fan due to a die shrink and just used the vents for passive convection cooling, and that seems to be debunked now. Like I said it was always unlikely though.

I'm not sure if this adds any evidence to or against the Foxconn clocks, but it's at least a good thing to know!

EDIT:
Advanced haptics work very similarly to audio. It's just a bunch of vibrations after all. The haptics in the iPhone actually run through a second DAC. So the overhead would basically be the same as a few more audio channels.

The physics calculations would be something else entirely and would need to be done in the traditional way. From there, they would generate waveforms and pass that info to the haptics system.

Right, I was specifically referring to the physics calculations that need to be fed to the HD rumble actuators. It's likely very simple physics software though which barely utilizes much of the CPU.
 
Actually, I wonder if the HD rumble has any sort of effect on CPU use. Based on what we know so far, I think we can pretty safely assume HD rumble is built with several physics libraries in mind in order to get sensations like marbles rolling in a box, so it could be that HD rumble enabled games have a bit of a larger CPU requirement than other games.

Though I kinda doubt that will be big enough of a CPU need. The physics programmed for HD rumble are likely very simple in that they don't really need to interact with any other complex systems like the physics in a game like BotW do.

Anyway, I still don't know why the CPU might have a max clock rate of 1.78GHz if its max clock rate for games is just 1GHz. Emulation likely wouldn't require 75% more CPU power would it?

Advanced haptics work very similarly to audio. It's just a bunch of vibrations after all. The haptics in the iPhone actually run through a second DAC. So the overhead would basically be the same as a few more audio channels.

The physics calculations would be something else entirely and would need to be done in the traditional way. From there, they would generate waveforms and pass that info to the haptics system.
 

optimiss

Junior Member
Haha I can see why you say that but I totally get why you put it up to your mouth. But thank you for this, this seems to be the first impression I've read where anyone has felt air coming through the vents.

One of the points of speculation (which was always very unlikely) is that they managed to build the console without a fan due to a die shrink and just used the vents for passive convection cooling, and that seems to be debunked now. Like I said it was always unlikely though.

I'm not sure if this adds any evidence to or against the Foxconn clocks, but it's at least a good thing to know!

Glad I could contribute!
 

z0m3le

Banned
Haha I can see why you say that but I totally get why you put it up to your mouth. But thank you for this, this seems to be the first impression I've read where anyone has felt air coming through the vents.

One of the points of speculation (which was always very unlikely) is that they managed to build the console without a fan due to a die shrink and just used the vents for passive convection cooling, and that seems to be debunked now. Like I said it was always unlikely though.

I'm not sure if this adds any evidence to or against the Foxconn clocks, but it's at least a good thing to know!

The shield TV after 2 hours would reach about 34c so not too hot, it's really silly to compare since it is twice as thick and isn't as small, but the Foxconn clocks would be on a smaller node and A72/A73 rather than A57 so the thermals should be pretty much identical, especially from any impressions like this. The difference between these clocks is also so small that even fast racing rmx doesn't tell us more than 4-5 times Wii U, which is within the differences of our clocks here.

We simply won't know until the device is hacked/opened up, until then we just have a mysterious clock that we can only give unlikely guesses at what they mean besides retail clock speeds, which we can't say with much confidence because the one source we have for switch hardware still has no idea what the final SoC is, and is vague about the date of their information which is still "fall" when they received it. At least we know that 3rd party launch games targeted 1ghz cpu clocks.
 
Top Bottom