• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

So after all the speculation does the evidence lean towards Switch being 16nm Finfet?

We really don't know. It could go either way in my opinion... The Foxconn leak is incredibly convincing but the fact that Eurogamer's clocks were described by Nintendo as "final"...

It's definitely possible that Eurogamer's clocks may be correct and Foxconn's hardware specs may be correct, and Nintendo is preparing for a clock increase somewhere along the line. In which case it would be 16nm, yes. Eurogamer never claimed to know the final hardware configuration, just the clocks.

We are unlikely to find out until someone can do a teardown.
 

z0m3le

Banned
Remember that Emily (or LKD) leak about Mario getting better performances around october/november? There sure seem to be quite a lot of coincidences around this.

So that was a eurogamer rumor, just had a conversation with Emily about this, she disagrees with that rumor/speculation.
 

foltzie1

Member
I am so very late to this party, but I have to assume that SCD probably exists inside the hardware division of Nintendo and was tabled for not being effective (either as a function of cost or suitability for a good experience).

If it was the former and not the latter, then making a 4K dock could happen at some point, but that presents a different set of issues (is more RAM or CPU needed in addition to the GPU).

Perhaps the next Switch is targets 1080p in the portable mode, 4K on the dock, but can run the 720p render target as well.

Will be interesting to see.
 
I am so very late to this party, but I have to assume that SCD probably exists inside the hardware division of Nintendo and was tabled for not being effective (either as a function of cost or suitability for a good experience).

If it was the former and not the latter, then making a 4K dock could happen at some point, but that presents a different set of issues (is more RAM or CPU needed in addition to the GPU).

Perhaps the next Switch is targets 1080p in the portable mode, 4K on the dock, but can run the 720p render target as well.

Will be interesting to see.
I wonder how 10nm will perform. Do we have any info when NVIDIA will share that?
 

jonno394

Member
Touch screen input looks nice. Must have to be enabled by Devs, hence why Mario kart 8 didn't use it

Bodes well for ds vc for handheld mode use.
 

Thraktor

Member
There is a lot of weird dismissal of these clocks from the luka.

First I'd like to comment on Thraktor's last post, Your estimates only apply to the Maxwell to Pascal power consumption. The SoC estimation is as good as we can get and even if your chart is off, the difference is going to be counted in 100th of watts not whole watts.

The chart could definitely be off by quite a large margin, particularly at low clock speeds (there's a reason I didn't extend the chart any further left). Just to clarify the data points I used for the chart, in decreasing order of likely accuracy:


  • Nvidia power measurement of P4 HPC GPU (GP102) at 1060MHz of 36W (GPU only)
  • Third party power measurement of GTX1080 (GP102) at both stock and overclocked, with estimated RAM power consumption subtracted
  • Nvidia's 1.5W power consumption claim for TX1 GPU (2xSM) at ~500MHz, adjusted based on TSMC's claims of 16FF+ improvement over 20nm.
The other point I need to make in interpreting the graph is that it's making a very simple assumption that all non-SM GPU logic will scale proportionally with the number of SMs (which isn't really the case, as I'll clarify below).

The only pure data point I have for Pascal power consumption is from the Nvidia's P4 slide, as it's a precise GPU-only measurement without having to extrapolate anything. However, it's a far larger GPU than what's in Switch, and there's no guarantee that power consumption would scale linearly. Most importantly, though, we know that non-SM power consumption won't scale linearly. It's a 20 SM GPU, which is 10 times what we're expecting in Switch, but at 64 ROPs it only has 4x the quantity I'd expect in Switch's GPU (based on TX1 having 16). The front-end is also likely to see limited benefits from scaling, so even if we know GP102's power draw at 1.06GHz it's hard to say, even at that frequency, that a much smaller Pascal GPU would have a similar per-SM power consumption.

The GTX1080 power measurements should be reasonably accurate, but are at far higher clocks than we're looking at (1.5GHz+) and are also based on the much larger GP102 GPU, so will the same scaling questions to smaller GPUs.

The last one, which is the closest to the size and clocks of the Switch GPU, is also the most likely to be inaccurate. It's based on Nvidia's claim that, while matching the Apple A8X's performance, the TX1's GPU draws 1.5W. Then, it relies on TSMC's claims of the power efficiency improvements of 16FF+. This presents several possible sources of error:


  • We don't actually know the clock speed. TX1 at "full clocks" got about twice the performance of the A8X, so I had assumed, given a 1GHz max clock for TX1, that to match A8X it would be running at about 500MHz. However, we now know that, even in the actively cooled Shield TV the TX1 can throttle down to 768MHz, so it's possible that it was likewise throttling in the main test, and the A8X matching test could be likewise lower (perhaps around 400MHz).
  • The 1.5W figure was given by Nvidia themselves, but they could be measuring in a way which is favourable to them, or they may have taken the best-performing TX1 die they could find for the test, with the real-world average die consuming more power.
  • TSMC's claims for the power consumption benefits of 16FF+ are likely to be as favourable as possible (it's an advertising claim after all). There are probably certain chips at certain clock speeds which see these improvements, but there's no particular guarantee that Switch's SoC is one of those, and the real-world savings could be lower.
The high end of the graph is probably reasonably accurate (at least for large Pascal GPUs). The low end of the graph, though, could easily be off by a factor of two or three. It's very difficult to estimate where a power curve like this will bottom out without precise measurements at those clocks, and as I only had vague extrapolations to work off of, the numbers given should be taken with a healthy dose of salt.

Next Wii U saw a very similar CPU performance upgrade from developers working on launch software.

Launch software developers thought the Wii U had 2 CPU cores at 1GHz and up until a few months before launch, this was the case. Later on they got access to 3 CPU cores at 1.24ghz. That is substantial, nearly a doubling of total CPU performance months before launch, and launch titles ran without that extra performance. The GPU performance change was even greater than this foxconn's leak on top of that, so I don't see your point being a strong one.

Lastly, Eurogamer answered someone, I believe it was Hermii who got the response and posted it in this thread, the post has since been altered, which is weird but it originally said that the clocks might have changed, after looking at this foxconn leak, doesn't sound too final to me IMO, and with some developers still using the july devkits up to 3 or 4 weeks ago and possibly even now, we don't know what changes were made in the final hardware, even Eurogamer made that same comment.

PS I believe the Eurogamer rumor 100%, I simply think that final hardware allowed Nintendo to make these changes resulting in Foxconn's leak, which is as solid info as we've ever gotten on Switch IMO.

I don't think it's impossible for Nintendo to have made some small last-minute clock speed changes, and something like increasing the GPU clocks to 921MHz in docked mode would seem somewhat plausible, if they realised on final testing that the cooling system could easily handle it. I don't see them increasing the CPU clock to 1.78GHz for games, though. It's simply far too big a jump for them to make on a last-minute basis. Had they known that there was a change in store from the specs given to Eurogamer's source (either a more efficient CPU or a move to 16nm) then they would have taken that into account and wouldn't have described the specs as final.

The leak of the titles came from 4chan and yeah, the real leaks always seem to be dismissed to me, but the reason no sites are reporting on the foxconn leak might be that NDAs around final devkits could be much more serious, and not every dev has them, I'd also speculate that Nintendo has only given this info to wave 2 titles, since launch titles should all be targeting the older specs regardless, and I think even Nintendo has come out and said that Zelda BotW isn't using the Switch to its fullest.

Well, there are probably a few things discouraging sites from reporting on this leak. Firstly, on the surface the clock speeds contradict those from reliable sources (even though they are most likely just thermal stress test clocks and bear no relation to in-game speeds). Secondly, the latter half of the leak sounds very out-there, and it's unlikely they can find any sources to corroborate it (as whatever device it is is likely only just making its way out to developers).

I suppose the fact that the only discussion of it seems to be here doesn't help, what with crackpots like me talking about a GTX 1060 powered super-dock and so forth :p

So after all the speculation does the evidence lean towards Switch being 16nm Finfet?

My money's still on 20nm.

We'll know in 1 month and a week. Someone will tear down the switch, we'll measure the SoC, epoxy it, sand it all the way to the exposed die.

An amateur tear-down or die-photo is unlikely to tell us the manufacturing process, as 16nm and 20nm have pretty much the same transistor density. You'd really need a cross-section of the die (and professional imaging equipment) to be able to tell the difference, but hopefully Chipworks will come through for us again on that front.

I am so very late to this party, but I have to assume that SCD probably exists inside the hardware division of Nintendo and was tabled for not being effective (either as a function of cost or suitability for a good experience).

If it was the former and not the latter, then making a 4K dock could happen at some point, but that presents a different set of issues (is more RAM or CPU needed in addition to the GPU).

Perhaps the next Switch is targets 1080p in the portable mode, 4K on the dock, but can run the 720p render target as well.

Will be interesting to see.

It's quite likely Nintendo have a lot of crazy prototypes locked away in their hardware labs, but if they're manufacturing 2000 units of a devkit with Foxconn then that suggests that they're moving into full-scale software production for the device. It's really just puzzling as to exactly what the device is, and what Nintendo's plans are for it.

The one thing holding me back from believing this 100% (or even 80%) was Eurogamer explicitly describing those clocks as final, like you said.

You don't think it's possible though that Nintendo had been investigating the possibility of 16nm for a while and finally determined they could go that route around August - October? Maybe they had to await reports of yield issues? And then come October they sent out those 16nm devkits which would line up with LKD's report of October devkits being more powerful. Again, Eurogamer describing those clocks as final sorta contradicts all of this, including LKD's October devkit leak.

I suppose if the numbers are very rough then it seems a lot less convincing than I previously thought. Power consumption matching up identically would have been an enormous indicator, but like you said since we don't have much hard data to work from that can't really be determined now.

Nintendo would have locked down the manufacturing process long before August. Ditto with choice of CPU core. Those decisions would have been made a long time ago, and by the time they told devs that the Eurogamer clocks were final they would have had actual silicon coming off the production lines to test (which is precisely why they would be confident enough to say that they're final clocks, because they're testing them on final hardware).

That would be very interesting news if it was coming as early as Holiday 2017 or something... It sure seems like the Switch is a "soft" launch, as some users have been calling it, with an incomplete OS, a trial online period, and relatively few games. They could be throwing us a huge curveball a year from now, though I really do doubt they want to muddy development even further with 1-2 more development targets.

Nintendo is nothing if not unpredictable though.

Well, that's just console launches for you. There are always "few games" and missing OS features in the first couple of months.

I don't know how the SCD would fit into their business plans, though, to be honest. Perhaps it's a last-minute thing, and they're trying to use it as a way to get western third-parties on board. I actually don't think an extra development target is that big of a deal (depending on the gap), assuming that Switch's development toolset/APIs/etc are already based around the idea of having multiple performance/resolution targets and making it simple for developers to accommodate them.
 

z0m3le

Banned
We don't really need to make assumptions anymore, there is a likely-hood that the foxconn clocks are true, and with XB1S and PS4 Pro moving to 16nm and those being much more expensive shrinks and requiring more bandwidth from the process than tiny 2SM Nvidia GPUs, I think it's well within reason that Nintendo went with 16nm, especially when you consider just how big the heat sink is for the Switch, 12cm L shaped heat sink is hinting at higher clocks IMO.

Also as you pointed out in another thread about the battery, it's likely taking up nearly half the internal space of the device, making the switch smaller than the shield tv, so I'm still not sure that the space can reasonably be cooled with the docked clocks, without a move to 16nm, especially because that battery is going to create heat inside the device.

I don't really care what any individual thinks at this point, including myself, it's absolutely a wait and see, and we should both stop worrying about possibly giving false information with solid "facts" about clocks of the device, as even Eurogamer came out to say that the clocks might have changed, so they as a source while 100% legit at the time, admitted that changes might have happened, IE take final clocks with a grain of salt.

As for the CPU difference being substantial, I already pointed to the ~88% increase that developers saw with Wii U's CPU at launch, with their games stuck running on 2 cores. Your need to dismiss the CPU clocks as unlikely, is misplaced IMO. When you have a demo running at substantially higher clocks like this for 8 days without dropping a frame, that is a stability test, not just a stress test to see if you are going to overheat the chip, but a performance test at those clocks and it passed with flying colors considering 0 frame drops after 8 days at full tilt.

It's just over a month, maybe we can figure out when it is torn down if it is 20nm or 16nm, I've already preordered the device and honestly the difference between the performance isn't some drastic change, so I don't care much either way, it's interesting to me that such a change seems to fit perfectly with what we can gather about these different architectures and process nodes. Also that 55% was on first run 16nm, is there no advancement on the process node where you really think this is a best case scenario?

EDIT: Last thing Thraktor, incase you are right about a SCD, how do you feel about a quad core A57 @ 1ghz partnering with 4.4tflops of GPU power? I assume not as good as you would feel about a quad core A72 @ 1.7ghz, that is a big tell in this leak IMO, if the dock is what we think it is, the lacking CPU doesn't sound like Nintendo, especially not after we saw what they did with New 3DS.
 

Donnie

Member
As far as the "final clocks" from Eurogamer go. It always struck me how the documentation said they were the final clocks apps could use at launch. Made me wonder if those clocks could possibly change for games not being readied for launch.
 
As far as the "final clocks" from Eurogamer go. It always struck me how the documentation said they were the final clocks apps could use at launch. Made me wonder if those clocks could possibly change for games not being readied for launch.

Yeah, that could fall in line with a late in the game hardware change, and LKD's leak about more powerful devkits in October. Maybe Nintendo always planned to go 16nm but due to yield issues couldn't get a sufficient amount of chips for devkits until sometime between August and October.

As a history lesson in Nintendo consoles' past, the Gamecube had a clockspeed change (CPU and GPU) four months prior to launch.

http://uk.ign.com/articles/2001/05/16/pre-e3-gamecube-specifications-revisions

Very interesting! 4 months from March would be November, which is when this leak happened.
 

Lexxism

Member
When do we expect to hear details on what's inside on the system? A week or two after launch? Will we get what's CPU, GPU and RAM on it?
 

Rodin

Member
I'm wondering if it's possible that for some reason they were working with A57 at 1.02GHZ and then they switched to A72 at 1.78GHZ, but the A57 cores were already at 16nm.
I mean, this interview about the efficiency of ARM A57-A53 at 16nm is from october 2014

Pete Hutton said:
”This silicon proof point with ARM Cortex-A57 and Cortex-A53 processors demonstrates the additional benefits in performance and power efficiency that 16nm FinFET technology delivers to big.LITTLE implementations. The joint effort of ARM, TSMC, and TSMC's OIP design ecosystem partners will transform end-user experiences across the next generation of consumer devices and enterprise infrastructure".

which means that they could've used A57 at 16nm since late 2014-early 2015. I wonder if simply going from A57 to A72-A73 is enough to bump the CPU clockspeed by that much though, and also if using these smaller and more efficient cores brought enough benefits that they could even bump the GPU clock from 768MHZ to 921.

As far as the "final clocks" from Eurogamer go. It always struck me how the documentation said they were the final clocks apps could use at launch. Made me wonder if those clocks could possibly change for games not being readied for launch.
Yeah, the wording seems very specific. If they really changed the clocks 4-5 months prior to launch it makes sense that launch games were made with the clocks reported by EG.

It would also mean that Zelda was made with those specs.

As a history lesson in Nintendo consoles' past, the Gamecube had a clockspeed change (CPU and GPU) four months prior to launch.

http://uk.ign.com/articles/2001/05/16/pre-e3-gamecube-specifications-revisions
Nice find, thanks.
 

Polygonal_Sprite

Gold Member
Does this quote sound like a 1GHz CPU / 200gflop GPU device ? -

"I think Nintendo Switch will put Nintendo at the forefront of the game industry once again. Their approach is quite different from anything they've done in the past – they've listened to EA, Activision, and other companies since the beginning of the Switch's development, so we've been involved throughout the whole process. They teamed up with us because they wanted to guarantee the console would be successful." - EA Executive vice president Patrick Soderlund.

It just doesn't make sense. Nor does the massive list of third party developers that are on board creating games for Switch. Would companies like From Software, Square and Bethesda even go near such a low powered device when they already have their hands full with a development pipeline which consists of PC, Xbox One, PS4 and PS4 Pro ?

We also have comments from gaf users Matt and OsirisBlack saying PS4 third party games would have no issues running on Switch. The Japanese business analyst (apologies for forgetting his name) also said he's heard that many PS4 third party projects will be ported to Switch.

This Foxconn leak would give Switch what, an 80% CPU and 20% GPU performance boost ? Sounds like it would get it much closer to the power needed for third parties to be interested.
 

z0m3le

Banned
I am starting to think that Nintendo just didn't know what they could target with final hardware without extensive testing (what seems to be what they did in this leak)

Speculation: Nintendo simply used the devkits they had with the power consumption target they were shooting for and got those eurogamer clocks and gave them to developers for launch. Much like they did to developers during Wii U and Gamecube launches. Then tested higher clocks with final hardware in the same power envelope to give us the foxconn clocks.

The foxconn leak said 2000 of these devkit units were made, I have to image that they went out to developers, not much else they could do with these things... I can't see the A57 1GHz quad core making sense for such a device IMO, the other thing is that the leaker mentions just 1 chip that seems to fit the GTX 1060 GPU as Thraktor mentioned, this all really does line up with the original WSJ leak of a product that mentioned a console and at least 1 portable device you can take with you. The new form of such a product being a 4k dock and the switch matches up well to that IMO.
 

ggx2ac

Member
I am starting to think that Nintendo just didn't know what they could target with final hardware without extensive testing (what seems to be what they did in this leak)

Speculation: Nintendo simply used the devkits they had with the power consumption target they were shooting for and got those eurogamer clocks and gave them to developers for launch. Much like they did to developers during Wii U and Gamecube launches. Then tested higher clocks with final hardware in the same power envelope to give us the foxconn clocks.

Anyone know how legit this guy is? https://youtu.be/LLsYMI55RlM?t=1m26s ?? (saw this posted in the foxconn leak's reddit comments)

The foxconn leak said 2000 of these devkit units were made, I have to image that they went out to developers, not much else they could do with these things... I can't see the A57 1GHz quad core making sense for such a device IMO, the other thing is that the leaker mentions just 1 chip that seems to fit the GTX 1060 GPU as Thraktor mentioned, this all really does line up with the original WSJ leak of a product that mentioned a console and at least 1 portable device you can take with you. The new form of such a product being a 4k dock and the switch matches up well to that IMO.

Are you certain?

http://www.wsj.com/articles/nintendo...orm-1444996588

The exact shape of the NX hardware isn’t yet clear. People familiar with the development plans said Nintendo would likely include both a console and at least one mobile unit that could either be used in conjunction with the console or taken on the road for separate use. They also said Nintendo would aim to put industry-leading chips in the NX devices, after criticism that the Wii U’s capabilities didn’t match those of competitors.

http://www.wsj.com/articles/nintendo...oss-1469604309

A person familiar with the matter said NX would be a handheld-console hybrid that would be compatible with its own smartphone games.
A Nintendo spokesman declined to comment on the details of NX.

At first, it wasn't clear what the Switch was with relation to dev-kits, then in a later article Takashi Mochizuki called it a hybrid.

I highly doubt the article was in reference to any SCD because no other outlet has reported on it. If it was possible to be leaked by one outlet back then, then it should have been leaked by another outlet later on like with Eurogamer.
 

z0m3le

Banned
Are you certain?

http://www.wsj.com/articles/nintendo...orm-1444996588



http://www.wsj.com/articles/nintendo...oss-1469604309



At first, it wasn't clear what the Switch was with relation to dev-kits, then in a later article Takashi Mochizuki called it a hybrid.

I highly doubt the article was in reference to any SCD because no other outlet has reported on it. If it was possible to be leaked by one outlet back then, then it should have been leaked by another outlet later on like with Eurogamer.

The original leak sounds like a concept leak to me, but the SCD in this leak is pretty strong evidence that the thing exists. Maybe people will start talking about the more powerful devkit soon, but right now it seems like everyone is just taking hot takes and not leaking anything else atm.
 

ggx2ac

Member
The original leak sounds like a concept leak to me, but the SCD in this leak is pretty strong evidence that the thing exists. Maybe people will start talking about the more powerful devkit soon, but right now it seems like everyone is just taking hot takes and not leaking anything else atm.

The original leak is very unclear. If the console unit is the SCD then it would suggest that the SCD can work standalone and that you wouldn't need a Switch.

I don't think linking the SCD to the WSJ article makes sense.
 

Meesh

Member
This has been the most amazing Switch thread ever! Totally falling down the rabbit hole with this one, fascinating I think. I'm no techie but it interests me greatly. Thanks! :)
 

z0m3le

Banned
The original leak is very unclear. If the console unit is the SCD then it would suggest that the SCD can work standalone and that you wouldn't need a Switch.

I don't think linking the SCD to the WSJ article makes sense.

I'm ok with a difference of opinion, Iwata originally said that Switch wouldn't be 1 device but might instead be 3 or 4 devices. Reggie called NX a platform, and now we are seeing a devkit in a pretty substantial leak. I think the original WSJ article fits into these topics, but frankly it is too early to clearly know what they are doing there. All we know is that there are 2000 of these things, they are much more powerful than the Switch we will buy in March and come with a screen attached to them, in my opinion this is so they can target both Switch's original specs and higher specs with this one device.
 
The original leak sounds like a concept leak to me, but the SCD in this leak is pretty strong evidence that the thing exists. Maybe people will start talking about the more powerful devkit soon, but right now it seems like everyone is just taking hot takes and not leaking anything else atm.

Those links got taken down.. hmmm
 

z0m3le

Banned
Those links got taken down.. hmmm

Yeah, I heard that. The speculation is because of the devkit we are talking about, it just wasn't suppose to be known yet. I've heard whispers of the device in developers hands (the source is not directly from an insider though, just someone on NeoGaf I trust, they mentioned the insider but if that person isn't saying anything publicly yet, then maybe it is all in an early testing phase) still making 2k of these leads me to believe that full development of software has already begun, I think if we don't hear substantial rumors by E3, it is probably scrapped or a long ways off though.
 

Mokujin

Member
So this is still going the way of insanity, I see.

So about the whole gtx 1060 SCD.-

* Even if the heavier unit in the leak was real (which I have no reason to doubt there was one), it was referenced as having a much bigger 200mm SoC, not that it had both the Switch SoC and another 200mm chip, so all that "GTX 1060 is also 200mm, all checks out" is basically wrong speculation from the start.

The most logical explanation is that it was a test unit with a much bigger SoC with more cores to run both debugging software along with the Switch hardware. All that was said looks exactly like the image leaked just before the unveiling.

img_1029.jpg

As stated in the leak:

* DC input instead of a battery.
* No dock, hdmi in the main unit.
 

optimiss

Junior Member
So this is still going the way of insanity, I see.

So about the whole gtx 1060 SCD.-

* Even if the heavier unit in the leak was real (which I have no reason to doubt there was one), it was referenced as having a much bigger 200mm SoC, not that it had both the Switch SoC and another 200mm chip, so all that "GTX 1060 is also 200mm, all checks out" it's basically wrong speculation from the start.

The most logical explanation is that it was a test unit with a much bigger SoC with more cores to run both debugging software along with the Switch hardware. All that was said looks exactly like the image leaked just before the unveiling.



As stated in the leak:

* DC input instead of a battery.
* No dock, hdmi in the main unit.

I thought the leak said there is no screen on it either. Maybe I am remembering incorrectly.
 
So this is still going the way of insanity, I see.

So about the whole gtx 1060 SCD.-

* Even if the heavier unit in the leak was real (which I have no reason to doubt there was one), it was referenced as having a much bigger 200mm SoC, not that it had both the Switch SoC and another 200mm chip, so all that "GTX 1060 is also 200mm, all checks out" is basically wrong speculation from the start.

The most logical explanation is that it was a test unit with a much bigger SoC with more cores to run both debugging software along with the Switch hardware. All that was said looks exactly like the image leaked just before the unveiling.

As stated in the leak:

* DC input instead of a battery.
* No dock, hdmi in the main unit.

Yeah I don't know if that leak was referring to this devkit/debug unit but I certainly think it's far too early to make any assumptions about an SCD, especially when all we "know" is the potential die size. Also keep in mind the SCD patent focuses heavily on wireless supplemental processing, so even if this leak is describing an SCD there is likely a lot more to it than we can know right now.

On the other hand it definitely seems worthwhile to think about why the Foxconn leak detailed the clocks it did. Do console makers typically run stress tests at speeds far above where they intend the processors to perform? Have any other consoles ever had a higher CPU clock solely for the purposes of emulation? We have precedence with the Wii U and Gamecube of a late in the game pre-launch clock speed boost, but do we know if those lower clock speeds were ever described as "final for launch"?
 

Mokujin

Member
Yeah I don't know if that leak was referring to this devkit/debug unit but I certainly think it's far too early to make any assumptions about an SCD, especially when all we "know" is the potential die size. Also keep in mind the SCD patent focuses heavily on wireless supplemental processing, so even if this leak is describing an SCD there is likely a lot more to it than we can know right now.

On the other hand it definitely seems worthwhile to think about why the Foxconn leak detailed the clocks it did. Do console makers typically run stress tests at speeds far above where they intend the processors to perform? Have any other consoles ever had a higher CPU clock solely for the purposes of emulation? We have precedence with the Wii U and Gamecube of a late in the game pre-launch clock speed boost, but do we know if those lower clock speeds were ever described as "final for launch"?

I don't think Nintendo is doing anything right now with their SCD concept, it makes no sense when they still don't know if Switch is going to take off.

While my stance is that cpu is A57 based, I don't rule out completely changes in the CPU setup or even a slight clock bump, but I just can't see cores running at 1.78 Ghz.

What I find annoying is people dismissing A57 cores and saying A72 are much better, well they are better but just about 10% better at the same clocks, and even A73 performance is about the same as A72, but in this case it is a more efficient architecture, overall A57 cores are still in a good spot compared to the newer Arm designs.
 

z0m3le

Banned
I don't think Nintendo is doing anything right now with their SCD concept, it makes no sense when they still don't know if Switch is going to take off.

While my stance is that cpu is A57 based, I don't rule out completely changes in the CPU setup or even a slight clock bump, but I just can't see cores running at 1.78 Ghz.

What I find annoying is people dismissing A57 cores and saying A72 are much better, well they are better but just about 10% better at the same clocks, and even A73 performance is about the same as A72, but in this case it is a more efficient architecture, overall A57 cores are still in a good spot compared to the newer Arm designs.

Because A57 cores at that speed wouldn't make sense, they would simply draw too much power for a handheld, if the clocks are true, then it has to be A72(or less likely A73) on 16nm as A57 at that clock would draw something like 3 to 4 watts by itself.

Also you don't make 2k of these units and not send them out to developers, that doesn't make sense, and because they have screens, they pretty much could only be devkits.
 
Can this handheld power the latest Frostbite engine? Games like battlefield 1 or ME Andromeda.

Frostbite is very scaleable, I think many phones can handle it. So I'm sure this can. It's a matter of whether or not EA wants to spend the time/money to actually port the engine.
 

z0m3le

Banned
Frostbite is very scaleable, I think many phones can handle it. So I'm sure this can. It's a matter of whether or not EA wants to spend the time/money to actually port the engine.

I assume Fifa will be running frostbite, the assets could possibly be from 360 version, but the engine itself is likely the reason they are bringing the game over at all, it makes it easier to "jump on board" the switch if it takes off and they can more quickly move their titles over.
 

Mr Swine

Banned
I assume Fifa will be running frostbite, the assets could possibly be from 360 version, but the engine itself is likely the reason they are bringing the game over at all, it makes it easier to "jump on board" the switch if it takes off and they can more quickly move their titles over.

Easiest to port would be using the frostbite engine but we know that EA doesn't want to use that engine for switch so it will be the FIFA 360 engine instead
 

Mokujin

Member
Because A57 cores at that speed wouldn't make sense, they would simply draw too much power for a handheld, if the clocks are true, then it has to be A72(or less likely A73) on 16nm as A57 at that clock would draw something like 3 to 4 watts by itself.

Also you don't make 2k of these units and not send them out to developers, that doesn't make sense, and because they have screens, they pretty much could only be devkits.

The problem is that your reasoning starts by believing the 1.78 Ghz clock and go from there instead of considering the possibility that something smells fishy being so different from E.G. clocks.

Why can't those units be simply Switch test units (which is much more likely) and instead think they are a more powerful evolution being tested? I can picture all those units being in developers hands as Switch test/debugging units already.
 

z0m3le

Banned
The problem is that your reasoning starts by believing the 1.78 Ghz clock and go from there instead of considering the possibility that something smells fishy being so different from E.G. clocks.

Why can't those units be simply Switch test units (which is much more likely) and instead think they are a more powerful evolution being tested? I can picture all those units being in developers hands as Switch test/debugging units already.

Yes, if the clocks are real, then it is A72, that is my reasoning, I'm not saying it is 100% A72 and 1.78ghz, that is what the leak leads us to believe, also they aren't test units because he describes this same unit being mass produced in the leak, 20k of them a day, so they aren't test units, they are whatever they are shipping.
 

Hermii

Member
Frostbite is very scaleable, I think many phones can handle it. So I'm sure this can. It's a matter of whether or not EA wants to spend the time/money to actually port the engine.
The fact that it can run the engine does not mean it can run every game running on the engine. Battlefield field 4 runs on frostbyte but it was designed for last gen hardware. Doesn't mean last gen hardware can run bf1.
 
Why can't those units be simply Switch test units (which is much more likely) and instead think they are a more powerful evolution being tested? I can picture all those units being in developers hands as Switch test/debugging units already.

Do console makers do this? Test a retail unit at far higher CPU/GPU clocks than what they are allowed to use by developers?

The fact that it can run the engine does not mean it can run every game running on the engine. Battlefield field 4 runs on frostbyte but it was designed for last gen hardware. Doesn't mean last gen hardware can run bf1.

I never claimed it can get all of those games, I was just answering the question about the engine itself.
 

Polygonal_Sprite

Gold Member
Yeah, I heard that. The speculation is because of the devkit we are talking about, it just wasn't suppose to be known yet. I've heard whispers of the device in developers hands (the source is not directly from an insider though, just someone on NeoGaf I trust, they mentioned the insider but if that person isn't saying anything publicly yet, then maybe it is all in an early testing phase) still making 2k of these leads me to believe that full development of software has already begun, I think if we don't hear substantial rumors by E3, it is probably scrapped or a long ways off though.

What did they say about it, potential power, upgrade dock, standalone console ??? Feel free to PM me if you don't want to make it public.
 

z0m3le

Banned
What did they say about it, potential power, upgrade dock, standalone console ??? Feel free to PM me if you don't want to make it public.
"A dock that offers a power boost, thanks to its own hardware inside" the user speculated it was the SCD, and while I trust the user, it is second hand and we should take it with a grain of salt, I'm only even talking about it because of this leak's seemingly confirmation of such a device.
 

Polygonal_Sprite

Gold Member
"A dock that offers a power boost, thanks to its own hardware inside" the user speculated it was the SCD, and while I trust the user, it is second hand and we should take it with a grain of salt, I'm only even talking about it because of this leak's seemingly confirmation of such a device.

OK, thanks.

If there is a much more powerful dock coming later, I wish Nintendo would have put all their cards on the table from the start as a lot of day one Switch buyers will feel burned if they reveal this dock add on at E3 or in late 2017.

I'm more excited about the potential clock boosts to the standard Switch console, especially the the CPU boost. How much more powerful is the Switch CPU at 1GHz over the WiiU CPU ?
 
OK, thanks.

If there is a much more powerful dock coming later, I wish Nintendo would have put all their cards on the table from the start as a lot of day one Switch buyers will feel burned if they reveal this dock add on at E3 or in late 2017.

I'm more excited about the potential clock boosts to the standard Switch console, especially the the CPU boost. How much more powerful is the Switch CPU at 1GHz over the WiiU CPU ?

Why would they feel burned? You'd still need to buy the base Switch device if the dock is simply an add-on.
 

Polygonal_Sprite

Gold Member
Why would they feel burned? You'd still need to buy the base Switch device if the dock is simply an add-on.

Because people will feel swindled into buying something in one month only to be told a few months later that they need to spend another $100-$200 to get the "true experience". I have no issues with it personally and would buy a more powerful dock the day it released but we all saw the reactions to PS4 Pro and Scorpio and those devices are three and four years after launch of the base hardware.

As much as I would like it, I really can't see a more powerful Switch dock add on being true. Current Nintendo don't strike me as the sort of company who chase graphical fidelity to the point of releasing an expensive add on to achieve that. They would not want to split their customer base (an add on would confuse a lot of people especially casual customers) and you would get third parties wanting to make dock exclusive games as they would see the base hardware as too weak but the dock add on powerful enough. That would lead to people buying games and them not working on the base hardware they bought.

It all just leads to over complication imo. If Nintendo wanted this sort of thing they would have released a standalone console, a dedicated handheld, a hybrid and presented them all at the same time so that it was not only clear what they were going for but also to show that they weren't being anti consumer by hiding expensive add ons until a few months after the launch of the base hardware.

I would love to be wrong on the dock and I'm hopeful the higher clocks of the base Switch are real.
 

Instro

Member
OK, thanks.

If there is a much more powerful dock coming later, I wish Nintendo would have put all their cards on the table from the start as a lot of day one Switch buyers will feel burned if they reveal this dock add on at E3 or in late 2017.

I'm more excited about the potential clock boosts to the standard Switch console, especially the the CPU boost. How much more powerful is the Switch CPU at 1GHz over the WiiU CPU ?

I'd be excited about both, but the CPU improvement in particular strikes me as a big deal. Hopefully we'll know more soon. The only thing that would make me feel burned is knowing that I wasted money having to get the system with the normal dock rather than being able to buy it with the 4k dock. Waste of money.

I suppose it's possible that Nintendo is not saying anything about specs until they have something to show on the supposed SCD front. The whole thing strikes me as odd though. Like why even bother with the dock initially if you have a super charged version coming soon? Engines are scalable of course, but theoretically you now have too many scenarios to develop for:

Portable
Normal docked
"Pro" dock @1080p
"Pro" dock above 1080p

I suppose the first two scenarios could be one in the same if the device is running at full speed regardless of it being docked or not, not sure if that has been confirmed pne way or another though...otherwise though it just seems pointless to have the regular dock at all.
 
Top Bottom