• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Curious Case of the Switch Foxconn Leak (Now a hardware fanfiction thread)

Good point with the logo, a 1080p switch does make the most sense for an iteration, and we even should know the gpu performance of such a device. Either it will be the 472gflops that the original switch has when docked or 1062gflops to 1180gflops for a new 1080p target, meaning that the current switch would run full clock and target 720p with future titles once "new" switch is released.

That would seemingly complicate matters quite a bit if an SCD dock is being prepared for fall 2018 at the latest, as this leak suggests. You'd go from three performance targets (base handheld, base dock, super dock) to five performance targets (base handheld, XL handheld, base dock, XL dock, super dock)...

So I'd guess in this case it's either one or the other- an SCD dock or a 1080p handheld- in a few years. Not both.
 

z0m3le

Banned
That would seemingly complicate matters quite a bit if an SCD dock is being prepared for fall 2018 at the latest, as this leak suggests. You'd go from three performance targets (base handheld, base dock, super dock) to five performance targets (base handheld, XL handheld, base dock, XL dock, super dock)...

So I'd guess in this case it's either one or the other- an SCD dock or a 1080p handheld- in a few years. Not both.

You'd go from 2 to 3, and they should all be as easy to handle as loading different assets and rendering at a higher resolution, possibly changing settings. Exactly how a PC does it, and is driven by the developer friendly Nvidia.

You just have the Switch display 720p on the TV and on the go, and increase the gpu clock while on the go. (which is easier to do with these new clocks by the way since the 16nm GPU would draw 3watts at 921mhz rather than 4.3watts for 768mhz, and the SoC at full clocks would be 5.05w and 6.13w respectively, though both would be bad, it's ~1h20m vs ~1h45m for minimum play times)

The "New" Switch would just target 1080p with a 2.25 to 2.5x better performance than the current switch (472 x 2.25 = 1062gflops)

And finally you'd just need the 4k dock to do 4x that or 4248gflops (which is a hair under the gtx 1060 people are assuming is in the devkits.)
 

Fafalada

Fafracer forever
Durante said:
But what we are discussing here is why the resolution doesn't scale more between the docked and undocked modes. "Primitive throughput" seems a very unlikely answer to that, given that (a) as you say it doesn't scale with resolution, and (b) it actually does increase proportionally to the clock increase when docked.
Yea you're right - I was thinking of something else when I wrote that. If primitives were bottle-necked at higher-clock it'd inevitably be a bandwidth limit again.
 

usmanusb

Member
The "New" Switch would just target 1080p with a 2.25 to 2.5x better performance than the current switch (472 x 2.25 = 1062gflops)

And finally you'd just need the 4k dock to do 4x that or 4248gflops (which is a hair under the gtx 1060 people are assuming is in the devkits.)

Why would Nintendo go for GTX 1060 and release a product few years down the road by the time when there will be next few GTX generations ?
 

Thraktor

Member
My point was that even if the innacuracy of the person measuring and removing bits Nintendo doesn't want were accounted for the extra RAM might take extra space which we are not really sure is accounted for, especially if we can't speculate on its size, not without knowing the exact customisations Nintendo's going with.

Oh, I agree, and I don't think there's any evidence for extra memory pools/increased cache/etc., but I also don't think there's really any evidence against it. It's unlikely to be something we could really determine without a die photo (and preferably another die photo of the TX1 to compare against).

Wii U's 35MB of embedded ram is also coming into play here, or am I missing something? I mean if Wii U is using all available embedded ram, doesn't that mean it can push more alpha textures than it's gpu has any right to? especially the 3MB of very fast t1sram from legacy gamecube on die?

It's also reckless to use this metric when we've seen Wii U with the same effects causing dips (alpha textures, fire, ect) and that is the comparison we are making.

I wasn't really comparing to Wii U (which would be difficult, as the different rasterisation paradigms and memory system setups would make differences very situational). I was responding (indirectly) to a post asking, to paraphrase, "If a game runs at 720p in handheld then why can't it run at 1080p docked?".

Depth of field effects are particularly bandwidth-intensive, but most importantly for a tile-based rendering GPU like Switch's, they can't be tiled. (Not even with the best possible implementation of Vulkan's render passes) Hence, frame rate drops which occur during DoF effects are a pretty good indication that a game is bandwidth limited.

I expect Nintendo to iron out the frame rate by the time the game releases, as their internal teams have a very good track record with frame rates (and the demo is likely based on a version of the game which is several months old by this point), but it's still a useful indicator at this point of where their technical challenges lie. I'd also expect them to improve some other graphical features in docked mode (e.g. draw distance), because if you're bandwidth-bound and have hardware idling you may as well use that up in some way.

I thought the same during the NX rumors, but the problem I have now is that the "Switch" name and logo start to fall apart when you don't revolve around the single core device like an SCD would.

Yeah, I had thought the same thing a while ago, and was actually considering making a thread on it, but to be honest I've come to the conclusion it's not that big of an issue. For now, Switch is a word which we associate with certain features of the hardware, but in a year or so it will just be a brand name we associate with Nintendo's device (or devices). When you hear Playstation you don't really think about the meaning of the words play or station, you just think of one of Sony's consoles. Ditto with pretty much any other brand name which is based on real words, it's only very early on in the brand's existence that you associate it with those words, and after that it's just a brand.

I also don't think this SCD would have to be that expensive sold standalone or optionally bundled at a discount, but I think it's important to note that the current dock despite doing essentially nothing has an inflated $90 price, possibly preparing consumers for whatever comes next.

Back to the name and logo, a dedicated home console would immediately lose "switching" to handheld, and is only left with "switching" the joy-cons between the grip and individual motion controllers, and that's if it didn't just come bundled with a Pro controller instead. Add in smaller handhelds and the joy-cons can't be attached, so either you lug them around separately or build some of the functions into the device but lose some game compatibility too which I doubt they want. The 3DS XL already sells better than the smaller unit anyways, right?

Yes, the 3DS XL sells better than the 3DS, but that's why they're starting off with a large handheld that doubles as a home console. That doesn't mean there's zero interest in anything smaller, though (and the hypothetical 5" Switch Pocket would be closer to the XL than the regular 3DS anyway). More importantly, though, a portable-only version of Switch with fixed controls could be a lot cheaper:

- No active cooling
- No dock (yes, it's overpriced at $90, but there's still some cost to it)
- No joy-con batteries
- No joy-con wireless
- No joy-con rails and physical interface
- Likely fewer linear actuators required for HD rumble
- Smaller screen
- Smaller battery (potentially)
- Less powerful charger
- No IR camera (maybe)

Having a cheaper option would be very valuable to Nintendo, and there are plenty of people who would prefer something a bit more portable than the current Switch anyway.

As for game compatibility, as far as we've seen so far the only game which wouldn't be compatible with a handheld with fixed controls is 1, 2, Switch, and it should be fairly obvious to customers that it requires joy-cons (and with Bluetooth it would in theory be possible to connect joy-cons to a "Switch Pocket", anyway). Even ARMS, a game that's very obviously built around motion controls, supports traditional controls as well, so it seems like Nintendo is at the very least keeping their options open for future form-factors.

A 1080p upgrade down the line with full backwards compatibility is what I would most expect, just like the DSi and NN3DS, with or without the SCD being true. Until this happens I would not expect any of the games being exclusive to better hardware ("some games will only play on the second device"), just like the PS4 Pro will not have exclusives.

I would have expected them to avoid the "some games only work on one of the Switch devices" too, by making a home console that plays the same games in higher resolutions. They could do so with a relatively affordable ~1.6TFlop machine, going by our current understanding of Switch's performance, and in theory get games running at up to 4K.

The problem is two-fold. Firstly, if they were actually to release a dock/SCD/standalone console using GP106-class hardware, then it would be far too powerful for this approach, even clocked down and with SMs disabled. A game running at native 4K on this kind of device would probably need to run at about 480p on Switch in portable mode, and games that run at sub-4K on it would be pushing 3DS-level resolutions in portable mode. Developers could certainly do more than just alter resolution, with improved effects, lighting, etc. to use up the extra horsepower, but with such a large performance gap this would be quite a burden on some developers.

The second issue is western third parties. If Nintendo wants to use a device like this to bring western AAA games onto their hardware, then forcing them to also bring their games to the portable would only hamper this effort, as with a couple of exceptions western publishers/developers don't seem to have any interest in supporting the current hardware. And if it is to be based on GP106, then you'd be going from saying "hey developers, make games for this console, it'll easily be able to handle any of your games with little effort" to saying "hey developers, make games for this console which can handle easy ports, but you also have to put a load of effort into making them work on much less powerful hardware too!".

I meant more "likely". Anyway i'll repost this quote from Manfred Linzner about Wii U memory bandwidth, i still remember people saying that it was bandwidth starved "cuz 12.8GB/s", and none of them pointed at the CPU's rather large cache or those 32MB of eDRAM being part of a more complex but balanced memory subsystem, not even despite the fact that literally 0 devs ever complained about bandwidth despite shitting on the CPU at every turn. Now you're saying there are other possibilities like having more external bandwidth on Switch, which certainly makes more sense to me than "25GB/s period". Still I don't see any major challenge in using a 128bit bus in a device this size, and it's certainly not a cheap one either.

Hopefully marcan will test this d1 if we don't get other leaks.

We don't really need marcan or any kind of hacking to figure out the memory bandwidth, it'll just be a matter of checking the codes on the RAM modules once there's a teardown.

You're right in that Wii U wasn't bandwidth starved by the 12.8GB/s of DDR3 memory bandwidth by virtue of having an eDRAM pool with (I believe) about 70GB/s of bandwidth where most buffer accesses would go, but in a certain sense the same is true of Switch. There may be 25/50GB/s of main memory bandwidth, but like Wii U, most buffer accesses won't hit that, but will rather stay on the GPU's L2 (so long as they're tiled). That L2 bandwidth is likely to be pretty high, potentially higher than Wii U's eDRAM. The GM107's L2 cache reportedly has a bandwidth of 512 bytes per clock, and even if Switch's is only one quarter that it would put it at almost 100GB/s.

On a related note, Switch should also have a quite substantial pixel fill-rate advantage over Wii U. If they keep the 16 ROP configuration from TX1 then we're looking at 12.3 GP/s, compared to a reported 4.4 GP/s for Wii U.
 
I would say that we really know far too little to make guesses about the "SCD" devkit from the leak. The fact that no leakers have mentioned anything like it means that it's fairly far off anyway.

I'm curious though if any insiders have commented anywhere (like on Twitter) about the veracity of the leak in the context of the increased GPU and CPU speeds (and the upgrade to 16nm/A72).
 

Rodin

Member
You're right in that Wii U wasn't bandwidth starved by the 12.8GB/s of DDR3 memory bandwidth by virtue of having an eDRAM pool with (I believe) about 70GB/s of bandwidth where most buffer accesses would go, but in a certain sense the same is true of Switch. There may be 25/50GB/s of main memory bandwidth, but like Wii U, most buffer accesses won't hit that, but will rather stay on the GPU's L2 (so long as they're tiled). That L2 bandwidth is likely to be pretty high, potentially higher than Wii U's eDRAM. The GM107's L2 cache reportedly has a bandwidth of 512 bytes per clock, and even if Switch's is only one quarter that it would put it at almost 100GB/s.

On a related note, Switch should also have a quite substantial pixel fill-rate advantage over Wii U. If they keep the 16 ROP configuration from TX1 then we're looking at 12.3 GP/s, compared to a reported 4.4 GP/s for Wii U.

So i wasn't entirely wrong when i suggested that the 900p limitation was possibly due to Nintendo not being able to fully take advantage of TBR with this port. Aonuma said that the development of the Switch version started last spring, do you think it would've been possible to reach 1080p with other bells and whistles if the game was made from the beginning (or at least since late 2013-2014) with the Switch in mind? I mean, GPU power hardly seems to be the issue here, and CPU even less so, especially if the foxconn leaked clocks are correct.

I would say that we really know far too little to make guesses about the "SCD" devkit from the leak. The fact that no leakers have mentioned anything like it means that it's fairly far off anyway.

I'm curious though if any insiders have commented anywhere (like on Twitter) about the veracity of the leak in the context of the increased GPU and CPU speeds (and the upgrade to 16nm/A72).
If (and that's a giant if) that simil-GTX 1060 powered device is even a thing, i think that it would be a premium "home console only", not a SCD. So that they can have on the market a standalone Switch for ~200$ (the portable system with the two joycons attached and none of the home related accessories), a Switch hybrid (console+dock+"home console" accessories) for 299$, which is the device they're currently releasing, and a Switch home premium (new box+Switch pro controller) for 399$. All would play the same games with different res/fidelity, and this would fall in line with Iwata's statements about the "consoles like brothers in a family of systems" and Nintendo's target to avoid "software droughts with new platform releases". It would even make more sense to release the hybrid first, so that Nintendo can have all of their teams working on it in order to build a large library of software forward compatible with any new piece of hardware they release, even if that new hardware is simply a revision of the Switch.

Still, if this console even exists and isn't something else entirely, i wouldn't expect it any time soon.
 

Thraktor

Member
So i wasn't entirely wrong when i suggested that the 900p limitation was possibly due to Nintendo not being able to fully take advantage of TBR with this port. Aonuma said that the development of the Switch version started last spring, do you think it would've been possible to reach 1080p with other bells and whistles if the game was made from the beginning (or at least since late 2013-2014) with the Switch in mind? I mean, GPU power hardly seems to be the issue here, and CPU even less so, especially if the foxconn leaked clocks are correct.

It's very difficult to say, as it likely all comes down to the extent to which the rendering pipeline can be tiled or not. As mentioned above, the depth of field effects are an obvious case where tiling isn't possible, so that pass has to run over the slower main memory rather than the faster cache, which appears to be causing slowdown. It's hard to say if there are other effects which can't be tiled, and might also be contributing to pressure on the LPDDR4 bandwidth. One problem from my point of view is that, as far as possible, I've been maintaining a media black-out on Breath of the Wild, so I can only really infer from second-hand sources.

Mario Kart 8 Deluxe and FAST Racing RMX both running at 1080p/60fps seem to bode pretty well for Switch's TBR, though. In MK8D's case, as Digital Foundry pointed out in their analysis of the game, it appears to use a deferred renderer. Deferred rendering typically works pretty poorly on a tile-based renderer, as the usual way of handling G-buffers (render to texture) can't be tiled. If (as I suspect) Switch is using Vulkan as the primary graphics API and if (as I also suspect) Vulkan's use of render passes (and more specifically attachments within render passes) allows g-buffers to be tiled, then you'd expect a deferred renderer to work very well on Switch. Mario Kart 8 seems to back this up, as I can't imagine a deferred renderer hitting a rock-solid 1080p/60fps on Switch if large parts of its rendering pipeline had to squeeze through an LPDDR4 bus.

FAST Racing RMX is perhaps even more impressive, given the game ran at sub-720p on Wii U. I'm not sure if they're using a deferred renderer, but there are some effects in there which aren't easily tiled (such as motion blur and water droplets on the camera), so it's pretty impressive from a bandwidth-optimisation point of view, although I suppose that's not unexpected from Shinen. I'd love to see an interview with them go in-depth into the optimisations involved, but I doubt Nintendo would allow it.

If (and that's a giant if) that simil-GTX 1060 powered device is even a thing, i think that it would be a premium "home console only", not a SCD. So that they can have on the market a standalone Switch for ~200$ (the portable system with the two joycons attached and none of the home related accessories), a Switch hybrid (console+dock+"home console" accessories) for 299$, which is the device they're currently releasing, and a Switch home premium (new box+Switch pro controller) for 399$. All would play the same games with different res/fidelity, and this would fall in line with Iwata's statements about the "consoles like brothers in a family of systems" and Nintendo's target to avoid "software droughts with new platform releases". It would even make more sense to release the hybrid first, so that Nintendo can have all of their teams working on it in order to build a large library of software forward compatible with any new piece of hardware they release, even if that new hardware is simply a revision of the Switch.

Still, if this console even exists and isn't something else entirely, i wouldn't expect it any time soon.

I would agree with you, and it seems to me the most sensible approach is one handheld, one hybrid and one home console, all playing the same games. The problem is, though, if we're to take this device as genuine, why does it have a screen? There wouldn't be any reason to have a screen on a devkit for a purely stationary device.

The reason I put forward the theory that it's an SCD or super-powered dock or whatever you want to call it is that the screen would allow developers to simulate un-docking and re-docking the Switch (perhaps with a mechanical toggle on the kit, or through software). It's likely just too early for them to have worked out the entire mechanical and electronic docking procedure for the device, and is hence simpler to manufacture the entire Switch + dock as a single unit. It would also explain why there's both a Switch SoC and a separate GPU in there, rather than a single chip. Granted that could be the case temporarily if they're designing a custom SoC with a GP106-class GPU, but that still leaves the question of the screen.

What I hope is the case is that it's effectively a standalone device that can also act as a dock or SCD. It would explain the description of the device while also being more sensible from a business standpoint than a pure add-on for existing Switch owners.
 

Rodin

Member
It's very difficult to say, as it likely all comes down to the extent to which the rendering pipeline can be tiled or not. As mentioned above, the depth of field effects are an obvious case where tiling isn't possible, so that pass has to run over the slower main memory rather than the faster cache, which appears to be causing slowdown. It's hard to say if there are other effects which can't be tiled, and might also be contributing to pressure on the LPDDR4 bandwidth. One problem from my point of view is that, as far as possible, I've been maintaining a media black-out on Breath of the Wild, so I can only really infer from second-hand sources.

Mario Kart 8 Deluxe and FAST Racing RMX both running at 1080p/60fps seem to bode pretty well for Switch's TBR, though. In MK8D's case, as Digital Foundry pointed out in their analysis of the game, it appears to use a deferred renderer. Deferred rendering typically works pretty poorly on a tile-based renderer, as the usual way of handling G-buffers (render to texture) can't be tiled. If (as I suspect) Switch is using Vulkan as the primary graphics API and if (as I also suspect) Vulkan's use of render passes (and more specifically attachments within render passes) allows g-buffers to be tiled, then you'd expect a deferred renderer to work very well on Switch. Mario Kart 8 seems to back this up, as I can't imagine a deferred renderer hitting a rock-solid 1080p/60fps on Switch if large parts of its rendering pipeline had to squeeze through an LPDDR4 bus.

FAST Racing RMX is perhaps even more impressive, given the game ran at sub-720p on Wii U. I'm not sure if they're using a deferred renderer, but there are some effects in there which aren't easily tiled (such as motion blur and water droplets on the camera), so it's pretty impressive from a bandwidth-optimisation point of view, although I suppose that's not unexpected from Shinen. I'd love to see an interview with them go in-depth into the optimisations involved, but I doubt Nintendo would allow it.
About Fast Neo:
At its core, we have a feature-rich deferred renderer designed to take advantage of the Wii U's eDRAM memory bandwidth - the 128-bit G-buffer fits comfortably within the console's 32MB.
http://www.eurogamer.net/articles/digitalfoundry-2015-vs-fast-racing-neo

I'm sure that Manfred Linzner confirmed in an interview that Shin'en moved from the forward renderer of Nano Assault Neo to a deferred renderer for this game, but i can't find it for some reason.

Anyway i think it's likely that Fast isn't full 1080p on Switch, but uses temporal reconstruction just like on Wii U. It's still impressive though, as it's basically doing the same jump of Mario Kart 8.

I would agree with you, and it seems to me the most sensible approach is one handheld, one hybrid and one home console, all playing the same games. The problem is, though, if we're to take this device as genuine, why does it have a screen? There wouldn't be any reason to have a screen on a devkit for a purely stationary device.

The reason I put forward the theory that it's an SCD or super-powered dock or whatever you want to call it is that the screen would allow developers to simulate un-docking and re-docking the Switch (perhaps with a mechanical toggle on the kit, or through software). It's likely just too early for them to have worked out the entire mechanical and electronic docking procedure for the device, and is hence simpler to manufacture the entire Switch + dock as a single unit. It would also explain why there's both a Switch SoC and a separate GPU in there, rather than a single chip. Granted that could be the case temporarily if they're designing a custom SoC with a GP106-class GPU, but that still leaves the question of the screen.

What I hope is the case is that it's effectively a standalone device that can also act as a dock or SCD. It would explain the description of the device while also being more sensible from a business standpoint than a pure add-on for existing Switch owners.

Yeah i'm not sure how to explain that screen. Was a huge red flag for me, but like you said, we can't completely dismiss it considering the rest of his leak.
 
If the Switch is really 4-5x as powerful as the wii u when docked, then I'm expecting all 720p wii U ports to be 1080p. It's not just the GPU that's 4x, but the CPU is going up as a similar scale. The only thing that isn't is the RAM(4GB), and the bandwidth is unknown.

So Mario kart going from 720p to 1080p doesn't wow me, but they what was expected considering the hardware differences, so that's like satisfactory/passing for me. Would be nice if they could have the frame rate at 60fps at 3 and 4 player or increased the graphical fidelity, if the system was capable, but w/e. BotW not going to 1080p is a bit dissapointing, though I don't know the full reason behind it, like if its really a bandwidth issue or not, considering the only graphical difference confirmed by Nintendo at least is just the resolution.

I know that these wii u ports were made for the wii u, so I'm expecting new games to look good on the switch as it will be better optimized.
 
If the Switch is really 4-5x as powerful as the wii u when docked, then I'm expecting all 720p wii U ports to be 1080p. It's not just the GPU that's 4x, but the CPU is going up as a similar scale. The only thing that isn't is the RAM(4GB), and the bandwidth is unknown.

So Mario kart going from 720p to 1080p doesn't wow me, but they what was expected considering the hardware differences, so that's like satisfactory/passing for me. Would be nice if they could have the frame rate at 60fps at 3 and 4 player or increased the graphical fidelity, if the system was capable, but w/e. BotW not going to 1080p is a bit dissapointing, though I don't know the full reason behind it, like if its really a bandwidth issue or not, considering the only graphical difference confirmed by Nintendo at least is just the resolution.

I know that these wii u ports were made for the wii u, so I'm expecting new games to look good on the switch as it will be better optimized.

I think this is exactly the problem with the BotW port. They began porting it last Spring so it likely just took a while to optimize, and they have (for whatever reason) decided on 900p with certain effects like draw distance enhanced. Games built from the ground up (like ARMS) look fantastic on the Switch. I think it'll be easier to judge its power when we see more games like ARMS coming out.
 
Has there been any new information in the past few days if its 20nm or 16nm finfet?

I'm wondering this too, but I don't think anymore info has come out. It sure looks like the Foxconn numbers are right and logic says they wouldn't be running it at these clocks for 8 days straight if they weren't preparing to launch them with those clocks.

But I don't think Eurogamer or Digital Foundry has acknowledged this or given it any veracity, so really, we have nothing new to report I guess.
 

McMilhouse

Neo Member
I'm wondering this too, but I don't think anymore info has come out. It sure looks like the Foxconn numbers are right and logic says they wouldn't be running it at these clocks for 8 days straight if they weren't preparing to launch them with those clocks.

But I don't think Eurogamer or Digital Foundry has acknowledged this or given it any veracity, so really, we have nothing new to report I guess.

Yeah. If its 20nm, it would be a shame that if zelda runs at 2.5-3 hours, it could of been finfet giving it more portable hours imo.
 

Malakai

Member
That X1 Eurogamer doesn't make sense to me at all. When they leaked the X1 during the summer of 2016, they said that dev. kit then were most likely using overclocked X1's due to the loud fan noise. If one set dev. kit were using the overclocked x1 during the summer why would Nintendo downclock that same x1 and ship it in a retail product?
 

guek

Banned
If the Switch is really 4-5x as powerful as the wii u when docked, then I'm expecting all 720p wii U ports to be 1080p. It's not just the GPU that's 4x, but the CPU is going up as a similar scale. The only thing that isn't is the RAM(4GB), and the bandwidth is unknown.

So Mario kart going from 720p to 1080p doesn't wow me, but they what was expected considering the hardware differences, so that's like satisfactory/passing for me. Would be nice if they could have the frame rate at 60fps at 3 and 4 player or increased the graphical fidelity, if the system was capable, but w/e. BotW not going to 1080p is a bit dissapointing, though I don't know the full reason behind it, like if its really a bandwidth issue or not, considering the only graphical difference confirmed by Nintendo at least is just the resolution.

I know that these wii u ports were made for the wii u, so I'm expecting new games to look good on the switch as it will be better optimized.

My totally layman's guess as to why Zelda is only 900p instead of 1080p on Switch is because it's an open world game and they can't just brute force it like they could with MK8. I'm curious though what Splatoon 2 will eventually be rendered at while docked. Based on hands on reports, it was 720p/60fps both handheld and docked which makes me think the demo simply isn't set to output at any higher when docked.
 

z0m3le

Banned
My totally layman's guess as to why Zelda is only 900p instead of 1080p on Switch is because it's an open world game and they can't just brute force it like they could with MK8. I'm curious though what Splatoon 2 will eventually be rendered at while docked. Based on hands on reports, it was 720p/60fps both handheld and docked which makes me think the demo simply isn't set to output at any higher when docked.

Zelda has been in development for less than a year on unfinished hardware. The game looks better on Switch and has a greater draw distance but beyond that, the wii u version of the game according to Reggie, isn't as smooth, so yes it's 900p but that really doesn't represent the switch's total performance, although I do think it might indeed be lacking memory bandwidth for effects with everything else it is doing in a tiled base renderer. (thanks thraktor, didn't know you couldn't tile effects like dof)
 

nynt9

Member
That X1 Eurogamer doesn't make sense to me at all. When they leaked the X1 during the summer of 2016, they said that dev. kit then were most likely using overclocked X1's due to the loud fan noise. If one set dev. kit were using the overclocked x1 during the summer why would Nintendo downclock that same x1 and ship it in a retail product?

Because dev kits need more power than retail hardware? They can overclock devkits to run games in debut mode but the overclocking might not be good for the battery life in a retail unit.
 
Because dev kits need more power than retail hardware? They can overclock devkits to run games in debut mode but the overclocking might not be good for the battery life in a retail unit.

Apparently this hasn't been true for a fairly long time. Devkits sometimes have a bit more RAM and I think that's it.
 

Donnie

Member
I'm a total luddite, so forgive me. Is this the latest estimate and is there veracity to it? People are talking like Switch is literally 1.5 Wii Us.

In handheld mode it may be 1.5x WiiU (worst case scenario of Eurogamer clocks and basic X1 SoC). In docked mode it's minimum 2.5x handheld mode, or about 4x WiiU. Of course that's still with good optimisation for the hardware (though is based off worst case scenario spec for Switch). You won't get 4x WiiU from that hardware if you just do quick ports.
 
In handheld mode it may be 1.5x WiiU (worst case scenario of Eurogamer clocks and basic X1 SoC). In docked mode it's minimum 2.5x handheld mode, or about 4x WiiU. Of course that's still with good optimisation for the hardware (though is based off worst case scenario spec for Switch). You won't get 4x WiiU from that hardware if you just do quick ports.

And you're just talking about the GPU. CPU is loads better than Wii U and we have 3.2x more RAM available for games. Overall the console can probably be considered 3-5x Wii U, averaged between handheld and docked mode.

Also it sure looks like these Foxconn clocks are accurate and more recent than Eurogamer's...
 
Have I missed anything new on those Foxconn clocks being newer than Eurogamers btw?

I haven't heard anything from insiders or anything like that, but if you read the last few pages you'll see that the power consumption of the specs and clocks detailed in the Foxconn leak match up 100% perfectly with the clock speeds leaked by Eurogamer of a TX1.

Edit: assuming the calculations are right.

Which would very highly suggest Nintendo was able to squeeze out more performance by moving to a smaller 16nm node after the initial clocks and specs. It would be an astronomical coincidence otherwise.
 

Speely

Banned
I haven't heard anything from insiders or anything like that, but if you read the last few pages you'll see that the power consumption of the specs and clocks detailed in the Foxconn leak match up 100% perfectly with the clock speeds leaked by Eurogamer of a TX1.

Which would very highly suggest Nintendo was able to squeeze out more performance by moving to a smaller 16nm node after the initial clocks and specs. It would be an astronomical coincidence otherwise.

I am hopeful.
 
Apparently this hasn't been true for a fairly long time. Devkits sometimes have a bit more RAM and I think that's it.

Speaking of RAM, I don't think Nintendo has confirmed how much RAM the Switch will be for launch. Went for a google search and couldn't find anything outside of rumors from october. It's weird that they would confirm the storage, but not the RAM this close to release. I thought Nintendo confirmed Wii U to have 2GB of RAM sometime in August 2012.

I'm not expecting last minute changes, and 4GB of RAM as rumored by Laura, Emily Rogers, etc.
 

McMilhouse

Neo Member
Speaking of RAM, I don't think Nintendo has confirmed how much RAM the Switch will be for launch. Went for a google search and couldn't find anything outside of rumors from october. It's weird that they would confirm the storage, but not the RAM this close to release. I thought Nintendo confirmed Wii U to have 2GB of RAM sometime in August 2012.

I'm not expecting last minute changes, and 4GB of RAM as rumored by Laura, Emily Rogers, etc.

is it all ddr4?
 

Vic

Please help me with my bad english
So the CPU+GPU clocks included in the Foxconn leak seems more realistic than we've excepted huh. Given than most of the leaked info was precisely right, I wouldn't be surprised Eurogamer clocks were actually taken from an older dev kit.

is it all ddr4?
Yes, LPDDR4 (LP for low power).
 

Thraktor

Member
I haven't heard anything from insiders or anything like that, but if you read the last few pages you'll see that the power consumption of the specs and clocks detailed in the Foxconn leak match up 100% perfectly with the clock speeds leaked by Eurogamer of a TX1.

Edit: assuming the calculations are right.

Which would very highly suggest Nintendo was able to squeeze out more performance by moving to a smaller 16nm node after the initial clocks and specs. It would be an astronomical coincidence otherwise.

I'd chalk it up to coincidence. The deduction seems to be made on the basis of a rough estimate I made on the power consumption of Pascal, but it's a very rough estimate and it's extremely unlikely to be accurate to 100mW, let alone 10mW. (Hell, I'd be happy if it was accurate to 1W).

Besides, decisions like the use of a particular ARM core or manufacturing process would have been locked down over a year ago, there's no scope for them to have made such a change only a couple of months before launch. Furthermore, they specifically stated that the Eurogamer leaks were final clock speeds. Had they known that the final chip was 16nm or used more power-efficient CPU cores or whatever else that would give them scope for higher clock speeds, then there's no way they would have described the Eurogamer clocks as final.

There's a small chance that, after thermal and battery testing, Nintendo decided at the last minute to bump up clock speeds a little (which happened on Wii U), but I don't see any reason to believe that's what's happened here, as a CPU clock jump from 1GHz to 1.78GHz is far too large to be a last-minute adjustment.

About Fast Neo:

http://www.eurogamer.net/articles/digitalfoundry-2015-vs-fast-racing-neo

I'm sure that Manfred Linzner confirmed in an interview that Shin'en moved from the forward renderer of Nano Assault Neo to a deferred renderer for this game, but i can't find it for some reason.

Anyway i think it's likely that Fast isn't full 1080p on Switch, but uses temporal reconstruction just like on Wii U. It's still impressive though, as it's basically doing the same jump of Mario Kart 8.

Good to hear they're using deferred rendering too, thanks for the confirmation. Screenshots of Fast RMX seem pretty sharp at 1080p, although if I'm not mistaken Fast NEO used dynamic resolution, so perhaps they're going that route too.

Yeah i'm not sure how to explain that screen. Was a huge red flag for me, but like you said, we can't completely dismiss it considering the rest of his leak.

It's certainly strange. The other way of looking at it is as a direct successor to Switch, but even if they were using GP106 as a stand-in for a much less powerful Pascal GPU (say 4 SMs), it would still seem far too early to have dev kits for Switch 2 before the first model had even released. You'd be looking at releasing a new Switch perhaps only a year after the first one, and even though they've talked about going the iOS route I can't imagine they would take it to that kind of extreme. Especially when there's options for new hardware releases (dedicated handheld, dedicated console) which they could bring out and still give the original Switch time to breathe.

It does remind me of Jen-Hsun Huang's comments about Nintendo after Nvidia's last quarterly reports:

Jen-Hsun Huang said:
I guess you could also say that Nintendo contributed a fair amount to that growth. And over the next – as you know, the Nintendo architecture and the company tends to stick with an architecture for a very long time.

He cuts off after saying "over the next", and if you listen to the audio of the earning's call (the question is right at the end) you can hear him hesitate when he says it, as if he's realising "oh, wait, I'm not supposed to talk about this", and then changes the subject. It would certainly seem like Nvidia has already got more stuff in the works for Nintendo, and I don't think he's talking about a Switch Pocket (as that would likely just use the existing Switch SoC). At the time I had assumed that whatever they're working on would be a good few years off, but perhaps it isn't. Nintendo might be going heavier into the "family of systems" than we had thought.
 
I'd chalk it up to coincidence. The deduction seems to be made on the basis of a rough estimate I made on the power consumption of Pascal, but it's a very rough estimate and it's extremely unlikely to be accurate to 100mW, let alone 10mW. (Hell, I'd be happy if it was accurate to 1W).

Besides, decisions like the use of a particular ARM core or manufacturing process would have been locked down over a year ago, there's no scope for them to have made such a change only a couple of months before launch. Furthermore, they specifically stated that the Eurogamer leaks were final clock speeds. Had they known that the final chip was 16nm or used more power-efficient CPU cores or whatever else that would give them scope for higher clock speeds, then there's no way they would have described the Eurogamer clocks as final.

There's a small chance that, after thermal and battery testing, Nintendo decided at the last minute to bump up clock speeds a little (which happened on Wii U), but I don't see any reason to believe that's what's happened here, as a CPU clock jump from 1GHz to 1.78GHz is far too large to be a last-minute adjustment.

The one thing holding me back from believing this 100% (or even 80%) was Eurogamer explicitly describing those clocks as final, like you said.

You don't think it's possible though that Nintendo had been investigating the possibility of 16nm for a while and finally determined they could go that route around August - October? Maybe they had to await reports of yield issues? And then come October they sent out those 16nm devkits which would line up with LKD's report of October devkits being more powerful. Again, Eurogamer describing those clocks as final sorta contradicts all of this, including LKD's October devkit leak.

I suppose if the numbers are very rough then it seems a lot less convincing than I previously thought. Power consumption matching up identically would have been an enormous indicator, but like you said since we don't have much hard data to work from that can't really be determined now.

Good to hear they're using deferred rendering too, thanks for the confirmation. Screenshots of Fast RMX seem pretty sharp at 1080p, although if I'm not mistaken Fast NEO used dynamic resolution, so perhaps they're going that route too.

It's certainly strange. The other way of looking at it is as a direct successor to Switch, but even if they were using GP106 as a stand-in for a much less powerful Pascal GPU (say 4 SMs), it would still seem far too early to have dev kits for Switch 2 before the first model had even released. You'd be looking at releasing a new Switch perhaps only a year after the first one, and even though they've talked about going the iOS route I can't imagine they would take it to that kind of extreme. Especially when there's options for new hardware releases (dedicated handheld, dedicated console) which they could bring out and still give the original Switch time to breathe.

It does remind me of Jen-Hsun Huang's comments about Nintendo after Nvidia's last quarterly reports:

He cuts off after saying "over the next", and if you listen to the audio of the earning's call (the question is right at the end) you can hear him hesitate when he says it, as if he's realising "oh, wait, I'm not supposed to talk about this", and then changes the subject. It would certainly seem like Nvidia has already got more stuff in the works for Nintendo, and I don't think he's talking about a Switch Pocket (as that would likely just use the existing Switch SoC). At the time I had assumed that whatever they're working on would be a good few years off, but perhaps it isn't. Nintendo might be going heavier into the "family of systems" than we had thought.

That would be very interesting news if it was coming as early as Holiday 2017 or something... It sure seems like the Switch is a "soft" launch, as some users have been calling it, with an incomplete OS, a trial online period, and relatively few games. They could be throwing us a huge curveball a year from now, though I really do doubt they want to muddy development even further with 1-2 more development targets.

Nintendo is nothing if not unpredictable though.

If it was 16nm finfet wouldnt most of these titles be running at 720p 60fps or 1080p 30fps? Something doesn't add up.

16nm means nothing for performance, when taken alone. It allows you to raise clock speeds without raising power consumption too high, which could account for the 20% increased GPU clock speed. But that's not something which would be all that noticeable in games, no.
 

McMilhouse

Neo Member
It would be annoying if the initial batch are 20nm leftovers then once TSMC's foundry moves to 16nm they switch it quietly and the battery life gets better. (Like Apple did with Iphone 4s, and many other products for example)
 

Chronos24

Member
It would be annoying if the initial batch are 20nm leftovers then once TSMC's foundry moves to 16nm they switch it quietly and the battery life gets better. (Like Apple did with Iphone 4s, and many other products for example)

That would upset me, but such is the life of early adopters like myself haha.
 

ggx2ac

Member
I was thinking about that foxconn leak with the 2000 dev-kits having the more powerful GPU, more RAM and a display screen but no battery.

I am so bored... Okay Hardware Fanfiction, it's for the SCD and the reason it's not a standalone console is that they're testing the waters for a more powerful console for two reasons.

1) To see if there's an audience for having a powerful Nintendo home console.
2) To test the waters for VR, they already have HD Rumble, they just need a VR headset.

Since we are talking about the SCD, it could be for anything. The fact that it's a peripheral means that if it fails to sell well then they can just stop selling it compared to trying to keep alive a dead console like the Wii U.

All speculation and i don't care if I'm wrong, just figuring out what the dev-kits are for.
 
It would be annoying if the initial batch are 20nm leftovers then once TSMC's foundry moves to 16nm they switch it quietly and the battery life gets better. (Like Apple did with Iphone 4s, and many other products for example)

They wouldn't "quietly" do that. 16nm pascal is 60% more energy efficient than 20nm. So if we have 20 a launch, then 16 would likely come as part of a Switch Lite reiteration
 

Theonik

Member
They wouldn't "quietly" do that. 16nm pascal is 60% more energy efficient than 20nm. So if we have 20 a launch, then 16 would likely come as part of a Switch Lite reiteration
Besides Nintendo is probably cheap enough that by the time it happens we'll be in 10nm but the 16nm chips might be a bargain.
 

z0m3le

Banned
There is a lot of weird dismissal of these clocks from the luka.

First I'd like to comment on Thraktor's last post, Your estimates only apply to the Maxwell to Pascal power consumption. The SoC estimation is as good as we can get and even if your chart is off, the difference is going to be counted in 100th of watts not whole watts.

Next Wii U saw a very similar CPU performance upgrade from developers working on launch software.

Launch software developers thought the Wii U had 2 CPU cores at 1GHz and up until a few months before launch, this was the case. Later on they got access to 3 CPU cores at 1.24ghz. That is substantial, nearly a doubling of total CPU performance months before launch, and launch titles ran without that extra performance. The GPU performance change was even greater than this foxconn's leak on top of that, so I don't see your point being a strong one.

Lastly, Eurogamer answered someone, I believe it was Hermii who got the response and posted it in this thread, the post has since been altered, which is weird but it originally said that the clocks might have changed, after looking at this foxconn leak, doesn't sound too final to me IMO, and with some developers still using the july devkits up to 3 or 4 weeks ago and possibly even now, we don't know what changes were made in the final hardware, even Eurogamer made that same comment.

PS I believe the Eurogamer rumor 100%, I simply think that final hardware allowed Nintendo to make these changes resulting in Foxconn's leak, which is as solid info as we've ever gotten on Switch IMO.
 
^

That's all certainly possible, but the Eurogamer leak basically had Nintendo saying to developers: "here are the final clocks available for applications". Did something like that happen with the lower Wii U clocks? If not, it just seems odd that they would describe the clocks as "final" and then go ahead and change them a couple months later.

Again, I'm keeping an open mind here and the Foxconn leak is very persuasive, it's just not terribly easy to reconcile with the wording in that Digital Foundry article.
 

z0m3le

Banned
^

That's all certainly possible, but the Eurogamer leak basically had Nintendo saying to developers: "here are the final clocks available for applications". Did something like that happen with the lower Wii U clocks? If not, it just seems odd that they would describe the clocks as "final" and then go ahead and change them a couple months later.

Again, I'm keeping an open mind here and the Foxconn leak is very persuasive, it's just not terribly easy to reconcile with the wording in that Digital Foundry article.

We did see this happen with PSP, where launch clocks were what? 222mhz and when God of War came out along side PSP2000 they moved the clocks to 333mhz.

The thing about Nintendo is that they hold all the cards and it's pretty obvious that they aren't launching any software that pushes the system at launch, this could be because they weren't sure what direction they were going in until final hardware was finished and all of those pieces of software required more time. The rumor behind Mario for instance was that it wasn't running well in october (Emily or LKD said this) but in early November, we heard that had drastically changed. Right now we are hearing that Mario is running 720p docked or undocked, so while it might be "finished" according to some leakers, it could be that it requires time to optimize the game and get it to perform better due to final hardware only being around for 2 months or so.
 
There is a lot of weird dismissal of these clocks from the luka.

First I'd like to comment on Thraktor's last post, Your estimates only apply to the Maxwell to Pascal power consumption. The SoC estimation is as good as we can get and even if your chart is off, the difference is going to be counted in 100th of watts not whole watts.

Next Wii U saw a very similar CPU performance upgrade from developers working on launch software.

Launch software developers thought the Wii U had 2 CPU cores at 1GHz and up until a few months before launch, this was the case. Later on they got access to 3 CPU cores at 1.24ghz. That is substantial, nearly a doubling of total CPU performance months before launch, and launch titles ran without that extra performance. The GPU performance change was even greater than this foxconn's leak on top of that, so I don't see your point being a strong one.

Lastly, Eurogamer answered someone, I believe it was Hermii who got the response and posted it in this thread, the post has since been altered, which is weird but it originally said that the clocks might have changed, after looking at this foxconn leak, doesn't sound too final to me IMO, and with some developers still using the july devkits up to 3 or 4 weeks ago and possibly even now, we don't know what changes were made in the final hardware, even Eurogamer made that same comment.

PS I believe the Eurogamer rumor 100%, I simply think that final hardware allowed Nintendo to make these changes resulting in Foxconn's leak, which is as solid info as we've ever gotten on Switch IMO.

The Wii U's bizarre issue with devs not seeing the third core of the CPU was just reported by one poster (maybe Ideaman?), and I believe it was said to be a glitch. Those early Wii U has been reported to be consistently bad, but I don't know if we can conclude that this particular glitch was widespread.

In any case, Nintendo having bad dev kits is not the same as what may be going on with the Swirch's kits. I am curious with Eurogamer's response to the other person. Their response to me was basically dismissal.
 
We did see this happen with PSP, where launch clocks were what? 222mhz and when God of War came out along side PSP2000 they moved the clocks to 333mhz.

The thing about Nintendo is that they hold all the cards and it's pretty obvious that they aren't launching any software that pushes the system at launch, this could be because they weren't sure what direction they were going in until final hardware was finished and all of those pieces of software required more time. The rumor behind Mario for instance was that it wasn't running well in october (Emily or LKD said this) but in early November, we heard that had drastically changed. Right now we are hearing that Mario is running 720p docked or undocked, so while it might be "finished" according to some leakers, it could be that it requires time to optimize the game and get it to perform better due to final hardware only being around for 2 months or so.

I actually recall a comment by Tom Phillips of Eurogamer that the Switch reveal video was delayed until they could get Mario running smoothly, which apparently happened towards the end of October. I don't think Laura or Emily said anything to that effect, though I could have easily missed it.

Other than that your points make sense. All of this could be possible. It's strange that no one really seems to care (or comment) on this leak, although I just saw SpawnWave put up a video about it.
 

z0m3le

Banned
I actually recall a comment by Tom Phillips of Eurogamer that the Switch reveal video was delayed until they could get Mario running smoothly, which apparently happened towards the end of October. I don't think Laura or Emily said anything to that effect, though I could have easily missed it.

Other than that your points make sense. All of this could be possible. It's strange that no one really seems to care (or comment) on this leak, although I just saw SpawnWave put up a video about it.

You know, Zelda in the reveal video looked like it was doing horrible frame drops too, nothing I've seen last week from the Switch build, not sure but I feel it collaborates the Mario information.
 
You know, Zelda in the reveal video looked like it was doing horrible frame drops too, nothing I've seen last week from the Switch build, not sure but I feel it collaborates the Mario information.

Well, remember that nothing in that video was running on the Switch hardware shown. People here figured out that the Zelda footage was a constant 19 or so FPS because they didn't have enough footage to fill out the amount of time it was supposed to be on screen, so the editor had to stretch it. Which is a ridiculous thing to happen but it seems pretty clear that was the case.

Saw the video on my YouTube but didn't watch it. Who is this guy? Anyway. I think it's to easy to dismiss the Foxconn leak based on anything eurogamer. Wasn't their a reddit software leak like the day before the announcement that everyone disregarded... but turned out to be true. We will only know if this is true when we get the system teardown... and it might just be that Foxconn was right. With that being said I think there is a reason we aren't hearing anything from developers. I think the system does what It is supposed to do. Imagine developers having to develop through the Wii and Wii U generations of consoles. Now they have a machine that can run modern engines, and if rumor is believed is the easiest of all major platforms to develop for. I think the games announced and shown at E3 will speak for themselves.

SpawnWave has done a bunch of videos about Switch/NX hardware rumors, they talk about him a lot on Reddit from what I've seen. And yeah, you're right that I don't expect to hear anything from developers but the fact that no news outlets like Eurogamer/Kotaku/Polygon seem to be commenting on this leak is a bit strange to me too. Considering how much (and the kind of things) the leak got right, it's hard not to take it seriously.
 

z0m3le

Banned
Well, remember that nothing in that video was running on the Switch hardware shown. People here figured out that the Zelda footage was a constant 19 or so FPS because they didn't have enough footage to fill out the amount of time it was supposed to be on screen, so the editor had to stretch it. Which is a ridiculous thing to happen but it seems pretty clear that was the case.



SpawnWave has done a bunch of videos about Switch/NX hardware rumors, they talk about him a lot on Reddit from what I've seen. And yeah, you're right that I don't expect to hear anything from developers but the fact that no news outlets like Eurogamer/Kotaku/Polygon seem to be commenting on this leak is a bit strange to me too. Considering how much (and the kind of things) the leak got right, it's hard not to take it seriously.

lol I missed that, absolutely ridiculous. So how did we figure out that they stretched the video rather than bad performance?

Spawnwave deals with speculation as much as anyone, I mean it was a nice shout out and I am subbed to him, but really it is just his take on this leak.

The leak of the titles came from 4chan and yeah, the real leaks always seem to be dismissed to me, but the reason no sites are reporting on the foxconn leak might be that NDAs around final devkits could be much more serious, and not every dev has them, I'd also speculate that Nintendo has only given this info to wave 2 titles, since launch titles should all be targeting the older specs regardless, and I think even Nintendo has come out and said that Zelda BotW isn't using the Switch to its fullest.

It's strange to me, but then all rumors have seem to dry up around the switch, I think all publishers are in a wait and see so any title that is being worked on the switch in private might never see the light of day depending on how Switch sells this first year.
 

z0m3le

Banned
The thing is most of these sites want to be right. They want to build their "credibility" for leaks. I don't expect them to comment. If the guy has s right that contradict the info they put out. I have no allegiance to anyone in the "rumor" mill department. The fact is the Foxconn person got a lot of info right that other sites didn't. It would be absurd to dismiss his info because someone else said something different. Again for me it's a mute point because we aren't going to know until we get a teardown of the system. So we can speculated but anyone could be right. We will see in the end.

Yeah, I honestly think going with Eurogamer's clocks as some sort of solid fact now seems ignorant of this leak at best, willfully so at worst. We just won't know anymore and either could end up being the case.
 
Top Bottom