• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

A Nintendo Switch has been taken apart

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Yes, that's one of the general problems with low-level architecture-dependent optimizations encoded in user-level code, and doesn't apply only to games or ESRAM: it (a) obfuscates the actual algorithm, making it harder to understand and maintain, (b) greatly reduces performance portability, if not actual portability, and (c) reduces malleability, which makes it harder to experiment with and test algorithmic improvements that might be more meaningful than low-level architecture-specific tweaking.

Given all these factors, for general applications an architecture which doesn't impose hardware-specific user-level optimization concerns to achieve "good enough" performance seems to have a market advantage, even if it is less optimal from a pure hardware perspective.
I think we're veering off from the subject. It's not that everybody need to optimise their apps up the wazoo. It's about what one does need to do to hit a performance target in a scenario where a naive port, whatever that entails, misses that. And low and behold, consoles do tend to be underpowered compared to workstations. So unless you're doing a Tetris* you'd likely have to think about optimisations, whether that obfuscates the algorithm or not. We live in a reality where people buy (and generally use) products based on how they perform, not based on how pure their algorithmic implementation is.

* Fun anecdote: I have a ray-traced Tetris the CPU implementation of which runs best on multi-socket Xeons.
 

Durante

Member
I think we're veering off from the subject. It's not that everybody need to optimise their apps up the wazoo. It's about what one does need to do to hit a performance target in a scenario where a naive port, whatever that entails, misses that.
Exactly. And reducing that amount of effort to get acceptable performance for most games seems like an important hardware design goal even for consoles these days -- much more so than it used to be.

Just look at the platform development from PS2 and PS3 to PS4.
 

Hermii

Member
Exactly. And reducing that amount of effort to get acceptable performance for most games seems like an important hardware design goal even for consoles these days -- much more so than it used to be.

Just look at the platform development from PS2 and PS3 to PS4.
Haven't Nintendo had this goal since they fucked up with the N64? GameCube was maybe the most developer friendly system of that gen, Wii was basically a GameCube, Wii U was fairly developer friendly once the tools were finished and with Switch they chose to work with Nvidia to create a developer environment that's may be above and beyond any other console.
 

Durante

Member
Haven't Nintendo had this goal since they fucked up with the N64? GameCube was maybe the most developer friendly system of that gen, Wii was basically a GameCube, Wii U was fairly developer friendly once the tools were finished and with Switch they chose to work with Nvidia to create a developer environment that's may be above and beyond any other console.
Sure, you can see that as another example of the same overall development. I chose PS/3 to PS4 since the contrast is particularly stark and the transition is recent.
 
Haven't Nintendo had this goal since they fucked up with the N64? GameCube was maybe the most developer friendly system of that gen, Wii was basically a GameCube, Wii U was fairly developer friendly once the tools were finished and with Switch they chose to work with Nvidia to create a developer environment that's may be above and beyond any other console.
Yeah those properitery mini discs with less space than standard DVDs were real dev friendly!
 

Mameshiba

Neo Member
I don't see the LPDDR4 memory included in the power draw calculation or did I missed it?

Edit: I found this table, but it's from 2014, I don't know if things evolved meanwhile:

It's not in the calculation, because i don't know how much power LPDDR4 draws.
The table is certainly an interesting find, but i am not sure how to read it.
K4F6E304HB-MGCH would put us in the "two-channel LPDDR4 at 3200 Mbits/s" column, but how do we know how much energy a bit needs?
I hope it's not the bottom row :>

Was there any mention of the Vdd & Vddq voltages for the table?
 

madmackem

Member
I wouldn't be surprised if Nintendo didn't increase GPU clockspeeds for the switch at all for docked mode for botw. It could explain the framerate dropping down to the low 20s on docked mode at times with 900p, while handheld mode is a stable 30 at 720p.

I mean its possible that it could be a bug, and we will indeed have a patch on day 1.. But knowing how Nintendo treated TP on Wii was exactly like the cube minus controls, Nintendo doesn't seem like they took advantage of Switch's hardware at all because they were rushing it for launch, and maybe didn't want to alienate the gamers that was looking forward to the game on the original console(the Wii u) Certainly not in docked mode.
I'm sure Peter brown at gamespot said he dropped the res to 720p while docked and still had the framerate drops in the same places.
 

beril

Member
Huh? How would he do that? I doubt Switch has a setting for that (or did I miss something?).

It has an option for output resolution, but that doesn't necessarily mean the game changes render resolution; it's especially unlikely when a game uses a non native internal resolution like Zeldas 900p.
 
Huh? How would he do that? I doubt Switch has a setting for that (or did I miss something?).

It does indeed have a setting to manually adjust resolution.

That said, given the very nature of the Switch's hybrid function, I doubt simply changing the resolution output is the same as the system thinking 'welp, handheld time'. Other settings may still be up that wouldn't be in handheld.
 

Hermii

Member
Huh? How would he do that? I doubt Switch has a setting for that (or did I miss something?).
It's probably output resolution. It's still rendering at 900p I think. You won't get improved framerate on PS4 (or any console) if you change output to 720p
 

Zedark

Member
It's probably output resolution. It's still rendering at 900p I think. You won't get improved framerate on PS4 (or any console) if you change output to 720p

Right, it being just an output resolution makes sense. I didn't think about that setting. Also explains why there is no change in performance.
 

Donnie

Member
It's not in the calculation, because i don't know how much power LPDDR4 draws.
The table is certainly an interesting find, but i am not sure how to read it.
K4F6E304HB-MGCH would put us in the "two-channel LPDDR4 at 3200 Mbits/s" column, but how do we know how much energy a bit needs?
I hope it's not the bottom row :>

Was there any mention of the Vdd & Vddq voltages for the table?

Definitely not, otherwise you'd never get any decent battery life out of Switch even with settings turned right down in less demanding games, as the RAM would constantly be using a third of Switch's total battery power. Plus you've got phones out there that only use 2.5w total under full load with 4GB of LPDDR4. I'm talking about high spec phones with 5-6 inch screens and top end SoC's as well. So I'd think that the RAM is only in the mW power draw range.
 
Haven't Nintendo had this goal since they fucked up with the N64? GameCube was maybe the most developer friendly system of that gen, Wii was basically a GameCube, Wii U was fairly developer friendly once the tools were finished and with Switch they chose to work with Nvidia to create a developer environment that's may be above and beyond any other console.

This is a forest for the trees misapprehension though. In the context of wider multi-platform development, having a contemporaneous console design whose power is out of line with competitors can be very developer-unfriendly, even if the generation to generation portability is more straightforward. Think art reduction costs, re-design of gameplay systems for divergent CPU capabilities, etc.

NB: I do not speak for the company, etc.
 

KingSnake

The Birthday Skeleton
It's not in the calculation, because i don't know how much power LPDDR4 draws.
The table is certainly an interesting find, but i am not sure how to read it.
K4F6E304HB-MGCH would put us in the "two-channel LPDDR4 at 3200 Mbits/s" column, but how do we know how much energy a bit needs?
I hope it's not the bottom row :>

Was there any mention of the Vdd & Vddq voltages for the table?

Vdd & Vddq for LPDDR4 are 1.8V / 1.1V.

If I understand correctly what I read about it, it can be managed and can be capped.

Here's the accompanied text for the table:

Low-power enhancements and a narrower address bus will reduce the energy required per bit (see the table). Transferring more bits, though, means that the overall power consumption may be higher than an LPDDR3 implementation at the highest speeds of operation. Because LPDDR4 will find a place in the highest-performance, most complex systems, power management becomes even more critical.

To avoid thermal issues, system designers can use strategies such as monitoring the SDRAM's internal die temperature, increasing the refresh rate, and throttling back the SDRAM clock when it is detected that the die is about to overheat. This may only be necessary when the device is used during extremely compute-intensive tasks such as real-time gaming, which draws more power and increases the heat dissipated.

http://electronicdesign.com/power/lpddr4-dram-meets-mobile-power-and-performance-demands

Definitely not, otherwise you'd never get any decent battery life out of Switch even with settings turned right down in less demanding games, as the RAM would constantly be using a third of Switch's total battery power. Plus you've got phones out there that only use 2.5w total under full load with 4GB of LPDDR4. I'm talking about high spec phones with 5-6 inch screens and top end SoC's as well. So I'd think that the RAM is only in the mW power draw range.

Smartphones might not need the same kind of performance from the memory on their normal use as a gaming device would need.

Still, probably in handheld mode the memory is running in a power saving mode, but this mode still might be more power hungry than the smartphone normal use.
 
It does indeed have a setting to manually adjust resolution.

That said, given the very nature of the Switch's hybrid function, I doubt simply changing the resolution output is the same as the system thinking 'welp, handheld time'. Other settings may still be up that wouldn't be in handheld.
No

Zelda does not have a resolution option. The switch has a resolution output setting but that will probably have no effect on the game's internal resolution which is probably still 900p.
 
No

Zelda does not have a resolution option. The switch has a resolution output setting but that will probably have no effect on the game's internal resolution which is probably still 900p.

...I was referring to the Switch having the option to change the resolution output. Because the person I was quoting was referring to the Switch, not Breath of the Wild itself.
 

Donnie

Member
Smartphones might not need the same kind of performance from the memory on their normal use as a gaming device would need.

Still, probably in handheld mode the memory is running in a power saving mode, but this mode still might be more power hungry than the smartphone normal use.

It was only an example to put the idea of 4GB LPDDR4 possibly using 2w into perspective.

My example BTW (2.5w total system power draw) was a top end phone with 4GB LPDDR4 running the Manhattan benchmark constantly, while the SoC may throttle I doubt RAM will. Like I say just an example to put things into perspective not any kind of definitive info for comparison. I still believe RAM in Switch will be in the mW range though.
 

KingSnake

The Birthday Skeleton
It was only an example to put the idea of 4GB LPDDR4 possibly using 2w into perspective.

My example BTW (2.5w total system power draw) was a top end phone with 4GB LPDDR4 running the Manhattan benchmark constantly, while the SoC may throttle I doubt RAM will. Like I say just an example to put things into perspective not any kind of definitive info for comparison. I still believe RAM in Switch will be in the mW range though.

Yes, but 410mW could very well throw the A72 on 20nm scenario out of the maximum range, for example.
 
I dunno if everyone saw this, it was from a shin'nen interview from yesterday on nintendolife:

The hardware was already quite advanced when we started working on it. It made a very solid impression right from the start. When we got the final devkits (which are very close to the retail version) it felt simply perfect to play our game on them From this day we knew the Switch would win the masses.. It was simply so satisfying to use all the control options and to switch seamless between console and mobile mode.

http://www.nintendolife.com/news/2017/02/feature_fast_rmx_-_the_price_modes_and_performance_of_switchs_futuristic_racer

I dunno if this information could bring something to the foxconn leak or eurogamer frequencies, but I found that interesting.
 

Padinn

Member
That certainly seems to indicate the final dev kits had improved hardware. They also state it took only a year for them to port.
 

Polygonal_Sprite

Gold Member
It looks to me like the framerate drops when Link nears the fire, then again around environmental mist and a final time when you get a look out at the vast draw distance.

Aren't the first two tied to the GPU and called alpha effects which are pretty expensive?

In a console isn't 20% more GPU power pretty significant? If BotW is GPU limited then I don't think we should be discounting the fact the frame drops may be down to the fact that the EG clocks are the final clock speeds.

Hopefully it's improved in a patch but have Nintendo ever had a day one patch that improved framerate before?
 
It looks to me like the framerate drops when Link nears the fire, then again around environmental mist and a final time when you get a look out at the vast draw distance.

Aren't the first two tied to the GPU and called alpha effects which are pretty expensive?

In a console isn't 20% more GPU power pretty significant? If BotW is GPU limited then I don't think we should be discounting the fact the frame drops may be down to the fact that the EG clocks are the final clock speeds.

Hopefully it's improved in a patch but have Nintendo ever had a day one patch that improved framerate before?

Depends. I mean, the suggestion has been tossed about that it might be even Nintendo aren't fully utilising their hardware here, owing to having to port the game from the Wii U to the Switch. While I'm wary of such a claim given the amount of time taken to develop the game in question (after all, it had been originally scheduled for 2015, and it's not like Nintendo just sat on their asses), without clear knowledge of how the hardware is performing - and what that hardware even is - we're just not gonna know.
 

Pasedo

Member
Question guys. Using Botw as an example what percentage of its total graphics capability potential would you say its using? Is Botw pretty much maxing out Switch hardware?
 

KingSnake

The Birthday Skeleton
I dunno if everyone saw this, it was from a shin'nen interview from yesterday on nintendolife:



http://www.nintendolife.com/news/2017/02/feature_fast_rmx_-_the_price_modes_and_performance_of_switchs_futuristic_racer

I dunno if this information could bring something to the foxconn leak or eurogamer frequencies, but I found that interesting.

That talks about the control option and overall feel of Switch and docking and undocking. It's quite a stretch to make it about power, not that it surprises me.
 

Hermii

Member
I dunno if everyone saw this, it was from a shin'nen interview from yesterday on nintendolife:



http://www.nintendolife.com/news/2017/02/feature_fast_rmx_-_the_price_modes_and_performance_of_switchs_futuristic_racer

I dunno if this information could bring something to the foxconn leak or eurogamer frequencies, but I found that interesting.

"Quite advanced" means nothing.

Question guys. Using Botw as an example what percentage of its total graphics capability potential would you say its using? Is Botw pretty much maxing out Switch hardware?

Its pointless to give a percentage, but BOTW is a port from a much older and less capable architecture. The Switch probably have plenty of tricks up its sleeve that would be visible in a ground up made game.
 
Question guys. Using Botw as an example what percentage of its total graphics capability potential would you say its using? Is Botw pretty much maxing out Switch hardware?

Ports from last generation doesn't indicated the power of the new console.

PS3 ports to PS4 are a lot shittier than these newer PS4 games.
 

ksamedi

Member
Its possible even Nintendo didnt know what the final spec would be until the last minute. Maybe devs requested for more GPU power and Nintendo just went ahead with it. I dont think first generation games are
Indicative of the power.
 

newbong95

Member
upclock in portable mode (new sdk) - 384 mhz gpu and only one mode in memory clocks - 1331.2mhz by digital foundry.

he key addition is a new mode seemingly designed to beef up handheld performance. Developers can opt for a 384MHz GPU clock - a straight 25 per cent uplift in compute power compared to the default 307.2MHz option. Both frequencies are available to developers in what it calls 'normal mode' operation - and to be clear, users will not be able to choose between them. Additionally, adjustments have been made to available memory bandwidth. In our prior story, we revealed that in undocked mode, developers could choose between running the LPDDR4 memory at either 1600MHz or 1331.2MHz. The 1600MHz option is now only available in 'boost mode' - when Switch is docked - while 1600MHz support in mobile mode is deprecated. As before, developers can opt to run handheld modes while in the dock too, and to be clear, the documentation has no new modes for docked performance. On top of that, we should stress that not all games will use the 384MHz GPU mobile mode - game-makers will choose the best fit for their projects, and 307.2MHz remains the default option.http://www.eurogamer.net/articles/digitalfoundry-2017-new-performance-mode-boosts-handheld-switch-clocks-by-25-per-cent

so new portable mode- 384 x 2 x 256 = 196.6 Gflops
 

KingSnake

The Birthday Skeleton
http://www.eurogamer.net/articles/d...-boosts-handheld-switch-clocks-by-25-per-cent

However, as much as we want the Foxconn clocks to be real, the weight of evidence is stacking up against this aspect of the leak. To maintain meaningful battery life with those clocks, we'd need to be looking at a 16nm FinFET chip and maybe even a new revision of the ARM CPU cores, and the Chinese teardown of the processor confirms that the physical size of the chip is seemingly unchanged from existing 20nm Tegra X1 SoC.

The difference between 16nm and 20nm isn't actually about transistor size, but more about the 3D 'FinFET' transistors on the lower node. A 16nm SoC would be approximately the same size as the existing 20nm Tegra X1, but the difference here is that the teardown reveals a processor with seemingly identical dimensions. Also interesting is that the processor is surrounded by the same surface-mounted arrangement of what are likely to be decoupling capacitors, there to reduce noise on the power lines. The initial conclusion we have is that we are looking at a more lightly modified X1, still on the 20nm process, which ties in more closely with the clocks we reported - and indeed every non-Foxconn spec leak seen to date.
 

Pasedo

Member
In usual Nintendo fashion I can see where they're going with it. You'd get some amazing games that perform at 60fps using the typical Nintendo art style. Problem is that 3rd party games go for the more realistic approach and Ive been telling myself that it's never going to be for that. When I tried it at a demo booth it felt like an arcade machine. Intense, fun, Vibrant visuals but not super realistic graphics. I feel if it succeeds, 3rd party devs will take a different approach. Not release their serious realistic games on it and may even recreate simpler more basic games dedicated to the platform. It will also be where Indies will truly flourish. So basically if a game doesn't have similar art direction which means more simpler less intensive graphics it won't release on the Switch. For example Borderlands will go on Switch but Battlefield will not...etc. Sorry late here and probably ranting and and derailing this a bit. Lol.
 
upclock in portable mode (new sdk) - 384 mhz gpu and only one mode in memory clocks - 1331.2mhz by digital foundry.

so new portable mode- 384 x 2 x 256 = 196.6 Gflops

So officially >Wii U in pure flops.

I like their theory that Nintendo decided Zelda essentially needed this boost, that seems likely.
 

Zedark

Member
So officially >Wii U in pure flops.

I like their theory that Nintendo decided Zelda essentially needed this boost, that seems likely.

But now the docked mode seems to lag behind for the straight 720p to 1080p conversion. I would have expected this increase to be combined with an increase in docked speed as well.
 
But now the docked mode seems to lag behind for the straight 720p to 1080p conversion. I would have expected this increase to be combined with an increase in docked speed as well.

Which could explain why Zelda is unstable at 900p.

Especially if they have upped the landscape effects/draw distance over portable mode (which makes sense to me, as it's a smaller screen).
 
Question guys. Using Botw as an example what percentage of its total graphics capability potential would you say its using? Is Botw pretty much maxing out Switch hardware?

Some desperately want to believe that, but No. Not at all. It's a "from-the-ground-up" Wii U title, and Nintendo told everybody that it would be the same experience on the Switch. So, the resolution, etc., are design choices, and those playing the Wii U version aren't shortchanged. That would be like saying Mario Kart 8 Deluxe Edition pushed the Switch to its limits.
 

Zedark

Member
Which could explain why Zelda is unstable at 900p.

Especially if they have upped the landscape effects/draw distance over portable mode (which makes sense to me, as it's a smaller screen).

Might be, the big screen definitely needs higher settings for those effects. I wonder if it is enough to explain the instability of the frame rate, though, since the change isn't that drastic, and they aren't pushing it at 1080p like Mario Kart, but rather at 900p.

Also, what about title like Mario Kart? Will they just underutilise the handheld mode to keep the 1080p docked mode stable? The new clock speed is a suggested spec, so that would be a reasonable assumption, right?
 

Oregano

Member
Might be, the big screen definitely needs higher settings for those effects. I wonder if it is enough to explain the instability of the frame rate, though, since the change isn't that drastic, and they aren't pushing it at 1080p like Mario Kart, but rather at 900p.

Also, what about title like Mario Kart? Will they just underutilise the handheld mode to keep the 1080p docked mode stable? The new clock speed is a suggested spec, so that would be a reasonable assumption, right?

Sure, why not?

Both 3DS and Vita(and PSP) had the ability to use more system resources which a minority of games used.
 

Hmm. I still think Nintendo has got a customized 16nm Tegra in the final retail unit. Otherwise, it really won't make so much sense and isn't in line in what Nintendo wants to achieve with this device.

This device combines handheld and console. With a processor based on new 16nm technology, this allows for higher clocks than 20nm and therebefore more power at the same power consumption.

For the future of the device and its concept its absolutely necessary to have a Tegra based on the 16nm process, because this is the technology that further reduces the gap between home console and mobile device. This applies to 128 Bit memory bus as well, 64 Bit & 25 GB/s memory bandwidth wouldn't be enough for a home console this day (remember, Nintendo marketing it as a home console on the go) even with the magic sauce some people here are dreaming of.

If it really is still based on the old 20nm process, the concept of the Switch is not utilized to its full potential and I can't help but feel a little bit bad by paying 320€ for older hardware.

This does not mean that the Switch isn't a amazing handheld or that its a waste of money, but Nintendo and Nvidia could have done more to fulfill the concept of building a handheld console hybrid if the Tegra really is based on the 20nm process.
 
This seems to be a sort of confirmation of a 16nm chip being in the thing-with the new process node they were able to up the clocks based on the same multiplier, with a relatively similar ratio (around 2.4 this time) between docked/portable mode that obviously can't be the same.

Zelda might be running at 900p for so many reasons, I'd wait until launch and a day one patch.

Also, we don't know what's inside the new Shield TV, and even if it (probably) is in 20nm the outside of the chip combined with identical transistor densities doesn't let us tell wether the chips are different or not.
 

Zedark

Member
Hmm. I still think Nintendo has got a customized 16nm Tegra in the final retail unit. Otherwise, it really won't make so much sense and isn't in line in what Nintendo wants to achieve with this device.

This device combines handheld and console. With a processor based on new 16nm technology, this allows for higher clocks than 20nm at the same power consumption.

For the future of the device and its concept its absolutely necessary to have a Tegra based on the 16nm process, because this is the technology that further reduces the gap between home console and mobile device. This applies to 128 Bit memory bus as well, 64 Bit & 25 GB/s memory bandwidth wouldn't be enough for a home console this day (remember, Nintendo marketing it as a home console on the go) even with the magic sauce some people here are dreaming of.

If it really is still based on the old 20nm process, the concept of the Switch is not utilized to its full potential and I can't help but feel a little bit bad by paying 320€ for older hardware.

This does not mean that the Switch isn't a amazing handheld or that its a waste of money, but Nintendo and Nvidia could have done more to fulfill the concept of building a handheld console hybrid if the Tegra really is based on the 20nm process.

Yeah, the Eurogamer article does a bad job of explaining why they think it is 20nm, basically saying 'because it looks like a Tegra X1 SoC', even though that isn't a sufficient condition. They could increase the bandwidth of the system by adding modifications to the system, Nintendo has been doing that for a while. Neither of those things are (sufficiently) contradicted by this new info.
 
So eurogamer was basically explaining the tear down from last week + got some info from a reliable resource that there is a 25% performance boost for CPU mode..

So err.. We have an actual tear down for a retail switch, right?
 

KingSnake

The Birthday Skeleton
So eurogamer was basically explaining the tear down from last week + got some info from a reliable resource that there is a 25% performance boost for CPU mode..

So err.. We have an actual tear down for a retail switch, right?

Eurogamer practically takes it as the teardown is valid for what's in a retail switch, yeah.
 

KingSnake

The Birthday Skeleton
Hmm. I still think Nintendo has got a customized 16nm Tegra in the final retail unit. Otherwise, it really won't make so much sense and isn't in line in what Nintendo wants to achieve with this device.

This device combines handheld and console. With a processor based on new 16nm technology, this allows for higher clocks than 20nm and therebefore more power at the same power consumption.

For the future of the device and its concept its absolutely necessary to have a Tegra based on the 16nm process, because this is the technology that further reduces the gap between home console and mobile device. This applies to 128 Bit memory bus as well, 64 Bit & 25 GB/s memory bandwidth wouldn't be enough for a home console this day (remember, Nintendo marketing it as a home console on the go) even with the magic sauce some people here are dreaming of.

If it really is still based on the old 20nm process, the concept of the Switch is not utilized to its full potential and I can't help but feel a little bit bad by paying 320€ for older hardware.

This does not mean that the Switch isn't a amazing handheld or that its a waste of money, but Nintendo and Nvidia could have done more to fulfill the concept of building a handheld console hybrid if the Tegra really is based on the 20nm process.

There might not be a usable Tegra Soc at 16nm plain and simple. Nvidia themselves seem to skip the Parker and quickly move to Xavier.
 

KtSlime

Member
It doesn't bother me one way or another if it is 20nm or 16nm, and I have no technical evidence, however in my gut I want to say it is 16nm. Why else would Nintendo choose this launch timing, and wait to deliver the final dev machines if they weren't waiting on a particular change in tech.

Zelda has been done for a while now, and I am of the opinion that Pokemon Sun and Moon had been back ported to 3DS from the work they did on a Switch game just so they wouldn't miss their anniversary. If Nintendo was waiting to launch the Switch until they got more 3rd party support they would have gotten out the final dev machines sooner.

Are there any alternative theories as to why the Switches design was finalized so late?
 
Top Bottom