• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Nintendo Switch CPU and GPU clock speeds revealed

Status
Not open for further replies.

ty_hot

Member
is there any speculation about the OS? It seems to me that the system has improved a lot more in the RAM department than in the other parts (CPU/GPU), so maybe they are preparing a OS with some better features?
 

Hermii

Member
Technically, yes it's superior to the Wii U. Practically, that doesn't mean shit because the Wii U CPU is inferior to average smartphones these days. At the end of the day, they may have as well not even bothered; they could have used an 8 core Zen and it still wouldn't compensate for the weaksauce GPU and terrible bandwidth bottlenecks of this thing. It reeks of classic Nintendo cheapness.
We still don't know much about the memory subsystem. If there is one positive thing you can say about Nintendo hardware since the GameCube it's that it's not bandwidth or memory starved, I don't believe Switch will be either. Weak as fuck sure,but with adequate memory.
 
It's much more than Wii U 1080p, though. For example, can you imagine what Monolith can do with all of that 3.2x the RAM, a more modern shader system, and a much stronger CPU?

Hopefully eliminate popups and 1080p xenoblade x docked. Too bad docked mode is there just to increase resolution, so whatever they can do, they have to manage it in a GPU 1.5X as powerful. Maybe increased textures.. Maybe. God I hope they add some kind of legit online multiplayer mode.
 

Two Words

Member
All that sacrifice just so you cN play Zelda on the go. I just don't see this device taking off. Nintendo is basically taking on a very mature mobile electronics industry at this point.
 
We still don't know much about the memory subsystem. If there is one positive thing you can say about Nintendo hardware since the GameCube it's that it's not bandwidth or memory starved, I don't believe Switch will be either. Weak as fuck sure,but with adequate memory.

It's funny how our standards have lowered over the months.. From PS4 level to xbone to half xbone to.. this. lol

Yeah, as you said, memory bandwidth isn't confirmed. Yet.
 
We still don't know much about the memory subsystem. If there is one positive thing you can say about Nintendo hardware since the GameCube it's that it's not bandwidth or memory starved, I don't believe Switch will be either. Weak as fuck sure,but with adequate memory.

It allegedly has a 25GBps memory bandwidth, how can that not be an impossible bottleneck for 1080p? That's XBOX 360 levels of memory bandwidth we're talking about here..
 
Hopefully eliminate popups and 1080p xenoblade x docked. Too bad docked mode is there just to increase resolution, so whatever they can do, they have to manage it in a GPU 1.5X as powerful. Maybe increased textures.. Maybe. God I hope they add some kind of legit online multiplayer mode.
The CPU and RAM doesn't change, so those advantages like AI, physics, and multiple-player elements will still be there in HH mode.
It allegedly has a 25GBps memory bandwidth, how can that not be an impossible bottleneck for 1080p? That's XBOX 360 levels of memory bandwidth we're talking about here..
In other words, Nintendo may have customized the memory setup like they have always done since the GCN. Maxwell/Pascal also already has an "automagic" tile renderer system to assistant on those things.
 

NateDrake

Member
I would still love to play Bloostained Ritual of the Night on Switch 😃
images

Amen to that.
 
The CPU and RAM doesn't change, so those advantages like AI, physics, and multiple-player elements will still be there in HH mode.

In other words, Nintendo may have customized the memory setup like they have always done since the GCN. Maxwell/Pascal also already has an "automagic" tile renderer system to assistant on those things.

In other words.. It's going to be GCN-->Wii specs

or maybe what the New 3DS could have been if the GPU was 1.5x better than the wii instead of being around GCN specs for graphics(while having the RAM and CPU boosts).
 
In other words.. It's going to be GCN-->Wii specs

or maybe what the New 3DS could have been if the GPU was 1.5x better than the wii instead of being around GCN specs for graphics(while having the RAM and CPU boosts).
I would probably compare it more to XB1 --> Scorpio due to the architectural change.
 

Fox_Mulder

Rockefellers. Skull and Bones. Microsoft. Al Qaeda. A Cabal of Bankers. The melting point of steel. What do these things have in common? Wake up sheeple, the landfill wasn't even REAL!
Can we expect switch games being announced also for wii u? (based on the similiar power)
 

Putty

Member
I simply cant get excited for this...the whole design is simply not me...and i'm all for "powah" which these specs clearly lack. I fear for it truth be told...
 

crinale

Member
It's also worth pointing out that if Nintendo finds some sweet deal and could buy kickass batteries in reasonable price, the heat dissipation problem just can never be avoided.
Say that you get magical battery can dish out 100 watt for 6 hours and design handheld to make use of all the potential, that 100W will ultimately heating the environment, 100%.
Even if you could invent ideal fan that makes zero noise, that never means the device produce less heat.
Therefore I won't be surprised Nintendo understands the limitation and decided deliberately how much processing power they can go, as for today's semiconductor fab standard.
 

Tyaren

Member
Sooo to sum this up, the Switch has a minor power boost over the WiiU if docked and is about PS3/Xbox360 levels powerful used as a handheld?
 
So how much more powerful than a vita is it?

I would market the switch as a Portable System. If Nintendo compared the specs from a 3DS to a Switch it sounds much more impressive.
 

Mato

Member
I think the power is fine. Depending on the price point it can still prove to offer a great value when you consider all the different ways you can play this thing, plus all the high quality Nintendo games.
 
The same way Wii U and 3ds isn't: embedded ram.

In other words, Nintendo may have customized the memory setup like they have always done since the GCN. Maxwell/Pascal also already has an "automagic" tile renderer system to assistant on those things.

Nintendo hasn't done hardware magic with the Wii U and eDRAM/eSRAM is not a magic bullet as seen in the 360/Bone. The reason why bandwidth hasn't been a glaring issue with Wii U is that everything else was worse. Bringing the GCN into the argument is in my opinion kind of irrelevant - it's the product of a completely different era with completely different bottlenecks.
 
All that sacrifice just so you cN play Zelda on the go. I just don't see this device taking off. Nintendo is basically taking on a very mature mobile electronics industry at this point.

Lol, might as well call laptops bullshit because they typically need to sacrifice GPU power just so you can work on the go.
You simply don't get that there is a non-vanishing handheld market of which you just aren't a part.
This device will still easily outsell Xbox One + Xbox Scorpio in 2017 and on-ward.
 
Trying to make sense of all the comparisons on the most recent 20 pages, lol? So what's it then?

A little above Wii U portable and comparable to a Wii U Pro when docked

Enlighten us.


Edit: saw post above....but that's pretty much what he said?

It's not a minor power boost docked and it's definitely not PS3 level when portable?

Thanks. Was I really that far off the mark?

PS3 level is pretty low for what it can do I think. It kinda suprised me since I had yet to see that comparison on this thread.
 
I would probably compare it more to XB1 --> Scorpio due to the architectural change.

That's pretty big. Scorpio is almost 4.5x as powerful as base Xbone in GPU though. If Switch was around 500 GFLOPS, it would be more closer I think

Speaking of architectures, How different is the architecture of the Wii U GPU VS Xbone and PS4..? By that I mean, do they perform similar flop per flop? When people talk about Wii U vs Xbone PS4, everyone has always just compared paper specs. Obviously the latter is more modern.. But I wonder if its just as prounced as Nvidia apparently performing significantly better per flop over AMD.

How exactly did you get the number that Switch is 1.5x more than Wii U's GPU also btw? You said architectural differences, but where did you get that switch's nvidia is 1.5x better per flop than Wii U's GPU?
 

Hermii

Member
Nintendo hasn't done hardware magic with the Wii U and eDRAM/eSRAM is not a magic bullet as seen in the 360/Bone. The reason why bandwidth hasn't been a glaring issue with Wii U is that everything else was worse. Bringing the GCN into the argument is in my opinion kind of irrelevant - it's the product of a completely different era with completely different bottlenecks.
I think the main issue with Xbox is that there isn't enough of it. Wii U has 3 times what 360 has at about the same power level. Xbone should have had 64mb.
 
So how much more powerful than a vita is it?

I would market the switch as a Portable System. If Nintendo compared the specs from a 3DS to a Switch it sounds much more impressive.
It IS a portable system. It just has a dock. Anyone thinking otherwise is fooling themselves. Nintendo has left the home console market.
 
So this thing will cost 99 usd to produce, right?

Considering iphone 7 is much more powerful and uses a much better screen and it costs like 230 usd
 
For the thick layman just how good is this thing?
No one knows until January?
At the very least the most kickass handeld ever, more than a WiiU in portable mode, not an XBox One when docked.

It IS a portable system. It just has a dock. Anyone thinking otherwise is fooling themselves. Nintendo has left the home console market.

Anyone thinking the categories of handheld versus homeconsole gaming mean anything are fooling themselves. Games are games.
 

daffy

Banned
Hopefully the games will be $50 lol

You guys are gonna be in for a rude awakening when Diddy Kong and Samus Returns are announced in January
 

EVH

Member
What about the unreal 4 engine support? I mean wii u only could handle unreal 3 engine...

UE4 is a highly scalable engine that was already ported to Tegra SoC, so basically this would mean they had to do little work to support Switch. Is not a question of power - in this case - is a question of architecture and having work already done.
 
Technically, yes it's superior to the Wii U. Practically, that doesn't mean shit because the Wii U CPU is a POS 1.2GHz 32-bit Triple-core PPC akin to what you find in an XBOX 360 and effectively inferior to budget smartphone CPUs these days. It's not a generation old but in fact two generations old. At the end of the day, they may have as well not even bothered; they could have used an 8 core Zen and it still wouldn't compensate for the weaksauce GPU (2 SMs? REALLY?) and terrible bandwidth (XBOX 360 levels of 25GBps - WTF?!?!?!) bottlenecks of this thing. It reeks of classic Nintendo cheapness.

The goal is to be cheap though.

The target goal will be $250 with decent battery that doesn't over heat.

You can't release a unit that cheap, that doesn't last 2 hours or melt in your hands in 2017 unless you're willing to sell a unit which is much more expensive than that.

Regardless of what Nintendo says, this is a portable system which has a dock so you can play the games on the TV.

Realistically you don't release a portable system with the same power of home consoles during the same generation. That has never or will never happen.
 

The_Lump

Banned
Technically, yes it's superior to the Wii U. Practically, that doesn't mean shit because the Wii U CPU is a POS 1.2GHz 32-bit Triple-core PPC akin to what you find in an XBOX 360 and effectively inferior to budget smartphone CPUs these days. It's not a generation old but in fact two generations old. At the end of the day, they may have as well not even bothered; they could have used an 8 core Zen and it still wouldn't compensate for the weaksauce GPU (2 SMs? REALLY?) and terrible bandwidth (XBOX 360 levels of 25GBps - WTF?!?!?!) bottlenecks of this thing. It reeks of classic Nintendo cheapness.

You're extrapolating far too much from just the clock speeds. We are still missing some key info here.

3rd time I've quoted it, but have a read of Thraktor's post.

I haven't had time to read through every response here, so I'm probably repeating what others have already said, but here are my thoughts on the matter, anyway:

CPU Clock

This isn't really surprising, given (as predicted) CPU clocks stay the same between portable and docked mode to make sure games don't suddenly become CPU limited when running in portable mode.

The overall performance really depends on the core configuration. An octo-core A72 setup at 1GHz would be pretty damn close to PS4's 1.6GHZ 8-core Jaguar CPU. I don't necessarily expect that, but a 4x A72 + 4x A53 @ 1GHz should certainly be able to provide "good enough" performance for ports, and wouldn't be at all unreasonable to expect.

Memory Clock

This is also pretty much as expected as 1.6GHz is pretty much the standard LPDDR4 clock speed (which I guess confirms LPDDR4, not that there was a huge amount of doubt). Clocking down in portable mode is sensible, as lower resolution means smaller framebuffers means less bandwidth needed, so they can squeeze out a bit of extra battery life by cutting it down.

Again, though, the clock speed is only one factor. There are two other things that can come into play here. The second factor, obviously enough, is the bus width of the memory. Basically, you're either looking at a 64 bit bus, for 25.6GB/s, or a 128 bit bus, for 51.2GB/s of bandwidth. The third is any embedded memory pools or cache that are on-die with the CPU and GPU. Nintendo hasn't shied away from large embedded memory pools or cache before (just look at the Wii U's CPU, its GPU, the 3DS SoC, the n3DS SoC, etc., etc.), so it would be quite out of character for them to avoid such customisations this time around. Nvidia's GPU architectures from Maxwell onwards use tile-based rendering, which allows them to use on-die caches to reduce main memory bandwidth consumption, which ties in quite well with Nintendo's habits in this regard. Something like a 4MB L3 victim cache (similar to what Apple uses on their A-series SoCs) could potentially reduce bandwidth requirements by quite a lot, although it's extremely difficult to quantify the precise benefit.

GPU Clock

This is where things get a lot more interesting. To start off, the relationship between the two clock speeds is pretty much as expected. With a target of 1080p in docked mode and 720p in undocked mode, there's a 2.25x difference in pixels to be rendered, so a 2.5x difference in clock speeds would give developers a roughly equivalent amount of GPU performance per pixel in both modes.

Once more, though, and perhaps most importantly in this case, any interpretation of the clock speeds themselves is entirely dependent on the configuration of the GPU, namely the number of SMs (also ROPs, front-end blocks, etc, but we'll assume that they're kept in sensible ratios).

Case 1: 2 SMs - Docked: 384 GF FP32 / 768 GF FP16 - Portable: 153.6 GF FP32 / 307.2 GF FP16

I had generally been assuming that 2 SMs was the most likely configuration (as, I believe, had most people), simply on the basis of allowing for the smallest possible SoC which could meet Nintendo's performance goals. I'm not quite so sure now, for a number of reasons.

Firstly, if Nintendo were to use these clocks with a 2 SM configuration (assuming 20nm), then why bother with active cooling? The Pixel C runs a passively cooled TX1, and although people will be quick to point out that Pixel C throttles its GPU clocks while running for a prolonged time due to heat output, there are a few things to be aware of with Pixel C. Firstly, there's a quad-core A57 CPU cluster at 1.9GHz running alongside it, which on 20nm will consume a whopping 7.39W when fully clocked. Switch's CPU might be expected to only consume around 1.5W, by comparison. Secondly, although I haven't been able to find any decent analysis of Pixel C's GPU throttling, the mentions of it I have found indicate that, although it does throttle, the drop in performance is relatively small, and as it's clocked about 100MHz above Switch to begin with it may only be throttling down to a 750MHz clock or so even under prolonged workloads. There is of course the fact that Pixel C has an aluminium body to allow for easier thermal dissipation, but it likely would have been cheaper (and mechanically much simpler) for Nintendo to adopt the same approach, rather than active cooling.

Alternatively, we can think of it a different way. If Switch has active cooling, then why clock so low? Again assuming 20nm, we know that a full 1GHz clock shouldn't be a problem for active cooling, even with a very small quiet fan, given the Shield TV (which, again, uses a much more power-hungry CPU than Switch). Furthermore, if they wanted a 2.5x ratio between the two clock speeds, that would give a 400MHz clock in portable mode. We know that the TX1, with 2 SMs on 20nm, consumes 1.51W (GPU only) when clocked at about 500MHz. Even assuming that that's a favourable demo for the TX1, at 20% lower clock speed I would be surprised if a 400MHz 2 SM GPU would consume any more than 1.5W. That's obviously well within the bounds for passive cooling, but even being very conservative with battery consumption it shouldn't be an issue. The savings from going from 400MHz to 300MHz would perhaps only increase battery life by about 5-10% tops, which makes it puzzling why they'd turn down the extra performance.

Finally, the recently published Switch patent application actually explicitly talks about running the fan at a lower RPM while in portable mode, and doesn't even mention the possibility of turning it off while running in portable mode. A 2 SM 20nm Maxwell GPU at ~300MHz shouldn't require a fan at all, and although it's possible that they've changed their mind since filing the patent in June, it begs the question of why they would even consider running the fan in portable mode if their target performance was anywhere near this.

Case 2: 3 SMs - Docked: 576 GF FP32 / 1,152 GF FP16 - Portable: 230.4 GF FP32 / 460.8 GF FP16

This is a bit closer to the performance level we've been led to expect, and it does make a little bit of sense from the perspective of giving a little bit over TX1 performance at lower power consumption. (It also matches reports of overclocked TX1s in early dev kits, as you'd need to clock a bit over the standard 1GHz to reach docked performance here.) Active cooling while docked makes sense for a 3 SM GPU at 768MHz, although wouldn't be needed in portable mode. It still leaves the question of why not use 1GHz/400MHz clocks, as even with 3 SMs they should be able to get by with passive cooling at 400MHz, and battery consumption shouldn't be that much of an issue.

Case 3: 4 SMs - Docked: 768 GF FP32 / 1,536 GF FP16 - Portable: 307.2 GF FP32 / 614.4 GF FP16

This would be on the upper limit of what's been expected, performance wise, and the clock speeds start to make more sense at this point, as portable power consumption for the GPU would be around the 2W mark, so further clock increases may start to effect battery life a bit too much (not that 400-500MHz would be impossible from that point of view, though). Active cooling would be necessary in docked mode, but still shouldn't be needed in portable mode (except perhaps if they go with a beefier CPU config than expected).

Case 4: More than 4 SMs

I'd consider this pretty unlikely, but just from the point of view of "what would you have to do to actually need active cooling in portable mode at these clocks", something like 6 SMs would probably do it (1.15 TF FP32/2.3 TF FP16 docked, 460 GF FP32/920 GF FP16 portable), but I wouldn't count on that. For one, it's well beyond the performance levels that reliable-so-far journalists have told us to expect, but it would also require a much larger die than would be typical for a portable device like this (still much smaller than PS4/XBO SoCs, but that's a very different situation).

TL:DR

Each of these numbers are only a single variable in the equation, and we need to know things like CPU configuration, memory bus width, embedded memory pools, number of GPU SMs, etc. to actually fill out the rest of those equations to get the relevant info. Even on the worst end of the spectrum, we're still getting by far the most ambitious portable that Nintendo's ever released, which also doubles as a home console that's noticeably higher performing than Wii U, which is fine by me.
 

AgeEighty

Member
The most discouraging part of the report is this:

Eurogamer said:
One developer source likens this to creating two different versions of the same game - almost like producing a PS4 game and a PS4 Pro variant. At the very least, QA will require titles to be tested thoroughly in both configurations, plus a lot of thought will be going into exactly how to utilise GPU power in each mode.

Because it means they're effectively creating a development bottleneck to game porting that's not likely to be too attractive to developers. I don't know precisely how large that bottleneck will be, but any bottleneck is bad.

The most encouraging part is this:

Eurogamer said:
We should also remember that Nvidia has produced a bespoke software layer that should allow developers to get much, much more from the processor compared to what we've seen Tegra achieve in the Android-powered Shield console.

Hopefully the actual output we see won't be as bad as these specs seem to suggest.

Whatever faith people may or may not have in Nintendo's hardware, I'm pretty confident that Nvidia can spin gold out of straw.

Each of these numbers are only a single variable in the equation, and we need to know things like CPU configuration, memory bus width, embedded memory pools, number of GPU SMs, etc. to actually fill out the rest of those equations to get the relevant info. Even on the worst end of the spectrum, we're still getting by far the most ambitious portable that Nintendo's ever released, which also doubles as a home console that's noticeably higher performing than Wii U, which is fine by me.

Yeah, honestly that sounds pretty good to me.
 
For the thick layman just how good is this thing?

It's basically a fairly modest home console (trying to be generous here) but also the best handheld console hardware available (which isn't saying much because there are no competitors here, and different segments have devices like the Shield already out and at twice the performance without active cooling). Overall, it seems like a very confused and inadequate piece of hardware that seems to be overreliant on a gimmick. If that reminds you of the Wii U, well, it probably should.
 
Status
Not open for further replies.
Top Bottom