• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Nintendo Switch CPU and GPU clock speeds revealed

Status
Not open for further replies.
Are folks being dramatic about Switch vs the Wii U? Honest question. I seem to recall Nintendo saying the Switch version of Zelda will have enhanced visuals, but is it at all possible the Wii U version is better if not a negligible difference? I might opt to play it on the Wii U for the gamepad menu if that's the case.
 

KingSnake

The Birthday Skeleton
Because I'd imagine if their target rendercost is 32ms and after tweaking they hit that with the PS4 but with the same values its 32.4ms on X1, reducing render resolution in that equation by 1% is trivial, because the difference between 1920 x1080 and a scaled 1920 x 900 is barely discernible to an average consumer.

the difference between 1280x720 and 1280 x 540 is fucking noticable

So your most important argument is that the devs will do it just because is noticeable although they have cheaper ways to reach the same result.

Also, what about this:

Deferred rendering hits GPU bandwidth hard. The lighting passes fetches from anywhere from 3 to 5+ textures, for every pixel on the screen, for every light. That's a lot of bandwidth.

This hurts mobile GPUs, an increasingly important segment, more than others. Yes, they're low-power chips, but they still have some shader power behind them. Deferred rendering on such platforms is going to hurt. This is particularly true for PowerVR-based platforms (the current most popular mobile GPUs), as their tile-based rendering system already gets many of the advantages of deferred rendering.

We already know that TX1 memory bandwidth is 25GB/S. How does this fit in your magic?
 

jerry5278

Neo Member
I think people are getting too worked up about this. we dont even know if it's true. do you really think 3rd parties would be very positive about this system, especially todd howard, if the switch really had this kind of performance? absolutely not. every week, we learn about another game coming to the switch. LKD leaked that from software has dark souls 3 running at a level they are happy with on the switch. If that is the case, then I highly doubt the switch is really gonna have these clock speeds. dark souls 3 in particular would not be able to run well with speeds like this. now everyone just take a deep breath and relax. the full reveal is only a few weeks away
 

Manoko

Member
Are folks being dramatic about Switch vs the Wii U? Honest question. I seem to recall Nintendo saying the Switch version of Zelda will have enhanced visuals, but is it at all possible the Wii U version is better if not a negligible difference? I might opt to play it on the Wii U for the gamepad menu if that's the case.

I'll play it on the Wii U, it saves me €250+.
 

bomblord1

Banned
Are folks being dramatic about Switch vs the Wii U? Honest question. I seem to recall Nintendo saying the Switch version of Zelda will have enhanced visuals, but is it at all possible the Wii U version is better if not a negligible difference? I might opt to play it on the Wii U for the gamepad menu if that's the case.

It is not possible for the wiiU version to outperform the switch version based on specs.

An incredibly bad porting job could do that but based on what we seen on Falon that's not the case either.
 

Shikamaru Ninja

任天堂 の 忍者
Are folks being dramatic about Switch vs the Wii U? Honest question. I seem to recall Nintendo saying the Switch version of Zelda will have enhanced visuals, but is it at all possible the Wii U version is better if not a negligible difference? I might opt to play it on the Wii U for the gamepad menu if that's the case.

Wouldn't it be important for BOTW to run 1080p on the docked Switch? As with all the first-party titles being vanguards of the 720p mobile / 1080p home ideal.
 

lyrick

Member
Are folks being dramatic about Switch vs the Wii U? Honest question. I seem to recall Nintendo saying the Switch version of Zelda will have enhanced visuals, but is it at all possible the Wii U version is better if not a negligible difference? I might opt to play it on the Wii U for the gamepad menu if that's the case.

There isn't anything from this report/rumor? that suggests that it's less powerful than the Wii U.

Seems like a lot of people are simply comparing numbers that shouldn't be directly compared, then spreading their comprehension deficit further downstream.
 
We already know that TX1 memory bandwidth is 25GB/S. How does this fit in your magic?

Mario Kart 8 and Splatoon, and Need For Speed: Most Wanted U use deferred rendering on Wii U, so I'm guessing Nintendo must be using something to free memory bandwidth bottlenecks on Switch.

The post you mention is differently on the money though regarding mobile GPU's: The Vita version of NFS: Most Wanted uses a forward renderer and renders at 1/4 of the resolution of PS3/360/Wii U (360p) while also seeing noticeable visual cutbacks elsewhere (poly-count, reflections, shadowing, lighting, framerate).
 

GAMETA

Banned
Are folks being dramatic about Switch vs the Wii U? Honest question. I seem to recall Nintendo saying the Switch version of Zelda will have enhanced visuals, but is it at all possible the Wii U version is better if not a negligible difference? I might opt to play it on the Wii U for the gamepad menu if that's the case.

If your question's anything related to the GIF, I was just kidding.

Just the fact that the Switch runs Wii U games while undocked clearly means it's more powerful. :)

The only chance of the Wii U version running better is if it's running on CEMU.
 
Looking at Switch's supposed memory bandwidth I wouldn't even touch this topic.
Iirc, the XB1 memory setup is worse than the Wii U's in terms on what it needed to do because of the way it did bottleneck devs from getting to 1080p. The Wii U was good for its powerlevel.
 

-MB-

Member
i think people are exagerating this, the only reason they said it was a console first, was becuase they wanted to seel 3ds and pokemon sun and moons this holiday season, and nothing more.

Or they say it because consumers are more willing to spend the likely 250 on something called a home console, than they would if it were labeled a handheld first. Case in point, 250$ wii sold massively, a 250$ 3DS had to get an emergency pricecut to save its tanking sales.
 

aBarreras

Member
The last Wii U game I buy, not buying the Switch. I am all in pc right now.

congratulations?

Or they say it because onsumers are more willing to spend the likely 250 on somethign alled a home onsole, than they would if it were labeled a handheld first. Case in point, 250$ wii sold massively, a 250$ 3DS had to get an emergency pricecut to save its tanking sales.

at the end, it is marketing shaeeningans
 

LordOfChaos

Member
I'll say it again, but the lessons we should have learned with GameCube clearly weren't learned.

GameCube CPU: 485MHz, FPU 1.9GFLOPs
GameCube GPU: 162MHz
GameCube RAM: 43MB

PS2 CPU: 294MHz, FPU 6.2 GFLOPs
PS2 GPU: 147MHz
PS2 RAM: 32MB

Xbox CPU: 733MHz, FPU performance unknown
Xbox GPU: 233MHz
Xbox RAM: 64MB

At the time of their comparison in 2001, GameCube was labeled "garbage-tier" compared against the Xbox and just barely better than PS2, with its floating-point performance being regularly singled out.

And we all remember how things panned out that generation: PS2 was the weakest, naturally, but Xbox wasn't this massive unparalleled technology leap compared to any of them. How every component works with the total package in real-world performance is the only way to measure a console.

Nintendo clearly demonstrated its design philosophy, a philosophy that always gets overlooked because it's not something you can use as bait when trolling: Optimal RAM and cache for fewer wasted CPU/GPU cycles. I don't expect Switch to be any different in that regard. How optimized the design is as a whole will be the question, but as always, we'll have to wait until January to know for sure.


Console architectures were very unique from each other back then. There were more GPU and CPU companies duking it out.

Nowadays, architectures have broadly had an evolutionary convergence. Can the Switch be more efficient than bog standard maxwell 2? Sure, I don't rule out the possibility. But I do rule out the possibility that in a mere 500 man-years by their own admission, that's 250 people working two years, they /massively/ outdid Nvidias architectures that took the most R&D of any GPU to date.
 

LordRaptor

Member
So your most important argument is that the devs will do it just because is noticeable although they have cheaper ways to reach the same result.

No, my most important argument is that devs aren't going to take their PS4 code and go "Well, thats fine for a handheld, just quarter the resolution" jfc

Also, what about this:
words without context


We already know that TX1 memory bandwidth is 25GB/S. How does this fit in your magic?

Okay, go and read up on what deferred rendering is, because its literally used in every modern 3D game and I really can't be arsed trying to explain it to you in such a way that will convince you that its not "magic all up in this bitch" and that the rendercost per frame is calculated in a completely fucking different way to how it is calculated using forward rendering.

Or are you going to pretend some unsourced quote about mobile chips means the Switch won't support deferred rendering?
 

KingSnake

The Birthday Skeleton
Okay, go and read up on what deferred rendering is, because its literally used in every modern 3D game and I really can't be arsed trying to explain it to you in such a way that will convince you that its not "magic all up in this bitch" and that the rendercost per frame is calculated in a completely fucking different way to how it is calculated using forward rendering.

That quote was for a question about bandwidth, but fine, you don't want to touch this.

I'm not saying that deferred rendering is magic. I'm saying that the way you make it make up for a huge difference in processing power of two GPUs is almost touching magic. We talk here about two GPUs that both use deferred rendering and are based on modern architecture. It's not one GPU using forward rendering and the other deferred rendering.

I'll go and read more about it, for sure.

You still haven't provided specifics about what exactly can be cut out beyond [huge list of bells and whistles]. You cut out completely the lighting or what?

If the games renders at 720p in handheld mode then it should be able to render at 1080p in docked mode. So you're saying that Switch can do what Xbone can't do with a third of its power. How isn't that magic? It's a fucking Christmas miracle.

Edit: I'm aware that it could be made to render at the same resolution, but at that point it wouldn't be the same game anymore. It would be a sketch of the game.
 

Breakage

Member
I think Nintendo would have a better chance of success if they just made a straight up, no-bullshit-gimmicks home console with a good pad i.e. a spiritual successor to the GC. It doesn't have to be the most powerful console just something in the same ballpark as the base PS4 and X1. Imagine what Nintendo could do with a powerful machine that is also attractive to third parties.

Nintendo is still drunk on Wii's success and thinks the "new way to play" strategy is the way to go. So it's going to be another era of compromises made to accommodate a bunch of gimmicks.
I don't see the Switch being successful in the way Nintendo thinks it will be. It looks too complicated for casuals and I don't think lots of people will be lugging around a Switch unit for local play sessions.
 

Durante

Member
Okay, go and read up on what deferred rendering is, because its literally used in every modern 3D game and I really can't be arsed trying to explain it to you in such a way that will convince you that its not "magic all up in this bitch" and that the rendercost per frame is calculated in a completely fucking different way to how it is calculated using forward rendering.

Or are you going to pretend some unsourced quote about mobile chips means the Switch won't support deferred rendering?
The Switch will obviously support deferred rendering.

That doesn't meant that it can't be true at the same time that deferred rendering is quite bandwidth-intensive. Which it is.

Of course, then you have to consider that it is also a good fit for the tiled rasterization recent NV GPUs perform. Which will in turn reduce effective external bandwidth requirements again.

Even so, at <200 GFlops you are not going to get away with just post-processing or even minor shading shading quality reductions in most high-end games, if you want to render them at 720p.
 

Neoxon

Junior Member
I think Nintendo would have a better chance of success if they just made a straight up, no-bullshit-gimmicks home console with a good pad i.e. a spiritual successor to the GC. It doesn't have to be the most powerful console just something in the same ballpark as the base PS4 and X1. Imagine what Nintendo could do with a powerful machine that is also attractive to third parties. Nintendo is still drunk on Wii's success and thinks the "new way to play" strategy is the way to go. So it's going to be another era of compromises made to accommodate a bunch of gimmicks.
I don't see the Switch being successful in the way Nintendo thinks it will be. It looks too complicated for casuals and I don't think lots of people will be lugging around a Switch unit for local play sessions.
Given how established the PS4 & XB1 are in the current market, it's too late in the generation for Nintendo to jump in with another system similar to the competition. If this was the start of the generation, maybe you'd have somewhat of a point. But even then, third parties likely wouldn't give a shit due to the Nintendo audience either not caring or having systems that they get third party games on.
 
I think there is only one fan in Switch (in accordance with the recent patent diagrams). While it's odd that the fan reportedly operates in portable mode, we don't have official confirmation of how often and how fast.

I've posted this before, but the Pixel C is a large device with an aluminum back. Switch's setup may be a Nintendo compromise. Why stick a fan in the dock when it doesn't work as well as it does when inside the Switch itself? Why spend money on an aluminum case when you already need to be spend money on a fan? I can see Nintendo's engineers thinking of this as a clever solution.
 

Breakage

Member
Given how established the PS4 & XB1 are in the current market, it's too late in the generation for Nintendo to jump in with another system similar to the competition. If this was the start of the generation, maybe you'd have somewhat of a point. But even then, third parties likely wouldn't give a shit due to the Nintendo audience either not caring or having systems that they get third party games on.

I get that. Nintendo isn't going to win this gen. I just think they'd have a better chance of moving more units with a no bs home console than this hybrid adventure.
 

LordRaptor

Member
You still haven't provided specifics about what exactly can be cut out beyond [huge list of bells and whistles]. You cut out completely the lighting or what?

No, it is calculated in a completely different way, as I tried to explain before in the most possible abstract method.

Forward is: (G*(L+1)+A) = R
where G = Geometry, L = number of lights in a scene, and A is a forward rendering based Anti-aliasing solution.
R is the final Rendertime, likely to be 16ms or 32ms depending on the game.

L is a fixed cost of at least 1. A is probably the cheapest option available at the time.

so to hit a given value of R where L and A are fixed, you can extrapolate G linearly.

Deferred Rendering is vastly more complicated to arrive at R, because almost none of the individual variables involved are fixed, and almost none scale linearly.
G is fixed, but G in a modern pipeline is almost nothing.

A modern games render frame components would be something like:
(0.1ms Geometry) + (0.4ms lighting prepass) + (0.5ms Z-buffer) + (2ms lighting quality 1st pass) + (1ms edge AA) + (2ms lighting quality second pass) + (2ms Bokeh DOF blur) + (2ms high quality bloom) + (2ms lighting quality third pass) + (1ms temporal AA) + (2ms SSAO) + (1ms colour grading)

of which the only parts of the effects chain required are underlined, and where all of the rendercosts are also all variables, where you can literally have a slider between 'speed' and 'quality' (or turn them off completely).

e:
Even so, at <200 GFlops you are not going to get away with just post-processing or even minor shading shading quality reductions in most high-end games, if you want to render them at 720p.

No, which is why I don't think any developers supporting the Switch are going to take "X1 / PS4 console" settings then dial down resolution until it runs docked.
It has a completely different performance metric. It would require a commensurate level of porting effort.

e: I mean, for starters, if your target is a 720p handheld, there's no point using any textures over 512x512.
 

Buggy Loop

Member
I get that. Nintendo isn't going to win this gen. I just think they'd have a better chance of moving more units with a no bs home console than this hybrid adventure.

Wannabe PC's already a shrinking market, barely supporting 2 consoles with PS4 & Xbox one, let alone a 3rd one.

Whatever the future holds for Nintendo's decision, it can never be worst than going for the same market as Sony & Microsoft.
 

KingSnake

The Birthday Skeleton
Deferred Rendering is vastly more complicated to arrive at R, because almost none of the individual variables involved are fixed, and almost none scale linearly.
G is fixed, but G in a modern pipeline is almost nothing.

A modern games render frame components would be something like:
(0.1ms Geometry) + (0.4ms lighting prepass) + (0.5ms Z-buffer) + (2ms lighting quality 1st pass) + (1ms edge AA) + (2ms lighting quality second pass) + (2ms Bokeh DOF blur) + (2ms high quality bloom) + (2ms lighting quality third pass) + (1ms temporal AA) + (2ms SSAO) + (1ms colour grading)

of which the only parts of the effects chain required are underlined, and where all of the rendercosts are also all variables, where you can literally have a slider between 'speed' and 'quality' (or turn them off completely).

e:


No, which is why I don't think any developers supporting the Switch are going to take "X1 / PS4 console" settings then dial down resolution until it runs docked.
It has a completely different performance metric. It would require a commensurate level of porting effort.

So practically you're assuming that 3rd parties will more or less completely re-work their games for Switch just to achieve the native resolution in handheld mode.
 

LordRaptor

Member
So practically you're assuming that 3rd parties will more or less completely re-work their games for Switch just to achieve the native resolution in handheld mode.

If by "completely rework their games" you mean "have the technical artist set up a Switch specific effects-chain" yes.
 

KingSnake

The Birthday Skeleton
If by "completely rework their games" you mean "have the technical artist set up a Switch specific effects-chain" yes.

So if a game won't render at 720p on Switch we can say "lazy technical artist!".

How many man-days do you estimate that this takes for a game like COD?
 

M-PG71C

Member
Wannabe PC's already a shrinking market, barely supporting 2 consoles with PS4 & Xbox one, let alone a 3rd one.

Whatever the future holds for Nintendo's decision, it can never be worst than going for the same market as Sony & Microsoft.

This. Nintendo's best chance (And honestly, it is a good chance) is to forge their own section of the market. The Switch is poised to do just that.Them competing directly with Sony and MS is going after a already limited market share and frankly, a market that is shrinking.
 
So if a game won't render at 720p on Switch we can say "lazy technical artist!".

How many man-days do you estimate that this takes for a game like COD?

I think he's saying that it's a fairly trivial thing to remove certain post processing effects like AA, AF, AO and you gain a lot of breathing room by doing that alone. That's not to say other things like resolution might need to change, but a lot of the difference can come from there.

Developers don't do this typically for XB1 games because they can actually afford to keep these effects around, and possibly in part due to the strange RAM setup talked about in the past few pages (if that hasn't been debunked).
 

LordRaptor

Member
So if a game won't render at 720p on Switch we can say "lazy technical artist!".

How many man-days do you estimate that this takes for a game like COD?

No, you can say an Art Director sacrificed resolution and/or framerate for aesthetic - like is already happening on console games.

What do you want to hear?
"IT TAKES SO MUCH EFFORT THIRD PARTIES WILL NEVER DO IT!!!!"?
"IT TAKES ONE DUDE 10 MINUTES TO MAKE A ONE-OFF PRESET THAT WILL LAST FOREVER, SO SWITCH WILL DEFINITELY GET EVERY GAME EVER!!!!!"

Neither of those statements are true. Neither of those statements are unique to the Switch with regards to multi-platform development.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Gaf insiders gave us to high expectations in the first place. There won't be technical barriers from porting games up Switch right Matt?

OsirisBlack, the one who broke all the PS4 Pro news, said the same, and the specs in the OP don't really go against it yet.
 

KingSnake

The Birthday Skeleton
I think he's saying that it's a fairly trivial thing to remove certain post processing effects like AA, AF, AO and you gain a lot of breathing room by doing that alone. That's not to say other things like resolution might need to change, but a lot of the difference can come from there.

My understanding is that he's talking about more than just AA, AF, AO. But maybe I understand again wrongly.

I don't know of any game where turning off AA, AF, AO and cutting the resolution to half the pixels frees up to 90% of the GPU (Switch in handheld mode is 1/10 PS4). But fine, I'm ready to believe if I see it.

No, you can say an Art Director sacrificed resolution and/or framerate for aesthetic - like is already happening on console games.

But this is exactly what started this discussion, you saying that the developers will not sacrifice the native resolution for anything.
 

Easy_D

never left the stone age
So if a game won't render at 720p on Switch we can say "lazy technical artist!".

How many man-days do you estimate that this takes for a game like COD?

I'd wager a guess that it was harder to get 360 CoD games on Wii than it will be to get PS4 CoD games on the Switch, not what you asked at all, but worth thinking about. But I agree with your general point that blaming devs for being lazy is never the answer, that rests with the publisher and the budget they're willing to set for a project, and that's exactly what'll be the limit for Switch ports as well, is it economically sound or not to bring games over if it means a lot of work?
 

LordRaptor

Member
But fine, I'm ready to believe if I see it.

No, again, you're trying to make this into "everything will just run fine on Switch, just turn off some shaders" type of argument that I am not making.
I am saying developers will get better results targetting 720p handheld mode, then taking a 'free' resolution bump when docked, than taking a PS4/X1 preset and setting resolution to 540p as a docked mode target and, I dunno, telling handheld users to ESAD.
 

PSGames

Junior Member
I think most expecting the majority of current generation ports will be dissapointed but Switch will have a fairly large library despite the low spec hardware. People keep referring to Switch as a 'portable Wii U' and 'Wii U ports' but this thing has the potential to get huge amounts of last gen ports/remasters and introduce them to an all new audience. Also with the added portability appeal plus improved graphics/framerates I can see many people revisiting or trying out games they may have missed the first go around. That in itself will be amazing for a portable device.
 
Status
Not open for further replies.
Top Bottom