• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

A Nintendo Switch has been taken apart

LordOfChaos

Member
You guys may not have seen this from the other thread, but there's a new Jetson TX2 which is a Pascal based unit with A57s (and Denver cores), rather than A72s:



It would be interesting to see what size this die is.

16nm, a57?

jonw.gif





I thought Denver 2 cores would have been interesting in the Switch, maybe only turn on when docked and A57s only when mobile, but, that's pretty surely not what's in it, so mildly interesting but not entirely related.
 
So it seems a lot of switch titles are using double buffering. Is this a hardware thing perhaps. It's odd that both DQH and Zelda use it.
 

Mokujin

Member
16nm, a57?

jonw.gif





I thought Denver 2 cores would have been interesting in the Switch, maybe only turn on when docked and A57s only when mobile, but, that's pretty surely not what's in it, so mildly interesting but not entirely related.

I have commented this a lot of times, but A72 is not a huge improvement over A57 at the same node, they are about 10% better with similar area use which is nice but not a game changer.

Also A72 licensing is probably more expensive than A57 and Nvidia may probably have a multi year A57 license, so not that big of a mystery in my eyes.

Where is the other thread?
 
I have commented this a lot of times, but A72 is not a huge improvement over A57 at the same node, they are about 10% better with similar area use which is nice but not a game changer.

Also A72 licensing is probably more expensive than A57 and Nvidia may probably have a multi year A57 license, so not that big of a mystery in my eyes.

Where is the other thread?

Sorry, it was the Foxconn thread, follow the link in the quote that was in my post.
 

LordOfChaos

Member
I have commented this a lot of times, but A72 is not a huge improvement over A57 at the same node, they are about 10% better with similar area use which is nice but not a game changer.

Also A72 licensing is probably more expensive than A57 and Nvidia may probably have a multi year A57 license, so not that big of a mystery in my eyes.

Where is the other thread?


It may be the licensing cost, but otherwise, it's more like "why not". Same area, faster, lower power, there's not an area where the A72 is a regression from the A57.


Btw, According to Anandtech reporting, the Denver cores are only enabled in 15W mode. Pretty much a nonstarter for this discussion.

Also apparently the A57s are there because the design was frozen in 2013, and then the 16nm delay happened. The Tegra roadmap that NVidia showed off in 2013 went straight from 28nm Logan to 16nm Parker, with Parker meant to ship in 2015. So we're getting a 2015 design in 2017.

The familiar 20nm part with no Denver was also rushed out for this reason.
 
Nvidia has had Parker in mass production for months, and now they release a product based on it.
As a coincidence, it has a power mode that coincides with Foxconn's frequencies and Switch's power consumption in docked mode. (Accounting for Switch not having the Denver cores)
It also has Nvidia's own uncore that would allow Switch to use low power cores for its OS.
I don't see a single reason why Nintendo would use the old TX1 beyond old devkits and the like. (Since, in the eyes of the programmers, the two things would be the same, just with different frequencies)
Edit: from what I understand, the allegedly huge power efficiency advantage of A72 over A57 comes from A72 usually being in 16nm and A57 being in the poor 20nm node?
 

Hermii

Member
The Joy-cons have their own battery though, so they shouldn't draw any power from the main unit battery, unless the tests were done with depleted Joy-con batteries.

I even read somewhere (maybe Reddit?) that once the main console runs low on battery, the console starts sipping power from the Joy-con batteries, if they have enough charge left. As the Joy-cons supposedly last ~20 hours, that should be the norm if both are fully charged, therefore further increasing the available power for the main console from 16 Wh to ~19 Wh.

In general I would be really interested in trying to calculate the power consumption for all of the parts based on the Zelda runtime, so we can have a better guess on the actual hardware and clockrates of the SoC.

For the Display, the closest one with power measurements i could find is the Mate 8 Display http://www.anandtech.com/show/9878/the-huawei-mate-8-review/6
Only 6" instead of 6,2", but also 1080p instead of 720p, so power consumption should be fairly close. I haven't seen a switch in person yet, but from reading a lot of first impressions, i would guess that the switch is brighter than the 3DS and Wii U Display, but doesn't come close to modern smartphones. So somewhere between 200 and 300 nits sounds reasonable, which would correlate to between 0.8 and 1 watt.
(For comparison, the 3DS got around 150 nits, while the new 3DS managed 161 nits according to https://www.welt.de/wirtschaft/webw...eue-Nintendo-3DS-hat-den-Zocker-im-Blick.html)

The SoC calculations were already done before in this thread (assuming 4 Cores), iirc that resulted in:

~2,5 watt: Eurogamer clocks, A57 @20nm (1,83 + ~0,7)
~1,5 watt: Eurogamer clocks. A57 @16nm (1,16 + ~0,4)
~6,5 watt: Foxconn clocks, A57 @20nm (5,46 + ~1)
~4,3 watt: Foxconn clocks, A72 @20nm (~3,3 + ~1)
~4,2 watt: Foxconn clocks, A57 @16nm (3,65 + ~0,6
~2,8 watt: Foxconn clocks, A72 @16nm (2,2 + ~0,6)

All the values with a ~ are estimations, mostly based on http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview/3
The other values come from:
http://www.anandtech.com/show/9330/exynos-7420-deep-dive/5
http://www.anandtech.com/show/8718/the-samsung-galaxy-note-4-exynos-review/6
http://www.anandtech.com/show/9878/the-huawei-mate-8-review/3

I think those numbers might be accurate up to +- 20% of the actual value

Wlan shouldnt be active while playing Zelda, or atleast power draw should be negligible
Joy-cons have their own battery.
Do we have any information about the Speakers?
No idea about the Ram, that might be one of the bigger power consumers while playing.
Nand and microsd should also be negligible.

Did i forget anything?

So lets look at this table again, with Marcans readings in mind for 10.9 watts docked and 7.1 watts mobile (min brightness) for the system when not charging.

Foxconn still wins. But then there is the still unknown memory power consumption.
 

Mameshiba

Neo Member
Nvidia has had Parker in mass production for months, and now they release a product based on it.
As a coincidence, it has a power mode that coincides with Foxconn's frequencies and Switch's power consumption in docked mode. (Accounting for Switch not having the Denver cores)
It also has Nvidia's own uncore that would allow Switch to use low power cores for its OS.
I don't see a single reason why Nintendo would use the old TX1 beyond old devkits and the like. (Since, in the eyes of the programmers, the two things would be the same, just with different frequencies)
Edit: from what I understand, the allegedly huge power efficiency advantage of A72 over A57 comes from A72 usually being in 16nm and A57 being in the poor 20nm node?

A72 is a big improvement over A57 even on the same node. A72 is ~10% smaller, consumes ~20% less power and is quite a bit faster, depending on the workload. Especially memory bandwidth is massively improved.
 

Goodlife

Member
So lets look at this table again, with Marcans readings in mind for 10.9 watts docked and 7.1 watts mobile (min brightness) for the system when not charging.

Foxconn still wins. But then there is the still unknown memory power consumption.

Seems a bit crazy if RAM power consumption was 6/7 times more than SOC power consumption though, doesn't it?
 
With one of the games being Zelda, its a stretch to say the devs were lazy.

So why else would Nintendo use double buffered vsync? The Switch should have ample memory for another buffer, it's got way more usable than the Wii U.

Nintendo aren't infallible programming geniuses.

We saw plenty of early games on PS4 that didn't support AF and that was nothing to do with the console.
 

Hermii

Member
So why else would Nintendo use double buffered vsync? The Switch should have ample memory for another buffer, it's got way more usable than the Wii U.

Nintendo aren't infallible programming geniuses.

We saw plenty of early games on PS4 that didn't support AF and that was nothing to do with the console.

Infallible programming geniuses maybe not, but lazy is not a word that comes to mind when thinking of Nintendos internal teams.
 

ultrazilla

Member
I love Zelda and almost exclusively play it in docked mode on my 50" tv. The games deserves a big screen for all its beauty. That said, the horrendous frame rate jumps and hiccups are really starting to bother me. I mean, Nintendo has to do something with the performance on the game in docked mode. If it means lowering the docked to 720p to lock in a stable 30 fps, I'm totally fine with that. The game will still look gorgeous.

I'd really like to know what is going on with the Switch gpu chip. I mean, we were basically told to expect 2-5x the performance of a Wii U and having owned a Wii U, I'm not presently seeing anything that would lead me to believe the Switch is more powerful than a Wii U. Yeah, handheld Zelda is the "best experience" for Zelda but Nintendo is marketing the Switch as a HOME CONSOLE. And yes, I installed the day 1 system update.

I'm just really concerned this is more of a small step up from Wii U at this point. :(
 
I love Zelda and almost exclusively play it in docked mode on my 50" tv. The games deserves a big screen for all its beauty. That said, the horrendous frame rate jumps and hiccups are really starting to bother me. I mean, Nintendo has to do something with the performance on the game in docked mode. If it means lowering the docked to 720p to lock in a stable 30 fps, I'm totally fine with that. The game will still look gorgeous.

I'd really like to know what is going on with the Switch gpu chip. I mean, we were basically told to expect 2-5x the performance of a Wii U and having owned a Wii U, I'm not presently seeing anything that would lead me to believe the Switch is more powerful than a Wii U. Yeah, handheld Zelda is the "best experience" for Zelda but Nintendo is marketing the Switch as a HOME CONSOLE. And yes, I installed the day 1 system update.

I'm just really concerned this is more of a small step up from Wii U at this point. :(

It's been said before but it's highly unlikely that the frame rate issues have to do with the limitations of the Switch hardware. It's far more likely that in porting from the Wii U (in less than a year) which has a vastly different architecture and memory setup not everything could be optimized properly.

For instance, the majority of times I've seen a hiccup is when killing a moblin- it doesn't happen for bokoblins, lizalfos, or anything else, just on moblins. This is not a hardware issue, this is a problem with optimization of the software.


Yeah we talked about it on the last page. It would be very interesting if the die sizes were the same as it sort of seems from this image.
 

ultrazilla

Member
It's been said before but it's highly unlikely that the frame rate issues have to do with the limitations of the Switch hardware. It's far more likely that in porting from the Wii U (in less than a year) which has a vastly different architecture and memory setup not everything could be optimized properly.

For instance, the majority of times I've seen a hiccup is when killing a moblin- it doesn't happen for bokoblins, lizalfos, or anything else, just on moblins. This is not a hardware issue, this is a problem with optimization of the software.



Yeah we talked about it on the last page. It would be very interesting if the die sizes were the same as it sort of seems from this image.

I am getting a ton of frame rate issues when I'm using the 3D camera to look around, especially in the Great Plateu areas.

I'll hold out hope that we'll see better performance from whatever chip it's using down the road once devs have gotten used to programming for it. I realize with all new hardware there's a learning curve. I just hope we start seeing better performance. I would think 1080p 60/30fps would be the norm for the machine in docked mode. Is that asking above what the suspected gpu is capable of? Anyways, off to play Zelda. What an amazing experience!
 

LordOfChaos

Member

From the interviews going on it sounds like they started converting it in Spring of last year, so while the performance and resolution are a letdown, at least the former of those may be because the architecture that it's being ported from was so different.

R700 series GPU with an eDRAM to a Maxwell 2 GPU with TBR, three PowerPC 750 cores to three Cortex A57s, PowerPC to ARM...Hard to tell where the performance hits come from. Though since it works much better in docked/720p mode the CPU is probably not the issue.




last page, and likely not related at all to the Switch. The two Denver cores come on only if you set the TDP higher than 15W...
 
I am getting a ton of frame rate issues when I'm using the 3D camera to look around, especially in the Great Plateu areas.

I'll hold out hope that we'll see better performance from whatever chip it's using down the road once devs have gotten used to programming for it. I realize with all new hardware there's a learning curve. I just hope we start seeing better performance. I would think 1080p 60/30fps would be the norm for the machine in docked mode. Is that asking above what the suspected gpu is capable of? Anyways, off to play Zelda. What an amazing experience!

Yeah there's definitely some slowdowns that happen fairly often, but I'm talking about the hiccups that sort of freeze the gameplay for a half a second or so. When a hiccup like that is (apparently) only tied to a specific enemy you know it's 100% a software optimization problem.

I think as some people were saying in the past few pages here, it's probably an issue of going from a Wii U memory arrangement which has a higher effective bandwidth to a lower bandwidth tile based memory arrangement in the Switch. When a game is built from the ground up for one system with one way of programming and then ported to another one with an entirely different way of programming in less than a year, poor optimization is entirely expected, especially for a game as massive as BotW.

Anyway, like you said there is certainly a learning curve here and I'd expect games built from the ground up for Switch to perform quite a bit better.

last page, and likely not related at all to the Switch. The two Denver cores come on only if you set the TDP higher than 15W...

It may not be related to Switch but looking at the die size could actually give us some very interesting information. If the die size of the TX2 is also 121mm^2 (and it appears quite similar to the TX1 in those pictures) then that would make Digital Foundry's primary argument in favor of 20nm kind of moot.
 

Rodin

Member
Is the switch slightly more powerful than a WiiU? I was watching the IGN review and they stated that.
IGN also said that the power difference with Xbox One is like Wii-360. Don't even look at that "review".

From the interviews going on it sounds like they started converting it in Spring of last year, so while the performance and resolution are a letdown, at least the former of those may be because the architecture that it's being ported from was so different.

R700 series GPU with an eDRAM to a Maxwell 2 GPU with TBR, three PowerPC 750 cores to three Cortex A57s, PowerPC to ARM...Hard to tell where the performance hits come from.


Yeah there's definitely some slowdowns that happen fairly often, but I'm talking about the hiccups that sort of freeze the gameplay for a half a second or so. When a hiccup like that is (apparently) only tied to a specific enemy you know it's 100% a software optimization problem.

I think as some people were saying in the past few pages here, it's probably an issue of going from a Wii U memory arrangement which has a higher effective bandwidth to a lower bandwidth tile based memory arrangement in the Switch. When a game is built from the ground up for one system with one way of programming and then ported to another one with an entirely different way of programming in less than a year, poor optimization is entirely expected, especially for a game as massive as BotW.

Anyway, like you said there is certainly a learning curve here and I'd expect games built from the ground up for Switch to perform quite a bit better.

I don't think the Switch has lower bandwidth than Wii U, the problem imho is that they could fit some more stuff into the large 32MB pool of eDRAM on Wii U but they couldn't do that with every effect on Switch due to the fact that some of them can't be tiled. If you're still at 720p you have twice the bandwidth for the main RAM, probably more bandwidth for the SRAM pool, plus Maxwell/Pascal color compression, but when you increase the pixel count by 56% you may need to find some workarounds for some parts of the code and they probably didn't have enough time for that considering how late in the development process Nintendo decided they wanted this port. I'm pretty confident that if this game was built for Switch it could've run at 1080p 30fps with the same performances they have now in Switch undocked at the very least.
 
I am getting a ton of frame rate issues when I'm using the 3D camera to look around, especially in the Great Plateu areas.

I'll hold out hope that we'll see better performance from whatever chip it's using down the road once devs have gotten used to programming for it. I realize with all new hardware there's a learning curve. I just hope we start seeing better performance. I would think 1080p 60/30fps would be the norm for the machine in docked mode. Is that asking above what the suspected gpu is capable of? Anyways, off to play Zelda. What an amazing experience!
As with everything, it will depend on the game.
Something like ARMS or Mario Kart 8 or Mario Odyssey will run at 1080p60 with no hiccups, multiplatform ports may have to run at 720p or checkerboard 1080p or 900p with reduced graphics settings compared to the twins, or any combination of those.
For portable mode, maybe some games will run at 20FPS like MGS Peace Walker. The smaller screen reduces the perceptual difference between one frame and the next, and sort of helps keep up with what's going on on screen.
 

btrboyev

Member
IGN is probably going by launch titles. Like every console ever, its going to take a while for it to get maxed out.

Well, to say slightly more powerful than a Wii U is probably the most accurate at this point. It definitely had a hardware advantage, but it's not a generational leap either. You can only expect so much from something the size of a tablet.
 

Rodin

Member
As with everything, it will depend on the game.
Something like ARMS or Mario Kart 8 or Mario Odyssey will run at 1080p60 with no hiccups, multiplatform ports may have to run at 720p or checkerboard 1080p or 900p with reduced graphics settings compared to the twins, or any combination of those.

I thought ARMS ran at 900p 60fps docked? Pointing this out because many people are saying that the game runs at 1080p.

For portable mode, maybe some games will run at 20FPS like MGS Peace Walker. The smaller screen reduces the perceptual difference between one frame and the next, and sort of helps keep up with what's going on on screen.
Jeez i hope not.
 
I just took a look at a png screenshot of ARMS and indeed it appears to be 900p... Shame, still looks very good.
About the 20 FPS, everybody played Ocarina of Time like that and if it's what it takes to get, say, an Assassin's Creed running in portable mode, I'd take it tbh.
 

Hermii

Member
Well, to say slightly more powerful than a Wii U is probably the most accurate at this point. It definitely had a hardware advantage, but it's not a generational leap either. You can only expect so much from something the size of a tablet.
We are going from 33 watts to 11 watts docked, 7 watts mobile (min brightness) so it won't be a generational leap but I still trust Nintendo to create impressive titles considering the hardware, and I don't think launch titles are close to the ceiling of what it can do.
 

LordOfChaos

Member
It may not be related to Switch but looking at the die size could actually give us some very interesting information. If the die size of the TX2 is also 121mm^2 (and it appears quite similar to the TX1 in those pictures) then that would make Digital Foundry's primary argument in favor of 20nm kind of moot.


Their primary argument was the clock speed, since 16nm FF and 20nm are technically the same transistor density. They've staked their reputation on those clock speeds a few times, especially with the updated 25% boost article. 16 should have allowed higher speeds at the same power draw we're seeing.


Cell-SizeComparison.png


heeey, wait just a second tsmc!


I'm pretty confident that if this game was built for Switch it could've run at 1080p 30fps with the same performances they have now in Switch undocked at the very least.


Theoretically the resolution should have been the most scalable thing. Early 7th gen games often just looked like HD 6th gen games, but at least the resolution was a lot higher. Resolutions did not increase as the generation went on, they decreased sub-720p as games got more complex. I suspect even 5 years down the line it's not going to be a 1080p box most of the time, depending on game complexity.
 

Drain You

Member
I just took a look at a png screenshot of ARMS and indeed it appears to be 900p... Shame, still looks very good.
About the 20 FPS, everybody played Ocarina of Time like that and if it's what it takes to get, say, an Assassin's Creed running in portable mode, I'd take it tbh.

I'd have a really hard time picking up the Switch if games were running 20 fps. In fact given the choice id rather them lower the resolution of effects to achieve 30 fps.
 

Schnozberry

Member
I just took a look at a png screenshot of ARMS and indeed it appears to be 900p... Shame, still looks very good.
About the 20 FPS, everybody played Ocarina of Time like that and if it's what it takes to get, say, an Assassin's Creed running in portable mode, I'd take it tbh.

You're more likely to get sub-hd resolution rather than 20fps these days. I wouldn't get too worked up about ARMS being 900p at the preview events. According to Shinen those were delivered in early January, and ARMS would have had months of development left.
 

Hermii

Member
I just took a look at a png screenshot of ARMS and indeed it appears to be 900p... Shame, still looks very good.
About the 20 FPS, everybody played Ocarina of Time like that and if it's what it takes to get, say, an Assassin's Creed running in portable mode, I'd take it tbh.
Oot was acceptable back then, because I was a kid that didn't know what framerate was and 3D console gaming was in its infancy. Now, I would rather take nothing than AC at 20fps.
 

Rodin

Member
depending on game complexity.
Exactly. Breath of the Wild may be the most technically impressive Wii U game by far, to the point that sometimes i wonder how the fuck did they manage to make it run, but it's still a Wii U game.

Oot was acceptable back then, because I was a kid that didn't know what framerate was and 3D console gaming was in its infancy. Now, I would rather take nothing than AC at 20fps.
OOT was acceptable because every animation was keyframed to 20fps. It looked "normal" back then.
 
Their primary argument was the clock speed, since 16nm FF and 20nm are technically the same transistor density. They've staked their reputation on those clock speeds a few times, especially with the updated 25% boost article. 16 should have allowed higher speeds at the same power draw we're seeing.

Cell-SizeComparison.png


heeey, wait just a second tsmc!

Their only evidence for 20nm was the die size though, from the article:

However, as much as we want the Foxconn clocks to be real, the weight of evidence is stacking up against this aspect of the leak. To maintain meaningful battery life with those clocks, we'd need to be looking at a 16nm FinFET chip and maybe even a new revision of the ARM CPU cores, and the Chinese teardown of the processor confirms that the physical size of the chip is seemingly unchanged from existing 20nm Tegra X1 SoC.

The difference between 16nm and 20nm isn't actually about transistor size, but more about the 3D 'FinFET' transistors on the lower node. A 16nm SoC would be approximately the same size as the existing 20nm Tegra X1, but the difference here is that the teardown reveals a processor with seemingly identical dimensions. Also interesting is that the processor is surrounded by the same surface-mounted arrangement of what are likely to be decoupling capacitors, there to reduce noise on the power lines. The initial conclusion we have is that we are looking at a more lightly modified X1, still on the 20nm process, which ties in more closely with the clocks we reported - and indeed every non-Foxconn spec leak seen to date.

They still may be right, but I think if the TX2 is the exact same size as the TX1 that sorta throws a wrench in this reasoning of theirs.
 

LordOfChaos

Member
They still may be right, but I think if the TX2 is the exact same size as the TX1 that sorta throws a wrench in this reasoning of theirs.

Why would we think it's the same size though? 256-core Pascal GPU + 4 A57s + 2 Denver 2 cores, on a process at the same transistor density as 20nm (just far more efficient). Denver cores are rather large, iirc two of them were like the entire 4 core complex of A57s.

They also reasoned in your quoted block that the power delivery was the same - if it was on 16nm, one would expect a shifted over voltage curve that may want a new power delivery outlay.

Someone, not sure who, had showed the pictures of Jen Hsun holding up Parker and TX1 as proof they were the same die size, but that was before Parker had even been taped out, so definitely a dummy board because Jen likes to hold things up like Jobs.

Xavier was like 300mm2 (that has the above plus double the GPU cores at 512)
 
We don't know the exact size of the die in Switch and we don't know the size of TX2 and it doesn't matter. Eurogamer's info on this is sketchy and incomplete and it's not very clear when they obtained it and how complete it is.
The Foxconn leak got much more information than Eurogamer, all of it right, and at least it gives concrete information on how the frequencies were observed.
 

Donnie

Member
Well, to say slightly more powerful than a Wii U is probably the most accurate at this point. It definitely had a hardware advantage, but it's not a generational leap either. You can only expect so much from something the size of a tablet.

it would be fair enough to call Switch in handheld mode slightly more powerful than WiiU (its more than that but I can understand the wording). But in docked mode you then have a doubling of GPU performance plus extra bandwidth.
 

McMilhouse

Neo Member
it would be fair enough to call Switch in handheld mode slightly more powerful than WiiU (its more than that but I can understand the wording). But in docked mode you then have a doubling of GPU performance plus extra bandwidth.

Doubling? Zelda chugs at 900p
 

Donnie

Member
Doubling? Zelda chugs at 900p

So you're going to disregard the fact (and it is a fact) that the graphics chip runs at least twice as fast in docked mode because a certain game isn't running at twice the resolution? That makes absolutely no sense.

Many things can limit performance in a particular game, not just GPU power.

It chugs in some parts of the game, plenty of the time its a solid 30fps. The reason for those drops certainly isn't GPU power, could partly be bandwidth. But most likely is simply not enough time spent on optimisation when taking a game from one architecture to a very different architecture.
 
Doubling? Zelda chugs at 900p

It's one game and it's a port of a wii U game. Not to mention it's a launch game. I don't think we can judge the switch HW based entirely on Zelda. There is no guarantee that it's perfectly optimised for the switch HW. I'm sure Nintendo will produce better looking games as the generation goes on.
 

Oregano

Member
The better point is that Zelda runs worse on Wii U at 720p than Switch at 900p. Saying Zelda chugs at 900p is meaningless in a vacuum.
 
Doubling? Zelda chugs at 900p
Because its a rushed port that was built from the ground up for Wii u architecture.

If it was built and fully optimized for the switch and wasn't rushed, I'd expect 1080p with solid 30fps and better draw distance and/or better polygons/textures/lighting/shadows. The gap between Wii u and switch should be twice as large(4-5x) than Switch and PS4(at least 2x difference), if we go by eurogamer clock speed leaks. 720p to 1080p takes 2.25x power alone, but we still have up a good amount of power left for framerate and better effects. This is assuming that its optimized to Switch's architecture and there's no large bottleneck on it like bandwidth.
 

Polygonal_Sprite

Gold Member
As with everything, it will depend on the game.
Something like ARMS or Mario Kart 8 or Mario Odyssey will run at 1080p60 with no hiccups, multiplatform ports may have to run at 720p or checkerboard 1080p or 900p with reduced graphics settings compared to the twins, or any combination of those.
For portable mode, maybe some games will run at 20FPS like MGS Peace Walker. The smaller screen reduces the perceptual difference between one frame and the next, and sort of helps keep up with what's going on on screen.

How can you possibly say that ? ARMS is a very limited 1v1 fighting game and MK 8 Deluxe is a port of a Wii U game which is much less complex than BotW's huge seamless World with it's amazing grass and foliage heavy areas (which seem to already be pushing Switch's memory bandwidth to breaking point).

I think the people expecting every Switch game to go from 720p in handheld mode to 1080p when docked are in for a huge disappointment. Different developers will go for different things. I think the best compromise between resolution and improved visuals over Wii U will be found at 900p.

It will be interesting to see what resolution the Splatoon 2 test fire runs at. I just can't see it going from 720p to 1080p at release, same goes for Mario. There has to be a reason they were shown at 720p in the first place.
 

gamerMan

Member
IGN also said that the power difference with Xbox One is like Wii-360. Don't even look at that "review".
I'm pretty confident that if this game was built for Switch it could've run at 1080p 30fps with the same performances they have now in Switch undocked at the very least.

I agree. Part of the problem is that the game was optimized for the Wii U. You don't just flip a "switch." At some point, I wish they just cut the Wii U version and moved development over to the Switch.

Since the game runs pretty well in 720p in portable mode, I do wish that Nintendo used a variable resolution to keep the frame rate constant. This looks like just a resolution bump with very little optimization.
 

guek

Banned
It will be interesting to see what resolution the Splatoon 2 test fire runs at. I just can't see it going from 720p to 1080p at release, same goes for Mario. There has to be a reason they were shown at 720p in the first place.
It was a demo that was running exclusively in handheld mode even when docked. That's pretty much all there is to it.
 
Top Bottom