• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo Switch Dev Kit Stats Leaked? Cortex A57, 4GB RAM, 32GB Storage, Multi-Touch.

Status
Not open for further replies.

Mokujin

Member
where in the world a gtx 1060 is 40% fastar than a RX 480?

While the whole discussion it's kind of absurd (Switch games have to work in the lower Gflops profile) and I hate talking about Nvidia flops vs. AMD flops it really is a thing.

Gtx 1060 4.4 Tflops outperform RX480 5.84 Tflops, so there is truth behind that.
 
Lets change the topic: what do you think is up with the dev kit? They had barely edited TX1 docs, but we now getting these clock frequency numbers and it doesn't seem like it wasn't downgraded throughout its development.

The dev kit seems like they just took the closest SoC to the target and started the ball rolling. And the leak of the devkit was accurate in that it described the physical makeup of the kit, which was not meant to replicate the final hardware set up, but to stand in its place as best as possible. That's my take anyways.
 
The dev kit seems like they just took the closest SoC to the target and started the ball rolling. And the leak of the devkit was accurate in that it described the physical makeup of the kit, which was not meant to replicate the final hardware set up, but to stand in its place as best as possible. That's my take anyways.
That part makes sense, but does dev kits usually state the capped clockspeed instead of its max? My thoughts is that wouldn't happen unless they were going for the max range performance in the first place, but the report clock numbers are weird unless the chipset has been heavily customized.
 

Luigiv

Member
Accompanied by what CPU?

This specific build uses a Core i3 4130, which not exactly a monster CPU. Anyway it's irrelevant, because even when a PS4 title is CPU bound, devs aren't just leaving 25% of the GPU unutilised, they turn up a few GPU specific settings to fill the gap.

Yeah, using 1-2 CPU cores and 2-3 GB RAM is like no overhead at all.

As far as the games are concerned, that is effectively no overhead at all, as the OS is entirely contained within those 2 cores and 3GB while games have the remaining 6 cores and 5GB entirely to themselves. On a PC, games must dynamically share CPU resources and system memory with the OS and other background tasks, making system overhead a much bigger concern.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
That part makes sense, but does dev kits usually state the capped clockspeed instead of its max? My thoughts is that wouldn't happen unless they were going for the max range performance in the first place, but the report clock numbers are weird unless the chipset has been heavily customized.
Jetson TX1 is runnable at a plethora of powers levels. They could have easily ran it at sub-max levels from the very start.
 
45% of XBO? Which stage of grief is this now? Delusion? Must be an extra stage for Nintendo diehards. lolz


What a brillant and clever post sir. You can be free to disagree, I actually disagree with that statement. But do you feel it's necessary to be condescending ? That's just lame and you're ruining an otherwise interesting discussion.
 

z0m3le

Banned
Jetson TX1 is runnable at a plethora of powers levels. They could have easily ran it at sub-max levels from the very start.

My guess is they knew their power consumption envelope for the handheld and worked it out from there.

I really don't think 768mhz is bad for the GPU, but I was caught off guard by the 1ghz cpu. Always thought they would go with 1.2ghz clock or better.
 
While the whole discussion it's kind of absurd (Switch games have to work in the lower Gflops profile) and I hate talking about Nvidia flops vs. AMD flops it really is a thing.

Gtx 1060 4.4 Tflops outperform RX480 5.84 Tflops, so there is truth behind that.
They don't (and it's 5.4TF)
 

Hermii

Member
My guess is they knew their power consumption envelope for the handheld and worked it out from there.

I really don't think 768mhz is bad for the GPU, but I was caught off guard by the 1ghz cpu. Always thought they would go with 1.2ghz clock or better.
There's always the possibility they are going super conservative for launch and upclock later like the Vita.
 

z0m3le

Banned
They don't (and it's 5.4TF)

GTX 1060 is 1280 cuda cores @ 1708mhz http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-1060/specifications

That is 1280*2*1.708 = 4372.48tflops

RX 480 is 2304*2*1.266 = 5833.728tflops

GTX 1060 beats RX 480 by about ~7% on average iirc, putting the difference at greater than a 40% advantage.

Holy shit, that thread escalated quickly. Gotdam. Learned my lesson. Do not start Switch threads. Got it.

You came at it from the wrong angle, should have been about perspectives, not trying to tell people what they can and can't talk about.
 
Your thread was thinly veiled flame bait which is why it got locked

How was that flame bait? Because I blamed the fan base for some of the negativity? They need to take some responsibility for the Switch doom and gloom. There is nothing wrong with the leaked specs. It's perfectly fine system as revealed. The only thing that can ruin the system as we know it thus far is pricing mistake by Nintendo.
 

Shaii

Member
GTX 1060 is 1280 cuda cores @ 1708mhz http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-1060/specifications

That is 1280*2*1.708 = 4372.48tflops

RX 480 is 2304*2*1.266 = 5833.728tflops

GTX 1060 beats RX 480 by about ~7% on average iirc, putting the difference at greater than a 40% advantage.

You also need to account for the effect of drivers and architecture. The Pascal cards are extremely strong on single threaded performance, where the Polaris cards are stronger on async. This will be evident from game benchmarks dx11 vs. dx12..

The overall performance gap has shifted in favor of Polaris, over the last 3 months, due to driver updates.

http://www.hardwarecanucks.com/foru.../73945-gtx-1060-vs-rx-480-updated-review.html
 

saskuatch

Member

There will be no facts in a few weeks, Nintendo never releases detailed technical specs. They will probably give you the names of the GPU and CPU, which we already know and nothing else. Will have to rely on speculation and "insiders" till post launch teardown
 

Mr Swine

Banned
There's always the possibility they are going super conservative for launch and upclock later like the Vita.


This is what I hope that they are doing, have good battery life now and a year later and upclock it so that it becomes 25% more powerful
 
How was that flame bait? Because I blamed the fan base for some of the negativity? They need to take some responsibility for the Switch doom and gloom. There is nothing wrong with the leaked specs. It's perfectly fine system as revealed. The only thing that can ruin the system as we know it thus far is pricing mistake by Nintendo.

Incendiary title, failure to engage in discussion, posting yet another unnecessary Switch thread. If I had to guess those are the reasons it got locked.

This is getting off topic now so I'll drop it.
 

DynamicG

Member
How was that flame bait? Because I blamed the fan base for some of the negativity? They need to take some responsibility for the Switch doom and gloom. There is nothing wrong with the leaked specs. It's perfectly fine system as revealed. The only thing that can ruin the system as we know it thus far is pricing mistake by Nintendo.

Objectively you are right. However, this is an enthusiast forum that has had a negative reaction to the console specs for the past several days. So you should have known your audience. Also, the people who buy Nintendo hardware are a varied lot and you shouldn't generalize.

For the folks in this thread, you made this comment:

45% of XBO? Which stage of grief is this now? Delusion? Must be an extra stage for Nintendo diehards. lolz

So it's pretty easy to infer that you may have been trying to flame or troll.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Holy shit, that thread escalated quickly. Gotdam. Learned my lesson. Do not start Switch threads. Got it.

Shogmaster
Not genuinely interested in rational debate.
(Today, 12:56 PM)
 
I don't think it's all that controversial to claim that newer architectures will effectively give certain chips "flop for flop advantages" over older chips. The Wii U is kinda proof of this. I very much doubt we can be certain of the exact percentage in a console environment (compared to on PC) but when we don't know any more it's not a bad place to start comparisons.

As for the 3SM dream, I'm starting to think there's really no chance of that. What would Nintendo truly gain by increasing their production costs by that much? An extra ~20-40% GPU power? Is that really something they're all that interested in? Especially when GPU functions are as scaleable as they are?

I think it's far more important that they've customized the CPU configuration to improve that part of the SoC. Getting an extra ~40% out of the CPU would be a much better improvement as that can definitely be a barrier in even being able to run certain games.

How about everyone relax and have a nice cup of refreshing orange juice. :)

Y'all got a couple weeks left.

But I just brushed my teeth...


EDIT: Also regarding Takashi's tweets a few pages back, he may be talking about newly published Japanese patent applications. I don't know why the patent application we saw the other day would say anything about open-world capabilities or screen resolution, but a newly published JP patent might have certain embodiments where the screen resolution is 1080p and the console is capable of 1440p or whatnot. Just like with the US patent from the other day, these would just represent potential embodiments and not be any indicator of what the Switch product winds up being.
 

z0m3le

Banned
You also need to account for the effect of drivers and architecture. The Pascal cards are extremely strong on single threaded performance, where the Polaris cards are stronger on async. This will be evident from game benchmarks dx11 vs. dx12..

The overall performance gap has shifted in favor of Polaris, over the last 3 months, due to driver updates.

http://www.hardwarecanucks.com/foru.../73945-gtx-1060-vs-rx-480-updated-review.html

Yes it has moved in Polaris's direction, but a flop to flop comparison is still far in favor of Pascal. even losing to Polaris by 6% (the best result polaris has there) the flops tell a different story, it should lose by 25% if the flop performance was about equal, there is a large gulf here and what you aren't taking into account is that DX12 is based on Mantle which is an AMD API, pushing AMD's strengths is expected.

A 6% loss to RX 480 is still a ~27% flops performance advantage.
 
I'm in a bizzarro world where I'm defending a Nintendo device to Nintendo fans from being unreasonable about it. My head is spinning...
 

BGBW

Maturity, bitches.
There will be no facts in a few weeks, Nintendo never releases detailed technical specs. They will probably give you the names of the GPU and CPU, which we already know and nothing else. Will have to rely on speculation and "insiders" till post launch teardown

In the end people buy a system for the games not the specs.
 
You also need to account for the effect of drivers and architecture. The Pascal cards are extremely strong on single threaded performance, where the Polaris cards are stronger on async. This will be evident from game benchmarks dx11 vs. dx12..

The overall performance gap has shifted in favor of Polaris, over the last 3 months, due to driver updates.

http://www.hardwarecanucks.com/foru.../73945-gtx-1060-vs-rx-480-updated-review.html

Also, AMD cards tend to age better. A freaking HD 7970 can still hold its own in a lot of modern titles. You won't hit 60fps on Ultra, but it's no slideshow either.
 

Shahadan

Member
I'm in a bizzarro world where I'm defending a Nintendo device to Nintendo fans from being unreasonable about it. My head is spinning...

Even if your thread had been worded differently there is no reasoning with people. We'll get the world most powerful handheld that will perform better than devices three times more expensive & more powerful, will use less battery and have the option to display games on the TV.

But all people will have to say is "but specs should have been this but nintendo is dumb and lazy"
 

Doc Holliday

SPOILER: Columbus finds America
I'm in a bizzarro world where I'm defending a Nintendo device to Nintendo fans from being unreasonable about it. My head is spinning...

The problem is not that fans are expecting unreasonable performance from a hybrid system. The issue that many Nintendo fans including myself have, is that they didn't want a hybrid system to begin with.

It's a dumb solution to the real problem of nintendo not having enough internal resources to support a portable and a console system by themselves.

The console is probably weak and it's a bulky a portable. They should have just had two systems running the same OS, compatible chips and one unified account with cloud saves.

Once again they are making 3rd part ports harder than it should be, it makes no sense.
 

LordKano

Member
Chances are it'll be even worse because we're supposed to see mostly ports or games looking similar to Wii U.

Come on. Like they would hold a conference for ports. We'll see huge games made for the Switch and some third party games that could give us some indications about what to expect.
 
Even if your thread had been worded differently there is no reasoning with people. We'll get the world most powerful handheld that will perform better than devices three times more expensive & more powerful, will use less battery and have the option to display games on the TV.

But all people will have to say is "but specs should have been this but nintendo is dumb and lazy"

Why do you keep thinking of Switch as a handheld? The North American website literally calls it their next home console. If anything it's a stand alone system with the option to go into a portable mode, not the other way around. I guess we will have to wait a little while longer to find out definiteively.
 

BGBW

Maturity, bitches.
The problem is not that fans are expecting unreasonable performance from a hybrid system. The issue that many Nintendo fans including myself have, is that they didn't want a hybrid system to begin with.

It's a dumb solution to the real problem of nintendo not having enough internal resources to support a portable and a console system by themselves.

The console is probably weak and it's a bulky a portable. They should have just had two systems running the same OS, compatible chips and one unified account with cloud saves.

Once again they are making 3rd part ports harder than it should be, it makes no sense.

I don't know, instead of buying two bits of hardware for Nintendo games, I only need one.

OH GOD, I SOUND LIKE ONE OF THOSE AWFUL 'NINTENDO SHOULD GO THIRD PARTY'ERS.
 

Shaii

Member
Yes it has moved in Polaris's direction, but a flop to flop comparison is still far in favor of Pascal. even losing to Polaris by 6% (the best result polaris has there) the flops tell a different story, it should lose by 25% if the flop performance was about equal, there is a large gulf here and what you aren't taking into account is that DX12 is based on Mantle which is an AMD API, pushing AMD's strengths is expected.

A 6% loss to RX 480 is still a ~27% flops performance advantage.

Mantle was an AMD API which was replaced by the Vulcan API, which is a collaboration between several industry leading technology companies. Calling DX12 (a Microsoft API, mind you) an AMD standard is plain wrong.

Going async is a natural evolution of software, as well as hardware. It wasn't supported at all in DX11. AMD has been working on their GCN architecture for years, where it's supported natively. This in other words meant that their cards have been under-performing due to their worse single thread performance. Until very recently, all games were developed with DX11. Nvidia went another direction, focusing on what API was primarily used, and thus, most of their work has been optimizing their brute single-core strength.
 
I'm in a bizzarro world where I'm defending a Nintendo device to Nintendo fans from being unreasonable about it. My head is spinning...

The real problem with expectations actually came from Eurogamer to begin with. I don't understand why it's unreasonable to expect that the maximum (docked) performance would be around the same as a normal Tegra X1- 512GFLOPS- which is what most of us expected. The fact that even that has been reduced by ~25% is of course disappointing. The CPU clocks being reduced by 50% is even worse.

I don't really understand why basing our expectations on a standard Tegra X1- which Eurogamer reported was the devkit- was somehow us being delusional.
 
I don't know, instead of buying two bits of hardware for Nintendo games, I only need one.

Also, Nintendo is smart to stop sinking money on consoles where they keep taking the L. Smart to concentrate on their mobile successes. And IMO, as a mobile device, the Switch is just as well thought out, if not more (I say WAY more), than the 3DS.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
The real problem with expectations actually came from Eurogamer to begin with. I don't understand why it's unreasonable to expect that the maximum (docked) performance would be around the same as a normal Tegra X1- 512GFLOPS- which is what most of us expected. The fact that even that has been reduced by ~25% is of course disappointing. The CPU clocks being reduced by 50% is even worse.

I don't really understand why basing our expectations on a standard Tegra X1- which Eurogamer reported was the devkit- was somehow us being delusional.

This. Expecting a PS4-level of power was delusional, but I rarely saw any of that. Expecting an X1 wasn't delusional at all.
 

Donnie

Member
As for the 3SM dream, I'm starting to think there's really no chance of that. What would Nintendo truly gain by increasing their production costs by that much? An extra ~20-40% GPU power? Is that really something they're all that interested in? Especially when GPU functions are as scaleable as they are?

A 3SM design would offer 50% more GPU performance. Who knows what Nintendo are interested in. But I think a 3SM Tegra clocked at 768Mhz lines up more with everything we've heard from devs, insiders and various rumours than a 2SM design at that clock speed (as well as the need for a fan of course).
 
The real problem with expectations actually came from Eurogamer to begin with. I don't understand why it's unreasonable to expect that the maximum (docked) performance would be around the same as a normal Tegra X1- 512GFLOPS- which is what most of us expected. The fact that even that has been reduced by ~25% is of course disappointing. The CPU clocks being reduced by 50% is even worse.

I don't really understand why basing our expectations on a standard Tegra X1- which Eurogamer reported was the devkit- was somehow us being delusional.

I do blame Eurogamer for thinking that 20W X1 in Shield TV would be used as is in that tiny chassis. That's fucking nuts, even from my none technical expert eyes.
 

Shahadan

Member
Why do you keep thinking of Switch as a handheld? The North American website literally calls it their next home console. If anything it's a stand alone system with the option to go into a portable mode, not the other way around. I guess we will have to wait a little while longer to find out definiteively.

It's a handlheld simply because having the possibility to use it as a handheld implied sacrifices( and goals to achieve that) being the main focus.
Having it as a handheld meant tiny, cheap, power efficient, cool and putting a screen on it (which means even more problems). So specs aren't going to be the main focus. Home console isn't going to be the main focus, home consoles deal with those problems differently.

It can only has been designed as a handheld first. They can call it a dildo on their website if they want to, that doesn't change anything.
It will serve as a home console since it can display shit on TV, but it was not made as what people here call "home console"
 
A 3SM design would offer 50% more GPU performance. Who knows what Nintendo are interested in. But I think a 3SM Tegra clocked at 768Mhz lines up more with everything we've heard from devs, insiders and various rumours than a 2SM design at that clock speed (as well as the need for a fan of course).

I don't know if an extra 200GFLOPS is all that important to even most developers though. All that does is reduce the level at which you have to downgrade all your settings/effects. It shouldn't really fundamentally change how ports are done.

On the other hand, CPU performance would drastically change how some games have to be designed, right?

I do blame Eurogamer for thinking that 20W X1 in Shield TV would be used as is in that tiny chassis. That's fucking nuts, even from my none technical expert eyes.

The idea that they may be going 16nm also came from that Eurogamer/DF report, which could explain how they get TX1 performance into a tiny portable. It seemed like the likeliest outcome from there. Why would we expect Nintendo to give developers a devkit with ~50% more performance than the final product will have? It just wouldn't have made sense to assume a massive downclock back then.
 
Come on. Like they would hold a conference for ports. We'll see huge games made for the Switch and some third party games that could give us some indications about what to expect.


9sxHPBx.jpg
 
Status
Not open for further replies.
Top Bottom