• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo NX rumored to use Nvidia's Pascal GPU architecture

Status
Not open for further replies.

MuchoMalo

Banned
I'm not sure the x1 in the dev kits are overclocked, they need active cooling to hit full clock speed (or near to) in mobile form. With Pascal I reckon Nintendo will try and get as close to the 500gflop as possible in mobile form but even if they only get half that (250gflops) it's will above Wii u (176gflops). I reckon it will be 350gflops at the lowest and 500gflops at best in mobile mode.

I just want to comment this post for actually being reasonable and based on logic, rather than feelings or assumptions.

It would also explain why some outlets are saying that it'll use an X1 when Nintendo would obviously use a custom chip: devs asked if the final hardware would be faster than the dev kit, Nintendo said no, devs assumed they were looking at the final chip.
 

ozfunghi

Member
I'm not sure the x1 in the dev kits are overclocked, they need active cooling to hit full clock speed (or near to) in mobile form. With Pascal I reckon Nintendo will try and get as close to the 500gflop as possible in mobile form but even if they only get half that (250gflops) it's will above Wii u (176gflops). I reckon it will be 350gflops at the lowest and 500gflops at best in mobile mode.

Well, let's say it's just running at a 100% and not overclocked. Due to the fact that it reaches 0.5TF, that Nvidia get more performance than AMD out of those GF (due to less use of compute i was explained once), the fact that it's a much more modern chip than the AMD inside the 360, and the statement that Zelda NX will look better than Zelda U... I'm failing to see why we shouldn't be expecting something that's a noticable step up from the 360.
 

Chittagong

Gold Member
I thought Dev kits need to be more powerful and also the dev kits could be for docked mode where the system overclocks.

Anyways you are probably right but I'm just being cautious and keeping my expectations low. Either way I should be happy as long as we can get a games that looks like Zelda BotW running in 540p in portable mode. I just watched a Gameplay video of Zelda on my Vita and it looks amazing on the small screen so to even match Wii U graphics in 540p on a portable would be insane for me.

This is the thing really, and why complaints about power are pointless to me - we will get an epic new mainline openworld Zelda on the go, and a mainline Pokemon at home. Just letting that sink in makes me have to go ly down
 

Mr Swine

Banned
Didn't one of the leaks mention Nintendo going for low end graphics and will be cheaper than all consoles.

I expect less than Wii U performance in portable mode where system is severely underclocked. However due to low res 540p screen games would look identical such as Zelda running at 540p but looks like the Wii U version which runs at 720p.

Once docked the system will overclock to give it 2x performance, may also have additional VRAM on the dock to help with the Upscaling but no way another GPU as SLI is way too expensive. The dock will allow games to be up scaled to 720p to 1080p natively. But the performance of games in dock mode is basically 2x better than a Wii U, so just Zelda running on 1080p with no other graphics enhancements. Or running in 720p with better graphics.

People mentioning this will be anywhere near Xbox One levels have got it all wrong.

So you basically think the NX performance will be a Vita but with a very modern architecture?
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Well, let's say it's just running at a 100% and not overclocked. Due to the fact that it reaches 0.5TF, that Nvidia get more performance than AMD out of those GF (due to less use of compute i was explained once), the fact that it's a much more modern chip than the AMD inside the 360, and the statement that Zelda NX will look better than Zelda U... I'm failing to see why we shouldn't be expecting something that's a noticable step up from the 360.

I still have yet to see the statement that Zelda NX will "look better" than Wii U version. Do you have a link?
 

KAL2006

Banned
Well, let's say it's just running at a 100% and not overclocked. Due to the fact that it reaches 0.5TF, that Nvidia get more performance than AMD out of those GF (due to less use of compute i was explained once), the fact that it's a much more modern chip than the AMD inside the 360, and the statement that Zelda NX will look better than Zelda U... I'm failing to see why we shouldn't be expecting something that's a noticable step up from the 360.

Can I have a source for the statement that Zelda will look better on NX.
 

Rodin

Member
So you basically think the NX performance will be a Vita but with a very modern architecture?

The Vita can run Breath of the Wild at 540p? Lol

Not that i think that what he said will happen, but the Vita is nowhere near that.

Anyway, again, these posts are ignoring osirisblack's hint (and lcgeek's). Not that i expect a powerhouse, but thinking that this will be closer to Wii U than Xbone sounds like a stretch to me at the moment. Even Emily said that the jump from Wii U is huge.

Emily Rogers said:
But everything that I’ve heard (so far) indicates that NX isn’t going to blow away any of the consoles on the market today…except for Wii U.
 

Ganondolf

Member
Well, let's say it's just running at a 100% and not overclocked. Due to the fact that it reaches 0.5TF, that Nvidia get more performance than AMD out of those GF (due to less use of compute i was explained once), the fact that it's a much more modern chip than the AMD inside the 360, and the statement that Zelda NX will look better than Zelda U... I'm failing to see why we shouldn't be expecting something that's a noticable step up from the 360.

In my opinion the Wii u graphics are already about xbox360 and this should be about twice Wii u's gpu at worse case. It would also be more modern chips too plus like you said nvidia pre forms better than amd's gpus. It worth nothing that there is a lot more than just the gpu, the cpu will be better and the ram will be at least twice the size of the Wii u and even possibly x4 the size.

And then on top of that it will be displaying in handheld mode on a screen that's 540p or at best 720p.

The jump in graphics from Wii u will be very noticeable.
 

KAL2006

Banned
We have to also remember whether it's X1 or X2 it will likely be custom which could me less cores and on a lower clock rate. If the NVIDIA Shield TV was not on a mains and ran on a battery while playing high performing games like Zelda. The battery would be gone within 39 minutes. So making direct comparisons is quite pointless
 
This is the thing really, and why complaints about power are pointless to me - we will get an epic new mainline openworld Zelda on the go, and a mainline Pokemon at home. Just letting that sink in makes me have to go ly down

It's possible for someone to be very excited about both portable and graphically superior Nintendo games and also be hopeful that the NX reaches a certain power level making it capable of receiving AAA ports (not that the power guarantees the ports).

Not mutually exclusive.
 
Since someone brought it up and it's relevant, here's a video showing how Maxwell and Pascal stack up when scaled to the same FLOPS. The results may surprise you... unless you've been reading this thread of course



It has to if it's using a Pascal-based Tegra.

Yeah, that should be well known by everyone who's paid attention to Pascal performance, it's only a tweaked version of Maxwell on a smaller fab, it's the higher clocks at the same power usage/better power efficiency that would lead to the performance increase.
 

Hoo-doo

Banned
We have to also remember whether it's X1 or X2 it will likely be custom which could me less cores and on a lower clock rate. If the NVIDIA Shield TV was not on a mains and ran on a battery while playing high performing games like Zelda. The battery would be gone within 39 minutes. So making direct comparisons is quite pointless

Seriously? I thought the Shield had quite a respectable battery life?

Damn, I hope Nintendo stuffs that thing with a 10.000MAh battery or something.
 

Ganondolf

Member
Seriously? I thought the Shield had quite a respectable battery life?

Damn, I hope Nintendo stuffs that thing with a 10.000MAh battery or something.

The shield tv is the desk top box not the handheld. It runs on mains only.

This is why Pascal is important to reduce power draw and heat.
 
I thought Dev kits need to be more powerful and also the dev kits could be for docked mode where the system overclocks.

Anyways you are probably right but I'm just being cautious and keeping my expectations low. Either way I should be happy as long as we can get a games that looks like Zelda BotW running in 540p in portable mode. I just watched a Gameplay video of Zelda on my Vita and it looks amazing on the small screen so to even match Wii U graphics in 540p on a portable would be insane for me.

dev kits usually have more memory but they don't need to be more powerful
 
In my opinion the Wii u graphics are already about xbox360 and this should be about twice Wii u's gpu at worse case. It would also be more modern chips too plus like you said nvidia pre forms better than amd's gpus. It worth nothing that there is a lot more than just the gpu, the cpu will be better and the ram will be at least twice the size of the Wii u and even possibly x4 the size.

And then on top of that it will be displaying in handheld mode on a screen that's 540p or at best 720p.

The jump in graphics from Wii u will be very noticeable.

Hell, I thought we'd be lucky if we could get Wii U-ish performance on the next handheld.

Not because I'm one of the weird people that insists Nintendo never used the latest chips, but because of the battery life.
 

KingSnake

The Birthday Skeleton
Hell, I thought we'd be lucky if we could get Wii U-ish performance on the next handheld.

Really. Months before this leak (before SemiAccurate leak) most of the people thought Wii U performance in handheld is quite optimistic and rather the maximum expectation to have while now it will mean using a Tegra gimped almost to the max. No matter how one tries to spin this is good news.
 

Eolz

Member
Really. Months before this leak (before SemiAccurate leak) most of the people thought Wii U performance in handheld is quite optimistic and rather the maximum expectation to have while now it will mean using a Tegra gimped almost to the max. No matter how one tries to spin this is good news.

To be fair on this, at that point everyone thought there'd be two separate platforms, and that nvidia would never be in an handheld since the 3DS fiasco.
 

G.ZZZ

Member
There is no way the chipset is overclocked in any mode. For various reasons:

- first, overclock increase drasticly the heat and power consumption, for marginal gains. The curve of the performances/wattage grow rapidly and then become flat. The "nominal" speed is where the curve begins to become flat. Increasing the speed over that point come as extremely high termal and power costs. As this has to fit in a tablet size, there's no way it will be.

- second, not all chipset can be reliably overclocked. Only higher quality ones do. To make every chip overclock reliably in docked mode would increase significantly the costs since it'd decrease the yelds.

- third, the chipset may come close to heat limitation in a small form factor already before full clock (i think they'll go for a pretty bulky handheld, but 5" is possible too obviously). That said, if the X1 is a devkit, and this is using Pascal, there should be absolutely no problem at hitting X1 performances without heating issues.

My personal assumption is that this is extremely similar to X1 (aka 2 core GPU, 4 core A57/A53/whatever), but with pascal instead of maxwell. Which mean that a 2 core GPU would perform easily above X1 full clock even at much lower wattages and heat targets (keep in mind the 16nm over the 20nm process make it consume about 60% less W for the same level of perfomances).

I don't see them using 1 core because we'd get a:

- significantly worse GPU compared to devkits even with Pascal
- a different GPU architecture compared to an X1, which mean that translating the code from games in development from a devkit to the new thing wouldn't be AS trivial as it would be if u had identical architecture (Pascal is basically the same as Maxwell but on a smaller node)

I don't see them using 3 cores either because of costs and heat considerations, plus there's no way you could emulate the performances of a Pascal 3 core with a X1, even overclocked.
 
Anyways you are probably right but I'm just being cautious and keeping my expectations low. Either way I should be happy as long as we can get a games that looks like Zelda BotW running in 540p in portable mode. I just watched a Gameplay video of Zelda on my Vita and it looks amazing on the small screen so to even match Wii U graphics in 540p on a portable would be insane for me.

I just did the same, and it is glorious. It's almost inconceivable that Nintendo may be on the verge of releasing a portable with those capabilities at minimum. I'm still hopeful for healthy graphical improvements, but I'd be thrilled either way.
 
There is no way the chipset is overclocked in any mode. For various reasons:

- first, overclock increase drasticly the heat and power consumption, for marginal gains. The curve of the performances/wattage grow rapidly and then become flat. The "nominal" speed is where the curve begins to become flat. Increasing the speed over that point come as extremely high termal and power costs. As this has to fit in a tablet size, there's no way it will be.

- second, not all chipset can be reliably overclocked. Only higher quality ones do. To make every chip overclock reliably in docked mode would increase significantly the costs since it'd decrease the yelds.

- third, the chipset may come close to heat limitation in a small form factor already before full clock (i think they'll go for a pretty bulky handheld, but 5" is possible too obviously). That said, if the X1 is a devkit, and this is using Pascal, there should be absolutely no problem at hitting X1 performances without heating issues.

My personal assumption is that this is extremely similar to X1 (aka 2 core GPU, 4 core A57/A53/whatever), but with pascal instead of maxwell. Which mean that a 2 core GPU would perform easily above X1 full clock even at much lower wattages and heat targets (keep in mind the 16nm over the 20nm process make it consume about 60% less W for the same level of perfomances).

I don't see them using 1 core because we'd get a:

- significantly worse GPU compared to devkits even with Pascal
- a different GPU architecture compared to an X1, which mean that translating the code from games in development from a devkit to the new thing wouldn't be AS trivial as it would be if u had identical architecture (Pascal is basically the same as Maxwell but on a smaller node)

I don't see them using 3 cores either because of costs and heat considerations, plus there's no way you could emulate the performances of a Pascal 3 core with a X1, even overclocked.

You're thinking too hard on the term... "overclocking" in this case could simply mean running it higher than it's default state. The Vita, for instance, runs at 333mhz with a 444mhz 'boost' state, that could loosely be defined as overclocking.
 

MuchoMalo

Banned
There is no way the chipset is overclocked in any mode. For various reasons:

- first, overclock increase drasticly the heat and power consumption, for marginal gains. The curve of the performances/wattage grow rapidly and then become flat. The "nominal" speed is where the curve begins to become flat. Increasing the speed over that point come as extremely high termal and power costs. As this has to fit in a tablet size, there's no way it will be.

- second, not all chipset can be reliably overclocked. Only higher quality ones do. To make every chip overclock reliably in docked mode would increase significantly the costs since it'd decrease the yelds.

- third, the chipset may come close to heat limitation in a small form factor already before full clock (i think they'll go for a pretty bulky handheld, but 5" is possible too obviously). That said, if the X1 is a devkit, and this is using Pascal, there should be absolutely no problem at hitting X1 performances without heating issues.

My personal assumption is that this is extremely similar to X1 (aka 2 core GPU, 4 core A57/A53/whatever), but with pascal instead of maxwell. Which mean that a 2 core GPU would perform easily above X1 full clock even at much lower wattages and heat targets (keep in mind the 16nm over the 20nm process make it consume about 60% less W for the same level of perfomances).

I don't see them using 1 core because we'd get a:

- significantly worse GPU compared to devkits even with Pascal
- a different GPU architecture compared to an X1, which mean that translating the code from games in development from a devkit to the new thing wouldn't be AS trivial as it would be if u had identical architecture (Pascal is basically the same as Maxwell but on a smaller node)

I don't see them using 3 cores either because of costs and heat considerations, plus there's no way you could emulate the performances of a Pascal 3 core with a X1, even overclocked.

The idea is more that it would be underclocked in mobile mode than that it would be oeverclocked in docked mode.
 

Rodin

Member
There is no way the chipset is overclocked in any mode. For various reasons:

- first, overclock increase drasticly the heat and power consumption, for marginal gains. The curve of the performances/wattage grow rapidly and then become flat. The "nominal" speed is where the curve begins to become flat. Increasing the speed over that point come as extremely high termal and power costs. As this has to fit in a tablet size, there's no way it will be.

- second, not all chipset can be reliably overclocked. Only higher quality ones do. To make every chip overclock reliably in docked mode would increase significantly the costs since it'd decrease the yelds.

- third, the chipset may come close to heat limitation in a small form factor already before full clock (i think they'll go for a pretty bulky handheld, but 5" is possible too obviously). That said, if the X1 is a devkit, and this is using Pascal, there should be absolutely no problem at hitting X1 performances without heating issues.

My personal assumption is that this is extremely similar to X1 (aka 2 core GPU, 4 core A57/A53/whatever), but with pascal instead of maxwell. Which mean that a 2 core GPU would perform easily above X1 full clock even at much lower wattages and heat targets (keep in mind the 16nm over the 20nm process make it consume about 60% less W for the same level of perfomances).

I don't see them using 1 core because we'd get a:

- significantly worse GPU compared to devkits even with Pascal
- a different GPU architecture compared to an X1, which mean that translating the code from games in development from a devkit to the new thing wouldn't be AS trivial as it would be if u had identical architecture (Pascal is basically the same as Maxwell but on a smaller node)

I don't see them using 3 cores either because of costs and heat considerations, plus there's no way you could emulate the performances of a Pascal 3 core with a X1, even overclocked.

The "dock and portable mode" argument isn't about components getting an oc in dock mode, but about running at full speed when docked and downlocked when in portable mode, to save battery life and avoid overheating.

The overclock argument is about the X1 in devkits supposedly running at a higher clock to reach the performances of the chip that will be actually used (and that we now know is based on Pascal) because the cooling solution is very noisy, which shouldn't be the case with TX1.
 

KAL2006

Banned
I expect NX to be the same as a NVIDIA Shield TV but in Pascal architecture. I also expect it to have a custom chip that has lower cores than normal Pascal chips. Finally I don't expect it to overclock with fans or cooling. Docked would be standard clock and nd portable would be underclocked.

So no way I expect anything near XB1

0.5x more powerful than a Wii U docked
Slightly less powerful than Wii U not docked with decent battery life and 540p screen

Cheap price point of $250

People need to have their expectations in check
 

Oregano

Member
I expect NX to be the same as a NVIDIA Shield TV but in Pascal architecture. I also expect it to have a custom chip that has lower cores than normal Pascal chips. Finally I don't expect it to overclock with fans or cooling. Docked would be standard clock and nd portable would be underclocked.

So no way I expect anything near XB1

0.5x more powerful than a Wii U docked
Slightly less powerful than Wii U not docked with decent battery life and 540p screen

Cheap price point of $250

People need to have their expectations in check

It's already said but that would mean the X1 in the dev kits is wholly unrepresentative of the hardware which is useless.

Also $250 for that wouldn't be cheap.
 
I expect NX to be the same as a NVIDIA Shield TV but in Pascal architecture. I also expect it to have a custom chip that has lower cores than normal Pascal chips. Finally I don't expect it to overclock with fans or cooling. Docked would be standard clock and nd portable would be underclocked.

So no way I expect anything near XB1

0.5x more powerful than a Wii U docked
Slightly less powerful than Wii U not docked with decent battery life and 540p screen

Cheap price point of $250

People need to have their expectations in check


If you expect it to be 512gflops when docked, then there's no way it's only 1.5 Wii U
 

G.ZZZ

Member
I wrote that because i saw some people talking about "overclock" when there is a near 0 chance this is actually OC even in docked mode. It's full clock or near full clock at best even in docked mode.
 
I wrote that because i saw some people talking about "overclock" when there is a near 0 chance this is actually OC even in docked mode. It's full clock or near full clock at best even in docked mode.

Again, you're getting hung up on the word without taking in the context everyone is using it in.
 

G.ZZZ

Member
I expect NX to be the same as a NVIDIA Shield TV but in Pascal architecture. I also expect it to have a custom chip that has lower cores than normal Pascal chips. Finally I don't expect it to overclock with fans or cooling. Docked would be standard clock and nd portable would be underclocked.

So no way I expect anything near XB1

0.5x more powerful than a Wii U docked
Slightly less powerful than Wii U not docked with decent battery life and 540p screen

Cheap price point of $250

People need to have their expectations in check

1.5 times a Wii U is a 1 core Pascal clocked at less than half full nominal clock, or less than half a X1. Why would they give out devkits that slow compared to the actual console? Do they want to people to trash all the code they're writing on it ?
 

KAL2006

Banned
Remember the NVIDIA Shield TV is quite large. Has a fan inside and runs on a mains because the architecture inside draws alot of power. Like I said they will not only need to use Pascal for this to work portable but they will also need to make a custom and less processing chip so it doesn't drain battery and can be fanless.

All I am saying is we should not use X1 equivalent and assume that's the power. Custom X2 could potentially mean it may not be as powerful as one assumes.

Just an idea what if inside the NX there are 2 cores. But 1 switches off in portable mode. Then when docked both Tur on for double the resolution. Any technical people know if this can be feasible.
 

MuchoMalo

Banned
I expect NX to be the same as a NVIDIA Shield TV but in Pascal architecture. I also expect it to have a custom chip that has lower cores than normal Pascal chips. Finally I don't expect it to overclock with fans or cooling. Docked would be standard clock and nd portable would be underclocked.

So no way I expect anything near XB1

0.5x more powerful than a Wii U docked
Slightly less powerful than Wii U not docked with decent battery life and 540p screen

Cheap price point of $250

People need to have their expectations in check

Well, the GPU only has two "cores," so you're expecting that the devs kits are around 4-5x more powerful than the actual hardware. Why would that be necessary?

People need to learn that there's a difference between keeping expectations in check and just being flat-out pessimistic, especially if you're expecting $250 for that kind of hardware and calling it cheap...
 

G.ZZZ

Member
Remember the NVIDIA Shield TV is quite large. Has a fan inside and runs on a mains because the architecture inside draws alot of power. Like I said they will not only need to use Pascal for this to work portable but they will also need to make a custom and less processing chip so it doesn't drain battery and can be fanless.

All I am saying is we should not use X1 equivalent and assume that's the power. Custom X2 could potentially mean it may not be as powerful as one assumes.

Just an idea what if inside the NX there are 2 cores. But 1 switches off in portable mode. Then when docked both Tur on for double the resolution. Any technical people know if this can be feasible.

Why they would give devkits of a X1 and then go for half or a third of that?

Also, 250$ for a 1 core Pascal GPU and some ARM cpu isn't cheap at all. The shield came out at 200$ and it's a limited product which has the X1 which was the fastest and newest Tegra at the time.

I'm not saying it's impossible, i'm saying that based on the two leaks/rumors we had (devkit is a X1 but actual console use Pascal instead) the logical thing is not that the final console is a third the speed of the devkits and cost 250$.

WiiU devkits were the same as the final console, but with 3 Gb ram instead of 2 and slightly slower clocks iirc.
 

wildfire

Banned
The idea is more that it would be underclocked in mobile mode than that it would be oeverclocked in docked mode.

I personally try avoiding that mistake. I don't downclock a 4770K to 2.0 GHZ and call reverting it back to base 4.0 GHZ overclocking.

I hope more people are mindful that overclocking isn't about getting some higher number.

G.ZZZ already said it but I'll be even more succinct.

Overclocking means drastically increasing heat and breaking
(I'm exaggerating to be simple)
down your chip faster in order to get some nice performance over standard
(base)
clocks.

Overclocking is like going from working 30 hours a week to 60 hours. It's a lot of stress for arguably questionable gains.
 

KAL2006

Banned
Again, why they would give devkits of a X1 and then go for half or a third of that?

We don't know the full details of the dev kits

It could easily be over clocked X1 dev kits to match Pascal's power efficiency but details of it being custom was not leaked properly. Also arent dev kits also more powerful and this devkit also can be based on when the system is docked.
 

kami_sama

Member
I personally try avoiding that mistake. I don't downclock a 4770K to 2.0 GHZ and call reverting it back to base 4.0 GHZ overclocking.

I hope more people are mindful that overclocking isn't about getting some higher number.

G.ZZZ already said it but I'll be even more succinct.

Overclocking means drastically increasing heat and breaking
(I'm exaggerating to be simple)
down your chip faster in order to get some nice performance over standard
(base)
clocks.

Overclocking is like going from working 30 hours a week to 60 hours. It's a lot of stress for arguably questionable gains.

I don't know if simply overclocking can diminish the life of the chip.
Overvolting can, but I don't think they will that.
 

MuchoMalo

Banned
Remember the NVIDIA Shield TV is quite large. Has a fan inside and runs on a mains because the architecture inside draws alot of power. Like I said they will not only need to use Pascal for this to work portable but they will also need to make a custom and less processing chip so it doesn't drain battery and can be fanless.

All I am saying is we should not use X1 equivalent and assume that's the power. Custom X2 could potentially mean it may not be as powerful as one assumes.

Just an idea what if inside the NX there are 2 cores. But 1 switches off in portable mode. Then when docked both Tur on for double the resolution. Any technical people know if this can be feasible.

Well first of all, if there are two then it's already going to be way faster than your guess. Second, it would actually be easier and would reduce heat more if they cut the clock in half than if they disabled an SM.

We don't know the full details of the dev kits

It could easily be over clocked X1 dev kits to match Pascal's power efficiency but details of it being custom was not leaked properly. Also arent dev kits also more powerful and this devkit also can be based on when the system is docked.

Early dev kits can be more powerful, but not this close to launch and certainly not around 4x as powerful. There's also no reason for the dev kits to have a fan unless the chip is running at at least full speed. Also, "overclocked to match efficiency" is an oxymoron. It makes no sense.
 
Remember the NVIDIA Shield TV is quite large. Has a fan inside and runs on a mains because the architecture inside draws alot of power. Like I said they will not only need to use Pascal for this to work portable but they will also need to make a custom and less processing chip so it doesn't drain battery and can be fanless.

All I am saying is we should not use X1 equivalent and assume that's the power. Custom X2 could potentially mean it may not be as powerful as one assumes.

Just an idea what if inside the NX there are 2 cores. But 1 switches off in portable mode. Then when docked both Tur on for double the resolution. Any technical people know if this can be feasible.

It's perfectly fine to have low expectations, but you should accept that based on the Eurogamer report, we should expect at least around TX1 performance either in docked or handheld mode, which is quite a bit more powerful than Wii U performance. It's realistic to expect this, not pie in the sky crazy.

Why they would give devkits of a X1 and then go for half or a third of that?

Also, 250$ for a 1 core Pascal GPU and some ARM cpu isn't cheap at all. The shield came out at 200$ and it's a limited product which has the X1 which was the fastest and newest Tegra at the time.

I'm not saying it's impossible, i'm saying that based on the two leaks/rumors we had (devkit is a X1 but actual console use Pascal instead) the logical thing is not that the final console is a third the speed of the devkits and cost 250$.

Yeah this is another interesting point- nVidia needs pretty high margins on their hardware since they don't get any licensing revenue from Shield games, right? In that case Nintendo would likely be able to offer a hypothetical Shield clone for cheaper than $200, since they get the majority of their money from licensing and software, and can just about break even on hardware.

Not sure how this speaks to the NX pricing but it's something to keep in mind.
 
We don't know the full details of the dev kits

It could easily be over clocked X1 dev kits to match Pascal's power efficiency but details of it being custom was not leaked properly. Also arent dev kits also more powerful and this devkit also can be based on when the system is docked.



You don't overclock a product to match a power efficiency, that's a nonsense.
 

Oregano

Member
We don't know the full details of the dev kits

It could easily be over clocked X1 dev kits to match Pascal's power efficiency but details of it being custom was not leaked properly. Also arent dev kits also more powerful and this devkit also can be based on when the system is docked.

But they wouldn't have to overclock X1 to match Pascal if they were going for something less powerful than a standard X1. They also wouldn't customise the X1 for dev kits. If they were aiming at such a small amount of power they would just throw a K1 in the dev kits.
 
We don't know the full details of the dev kits

It could easily be over clocked X1 dev kits to match Pascal's power efficiency but details of it being custom was not leaked properly. Also arent dev kits also more powerful and this devkit also can be based on when the system is docked.

dev kits usually have more memory but are not usually more powerful
 

G.ZZZ

Member
What is this WiiU is 176 gflops stuff? That is a fair bit lower than the 360 and ps3's gpu ratings.

Not that much tbh:

2414234-tegra+ki+ps3.jpg
 

Oregano

Member
It's perfectly fine to have low expectations, but you should accept that based on the Eurogamer report, we should expect at least around TX1 performance either in docked or handheld mode, which is quite a bit more powerful than Wii U performance. It's realistic to expect this, not pie in the sky crazy.



Yeah this is another interesting point- nVidia needs pretty high margins on their hardware since they don't get any licensing revenue from Shield games, right? In that case Nintendo would likely be able to offer a hypothetical Shield clone for cheaper than $200, since they get the majority of their money from licensing and software, and can just about break even on hardware.

Not sure how this speaks to the NX but it's something to keep in mind.

There is also incentive for Nvidia to give Nintendo a good deal because powering NX will likely be their biggest Tegra win and considering it took Nintendo 15 years to switch from the Gamecube architecture they'd probably have business going forward for at least 10 years.
 

G.ZZZ

Member
But they wouldn't have to overclock X1 to match Pascal if they were going for something less powerful than a standard X1. They also wouldn't customise the X1 for dev kits. If they were aiming at such a small amount of power they would just throw a K1 in the dev kits.

That too, a K1 is still 364 Gflops or more than twice the WiiU. If the console had to be considerably worse than a X1 (which is 512) there's no reason for the devkits to use an X1 instead of a cheaper and closer to actual performances K1.

I guess it was consensus they are aiming more power than Tegra X1... that is why the choose Pascal.

It's possible that it just cost less as the 20nm node technology is being abandoned, on top of consuming much less power, which help immensely for a mobile unit.
 
Status
Not open for further replies.
Top Bottom