• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

i-Lo

Member
Well something I've learned from Nintendo fans is that if a comment is positive, then we should pay close attention to what that developer is saying. If something is negative, then clearly there is an ulterior motive.

Well yea.. because it's uhh... true.
 
Well something I've learned from Nintendo fans is that if a comment is positive, then we should pay close attention to what that developer is saying. If something is negative, then clearly there is an ulterior motive.
It's not a Nintendo fan exclusive phenomenon.

Very pronounced right now because of reality being crushed for some of them though.
 

Hawk269

Member
I am curious what they can do with that much memory available for games though. No system before it has had that much ram, yes it is slow and all, but I am sure some talented developer should be able to do something amazing.

It is too bad that Nintendo did not have something more from 1st party to show what the system can do.
 

TheD

The Detective
Vastly different is a step too far, but I'm not sure how you can reconcile very similar with the slower RAM and CPU.

Obviously if the latter is true (it is) there is a different focus in core system design. Leading to different restrictions and design imperatives to achieve parity.

Speed does not have anything to do with how similar architecture is to each other.


Very mature lol...

WiiU uses a GPGPU setup according to Mr Iwata, most of the PS360 games are very CPU intensive, so throwing PS360 code into it without enough time to tweak and optimize the code was always going to lead to issues with launch ports.

The 360 also has a GPGPU, they only work well for a small amount of code and it will take away GPU time for graphics.


I've mentioned the slower Ram, but 'slow CPU', didn't realise they had released official specs or tare down specs of the CPU, care to provide a link or are you going off the Metro developers comments about the very first devkit from 2011 ?.

Simple math, the measured size of the WiiU CPU is about 1/3 the size as the 360 CPU and they both use the same process node.

It physically can not be as fast without an extremely high clock speed, but that would make a huge amount of heat and use a huge amount of power (both things the WiiU clearly does not do).
 

clem84

Gold Member
I'm wondering what kind of games we can expect from this system. Modest CPU, lots of RAM, an advanced GPU. I mean what kind of games will this produce for developers who will take the time to dig deep into this hardware. Great big game worlds with limited AI and limited physics?

I'm wondering if they ran into problems with Pikmin 3 (obviously very AI intensive).
 
Speed does not have anything to do with how similar architecture is to each other.




The 360 also has a GPGPU, they only work well for a small amount of code and it will take away GPU time for graphics.




Simple math, the measured size of the WiiU CPU is about 1/3 the size as the 360 CPU and they both use the same process node.

It physically can not be as fast without an extremely high clock speed, but that would make a huge amount of heat and use a huge amount of power (both things the WiiU clearly does not do).

If it's true about the 360 using GPGPU tech then fair enough my bad.

I never said the CPU was as fast as Xenon or Cell, i actually said i would guess 1.6 - 2Ghz if you read my posts.
 

TheD

The Detective
If it's true about the 360 using GPGPU tech then fair enough my bad.

I never said the CPU was as fast as Xenon or Cell, i actually said i would guess 1.6 - 2Ghz if you read my posts.

We are not talking about the clock speed, we are talking about how fast the CPU is in processing power.
 
Speed does not have anything to do with how similar architecture is to each other.




The 360 also has a GPGPU, they only work well for a small amount of code and it will take away GPU time for graphics.




Simple math, the measured size of the WiiU CPU is about 1/3 the size as the 360 CPU and they both use the same process node.

It physically can not be as fast without an extremely high clock speed, but that would make a huge amount of heat and use a huge amount of power (both things the WiiU clearly does not do).

Do not forget that both the CPU AND the GPU are on that same die, thus adding to the size.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Do not forget that both the CPU AND the GPU are on that same die, thus adding to the size.

No, they are not on the same die, they are separate dies on an MCM, a different thing entirely.

(In the linked picture, the large die is the GPU, and the smaller one the CPU)
 
Speed does not have anything to do with how similar architecture is to each other.
So a low clocked, tri-core, single threaded OoOE CPU versus high clock FP heavy in order CPU's are similar architectures?

Or how about the low bandwidth memory? Or 32 MB scratchpad on the GPU?

They definitely aren't the same. For one thing the CPU completely sucks on one setup vs the other two. Any code requiring that class of CPU is not going to find an analog in WiiU. Similar overall system capability, but differing ways of reaching it.

That isn't any kind of excuse for the first round of games. Nor is it "blaming the devs" they worked with what they had in the time they had. The 360 and PS3 were much closer analogs and that rarely worked out (mainly in the PS3's case). Throw another esoteric design into the mix of similar capability, with no userbase, sloppy documentation, a history of the platform manufacturer cannibalizing software sales, and you've got a recipe for "iffy" software.
 
So a low clocked, tri-core, single threaded OoOE CPU versus high clock FP heavy in order CPU's are similar architectures?

Or how about the low bandwidth memory? Or 32 MB scratchpad on the GPU?

They definitely aren't the same. For one thing the CPU completely sucks on one setup vs the other two. Any code requiring that class of CPU is not going to find an analog in WiiU. Similar overall system capability, but differing ways of reaching it.

That isn't any kind of excuse for the first round of games. Nor is it "blaming the devs" they worked with what they had in the time they had. The 360 and PS3 were much closer analogs and that rarely worked out (mainly in the PS3's case). Throw another esoteric design into the mix of similar capability, with no userbase, sloppy documentation, a history of the platform manufacturer cannibalizing software sales, and you've got a recipe for "iffy" software.

Good point about the CPU being OoO, i completely forgot about that, another reason why the CPU wouldn't have to be clocked anywhere near 3Ghz.
 
The 360 also has a GPGPU, they only work well for a small amount of code and it will take away GPU time for graphics.

More modern GPUs are designed to handle GPGPU tasks more efficiently, and as of now, we don't have a lot of infomation about the customizations done to Wii U's GPU.
 
Good point about the CPU being OoO, i completely forgot about that, another reason why the CPU wouldn't have to be clocked anywhere near 3Ghz.
We're not looking at anything close to feasible parity in regards to the CPU though.

A clusterfuck of bad design decisions that are potentially very limiting. But at least we're getting close to an answer on why some of these physic engines don't run on the system.
 

wsippel

Banned
Speed does not have anything to do with how similar architecture is to each other.
Yeah, but how similar are they? The similarities are mostly superficial. The CPUs are all PowerPC? Maybe. Wii U uses an asymmetric, three core, out-of-order ppc32 with per-core L2, no vector units, and the CPU relies largely on paired singles for floating point math. Xenon is a symmetric, three core, in-order ppc64 with shared L2 that relies heavily on vector units and doesn't support paired singles. Code written for one CPU will perform like ass on the other - if it even runs in the first place, which is extremely unlikely for anything more complex than "Hello World". And that's just the CPU.
 

NBtoaster

Member
More modern GPUs are designed to handle GPGPU tasks more efficiently, and as of now, we don't have a lot of infomation about the customizations done to Wii U's GPU.

I would guess most cutomisations to the GPU involved getting power usage low rather than tacking on additional capability.
 

TheD

The Detective
So a low clocked, tri-core, single threaded OoOE CPU versus high clock FP heavy in order CPU's are similar architectures?

Or how about the low bandwidth memory? Or 32 MB scratchpad on the GPU?

They definitely aren't the same. For one thing the CPU completely sucks on one setup vs the other two. Any code requiring that class of CPU is not going to find an analog in WiiU. Similar overall system capability, but differing ways of reaching it.

Vastly different is PS2 vs xbox 1, one uses fast vector coprocessors on the CPU that are used for graphics work and a very simple GPU with some very fast EDRAM, the other is like a PC, with a CPU that only handles CPU type code and a GPU with a programmable pipeline that handles all the graphics.

Not a console with an in order PPC CPU vs a console with a OoE PPC CPU or faster RAM in one or one having a bit more EDRAM.

That isn't any kind of excuse for the first round of games. Nor is it "blaming the devs" they worked with what they had in the time they had. The 360 and PS3 were much closer analogs and that rarely worked out (mainly in the PS3's case). Throw another esoteric design into the mix of similar capability, with no userbase, sloppy documentation, a history of the platform manufacturer cannibalizing software sales, and you've got a recipe for "iffy" software.

Most of the power in the PS3 is in the Cell, it is far more dislike the 360 than the WiiU is dislike the 360.
 
We're not looking at anything close to feasible parity in regards to the CPU though.

A clusterfuck of bad design decisions that are potentially very limiting. But at least we're getting close to an answer on why some of these physic engines don't run on the system.

So a Tri Core, 1.5 - 2Ghz 2011 CPU that is OoO, uses the GPU for some of it's calculations, has a separate audio chip can in no way compete with the CPU performance of Cell / Xenon from 2004 / 2005 ?.

I find it extremely hard to believe Nintendo would not even hit current gen standards with their next gen console.

If that were the case why didn't they just stick a 250 GFLOP GPU with DX9 like features and 512MB's of Ram in it, forget about eDRAM, the separate Audio chip and sell it for $200 ?.

There is more to this sytem than meets the eye imo, Nintendo are by no means cutting edge but they would be beyond mental to release a console that will have to last until 2018 with performance worse than PS360.
 
Vastly different is PS2 vs xbox 1, one uses fast vector coprocessors on the CPU that are used for graphics work and a very simple GPU with some very fast EDRAM, the other is like a PC, with a CPU that only handles CPU type code and a GPU with a programmable pipeline that handles all the graphics.

Not a console with an in order PPC CPU vs a console with a OoE PPC CPU or faster RAM in one or one having a bit more EDRAM.
You're talking about a really low clocked potentially single-threaded OoOE PPC. Paired with a beastly (in comparison) GPU hamstrung because of the piss-poor bandwidth.

Most of the power in the PS3 is in the Cell, it is far more dislike the 360 than the WiiU is dislike the 360.
This I can agree with. Just with about as lopsided a focus as the PS3 (in this case a focus on the GPU) all the while constraining it because of the lousy memory bandwidth.

Anything FP heavy on the CPU side is not going to find a friend in the WiiU.
 

TheD

The Detective
So a Tri Core, 1.5 - 2Ghz 2011 CPU that is OoO, uses the GPU for some of it's calculations, has a separate audio chip can in no way compete with the CPU performance of Cell / Xenon from 2004 / 2005 ?.

Not at the size and power usage we have in the WiiU.
 
Eurogamer's unnamed sources did comment on the slow RAM actually iirc.
Found it:

Mystery surrounds the fast eDRAM attached to the GPU too. There's 32MB of it, compared to the 10MB in the Xbox 360. This should be awesome for 1080p visuals or high levels of anti-aliasing, but we've yet to see any kind of evidence of either in 3D applications. There have been several reports that the 1GB of available RAM for developers is rather slow, so it may well be the case that the much faster eDRAM is used for much more than just the framebuffer.
Source: http://www.eurogamer.net/articles/2012-08-30-how-powerful-is-the-wii-u-really

Article is from August 30.
 
So a Tri Core, 1.5 - 2Ghz 2011 CPU that is OoO, uses the GPU for some of it's calculations, has a separate audio chip can in no way compete with the CPU performance of Cell / Xenon from 2004 / 2005 ?.

I find it extremely hard to believe Nintendo would not even hit current gen standards with their next gen console.

If that were the case why didn't they just stick a 250 GFLOP GPU with DX9 like features and 512MB's of Ram in it, forget about eDRAM, the separate Audio chip and sell it for $200 ?.

There is more to this sytem than meets the eye imo, Nintendo are by no means cutting edge but they would be beyond mental to release a console that will have to last until 2018 with performance worse than PS360.
They are mental.

If you're talking about in the realm of modern game design. We're looking at a system that may always struggle running PS3/360 caliber games. Concessions were made in a range of areas that may prove insurmountable. I have no doubt Nintendo's games, and the games of interested 3rd parties will look spectacular.

But at this point I feel safe saying that range will always be well within the PS3/360 mold. With the occasional sharper texture.
 
They are mental.

If you're talking about in the realm of modern game design. We're looking at a system that may always struggle running PS3/360 caliber games. Concessions were made in a range of areas that may prove insurmountable. I have no doubt Nintendo's games, and the games of interested 3rd parties will look spectacular.

But at this point I feel safe saying that range will always be well within the PS3/360 mold. With the occasional sharper texture.

Unbelievable if its true lol...

I remember suggesting after E3 2012 that Nintendo may have downgraded the hardware for cost reasons after the Bird / Zelda demos from E3 2011, do you think that might have been what happened ?.

It's an interesting discussion but my WiiU is for Nintendo games only, its the guys who have the WiiU as their only console for next gen that i feel sorry for, sounds like Nintendo royally screwed them over...

Cheers for all the info, both of you (TheD).
 
Unbelievable if its true lol...

I remember suggesting after E3 2012 that Nintendo may have downgraded the hardware for cost reasons after the Bird / Zelda demos from E3 2011, do you think that might have been what happened ?.

It's an interesting discussion but my WiiU is for Nintendo games only, its the guys who have the WiiU as their only console for next gen that i feel sorry for, sounds like Nintendo royally screwed them over...

Cheers for all the info, both of you (TheD).
I've pretty much done a 180 since the thing got opened and we started getting actual model numbers for the memory.

I figured the bandwidth peak would be in the realm of 17GB/s. A deficit versus the PS3 and 360 but not insurmountable with that 32MB scratchpad.

The situation is actually worse than that, and both the CPU and GPU feed off it. Shouldn't act as much of a limiter to the CPU because that thing doesn't have much it can do. But that really limits the GPU and anything feeding into it.
 
So a Tri Core, 1.5 - 2Ghz 2011 CPU that is OoO, uses the GPU for some of it's calculations, has a separate audio chip can in no way compete with the CPU performance of Cell / Xenon from 2004 / 2005 ?.

I find it extremely hard to believe Nintendo would not even hit current gen standards with their next gen console.

If that were the case why didn't they just stick a 250 GFLOP GPU with DX9 like features and 512MB's of Ram in it, forget about eDRAM, the separate Audio chip and sell it for $200 ?.

There is more to this sytem than meets the eye imo, Nintendo are by no means cutting edge but they would be beyond mental to release a console that will have to last until 2018 with performance worse than PS360.

Its a System on Chip design that draws 35-40W on load and still produces visuals about on par with 100W monsters, despite being a bit weaker overall. We're looking at some great efficiency. It's not what people want in their consoles, though.
 

FLAguy954

Junior Member
Well something I've learned from Nintendo fans is that if a comment is positive, then we should pay close attention to what that developer is saying. If something is negative, then clearly there is an ulterior motive.

To be fair to his point though, the Metro developer didn't say anything about the ram either :p.
 
Its a System on Chip design that draws 35-40W on load and still produces visuals about on par with 100W monsters, despite being a bit weaker overall. We're looking at some great efficiency. It's not what people want in their consoles, though.

I wonder how much more expensive would it have been for 1 GB of GDDR5? They could have gone with 4 chips and gotten much better performance it seems.

Did they really place such a high priority on the web browser that having a whole gig for applications was necessary? Who knows, they seem to want to sell this as a bit more than just a game console, w/ the Netflix, TVii, etc. Actually, will Nintendo Tvii work while a game is suspended? It's in the menu there. Perhaps that is the true cuplrit.
 
I wonder how much more expensive would it have been for 1 GB of GDDR5? They could have gone with 4 chips and gotten much better performance it seems.

Did they really place such a high priority on the web browser that having a whole gig for applications was necessary? Who knows, they seem to want to sell this as a bit more than just a game console, w/ the Netflix, TVii, etc. Actually, will Nintendo Tvii work while a game is suspended? It's in the menu there. Perhaps that is the true cuplrit.

I don't really see what faster memory would do if the CPU is such a bottleneck.
 

Log4Girlz

Member
So whether the Wii U is in anyway more capable than the current gen now relies solely on what kind of GPU it has. Everything else is sub par.
 
I wonder how much more expensive would it have been for 1 GB of GDDR5? They could have gone with 4 chips and gotten much better performance it seems.

Did they really place such a high priority on the web browser that having a whole gig for applications was necessary? Who knows, they seem to want to sell this as a bit more than just a game console, w/ the Netflix, TVii, etc. Actually, will Nintendo Tvii work while a game is suspended? It's in the menu there. Perhaps that is the true cuplrit.

We still don't know how weak the GPU is yet. We know its on the low end of the R700 spectrum but we don't know how far down it is. Faster memory would no doubt help either way, but only marginally if its a 710-level part.

I'm guessing Nintendo reserved some of the ram for multitasking and another part sort of like a cache or buffer to make up for lack of a HDD. there's really no telling what that 1GB is reserved for as of yet.

I don't really see what faster memory would do if the CPU is such a bottleneck.

If the GPU handled all the graphical work, it would be significant. ex:Fighting games would look much better, even if the system isn't capable of Battlefields. Certain games aren't usually too taxing on the CPU.
 
So whether the Wii U is in anyway more capable than the current gen now relies solely on what kind of GPU it has. Everything else is sub par.

Not really, even if the CPU is 1.5Ghz there is still far more to a CPU than just it's clock speed as i have been told many times on here. The dedicated Audio chip, the fact that its an OoO CPU, the eDRAM and the fact that the GPU does some of it's calculations could well make up the difference between Espresso and Cell / Xenon.

The GPU is a big deal tho, as much as i loved BG's enthusiasm i can't really see it being 600 GFLOPs but i don't see why it can't be around 500 GFLOPs, a GPU of that power which supports a 2011 feature set along with the extra overall system Ram would mean it would be a leap over current gen, a small leap but a leap none the less.

We just need to wait on more details rather than condemning the console just because a few people looked up serial numbers of Ram on Google and some developer thought the CPU was slow because he saw an early version of a WiiU U devkit...

As Ideaman has pointed out several times none of the developers he talked to complained about the CPU or Ram, prob just 'Gaf being Gaf' as usual.
 

pestul

Member
The CPU and ram might be weak, but I'm impressed with it's ability to play 1080p youtube videos from within the browser. Pretty weak point, but it is something if a small tidbit..
 

ozfunghi

Member
Well something I've learned from Nintendo fans is that if a comment is positive, then we should pay close attention to what that developer is saying. If something is negative, then clearly there is an ulterior motive.

Seriously? There is a developer that hardly bothered with the hardware, calling the CPU horrible, and you have a topic of dozens of pages, with people trolling the WiiU as if it is set in stone and there are no possible loopholes and as if the CPU specs are actually revealed. Statements by devs that ARE actually working on the hardware are conveniently brushed under the rug, such as Shin'n and Gearbox. But yeah, it's the Nintendofans like always that are unreasonable. lol
 
Not really, even if the CPU is 1.5Ghz there is still far more to a CPU than just it's clock speed as i have been told many times on here. The dedicated Audio chip, the fact that its an OoO CPU, the eDRAM and the fact that the GPU does some of it's calculations could well make up the difference between Espresso and Cell / Xenon.

The GPU is a big deal tho, as much as i loved BG's enthusiasm i can't really see it being 600 GFLOPs but i don't see why it can't be around 500 GFLOPs, a GPU of that power which supports a 2011 feature set along with the extra overall system Ram would mean it would be a leap over current gen, a small leap but a leap none the less.

We just need to wait on more details rather than condemning the console just because a few people looked up serial numbers of Ram on Google and some developer thought the CPU was slow because he saw an early version of a WiiU U devkit...

Are you joking? If so we thought we contained that in the Metro thread.

If not, just looking at the size of the CPU die and the GPU die, taking into consideration the things that would have to be on those dies to ensure Wii mode compatibility, there's no magic pixie dust Nintendo can solder on boards to make this a reality. The GPU will be capable of more grunt work than the 360's in practice, though, but its chopped at the knees. On simpler games ("Nintendo style") you will be able to see that the Wii U is special. That's all that counts.
 
Seriously? There is a developer that hardly bothered with the hardware, calling the CPU horrible, and you have a topic of dozens of pages, with people trolling the WiiU as if it is set in stone and there are no possible loopholes and as if the CPU specs are actually revealed. Statements by devs that ARE actually working on the hardware are conveniently brushed under the rug, such as Shin'n and Gearbox. But yeah, it's the Nintendofans like always that are unreasonable. lol

Will be interesting to see if when PS4 / 720 launch will there be the same leeway of allowing the whole front page of the forum to be taken up with negative threads about one system, im sure by then we will have returned to 'super threads' so their downfalls are all hidden away for no one to see ;).
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Will be interesting to see if when PS4 / 720 launch will there be the same leeway of allowing the whole front page of the forum to be taken up with negative threads about one system, im sure by then we will have returned to 'super threads' so their downfalls are all hidden away for no one to see ;).

So now the mods/admins here are biased against Nintendo?

Man, Shark Jumping, Not just for Batman anymore.
 
Are you joking? If so we thought we contained that in the Metro thread.

If not, just looking at the size of the CPU die and the GPU die, taking into consideration the things that would have to be on those dies to ensure Wii mode compatibility, there's no magic pixie dust Nintendo can solder on boards to make this a reality. The GPU will be capable of more grunt work than the 360's in practice, though, but its chopped at the knees. On simpler games ("Nintendo style") you will be able to see that the Wii U is special. That's all that counts.

Not to be rude but it's a bunch of trolls jumping on the bandwagon whenever anything negative is said about the console, we all knew it wasn't going to be some sort of 5x power leap beast...

I will reserve judgement on the console until one of the big tear down sites give out a full list of ALL the system specs and not just looking at the Ram and running with the 'WiiU is doomed party line'.

I've never seen so many rabid fanboys want something to fail so hard before in my life, it's pretty sad.

A thread is made about something positive on WiiU and it gets two pages, a thread is made about the tiniest little thing thats wrong with the system and its on the front page for days.

Haters gonna hate.

Is it as bad when Sony or MS bring out consoles ?.
 

Gravijah

Member
Will be interesting to see if when PS4 / 720 launch will there be the same leeway of allowing the whole front page of the forum to be taken up with negative threads about one system, im sure by then we will have returned to 'super threads' so their downfalls are all hidden away for no one to see ;).

i hear that they are gonna have mods watching and anytime anything negative is posted about ps4/720 they will delete the thread.

keep this on the down low.
 
I will reserve judgement on the console until one of the big tear down sites give out a full list of ALL the system specs and not just looking at the Ram and running with the 'WiiU is doomed party line'.

Because they're going by thermal design laws/Moores Law. The expectation of a R700-design chip running at a low <30 power draw and fitting in at under 90mm^2 yet still reaching and exceeding at times the PS3 and 360 is reasonable. GPU tech has indeed made this feasible.

This isn't quite as true for IBM's low power CPU tech.
 

TAS

Member
Not to be rude but it's a bunch of trolls jumping on the bandwagon whenever anything negative is said about the console, we all knew it wasn't going to be some sort of 5x power leap beast...

I will reserve judgement on the console until one of the big tear down sites give out a full list of ALL the system specs and not just looking at the Ram and running with the 'WiiU is doomed party line'.

I've never seen so many rabid fanboys want something to fail so hard before in my life, it's pretty sad.

A thread is made about something positive on WiiU and it gets two pages, a thread is made about the tiniest little thing thats wrong with the system and its on the front page for days.

Haters gonna hate.

Is it as bad when Sony or MS bring out consoles ?.

Couldn't have said it better myself brother. In the end, I have no doubts that 2nd and 3rd generation software from EAD and Retro will outclass anything on PS360.
 
Because they're going by thermal design laws/Moores Law. The expectation of a R700-design chip running at a low <30 power draw and fitting in at under 90mm^2 yet still reaching and exceeding at times the PS3 and 360 is reasonable. GPU tech has indeed made this feasible.

This isn't quite as true for IBM's low power CPU tech.

Did Nintendo ever confirm that it was an R700 series GPU ?, as far as im aware all they ever confirmed was that it was a Radeon HD GPU.
 
Top Bottom