• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
800 for the Ram
1200 for the CPU
400 for the GPU
200 for the DSP

A little over a 1.5 increase in power from the Wii
which was a 1.5 increase from the Gamecube.

Interesting, He wouldn't be the only person putting a 1.2Ghz speed forward as a likely speed of the CPU either, although the method to reach those conclusions are a little tenuous to say the least.
 
This might be the wrong thread for this, but I was watching Extra Credits and they said something which , quite possibly, could be the reason why the big 3 aren't going totally discless:

quite bluntly: the world is running out of bandwidth.
 

DonMigs85

Member
This might be the wrong thread for this, but I was watching Extra Credits and they said something which , quite possibly, could be the reason why the big 3 aren't going totally discless:

quite bluntly: the world is running out of bandwidth.

That's possible, and I'd really rather not be downloading gigs and gigs of files. Anything over 10GB I would prefer to have on disc.
 
800 for the Ram
1200 for the CPU
400 for the GPU
200 for the DSP

A little over a 1.5 increase in power from the Wii
which was a 1.5 increase from the Gamecube.

How many GFLOPs would a GPU be, running at 400Mhz ?.

A 1.2 Ghz CPU in 2012, not even Nintend... would they ? lol !.
 

MDX

Member
Can't 2011 components do more with a smaller power draw compared to 2005 components though ?.

I think people who are automatically saying low power draw = weak sauce are being a bit negative.

How many Watts did Gamecube draw and look at how small yet powerful that thing was compared to other current gen systems at the time.


Nintendo has stated emphatically that the WiiU was designed to consume as little as power as possible:

One of the aims with Wii U is shared with Wii – low power consumption, but high performance. (Versus high power to achieve high performance.) This is a strategy they’ve embraced since the GameCube.

The use of a multi-core CPU helped them lower power consumption. They’re also embedding it onto an MCM alongside the GPU in order to reduce cost and speed up the exchange of data while minimizing overhead. (In simple terms, it’s cheaper, faster, and wastes less processing power.) With GameCube and Wii, the CPU and GPU were separate. The MCM also takes up less space on the main board.

Because, like the Wii, they want the machine always on. And for it to be reliable.
If they didnt go through all this trouble, obviously the WiiU would be running hotter and drawing more current. Would that mean it would be more powerful?
 

DonMigs85

Member
Depends on the ALU/SPU amount.

The number of SPU's, times the frequency, times 2, divided by 1000.

For instance 480*400*2/1000=384 Gflops.

You can't use the same computation for Nvidia GPUs though, since AMD's are five-dimensional. It takes roughly 3-5 AMD SPUs to equal one Nvidia SPU.
 
Luckily WiiU is using an AMD chip. I think he wanted to know how to calculate Gflops for WiiU.

Yeah i did.

So this console could very well have a 1.2Ghz CPU, 1GB of Ram for games but that is clocked at half the speed of the 512MB's of Ram inside PS360 and a 300GFLOP GPU, in 2012 !, for $350 !!!.

If that's true, slow clap for Nintendo, going after core gamers my ass...
 

MDX

Member
How many GFLOPs would a GPU be, running at 400Mhz ?.

A 1.2 Ghz CPU in 2012, not even Nintend... would they ? lol !.

Well, there is a reason why the 360 and PS3, were expensive (and still sold at a loss), ran hot, were designed to last for 10 years, have gone through several configurations, and are still a bit expensive depending on the model you get.
 

Raistlin

Post Count: 9999
Can't 2011 components do more with a smaller power draw compared to 2005 components though ?.

I think people who are automatically saying low power draw = weak sauce are being a bit negative.

How many Watts did Gamecube draw and look at how small yet powerful that thing was compared to other current gen systems at the time.
Efficiency isn't some mystical, unknowable thing though.

With some systems/architecture knowledge, you can combine the power draw range, core counts, processor families, and die sizes to get a rough estimate of clock speeds and GFLOPs.
 

ozfunghi

Member
Well, I'm goin with these multipliers:

DSP: 266MHz (base)
GPU: 533MHz (2X DSP)
RAM: 800MHz (1.5X GPU)
CPU: 1.6GHz (2X RAM)

DSP seems high, no? I think 133 would be closer? Than the rest of your numbers can stay :)

EDIT: though i wouldn't rule out the GPU running slower (400). I know Matt said "a little slower" in response to 600Mhz, but maybe that "little" wasn't supposed to be taken that literally. It also means it's a better match due to heat and power consumption. That would mean they needed some extra SPU's to compensate... which would also make sense if they really want the GPGPU feature being put to use. I've been guessing 480 SPU's at 480Mhz (460 Gflops), but it could also be 400Mhz and 640 SPU's (512 Gflops) for instance.
 

Meelow

Banned
Yeah i did.

So this console could very well have a 1.2Ghz CPU, 1GB of Ram for games but that is clocked at half the speed of the 512MB's of Ram inside PS360 and a 300GFLOP GPU, in 2012 !, for $350 !!!.

If that's true, slow clap for Nintendo, going after core gamers my ass...

BG says he still feels the Wii U has a 600GFLOPS, so I'm going to stick to his statement until proving otherwise.
 
BG says he still feels the Wii U has a 600GFLOPS, so I'm going to stick to his statement until proving otherwise.

Why? He has no insider info. Why are people using him to source things... he's not a dev, he's not in the press, he's a guy that reads B3D. His 600gflops prediction should be proof he has no info.
 

TheD

The Detective
Nothing else, EVERYTHING else is speculation, including the 'leaked' Ram speed, i really don't think googling a part number tells the full story, Nintendo could have quite easily asked the companies to clock it at a different speed for their personal needs.

That is borderline nuts.

If nintendo wanted different speed RAM they would have used different chips!

The max RAM speed is KNOWN!


Well, I'm goin with these multipliers:

DSP: 266MHz (base)
GPU: 533MHz (2X DSP)
RAM: 800MHz (1.5X GPU)
CPU: 1.6GHz (2X RAM)

Nothing forces the system to work like, they could be any clock speed they want.
 

xbhaskarx

Member
In the end, I have no doubts that 2nd and 3rd generation software from EAD and Retro will outclass anything on PS360.
So how many years from now will software from those two surpass what is on the PS360, and how long after the PS360 replacements have been released?
 

tenchir

Member
That is borderline nuts.

If nintendo wanted different speed RAM they would have used different chips!

The max RAM speed is KNOWN!




Nothing forces the system to work like, they could be any clock speed they want.

To maintain BC, you going to need multipliers to lock the GPU/CPU/Memory to the emulated console's speed for maximum compatiblity. The bus speed seems to be a multiple of 162mhz.

GC:
CPU speed 486Mhz (162x3)
GPU speed 162 Mhz(162x1)
24 1t-SRAM 324Mhz(162x2) and 16MB DRAM 81Mhz(162x0.5)
DSP 81Mhz(162x0.5)

Wii:
CPU speed 729Mhz(162x4.5)
GPU speed 486Mhz(162x3)
DSP 81Mhz*(162x0.5) <- Integrated as part of Hollywood MCM. Couldn't find more info on speed, so this is an assumption.
ARM speed 243Mhz(162x1.5) <Part of Hollywood MCM. Handles I/O functions and other stuff.

edit:
My guess for Wii U would be
CPU 1620Mhz(162x10)
GPU speed 486Mhz(162x3)
RAM 810Mhz(162x5) <- I am unsure of this because of bus clock and memory clock.
DSP could be anything since there wasn't any dsp in the previous console. 162-243Mhz
ARM 243Mhz(162x0.5) <- You would need this for Wii compatibility reason if it doesn't do anything else for the Wii U.
 

disap.ed

Member
My guess for Wii U would be
CPU 1620Mhz(162x10)
GPU speed 486Mhz(162x3)

This^ or this:

CPU: 1458 MHz
GPU: 486 MHz
RAM: 729 MHz
DSP: 121,5 MHz

(Everything based on the 243/729 MHz Wii GPU/CPU clocks).

GPU has probably 400 or 480 shader units judging by the die size, so this would mean between 390-460 GFLOPS. 460 fits the "a little less than 600" comment better so that's my guess.
 

Xdrive05

Member
Do we know if the U-CPU has more cache than the Wii and GC single core variants? Would it just be 3 times the cache since 3 times the cores? Or did they bother to add more? (not counting the edram that is)
 

ozfunghi

Member
Do we know if the U-CPU has more cache than the Wii and GC single core variants? Would it just be 3 times the cache since 3 times the cores? Or did they bother to add more? (not counting the edram that is)

All the info we know for a fact is stated in the other topic (something something "serious discussions welcome" by Blu). The cores have assymetrical cache lay-out, with one core getting 2MB and two cores getting 512KB... i believe.

EDIT: here you go: http://www.neogaf.com/forum/showthread.php?t=500466
 

MDX

Member
Im just wondering

Obviously Sony and MS will try to double
the speed of their GPU's from 500mhz up to around 1Ghz
for their next consoles.

But what are they going to do with their CPU?
They are currently clocked over 3 ghz.
How far can they go with this?
Simply add extra cores?

If the WiiU's CPU is clocked under 2Ghz,
you can see that Nintendo has left themselves room to grow.
Even the WiiU's predecessor may not surpass the numbers
the 360 and Ps3 have been boasting.

Its going to be interesting how this all plays out.
 
Im just wondering

Obviously Sony and MS will try to double
the speed of their GPU's from 500mhz up to around 1Ghz
for their next consoles.

But what are they going to do with their CPU?
They are currently clocked over 3 ghz.
How far can they go with this?
Simply add extra cores?

They don't need higher clock rates but more performance per clock. They might even lower the clock rates. It's likely that there will be more cores, too, of course.
In any case, eg. even Jaguar cores will be clearly superior to what we have in 360 and PS3.
 

StevieP

Banned
Why go with 8GB of ram and use slow ram? Almost all of it would be used only as cache. Just doesn't make any sense at all.

Even the fastest ddr3 would only use 200MB per frame. Using another 1.5 GB for cache, you are left with at least 5 GB that are pretty much useless.

ITT, people still think they're getting a large amount (8GB, for example) of memory that's blazing fast.

For reference, the kits have 12GB of DDR3.
 

pestul

Member
Nothing forces the system to work like, they could be any clock speed they want.

Nintendo loves multipliers. I'm not quite sure the calculations we're doing are as simple for the gpu though. This thing should be orders of magnitude faster and more complex than Wiis. Having said that, the custom modifications to the gpu by amd probably enable it to enter 'Wii mode' rather than adding any DX11 or Shader model 5 feature sets. :(
 

Raist

Banned
Talked with BG briefly the other day via email and he had this to say about the RAM bandwidth. I realise he can't be part of the discussion but I thought I'd pass it on (He's OK with that).

PS3 XDR - 35ns (taken from a PS3 wiki)

I really, really doubt that figure. They chose XDR because it has insanely low latency, far less than GDDR (well below a double digit figure).
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
I really, really doubt that figure. They chose XDR because it has insanely low latency, far less than GDDR (well below a double digit figure).

Don't worry about BG, there's a reason he fled from the forum just before the facts started coming in and now chooses instead to post via emails to 3rd parties instead. :p
 

StevieP

Banned
Don't worry about BG, there's a reason he fled from the forum just before the facts started coming in and now chooses instead to post via emails to 3rd parties instead. :p

Much of bg's info actually came from people that many on this forum evangalize and are considered (and confirmed) actual industry insiders, actually. I won't name names.

I'll say this: nobody knows everything, and some things are lost in translation - especially where third hand technical information is concerned.
 

Van Owen

Banned
I'm just glad I don't have egg on my face.

I always expected that for me Wii U would be just another system for Nintendo games, and I know 720 and PS4 will be huge jumps that I want out of new consoles (and I have a good PC), but some of the people in the speculation threads were just nuts.
 
I'm just glad I don't have egg on my face.

I always expected that for me Wii U would be just another system for Nintendo games, and I know 720 and PS4 will be huge jumps that I want out of new consoles (and I have a good PC), but some of the people in the speculation threads were just nuts.

What was nuts exactly ?, even from early WUST's people were saying at most the console would have 2GB's of Ram, a slower than PS360 CPU and maybe a 1TF GPU, as time went on the GPU figure came down to around the 600 GFLOPs range which BG settled on.

The console doesn't look far off what people predicted tbh.

We will see if he was right but no one ever said WiiU was some kind of 10x power leap beast of a console that had a 2TF GPU, 4GB's of Ram and a faster CPU than PS360. We all knew Nintendo don't compete in the tech arms race anymore, nor should they imo.

Make up whatever you want to make yourself feel smug though lol...
 
Happy Native-American Genocide Day! To celebrate, we shall continue to fling shit at eachother just because one person's opinions differ from another person's!

Cheers!
 
Top Bottom