• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

don't all of you apologize at one time now.

You've made reference to this three times now and no one has responded, and with good reason. This is a rumor, not fact, and some of the specs completely go against some of the other rumors we've heard that have more weight to them.

Either way, people complaining about storage in here are redic. We've known you can BYO memory for over a year now, and that's not an issue. Of course, it's Nintendo. Gotta have something to complain about.

Also, 32MB is not for the OS. I don't know how many times it has to be said.
 

Ydahs

Member
Xenos is a 2005 CPU. And 90% of rumors say is equivalent to a Xenos CPU. Including this. So....
You mean Xenon. Xenos is a GPU.

From my understanding, most rumours have stated it's more powerful than the Xenon, not equivalent. Either way, I can't see where you can determine that it's "CPU is just as powerful as a 2005 CPU"
 

MadOdorMachine

No additional functions
Here's the sad fact.

That's never going to happen, even if it were another warm body alongside the PS4 and Durango. The software wouldn't sell on the Nintendo system, third parties would stop putting AAA (or, coming soon, AAAA) titles on there, and it would be the same situation it's always been.

Within these expectations, Wii U will probably garner more support than the Wii did because it's simply not as hard to make a Wii U game as it is to make a Wii one, in terms of having people that know how to develop for the system.

But it was never going to leapfrog this generation. I've said this a hundred times. If it were released in 2005, it would probably be king of the hill, but even then it would likely not getting a whole hell of a lot of support.

To what extent this is going to be another Wii situation, both negatively and positively, is going to be determined by a lot more factors than just the processor.

I disagree. Had Wii been more comparable to 360 & PS3 in performance, it would have destroyed them in third party support. There's no way publishers would have skipped out on it.
 

Fularu

Banned
When was the last time the most powerfull console won the "console war"?

I'm honestly asking because as far as I can remember, that never, ever, happened.
 

thefro

Member
I'm guessing they got a custom chip from IBM that has BC with Broadway & accepts those instructions, but that doesn't mean they just overclocked a bunch of Wii processors.
 

Pocks

Member
Wow, I dont know what to think, could somebody please find Ja Rule so I can make sense of all this!

Just wanted to let you know that I got the reference, and I thought it was funny.

Also, these specs aren't surprising but they sure are disappointing. I was at least hoping for more RAM. Wii U will still have great Nintendo IP, so of course it's a year one purchase nonetheless.
 

Diablos54

Member
The whole CPU info sounds... Dodgy to me. Overclocking an already overclocked chip, and then putting 3 of them together? Something sounds off to me.

The rest of the stuff sounds about right, though.
 

hachi

Banned
The specs that matter much more to me are not on this kind of list:

- range of the gamepad
- integration of the OS / speed of boot-up or wake, etc
- details on TV remote functionality
- etc
 

Somnid

Member
So um. How exactly will they get PS4/720 ports with these specs?

Same way 360 has been getting PC ports for the last 6 years. Very few people want to continue moving the bar because Wii proved it doesn't matter as much anymore and it's really expensive.
 
The problem with DX11 is that most of it's nicest features are only viable on:

- High end GPUs of past DX11 generation GPUs.
- Mid-high end GPUs of current gen DX11 GPUs.

Just look at PC DX11 GPU benchmarks to see the bloodbath, compared to DX9/10 games.

The most suitable DX11 GPU they could it fit in the targeted the form factor and targeted price ($250) would suck horribly at running actual DX11-exclusive features and silicon would be wasted, unused by most games. With DX10.1 the GPU has a better chance to be used to it's fullest.

It's not that simple. DX11 is not that much about new graphical features but more about efficiency (SM 4.0 shaders are already freely programmable, so you can basically achieve any effect you like). In other words, you can achieve the same results with less raw power. At the cost of more transistors on the GPU of course.
Most PC titles with DX11 support just don't use it in a very meaningful way.
 
I disagree. Had Wii been more comparable to 360 & PS3 in performance, it would have destroyed them in third party support. There's no way publishers would have skipped out on it.

Useless speculation.

More comparable = More expensive = A lot less sales.


Same way 360 has been getting PC ports for the last 6 years. Very few people want to continue moving the bar because Wii proved it doesn't matter as much anymore and it's really expensive.

PC has been getting 360 ports for the last 6 years more like.
 

NickMitch

Member
"well, it will suffice for now, but in 2-4 years it will SUCK"

That´s the basic stuff being said in this thread, which is rather fascinating for a discussion regarding tech of any kind.
 

ivysaur12

Banned
On a scale of Superman 64 to Half-Life Episode 3, what would you say we have here?
I was an English major, I have no fucking idea what this shit means.
 

hyduK

Banned
When was the last time the most powerfull console won the "console war"?

I'm honestly asking because as far as I can remember, that never, ever, happened.

In terms of sales? Who knows. But as a consumer my only stock in sales is that they get enough to keep making games/consoles.
 
tumblr_lda9qpu91Z1qfsk6co1_500.gif

WiiU spec threads remind me of those wives who get beaten but never leave the husband in question. As if they enjoyed it.
 

Hoodbury

Member
As someone who doesn't buy digital games or eShop games, will I even need to get a external harddrive or a SD card? Or will 8 gigs be enough for me and my game saves and whatever else I need a hard drive for?

Or will DLC start eating up that 8 gigs pretty fast?
 

Shion

Member
The same way the PS2 got downports from the GameCube and Xbox?
- PS2 was in the same ballpark as GC/XBOX. By 2013, Wii U is going to be a generation behind (in terms of specs).

- PS2 didn’t get downgraded ports from GC/XBOX, PS2 was the lead development platform of that generation. GC/XBOX got ports from PS2.
 
I disagree. Had Wii been more comparable to 360 & PS3 in performance, it would have destroyed them in third party support. There's no way publishers would have skipped out on it.

If the Wii had been comparable in performance, it would not have cost $250 in 2006 and would not have a 60 million+ userbase.

The Gamecube was similar in performance to the PS2 and XBox. After the initial test multiplats, most third party support was in the form of contractual obligations that immediately got thrown aside in favor of ports when it was possible to do so.
 

Ydahs

Member
Slight off topic, but did anyone else get redirected to another page while browsing this thread?

I think it may be from an ad. I copied the link it redirected me to.
 

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Not really. These are just god-awful.

Sad thing is they'll probably be farther behind in online and other features.

For GAF. These are awful for GAF.

Now, the majority of people who actually purchase these consoles? We'll see.
 

Drkirby

Corporate Apologist
Is eDRAM faster then the 1T-SRAM Nintendo has been using for a while? Or do I have my facts messed up, and 1T-SRAM is a type of eDRAM?
 
alternatively current Wii devs might like it as it provides continuity and tools might be similar.
A lot of people cited this as an advantage on the Wii if they had Gamecube experience. Hell, I think Nintendo might have said that a couple of times. Not 100% about my second statement though. That didn't really encourage third parties to develop Wii software. Don't think it'll work in this case either.
1. Denial

2. Anger

3. Bargaining

4. Depression

5. Acceptance

Which stage are you guys? Because, it's all a broken record.
The cycle doesn't end until the generation does. The games keep us in a perpetual state of turmoil.
 
Disappointing. No Shader Model 5.0, no DirectX11, no OpenGL 4.3, weak CPU, only 1 GB RAM. I don't think we will see many Ports from PS4/Xbox3. Will buy when it reaches 150 Euro.
 
I'll just have a look on thursday.. If the ports from this gen have a little extra, like better resolution and AA (i hate jaggies) and maybe a tad better textures and lighting, then i'm fine.
If the ports are worse than 360 and ps3, then i'll wait for a truelly amazing wii-U game to come out. I want pikmin3 bad, but i'm not buying the wii-U for 1 game (that i can still play later on).

I'm confident first party titles will look great and honestly, i'm never ever going to buy multiplats on Wii-U anymore when ps4 and xbox720 are out. I bought zero multiplats on wii.

So... best they show the good stuff thursday, so that people will buy a Wii-U in the first year.
 

Mithos

Member
So that means a Broadway chip was roughly equivalent to one of the 360 cores?

That's kind of crazy.

I just get to get this in here (from an emu developer (LibretroRetroArch)).

I believe if you program only against one main CPU (like we do for pretty much most emus), you would find that the PS3/Xenon CPUs in practice are only about 20% faster than the Wii CPU.

I've ported the same code over to enough platforms by now to state this with confidence - the PS3 and 360 at 3.2GHz are only (at best - I would stress) 20% faster than the 729Mhz out-of-order Wii CPU without multithreading (and multithreading isn't a be-all end-all solution and isn't a 'one size fits all' magic wand either). That's pretty pathetic considering the vast differences in clock speed, the increase in L2/L1 cache and other things considered - even for in-order CPUs, they shouldn't be this abysmally slow and should be totally leaving the Wii in the dust by at least 50/70% difference - but they don't.

BTW - if you search around on some of the game development forums you can hear game developers talking amongst themselves about how crap the 360/PS3 CPUs were to begin with. They were crap from the very first minute the systems were launched. You're essentially looking at the equivalent of Pentium 4-spec consoles that have to be helped along by lots of vector CPUs (SPUs) and/or reasonably mid-specced, highly programmable GPUs (which the Wii admittedly lacks). Without utilizing multithreading - you're essentially looking at Pentium 4 2.4GHz-esque performance.

So 3 core "enhanced broadway" could be better, or much better then current consoles CPU I take it.

Haven't linked to the source because I dunno yet if that site is allowed.
 

Alexios

Cores, shaders and BIOS oh my!
Gemüsepizza;41987475 said:
Disappointing. No Shader Model 5.0, no DirectX11, no OpenGL 4.3, weak CPU, only 1 GB RAM. I don't think we will see many Ports from PS4/Xbox3. Will buy when it reaches 150 Euro.
I like how you say the same thing three times. I'm sure it's for emphasis, not because you don't really know what you're hastily repeating! Also, what's the gap to PS4/Xbox3 that makes you say that, I must have missed that spec thread.
 
I know comparison is unwarranted for since we have yet to see output of the titles but in terms of raw specs, can someone rank the current consoles with WiiU? I can't fathom to know what's Expresso and “GPU7” AMD Radeon-based High Definition GPU.
 
Top Bottom