• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

why does this thread exist?

Nintendo's stated strategy - they've openly said it - is taking "old, withered technology" and finding new uses for it.

They were never going to compete on tech... and one look at the Wii suggests why.

Most of the thread isn't even a tech discussion, it's people bitching because they kidded themselves or don't actually know very much about Nintendo.
 

tnaden

Member
Ah, flops. Can't believe the importance people kept putting in gflops and tflops.

You know Sony isn't going to use a weaker CPU, so what are you trying to accomplish?

They are most certainly going to use a CPU weaker then the Cell, putting more money into a really nice GPU. The Cell was a mistake.
When the PS3 was released games used a lot of CPU and a bit less GPU. It's the other way around now with more and more functions being moved to the GPU. It wouldn't make sense to have the same power ration between the two in the PS4

EDIT: Beaten..
 

StevieP

Banned
Ah, flops. Can't believe the importance people kept putting in gflops and tflops.

You know Sony isn't going to use a weaker CPU, so what are you trying to accomplish?

It depends on the metric. The PPE was really really good at some things, and bloody awful at others.
Edit: beaten by GhostTrick
 
Ah, flops. Can't believe the importance people kept putting in gflops and tflops.

You know Sony isn't going to use a weaker CPU, so what are you trying to accomplish?

Well i would really like an explanation from you why an A10 from AMD is not slower than CELL.

And HW dosen´t sell consoles anyhow.

And by the way, you post an awful lot in Wii U threads for someone so hateful against it. If you are not liking it, why are you even in here?

I read an awful lot of... lets say... absolutely non contributing posts from you...
 
Fixed function stuff could be for Gamecube/Wii sw to use.

It would be pointless though. Shaders can take care of all those instructions and more. In other words, from what I have gained, the Wii environment can be basically reproduced without TEV or any of those old parts. There is such a jump from that architecture that it's not an issue. The CPU, on the other hand, seems to be a closer match to the older Wii parts. Sounds kind of like how Sony did BC on the 80 GB PS3 (except they actually included the old CPU since they had Cell), but probably even closer to 100%.
 

beril

Member
Seems like exactly that - rumors. Nintendo has focused on lighting in their own titles (which use relatively simple geometry and textures) and there are probably at least enough SPUs on the GPU to add some decent lighting on top of existing 360 engines if devs so choose. But I think the whole "fixed function" thing was bogus.

Yea that part never made much sense at all. People just know Nintendo likes fixed function systems and went crazy with speculation, with very little grounds or thought about how it would actually work. Exactly how would a fixed function lighting pass go together with a programmable pixel pipeline? You could fit in in between the vertex shader and the fragment shader, so it gives you the fully lit diffuse/specular colors as inputs for the pixelshaders. But then you'd only be able to use vertex normals. You could perhaps have an option to use normalised normals, or even assign a normalmap with some special registers. And what about shadows? That would also have to be done during that pass. It would very quickly become very complex and less flexible than a fully programmable system. And of course, none of this would really work with deferred rendering.
 

MDX

Member
My new guess for the specs:

CPU 1620Mhz(162x10)
GPU speed 486Mhz(162x3)
RAM 810Mhz(162x5) <- I am unsure of this because of bus clock and memory clock.
DSP 162Mhz
ARM 243Mhz(162x0.5) <- You would need this for Wii compatibility reason if it doesn't do anything else for the Wii U.

We should do some kind of "Price is Right" thread for these
speculations.
 

v1oz

Member
It would be pointless though. Shaders can take care of all those instructions and more. In other words, from what I have gained, the Wii environment can be basically reproduced without TEV or any of those old parts. There is such a jump from that architecture that it's not an issue. The CPU, on the other hand, seems to be a closer match to the older Wii parts. Sounds kind of like how Sony did BC on the 80 GB PS3 (except they actually included the old CPU since they had Cell), but probably even closer to 100%.
Are you 100% certain the TEV functionality could be done with shaders? Nintendo tends to prefer hardware emulations over software.
 
Whatever CPU Sony uses next in their VIDEOGAME CONSOLE, it will definitely be faster for VIDEOGAMES.

But not faster than Cell, a 7 year old CPU !!11!!!.

Not so fond of info like that when it's aimed at Sony instead of Nintendo are you lol...

Sony better get PS4 right, it looks like it may well be their last console.
 
It would be pointless though. Shaders can take care of all those instructions and more. In other words, from what I have gained, the Wii environment can be basically reproduced without TEV or any of those old parts. There is such a jump from that architecture that it's not an issue. The CPU, on the other hand, seems to be a closer match to the older Wii parts. Sounds kind of like how Sony did BC on the 80 GB PS3 (except they actually included the old CPU since they had Cell), but probably even closer to 100%.

Wouldn't that require some sort of emulation layer, and if some GC/Wii games were using TEVs in more exotic way, that would make the emulation difficult. Well, maybe not too difficult, but difficult enough for a nice simple hw solution instead of sw one...
 

Van Owen

Banned
But not faster than Cell, a 7 year old CPU !!11!!!.

Not so fond of info like that when it's aimed at Sony instead of Nintendo are you lol...

Sony better get PS4 right, it looks like it may well be their last console.

Yeah, it will be much better for general performance.

Cell might be better at folding proteins or whatever though. My god, Sony is doomed.
 

Lothars

Member
But not faster than Cell, a 7 year old CPU !!11!!!.

Not so fond of info like that when it's aimed at Sony instead of Nintendo are you lol...

Sony better get PS4 right, it looks like it may well be their last console.
Is all of your posts this clueless? Who cares if the processor isn't as fast in some ways as the Cell Processor if it makes it easier for developers to make games compared to what the Cell did than I think that's a good thing for the system. I bet games will still look amazing on next xbox and PS4.
 
Yeah, it will be much better for general performance.

Cell might be better at folding proteins or whatever though. My god, Sony is doomed.


Well, Wii U CPU might be better for general purpose too. Because Xenon and Cell weren't. And port are based on consoles relying on CPU with high frequencies, lot of threads and FP.
 

ozfunghi

Member
Arkam was basically hounded out of the forum by a vast percentage of the WUST. It was shameful behaviour. Just because he was telling the truth and most of WUST didn't want to hear it. Yeah. Lots of hypocrites.

Well, i remember giving him a hard time as well. I'm not ashamed to admit and i'm not ashamed that i did either. I remember i had a hard time with the way he brought the message, and not so much the message itself. It was very hit and run. Trolling really. The fact that it might have been true, doesn't change that.

Also, no need for you to try and act holier than the pope, point the finger or pass judgement. After all, there was a reason you got banned, was there not?

Well, Wii U CPU might be better for general purpose too. Because Xenon and Cell weren't. And port are based on consoles relying on CPU with high frequencies, lot of threads and FP.

Get.Your.Filthy.Logic... OUT OF HERE!
 
Haha, fans of other companies don't like the idea of the 'true' next gen consoles CPU's being worse at certain things than their 7 year old CPU's lol ;).
 

QaaQer

Member
Well, i remember giving him a hard time as well. I'm not ashamed to admit and i'm not ashamed that i did either. I remember i had a hard time with the way he brought the message, and not so much the message itself. It was very hit and run. Trolling really. The fact that it might have been true, doesn't change that.


Is it trolling if it is the truth?
 

Meelow

Banned
I doubt the Metro dev would have an issue if that were even remotely true.

If you read everything THQ/4A said about Metro it more sounds like they want to do it but don't have the time and money to do it, they said "we might make a Wii U version at a later date" and they also said "even the PS3 is pushing us"

So it really seems it has nothing to do with the Wii U CPU and more along of THQ money and time situation.

But that's what I think.
 

Van Owen

Banned
If you read everything THQ/4A said about Metro it more sounds like they want to do it but don't have the time and money to do it, they said "we might make a Wii U version at a later date" and they also said "even the PS3 is pushing us"

So it really seems it has nothing to do with the Wii U CPU and more along of THQ money and time situation.

But that's what I think.

They also said it could run on an iPad. It would just be gimped, like if it were on Wii U.
 
If you read everything THQ/4A said about Metro it more sounds like they want to do it but don't have the time and money to do it, they said "we might make a Wii U version at a later date" and they also said "even the PS3 is pushing us"

So it really seems it has nothing to do with the Wii U CPU and more along of THQ money and time situation.

But that's what I think.
"The Wii U has a horrible and slow CPU"

Does the guy have to get it tattooed on his ass for you? He's the Chief Technical Officer to boot. That's about as clear a statement as you can get, regardless of what other things are said by a PR guy the next day.
 

MDX

Member
Here is the thing people are forgetting,

Nintendo found a way to basically create an incredible lag free signal between the gamepad, console and TV. Who is to say that Nintendo and partners have not come up with something special for the memory?


For example:

"The performance problem of hardware nowadays is not clock speed but ram latency. Fortunately Nintendo took great efforts to ensure developers can really work around that typical bottleneck on Wii U.

"They put a lot of thought on how CPU, GPU, caches and memory controllers work together to amplify your code speed.

Renesas has a patent for memory controllers:
For high speed access to memory.
http://patents.com/us-8255622.html

Is similar technology being used?
 

Meelow

Banned
"The Wii U has a horrible and slow CPU"

Does the guy have to get it tattooed on his ass for you? That's about as clear a statement as you can get, regardless of what other things are said by a PR guy the next day.

I'm not saying that it's not "slow" because I believe it, I'm just saying that it seems THQ wants to put it on Wii U but they don't have time and money. And I also remember that devs said this with the PS3 in it's first few years.

That's just my opinion though.
 

Van Owen

Banned
Here is the thing people are forgetting,

Nintendo found a way to basically create an incredible lag free signal between the gamepad, console and TV. Who is to say that Nintendo and partners have not come up with something special for the memory?


For example:



Renesas has a patent for memory controllers:
For high speed access to memory.
http://patents.com/us-8255622.html

Is similar technology being used?

The gamepad is just displayed over existing 5ghz wifi.
 

MDX

Member
"The Wii U has a horrible and slow CPU"

Does the guy have to get it tattooed on his ass for you? He's the Chief Technical Officer to boot. That's about as clear a statement as you can get, regardless of what other things are said by a PR guy the next day.


Did he actually work on it or just look at the numbers?
 
Sony PS3s CELL CPU -> 230.4 GFLOPS

Source: http://en.wikipedia.org/wiki/PlayStation_3_hardware


Rumor has it PS4 will use an A10 APU from AMD ->


Fastest A10 to date -> 121.6GFLOPS (@100W already, CPU flops only)

Source: http://www.rage3d.com/reviews/fusion/amd_a10_5800k_launch_review/

That's because piledriver only has one floating point unit per module since they aren't very useful for general processing, FLOPS have no place in power measurement on a general processor.
 

beril

Member
That's because piledriver only has one floating point unit per module since they aren't very useful for general processing, FLOPS have no place in power measurement on a general processor.

That didn't stop sony from hyping up the Cell as the most powerful CPU ever created, while most developers were fairly clueless about what to even put on the SPEs
 

Durante

Member
Nintendo looks like they emphasized GPU over CPU, why do you think Sony wouldn't?
Even if they emphasize the GPU more (which they likely will), that doesn't mean that their CPU will be lackluster.

I'm almost certain that Sony is not building a 35 Watt system.


Here is the thing people are forgetting,

Nintendo found a way to basically create an incredible lag free signal between the gamepad, console and TV. Who is to say that Nintendo and partners have not come up with something special for the memory?
You know what says that? Logic. The chips they are using are readily available on the market and their specifications are well known. If, by some remote chance, someone had come up with a magical hardware way to get more performance out of DDR3, a Nintendo console wouldn't be the first you hear about it.


But not faster than Cell, a 7 year old CPU !!11!!!.

Not so fond of info like that when it's aimed at Sony instead of Nintendo are you lol...

Sony better get PS4 right, it looks like it may well be their last console.
Your posts are embarrassing.


That didn't stop sony from hyping up the Cell as the most powerful CPU ever created, while most developers were fairly clueless about what to even put on the SPEs
Cell is a very intelligently designed, extremely efficient architecture. How forward-looking it was can be seen by the fact that heterogenous accelerator-based computing is now the main driving force in HPC. It's just based mostly on GPUs now, but the principle is extremely similar.
It's harder to program, but going with a hard to program but efficient architecture paid off for Sony with PS2. With PS3, they learned that they cannot count on that when they are not the market leader. Which leads to the (rumoured) PS4 decisions.
 

tenchir

Member
You know what says that? Logic. The chips they are using are readily available on the market and their specifications are well known. If, by some remote chance, someone had come up with a magical hardware way to get more performance out of DDR3, a Nintendo console wouldn't be the first you hear about it.

You could get more performance out of DDR3 by using multi-channel memory architecture, like dual channel DDR3. Intel chipset can get 25.6GB/s from 1066Mhz DDR3 using triple channel. I don't think Nintendo implemented anything like this though because there are issues with using these architectures(latency).

Edit: Our PCs been using DDR2/3 like it for years now for example.
 
Even if they emphasize the GPU more (which they likely will), that doesn't mean that their CPU will be lackluster.

I'm almost certain that Sony is not building a 35 Watt system.

Agreed.


You know what says that? Logic. The chips they are using are readily available on the market and their specifications are well known. If, by some remote chance, someone had come up with a magical hardware way to get more performance out of DDR3, a Nintendo console wouldn't be the first you hear about it.

This isn't true. Just like how USB 3.0 is faster than Firewire on "paper", but still loses to Firewire in real world benchmarks. To say Nintendo couldn't have thought up something to make the memory bandwidth more efficient using a smaller pipe is close-minded. We already have technology out there that delivers more data using a smaller pipe. Hell, AMD proved this back during the socket 939 days.


You are an embarrassment.

The guy you're replying to has a point. Sony is losing a lot of money. They can't afford to create an uber powerful console to which they'd lose a ton of money on. In this economy (although improving), people just don't want to spend a ton of money on an expensive console. If the initial 3DS sales wasn't an indicator of that, then I don't know what is.

Cell is a very intelligently designed, extremely efficient architecture. How forward-looking it was can be seen by the fact that heterogenous accelerator-based computing is now the main driving force in HPC. It's just based mostly on GPUs now, but the principle is extremely similar.
It's harder to program, but going with a hard to program but efficient architecture paid off for Sony with PS2. With PS3, they learned that they cannot count on that when they are not the market leader. Which leads to the (rumoured) PS4 decisions.

Do you work for Sony? You're starting to sound like their PR team.
 

Durante

Member
You could get more performance out of DDR3 by using multi-channel memory architecture, like dual channel DDR3. Intel chipsest can get 25.6GB/s from 1066Mhz DDR3 using triple channel. I don't think Nintendo implemented anything like this though because there are issues with using these architectures.
Wii U is already using the maximum bus width afforded by its memory chips.


This isn't true. Just like how USB 3.0 is faster than Firewire on "paper", but still loses to Firewire in real world benchmarks. To say Nintendo couldn't have thought up something to make the memory bandwidth more efficient using a smaller pipe is close-minded. We already have technology out there that delivers more data using a smaller pipe. Hell, AMD proved this back during the socket 939 days.
Are you seriously arguing that Nintendo developed a new signalling standard for DDR3 memory that works with off-the-shelf chips and delivers higher performance? If so I'll just drop this.


Do you work for Sony? You're starting to sound like their PR team.
No. I work at an University, I've programmed Cell (though it was 2 of them in an IBM blade, and more accurately it was PowerXCell8i) and I think it's a beautiful architecture.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
After all, there was a reason you got banned, was there not?

For telling the WiiU fanboys who said it was going to be a uber console and giving folks like me and Arkam a hard time exactly what I thought of them in a most ungentlemanly fashion once the truth finally started to come out on this 'on par' system.

And it was absolutely worth it.
 

tenchir

Member
Wii U is already using the maximum bus width afforded by its memory chips.

That's not how multi-channel memory works though, each memory chips has it's own channel to the memory controller instead of a single channel limited by it's maximum bus width. In multi-channel(dual for example) you are not getting a single channel for all the chips(4x512MB) that provides 12.8GB/s, you are getting 1 channel per 2x512MB chips that can give up to 25.8GB/s. I am not really good at explaining this, so here's a link.

http://en.wikipedia.org/wiki/Multi-channel_memory_architecture

edit: I am not saying that Nintendo is using this, just that it's possible to get more bandwidth from DDR3 without increasing it's speed. It just depends on the memory controller in the CPU and the memory chip configurations.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
That's not how multi-channel memory works though, each memory chips has it's own channel to the memory controller instead of a single channel limited by it's maximum bus width. In multi-channel(dual for example) you are not getting a single channel for all the chips(4x512MB) that provides 12.8GB/s, you are getting 1 channel per 2x512MB chips that can give up to 25.8GB/s. I am not really good at explaining this, so here's a link.

http://en.wikipedia.org/wiki/Multi-channel_memory_architecture

I don't know much about memory architecture but this on Wikipedia:-

According to Intel, a Core i7 with DDR3 operating at 1066 MHz will offer peak data transfer rates of 25.6 GB/s when operating in triple-channel interleaved mode.

And thats a Core i7 tripple channel running at 1066Mhz. I can't see WiiU getting anywhere near that.
 

tenchir

Member
I don't know much about memory architecture but this on Wikipedia:-



And thats a Core i7 tripple channel running at 1066Mhz. I can't see WiiU getting anywhere near that.

Triple channel just means that it's more likely to reach the peak 25.6GB/s than dual channel. Dual channel can reach 25.6GB/s too, but that depends on the controller and memory to be at 100% efficiency or something. I didn't say that the Wii U can reach 25.6GB/s, all I am saying is that you can get more bandwidth from DDR3@800Mhz without increasing it's clock speed.

The dual-channel configuration alleviates the problem by doubling the amount of available memory bandwidth. Instead of a single memory channel, a second parallel channel is added. With two channels working simultaneously, the bottleneck is reduced. Rather than wait for memory technology to improve, dual-channel architecture simply takes the existing RAM technology and improves the method in which it is handled. While the actual implementation differs between Intel and AMD motherboards, the basic theory stands.
 

beril

Member
That's not how multi-channel memory works though, each memory chips has it's own channel to the memory controller instead of a single channel limited by it's maximum bus width. In multi-channel(dual for example) you are not getting a single channel for all the chips(4x512MB) that provides 12.8GB/s, you are getting 1 channel per 2x512MB chips that can give up to 25.8GB/s. I am not really good at explaining this, so here's a link.

http://en.wikipedia.org/wiki/Multi-channel_memory_architecture

edit: I am not saying that Nintendo is using this, just that it's possible to get more bandwidth from DDR3 without increasing it's speed. It just depends on the memory controller in the CPU and the memory chip configurations.

That just about adding more modules together for a bigger bus. In most cases on PC it's 2 64bit modules for a 128 bit bus. The Wii U has 4 16-bit chips, adding up to a 64 MB bus.
 

ozfunghi

Member
For telling the WiiU fanboys who said it was going to be a uber console and giving folks like me and Arkam a hard time exactly what I thought of them in a most ungentlemanly fashion once the truth finally started to come out on this 'on par' system.

And it was absolutely worth it.

People like you and Arkam? Exactly what insider knowledge did you bring to the table to justify your constant bashing? And "on par" system? lol, ok. We'll get back to that later on.
 

Durante

Member
That's not how multi-channel memory works though, each memory chips has it's own channel to the memory controller instead of a single channel limited by it's maximum bus width. In multi-channel(dual for example) you are not getting a single channel for all the chips(4x512MB) that provides 12.8GB/s, you are getting 1 channel per 2x512MB chips that can give up to 25.8GB/s. I am not really good at explaining this, so here's a link.

http://en.wikipedia.org/wiki/Multi-channel_memory_architecture

edit: I am not saying that Nintendo is using this, just that it's possible to get more bandwidth from DDR3 without increasing it's speed. It just depends on the memory controller in the CPU and the memory chip configurations.
You are not sufficiently understanding the technical issue here. There are 4 chips in the Wii U. Each has a 16 bit wide connection. It's not possible for it to be anything more than a 64 bit memory controller.
 
Alright alright, we've been hearing this all week. Why not call out individual posters instead of the WUST as a whole? Let's see some quotes! To me, it's no different than people calling out the "GAF hivemind." It's an oversimplification and implies that all Nintendo fans/Wii U speculators think exactly alike. Half the arkam bashers are probably banned by now anyway.
You don't really need to quote anyone. The whole community was an echo chamber.
 
they should have went with double the chips in a lower capacity while they waited for Micron's Twindie chips to be ready.

As it is they had bad luck because those weren't available, but they surely knew, as we do, that buying time (and wasting a little more money and energy per console) was an option. And they should have decided for it in a heartbeat.
 
Top Bottom