• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Wii U clock speeds are found by marcan

Kai Dracon

Writing a dinosaur space opera symphony
The only thing I don't understand behind this decision is this:

Nintendo has been stating all along that the WiiU was aimed at recapturing the core market by making it an attractive system for third party developers.

The CPU has been giving many devs issues (with current gen ports!!)

There seems to be a clear mismatch of what Iwata said about the goals of the system and what they actually delivered.

Sure companies say crap all the time, but this was about the vision for the console, not just a simple PR statement.

I think this is where the rubber hits the road with the bind Nintendo is in.

On one hand, I tend to think Nintendo is sincere when they feel that pure, red ocean competition with the other guys isn't going to work for them. What other people call "gimmicks", are Nintendo differentiating the experience their stuff offers. Those "gimmicks" also allows their own developers to create content that has some unique edge you'll never see elsewhere.

They also don't want to price themselves out of the mass market at launch. In case nobody has noticed, 7 years into this generation and the PS360 aren't $99 (not without a subscription trick on Microsoft's part). They apparently are even selling Wii U at a slight loss, something they don't like to do, ever.

Therefore, the budget of any Nintendo hardware generally can't be balanced in a way that equals parting out a gaming PC the way many folks approach it. Their priorities are going to be different.

On the other hand, Nintendo seems to realize after the Wii, that they've got to balance their platforms to meet certain demographics half way. So they're on a precarious rail. As I remarked upthread, the fact that a rushed port of something like ACIII looks and runs as well as it does on Wii U should be a big hint that Nintendo's strategy for designing the system isn't as "illogical" as some would claim. We've already got people freaking believing that the Zelda demo will never happen in a real game, because now that they've got one number to attach to the system, it's like literally a Wii that outputs 720p or something. In spite of the launch games right in front of them. For example: look at the scale and complexity of ACIII. Now imagine EAD and internal developers making a Zelda game, fully optimized and designed for the hardware.

I'm not panicking about the console's future potential there.

But it does remain that Nintendo is in a very uncomfortable position. The harsh reality is that they have to serve many masters. A lot of people don't seem to recognize or accept the consequences of that, merely characterizing them as entirely stupid and oblivious for not "competing" directly with companies that have different aims, and are in different situations.

And it's like nobody is looking at how dangerous that route is. Sony sure is doing great these days as a company, for instance.
 

Raist

Banned
For posters who keep bashing the WUST thread and the expectations, I'll say this again:

How can you blame ANYONE in those threads for expecting a CPU less than today's standards? It's illogical. Unfortunately, Nintendo went with the illogical method.

Just don't bash the people who actually expected something logical to happen. It's asinine.

It doesn't matter if it's logical or not, it's Nintendo. It was a given they'd go that way. I guess people bought the "We made this for teh hardcore1!!!" bullshit from Reggie too easily.
 

McLovin

Member
I think the controller was the main reason they went with that slow CPU/ram combo. They wanted to be profitable and this is the result.
 

Kerub

Banned
persistance+of+memory.jpg


ah, a Dali classic. Truly a timeless artist; such irony in the painiting and the artist.

Are the clocks supposed to represent the Wii U CPU cores?
 
Is this why the OS is slow as hell to load anything off the menu?

No. It just doesn't make much sense for this to be the case.
Am still going with my 'it keeps reconnecting to the Nintendo Network' theory.

People say Wii U is on par with current gen consoles. And the rebuttal is pointing out ports that perform on par with current gen consoles? Ok...

Read the last sentence. It wasn't even a rebuttel to anyone.
What are you on about?
 

Raist

Banned
I think the controller was the main reason they went with that processor and slow ram. They wanted to be profitable and this is the result.

In terms of cost, yeah that makes sense. But you have to wonder why they'd use awfully outdated specs on a console which in addition has to feed a second screen. It's insane.
 

Vestal

Gold Member
To be fair, regarding the GHz discussion, my current rMBP has a 2.6GHz CPU that completely smokes my previous MBP's 2.33GHz C2D CPU. Not all CPU clock speeds are created equal.

As has been stated various time in this thread, yes that premise is correct. However it is also important to point out that your new CPU is probably not based around the same architecture as your old one. In the case of the WiiU they apparently are using the same base architecture that the Gamecube and Wii used, which is why people are very surprised at the clock speeds. There is only so much you can tweak, and enhance on a 12+ year old architecture.
 

Meelow

Banned
It doesn't matter if it's logical or not, it's Nintendo. It was a given they'd go that way. I guess people bought the "We made this for teh hardcore1!!!" bullshit from Reggie too easily.

This is what confuses me, I want to meet these "hardcore" gamers that play games for tech.
 

ASIS

Member
Then it likely would've gone up in cost. And people still would've said it's underpowered.

A few more graphical whizbangs isn't going to make me like a game more. The Wii U will succeed or fail in my eyes based on software that I find compelling, whether it's on par with the PS4/720 or not. My purchase of the PS4/720 will be made for the same reason: compelling software, not number of gigaflops or sparks or whatever.

Specs directly affect the type of game you will likely get, just like sales and costs. These are all important factors. To say that you only care about the game is fairly vague, we ALL care about the games at the end. But because we care about the games we look at external elements that could, in one way or the other, hamper our enjoyment of the product.
 
clock speeds don't matter. they're only relevant to folks who like to make weird disingenuous tables on the front page of anandtech. the ONLY points of concern re: the cpu are the shitty simd support, the amount of legacy ppc 750 cruft still in the system (which is unknown), and the small number of available hardware threads for increasingly parallelized models. i mean, more clocks would help, but trivially.
 

dark10x

Digital Foundry pixel pusher
The UI is not chocking the CPU. My own theory is the network is gimped; seems to be the system is just recontacting the Nintendo Network for every new app and almost relogging you for everything you want to access.
If that were indeed the case, why would the issue persist even when disconnected from the internet? I don't think it has anything to do with Nintendo Network.
 
the wii u *is* more appealing to third-party developers in that ARCHITECTURALLY it is moderately easy to port from a 360 codebase and achieve reasonably similar performance :teehee

hey, that's three more years of relatively easy ports until durango and/or orbis hit stride!

I think that may be the nicest thing you've ever said.
 
There's more factors to it than that. A core i7 is relatively highly clocked and has better IPC than anything else out there by a country instruction.
I'd say it's more like "low power, high IPC, high clock, cheap" -- chose 2, or 3 if you're Intel.

A core i7 is a modern technology and pretty much in its own class . Its the exception not the rule; and pretty much all of there previous processors followed this rule. And of course there's more to it than that. ISA, ALU counts, Pipeline width, pipeline complexity, software, etc.
 

F#A#Oo

Banned
How much extra would it have cost Nintendo to put a more modern CPU & GPU in the same system? $15 per system?

I doubt it's a price barrier.

Nintendo had form factor and power consumption at the top of the list. The main bulk of R&D went into the GamePad. Just check out the Iwata Asks it's very clear on the philosophy/path Nintendo is on.

Essentially boils down to wanting to see low power+above average performance in power dept.
 

McLovin

Member
Can't wait until people get shocked at PS4 and Xbox720 CPU's being clocked at less than 2ghz.

Jaguar cores are capped at 2GHZ... :)
How would they be slower then ps3/360?
As has been stated various time in this thread, yes that premise is correct. However it is also important to point out that your new CPU is probably not based around the same architecture as your old one. In the case of the WiiU they apparently are using the same base architecture that the Gamecube and Wii used, which is why people are very surprised at the clock speeds. There is only so much you can tweak, and enhance on a 12+ year old architecture.
I know, just trying to figure out why they would do that. Didn't say it was a good idea.
 
Lower than expected on the CPU side, although now I guess we really know what Espresso on Beyond3D said when he claimed it was clocked only a "little higher" than Broadway. I think this also trounces the rumors of any huge upgrades to the cores besides the cache and SMP. Seems like Broadway through and through.
 
Alright. Decided to upgrade GPU with 6770.

CPU: 3.2GHz Triple Core AMD Athlon II X3 450 - $65
GPU: XFX Radeon HD 6770 GPU with 1GBs of VRAM - $100
Motherboard: BIOSTAR A780L3B Micro ATX - $45
RAM: 2GB of Kingston DDR3 - $10
Power Supply: Athena Power 400W PSU - $20
Hard Drive: 250GB WD Caviar Blue 7200RPM - $50

TOTAL: $290.

Any ideas on what this wouldn't play? I would love a gaming PC but this is more in line with my budget.

Sorry for off-topic :)
 

Vestal

Gold Member
Can't wait until people get shocked at PS4 and Xbox720 CPU's being clocked at less than 2ghz.

Jaguar cores are capped at 2GHZ... :)

However they will be new Architectures which makes all the difference in the world. Just like AMD Athlon Processors clocked 1ghz slower than Intel processors would smoke them in comparisons back in the day.
 
Who cares if the CPU is slower? Doesn't mean it won't have some of the best games of the generation. I could care less about the hardware power, so long as the software is good.
 
clock speeds don't matter. they're only relevant to folks who like to make weird disingenuous tables on the front page of anandtech. the ONLY points of concern re: the cpu are the shitty simd support, the amount of legacy ppc 750 cruft still in the system (which is unknown), and the small number of available hardware threads for increasingly parallelized models. i mean, more clocks would help, but trivially.
What's going on here? Your last few posts have been a bit too reasonable.

Has GAF's security been compromised???
 

Linkhero1

Member
This is what confuses me, I want to meet these "hardcore" gamers that play games for tech.

A majority of these people don't give a shit about the Wii U or the games. It's not like they would buy third party ports on the Wii U even if it had a great CPU and GPU when they can get them on consoles they own and, in the future, for the PS4 or Xbox 720.
 

RM8

Member
The car analogies are so inaccurate.

Wii is a kart, PS3 is a Ferrari. Can I play a 50-times-as-good Super Mario Galaxy on my PS3, please? I mean, it should do everything my Wii does, but way better as it's the case with a low-end car vs. a high-end car.
 
The problem Nintendo has is their market aims. It seems to be a conscious choice.

To the enthusiast gamer, everything that's not as powerful as current technology allows is "gimped". It's a bit like an auto enthusiast who rails that a mass market consumer minivan is a piece of crap because somewhere in the world, there are Ferraris. It's true that the minivan is no Ferrari, so the enthusiast has a point - but he's also missing the point that the minivan's job is not to be a Ferrari. And in some ways, it's better... like cargo capacity, fuel efficiency, etc.

It's hard for me even with this news to see the Wii U's hardware as "Nintendo cheeps out on j00 suckers". Because Nintendo's goal was not to make a $500 console that was way more powerful than PS360 plus included an iPad. Again, their self-chosen path and problem is that they deal with the mass market. A $300 console (the base model) sounds like their absolute upper limit for MSRP, to not scare away the authentic mainstream audience. Within that price, their concept for the system included an expensive to develop, and not cheap to produce touch screen / motion sensing interface device.

Nintendo doesn't seem to have "gimped" on anything within the price range they had to remain within, considering the total components that make up the system. If the Xbox 360 had ha a cheaper CPU, it could have had more ram, for example. But there were specific priorities and they were followed. Wii U was designed with specific priorities and this is what we got.

The joke with the FUD being spread is that you still have ports like ACIII at launch, made in a rush, that effectively look and run about like the PS3 version of the same game. If people stopped and thought for a moment, they'd see that clearly, something in Nintendo's design strategy for the console is working. Otherwise that game would not exist on Wii U and if it did, never with that kind of port parity.

Edit: I would add that the most questionable thing in the entire matter IMO is Nintendo's very obvious entreaties to 3rd parties about Wii U being friendly towards them from a development and power standpoint. Obviously, working on the console involves some major strategic shifts and while that doesn't mean the hardware is bad, it probably does make Nintendo's official PR line sound like damage control. But then we have all those months and months of some 3rd parties saying the hardware is great, some griping it sucks, etc etc. Opinions, woohah!

Except the fact that wiiu isnt the minivan or golf polo in europe. Thats 360 and ps3.

360 is the commute care, that plays everything and get you from point a to b.

Wiiu and the wii is much more of a novelty than anything else.
 

10k

Banned
Wow, I guess that rumor of three Broadway's clocked at 1.24GHz wasn't far off. I hope to god the rumours of the GPU being a GPGPU are true or else next gen ports are gonna suck on the Wii U.
 

kinggroin

Banned
If that were indeed the case, why would the issue persist even when disconnected from the internet? I don't think it has anything to do with Nintendo Network.

maybe the checks are still there, but it'll eventually time out

I don't agree with him either
 
Can't wait until people get shocked at PS4 and Xbox720 CPU's being clocked at less than 2ghz.

Jaguar cores are capped at 2GHZ... :)

Indeed.

PS3 / 360 were CPU centric consoles.

WiiU / PS4 / 720 are GPU centric consoles.

The time and need for super fast, very, very hot CPU's is now gone.

That's a nice little console Nintendo have gotten to retail for $300, esp if you factor in the cost of the Tablet controller.

First Party games are going to look unreal and next gen third party ports should be possible with down ports if the publishers feel there is a market for them on WiiU.

Well done Nintendo ! :D.
 
Any ideas on what this wouldn't play? I would love a gaming PC but this is more in line with my budget.

Sorry for off-topic :)

My original spec should play last year's titles easily on default settings at 1080p.
The one you quoted should play current and future titles at 1080p and default settings (or higher) with no problem.

Someone correct me if I'm wrong.

P.S. There is an amazing PC gaming thread which provides the ideal tested solution for your budget: http://www.neogaf.com/forum/showthread.php?t=493301

Remember though, the build I posted doesn't include optical drive, casing, operating system, monitor, KB/M, speakers, etc. Things may vary but yeah... read Hazaro's thread.
 

LeleSocho

Banned
There is only so much you can tweak, and enhance on a 12+ year old architecture.

An Espresso clocked the same as Xenon totally destroys it, we are not talking about an Xcpu clocked at 3.0Ghz have the same power of the Ycpu clocked at 3.2Ghz but more likely of an Xcpu clocked at 1.5Ghz have the same power of a Ycpu clocked at 3.0Ghz.
 

McLovin

Member
The car analogies are so inaccurate.

Wii is a kart, PS3 is a Ferrari. Can I play a 50-times-as-good Super Mario Galaxy on my PS3, please? I mean, it should do everything my Wii does, but way better as it's the case with a low-end car vs. a high-end car.
If the ps3 is a Ferrari then so is the WiiU, but they took the engine out and replaced it with a Honda Civics.
 

onQ123

Member
Just think I would have been banned probably if I would have made a thread when I posted the rumor of the Wii U CPU being a tri-core Wii CPU with more memory & higher clock speed & it turned out to be true.
 
So I guess the rumours are true....

The Wii-U CPU really is just three Broadway CPU's duct taped together. Do they have this man on their console R&D team?

redgreen.jpg




Just for the fuck of it, whats the rough estimate of what these parts would cost?



CPU: 3.2GHz Triple Core AMD Athlon II X3 450 = $59.99 (lowest)
GPU: XFX Radeon HD 4850 GPU with 1GBs of VRAM = can't find a price anywhere. Lets just say under $100 dollars USD.
Motherboard: BIOSTAR A780L3L Micro ATX = roughly $44.99, looking around.
RAM: 2GBs of Kingston DDR3 = $9.99
Power Supply: Rosewill RV350 ATX 1.3 = $29.99
Hard Drive: 80GB WD Caviar Blue 7200RPM = $19.99

It's hard to come to an exact price, because I can't find the cost of the GPU anywhere (it's been off the market for a while now). But if we just take a guess and say $90.00, then it would come to about $254.00.

But comparing 6 different retail parts to the components of a game console is a pretty poor comparison to make. You can't compare a a fully manufactured 4850 GPU on a printed circuit board with 1GB of GDDR RAM, display ports, and a CPU cooler to just the manufacturing costs of a single silicone GPU chip. The motherboard in the Wii-U probably cost as much to manufacture as that Radeon HD 4850 GPU itself.

A slight my more accurate comparison would probably be something like this, I think:

BIOSTAR A68I-350 Deluxe AMD Fusion APU 350D AMD A68 Mini ITX Motherboard/CPU Combo
 
The only thing I don't understand behind this decision is this:

Nintendo has been stating all along that the WiiU was aimed at recapturing the core market by making it an attractive system for third party developers.

The CPU has been giving many devs issues (with current gen ports!!)

There seems to be a clear mismatch of what Iwata said about the goals of the system and what they actually delivered.

Sure companies say crap all the time, but this was about the vision for the console, not just a simple PR statement.

Well, the early results are in: they have the biggest third party titles released this year as launch releases and they're, at least, in the same ballpark as the X360 and PS3 ones. Surely, that means Wii U has achieved what the Wii never did as Ninty now has a budding userbase for the biggest third party releases with some initial early experience under those developers' belts while mostly offering significant upgrades and/or differentiation to that experience. As the system matures, so will understanding of what it takes to match or better the ports that come each year, just like it has been evolving on between X360 and PS3, year after year. I don't see any problem that isn't just speculative or theory, even if from a developer with hands-on with the Wii U, as they will find a way even if it's not one-for-one. I don't see a mismatch at all, just a realignment of expectations that shows that Ninty is focused on the here and now, not so much whatever happens in the so-far unproven and uncertain future for their competition from MS and Sony.
 
all of amd's gpu offerings from the R7XXX line onward are "gpgpu". iwata's just parroting a bullet point. the REAL question is: how many GFLOPS will the gpu be able to spare? :teehee
 

dwu8991

Banned
Indeed.

PS3 / 360 were CPU centric consoles.

WiiU / PS4 / 720 are GPU centric consoles.

The time and need for super fast, very, very hot CPU's is now gone.

That's a nice little console Nintendo have gotten to retail for $300, esp if you factor in the cost of the Tablet controller.

First Party games are going to look unreal and next gen third party ports should be possible with down ports if the publishers feel there is a market for them on WiiU.

Well done Nintendo ! :D.

Yeah, it's the GPU that's the star of the show !!!
 

quest

Not Banned from OT
Can't wait until people get shocked at PS4 and Xbox720 CPU's being clocked at less than 2ghz.

Jaguar cores are capped at 2GHZ... :)

Except the jaguar cores will be much higher IPC than the modified gamecube CPU. If the jaguar is anything like the bobcat it will be an amazing CPU. They are also beefing up the FPU on the jaguar. If Nintendo wanted to use such a old CPU they should of used a GPU based on the latest Amd GPU. Much better compute performance.
 
Top Bottom