CPU: “Espresso” CPU on the Wii U has three enhanced Broadway cores
GPU: “GPU7” AMD Radeon™-based High Definition GPU. Unique API = GX2, which supports Shader Model 4.0 (DirectX 10.1 and OpenGL 3.3 equivalent functionality)
Storage: Internal 8 GB with support for SD Cards (SD Cards up to 2GB/ SDHC Cards up to 32GB) and External USB Connected Hard Drives
We would hear a lot of concern from devs if they really used 'overclocked' Broadway cores...
Disappointed that they never used a DX11 or equivalent GPU but i can understand that they are trying to keep the price down because of the cost of the controller.
I guess we will see on Thursday whether or not it will be worth it price wise, charging $300 for a DX 10.1 console with 1.5GB of Ram and an 8GB flash drive when PS4 / 720 will 99% likely be DX11 capable consoles with 2GB + of Ram with more than likely a much faster CPU and larger HDD's for $400 a year - 18 months later is a bit of a joke tbh but like everyone else who buys one i will buy it for Nintendo's first party output if they price it accordingly.
Going from those specs it should be no more than $250 / £200 imo.
Huh? What's seen on the screen is processed and rendered on the console. The only difference is screen resolution.looks to be the right specs.. no need to put too much power into it, if the Tablet Controller Thingy can´t display the same kind of Power.. would be annoying looking at a gorgeus game on my Televison and then looking down at my tablet and see the same stuff not looking as good..
The PS3 doesn't use DirectX and nor will the PS4. UE4 has OpenGL support so I imagine that's why the PS4 and Wii U will use (considering they port the engine over).Yeah that could be a big problem for the wiiu after the ps4/nextbox launch because i think unreal engine 4(not the mobile one) and luminous are based on DX11 tech?not that i think the wiiu will have good third party support but still.
Someone translate this into DBZ levels of power plz
Source? They don't say much other than the basic architecture here, even if it's true.
1. Denial
2. Anger
3. Bargaining
4. Depression
5. Acceptance
Which stage are you guys? Because, it's all a broken record.
Frankly, it's simply embarrassing to use a 12-13 years old CPU, no surprise Nintendo don't want to share the specs. No matter how many cores or the clock speed, is an architecture of 12-13 years ago, it´s limited by default. It redefines the word "cheap".
What I don't get are people who pile on Nintendo for their specs and game primarily on PS360 rather than PC
I still think $250 with a Nintendo Land pack-in is the perfect price point.
Xenos is a 2005 CPU. And 90% of rumors say is equivalent to a Xenos CPU. Including this. So....
You don't have to with the other consoles.I don't get the complains about the 8GB internal storage. You can use SD or an external HDD unlike any other consoles.
The quote you responded to:
I don't think you answered it. Nintendo is, after all, all about exclusives.
2010 would have been a better time to ride the hype of Wii. The casuals may well have left at this point and the more hardcore gamers are going to be more interested in the more powerful Xbox or Playstation coming out next.
Isn't Broadway (Wii) just an overclocked Gekko (GC) CPU? If this is true, that means they're still using tech from the late 1990's. I don't even know what to say.
DX11 is a superset of DX10/10.1Yeah that could be a big problem for the wiiu after the ps4/nextbox launch because i think unreal engine 4(not the mobile one) and luminous are based on DX11 tech?not that i think the wiiu will have good third party support but still.
I've been at stage 5 since the beginning of Wii U since it has games I want to play.
I personally think that to believe everything with Nintendo's strategy is hunky-dory because "it worked for the Wii" is equally naive, and disregards the ways the market has changed in the last 7 years.
By this point in the lead-up to Wii's launch the hype for it was contagious. Everyone was talking about it and wowed by the never-before-seen controller concept. It's a little crazy to think that the Wii U has so far gotten anything comparable to that kind of hype and, without it, my fear is that, with specs this low, there's little-to-no incentive for developers to push quality Wii U software over the alternative: continuing to make games for the two HD consoles already on the market with 60+ million install bases and specs good enough that any benefit from the added power of the Wii U won't be immediately visible to the laymen.
But we'll see soon enough...:-/
Same here. There's a lot of Nintendo fanboys that owe me an apology you fucktards.
Wasn't the Wii U thread speculating more toward the 1.5-2GB range for RAM?
Memory: Mem1 = 32MB Mem2 = 1GB (that applications can use)
Let me know when they show up since there's nothing announced that's convinced me that they even look marginally better or even on par with the best current gen offerings. All be it these are launch games I've seen. I'm not impressed by other than the novel fun some of the games provide.
I'm waiting on the next big Mario that isn't a NSMB game. Something like a Galaxy sequel but up to date with new game play mechanics. I get a feeling that won't be until next year or later though.
At least it seems to really be on par with current gen spec so I'm not really worried about modern day ports. It's the stuff coming down the being specifically built for PS4/720 that has me worried for those expecting WiiU to fulfill all their gaming needs. You better saddle up and get a PC or one of the next gen platforms just to be safe.
1. Denial
2. Anger
3. Bargaining
4. Depression
5. Acceptance
Which stage are you guys? Because, it's all a broken record.
So um. How exactly will they get PS4/720 ports with these specs?
That link doesn't say anything of the sort.http://gamenmotion.blogspot.com.es/2012/06/rumor-wii-u-cpu-is-3-wii-cpu-cores.html
The B3D "insider" says is a equivalent to a Xenos.
The same way the PS2 got downports from the GameCube and Xbox?
The same way the PS2 got downports from the GameCube and Xbox?
So that means a Broadway chip was roughly equivalent to one of the 360 cores?
That's kind of crazy.
8GB internal storage with a MAX of 8 plus 32GB is extremely disappointing and archaic from downloadable game perspective. Most smartphones will have 32GB basically from this year on.
You better not because it's a pain in the ass or expensive if you want to.You don't have to with the other consoles.
Frankly, it's simply embarrassing to use a 12-13 years old CPU, no surprise Nintendo don't want to share the specs. No matter how many cores or the clock speed, is an architecture of 12-13 years ago, it´s limited by default. It redefines the word "cheap".
That's from a hard tech viewpoint. I have no doubt that Nintendo could have huge success selling toasters and I don't need to point the excellence of their development teams that will surely do wonders with this. But there is a line between being conservative and being cheap and Nintendo is crossing it.
Networking: 802.11 b/g/n Wifi