• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Or the update could simply be there to make sure the press don't get distracted... also to keep them from spoiling the fact that Wii BC is going to be another disappointing mark...

well, I think it's highly unfair for people to expect anything more than 99.9% compatibility. that's far beyond what the 360 managed, and only really matched by what the PS3 did with it's launch models.

launching into Wii mode is far from elegant, I admit, but if it gives me 100% compatibility with all my downloaded Wii software and retail Wii games, I'll gladly take it. yes, Wii U Gamepad screen support would have been amazing, but I can understand why that ultimately couldn't happen. it's going to be annoying seeing them get dinged for almost perfect BC.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Honestly I think this is going to largely depend on game budgets, if the developer doesn't have say $50-60 million, they won't be able to produce an "AAA" game and studios like Rockstar will continue to push $100 million dollar games but next gen the scope of those games will require more money, it honestly wouldn't surprise me if we saw some $150 million budgets from them for similar games.

The question of financing is a tricky one because ballooning game budgets tend to be eaten up by games of ridiculous huge scope, crazy 'Hollywood' production values, and global marketing campaigns. It's not like better graphics/more impressive effects = more expensive. You can achieve a noticeably impressive boost in rendering effects via a improved lighting/shadow engine, quality SSAO/HBAO, dynamic DOF and bokeh, and more, of which is encoded into the engine back end and available to most everybody.
 

z0m3le

Banned
The question of financing is a tricky one because ballooning game budgets tend to be eaten up by games of ridiculous huge scope, crazy 'Hollywood' production values, and global marketing campaigns. It's not like better graphics/more impressive effects = more expensive. You can achieve a noticeably impressive boost in rendering effects via a improved lighting/shadow engine, quality SSAO/HBAO, dynamic DOF and bokeh, and more, of which is encoded into the engine back end and available to most everybody.

Sure, but assets also scale with budgets, better textures, polygon models, more detailed environments... art isn't free. Not everything relates to scope, but yes scope is what will balloon a budget. $30million for your average AAA title should increase to ~$50million.

Also what you are talking about with better lighting and shadow engines, that would allow you to keep budgets down, maybe even at $30million, but these games will become the new AA, as games with that extra $20million will make use with higher detailed environments, models, ect. which should end up being the average AAA titles of next gen.
 

Phazon

Member
The question of financing is a tricky one because ballooning game budgets tend to be eaten up by games of ridiculous huge scope, crazy 'Hollywood' production values, and global marketing campaigns. It's not like better graphics/more impressive effects = more expensive. You can achieve a noticeably impressive boost in rendering effects via a improved lighting/shadow engine, quality SSAO/HBAO, dynamic DOF and bokeh, and more, of which is encoded into the engine back end and available to most everybody.

Sure, but assets also scale with budgets, better textures, polygon models, more detailed environments... art isn't free. Not everything relates to scope, but yes scope is what will balloon a budget. $30million for your average AAA title should increase to ~$50million.

I think the speed of development and amount of the devteam also has a big part in the costs.

For AC and COD, they need 100'rds of people working on it, while Retro Studios is a studio with around 80 (not sure) people. That's a big difference I think.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Well, if you can only use e-DRAM as framebuffer and you cannot use a part of it for off screen textures/FBO's then you are back to rendering to e-DRAM, export framebuffer content to main RAM, and then read it back as texture from main RAM like you do on Xbox 360 which does consume more memory bandwidth..
But that is a fixed cost which should not affect your fb BW, ergo it should not spike at high fillrate moments. Also, such resolve operations are normally nicely pipelineable.

..than you would on PS2's GS or Flipper/Hollywood (I was under the assumption that you could only use the texture cache as read-only from the GPU side, but apparently I was wrong).
I still share your old assumption. Care to elaborate what changed you mind?
 

FyreWulff

Member
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?

fq7tq.png
 

gofreak

GAF's Bob Woodward
I think some people missed this:



So?

It falls under shader processing. How a game's scene will be lit depends on software - not any particular part of hardware marked 'lighting' - so how 'good' lighting can potentially be on Wii U will depend on how much shader processing power there is.
 

Durante

Member
I think some people missed this:



So?
As gofreak explained, "lighting" these days is basically pure FLOPS. Since your lighting model (whichever you are using) is implemented more or less 100% in software, what you can do with it depends on which tradeoffs you are willing to make and the baseline performance available in hardware. It's no longer like with fixed-function GPU pipelines where individual hardware features dictated the availability of lighting methods.
 

gatti-man

Member
Sure, but assets also scale with budgets, better textures, polygon models, more detailed environments... art isn't free. Not everything relates to scope, but yes scope is what will balloon a budget. $30million for your average AAA title should increase to ~$50million.

Also what you are talking about with better lighting and shadow engines, that would allow you to keep budgets down, maybe even at $30million, but these games will become the new AA, as games with that extra $20million will make use with higher detailed environments, models, ect. which should end up being the average AAA titles of next gen.

No. Think about what you are saying. Higher res art does not cost an extra 20 million. I'm not a believer in next gen costs being much higher.
 

PrimeRib_

Member
I think it's more of an assumption than a given, that increasing art assets (texture quality) equates to additional development cost. If time = cost, then perhaps, but it's not exponentially going to increase (double) and the cost isn't always going to be the same or easily calculated. If art assets were pizza, there would be plenty ways to make a better tasting pizza while keeping costs in check.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
As gofreak explained, "lighting" these days is basically pure FLOPS.
And BW, if we take deferred shading into consideration. Actually, one could says that DS effectively trades FLOPs for BW.
 
Very insane ; )

Haha! I had a feeling it was! From what I've read, transformation is now done in the vertex shaders and then the result spat back into the pipeline for assembly? Times have definitely changed from 2001 when it comes to rendering graphics. I was actually thinking about it more last night, and decided that it was probably more likely that Nintendo just added in some custom instructions to the different parts of the R700 they used as a foundation. With what we've seen of the BC, do you still think they are largely emulating Hollywood? I know we talked about the eDRAM previously, but mapping the instructions for all the other parts of that chip to their modern equivalents should be doable with a little software trickery?
 

Snakeyes

Member
As gofreak explained, "lighting" these days is basically pure FLOPS. Since your lighting model (whichever you are using) is implemented more or less 100% in software, what you can do with it depends on which tradeoffs you are willing to make and the baseline performance available in hardware. It's no longer like with fixed-function GPU pipelines where individual hardware features dictated the availability of lighting methods.

So it basically means that more "next-gen" lighting can be implemented if the developer decides to scale back in other graphical areas. That's not too bad I guess.
 
I think it's more of an assumption than a given, that increasing art assets (texture quality) equates to additional development cost. If time = cost, then perhaps, but it's not exponentially going to increase (double) and the cost isn't always going to be the same or easily calculated. If art assets were pizza, there would be plenty ways to make a better tasting pizza while keeping costs in check.

I think the creators of the Unreal Engine (who've already said they expect costs to double over the next coming generation) probably has more insight on this than most people on the forum.
 

PrimeRib_

Member
I think the creators of the Unreal Engine (who've already said they expect costs to double over the next coming generation) probably has more insight on this than most people on the forum.

There are plenty of ways for devs to exploit new tech without it automatically equating to increased costs - this is what I was referring to. It's only natural for Epic to convince gaming press and consumers of the inevitable increase in costs with higher tech because their experience is heavy on exploiting tech, requiring additional brainpower and dev resources. They're in a position where they'd have no other choice but to defend a price increase if games jump from $69.99 to $79.99, (NA) but the methods they use to increase costs are not an industry standard, per se.
 

D-e-f-

Banned
Wait...I just thought about something: Since Wii U needs a day-one updated to add a Wii emulator, could mean that the CPU is...not based on Broadway?

The update doesn't add a "Wii emulator" ... it adds all the online functions which includes access to the Wii-system transfer option. The Wii-mode is already in there, as far as I know.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Haha! I had a feeling it was! From what I've read, transformation is now done in the vertex shaders and then the result spat back into the pipeline for assembly? Times have definitely changed from 2001 when it comes to rendering graphics. I was actually thinking about it more last night, and decided that it was probably more likely that Nintendo just added in some custom instructions to the different parts of the R700 they used as a foundation. With what we've seen of the BC, do you still think they are largely emulating Hollywood? I know we talked about the eDRAM previously, but mapping the instructions for all the other parts of that chip to their modern equivalents should be doable with a little software trickery?
Ok, here's what I think. I think U-GPU actually has edram load/store access ops.
 
Maybe not compared to this generation, but to act like 4x more GPU power or 4x more memory is not huge is disingenuous. That GPU difference is a difference between 720p/30FPS and 1080p/60FPS (without even factoring advantage of newer GPU features). Or it could be a difference of being able to run that UE4 demo in 720p/30FPS vs not being able to run it at all. 4x memory difference can also mean something being possible to do vs. not being possible (or "possible" but looking like crap in comparison).

For starters my point WAS that the difference next gen would be nothing like this gen, also the PS4 / 720 rumoured specs put them at 3x not 4x WiiU, a massive, massive comedown from the power gaps between Wii and PS360 not just from a pure numbers perspective but also from the point of view that WiiU will be able to actually handle the same engines because of it's architecture being from the same time frame as PS4 / 720.

*If* WiiU gets next gen third party ports, think of them running at 720p / 30 fps on low graphics settings with PS4 / 720 running games at 1080p / 30 fps or 720p / 60 fps on High graphics settings.

UE4 is possible on WiiU, someone from here that has access to the Warioworld development site confirmed that a few months ago.

Can't remember if it was BG or Ideaman that said it months ago but it should work out about the same as WiiU = PS2, PS4 & 720 = Xbox from that console generation.

People seem to have gotten so used to Nintendo not getting ANY ports of big multi platform games, that they can't seem to accept the possibility of WiiU getting even 'down scaled' version of multi platform games from late 2013 / 2014 onwards...
 

StevieP

Banned
No. Think about what you are saying. Higher res art does not cost an extra 20 million. I'm not a believer in next gen costs being much higher.

I think it's more of an assumption than a given, that increasing art assets (texture quality) equates to additional development cost. If time = cost, then perhaps, but it's not exponentially going to increase (double) and the cost isn't always going to be the same or easily calculated. If art assets were pizza, there would be plenty ways to make a better tasting pizza while keeping costs in check.

This won't be the first generation in history where the costs don't rise. And if you plot the last couple of generations on a graph an follow the line beyond 2013, the "doubling" describes by Sweeney may be understating it.

For starters my point WAS that the difference next gen would be nothing like this gen, also the PS4 / 720 rumoured specs put them at 3x not 4x WiiU, a massive, massive comedown from the power gaps between Wii and PS360 not just from a pure numbers perspective but also from the point of view that WiiU will be able to actually handle the same engines because of it's architecture being from the same time frame as PS4 / 720.

*If* WiiU gets next gen third party ports, think of them running at 720p / 30 fps on low graphics settings with PS4 / 720 running games at 1080p / 30 fps or 720p / 60 fps on High graphics settings.

UE4 is possible on WiiU, someone from here that has access to the Warioworld development site confirmed that a few months ago.

Can't remember if it was BG or Ideaman that said it months ago but it should work out about the same as WiiU = PS2, PS4 & 720 = Xbox from that console generation.

People seem to have gotten so used to Nintendo not getting ANY ports of big multi platform games, that they can't seem to accept the possibility of WiiU getting even 'down scaled' version of multi platform games from late 2013 / 2014 onwards...

While multipliers don't help anyone's case (because some aspects of other next gen consoles are only a couple times more powerful, and some much more than that, such as memory) I will agree that this time it's more on te publishers to port the games than the architecture. Unfortunately if you ask folks like ShockingAlberto they'll tell you that most publishers aren't really including the wii u in their publishing plans on next gen games.
 
I still say this comparison is flawed.

It's more an M2 in an era in which an N64 pulls off Beyond.

I think the distinction they were trying to make was that WiiU was powerful enough to run next gen ports but they would be the worst out of the three versions available like PS2 was during the PS2 / Gamecube / Xbox generation.

Im sure anyone buying a WiiU as their main system for the next 6 years would bite your hand off if you could guarantee them that though, i think we all know WiiU won't get every single next gen multi platform game available but i think it will be more down to the publishers thinking it wont sell on the system rather than the system not being powerful enough to run a down ported version of the engine at 720p / 30 fps, they could even avoid using the tablet altogether and go with sub HD resolutions to get it running on the system if needed.
 
While multipliers don't help anyone's case (because some aspects of other next gen consoles are only a couple times more powerful, and some much more than that, such as memory) I will agree that this time it's more on te publishers to port the games than the architecture. Unfortunately if you ask folks like ShockingAlberto they'll tell you that most publishers aren't really including the wii u in their publishing plans on next gen games.

If WiiU shifts 20 million consoles by the end of 2013, that will change fast imo, companies cannot afford to ignore a massive audience like that these days with the increasing cost of esp AA / AAA multi platform game development (Max Payne 3 and Darksiders II are two instances where they failed to even break even recently).
 

japtor

Member
I still say this comparison is flawed.

It's more an M2 in an era in which an N64 pulls off Beyond.
I'm thinking in technical terms even if the difference might be on the wide range of things, it won't be as easy for the visuals to show the technical gap as well as previous generations (whether due to art direction, assets, budget, targets, etc).
 
I'm thinking in technical terms even if the difference might be on the wide range of things, it won't be as easy for the visuals to show the technical gap as well as previous generations (whether due to art direction, assets, budget, targets, etc).

Which is pretty much what I believe. But I put the actual difference in technical capability at larger than most.

Meaning entirely different things now than it did before.
 
ITT: 184 pages of spec discussion with no specs.

Like this past generation, we're lacking the fine detail with Nintendo hardware. ROPs, clocks, shaders, poly counts. I'd even prefer their GCN era "this is the minimum you can expect in-game" that tells you nothing of theoretical limits versus what we have.

Which is just a sheet of manufacturers with the occasional number attached.

Thank God for insiders though. That's the only reason we know it has a feature-set more advanced (to a degree) of the PS3/360 with similar brute capability.
 
This has been bugging me for a while now. The image that IBM posted last year at E3 (2011) of the eDRAM chip for Wii U's CPU; how much RAM is actually on the bloody chip? Is there someone out there who can look at an image of a RAM LSI and say "Yep, this has X-amount of eDRAM'?
 

z0m3le

Banned
This has been bugging me for a while now. The image that IBM posted last year at E3 (2011) of the eDRAM chip for Wii U's CPU; how much RAM is actually on the bloody chip? Is there someone out there who can look at an image of a RAM LSI and say "Yep, this has X-amount of eDRAM'?

That would only reflect the eDRAM for the CPU (which is 3MBs if devs on neogaf have been telling the truth, that means 2MBs for 1 Core and 512kb for each of the others I believe. as it has an asymmetric split)

The GPU also has 32MB of eDRAM according to developers on beyond 3D, and I believe it was a member here who posted it there originally anyways. 32MB of eDRAM makes perfect sense if you wanted to emulate the Wii's T1RAM, which was below the 32MBs that the GPU would have access to.
 
If WiiU shifts 20 million consoles by the end of 2013, that will change fast imo, companies cannot afford to ignore a massive audience like that these days with the increasing cost of esp AA / AAA multi platform game development (Max Payne 3 and Darksiders II are two instances where they failed to even break even recently).

You would certainly think so, but I've found Publishers are amazing at making horrible business decisions.
 
That would only reflect the eDRAM for the CPU (which is 3MBs if devs on neogaf have been telling the truth, that means 2MBs for 1 Core and 512kb for each of the others I believe. as it has an asymmetric split)

The GPU also has 32MB of eDRAM according to developers on beyond 3D, and I believe it was a member here who posted it there originally anyways. 32MB of eDRAM makes perfect sense if you wanted to emulate the Wii's T1RAM, which was below the 32MBs that the GPU would have access to.

ibm-watson-wii-u-dram.jpg


Then would that mean that each little "diamond" is 128kb? Also, HOW BLOODY SMALL WOULD THAT HAVE TO BE FOR IT TO FIT ON THAT SMALL-ASS DIE, ANYWAY?!? I mean, the only way that I could think of it fitting would be if it was a a dual layered die and the CPU was on top and the eDRAM was on the bott-...wait. First off, is that even possible? Second off, if it is, has anyone considered it?
 

Thraktor

Member
ibm-watson-wii-u-dram.jpg


Then would that mean that each little "diamond" is 128kb?

The image is of an (early prototype) eDRAM macro, which there would be a bunch of on the Wii U's CPU. Off the top of my head, I think it's a 1Mbit (128Kbyte) macro, but it's not really relevant, as we have it from multiple sources that the CPU has 3MB eDRAM L2 cache.

Edit: It's minuscule, probably in the range of a fraction of a square millimetre.
 

wsippel

Banned
ibm-watson-wii-u-dram.jpg


Then would that mean that each little "diamond" is 128kb? Also, HOW BLOODY SMALL WOULD THAT HAVE TO BE FOR IT TO FIT ON THAT SMALL-ASS DIE, ANYWAY?!? I mean, the only way that I could think of it fitting would be if it was a a dual layered die and the CPU was on top and the eDRAM was on the bott-...wait. First off, is that even possible? Second off, if it is, has anyone considered it?
That's a die shot of some old test chip and not really related to the Wii U. Also, eDRAM is much smaller than SRAM, which is basically the point: you don't need stacking to have lots of cache on a small die.
 

Earendil

Member
So your saying, it requires 1.21 jigowatts of power to run?

Of course not. It's about 1 billionth the size of the original flux capacitor (that was 1985 man, come on!) so it only requires 1.21 watts. Unfortunately it is only capable of sending the Wii U through time, and not a DeLorean. The good news though is that when you get a Wii U, you can send it back in time to yourself, so that's a plus.
 

BDGAME

Member
Of course not. It's about 1 billionth the size of the original flux capacitor (that was 1985 man, come on!) so it only requires 1.21 watts. Unfortunately it is only capable of sending the Wii U through time, and not a DeLorean. The good news though is that when you get a Wii U, you can send it back in time to yourself, so that's a plus.
This is heavy.
 

Panajev2001a

GAF's Pleasant Genius
But that is a fixed cost which should not affect your fb BW, ergo it should not spike at high fillrate moments. Also, such resolve operations are normally nicely pipelineable.

It is a pipeline bubble you can schedule work around and thus manage, but it still remain a pipeline bubble for the GPU (rendering to texture and then using said texture take a much longer amount of time and thus GPU clock cycles, so that is more latency you have to hide with math ops) and every resolve you do to main RAM does reduce the bandwidth available to the CPU.

It is still a worthy compromise between the very nice approach the GS used and the first Xbox 1 with its single memory pool.

I still share your old assumption. Care to elaborate what changed you mind?

Well, I could say ... because it makes sense :), but I will have to leave that to "birds".
 

gatti-man

Member
The more I think about this its complete bull. PC games have high res textures and their costs are low. Poor management and poor asset planning creates high costs. Adding more artists won't even add 500k to dev costs and that's assuming they will even need additional artists.
 
Top Bottom