• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

beril

Member
I've been trying to follow all these different threads regarding the RAM specs, and just curious if I have this right.

Better Bandwidth = better textures and frame rates?

Better latency = better load times?

Neither will affect load times much since that's nearly always bottlenecked by the storage media.

If I'm not mistaken it's more like this
better latency = better for many small memory transfers
better bandwidth = better for big memory transfers
 
Neither will affect load times much since that's nearly always bottlenecked by the storage media.

If I'm not mistaken it's more like this
better latency = better for many small memory transfers
better bandwidth = better for big memory transfers
Kudos, that's a way better explanation than mine.
 
His biggest problem was the way he presented his information. It may or may not have been meant to be inflammatory, but it ended up being that way anyway. I think he could have presented his info in a much better way.
This is my biggest issue with the vast majority of GAFfers actually.

TheD is quite the smart mother fucker. But damn does he grate on someone that knows he's, more often than not, right. Civil discourse is just unbelievably difficult to have around here. Well on the internet in general.

Part of the reason Durante, Fafa, AlStrong, and Panajev will always get love from me. Unless someone is being completely dismissive of them, they tend to be badasses that will explain why and where someone has made a mistake. A skill I'd love to see TheD pickup. Because he really is a smart motherfucker.
 

Lord Error

Insane For Sony
I don't know, God of War 3 felt sluggish to me regarding ho fast it answers my inputs; not the framerate itself; just seemed jetlagged to me.

I always attributed it to the MLAA and wanted to turn it off if it was an option, but if there's another explanation/reason I'll take it. Perhaps the game is just slower.

I felt it was unresponsive in regards to God of War 1 and 2 on the PS2, so I wouldn't attribute it to the framerate so I concluded it was probably because the frames were taking too long to finish rendering.
I don't have numbers for GOW3, but KZ3 uses same MLAA and has one of the lower measured input lags of 30FPS games.

As for GOW3, IMO the game has slower/weightier animation than GOW1 and 2, but I don't know what the actual input lag is. If it was larger, it must be coming from something else. If you're really curious about it, the E3 demo of the game was using 2xMSAA, so you can try playing it and see if it feels any different.

Speaking more on topic, newer GPUs are very good with performing these kinds of AA without any CPU involvement, so Wii U should be getting its own implementations pretty easily. SMAA especially is really impressive, and I think that it having a newer and beefier GPU it should be able to pull that off with relative ease. It will just take someone doing it as Nintendo probably won't have that ready in their libraries.
 
I don't have numbers for GOW3, but KZ3 uses same MLAA and has one of the lower measured input lags of 30FPS games.

As for GOW3, IMO the game has slower/weightier animation than GOW1 and 2, but I don't know what the actual input lag is. If it was larger, it must be coming from something else. If you're really curious about it, the E3 demo of the game was using 2xMSAA, so you can try playing it and see if it feels any different.
That's probably it.

As for the E3 demo I didn't know I could get it, I'll look into it when I have the time.
Speaking more on topic, newer GPUs are very good with performing these kinds of AA, so Wii U should be getting its own implementations pretty easily. SMAA especially is really impressive, and I think with newer GPU it should be able to pull that off with relative ease. It will just take someone doing it as Nintendo probably won't have that ready in their libraries.
Yes, those kinds of AA are really preferable, seeing they'll take less passes to do and pull the same results as MSAA (and often cheaper for the hardware or using some unused overhead anyway). Text heavy games tend to suffer a lot if the HUD isn't rendered separately; but having an extra passage for HUD might defeat the purpose.
 
The gamepad is just displayed over existing 5ghz wifi.

Van Owen... and other Wii U naysayers...

First of all... WHERE on NeoGAF did you guys read the Nintendo fans think Wii U would be a powerhouse? For over a year now(!!) NintenGAF though about 3x 360 for Wii Us powerlevel. Seeing that Blops 2 and ME3 are already outperforming PS3 and are close to 360, i´d say that a very good point to start at. Remember, graphics do improve over time. I seriously hope no one of you think that Wii U is maxed out day 1, lol.

Nor for something else...

Wii Us hardware power is not all that important!

Lets get real here. Wii U will NEVER get the kind of 3rd party support. Devs who got burned by Nintendo in the past won´t support their platforms. So why should Nintendo build a 599 us dollar beast and then get rejected by 50% or more 3rd parties? That does not compute! And NO, having strong hardware didn´t help them in the past and it won´t help them now! If a 3rd party wants to develop for Nintendo, then their welcome. They get a dev kit and all licensed software they need. If some devs don´t want to... Their loss. Alot of gamers, like me, are not casual or hardcore gamers. I don´t like to play stuff like Cut The Rope or Farmville or stuff like that. I also don´t like CoD, MoH and GTA4. But I buy Wii U for Zelda, Mario, F-Zero, Smash, etc... But when a 3rd party drops an awesome game... Hell yeah DO WANT!

I expect new experiences from a new console nowadays and not just "more of the same but hey, now its 1080p for realz!!!" IF its that you seek, thats fine with me. Each to his own. But PLEASE could you guys finally stop RUINING EVERY Wii U thread? We get it, you don´t like the Wii U, could you please stop to rain on our parade already? Just accept that Wii U is not the thing for you. The consoles you like will come...

You know wich games this gen impressed me the most visually? Mario Galaxy 1/2. Their not HD yet they were glorious when i first saw them running on my TV. That unmatched artstyle pared with rock solid 60fps was just amazing. And it ran on the Wii!

But hey, continue complaining. Continue making fun of Nintendo. Heck you propably try to anger me by making fun of this very post. I don´t care. If you feel like its approrpiate to act live a 12 year old on a message board, go ahead. I won´t stop you or continue to argue with you!

Oh and you know what? I get the Wii U in 7 days and i will have a frickin awesome time with it. Even if it is obviously not up to "hardcore gamer" standards.

Thanks to Miiverse, eShop an amazing browser and all the videoapps anf of course Nintendos games paired with what Wii U supporters deliver, Wii U will find its audience. Will they sell nearly 100 million like with the Wii? Unlikely, but no matter how much they sell, Nintendo will walk away with a nice profit!

So in conclusion, Wii Us "lack"of hardwarepower is not the huge issue people make it out to be. If you don´t like Wii U for its "weak" hardware, thats fine but PLEASE stop ruining every Wii U thread.
 

z0m3le

Banned
Wii Us hardware power is not all that important!

Lets get real here. Wii U will NEVER get the kind of 3rd party support. Devs who got burned by Nintendo in the past won´t support their platforms. So why should Nintendo build a 599 us dollar beast and then get rejected by 50% or more 3rd parties?

I expect new experiences from a new console nowadays and not just "more of the same but hey, now its 1080p for realz!!!"

So in conclusion, Wii Us "lack"of hardwarepower is not the huge issue people make it out to be. If you don´t like Wii U for its "weak" hardware, thats fine but PLEASE stop ruining every Wii U thread.

These are good points.

Last night my best friend who hasn't bought a Nintendo console since the N64 bought Wii U, He has a top end PC (HD6950 x2 (xfired) with 2500k i5 @ 4.8GHz and 8GB DDR3 triple channel. He fought with the Wii U for a good hour and a half thanks to the update and slow OS for the set up, once we popped in the first game (nintendoland) he was first surprised by the disc's rounded edges, thought it was weird but nice, something that felt new.

Monita annoyed us (he calls her the jar jar binks of Nintendoland) seriously she is annoying and I'm a huge Nintendo fan, she should be skip-able or at least mute-able with fast text speeds.

Once we got into the games though we had a lot of fun, he also bought NSMBU. BTW what really surprised me was that he boxed his 360, he doesn't feel the need to use it anymore, the guy plays a ton of games, and uses his PC for FPS games mostly. The thing here is, I am not sure he will go back to the 360, once we got netflix set up on Wii U, we watched some top gear, the entire console feels new, and he knows that a new zelda and smash are coming even without me telling him.

I'm positive Nintendo did right with the system and considering this is the first generation that won't increase resolutions, I highly doubt Wii U will look dated before Nintendo is ready for a successor, especially when his computer is already more powerful than other future consoles releasing inside the next 2 years.

A ~5GHz CPU /w >4.5TFLOPs worth of GPU power + 8GB DDR3. His PC is where he will go for graphics and the next couple consoles on the market won't change that, but he will come to the Wii U for his console experience and this will be true for him until forza 5 comes out most likely.

TL:DR Wii U is powerful enough to fuel the next 5 years of console gaming, but will probably start feeling a bit outdated.

To this point, considering the 45nm and 40nm nature of the CPU and GPU, is it possible to remove the disc drive and relaunch Wii U as a handheld in ~2017-2018? looking at 11nm and 10nm chips available in 2016, so mass production shouldn't be far behind that, and considering the reduction in power, it seems they could get it into a fairly small form factor and a low enough power usage to drive Wii U onwards as "DSU" and extend it's life for another 4 to 6 years, while releasing another console in 2018-2020. Is this feasible? or am I just over simplifying reductions? because we are talking about extremely big reductions here right?
 

DCharlie

And even i am moderately surprised
Oh wow - i feel honoured to be following on a Zombie/Coldblooder double header but to be honest i have NO IDEA what to tell you guys...

Wii U has slid in as the -maybe- best current gen machine, it's also being (mis?)represented as the ultimate 5 year life span next gen machine whilst it thrashes it's arms around in a sea of ThisGen-ness - so - rather than fight it -

1/ where do you put the Wii U final specs seeing as you most certainly both must know with your assertions?

2/ where do you expect X8/PSOrb in comparison?

IF spec isn't everything - then why take Wii U over X360/PS3? Because the spec is better??? the insane amazing possibilities the controller offers? the ports? the 1st party games?

If spec isn't everything.... then $ per Game wise... get an X360/PS3. right? please posit the inevitable pro Wii-U argument ... i have 4-5 years for you to formulate it
 
You know, crazy new experiences like Zelda, Mario, Smash....




I'll stop you right there. If you're the type of guy claiming "lol always same Zelda and Mario", you should probably stop here. Because you can't tell me that Wind Waker, Twilight Princess, Majora's Mask or Skyward Sword are the same.
And you can't tell me that Super Mario Sunshine and Super Mario Galaxy are the same too.
Those are clearly completely different experiences.
 

ThatObviousUser

ὁ αἴσχιστος παῖς εἶ
Zelda 1 to Zelda 2 was a completely different experience.

Sunshine to Galaxy was heavy refinement. Still a 3D platformer very much in the vein of Mario 64. SM3DL is actually more of a new experience, even if it's also heavily based on the Galaxy games.

Skyward Sword is a pastiche of Wind Waker and Twilight Princess.

Smash is always the same game with some extra features, characters, etc.

I love these series more than most others but let's be real here.

Additionally, there is no single game in the Wii U launch lineup or even announced lineup that is using the GamePad in a "crazy new" way as a core game mechanic. Minigame collections don't count.
 
Zelda 1 to Zelda 2 was a completely different experience.

Sunshine to Galaxy was heavy refinement. Still a 3D platformer very much in the vein of Mario 64. SM3DL is actually more of a new experience, even if it's also heavily based on the Galaxy games.

Skyward Sword is a pastiche of Wind Waker and Twilight Princess.

Smash is always the same game with some extra features, characters, etc.

I love these series more than most others but let's be real here.

Additionally, there is no single game in the Wii U launch lineup or even announced lineup that is using the GamePad in a "crazy new" way as a core game mechanic. Minigame collections don't count.



It's not because Skyward Sword use cell shading and Adult Link that it's a pastiche. Those are completly 3 different games. Except for battle and some usual mechanics like dungeon, those are 3 different experiences.
Also, it seems like ZombiU is kinda new experience.
About Smash Bros, sure it's kinda the same experience, when I quoted that person, Smash Bros wasnt mentionned.
 

z0m3le

Banned
Oh wow - i feel honoured to be following on a Zombie/Coldblooder double header but to be honest i have NO IDEA what to tell you guys...

Wii U has slid in as the -maybe- best current gen machine, it's also being (mis?)represented as the ultimate 5 year life span next gen machine whilst it thrashes it's arms around in a sea of ThisGen-ness - so - rather than fight it -

Ok, First Wii U isn't the ultimate 5 year console machine for the now current gen, but it's also impossible to call it underpowered without knowing how powerful it is. The idea that it's outdated on release is wrong, it's for all intents and purposes a dedicated gaming machine that does stuff quite a bit differently. Read lostinblue's posts he is clearly explaining where this console faults in hardware design. (it's not very good at moving data around) and every piece of silicon is dedicated to process different information. So having said that let me answer your questions.
1/ where do you put the Wii U final specs seeing as you most certainly both must know with your assertions?

2/ where do you expect X8/PSOrb in comparison?

1. I actually think it's pretty close to PS3 in flops numbers, just mostly all those flops can be found on the GPU, I do think the CPU can't process physics, AI and other game logic anywhere near as well as xenon, but it wasn't suppose to, and neither is microsoft and sony's next consoles, both physics and some AI can be moved to the GPU.

2. If they are using jaguar cores, I expect the CPUs to be 4 cores @ ~2ghz (I think Wii U's CPU is 1.6ghz maybe 1.8ghz) XB3 will likely have a stronger CPU, so they might use A10 if they really are going AMD, but that means 1 thread per core, because AMD doesn't have SMT options at all. (so 4 threads max for these other consoles)

IF spec isn't everything - then why take Wii U over X360/PS3? Because the spec is better??? the insane amazing possibilities the controller offers? the ports? the 1st party games?
Yes all of those things, but the real reason is PS3 and X360 are ending, winding down and being replaced in a year. Anyone looking for a console for the next 5 years does in fact have to over look these, and most people who want to just buy a game console already own at least one of these.

If spec isn't everything.... then $ per Game wise... get an X360/PS3. right? please posit the inevitable pro Wii-U argument ... i have 4-5 years for you to formulate it

At the moment, you can get 90% of the games from 360 on a PC, and you will get better frame rates, it will be more future proof (since 360 is ending) also all PC games launch $10 cheaper and quickly drop in price thanks to steam sales. So $ wise, PC is the best thing you can spend money on.

Also I'm not here as some mindless PR for Nintendo, I just don't think the Wii U will feel outdated for the next 5 years.

(btw, I do think these other consoles coming will have ~4x-5x the flops of the Wii U, which puts them around 2TFLOPs, and if Wii U was launching in the same year as those consoles, it wouldn't gain traction, but it's released right now, and lots of core gamers have paid attention, Gaf might feel that it's underpowered for the majority of users, but they are still buying it.)
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Wii Us hardware power is not all that important!

Not important to you but important to me. My WiiU arrives next friday and although I'm looking forward to it. I'm pretty disappointed relative to the consoles power. Sure the Nintendo first party games will look nice and 3rd party devs will eventually get the hang of the hardware such that ports are okay. But I was hoping for a bridging system between this HD gen and next HD gen. A machine that would be able to have decent down ports of true next gen machines. Lots of speculation in the WUST pointed to that exact scenario.

Sadly it's not going to happen.

To say that power is not important in a gaming machine is complete rubbish.
 

StevieP

Banned
Not important to you but important to me. My WiiU arrives next friday and although I'm looking forward to it. I'm pretty disappointed relative to the consoles power. Sure the Nintendo first party games will look nice and 3rd party devs will eventually get the hang of the hardware such that ports are okay. But I was hoping for a bridging system between this HD gen and next HD gen. A machine that would be able to have decent down ports of true next gen machines. Lots of speculation in the WUST pointed to that exact scenario.

Sadly it's not going to happen.

That's the mandate of the publishers, who use terms like "return on investment" (for time and effort) and "demographics" more than anything else.

For example, Battlefield 4 is (I think) launching on PS360, PS4720, PC, but not Wii U. You will see that a lot I think. It is not necessarily a function of hardware power, nor some industry-wide conspiracy.

Think of it this way: would a movie director propose a sex scene and violence in a Disney/Pixar cartoon?
 

Instro

Member
Zelda 1 to Zelda 2 was a completely different experience.

Sunshine to Galaxy was heavy refinement. Still a 3D platformer very much in the vein of Mario 64. SM3DL is actually more of a new experience, even if it's also heavily based on the Galaxy games.

Skyward Sword is a pastiche of Wind Waker and Twilight Princess.

Smash is always the same game with some extra features, characters, etc.

I love these series more than most others but let's be real here.

Additionally, there is no single game in the Wii U launch lineup or even announced lineup that is using the GamePad in a "crazy new" way as a core game mechanic. Minigame collections don't count.

Mario 64 and Galaxy are vastly different when it comes to platforming style and level design. I'm not sure how one could see them as being particularly similar other than that they occupy the same genre with the same character.
 

gofreak

GAF's Bob Woodward
That's the mandate of the publishers, who use terms like "return on investment" (for time and effort) and "demographics" more than anything else.

For example, Battlefield 4 is (I think) launching on PS360, PS4720, PC, but not Wii U. You will see that a lot I think. It is not necessarily a function of hardware power, nor some industry-wide conspiracy.

Think of it this way: would a movie director propose a sex scene and violence in a Disney/Pixar cartoon?

So why are EA putting a M game on Wii U at launch? IIRC they even showed a Battlefield game for Wii U a couple of E3s ago, rumoured to have been shelved in favour of a BF4 port. I'm not sure EA has some big deal-breaking attitude about 'mature' games on Wii U.

Hardware power has a lot to with where content goes..if a system is a cheap port away, pubs will take a punt on it with titles at the start at least, even if there is a certain degree of demographic-related skepticism. If a cheap BF4 port can be done for Wii-U I think it will be. If it doesn't happen I would wager it was because porting was troublesome.
 

pottuvoi

Banned
If it's 3D you have to take a z-buffer into account.

As for maths:

If it's 3D:

1280x720x64 = 7.0313 MB/frame (x30fps=210.94 MB/s; x60fps=421.88 MB/s) <- 3D without AA

1280x720x112 = 12.3047 MB/frame (x30fps= 369.14 MB/s; x60fps= 738.28 MB/s) <- 3D with MSAA 2x

1280x720x160 = 17.5781 MB/frame (x30fps= 527.34 MB/s; x60fps= 1054.69 MB/s) <- 3D with MSAA 4x
Not entirely sure where you get the 112bits and 160bits.. Framebuffer and Z-buffer both get all MSAA samples. (and if using 24bit z-buffer you have 8 bits of stencil as well.)

so for 2xMSAA it would be 128bits and 4xMSAA 256bits. (on a standard 32bit framebuffer+32bit Z-buffer (24 Z+8bit stencil))

It's also a lot easier to calculate these things straight in bytes.. so.
(((1280x720*4)*2)*4)=29 491 200Bytes / 1024^2 = 28.125MB @60fps = 1687.5MB

32bit Framebuffer in bytes+ Z * 4xMSAA samples.

the part where I added Z buffer can be thought of how many 4byte buffers we have.
In case of 64bit framebuffer and normal Z one would use 3.. or in case of many G buffers like in case of crysis3 ps3 it would be 6.
(((1280*720*4)*6)*4) .. =84.375MB

for 60fps its 5062.5MB/s, but you never read the scene once, you read and write quite a bit more.

As for PC GPU's, it probably has to do with the fact that they have to be able to drive way higher resolutions (I mean, every integrated card this days has to support two 2560x1440 screens, and then there's this) in a way, having bandwidth to spare is a means to still manage to perform in these extreme situations. Add textures for games running at more than 1920x1080 and stuff like 16K megatexture being supported by Rage; it's kind of future proofing, even if when games taking that throughput from granted they won't be focused on that hardware anymore.
Texture resolution doesn't really have nothing to do with screen resolution, you can use 2048x2048 texture with multiple channels on a candle which never uses more than 64pixels on a screen. (If you do not have authored content properly.)

Also Rage has several 128k megatextures and a lot smaller ones, when people talk about 8k textures on rage they talk about texture atlas size which is temporal buffer for megatexture tiles.
 

ThatObviousUser

ὁ αἴσχιστος παῖς εἶ
Mario 64 and Galaxy are vastly different when it comes to platforming style and level design. I'm not sure how one could see them as being particularly similar other than that they occupy the same genre with the same character.

Obviously level design is different. But it still has quite a few leftovers from Mario 64. 120 star count, health bar instead of normal Mario health mechanics, a subset of the acrobatic controls in Mario 64, etc. It's quite clearly an evolution, but it's also a cross-generational sequel that really puts the extra hardware and years of solid 3D game design techniques to use.
 

jerd

Member
Obviously level design is different. But it still has quite a few leftovers from Mario 64. 120 star count, health bar instead of normal Mario health mechanics, a subset of the acrobatic controls in Mario 64, etc. It's quite clearly an evolution, but it's also a cross-generational sequel that really puts the extra hardware and years of solid 3D game design techniques to use.

I'm willing to bet that if Galaxy was the same game with a different main character nobody would say they were similar in many aspects besides their genre (3D platformer)
 

ThatObviousUser

ὁ αἴσχιστος παῖς εἶ
I'm willing to bet that if Galaxy was the same game with a different main character nobody would say they were similar in many aspects besides their genre (3D platformer)

How does that change any of what I said?

Listen

Galaxy is probably my favorite game

I don't rank Mario 64 that highly in the series

But they're very obviously connected. Come on now.
 

jerd

Member
How does that change any of what I said?

Listen

Galaxy is probably my favorite game

I don't rank Mario 64 that highly in the series

But they're very obviously connected. Come on now.

Well of course they are connected, it is the same series. But the argument was whether or not they are different experiences, not whether or not they are similar. I'm not sure how you can say they are not.
 

ThatObviousUser

ὁ αἴσχιστος παῖς εἶ
Well of course they are connected, it is the same series. But the argument was whether or not they are different experiences, not whether or not they are similar. I'm not sure how you can say they are not.

The argument was actually whether Galaxy was a "completely different experience" compared to Sunshine, and I can't say that it is.
 

jerd

Member
The argument was actually whether Galaxy was a "completely different experience" compared to Sunshine, and I can't say that it is.

Lol I feel like you're being a little nitpicky. The NSMB games I can see, but 3D mario has always been pretty innovative, especially in comparison to a lot of other franchises out there. I think you'll be hard pressed to find another franchise that changes its formula as much as the 3D Mario games have in the past 3 gens. But I guess we'll have to agree to disagree.
 

Septimius

Junior Member
How does that change any of what I said?

Listen

Galaxy is probably my favorite game

I don't rank Mario 64 that highly in the series

But they're very obviously connected. Come on now.

It changes the fact because the idea is that if this was Rayman Galaxies, and you could do the ol' propeller hair jump instead of a triple jump, and it had the glove power up instead of a super mushroom, you probably would call this "similar to previous Rayman games" rather than "similar to previous Mario games".

If character, jump animations and health-representation is what makes this a clear refinement of Mario64, it's merely confirmation bias making you crow-bar your assessment into lacking arguments. If you could replace all Toads with those small creatures from Rayman, and Bowser with the villain from Rayman 1, and suddenly it's "a completely different game", then it should show you how small the things you call games similar for are quite small
 

Stewox

Banned
The gamepad is just displayed over existing 5ghz wifi.

It's much more to this story than just that, stop spreading such BS. This has been discussed since the speculation threads earlier this year.

The signal carrier is IEEE 802.11n and works at 5.2 GHz , this is probably the only thing similar to WLAN/Wi-Fi, everything else is where the magic happens.
 

Instro

Member
Obviously level design is different. But it still has quite a few leftovers from Mario 64. 120 star count, health bar instead of normal Mario health mechanics, a subset of the acrobatic controls in Mario 64, etc. It's quite clearly an evolution, but it's also a cross-generational sequel that really puts the extra hardware and years of solid 3D game design techniques to use.

Those are all fluff compared to the core mechanics and design. Mario 64 is designed around large open areas, and(relatively speaking) sloppy platforming mechanics that allow the player to explore and collect in a non-linear manner. Galaxy replaces that with a very focused, structured, and more pure platforming experience with only a few levels that are truly open. I would call it a complete design shift rather than an evolution, as the most important aspects of what made Mario 64 what it was were replaced.

Of course this whole discussion is a pretty big tangent to the original topic which was about the WiiU and new experiences. In the grand scheme of things, you are correct. Most of their first party properties essentially go through evolutionary steps with each iteration. Some make more changes than others, say comparing Zelda to Smash, but they are not always bringing something completely new to the table. Metroid is one of the few that has really gone through constant extreme changes over the years, and will likely continue to do so. For this reason though Nintendo needs to keep creating new IPs, hopefully ones that people like ourselves would actually care about rather than more casual minigame type junk.
 
Not entirely sure where you get the 112bits and 160bits.. Framebuffer and Z-buffer both get all MSAA samples. (and if using 24bit z-buffer you have 8 bits of stencil as well.)

so for 2xMSAA it would be 128bits and 4xMSAA 256bits. (on a standard 32bit framebuffer+32bit Z-buffer (24 Z+8bit stencil))

It's also a lot easier to calculate these things straight in bytes.. so.
(((1280x720*4)*2)*4)=29 491 200Bytes / 1024^2 = 28.125MB @60fps = 1687.5MB

32bit Framebuffer in bytes+ Z * 4xMSAA samples.

the part where I added Z buffer can be thought of how many 4byte buffers we have.
In case of 64bit framebuffer and normal Z one would use 3.. or in case of many G buffers like in case of crysis3 ps3 it would be 6.
(((1280*720*4)*6)*4) .. =84.375MB

for 60fps its 5062.5MB/s, but you never read the scene once, you read and write quite a bit more.
Sorry, I'm not gonna say you don't know what you're talking about, but I think you're wrong.

Every up-sample for MSAA costs 24 bits on the x360, and while you can do that on 32 bit precision it doesn't seem like there's much sense in doing that for an extra passage meant for adding AA.

regular framebuffer+z-buffer=64 bits

64 bits+48 bits=112 for 2x MSAA
64 bits+96 bits=160 for 4x MSAA

I'm not the only guy doing it like this on the internet, it has been done like this for years. And I've researched it pretty heavily a few months back. If one doesn't take some basis for confirmed then it can't really ever do anything, and yet the numbers I posted were meant to be indicative, not absolute, for we know there have been rendering resolution tricks on consoles since the beginning of times; and they're still valid. It's only part of the equation; but I did my homework right.

As for how it gets done on PC it probably is 32 bits, because you have the overhead not to care; but I'm doing plausible math for consoles; and for a console that is far from rivaling a gaming PC power.
Texture resolution doesn't really have nothing to do with screen resolution, you can use 2048x2048 texture with multiple channels on a candle which never uses more than 64pixels on a screen. (If you do not have authored content properly.)

Also Rage has several 128k megatextures and a lot smaller ones, when people talk about 8k textures on rage they talk about texture atlas size which is temporal buffer for megatexture tiles.
But I wasn't implying there was...

As for Rage, I know it's not a big improvement with 16k, I was just saying that it does support it, and those modes (and up) will be reserved for PC GPU's because they're the ones with overhead for that.

I was only speaking of overhead here, not framebuffer being hindered by texture resolution (hell no).
 

pottuvoi

Banned
Every up-sample for MSAA costs 24 bits on the x360, and while you can do that on 32 bit precision it doesn't seem like there's much sense in doing that for an extra passage meant for adding AA.

regular framebuffer+z-buffer=64 bits

64 bits+48 bits=112 for 2x MSAA
64 bits+96 bits=160 for 4x MSAA
That certainly is an interesting way to calculate it.
Basically you add only additional Z samples to the calculation, but you fail to see that there is already a 24bit Z and most likely 8 bits of stencil in there. (in the end you have 5 z samples, one stencil and one color. (unless you meant it as 5color, 1stencil and 1Z.)

I certainly have not heard that you can have only 24bit Z/frame-buffer without 8 padding bits on x360.
Also, you missed the whole idea on how MSAA works, you must fill the buffer before resolve and keep every color and Z sample up to that point.

Resolve happens directly to main memory. (So msaa doesn`t add any main memory usage.)
Here's small MSDN bit about predictive tiling, it should clear this up..
http://msdn.microsoft.com/en-us/library/bb464139.aspx
 
So in conclusion, Wii Us "lack"of hardwarepower is not the huge issue people make it out to be. If you don´t like Wii U for its "weak" hardware, thats fine but PLEASE stop ruining every Wii U thread.

oh man, i kinda missed your meltdown posts while you were banned.
"guys, stop talking about how wiiU isn't all that great in this wiiU hardware thread! you're ruining the thread, it's all about fun!"
 
That certainly is an interesting way to calculate it.
Basically you add only additional Z samples to the calculation, but you fail to see that there is already a 24bit Z and most likely 8 bits of stencil in there. (in the end you have 5 z samples, one stencil and one color. (unless you meant it as 5color, 1stencil and 1Z.)

I certainly have not heard that you can have only 24bit Z/frame-buffer without 8 padding bits on x360.
Also, you missed the whole idea on how MSAA works, you must fill the buffer before resolve and keep every color and Z sample up to that point.

Resolve happens directly to main memory. (So msaa doesn`t add any main memory usage.)
Here's small MSDN bit about predictive tiling, it should clear this up..
http://msdn.microsoft.com/en-us/library/bb464139.aspx
I'll have to be brief because I'm between work (but I really find no need to re-research it), but as I said, it's not my invention. I'm pretty sure I've read it first on B3D threads a few years back as well as some tech data.

quick search gives me this:

straight up framebuffer is res*32 bits.
When you add 2xMSAA it adds a futher 48 bits into the mix so you can sample it correctly (2x 24bits )

So effectivly its res*80 bits (2xMSAA)
You always have a z-buffer (depth) at res*32 bits.
Although some use 24bits.

So your total is res*102 bits.

The rest is simply divison.

divde by 8=bytes
divide by 1024=kb
divide by 1024 again=mb

On the MSAA every up sample add 24 bits to the mix.

so 2x=+48 bits
4x=+96 bits.
Source: http://www.psu.com/forums/showthrea...ed-360-screenshots/page7?p=4797685&viewfull=1

Dude also didn't invent the method, surely. And no I'm not saying that proves my point, I'm saying I've picked it up and surely checked for it's accuracy back when I researched these shenanigans. (and if I didn't or found that I did a mistake I'd have no problem admitting it; but I don't think I did)

As for stencil, no, I'm not having that into account (not all games have to use it), like said it's supposed to be a real world indicative scenario, I'm not dumping the load here.

If you look at it... It's a mix, Z-Buffer is 32 bits, I could shave it off down to 24 bits, MSAA I actually read because predicative tiling is never ideal they usually don't use the 32-bit precision (that and other real world performance issues, I'm sure), and it actually makes sense. So it's a mix, In a way it's not the smallest framebuffer you can get (you could also skip rendering some horizontal lines, as I've pointed above), but it's also not the largest; but I'm not saying it's this and not a byte more, nor less.

Thinking developers are not gonna optimize there, when they're limited is IMO wrong, also.


Can we drop this?
 
1/ where do you put the Wii U final specs seeing as you most certainly both must know with your assertions?

2/ where do you expect X8/PSOrb in comparison?

1) I think that we haven't seen what the Wii U is capable of yet. I think the early efforts by Nintendo are too stylized to show it, and the 3rd party efforts are too hamstrung by their lack of experience with the hardware. Most specifically I think that anyone trying to recompile PS360 code that expects it to run at full speed without optimization for the WiiU's big edram bridge and and OOO processor is going to be in for a disappointment. Additionally, if they're not adjusting their SIMD routines to run on the GPU instead of the CPU, they're also going to run into problems.

I think that in the end we're talking about a machine that is going to put out graphics that will be noticably better than what we see on the PS360, but it might come down to comparisons. We may have games that just have better textures and run at a smoother clip.

2) I think that they'll have 4 or 8 gig of ram, and run at 140watt under load. I think they'll be of a more modern architecture, but it doesn't sound like they'll still be still be powerpc. That means that BC is out the window for a lot of cases. The real question is that if they try and compile 360 code for durango will it run well after being patched for architectural differences? I think the answer is yes. I think that they'll be able to take a sub 720p game and recompile it so that it runs 1080i 60fps. I think that 1st gen games for them will be able to look significantly better in a lot of ways than anything that could be done on the PS360.

All of that said, I'm not sure there's another console generation in there for Sony. Not that their game division doesn't have the will, but I expect that they'll be undergoing significant reorganization as a corporation in the next two years, some divisions will be chopped entirely, and some will be told to sit down and behave. Orbis may be a victim of that - whether it means something lower spec, more affordable, and more immediately profitable, or if it means stick with the PS3 for another 5 years and let Microsoft be the odd man out. I don't know. Or maybe their games division will be left alone.
 
I do not know how anybody can say what the wii u's true power is. I keep seeing all different things on what the wii u has. I have seen some sites say its 3 wii cores put together, others say its one big core with 2 smaller ones, I have seen people say the edram or what ever it is makes up for the slow ddr3 ram.

Until an actual spec sheet is poseted I do not see how anyone can say the wii u is underpowered or not.

People also have to kerep in mind you cannot compare the 360 to nintendos consoles because both companies think differently when it comes to developing games.

Nintendo has stated it develops its consoles mainly to how its internal developers want to program games . Microsoft usually does its consoles for a broad capability for all 3rd party programmers.

So if you get a pc dev who wants to develop a game the same way he does pc games he would look at the console and say wow its underpowered.
 

Thraktor

Member
All of that said, I'm not sure there's another console generation in there for Sony. Not that their game division doesn't have the will, but I expect that they'll be undergoing significant reorganization as a corporation in the next two years, some divisions will be chopped entirely, and some will be told to sit down and behave. Orbis may be a victim of that - whether it means something lower spec, more affordable, and more immediately profitable, or if it means stick with the PS3 for another 5 years and let Microsoft be the odd man out. I don't know. Or maybe their games division will be left alone.

I think there's at least one more generation in it for Sony, as the Playstation brand is simply too valuable for them not to try to take advantage of it (although if PS4 doesn't go well, they could well repurpose the brand as a streaming service). If you look at the PS4's rumoured specs, it already looks like something made by a very different company than the PS3. The CPU is a pretty basic, almost off-the-shelf model, the GPU is a mid-level, almost off-the-shelf model, they're using standard RAM and there aren't any extravagances in there like the Blu-Ray drive on the PS3. It definitely looks like a console that's designed to sell at about $400 and break-even as soon as possible.
 

MDX

Member
All of that said, I'm not sure there's another console generation in there for Sony. Not that their game division doesn't have the will, but I expect that they'll be undergoing significant reorganization as a corporation in the next two years, some divisions will be chopped entirely, and some will be told to sit down and behave. Orbis may be a victim of that - whether it means something lower spec, more affordable, and more immediately profitable, or if it means stick with the PS3 for another 5 years and let Microsoft be the odd man out. I don't know. Or maybe their games division will be left alone.


I've always felt that the best move for Sony is to focus on the PS3, and forget about the PS4 for a few more years. By that time it can launch early against Nintendo's next console. If they launch now, they are going to be battling Microsoft in an deep red ocean. But I got a feeling the console makers feel that the size of the gaming community pie is small. And that whoever gets to them now, will win long term.
 

MDX

Member
Havok on Wii U has “specific advantages” over other platforms

“Wii U has its own unique features, and its own challenges. When we come across any new particular platform, we optimize specifically for the advantages that those platforms offer over other platforms — the Wii U has specific advantages that no other platform has

Anybody know anything more about this? Any games currently making use of HAVOK and special features?
 

Thraktor

Member
Havok on Wii U has &#8220;specific advantages&#8221; over other platforms

&#8220;Wii U has its own unique features, and its own challenges. When we come across any new particular platform, we optimize specifically for the advantages that those platforms offer over other platforms &#8212; the Wii U has specific advantages that no other platform has.&#8221;

Anybody know anything more about this? Any games currently making use of HAVOK and special features?

It runs on the GPU.

It's an advantage over PS360, anyway, although of course PC and future consoles can do the same.

Fake edit: They could also be talking about Havok AI (pathfinding) which should run quite well on the Wii U's CPU, given the significantly larger cache.

Real edit: When I posted, I'd accidentally typed GPU instead of CPU in my fake edit. It occurred to me, though, that the same could be said for the GPU. By caching pertinent sections of the navigation mesh in the eDRAM, the reduction in latency would significantly improve the performance of pathfinding on the GPU, if that's the way they're going.

Real edit 2: Come to think of it, caching parts of the navigation mesh in the GPU eDRAM would also improve performance if you're doing the pathfinding on the CPU.
 

nikatapi

Member
Havok on Wii U has “specific advantages” over other platforms

“Wii U has its own unique features, and its own challenges. When we come across any new particular platform, we optimize specifically for the advantages that those platforms offer over other platforms — the Wii U has specific advantages that no other platform has

Anybody know anything more about this? Any games currently making use of HAVOK and special features?

Probably gpgpu stuff for doing calculations on the gpu instead of the cpu?
 

mckmas8808

Banned
Yes. I remember those heady days. Days when hope was alive. The rumoured spec pointed to a machine that was a big jump from current gen. Easily doing current gen ports. Of course the WiiU version would always be the best looking version.

And maybe even powerful enough to be within reach of the PS720. And then Arkam came along and basically said it's just a 720p machine and not that powerful and the WUST ate him alive. It was just not possible that the machine could be that gimped. Loads of folks said it was technically impossible that in some critical areas it would under perform and Xbox 360. Impossible. Absolutely no way.

In hindsight. I can understand it. But they were still dark dark days.

At least you have the integrity to admit it. I applaud you.

What does WUST stand for?
 
oh man, i kinda missed your meltdown posts while you were banned.
"guys, stop talking about how wiiU isn't all that great in this wiiU hardware thread! you're ruining the thread, it's all about fun!"

Sorry to disappoint. Its just sad that there are people on this board not understanding "Wii U is NOT for U!" and just shit all over the threads and think they do something cool...

Wii U will be here next week and im going to enjoy the heck out of it EVEN if the CPU is slow or the ram is 43% slower than on PS360... But some people propably won´t understand that you can have fun with Wii U despite that...

What does WUST stand for?

Wii U Speculation Thread. There were 6 of them.
 
oh man, i kinda missed your meltdown posts while you were banned.
"guys, stop talking about how wiiU isn't all that great in this wiiU hardware thread! you're ruining the thread, it's all about fun!"




I don't think that's a thread to determine if Wii U is great or not. Just a thread to talk about what's behind Wii U's hardware.
 
Not important to you but important to me. My WiiU arrives next friday and although I'm looking forward to it. I'm pretty disappointed relative to the consoles power. Sure the Nintendo first party games will look nice and 3rd party devs will eventually get the hang of the hardware such that ports are okay. But I was hoping for a bridging system between this HD gen and next HD gen. A machine that would be able to have decent down ports of true next gen machines. Lots of speculation in the WUST pointed to that exact scenario.

Sadly it's not going to happen.

To say that power is not important in a gaming machine is complete rubbish.

Wrong

3x Xbox 360 and "downports possible from P4/720" is what we concluded btw not " Close to PS4/720"...

"Oh btw PS3 has a really shitty CPU." Thats what devs said about the PS3 on the first year/s. I think we know better now, don´t we?

And may i ask why you keep judging Wii U by launchsoftware?

"true" next gen machines... Get a halfway decent PC today. Then you can have "true next gen" visuals today already!

Oh well whatever. Now excuse my i still got to prepare stuff forr next weekend. Big things in the works.
 

AzaK

Member
It runs on the GPU.

It's an advantage over PS360, anyway, although of course PC and future consoles can do the same.

Fake edit: They could also be talking about Havok AI (pathfinding) which should run quite well on the Wii U's CPU, given the significantly larger cache.

Real edit: When I posted, I'd accidentally typed GPU instead of CPU in my fake edit. It occurred to me, though, that the same could be said for the GPU. By caching pertinent sections of the navigation mesh in the eDRAM, the reduction in latency would significantly improve the performance of pathfinding on the GPU, if that's the way they're going.

Real edit 2: Come to think of it, caching parts of the navigation mesh in the GPU eDRAM would also improve performance if you're doing the pathfinding on the CPU.

Given "slow" RAM and CPU, I can't help but think a really beefy 64MB EDRAM would have been nice.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
3x Xbox 360 and "downports possible from P4/720" is what we concluded btw not " Close to PS4/720"...
Who's "we"? You mean you. And you are wrong.

"Oh btw PS3 has a really shitty CPU." Thats what devs said about the PS3 on the first year/s. I think we know better now, don´t we?
Rubbish. Sony touted the Cell as an amazing processor. The hype was everywhere.

And may i ask why you keep judging Wii U by launchsoftware?
And what else should be judge it by. Batshit insane drivelling from folks like you. Every next gen launch has a few games that make you think that next gen has arrived. Not so with WiiU. Proof is in the pudding.

"true" next gen machines... Get a halfway decent PC today. Then you can have "true next gen" visuals today already!
Thanks. I've already got a decent gaming PC. Still looking forward to true next gen when the new HD twins arrive.

Oh well whatever. Now excuse my i still got to prepare stuff forr next weekend. Big things in the works.
I too am looking forward to playing with my WiiU. Looking forward to gaming on the gamepad screen.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I'm curious, how did gaf react to wii specs, reveal and launch? I completely ignored the chaos back then lol

It was awesome. The rumour mill was in full swing. Some type of virtual reality headset with performance being a huge leap over Xbox or PS2 was one scenario.

When it finally landed there was a severe case of WTF.
 

Easy_D

never left the stone age
I'm curious, how did gaf react to wii specs, reveal and launch? I completely ignored the chaos back then lol

Wii had a totally new rendering method using Nurbs which would make it blow the other consoles out of the water.

Iwata's famous quote "When you see Wii graphics, you will say 'wow!'." made the actual reveal pretty funny.
 
Top Bottom