Interesting. That's pretty close to my expectations (1460/ 490).
Well known hardware hacker and former member of Team Twiizers, the guys responsible for the Wii Homebrew Channel.who is Marcan?
Interesting. That's pretty close to my expectations (1460/ 490).
Well known hardware hacker and former member of Team Twiizers, the guys responsible for the Wii Homebrew Channel.who is Marcan?
who is Marcan?
marcan said:@digitalfoundry sorry, I'd rather not talk about how I got that yet. It doesn't involve leaks, it involves Wii U hacks
Wow. That was quick.
1.25 GHz CPU clock with larger caches is about what I expected (though I'd hoped for ~1.5 GHz). Certainly not what you'd call powerful, but at least significantly more powerful than Broadway.
Sadly, 550 MHz GPU clock doesn't tell much.
Very heavily modified. 750CL is inherently unsuited for SMP.so it was 3Xbroadway.
Thanks for the link. If that is true, Matt and Lherre wasn't kidding when they said not to think too much about multipliers.
So three x Wii?
So three x Wii?
But how does it work on PC games which usually have various settings for detail, textures, resolution and so on?I think people believes that downport is like press a magic button in the engine workflow that magically converts your game with less resolution, less graphic effects, etc
Wii had a DSP as well.3 cores. so 3x3 Wii if you're going off that.. And you have a separate DSP to handle audio, which a lot of times a core on the 360 is busy with handling.
Also, is there something I am missing or does the CPU sound terribly underpowered? Didn't the 360 use a tri-core 3.2Ghz CPU?
Is the GHz number this meaningless or is the Wii U CPU just a terribly underpowered piece of hardware?
Also, is there something I am missing or does the CPU sound terribly underpowered? Didn't the 360 use a tri-core 3.2Ghz CPU? I'm aware that GHz are far from everything, but just a third of 360's clockspeed doesn't sound good at all.
Is the GHz number this meaningless or is the Wii U CPU just a terribly underpowered piece of hardware?
Depends on the CPU but is extremely unlikely that that it will have better Floatpoint power per clock that the 360 CPU (Wii cores had half power per clock) so anything that requires, for example, 3D computations will be slow (part of the AI, Animations and Physics) for other things it will be more or less the same.Also, is there something I am missing or does the CPU sound terribly underpowered? Didn't the 360 use a tri-core 3.2Ghz CPU? I'm aware that GHz are far from everything, but just a third of 360's clockspeed doesn't sound good at all.
Is the GHz number this meaningless or is the Wii U CPU just a terribly underpowered piece of hardware?
Everything Arkham said has been spot on so far. Funny how no one wanted to belive him back in the days.
I'm comparing a whole range of games. I wouldn't use Nano Assault for this though (when it comes to shadows).Are you serious? Lol, aren't you comparing Nano Assault to this? Or am I mistaken?
The only thing it loses against LOU is they're not soft shadows. But every object has a shadow. Zoom in and there are even self shadows (the windows). All while pushing extensive motion blur and having subtle depth of field near the edge of each level.
I agree, it shouldn't be ignored. But remember what you were replying to.phosphor112 said:Grass doesn't seem to show shadows in the game (though this is "beta" footage) so it might be added later. But with all the polygons and shaders they are pushing the lighting effects (umbra/penumbra/soft shadowing) cannot be ignored.
http://www.youtube.com/watch?v=qguDnY0C_Sg
When every object in a scene has them and at high resolution (i.e, no artifacts).
I haven't played Nano Assault on Wii U but if the 3DS version is a precedent, I doubt they made the levels uber compact like Naughty Dog did. Nintendo land is almost the same deal. The main plaza is quite spacious. Some of the minigames too I guess (Mario Chase and Animal Crossing Sweet Day).So last of us is a corridor and nano assault or nintendoland has huge environments to compare ... right?
I'm sure some will say this is confirmation that Wii U isn't as powerful as PS360. I still find that hard to believe with all things considered and the first round of ports are close to PS360 and in some cases equal.
So: Why would you move to have 2GB's of RAM in a system thats running on all the other specs we know about? Working backwards can this tell us anything?
The Wii U's been a week on the market, and they already have some success at hacking it? That was quick.
Let's see if Nintendo's OS is yet again so badly designed that they are unable to patch it in a way that makes hacks pointless.
The only thing it loses against LOU is they're not soft shadows. But every object has a shadow. Zoom in and there are even self shadows (the windows). All while pushing extensive motion blur and having subtle depth of field near the edge of each level.
Also, is there something I am missing or does the CPU sound terribly underpowered? Didn't the 360 use a tri-core 3.2Ghz CPU? I'm aware that GHz are far from everything, but just a third of 360's clockspeed doesn't sound good at all.
Is the GHz number this meaningless or is the Wii U CPU just a terribly underpowered piece of hardware?
I'm comparing a whole range of games. I wouldn't use Nano Assault for this though (when it comes to shadows).
Now here's my idea of a game that does shadows better.
The only thing it loses against LOU is they're not soft shadows. But every object has a shadow. Zoom in and there are even self shadows (the windows). All while pushing extensive motion blur and having subtle depth of field near the edge of each level.
2006 was all about Shader Model 3 and going from 256 x 256 textures to 512 x 512.
This is 2012 where Wii U has a more advance API and a lot more RAM to do things that weren't possible 6 years ago.
Of course not. Thing is: Shin'en said they used features in Nano Assault that are simply not available on PS3 and 360 hardware. And there's really no reason not to believe them. It's not about how the game looks, it does things previous consoles couldn't do. But we don't really see it. Because 3D graphics is largely smoke and mirrors to begin with, and almost everything can be at least faked with varying levels of success and efficiency on almost any hardware. It just tends to take more effort to fake stuff - which isn't a problem for a multimillion dollar AAA game with a team of several hundred developers and years of development time.
I remember seeing a video where alot of the environment is destructible. I have to see it again.Are you even sure that the environment shadows are real-time?
BTW do you even know about the soft indirect shadowing in The Last of Us? That's not simply "soft shadowing" and obviously a lot more advanced than self shadowing. I think you are trying too hard.
I'm not saying those effects are new.I'm talking about the effects you mentioned, we've already seen them in games around 2006. Dead Rising had impressive object-based motion blur in 2006 + DoF.
Star Wars 1212
I remember seeing a video where alot of the environment is destructible. I have to see it again.
Also, is there something I am missing or does the CPU sound terribly underpowered? Didn't the 360 use a tri-core 3.2Ghz CPU? I'm aware that GHz are far from everything, but just a third of 360's clockspeed doesn't sound good at all.
Is the GHz number this meaningless or is the Wii U CPU just a terribly underpowered piece of hardware?
So yes, the Wii U CPU is nothing to write home about, but don't compare it clock per clock with a 360 and claim it's much worse. It isn't.
I don't know if this will suffice for an answer?
http://forum.beyond3d.com/showpost.php?p=1682111&postcount=3476
http://forum.beyond3d.com/showpost.php?p=1682115&postcount=3478
Is it really the case that nothing that's been learned over the past 7 years can be applied to Wii U game development?
So how many GFLOPs can a 2011, 550Mhz GPU push realistically ?
So how many GFLOPs can a 2011, 550Mhz GPU push realistically ?.
Good news about the CPU using such a low amount of electricity, all the more for the GPU to use.
Still waiting on the eDRAM speed though, shouldn't the guy that hacked it be able to get that info too ?.