I'm going by the current crop of console ports.
Well of course they're going to be DX9 level because theyre being ported from DX9 level consoles. But the fact is WiiUs GPU is at minimum DX10.1 level, almost certainly above that in some ways.
I'm going by the current crop of console ports.
If we're lucky, it'll be kinda like comparing the PS1 version of Tony Hawk 2 to the Dreamcast one. Much uglier, but same core gameplay.
And that's probably what the Wii U versions of next-gen games will be like.
The point is that, unlike the current generation, developers will be able to create cut-down builds of these games for Nintendo's console without having to recode everything from scratch.
I don't think it'll be that bad. Here are shots of the PS and DC versions.
It's doubtful that we'll see such a meaningful jump in geometry, texture quality and aliasing in one generational.
I'm going by the current crop of console ports.
Agreed, Dreamcast could push about 20x as many polygons as PS1 and had over 6x the RAM. Not to mention PS1 didnt even have perspective correction or bilinear filtering. No way will WiiU to any other next gen console even approach that kind of difference.
I don't think they were really comparing the power levels. Just using Tony Hawk as an example of games that were identical in gameplay, but one version looked clearly better than the other. That probably will be the case with some Wii U and PS4/720 multiplatform titles.Wait. Now people think the difference between Wii U and PS4/720 is going to be the same as DC to PS1?
Holy shit. Just say PS4 is going to run Avatar at 5000fps and call it a damn day. There is no intelligent discussion left.
Wait. Now people think the difference between Wii U and PS4/720 is going to be the same as DC to PS1?
Holy shit. Just say PS4 is going to run Avatar at 5000fps and call it a damn day. There is no intelligent discussion left.
There are going to be a lot of hurt people come the reveals of PS4/ 720 imo, people have unreal expectations.
If we look at it like this there really isn't a huge difference compared to this generation -
CPU -
WiiU - IBM Tri Core OoO CPU with seperate Audio DSP.
PS4 - 4 Core AMD CPU.
720 - 4 Core AMD CPU.
GPU -
WiiU - ~500 GFLOP, Direct X 10.1 like feature set chip.
PS4 - ~1.8 TFLOP, Direct X 11 like feature set chip.
720 - ~1.5 TFLOP, Direct X 11 feature set chip.
Ram -
WiiU - 2GB's, 1GB for Games, 1 GB for OS.
PS4 - 4GB's, 3GB for Games, 1 GB for OS.
720 - 6GB's, 4GB for Games, 2 GB for OS.
Nothing like the power gap of the Wii / PS3 / 360 generation, people need to get over it...
The rumor of Battlefield 4 running in 60fps on PS4/720 is good enough for me. That is a blatantly obvious jump from last gen on consoles.
CoD titles usually run at or around 60fps on the HD twins.
It all depends on far and how well they push the hardware.
They'll both certainly be noticeably more powerful than the Wii U, but some people are expecting far too much out of them.
Have anyone seen this latest clip of NintendoLand's plaza... night time?
http://www.youtube.com/watch?v=zoAyJd7Ra5k
Sorry if old or slightly unrelated(couldn't find a proper thread to post it).
Somewhat on topic: I'm really digging the visual style of this game. The image quality, lighting and textures are really well done, but it's the way everything comes together. At times, there's a decent amount of stuff in the scene, and yet none of the objects lack proper lighting/texturing, conveying a CG-like look. The (plaza) scenes appear very rich with effects and everything is animated beautifully, yet the console seems to have no real problem rendering it this a decent framerate(isn't NintendoLand confirmed to be 60 fps, btw?). Some parts of the video gives me that (old) Saturday 3D cartoons vibes. Can't wait to see what Nintendo's 2nd/3rd gen games look like, given their unique art styles.
Have anyone seen this latest clip of NintendoLand's plaza... night time?
http://www.youtube.com/watch?v=zoAyJd7Ra5k
Sorry if old or slightly unrelated(couldn't find a proper thread to post it).
Somewhat on topic: I'm really digging the visual style of this game. The image quality, lighting and textures are really well done, but it's the way everything comes together. At times, there's a decent amount of stuff in the scene, and yet none of the objects lack proper lighting/texturing, conveying a CG-like look. The (plaza) scenes appear very rich with effects and everything is animated beautifully, yet the console seems to have no real problem rendering it this a decent framerate(isn't NintendoLand confirmed to be 60 fps, btw?). Some parts of the video gives me that (old) Saturday 3D cartoons vibes. Can't wait to see what Nintendo's 2nd/3rd gen games look like, given their unique art styles.
Because, frankly, they aren't the final say on what makes a game good/bad. This is a problem Sony's had: they'd block TYPES of games entirely because they figured they were "no good" or "not game enough", and so we miss stuff that could've been localized or perhaps developed locally. Hell, the NES policies weren't really even stopping crappy games so much as preventing the market from being flooded like with the Atari 2600: there was no shortage of shitty games, it's just many companies that wanted to pump out a lot had to run a separate label to get around it, and we had good (Konami) and bad (Acclaim) doing just that.I'm not ready to accept Nintendo is that powerless.
What's the point of being a manufacturer (or any top business position) if insubordinates rule you?
Edit: Even offering incentives is still some form of control and something they could totally do.
I don't think they were really comparing the power levels. Just using Tony Hawk as an example of games that were identical in gameplay, but one version looked clearly better than the other. That probably will be the case with some Wii U and PS4/720 multiplatform titles.
Yes this.
But COD4 on Wii versus the HD versions... Man that was ugly as sin and had scaled-down firefights too.
Maybe not compared to this generation, but to act like 4x more GPU power or 4x more memory is not huge is disingenuous. That GPU difference is a difference between 720p/30FPS and 1080p/60FPS (without even factoring advantage of newer GPU features). Or it could be a difference of being able to run that UE4 demo in 720p/30FPS vs not being able to run it at all. 4x memory difference can also mean something being possible to do vs. not being possible (or "possible" but looking like crap in comparison).If we look at it like this there really isn't a huge difference compared to this generation -
CoD titles usually run at or around 60fps on the sub-HD twins.
Isn't BLOPS2 sub-HD?
Fixed.
Seriously, running triple A titles with a stable framerate of +30 is going to be a huge improvement. I am beyond tired of sub-20 FPS drops, with games that barely run 720p. I don't care what the PS4 or next Xbox has for specs as long as it gives stability and attempts to push full 1080p in most games.
The thing is that something looking like "crap" is already subjective, and diminishing returns will be a factor on how much detail can be reduced or cut before most viewers being able to the recognize it as a significant difference. Epic themselves hit a bit of that when people compared the UE3 Samaritan video vs the UE4 demo. Epic said that UE4 will make the Samaritan video look like crap, but I'm sure that many would consider that a major exaggeration so far.Maybe not compared to this generation, but to act like 4x more GPU power or 4x more memory is not huge is disingenuous. That GPU difference is a difference between 720p/30FPS and 1080p/60FPS (without even factoring advantage of newer GPU features). Or it could be a difference of being able to run that UE4 demo in 720p/30FPS vs not being able to run it at all. 4x memory difference can also mean something being possible to do vs. not being possible (or "possible" but looking like crap in comparison).
The thing is that something looking like "crap" is already subjective, and diminishing returns will be a factor on how much detail can be reduced or cut before most viewers being able to the recognize it as a significant difference. Epic themselves hit a bit of that when people compared the UE3 Samaritan video vs the UE4 demo. Epic said that UE4 will make the Samaritan video look like crap, but I'm sure that many would consider that a major exaggeration so far.
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?
Because the games are generally designed with the consoles in mind and they rarely scale up any assets other than maybe textures. So the extra horsepower goes to higher resolutions, framerates, lighting and maybe DX11 features.
However even pc exclusives are not much different in terms of graphics. From my experience PC gives you textures that look a lot better, and some noticeable extra lighting and shading effects. Anyway graphics on multiplatform games are pretty much on the same ballpark.
Have anyone seen this latest clip of NintendoLand's plaza... night time?
http://www.youtube.com/watch?v=zoAyJd7Ra5k
Sorry if old or slightly unrelated(couldn't find a proper thread to post it).
Somewhat on topic: I'm really digging the visual style of this game. The image quality, lighting and textures are really well done, but it's the way everything comes together. At times, there's a decent amount of stuff in the scene, and yet none of the objects lack proper lighting/texturing, conveying a CG-like look. The (plaza) scenes appear very rich with effects and everything is animated beautifully, yet the console seems to have no real problem rendering it this a decent framerate(isn't NintendoLand confirmed to be 60 fps, btw?). Some parts of the video gives me that (old) Saturday 3D cartoons vibes. Can't wait to see what Nintendo's 2nd/3rd gen games look like, given their unique art styles.
They still have to accomodate lower-end systems, I guess.
Well it's kind of the same thing mentioned before, it takes however much extra power to run higher resolutions, framerate, texture detail, more/better effects, etc. Beyond that you get into the other stuff like hardware targets (PC games need to be scalable too), budgets, and to an extent diminishing returns.However even pc exclusives are not much different in terms of graphics. From my experience PC gives you textures that look a lot better, and some noticeable extra lighting and shading effects. Anyway graphics on multiplatform games are pretty much on the same ballpark.
Very insane ; )I've also been revisiting the rumor of some type of fixed function capabilities, which the 320 programmable ALUs would, in effect, supplement. What if Nintendo added in some sort of T&L unit similar to Flipper's but suped up? Say it could do some basic calculations for things like lighting and normal maps so that the programmable shaders could be freed up for some of the crazier effects. Can anyone with some technical know-how tell me if that idea is completely insane or not?
http://www.theinquirer.net/img/1606/PS3-memory-bandwidth-table.jpg?1241331976The XBox360 has 512MB of GDDR3 on a 128-bit bus running at 700MHz. The PS3 has 256MB of XDR and 256MB of GDDR3, but I'm not sure of the bus width or speed on either.
I sincerely doubt that a system with an embedded framebuffer and a separate RAM pool would have BW issues related to transparencies when running a ps360 port. Any slow-downs exhibited at rich translucencies scenes would be most likely related to sheer fillrate (ROPs). Alternatively, such scenes could be trisetup-bottlenecked, but AMD GPUs have not had a drop in the trisetup rate since Xenos that I'm aware of.High bandwidth is needed for shuffling and changing around textures, and using a lot of transparencies eats it for lunch.
It's also a big limiting factor for super and multi-sampling anti-aliasing.
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?
http://www.theinquirer.net/img/1606/PS3-memory-bandwidth-table.jpg?1241331976
(local refers to the GPU GDDR)
I sincerely doubt that a system with an embedded framebuffer and a separate RAM pool would have BW issues related to transparencies when running a ps360 port. Any slow-downs exhibited at rich translucencies scenes would be most likely related to sheer fillrate (ROPs). Alternatively, such scenes could be trisetup-bottlenecked, but AMD GPUs have not had a drop in the trisetup rate since Xenos that I'm aware of.
The same could be said for Wii. The Wiimote might make a multiplatform title better on Wii than on PS3 and 360. But how many times did that even happen? So few, because the Wii wasn't even equipped to handle most of the games developed on PS360. History will repeat itself here.
Hard as it might be to picture now if you're not a high end PC gamer, but there will be all sorts of PS4/720 games coming in the next years that the Wii U will simply not be capable of running. Even if they can make a greatly downgraded version on Wii U, will the gamepad screen gimmick they tack on make it the better version? I doubt you'll think so in 2014, 2015, etc etc.
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?
I agree with this, except for calling the improvement "minor". Almost every complex game on consoles this gen is basically destroyed by IQ issues, especially in motion.For me, looking at the lighting and SSAO quality, along with tessellation, in something like Crysis 3 DX11 does indeed look like a generation leap in rendering quality. Sometimes you really need to sit down and play the games on your computer, then look at how they appear on a console, to see how much cleaner and nicer current generation games look on PC even with minor improvements.
I agree with this, except for calling the improvement "minor". Almost every complex game on consoles this gen is basically destroyed by IQ issues, especially in motion.
The sad part is that I'm not even sure PS4/720 will fix that. Developers may still focus on who can create the best-looking bullshots, damn aliasing, temporal image stability and framerate.
1) Games are generally built with the 360/PS3 as the lead SKU, and scaled up for PC.
2) Even 'enhanced' PC games are, 99% of the time, built knowing that console ports are likely, and are thus limited.
3) Most current generation engines were founded on 360/PS3 era architecture.
4) Something something diminishing returns, subjectivity, and so on.
There's a few things to take note of though. Firstly, even if you're not seeing a 'generation leap', these cards are still processing significantly more data thanks to games running in 1080p+ (occasionally higher), have very impressive and demanding AA techniques implemented (SSAA, MSAA, SGSSAA, etc), scale assets and shader quality beyond current generation systems, occasionally include their own exclusive effects, and do all of this while maintaining a very solid framerate. Given that majority of console games stick to 720p, little-to-no AA, scaled down asset quality, missing effects, and still run on average at lower framerates, we are definitely seeing a massive performance gain from current generation GPUs.
But we won't really see how far they can be pushed until developers/publishers have a benchmark to work with, and PC is sadly not that. Not entirely anyway.
There's also the subjective nature of 'graphics' as people see them. I always bring it up, but people rate Mario Galaxy very high on the "games that look gorgeous" charts. And people froth over Dolphin shots at 1080p with a boatload of AA. Sure, you can see the technical drawbacks, but I also believe a lot of people feel that side-by-side with many Xbox 360 and PlayStation 3 games Mario Galaxy looks very, very good.
For me, looking at the lighting and SSAO quality, along with tessellation, in something like Crysis 3 DX11 does indeed look like a generation leap in rendering quality. Sometimes you really need to sit down and play the games on your computer, then look at how they appear on a console, to see how much cleaner and nicer current generation games look on PC even with minor improvements.
It's true, it's subjective, but it's hard to argue that even just using extra GPU power to get 1080p@60FPS/good AA vs. 720p@30FPS/noaa is not a major advantage. Then you also have to consider that if you use that extra GPU power to render something in a 720/30, then how are you going to present that on a 4x weaker machine? Render in SD/15FPS?The thing is that something looking like "crap" is already subjective, and diminishing returns will be a factor on how much detail can be reduced or cut before most viewers being able to the recognize it as a significant difference. Epic themselves hit a bit of that when people compared the UE3 Samaritan video vs the UE4 demo. Epic said that UE4 will make the Samaritan video look like crap, but I'm sure that many would consider that a major exaggeration so far.
It would need emulator regardless because GPU and most other components are different.Wait...I just thought about something: Since Wii U needs a day-one updated to add a Wii emulator, could mean that the CPU is...not based on Broadway?
Wait...I just thought about something: Since Wii U needs a day-one updated to add a Wii emulator, could mean that the CPU is...not based on Broadway?
I agree with this, except for calling the improvement "minor". Almost every complex game on consoles this gen is basically destroyed by IQ issues, especially in motion.
The sad part is that I'm not even sure PS4/720 will fix that. Developers may still focus on who can create the best-looking bullshots, damn aliasing, temporal image stability and framerate.
Or the update could simply be there to make sure the press don't get distracted... also to keep them from spoiling the fact that Wii BC is going to be another disappointing mark...