• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: Wii U final specs

Donnie

Member
I'm going by the current crop of console ports.

Well of course they're going to be DX9 level because theyre being ported from DX9 level consoles. But the fact is WiiUs GPU is at minimum DX10.1 level, almost certainly above that in some ways.
 

Snakeyes

Member
If we're lucky, it'll be kinda like comparing the PS1 version of Tony Hawk 2 to the Dreamcast one. Much uglier, but same core gameplay.

I don't think it'll be that bad. Here are some shots of the PS and DC versions:

tony_hawk_2_01_med.jpg


It's doubtful that we'll see such a meaningful jump in geometry, texture quality and aliasing in one generational transition again. Today's consoles are already powerful enough to render almost anything at a high degree of realism. Sure, the higher-end systems will be able to render the small details on character clothing, the textures will be sharper and the lighting will be more realistic, but the games will still look pretty similar provided that developers make an effort to optimize the Wii U version.

Speaking of lighting, how does the Wii U fare in that department? I think that will play a huge part in how acceptable the games will look compared to the competition.
 
And that's probably what the Wii U versions of next-gen games will be like.

The point is that, unlike the current generation, developers will be able to create cut-down builds of these games for Nintendo's console without having to recode everything from scratch.

This ^^.

Interesting to read rumours that a lot of the late 2013 multi platform games will be 'cross generational' releases aswell, games like AC 3: side story, Fifa 14, Madden 14, WWE 14, UFC 14, MW4, BF4, Watch Dogs and MGS Ground Zeros will all appear on PS360 / WiiU / PS4 / 720 imo.

Third party publishers will not abandon an over 140 million install base just because two new consoles arrive with a zero install base.

There will of course be a few third party exclusives only available on PS4 / 720 but the way the industry is going these are going to be few and far between, publishers want titles out on as many machines as can play them, if that means scaling them back a bit to run on WiiU then they absolutely will.
 

Donnie

Member
I don't think it'll be that bad. Here are shots of the PS and DC versions.





It's doubtful that we'll see such a meaningful jump in geometry, texture quality and aliasing in one generational.

Agreed, Dreamcast could push about 20x as many polygons as PS1 and had over 6x the RAM. Not to mention PS1 didnt even have perspective correction or bilinear filtering. No way will WiiU to any other next gen console even approach that kind of difference.
 
I'm going by the current crop of console ports.

Don't, these games were cheap, fast ports, rushed out for launch.

[/QUOTE]Well as they say, there's only one way to find out. Crytek has to put Crysis 3 on Wii U to test the power.[/QUOTE]

EA like a lot of third parties are waiting to see how the console sells at launch before commiting a lot of projects to it, if it's sold 5 million by January 1st 2013, expect to hear news of almost every big late 2012 and early 2013 multi platform game in development for WiiU, these third party publishers need the money now more than ever before.

Tomb Raider, Splinter Cell and Watch Dogs are already heavily rumoured to apear on WiiU in 2013.

Along with exclusives like Rayman Legends, Lego City Undercover, Bayonetta 2, Wonderful 101, Pikmin 3 and Monster Hunter 3 Ultimate, 2013 is already looking fantastic for the system :).
 

beril

Member
Agreed, Dreamcast could push about 20x as many polygons as PS1 and had over 6x the RAM. Not to mention PS1 didnt even have perspective correction or bilinear filtering. No way will WiiU to any other next gen console even approach that kind of difference.

Not to mention running in 4x the resolution. That was a much bigger jump than going HD this gen
 

BocoDragon

or, How I Learned to Stop Worrying and Realize This Assgrab is Delicious
Wait. Now people think the difference between Wii U and PS4/720 is going to be the same as DC to PS1?

Holy shit. Just say PS4 is going to run Avatar at 5000fps and call it a damn day. There is no intelligent discussion left.
I don't think they were really comparing the power levels. Just using Tony Hawk as an example of games that were identical in gameplay, but one version looked clearly better than the other. That probably will be the case with some Wii U and PS4/720 multiplatform titles.
 
Wait. Now people think the difference between Wii U and PS4/720 is going to be the same as DC to PS1?

Holy shit. Just say PS4 is going to run Avatar at 5000fps and call it a damn day. There is no intelligent discussion left.

There are going to be a lot of hurt people come the reveals of PS4/ 720 imo, people have unreal expectations.

If we look at it like this there really isn't a huge difference compared to this generation -

CPU -

WiiU - IBM Tri Core OoO CPU with seperate Audio DSP.
PS4 - 4 Core AMD CPU.
720 - 4 Core AMD CPU.

GPU -

WiiU - ~500 GFLOP, Direct X 10.1 like feature set chip.
PS4 - ~1.8 TFLOP, Direct X 11 like feature set chip.
720 - ~1.5 TFLOP, Direct X 11 feature set chip.

Ram -

WiiU - 2GB's, 1GB for Games, 1 GB for OS.
PS4 - 4GB's, 3GB for Games, 1 GB for OS.
720 - 6GB's, 4GB for Games, 2 GB for OS.

Nothing like the power gap of the Wii / PS3 / 360 generation, people need to get over it...
 

Reiko

Banned
There are going to be a lot of hurt people come the reveals of PS4/ 720 imo, people have unreal expectations.

If we look at it like this there really isn't a huge difference compared to this generation -

CPU -

WiiU - IBM Tri Core OoO CPU with seperate Audio DSP.
PS4 - 4 Core AMD CPU.
720 - 4 Core AMD CPU.

GPU -

WiiU - ~500 GFLOP, Direct X 10.1 like feature set chip.
PS4 - ~1.8 TFLOP, Direct X 11 like feature set chip.
720 - ~1.5 TFLOP, Direct X 11 feature set chip.

Ram -

WiiU - 2GB's, 1GB for Games, 1 GB for OS.
PS4 - 4GB's, 3GB for Games, 1 GB for OS.
720 - 6GB's, 4GB for Games, 2 GB for OS.

Nothing like the power gap of the Wii / PS3 / 360 generation, people need to get over it...

The rumor of Battlefield 4 running in 60fps on PS4/720 is good enough for me. That is a blatantly obvious jump from last gen on consoles.
 

Sheroking

Member
The rumor of Battlefield 4 running in 60fps on PS4/720 is good enough for me. That is a blatantly obvious jump from last gen on consoles.

CoD titles usually run at or around 60fps on the HD twins.

It all depends on far and how well they push the hardware.
 

OryoN

Member
Have anyone seen this latest clip of NintendoLand's plaza... night time?

http://www.youtube.com/watch?v=zoAyJd7Ra5k

Sorry if old or slightly unrelated(couldn't find a proper thread to post it).

Somewhat on topic: I'm really digging the visual style of this game. The image quality, lighting and textures are really well done, but it's the way everything comes together. At times, there's a decent amount of stuff in the scene, and yet none of the objects lack proper lighting/texturing, conveying a CG-like look. The (plaza) scenes appear very rich with effects and everything is animated beautifully, yet the console seems to have no real problem rendering it this a decent framerate(isn't NintendoLand confirmed to be 60 fps, btw?). Some parts of the video gives me that (old) Saturday 3D cartoons vibes. Can't wait to see what Nintendo's 2nd/3rd gen games look like, given their unique art styles.
 
Have anyone seen this latest clip of NintendoLand's plaza... night time?

http://www.youtube.com/watch?v=zoAyJd7Ra5k

Sorry if old or slightly unrelated(couldn't find a proper thread to post it).

Somewhat on topic: I'm really digging the visual style of this game. The image quality, lighting and textures are really well done, but it's the way everything comes together. At times, there's a decent amount of stuff in the scene, and yet none of the objects lack proper lighting/texturing, conveying a CG-like look. The (plaza) scenes appear very rich with effects and everything is animated beautifully, yet the console seems to have no real problem rendering it this a decent framerate(isn't NintendoLand confirmed to be 60 fps, btw?). Some parts of the video gives me that (old) Saturday 3D cartoons vibes. Can't wait to see what Nintendo's 2nd/3rd gen games look like, given their unique art styles.

I was thinking that Nintendo Land in the daytime looked pretty nice.
 

MThanded

I Was There! Official L Receiver 2/12/2016
Have anyone seen this latest clip of NintendoLand's plaza... night time?

http://www.youtube.com/watch?v=zoAyJd7Ra5k

Sorry if old or slightly unrelated(couldn't find a proper thread to post it).

Somewhat on topic: I'm really digging the visual style of this game. The image quality, lighting and textures are really well done, but it's the way everything comes together. At times, there's a decent amount of stuff in the scene, and yet none of the objects lack proper lighting/texturing, conveying a CG-like look. The (plaza) scenes appear very rich with effects and everything is animated beautifully, yet the console seems to have no real problem rendering it this a decent framerate(isn't NintendoLand confirmed to be 60 fps, btw?). Some parts of the video gives me that (old) Saturday 3D cartoons vibes. Can't wait to see what Nintendo's 2nd/3rd gen games look like, given their unique art styles.

I really like the way that looks.
 

Eusis

Member
I'm not ready to accept Nintendo is that powerless.

What's the point of being a manufacturer (or any top business position) if insubordinates rule you?

Edit: Even offering incentives is still some form of control and something they could totally do.
Because, frankly, they aren't the final say on what makes a game good/bad. This is a problem Sony's had: they'd block TYPES of games entirely because they figured they were "no good" or "not game enough", and so we miss stuff that could've been localized or perhaps developed locally. Hell, the NES policies weren't really even stopping crappy games so much as preventing the market from being flooded like with the Atari 2600: there was no shortage of shitty games, it's just many companies that wanted to pump out a lot had to run a separate label to get around it, and we had good (Konami) and bad (Acclaim) doing just that.
 

DonMigs85

Member
I don't think they were really comparing the power levels. Just using Tony Hawk as an example of games that were identical in gameplay, but one version looked clearly better than the other. That probably will be the case with some Wii U and PS4/720 multiplatform titles.

Yes this.
But COD4 on Wii versus the HD versions... Man that was ugly as sin and had scaled-down firefights too.
 

FyreWulff

Member
Yes this.
But COD4 on Wii versus the HD versions... Man that was ugly as sin and had scaled-down firefights too.

CoD4 on Wii was also ported by like, 3 people. (I'm exaggerating, but CoD3 Wii was in fact, ported by two guys given a DVD of PS2 assets)

The Wii was probably the last console where you could get away with micro-small teams on a retail title.


edit: and to make this comment slightly more relevant, Tony Hawk was ported to the N64 with a tiny team as well :p
 

Lord Error

Insane For Sony
If we look at it like this there really isn't a huge difference compared to this generation -
Maybe not compared to this generation, but to act like 4x more GPU power or 4x more memory is not huge is disingenuous. That GPU difference is a difference between 720p/30FPS and 1080p/60FPS (without even factoring advantage of newer GPU features). Or it could be a difference of being able to run that UE4 demo in 720p/30FPS vs not being able to run it at all. 4x memory difference can also mean something being possible to do vs. not being possible (or "possible" but looking like crap in comparison).
 

PrimeRib_

Member
Isn't BLOPS2 sub-HD?

CoD titles usually run at or around 60fps on the sub-HD twins.

Fixed.

Seriously, running triple A titles with a stable framerate of +30 is going to be a huge improvement. I am beyond tired of sub-20 FPS drops, with games that barely run 720p. I don't care what the PS4 or next Xbox has for specs as long as it gives stability and attempts to push full 1080p in most games.
 
Isn't BLOPS2 sub-HD?



Fixed.

Seriously, running triple A titles with a stable framerate of +30 is going to be a huge improvement. I am beyond tired of sub-20 FPS drops, with games that barely run 720p. I don't care what the PS4 or next Xbox has for specs as long as it gives stability and attempts to push full 1080p in most games.

You poor naive fool... It's not that the PS360 can't run HD games at 30fps... it's because they have to be the prettiest and shiniest games that they run sub-hd and low frame rate and that will happen again next gen unless Microsoft and Sony tell them they aren't going to get published without meeting those requirements.

I fully expect when the systems are first launched we're going to see a lot of PS360 games up-ported to full 1080p and 60fps, but by the time we're a couple years in, they'll drop back down to 30fps, and then to 720p all in the name of adding just a few more types of pixel shading and post-processing effects.
 
Maybe not compared to this generation, but to act like 4x more GPU power or 4x more memory is not huge is disingenuous. That GPU difference is a difference between 720p/30FPS and 1080p/60FPS (without even factoring advantage of newer GPU features). Or it could be a difference of being able to run that UE4 demo in 720p/30FPS vs not being able to run it at all. 4x memory difference can also mean something being possible to do vs. not being possible (or "possible" but looking like crap in comparison).
The thing is that something looking like "crap" is already subjective, and diminishing returns will be a factor on how much detail can be reduced or cut before most viewers being able to the recognize it as a significant difference. Epic themselves hit a bit of that when people compared the UE3 Samaritan video vs the UE4 demo. Epic said that UE4 will make the Samaritan video look like crap, but I'm sure that many would consider that a major exaggeration so far.
 

Pittree

Member
The thing is that something looking like "crap" is already subjective, and diminishing returns will be a factor on how much detail can be reduced or cut before most viewers being able to the recognize it as a significant difference. Epic themselves hit a bit of that when people compared the UE3 Samaritan video vs the UE4 demo. Epic said that UE4 will make the Samaritan video look like crap, but I'm sure that many would consider that a major exaggeration so far.

This.

As a lot of persons here have brought up graphics are a matter of perception, if Boco and DonMigs perception is for example that the Samaritan demo looks like crap compared to the infamous UE4 demo, then for them yes. Wii U versions are going to look like crap or even worse. However for the vast majority of regular people and even for a lot of avid gamers the differences could be neglible. Now on the original point of Boco I think that if he can already see the value on Nintendo IP'S and is so concerned about graphics, I can't see a reason for not to buying a Wii U now and enjoy superior graphics experiences for a whole year (at min.) while also enjoying firs party experiences. Hell, he could even trade his Wii U later for the brand new console of his preference
 

DonMigs85

Member
Even though it's relatively low-poly and didn't make much use of fancy shaders and lighting, I think Ridge Racer 7 still looks fairly good. Definitely better than RR6. And that's a PS3 launch title running at full 1920x1080p and 60FPS!
 

Oblivion

Fetishing muscular manly men in skintight hosery
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?
 

DonMigs85

Member
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?

Because the games are generally designed with the consoles in mind and they rarely scale up any assets other than maybe textures. So the extra horsepower goes to higher resolutions, framerates, lighting and maybe DX11 features.
 

Pittree

Member
Because the games are generally designed with the consoles in mind and they rarely scale up any assets other than maybe textures. So the extra horsepower goes to higher resolutions, framerates, lighting and maybe DX11 features.

However even pc exclusives are not much different in terms of graphics. From my experience PC gives you textures that look a lot better, and some noticeable extra lighting and shading effects. Anyway graphics on multiplatform games are pretty much on the same ballpark.
 

DonMigs85

Member
However even pc exclusives are not much different in terms of graphics. From my experience PC gives you textures that look a lot better, and some noticeable extra lighting and shading effects. Anyway graphics on multiplatform games are pretty much on the same ballpark.

They still have to accomodate lower-end systems, I guess.
 

guek

Banned
Have anyone seen this latest clip of NintendoLand's plaza... night time?

http://www.youtube.com/watch?v=zoAyJd7Ra5k

Sorry if old or slightly unrelated(couldn't find a proper thread to post it).

Somewhat on topic: I'm really digging the visual style of this game. The image quality, lighting and textures are really well done, but it's the way everything comes together. At times, there's a decent amount of stuff in the scene, and yet none of the objects lack proper lighting/texturing, conveying a CG-like look. The (plaza) scenes appear very rich with effects and everything is animated beautifully, yet the console seems to have no real problem rendering it this a decent framerate(isn't NintendoLand confirmed to be 60 fps, btw?). Some parts of the video gives me that (old) Saturday 3D cartoons vibes. Can't wait to see what Nintendo's 2nd/3rd gen games look like, given their unique art styles.

Having just watched wreck it ralph and already feeling all warm and fuzzy inside, this clip stirs up some major hype for me that I definitely haven't felt for NL up to now. Whoo! Can't wait to explore!
 

japtor

Member
However even pc exclusives are not much different in terms of graphics. From my experience PC gives you textures that look a lot better, and some noticeable extra lighting and shading effects. Anyway graphics on multiplatform games are pretty much on the same ballpark.
Well it's kind of the same thing mentioned before, it takes however much extra power to run higher resolutions, framerate, texture detail, more/better effects, etc. Beyond that you get into the other stuff like hardware targets (PC games need to be scalable too), budgets, and to an extent diminishing returns.

That last one depends a lot on subject, art, and a lot of factors though, like racing sims* can look pretty insane on PC but stuff on the consoles still look pretty damn good, while stuff with humans is still a ways off regardless of the hardware. A game might be able to be scaled down significantly from a technical standpoint but still be a good enough approximation of the original experience. Art could make the technical stuff moot (like Mario Galaxy looking decent despite the hardware), or be a linchpin of the experience, like if the style of a game is predicated on something only possible with newer hardware.

*There's non visual stuff to them too though, like complex physics models can be too much for the consoles to handle. That's still kind of a thing that can get approximated to be decent enough for many users though.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
I've also been revisiting the rumor of some type of fixed function capabilities, which the 320 programmable ALUs would, in effect, supplement. What if Nintendo added in some sort of T&L unit similar to Flipper's but suped up? Say it could do some basic calculations for things like lighting and normal maps so that the programmable shaders could be freed up for some of the crazier effects. Can anyone with some technical know-how tell me if that idea is completely insane or not?
Very insane ; )
 

neoanarch

Member
PC games are sort of hampered by the lack of improvement in displays for the last decade. 1080p is ridiculous as a standard for anything bigger than 17 inches. You can't really appreciate the improvements that come with the pc when max resolution is the same as the weaker consoles.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
The XBox360 has 512MB of GDDR3 on a 128-bit bus running at 700MHz. The PS3 has 256MB of XDR and 256MB of GDDR3, but I'm not sure of the bus width or speed on either.
http://www.theinquirer.net/img/1606/PS3-memory-bandwidth-table.jpg?1241331976

(local refers to the GPU GDDR)

High bandwidth is needed for shuffling and changing around textures, and using a lot of transparencies eats it for lunch.
It's also a big limiting factor for super and multi-sampling anti-aliasing.
I sincerely doubt that a system with an embedded framebuffer and a separate RAM pool would have BW issues related to transparencies when running a ps360 port. Any slow-downs exhibited at rich translucencies scenes would be most likely related to sheer fillrate (ROPs). Alternatively, such scenes could be trisetup-bottlenecked, but AMD GPUs have not had a drop in the trisetup rate since Xenos that I'm aware of.
 

Rootbeer

Banned
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?

It's a question of budgets not hardware limitations, IMO. Sad but true. I have a beast of a computer but it'll probably never be fully utilized because there's no developer out there trying to max out today's PC hardware because it would take so many resources that they either couldn't afford it or realize that so few would be able to utilize it that it's not worth attempting.
 

Panajev2001a

GAF's Pleasant Genius
http://www.theinquirer.net/img/1606/PS3-memory-bandwidth-table.jpg?1241331976

(local refers to the GPU GDDR)


I sincerely doubt that a system with an embedded framebuffer and a separate RAM pool would have BW issues related to transparencies when running a ps360 port. Any slow-downs exhibited at rich translucencies scenes would be most likely related to sheer fillrate (ROPs). Alternatively, such scenes could be trisetup-bottlenecked, but AMD GPUs have not had a drop in the trisetup rate since Xenos that I'm aware of.

Well, if you can only use e-DRAM as framebuffer and you cannot use a part of it for off screen textures/FBO's then you are back to rendering to e-DRAM, export framebuffer content to main RAM, and then read it back as texture from main RAM like you do on Xbox 360 which does consume more memory bandwidth than you would on PS2's GS or Flipper/Hollywood (I was under the assumption that you could only use the texture cache as read-only from the GPU side, but apparently I was wrong).
 

Donnie

Member
The same could be said for Wii. The Wiimote might make a multiplatform title better on Wii than on PS3 and 360. But how many times did that even happen? So few, because the Wii wasn't even equipped to handle most of the games developed on PS360. History will repeat itself here.

Hard as it might be to picture now if you're not a high end PC gamer, but there will be all sorts of PS4/720 games coming in the next years that the Wii U will simply not be capable of running. Even if they can make a greatly downgraded version on Wii U, will the gamepad screen gimmick they tack on make it the better version? I doubt you'll think so in 2014, 2015, etc etc.

Wii got so few multi-platform games for three main reasons. Its control method wasn't seen as suitable, its user base was considered the wrong demographic and its architecture was incompatible with the way the engines for those games were designed. That meant developers had to use a separate engine and redevelop the game for it rather than being able to port/downscale the same engine. Despite that Wii did still get some multi-platform games that could be considered the best on any console, not many but some, which is why I said better graphics meaning better game is not always a given even if it is often the case.

As far as WiiU following Wii. WiiU has the option to use the same traditional control method as any other next gen console and its architecture will most definitely be compatible, which will allow next gen engines to be porteddown scaled. So there's no reason for history to repeat itself on that score. The only question is will developers decide that the user base isn't suited to their games. Either way I'll bet right now that WiiU gets more multi-platform games than Wii did.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
Curious. There are PC graphics cards out there that can utterly obliterate the GPUs used in the PS360. But I don't see any PC games out right now that show a whole generational leap. Is there a reason for that?

1) Games are generally built with the 360/PS3 as the lead SKU, and scaled up for PC.
2) Even 'enhanced' PC games are, 99% of the time, built knowing that console ports are likely, and are thus limited.
3) Most current generation engines were founded on 360/PS3 era architecture.
4) Something something diminishing returns, subjectivity, and so on.

There's a few things to take note of though. Firstly, even if you're not seeing a 'generation leap', these cards are still processing significantly more data thanks to games running in 1080p+ (occasionally higher), have very impressive and demanding AA techniques implemented (SSAA, MSAA, SGSSAA, etc), scale assets and shader quality beyond current generation systems, occasionally include their own exclusive effects, and do all of this while maintaining a very solid framerate. Given that majority of console games stick to 720p, little-to-no AA, scaled down asset quality, missing effects, and still run on average at lower framerates, we are definitely seeing a massive performance gain from current generation GPUs.

But we won't really see how far they can be pushed until developers/publishers have a benchmark to work with, and PC is sadly not that. Not entirely anyway.

There's also the subjective nature of 'graphics' as people see them. I always bring it up, but people rate Mario Galaxy very high on the "games that look gorgeous" charts. And people froth over Dolphin shots at 1080p with a boatload of AA. Sure, you can see the technical drawbacks, but I also believe a lot of people feel that side-by-side with many Xbox 360 and PlayStation 3 games Mario Galaxy looks very, very good.

For me, looking at the lighting and SSAO quality, along with tessellation, in something like Crysis 3 DX11 does indeed look like a generation leap in rendering quality. Sometimes you really need to sit down and play the games on your computer, then look at how they appear on a console, to see how much cleaner and nicer current generation games look on PC even with minor improvements.
 

Durante

Member
For me, looking at the lighting and SSAO quality, along with tessellation, in something like Crysis 3 DX11 does indeed look like a generation leap in rendering quality. Sometimes you really need to sit down and play the games on your computer, then look at how they appear on a console, to see how much cleaner and nicer current generation games look on PC even with minor improvements.
I agree with this, except for calling the improvement "minor". Almost every complex game on consoles this gen is basically destroyed by IQ issues, especially in motion.

The sad part is that I'm not even sure PS4/720 will fix that. Developers may still focus on who can create the best-looking bullshots, damn aliasing, temporal image stability and framerate.
 

EatChildren

Currently polling second in Australia's federal election (first in the Gold Coast), this feral may one day be your Bogan King.
I agree with this, except for calling the improvement "minor". Almost every complex game on consoles this gen is basically destroyed by IQ issues, especially in motion.

The sad part is that I'm not even sure PS4/720 will fix that. Developers may still focus on who can create the best-looking bullshots, damn aliasing, temporal image stability and framerate.

By minor I meant in terms of assets quality and such, as in fairly straight console ports. I agree that the actual end result is still significantly better, even if not much other than the AA/AF improves.

And yeah, I don't expect the PS4/720 to improve in IQ. But they should lay a foundation for better engines that take good advantage of the processing power in modern GPUs. PC will, once again, clean it all up :p.
 
1) Games are generally built with the 360/PS3 as the lead SKU, and scaled up for PC.
2) Even 'enhanced' PC games are, 99% of the time, built knowing that console ports are likely, and are thus limited.
3) Most current generation engines were founded on 360/PS3 era architecture.
4) Something something diminishing returns, subjectivity, and so on.

There's a few things to take note of though. Firstly, even if you're not seeing a 'generation leap', these cards are still processing significantly more data thanks to games running in 1080p+ (occasionally higher), have very impressive and demanding AA techniques implemented (SSAA, MSAA, SGSSAA, etc), scale assets and shader quality beyond current generation systems, occasionally include their own exclusive effects, and do all of this while maintaining a very solid framerate. Given that majority of console games stick to 720p, little-to-no AA, scaled down asset quality, missing effects, and still run on average at lower framerates, we are definitely seeing a massive performance gain from current generation GPUs.

But we won't really see how far they can be pushed until developers/publishers have a benchmark to work with, and PC is sadly not that. Not entirely anyway.

There's also the subjective nature of 'graphics' as people see them. I always bring it up, but people rate Mario Galaxy very high on the "games that look gorgeous" charts. And people froth over Dolphin shots at 1080p with a boatload of AA. Sure, you can see the technical drawbacks, but I also believe a lot of people feel that side-by-side with many Xbox 360 and PlayStation 3 games Mario Galaxy looks very, very good.

For me, looking at the lighting and SSAO quality, along with tessellation, in something like Crysis 3 DX11 does indeed look like a generation leap in rendering quality. Sometimes you really need to sit down and play the games on your computer, then look at how they appear on a console, to see how much cleaner and nicer current generation games look on PC even with minor improvements.

yep. sometimes I feel like I'm being inconsistent or conflicted to spend boat loads of money on my PC to get 'teh bettar graphics' and then to turn around and say I'll be perfectly happy with something like Mario Galaxy in 720p.

I honestly feel that Crysis 2 with the better textures and all the DX11 features turned on, in stereoscopic 3D at 60 fps is a generational leap over anything on the consoles. It still blows me away, but ultimately what I care about most is IQ and framerate. textures are way more important to me than poly counts.

but then I can still go back and fire up AVP classic and enjoy it nearly as much as when I first played it. so I really don't know. i'll do just about anything for better IQ, smoother framerates and 3D, but a thirteen year old game can still draw me right in and scare me after all this time.

I do know that Wii games look like shit on my newer HDTV than they did my previous one, and that it stopped me being able to play anything that dolphin couldn't run well on my PC until I got a CRT to hook the Wii up to. I'm so greatful for dolphin because I literally couldn't have played Skywards Sword without it.

but show me a game with a new tier of lighting and I'm drooling all over the place.
 
Wait...I just thought about something: Since Wii U needs a day-one updated to add a Wii emulator, could mean that the CPU is...not based on Broadway?
 

Lord Error

Insane For Sony
The thing is that something looking like "crap" is already subjective, and diminishing returns will be a factor on how much detail can be reduced or cut before most viewers being able to the recognize it as a significant difference. Epic themselves hit a bit of that when people compared the UE3 Samaritan video vs the UE4 demo. Epic said that UE4 will make the Samaritan video look like crap, but I'm sure that many would consider that a major exaggeration so far.
It's true, it's subjective, but it's hard to argue that even just using extra GPU power to get 1080p@60FPS/good AA vs. 720p@30FPS/noaa is not a major advantage. Then you also have to consider that if you use that extra GPU power to render something in a 720/30, then how are you going to present that on a 4x weaker machine? Render in SD/15FPS?

Wait...I just thought about something: Since Wii U needs a day-one updated to add a Wii emulator, could mean that the CPU is...not based on Broadway?
It would need emulator regardless because GPU and most other components are different.
 
Wait...I just thought about something: Since Wii U needs a day-one updated to add a Wii emulator, could mean that the CPU is...not based on Broadway?

Or the update could simply be there to make sure the press don't get distracted... also to keep them from spoiling the fact that Wii BC is going to be another disappointing mark...
 

z0m3le

Banned
I agree with this, except for calling the improvement "minor". Almost every complex game on consoles this gen is basically destroyed by IQ issues, especially in motion.

The sad part is that I'm not even sure PS4/720 will fix that. Developers may still focus on who can create the best-looking bullshots, damn aliasing, temporal image stability and framerate.

Honestly I think this is going to largely depend on game budgets, if the developer doesn't have say $50-60 million, they won't be able to produce an "AAA" game and studios like Rockstar will continue to push $100 million dollar games but next gen the scope of those games will require more money, it honestly wouldn't surprise me if we saw some $150 million budgets from them for similar games.

Obviously every studio can't do this and $30 million is the budget for the average AAA game this gen, so you have studios right now that can't even do that, I'm 100% sure we will have some amazing looking games that will be very clearly a generation ahead (Crysis 3 on PC and beyond that) but by in large, those games will not pop up as often as they did in this gen, we will likely get a lot more games that look better than Two Souls or Last of Us though (better IQ, lighting, geometry) Honestly I feel that all 3 games I've just mentioned are valid targets for next gen when those above improvements are added and budgets come into play.

Or the update could simply be there to make sure the press don't get distracted... also to keep them from spoiling the fact that Wii BC is going to be another disappointing mark...

Considering PS4 has >1% chance of having BC with PS3, and XB3 is similarly implausible, Wii U having 1:1 Wii BC is in fact... a +. The down side is that you can't play Wii games and thus Wii VC games on the GamePad. THAT is the -, still at least it can play the Wii library and VC library you've collected over the past 6 years.
 
Top Bottom