• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Manfred Linzer of Shin'en speaks of the Wii U's tech (eDRAM, etc) in great detail

This actually speaks more for the xbone as well.
Well it is a way to fill in gaps, but just like WiiU you still have to deal with a sizable capability gap, and in PS4's favor an ease of use advantage as well.

But there's no doubt that if they can get it working within reasonable parameters MS will have the most technically advanced and costly on-die eDRAM approach of any console yet released.
 

Mlatador

Banned
Thanks for sharing. Much respect for Shin'en. They definitely know what they are talking about. I'm really looking forward to their next project.

Wii U is definitely capable of nice awesome-looking visuals. And if a small team like them (eventhough being tech-wizards) can get good performance out of the Wii U with ease, other developers should definitely be able to do that as well.
 

Nirolak

Mrgrgr
So why haven't you been an active member of these threads before?

You are being very concise and accurate with your depictions here my friend. Much better to talk to than the norm in tech oriented threads. Or maybe I just rarely am on at the same time as you and don't tend to notice your posts?

Anyway good right up. Easy to understand, while still being accurate. Good show.

Thanks. I decided to step out when things were still mostly speculative since tempers tended to run very high. I feel they're much calmer now that most things are known.

Thanks guys. Been a while since I kept up on the GPU side of things.

On this note, even on the PS4 we see a high profile title like The Order 1886 rendering in 1920x800 in favor of graphics, so while there are definitely hardware elements to this, developer decisions often have a notable impact on render resolution as well.

Awesomely enough some game engines are still actively supporting MSAA incombination with non-blurry TAA and PPAA.

Cryengine 3 for example.

Yes, Frostbite 2/3 will also run MSAA, but it's very expensive when it's used with a deferred renderer.
 
Thanks for sharing. Much respect for Shin'en. They definitely know what they are talking about. I'm really looking forward to their next project.

Wii U is definitely capable of nice awesome-looking visuals. And if a small team like them (eventhough being tech-wizards) can get good performance out of the Wii U with ease, other developers should definitely be able to do that as well.

I think most devs knew this, but ports weren't optimized well enough to make sure that things like bandwidth heavy bits of code would never get sent to 12.8 GB of bandwidth. Let's not forget the possibility devs programmed for a single core, which seem to have ironed out in Black Ops 2.
 
Thanks. I decided to step out when things were still mostly speculative since tempers tended to run very high. I feel they're much calmer now that most things are known.
I can understand that.

I just kind of like it when it's all high flying unrealistic expectations and "visions" of how far we've come... only for reality to set in and that's usually when they start to get touchy. Then there's a cooling off period. When they see what's being achieved they usually end up at "Well... okay maybe that is good enough."

Like clockwork every generation.

And please keep it up man. The need for even a basic understanding cannot be overstated in these threads. And that's not a knock against anyone. Not knowing or not quite understanding is not stupidity. But don't feel slighted if someone corrects you. You may even be lucky enough to have a Faf or Durante bitchslap you with knowledge. I've been on the receiving end of that glove. I came out better for it.
 

Nirolak

Mrgrgr
I can understand that.

I just kind of like it when it's all high flying unrealistic expectations and "visions" of how far we've come... only for reality to set in and that's usually when they start to get touchy. Then there's a cooling off period. When they see what's being achieved they usually end up at "Well... okay maybe that is good enough."

Like clockwork every generation.

Yeah and usually the systems actually turn out better than people would expect in terms of actual capabilities.

It would be incredibly hard to play Xbox 360 games in 2005 and then imagine something like Assassin's Creed 3, Battlefield 4, or GTA V being capable of running on the system.
 
Thanks. I decided to step out when things were still mostly speculative since tempers tended to run very high. I feel they're much calmer now that most things are known.



On this note, even on the PS4 we see a high profile title like The Order 1886 rendering in 1920x800 in favor of graphics, so while there are definitely hardware elements to this, developer decisions often have a notable impact on render resolution as well.



Yes, Frostbite 2/3 will also run MSAA, but it's very expensive when it's used with a deferred renderer.

So we should expect PS4/XB1 to at least reach 4X MSAA in some games, and WiiU 2X MSAA.
 

Nirolak

Mrgrgr
So we should expect PS4/XB1 to at least reach 4X MSAA in some games, and WiiU 2X MSAA.

It's not that you couldn't, but I'd be really surprised if most devs (or even many devs) decided to:

amd%20aa%201920.png


I suspect most solutions will revolve around trying to make post processing AA better.
 

Colonel

Neo Member
Why dont they use the DSP for audio and free up more of the cpu for there games? You would think they would be one dev. to use the dsp.
 
Yeah and usually the systems actually turn out better than people would expect in terms of actual capabilities.

It would be incredibly hard to play Xbox 360 games in 2005 and then imagine something like Assassin's Creed 3 or Battlefield 4 being capable of running on the system.

Both the PS3 and 360 have absolutely floored me with what they have been capable of. Especially since the move to deferred rendering. The 360's design was not suited for it. Designed literally for tile based rendering. Not exclusively obviously, but that's where its strengths are.

Devs when given the time and budget really amaze me. No dev has been given as much of either as Rockstar, and man is it showing in GTAV. All of that fitting into 512MB of freaking memory. All of those unique textures, even streaming them in chunks with fairly high fidelity shadowing, some really nice lighting. All of that working within the confines of 512MB and running on GPU's designed in 2005. Designed for two distinctly different hardware designs.

I understand why there's such a tech pull in this industry. When they are achieving such amazing bounds in the arena of realtime rendering, while also making strides into the very concept of interactive worlds. Creating huge worlds. Becoming more and more believable as the tech improves.

Give devs 8 years on PS4, One, even WiiU and we will see things that should not be possible given what we currently know of all of them.

It might take a hit to image quality to make it happen. But I know truly real-time global illumination and effective truly fluid tessellation are just a few insights away. Maybe on any platform with the capability.

I'm ranting now.

I just always have so much fun with this.
 

Nachtmaer

Member
I suspect most solutions will revolve around trying to make post processing AA better.

Isn't that what most people are expecting to happen? FXAA or MLAA tend to be "good enough", especially from a greater distance, and they have a way smaller impact on performance like MSAA (like that benchmark proves). Although they're a tad more blurry.
 

StevieP

Banned
Isn't that what most people are expecting to happen? FXAA or MLAA tend to be "good enough", especially from a greater distance, and they have a way smaller impact on performance like MSAA (like that benchmark proves). Although they're a tad more blurry.

There is nothing good about the majority of post AA, aside from its relatively low cost. It's usually quite a bit more than "a tad" unless you're extremely selective on where you apply it.
 

Nachtmaer

Member
There is nothing good about the majority of post AA, aside from its relatively low cost. It's usually quite a bit more than "a tad" unless you're extremely selective on where you apply it.

Well, yeah. I guess it also depends on personal preference. Some people are okay with how current games look from their couch, but if you have your PS3 hooked up to a second monitor (like I have), the jaggies are really an eyesore sometimes.

Overall, I am not that picky since FXAA is usually fine, which is one of the reasons why I'm still on an HD6850. If I do have the spare power to run a game smoothly, I crank up the MSAA.
 

wsippel

Banned
The eDRAM isn't the system RAM (the thing people refer to as being low bandwidth), it's the video RAM.

The eDRAM is inarguably larger than the 360's eDRAM pool, and thus you don't have to deal with tiling.

Edit: Well, assuming you don't have a ton of buffers to resolve that is.
The eDRAM is not video RAM. It's unified. Wii U's (and Wii's) RAM is split in two pools, but both pools are unified. There's no main RAM and VRAM split.
 

Datschge

Member
Why dont they use the DSP for audio and free up more of the cpu for there games? You would think they would be one dev. to use the dsp.

DSPs don't independently handle everything sound related, the data management still needs to be done by the CPU (only exception was the APU in the SNES which had its own processor to do just that).

Aside that the audio toolchain by Shine is likely even more impressive than their visual efforts, they offer it as a service to other developers. Their audio engine on GBA is very impressive, live generated music without samples (everything procedurally generated to save space and keep the highest possible resolution) at higher sound quality and still using less processing power than the widespread standard 8bit software sample mixing. It may well be the case looking at the DSP would have been more effort than just porting their tried and true audio engine code (though it's said nowhere the DSP isn't used, just that a core is used for the audio thread).
 
Yeah, we're probably just mistaken as to just what it means to use the DSP. It makes sense that the CPU would have to take at least a tiny part in the process, as it sounds like based on the dev comment.
 
Top Bottom