• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Resolution gate infintyward responds eurogamer

To me, the most "accurate" simplification of the situation so far seems like the split lands (as others have pointed out) on forward vs deferred renderers.

i.e. we will see some 1080p Xboxone titles using a forward rendered setup, but due to backbuffer sizes and that ESRAM, probably more 720p deferred rendered titles. And sadly deferred is sort of where all the real eye candy is going.

I believe this to be 100% correct. Forward renderer, there's reasonably enough grunt there to push 60fps 1080 on non-demanding titles. Deferred renderer though, the sums at 1080p don't add up, and you'll need to start copying back and forth from ESRAM to DDR3, and that will just kill your framerate. Everything comes back to that paltry 32mb.

As it happens, that may in a perverse way help the XBone long term. Given the prevalence of deferred renderers we should expect 1080p to be out of equation (sorry, the maths just don't add up), there should be some power go left over in the GPU. Well, you can use that to help match what the PS4 is doing with it's "spare" compute cycles. So you can match, more or less, the GPGPU stuff, and stay at the 720/900 resolutions we're expecting for the console.

Maybe it really is balanced after all, lol.
 
I believe this to be 100% correct. Forward renderer, there's reasonably enough grunt there to push 60fps 1080 on non-demanding titles. Deferred renderer though, the sums at 1080p don't add up, and you'll need to start copying back and forth from ESRAM to DDR3, and that will just kill your framerate. Everything comes back to that paltry 32mb.

As it happens, that may in a perverse way help the XBone long term. Given the prevalence of deferred renderers we should expect 1080p to be out of equation (sorry, the maths just don't add up), there should be some power go left over in the GPU. Well, you can use that to help match what the PS4 is doing with it's "spare" compute cycles. So you can match, more or less, the GPGPU stuff, and stay at the 720/900 resolutions we're expecting for the console.

Maybe it really is balanced after all, lol.

That would suggest to me that IW might have trouble ever hitting 1080p60fps on a COD this gen on XB1 as I highly doubt they'll utilize a different rendering method for one console over the other
 

Phades

Member
No point in getting a gaming PC in 2013 without also getting an SSD.
Says who? If they are taking that position, then get ready to RAID 0 (or 0+1) two of them together as a mandatory configuration....

My HD is loud and slow and only 300 GB, and I wouldn't be surprised if it gave out entirely sometime soon.

I think I will get a new gaming PC at some point in the future, it's just that right now it doesn't make sense for me to go down that road.
Well failure is always a legitimate concern (that has to be address regardless of upgrade considerations), but are you sure the sound you are hearing is from the HDD and not one of your case, aux, or CPU fan motors or bearings giving out? A properly screwed down HDD doesn't vibrate that much and typically isn't louder than the fans blowing full tilt. Hybrid drives are also a decent alternative as well depending on what else you use your machine for. I wouldn't go exclusively SSD on a new build and would have a standard drive as a backup/storage only keeping the most used stuff (program wise) on the SSD to try and keep the writes down and ensure longevity. There are also options to line the case to absorb that vibration noise with foam if silent gaming is your concern. Hard to avoid fan noise though, since the air has to come from/go somewhere in order to be useful.

Potentially waiting it out for the next CPU architecture could be a smart move though and then worry about the GPU options at that point in time. Although, the X79 stuff is rather nice if you can't wait and the price point is acceptable. Given the fact that the AMD architecture out now is what the consoles are based off of, a solid choice from them won't do you wrong either. There really isn't a one size fits all answer here.
 

Metfanant

Member
did anyone else read between the lines and get the hint sorta confirming some of the stuff mort has been talking about regarding the Xbone's OS??...

Rubin is very PC about it...but making those comments about waiting on things from the OEMs on the OS front....the resources required...specifically mentioning voice chat...

also, as for those suggesting how optimization is going to pull the gap closer...has there ever been a situation on consoles where from Game A to its sequel (on the same platform) that resolution was literally doubled??...i certainly cant think of one...

anything that frees up the resources for that kind of improvement on the Xbone is going to help the PS4 as well considering architecture...if the Xbone makes the jump to 1080/60 next time around, i can only imagine the assets for the PS4 version will be even better...
 
That would suggest to me that IW might have trouble ever hitting 1080p60fps on a COD this gen on XB1 as I highly doubt they'll utilize a different rendering method for one console over the other

That's it. Unless devs start doing fundamentally different renderers for the consoles, which isn't likely to happen as the games will look very different then. You can't get beyond the 32mb limitation without moving to DDR3, it's impossible, and no fancy software improvements from MS will ever change that simple fact - 32mb is not enough for a Full HD deferred renderer.

Renderers change, there may be a new technique waiting in the wings we don't know about. But the general rule is, as the generation goes on, we expect more eye candy, so I can't see a step backwards occuring.

What a difference 64mb EDRAM would have made.

EDIT - actually.... I'd need to do the math, but if you were to tile a part of a 1080 deferred frame buffer with, say, 2xAA, how much would that eat into that DDR3 bandwidth? Significantly less than moving the whole frame back and forth. That may, MAY, be a viable option. You'd still be eating into that DDR3 bandwidth but maybe only a third as much as doing the full frame. That may be the only way around the 1080 issue (it's not just 1080 60fps, it's 1080 30fps that devs will have trouble with, you simply have more bandwidth to "waste" at 30fps / copying framebuffers around).
 
A single frame at 1080p takes 7.910MB of the ESRAM. (1920x1080x32). Add in prefetched data, additional mapping needed for shadows and other effects and AA, the ESRAM gets quickly exhausted. 720p is the only answer.

Given that effects are going to get more abundent and complex, it is actually LESS likely we will see 1080p games on the Xbox One in the future.

People need to start facing reality.

Uhhh... the Math on that is wrong. The idea you're trying to get at is correct, but the math is wrong.

1080p buffer with 12bits per pixel (bpp) = 23.7MB
1080p buffer with 16bpp = 31.6MB
1080p buffer with 20bpp 39.5MB
1080p buffer with 24bpp = 47.5MB (BF3 and KZ:SF both use this)
Etc.
The formula is ((Xres x Yres x BPP) / 1024) / 1024 = MB/s required for frame buffer
 

velociraptor

Junior Member
A single frame at 1080p takes 7.910MB of the ESRAM. (1920x1080x32). Add in prefetched data, additional mapping needed for shadows and other effects and AA, the ESRAM gets quickly exhausted. 720p is the only answer.

Given that effects are going to get more abundent and complex, it is actually LESS likely we will see 1080p games on the Xbox One in the future.

People need to start facing reality.
The fact the Xbox One only has 16 ROPS also adds further misery to it's resolution woes.
 

Timurse

Banned
Uhhh... the Math on that is wrong. The idea you're trying to get at is correct, but the math is wrong.

1080p buffer with 12bits per pixel (bpp) = 23.7MB
1080p buffer with 16bpp = 31.6MB
1080p buffer with 20bpp 39.5MB
1080p buffer with 24bpp = 47.5MB (BF3 and KZ:SF both use this)
Etc.
The formula is ((Xres x Yres x BPP) / 1024) / 1024 = MB/s required for frame buffer

Do I take it correctly, that as you use bits per pixel, your overall results are also measured in MBits? So in order to compare these frame buffer sizes we have to go with /8 and 1080p@24bpp becomes 47,5/8 = 5,9 Megabytes?
 

Winternet

Banned
Do I take it correctly, that as you use bits per pixel, your overall results are also measured in MBits? So in order to compare these frame buffer sizes we have to go with /8 and 1080p@24bpp becomes 47,5/8 = 5,9 Megabytes?
No.
Read his post again and slowly this time.
 
Uhhh... the Math on that is wrong. The idea you're trying to get at is correct, but the math is wrong.

1080p buffer with 12bits per pixel (bpp) = 23.7MB
1080p buffer with 16bpp = 31.6MB
1080p buffer with 20bpp 39.5MB
1080p buffer with 24bpp = 47.5MB (BF3 and KZ:SF both use this)
Etc.
The formula is ((Xres x Yres x BPP) / 1024) / 1024 / 8 = MB/s required for frame buffer

You using MSAA? You should be adding an extra division by 8 when going from bits to bytes btw :p

Are we talking about one buffer or a fat gbuffer for deferred rendering.
1080p = 1920 * 1080 * 4 = 8294400 bytes which is 8100 kB which is +- 7.91mB for an 1080p buffer

Not sure how you calculate it im using 32bits per pixel?
 

Spazznid

Member
Resolution >>> Framerate

Resolutions should be at least in the four digits.

In this day and age, you're absolutely right.

Like Sterling says, 4K is imminent and our current/next gen having trouble with simple 1080p is pathetic. Maybe use an adequate and up to date engine and you'd get better results, both framerate wise and with the visuals...
 

Zing

Banned
I think people would need to be willing to pay more for the console to get consistent frame rates at high resolution. It's mind boggling what they are giving us for $400 already.
 
Top Bottom