• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Doom's lead graphics programmer thinks 4K isn't a good use of Xbox Scorpio's power

BobLoblaw

Banned
He's right. 1080P and downsampling should be more than enough. Those plus better lighting and frame-rates should be the defining features.
 

Spladam

Member
Just stopped by to say I think he's right, I've been saying it since the announcement. How many people really have 4k displays? Wouldn't it be better if they just used the extra power to add frames and better effects to a 1080 experience? 1080 is beautiful, easy on the eyes, and smooth on the frames with that kind of horse power. Been waiting for this to become a discussion, I think Microsoft's focus on 4k was more of a buzz word PR move by them, and less of a realistic roadmap.
 
Think of it this way, if it came down to it would you prefer these visuals at 1080p:
gzgdKvL.jpg

Or these visuals at 4K:
w1duufg.png


I think it's fairly clear that the first image is much more preferable, even if it sports a less sharp IQ.
 

pottuvoi

Banned
4K textures isn't a thing, unless you're literally saying every texture has a 4K resolution which is silly.
Indeed, especially as 2k and 4k textures have been widely used in games for a long time.
Resolution of textures is independent from the framebuffer size. (Old good use 4k texture for an arrow.)
One important thing that screenshots don't show is the huge increase in motion stability. Having way less pixel shimmering with movement is a big deal and adds to an overall sharp and clean looking IQ.
Yes.

Funnily having 4k buffer one can get resolution around 1080p worth of stable detail on moving picture. (Smallest detail being 2 pixels in 4k buffer, estimation but lot better temporal stability than 1 pixel for smallest point.)


My hope is that some developers would try variable shading with MSAA trick or similar methods to get shading cost near 1080p and output similar to 4k buffer where sharpest region would have almost all detail for native 4k.
This and especially for 'games using HDR and filmic' games would use a proper gaussian resolve for AA to get the temporal stable image as good as it can get. (Which reduces the spatial resolution a bit, but would look lot better than normal 1080p.)

Would also love to see 'native' 4k rendered game which would use virtual texturing and texture space shading to get really high quality results, especially if game goes for very static lighting environment, perhaps dusty low specular world in an adventure game. (IE. the Dig)
With slow moving camera this could allow near perfect image quality for a HDR 4k. (A lot of shading could be used for a long time without changes, shading rate could get lot lower than 1080p per frame.)
 

AllenShrz

Member
4k and 60 fps, is not a matter of choosing in my case.

I cant go back to 1080, it does not compare to 4k.

Also if I could, I would trade my current 4k tv for a 4k OLED, those are glorious.
 
You are saying it hides blur with motion blur? ok.

But TSSAA does not causes blurring in motion, it can cause artifacting like dithering when in motion but not blurring and it certainly doesn't "hide it with motion blur"...there's no such thing as hiding something with motion blur. That particular statement has always bugged me. And again 30FPS has lower temporal resolution, unless you use motion blur the gaps between frames is so easily visible that I can point them out individually.

Whatever Doom uses causes blurring when you move your camera. Same goes with The Division, Driveclub, Quantum Break and World of Tanks. Perhaps the temporal aspect doesn't have enough time to gather enough frames or the sharpening filter doesn't kick in but the image doesn't become sharp until it's perfectly still. Maybe you don't want to call it blurring but at least, the image does not remain sharp in motion, hence why good old higher resolution is very much applicable.

Motion blur hides artifacts and other graphical effects' shortcomings because of the strength of the effect. It deliberately smears the image to the point of you not being able to tell the difference between the effect itself and something else.
 
Indeed, especially as 2k and 4k textures have been widely used in games for a long time.
Resolution of textures is independent from the framebuffer size. (Old good use 4k texture for an arrow.)

Yes.

Funnily having 4k buffer one can get resolution around 1080p worth of stable detail on moving picture. (Smallest detail being 2 pixels in 4k buffer, estimation but lot better temporal stability than 1 pixel for smallest point.)


My hope is that some developers would try variable shading with MSAA trick or similar methods to get shading cost near 1080p and output similar to 4k buffer where sharpest region would have almost all detail for native 4k.
This and especially for 'games using HDR and filmic' games would use a proper gaussian resolve for AA to get the temporal stable image as good as it can get. (Which reduces the spatial resolution a bit, but would look lot better than normal 1080p.)

Would also love to see 'native' 4k rendered game which would use virtual texturing and texture space shading to get really high quality results, especially if game goes for very static lighting environment, perhaps dusty low specular world in an adventure game. (IE. the Dig)
With slow moving camera this could allow near perfect image quality for a HDR 4k. (A lot of shading could be used for a long time without changes, shading rate could get lot lower than 1080p per frame.)

AoS uses texture space shading. Its not very bandwidth efficient
 

Zaki2407

Member
Hmm... it makes me wonder, in a different universe where the only available tv are CRT tvs with 480p maximum resolution and 6+tflop game consoles. What can devs do with it?
 

Jimrpg

Member
As someone running a 1440p/144hz screen... 1440p is taxing enough, and I honestly should have just bought a 1080p screen. A GTX 970 isn't really cutting it at this resolution, at least for things like The Witcher 3 and The Division, both of which I'm playing on medium settings and getting 60fps.

The Scorpio will be like a GTX 1070, but even then that will only get you 1440p/60 maxed, so be prepared for 4k games at medium/low settings.

It's going to take the GTX 1170, 1270 series to do 4k/60 maxed... that's still 1.5-3 years away, and at least one more console generation away. 2019 for proper 4k/60... by then people will be asking for 8k.
 
Consumer preference. Raising FPS is still more demanding than raising to 4k, especially at higher resolutions.

No, most definitely isn't. 1080p 60fps is easier on something as powerful as Scorpio than 4K 30fps. Far easier. I'm very happy to hear developers not falling into the 4K trap. This console's power will be far better utilized by not boxing your game into a 4K resolution requirement.
 
That image doesn't have to worry about things like aliasing. Because it's a photo of real life and real life doesn't have aliasing.

Aliasing comes in all forms. Something you need to take into account is the fact that real life also doesn't need to worry about LODmipmapsetc. All of those trees, houses, etc in the distance are still perfectly detailed. If that was a video game, those distant objects would all be very complex for something so small. And with their maximum resolution textures, you'd get some really nasty shimmering or "pixel crawl." And the methods to reduce that are very expensive on the GPU (unless you want a blurry image). So if you want that sharp image, you either downsample from a larger resolution, or simple display the larger resolution natively.

I agree
But my question was why the need for a 4K *display* I would rather the cpu be thrown at down sampling, better AA and a dozen other things including frame rate or some mix of these.
Simply using all the cpu to fill 4K at 60fps seems to miss an opportunity to increase quality for 95% of current TV sets.
 

Jimrpg

Member
It stuck out to me that Andrew House didn't use 4K as a selling point in his last statement about the Neo, but instead said that games will "play an awful lot prettier."

It's possible he just didn't want to unveil a marketing pitch early, but that might have also been the feedback they got from developers once they got their hands on the hardware.



One thing worth considering about sticking with 1080p is that 4K is exactly four times as many pixels, so the scaling is much, much cleaner than going from 720p -> 1080p or some other uneven multiple. I suspect that's why the TV industry jumped on this resolution as well, since they knew most of the content would be 1080p, but wanted it to still look clean and good on new sets.

720p to 1080p is twice the number of pixels.

But 1080p to 1440p is 1.5 times, i believe.

Also Andrew House absolutely knows that PSNeo won't do 4k, its a R280X, R380X equivalent console. it's a jump from a 750Ti to a 280X, so 1080p/30fps to 1080/60fps.... its definitely not a jump to 4k.
 
No, most definitely isn't. 1080p 60fps is easier on something as powerful as Scorpio than 4K 30fps. Far easier. I'm very happy to hear developers not falling into the 4K trap. This console's power will be far better utilized by not boxing your game into a 4K resolution requirement.

1080@120 ~= 4k@30
 
Yeah, stick to getting 1080p, it's been kind of sad to see the XBO and in cases the PS4 struggling to play games at a resolution that's pretty much been a standard for almost a decade. That said, 4K TVs aren't that far from being where 1080p is now in affordability, why not just wait a few more years and release the "true 4K consoles? With how things are going that's probably going to be here in consoles by 2019/2020 as 8k is in the place 4k is now and 1080p was before it.
720p to 1080p is twice the number of pixels.

But 1080p to 1440p is 1.5 times, i believe.

Also Andrew House absolutely knows that PSNeo won't do 4k, its a R280X, R380X equivalent console. it's a jump from a 750Ti to a 280X, so 1080p/30fps to 1080/60fps.... its definitely not a jump to 4k.


1.8x (rounded up)
 
As someone running a 1440p/144hz screen... 1440p is taxing enough, and I honestly should have just bought a 1080p screen. A GTX 970 isn't really cutting it at this resolution, at least for things like The Witcher 3 and The Division, both of which I'm playing on medium settings and getting 60fps.

The Scorpio will be like a GTX 1070, but even then that will only get you 1440p/60 maxed, so be prepared for 4k games at medium/low settings.

It's going to take the GTX 1170, 1270 series to do 4k/60 maxed... that's still 1.5-3 years away, and at least one more console generation away. 2019 for proper 4k/60... by then people will be asking for 8k.

The scorpio is the 480. Which is about 40-50% slower than the 1070. It's a 1080p/60fps higher settings card. Or a 4k/30 card at mid/high.
 

mrklaw

MrArseFace
The scorpio is the 480. Which is about 40-50% slower than the 1070. It's a 1080p/60fps higher settings card. Or a 4k/30 card at mid/high.

Yeah. It's easy to get mixed up with AMD vs Nvidia flops. They use different methods to measure. Don't know why really.
 
Think of it this way, if it came down to it would you prefer these visuals at 1080p:
.
Or these visuals at 4K:
.
I think it's fairly clear that the first image is much more preferable, even if it sports a less sharp IQ.

it's not about still screenshots

high res makes things look much better in motion by making aliasing/flickering/etc. a lot less noticeable/distracting
 

artsi

Member
it's not about still screenshots

high res makes things look much better in motion by making aliasing/flickering/etc. a lot less noticeable/distracting

That's still just IQ though, some of us prefer things like screen space reflections and other effects, nice lighting, good LOD without pop-in, etc. before better IQ / resolution than 1080p + AA.
 

Paz

Member
Makes sense not to target 4K when most people have a 1080P TV, and those that do have a 4K TV aren't guaranteed to have a setup where they sit at a reasonable distance and have a large enough size to get the full benefit (though this equation is far less specific and distinct than those completely bullshit image comparisons that get posted).

As someone with a 75" 4K TV that I sit a moderate distance from this makes me sad though, people who haven't run modern games on a setup like this at native 4K really have no idea just how big of an impact that resolution jump can make to both the IQ and fine detail that is invisible at 1080p, it's not at all what you would think from simple screen shot comparisons when in motion.

No right answer really, except to say that I think upping the resolution (of both effects like shadows/particles and native rendering) is the easiest thing for developers to do when dealing with multiple SKU's that feature largely different GPU performance so I expect that's what most games will feature, instead of being designed for 1080p with higher quality everything. The games will still have to run well on a 1.31 tflop XBox One after all.

Edit - Also on the "Do want" post, I've never met an engine programmer who wasn't utterly thrilled at the prospect of having access to new and faster hardware, it's what they live for :) Some times in the age of esoteric console hardware designs they would live to regret that desire, but these days it's a pretty safe bet that a graphics programmer is gonna enjoy working on the engine for a 6tflop x86 machine powered by an AMD APU.
 
Yeah. It's easy to get mixed up with AMD vs Nvidia flops. They use different methods to measure. Don't know why really.

Well they're really not different in reality. A TFlop is a measure of theoretical computational performance. The AMD flops and Nvidia flops are the same. Nvidia cards at least on DX11 simply performed much better. Therefore Nvidia Flops were considered to be better than AMD flops again at least for DX11. At least from my understanding.
 

martino

Member
all this seems binary... why are most people thinking yes or no and can't imagine all the declination possible between the two ?
 

AzaK

Member
Makes sense not to target 4K when most people have a 1080P TV, and those that do have a 4K TV aren't guaranteed to have a setup where they sit at a reasonable distance and have a large enough size to get the full benefit (though this equation is far less specific and distinct than those completely bullshit image comparisons that get posted).

As someone with a 75" 4K TV that I sit a moderate distance from this makes me sad though, people who haven't run modern games on a setup like this at native 4K really have no idea just how big of an impact that resolution jump can make to both the IQ and fine detail that is invisible at 1080p, it's not at all what you would think from simple screen shot comparisons when in motion.

No right answer really, except to say that I think upping the resolution (of both effects and native rendering) is the easiest thing for developers to do when dealing with multiple SKU's that feature largely different GPU performance so I expect that's what most games will feature, instead of being designed for 1080p with higher quality everything. The games will still have to run well on a 1.31 tflop XBox One after all.



Sure, undoubtedly there;s a different it's just that when you get 4x the power, do you want to waste it all on 4x the res? I would have thought that higher poly count, higher res textures, better AA and shaders would give a more preferable result.

For years we watched movies on DVD at a "poxy" 480/576p. They looked amazing (If not obviously as sharp as HD) and that was because of the realism. i.e. The shaders, the polycount meaning smoother surfaces and all that jazz.

If you say to me "I'll give you the Witcher as it stands in 4k" or "I'll give you a photorealistic Witcher at 1080 (Or even 480)" I know what I'd go for.
 

hesido

Member
all this seems binary... why are most people thinking yes or no and can't imagine all the declination possible between the two ?

This is a good way of thinking. I think 4XMSAA 1080p to 4K trick gives the best of both worlds, mix that up with a temporal component and you are going smooth sailing with regards to IQ at 1080p or 4K.
 

Paz

Member
Sure, undoubtedly there;s a different it's just that when you get 4x the power, do you want to waste it all on 4x the res? I would have thought that higher poly count, higher res textures, better AA and shaders would give a more preferable result.

For years we watched movies on DVD at a "poxy" 480/576p. They looked amazing (If not obviously as sharp as HD) and that was because of the realism. i.e. The shaders, the polycount meaning smoother surfaces and all that jazz.

If you say to me "I'll give you the Witcher as it stands in 4k" or "I'll give you a photorealistic Witcher at 1080 (Or even 480)" I know what I'd go for.

I don't think it's that simple, if you play Uncharted 4 at 480p right now you'll notice it doesn't look nearly as impressive as at 1080p despite the ridiculously high quality assets/shaders/etc at work, the same is true for 1080p > 4K where a lot of things (particularly surfaces) can really come to life at the higher resolution.

Definitely a possibility that we're too early for that jump though, given both the hardware limitations we face and also the limited prevalence of 4K displays and viewing setups. It'll be interesting to see how developers trend over the next few years, I honestly don't know which way it will go (Higher detail 1080p vs trying to get closer to 4K native rendering).
 

LCGeek

formerly sane
No, most definitely isn't. 1080p 60fps is easier on something as powerful as Scorpio than 4K 30fps. Far easier. I'm very happy to hear developers not falling into the 4K trap. This console's power will be far better utilized by not boxing your game into a 4K resolution requirement.

Except if you use my standards or do lightboost which is 100fps minimum need 4k is absolutely not doable.

I was talking about fps and spending resources on them and why pixels are preferred over 60fps cause be it 4k100fps or 4k60fps you aren't doing them as easily as 4k30fps which is certainly attainable.
 
It largely depends on how big a normalised pixel is on your 1080p display, and your larger 4k display.

Ultimately 4k probably won't be required unless you're sitting close to your huge TV, but at the same time 4k TVs tend to be larger than 1080p screens, so you'd probably want 1440p + great subpixel rendering in order to not be able to resolve any individual pixels in-game.
 

pottuvoi

Banned
AoS uses texture space shading. Its not very bandwidth efficient
Interestingly they use it without virtual texturing and might have to do a lot of extra work due to it.
Also the game doesn't really fit my description of slow moving camera and has huge amounts of fast moving objects.

If game has proper look to it, texture space shading could be big win, especially when rendering to 4k. (Rasterization pass is cheap, shading reuse in case when it's in best case calculated only once for the scene for background objects is quite big. )
 
I agree
But my question was why the need for a 4K *display* I would rather the cpu be thrown at down sampling, better AA and a dozen other things including frame rate or some mix of these.
Simply using all the cpu to fill 4K at 60fps seems to miss an opportunity to increase quality for 95% of current TV sets.

Well downsampling is rendering the game at a higher resolution and then resizing to fit on whatever display you're looking at. It would be the same performance cost to output that resolution natively. But I'm assuming you mean downsampling from a lower res than 4K.

Who knows how many 4K displays there will be a year and a half from now. That's a long ways away. I still think having a 4K TV is silly since there's still a huge lack of 4K content and I'm not about to go replace my Blu-ray library with 4K versions just yet. But maybe I'll change my mind before 2018.

This is a good way of thinking. I think 4XMSAA 1080p to 4K trick gives the best of both worlds, mix that up with a temporal component and you are going smooth sailing with regards to IQ at 1080p or 4K.

Why are people saying this? 1080p + 4x MSAA is nowhere near the quality of 4K and doesn't cover nearly enough types of aliasing (no shader, no transparency, no texture). It's also very expensive in a deferred rendering engine so you may as well be increasing the resolution instead.
 

DOWN

Banned
The 4K spin from Microsoft came off as shitty buzz word marketing desperately trying to recover from their image regarding resolution this gen
 
I really struggle to take the pro 4K people seriously when they're almost certainly playing on a sample-and-hold display with atrocious motion resolution. Don't get me started on 2160p30.
 

enemy2k

Member
I am glad MS is giving the developers freedom to choose what they want to do with all dat extra power. Hope to see some amazing stuff next year.
 
I really struggle to take the pro 4K people seriously when they're almost certainly playing on a sample-and-hold display with atrocious motion resolution. Don't get me started on 2160p30.

I'm equally weary of people who've never seen native 4K in their life (usually people who think downsampling compares) and yet downplay the very obvious improvement in detail and clarity.

Alas, as many console developers will always opt for pushing prettier pixels at the cost of framerate, there's always a few who are willing to fight the good fight.

I think the best of both worlds lies in dynamic frame buffers (on console where you sit much farther away from the screen), and would like to see more developers focusing on solid performance instead.
 

tapedeck

Do I win a prize for talking about my penis on the Internet???
I totally agree with him and that was my first thought when I saw the 6TF figure..it's a big jump but nowhere near enough to run 4K games with next gen graphical upgrades and 60fps. Imagining what devs could do with that power at 1080p though..oh man.
 

III-V

Member
I think there is roughly a 0% chance devs will be targeting 4k native resolution with the upcoming consoles. They will, however, be able to output in 4K, and I expect the push will be scaling 1080P to 4K output.
 

leeh

Member
GPU Tflops change dependent on the clock speed. The base model of the 480 on PC is 1266mhz for instance, which means it's a 5.8 Tflop card non OC'd. So the Scorpio IMO is likely using the 480 level GPU in its APU at ~1300mhz.
From what we've seen from the 480 it's not a good OC's card. I'm presuming it's not the 480.
 

KageMaru

Member
Well they're really not different in reality. A TFlop is a measure of theoretical computational performance. The AMD flops and Nvidia flops are the same. Nvidia cards at least on DX11 simply performed much better. Therefore Nvidia Flops were considered to be better than AMD flops again at least for DX11. At least from my understanding.

While true for DX11, that hasn't generally been the case for DX12. IIRC, DX12 actually shares a lot in common with the XBO APIs and tools.

GPU Tflops change dependent on the clock speed. The base model of the 480 on PC is 1266mhz for instance, which means it's a 5.8 Tflop card non OC'd. So the Scorpio IMO is likely using the 480 level GPU in its APU at ~1300mhz.

If MS doesn't use Vega and are just using an OC'ed 480, they are stupid for waiting until fall of next year to launch.
 
Top Bottom