• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Doom's lead graphics programmer thinks 4K isn't a good use of Xbox Scorpio's power

While true for DX11, that hasn't generally been the case for DX12. IIRC, DX12 actually shares a lot in common with the XBO APIs and tools.



If MS doesn't use Vega and are just using an OC'ed 480, they are stupid for waiting until fall of next year to launch.

Why? When is Neo planned to launch? That's the only competition they have to worry about really.
 

Timedog

good credit (by proxy)
So does everyone else except that one guy at work who keeps referencing his TV any time anything even tangentially technology related is brought up at work.
 

P44

Member
I think games like UC4 and Ratchet have shown that we can do 1080p quite well, it's just that 3rd party developers stink at optimization tbh lol.

Nah, there's so much room to grow in even the 1080p space - whilst 4K is cool, its not 360p -> 1080p jump levels, not even close.
 
4K makes sense if;

A. You have literally nothing left to put GPU resources into eg. an older or less demanding game/console port.

B. You have the motion resolution to back up the huge static resolution i.e not standard LCD and/or 30Hz.
 
4K makes sense if;

A. You have literally nothing left to put GPU resources into eg. an older or less demanding game/console port.

B. You have the motion resolution to back up the huge static resolution i.e not standard LCD and/or 30Hz.

now B I can fully agree with

motion resolution (and just how shitbad lcd screens are at displaying motion especially at lower framerates) is often overlooked

Kudos for mentioning it
 

bidguy

Banned
GPU Tflops change dependent on the clock speed. The base model of the 480 on PC is 1266mhz for instance, which means it's a 5.8 Tflop card non OC'd. So the Scorpio IMO is likely using the 480 level GPU in its APU at ~1300mhz.

its not gonna be 480 even df said as much

you think theyll gonna deal with the issues of a oc gpu ? they think its gonna be some form of r9 fury
 

dogen

Member
Why are people saying this? 1080p + 4x MSAA is nowhere near the quality of 4K and [URL=" cover nearly enough types of aliasing[/URL] (no shader, no transparency, no texture). It's also very expensive in a deferred rendering engine so you may as well be increasing the resolution instead.

He's talking about sebbbi's(aka sebaaltonen in that twitter thread) msaa trick.
 
D

Deleted member 752119

Unconfirmed Member
Agreed. I couldn't care less about 4K personally. I'd rather power go to effects, AI etc.
 

drexplora

Member
Resolution makes a big difference on the entire image, its not just better AA.
You're able to make out more of the finer details in the game, some that weren't even visible at 1080p. Also the LOD seems to improve, you get a lot more detail in the distance that normally wouldn't make the cut.

I'm sure once we get used to 4k whenever that time comes, going back to 1080p will be an eyesore.

It makes a BIG difference.
 

hesido

Member
Why are people saying this? 1080p + 4x MSAA is nowhere near the quality of 4K and doesn't cover nearly enough types of aliasing (no shader, no transparency, no texture). It's also very expensive in a deferred rendering engine so you may as well be increasing the resolution instead.

We're saying this because it's an insanely beautiful way of utilizing the hardware. The trick is actually the 8xMSAA 540p rendering trick, which gives an equivalent of 2xMSAA at 1080p rendering but with massive bandwidth reductions.

When used at 4xMSAA at 1080p, you could have native 4K renderings, and the AA could come from temporal tricks and/or in-frame post-processing

The result would be almost indistinguishable from a native render, as Sebbbi says, as the shading and lighting is done at native resolution anyway.
sebbbi said:
Our MSAA trick does shading (lighting and post processing) at full resolution. It only saves performance on G-buffer overdraw. But for cases of heavy overdraw (such as foliage and trees) the G-buffer savings can be huge (up to 4x theoretical reduction in pixel shader invocations and back buffer bandwidth).

It just makes "business sense" to employ this technique for 4K.

Here's the post, and you can follow the links on that thread for more information:
GPU-driven rendering (SIGGRAPH 2015 follow up)
 
When you think that an i7 6700k and GTX 1080 (9 Tflop GPU) which is faster and more powerful than Scorpio already, yet it can't hit 4k / 60fps, Scorpio has little chance of doing decent native 4k / 60fps gaming without compromising on graphics.

Running games with better graphics and performance at 1080p and upscaling to 4k would definitely be a much better option now and then when its time for the next generation of hardware, native 4k / 60fps gaming should be a more realistic option then.
 

scently

Member
When you think that an i7 6700k and GTX 1080 (9 Tflop GPU) which is faster and more powerful than Scorpio already, yet it can't hit 4k / 60fps, Scorpio has little chance of doing decent native 4k / 60fps gaming without compromising on graphics.

Running games with better graphics and performance at 1080p and upscaling to 4k would definitely be a much better option now and then when its time for the next generation of hardware, native 4k / 60fps gaming should be a more realistic option then.

Running a game at "4k/60fps" on console is going to be a design choice and not entirely a subject of hardware. XB1 can't do 1080p 60fps and yet Forza does it without dropping a single frame. Why because it was designed that way.

And saying that a 6tflops gpu can only do 4k native rendering with simplistic graphics is not right. The 390X is PC part rated at 5.9tflops and does 4k/30fps with the best settings on a lot of recent games. On some you just have to drop some settings down a bit.

I expect Forza 7 to run at 4k/60fps native. Its going to be a showcase title for MS.

And ofcourse, as is being proposed in the tweets in OP, their will be many clever ways to get 4k rendering without spending all of the system's resources on doing it bruteforce. What remains to be seen is how effective they will be. QB and R6 have achieved varying degree of success using these techniques and am sure we will see other implementations, for better or for worse.
 

Guymelef

Member
And saying that a 6tflops gpu can only do 4k native rendering with simplistic graphics is not right. The 390X is PC part rated at 5.9tflops and does 4k/30fps with the best settings on a lot of recent games. On some you just have to drop some settings down a bit.

People always say things like that missing one important part of the equation, a few hundreds dollars CPU.
 

nOoblet16

Member
Why are people saying this? 1080p + 4x MSAA is nowhere near the quality of 4K and doesn't cover nearly enough types of aliasing (no shader, no transparency, no texture). It's also very expensive in a deferred rendering engine so you may as well be increasing the resolution instead.

MSAA trick only works in a deferred renderer, it is not compatible with forward renderer at all since it's the only way to use it.
And even with high cost of MSAA it saves performance, afterall MSAA is just supersampling applied in a specific manner in a limited scope so it will still be cheaper than actually having a full 4K image.
 

gatti-man

Member
There is a huge difference in visual quality between 1080p and even 1440p. People who are claiming in this thread the difference is negligible should list their display. It's night and day for me.

That being said I'd prefer 1440p up converted to 4K or down to 1080p for those without 4K tvs. 1080p is a waste of 6TF of power.

People always say things like that missing one important part of the equation, a few hundreds dollars CPU.

Far less cpu and gpu overhead on consoles. You get more not less for your power there.
 

amdb00mer

Member
Even home theater enthusiasts agree that 4K by itself isn't that much of an upgrade. It's the HDR and other things that come with it that make the difference worthwhile. I'd take 1080p/60fps with HDR over just the resolution upgrade 4K provides.

I noticed a difference. Netflix offers most of their original shows in 4K. I watched Marvel's Daredevil in 4K through my 4K LG tv. I then went back and watched part of one of the episodes in 1080p on my X1 and there most definitely is a difference. However, if your not watching 4K video on your 4K tv and just up-scaling 1080p, then yes, there is little to no difference in that regards.
 

amdb00mer

Member
Or in the way that Quantum Break renders

Exactly. Forza Motorsport 6 Apex (PC), Halo 5, and Quatum Break use these dynamic rendering engines that adjust resolution and some effects on the fly to maximize performance. Just imagine now this same engine applied to the Scorpio. A game can be run in 4K with decent performance as the system will adjust 'settings' on the fly if the developer chooses.
What I would really like to see is MS offer gamers options for single player games to choose to run the game at 4K30fps or 1080p60fps. With the change to the OS and being more Win10 and using the UWP more and more I think this is possible.
 

KageMaru

Member
You mean this standard?

"2K" and "4K" refer to the horizontal pixels. 2K would be roughly 2000 pixels. The 16:9 standard for 2K is 1920x1080. Double each dimension and you get 4K (3840x2160).

I do not understand where calling 2560x1440 "2K" came from... but it's silly. That's like calling 3440x1440 "3K." Which is equally silly.

Sorry I missed this the last time I checked out the thread. I just meant that I don't think using the vertical resolution to define a resolution term is accurate.

IMO to say that 1080p and 2K is the same is no different than calling games with a sub-1920 horizontal resolution "1080p" games just because the vertical resolution is that high.

Sorry having difficulties finding a good way to convey my point into words.
 

00ich

Member
Why are people saying this? 1080p + 4x MSAA is nowhere near the quality of 4K and doesn't cover nearly enough types of aliasing (no shader, no transparency, no texture).

The only anti aliasing is the temporal component. It's not 1080p with AA, it's native 4k geometry rastering with 1080p texture and shader sampling. There's no down- and back-up sampling to 1080p and back up to 4k.

It's also very expensive in a deferred rendering engine so you may as well be increasing the resolution instead.

Doesn't this always have to do with the resolved (downsampled) buffer? That wouldn't apply here.
 

amdb00mer

Member
Has there been a memory increase as well?

I think it will have a minimum 12GB RAM. There is no way they are going to be able to make these pushes with the same amount of RAM. However, the specs have not been finalized. Hence the one year wait for the release. I do think developers have been told what to expect at a minimum. Once the first dev kits go out it will get leaked.
 

valkyre

Member
Anyone with a brain would realize that 4k is just for the people who want to glue their face on a 60" tv...

A waste of resources that could be spent increasing graphic fidelity... and before some "smartass" comes out and say "4k is graphic fidelity", obviously i am talking about higher polycount and geometry, effects, shaders and the lot.

1080p is fine for 99% of the people out there. Make the games look even better instead of aiming at a ridiculous 4k "dick measuring contest".
 

Panajev2001a

GAF's Pleasant Genius
Anyone with a brain would realize that 4k is just for the people who want to glue their face on a 60" tv...

A waste of resources that could be spent increasing graphic fidelity... and before some "smartass" comes out and say "4k is graphic fidelity", obviously i am talking about higher polycount and geometry, effects, shaders and the lot.

1080p is fine for 99% of the people out there. Make the games look even better instead of aiming at a ridiculous 4k "dick measuring contest".

4k is also about deep color and HDR ready displays... those should be a game changer for real :).
 

jeffc919

Member
Anyone with a brain would realize that 4k is just for the people who want to glue their face on a 60" tv...

A waste of resources that could be spent increasing graphic fidelity... and before some "smartass" comes out and say "4k is graphic fidelity", obviously i am talking about higher polycount and geometry, effects, shaders and the lot.

1080p is fine for 99% of the people out there. Make the games look even better instead of aiming at a ridiculous 4k "dick measuring contest".

Agreed, for most AAA games anyway, which is really all I care about. When it's all said and done though, I'll just be happy to have better looking, better performing games, however the devs decide to utilize the more powerful hardware.
 

scently

Member
People always say things like that missing one important part of the equation, a few hundreds dollars CPU.

My comment on the ability to do 4k/30fps is independent of needing a cpu because you always need a cpu. Drawing a 4k screen is almost entirely on a gpu. Stop going off on a tangent.
 
Top Bottom