• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why won't PC do checkerboard 4K? (it does)

LordofPwn

Member
Have we considered rendering a bayer pattern and then having a separate chip handle the debayering like how cameras work? So instead of rendering full 3840x2160 RGB for all pixels it renders half the green pixels and 1/4 red and 1/4 blue.

1920x2160 G
1920x1080 B
1920x1080 R

Might be enough saving to really enhance that cinema feel.
 

Thraktor

Member
Have we considered rendering a bayer pattern and then having a separate chip handle the debayering like how cameras work? So instead of rendering full 3840x2160 RGB for all pixels it renders half the green pixels and 1/4 red and 1/4 blue.

1920x2160 G
1920x1080 B
1920x1080 R

Might be enough saving to really enhance that cinema feel.

There'd be almost no performance improvement from doing this (at least on existing GPUs). With 32 bit arithmetic you can already accommodate 8 bit R, G, B & Alpha in a single integer and operate on all four at once.
 

DSix

Banned
Hell, I would even be willing to checkerboard 1080p in the future when games become too much for my GPU.
 
Rainbow Six Siege offers a similar technique by default on PC, with pretty significant performance returns (although with softened image quality, natch). We'll see more I'm sure.
 
Your $1000 graphics card (or $700 or whatever) can do it? Wow, you must represent .5% of PC owners.

Getting asinine, are ye? I paid $600 CAD for it, back before Pascal was even a paper launch. Now you can get the same performance for much less with a 1070, and if Steam surveys are an indication, there's healthy adoption. At the least you'd have the real deal framebuffer, instead of a sacrificed image.
 

Renekton

Member
Getting asinine, are ye? I paid $600 CAD for it, back before Pascal was even a paper launch. Now you can get the same performance for much less with a 1070, and if Steam surveys are an indication, there's healthy adoption. At the least you'd have the real deal framebuffer, instead of a sacrificed image.
This native purist thing is misplaced elitism since many techniques in graphics are some of kind of approximation. If not, we'd have raytracing up the wazoo :)

DF just tested the first GPU to consistently do 4K ultra >60fps (ultra =/= max) on cutting edge titles is Titan Xp. Below that are either on less demanding titles (eg Overwatch), with 30fps lock or with lots of settings sacrifices. Being able to trade off between approximation artifacts and Ultra4K60 is just an option that does not undermine PC's lead on gaming graphics.
 
This native purist thing is misplaced elitism since many techniques in graphics are some of kind of approximation. If not, we'd have raytracing up the wazoo :)

DF just tested the first GPU to consistently do 4K ultra >60fps (ultra =/= max) on cutting edge titles is Titan Xp. Below that are either on less demanding titles (eg Overwatch), with 30fps lock or with lots of settings sacrifices. Being able to trade off between approximation artifacts and Ultra4K60 is just an option that does not undermine PC's lead on gaming graphics.

But this is a matter of resolution. You could say we've been 'half-assing' resolution with AA methods, but it's not like the real thing is out of reach. It's not even a matter of asking everyone to get a Titan X. The definition of 4k gaming itself is different between PC expectations and whatever is laid out for the PS4P. You argue options, but what's not to say a 1440k image with generous (good) AA on PC would be lesser to Playstations 4K? Especially if PC is leading other graphical fields?
 

Renekton

Member
The definition of 4k gaming itself is different between PC expectations and whatever is laid out for the PS4P. You argue options, but what's not to say a 1440k image with generous (good) AA on PC would be lesser to Playstations 4K? Especially if PC is leading other graphical fields?
The use case is understandable, where some people bought bang-for-buck HDR TVs and want some couch gaming. This option could exist alongside other tradeoffs like 30-lock and reduced settings, based on personal pref.
 

mhayze

Member
Have we considered rendering a bayer pattern and then having a separate chip handle the debayering like how cameras work? So instead of rendering full 3840x2160 RGB for all pixels it renders half the green pixels and 1/4 red and 1/4 blue.

1920x2160 G
1920x1080 B
1920x1080 R

Might be enough saving to really enhance that cinema feel.

Sadly rendering doesn't work that way.
 

Newline

Member
Isn't this similar to the technique Mgs has used for absolutely ages and on practically all devices? Dithering or something.
 

Durante

Member
Rainbow Six Siege offers a similar technique by default on PC, with pretty significant performance returns (although with softened image quality, natch). We'll see more I'm sure.

Exactly, and again this is what makes the thread title and much of the discussion in it so confusing.

A PC game with checkerboard rendering shipped in 2015.
September 2016 NeoGAF thread title: Why won't PC do checkerboard 4K?
 

UrbanRats

Member
Because it's a poor cousin to the real thing? My 980ti can do 4K30fps+ easily. Last gen games are an easy 4K60fps+, without the need for 'remasters'. No one's going to pay premium price for 'approximation'.
Options are bad?
It's good if you want to push other types of eye candy and still keep 60fps with better than average IQ, even with a powerful card.
Things like hair works, for example.
 
That's just silly.

If you have a 4K screen but lack the GPU grunt to get there then native is not possible. The checkerboard technique is hugely superior to just upscaling a 1080p image.

True, but why assume PC is upscaling from a 1080 image when even 480s and 1060s can push 1440p/30.

Also on PC you can force custom resolutions. You could for instance run the full 2160 vertical resolution and cut your horizontal resolution in half.

I would like to see a DF article directly comparing Sony's reconstruction to different upscales like I mentioned.

I am also wondering if NVIDIAs multi projection could be used to do reconstruction.
 
True, but why assume PC is upscaling from a 1080 image when even 480s and 1060s can push 1440p/30.

Also on PC you can force custom resolutions. You could for instance run the full 2160 vertical resolution and cut your horizontal resolution in half.

I would like to see a DF article directly comparing Sony's reconstruction to different upscales like I mentioned.

I am also wondering if NVIDIAs multi projection could be used to do reconstruction.

1440p looks like shit on 4k.
 

Stitch

Gold Member
Maybe there's something wrong with my PC but Siege looks terrible if I play it with temporal filtering. Every object has a weird dithering effect around it.

 
Definitely not 4k30fps at max settings.

Actually my 1070 which is very close to the 980ti does many games including Rise of the TR at 4k 30 max settings. Not that max settings matter. PS4 pro isn't using PC max settings.

1440p looks like shit on 4k.

Does it? I don't know from experience, but Durante said the PS4 Pro results were similar to 1440 upscaled. That's why I want to see comparisons.
 

McHuj

Member
I think it's only a matter of time until all high resolution rendering moves to reconstruction based techniques. It makes too much sense.

4k displays are going to become more and more common and unfortunately I don't see the pace of silicon advancement being able to keep up. I don't think we'll be a able to get Titan X performance in a 460/1050-level GPU in the next decade so software will have to get much smarter in order to provide rendering improvements across all the products not just at the high tier.

The good news, I think, is that this will probably be a very hot and active research topic and I expect many advancements in terms of quality and performance.
 

Lister

Banned
1440p looks like shit on 4k.

If this comparison is accurate, 1440p on PC looks BETTER than the PS4 Pro's Checkerboard upscale:

Looks a ways off even PC at 1440p with artifacts,

Top PS4 Pro, PC 1440, PC 2160
ps4prorottridqyr.jpg

Those artifacts are ugly, and the image is noticeably softer and lacking in detail over even PC 1440p. It looks like in motion it would be even worse, but it might be the opposite, I haven't sene it in motion.

PS: Right click and open in new tab to see full res shots.
 
If this comparison is accurate, 1440p on PC looks BETTER than the PS4 Pro's Checkerboard upscale:



Those artifacts are ugly, and the image is noticeably softer and lacking in detail over even PC 1440p. It looks like in motion it would be even worse, but it might be the opposite, I haven't sene it in motion.

PS: Right click and open in new tab to see full res shots.

Yeah the 1440 image is noticeably better. I am not great at catching subtle differences in a lot of comparison shots, but this one is not subtle. I still want to see more comparisons when the PS4 Pro releases, hopefully a DF article.

I wonder what a 1920x2160 scaled image would look like. That's a bit more pixels than 1440, but not drastically so.
 

galv

Unconfirmed Member
Can someone explain to me why "checkerboard" upscaling is better than something like bilinear/bicubic upscaling or lanczos?

Is it only performance, or can you not use the latter in upscaling games?
 

Miker

Member
Maybe there's something wrong with my PC but Siege looks terrible if I play it with temporal filtering. Every object has a weird dithering effect around it.


Something's wrong there. I played w/ temporal filtering on my old GPU and I didn't have any of that.
 

Lister

Banned
Can someone explain to me why "checkerboard" upscaling is better than something like bilinear/bicubic upscaling or lanczos?

Is it only performance, or can you not use the latter in upscaling games?

I don't know to be honest. Hopefully someone else can chime in, cause I too am a litle confused.

The end result is worse than upscaling from 1440p. So I can't imagine that they are workign with a 1440p image. From what I kinda understand, they are rendering something close to 1080p but over a spread out grid of 4x4 pixel clusters. The scaler then alternates? between the clusters on a frame by frame basis, and interpolates the in between clusters using temporal data (previous frames).

That's my bare bones, possibly incorrect interpretation of things which would explain why the artifacts, and why it's not as sharp as 1440p upscaled on PC.

But it doesn't explain why this would be any sort of performance hit on the PS4 Pro. If this is done in hardware, and you are essentially rendering 1080p at any one point, then why can't the PS4 run at 60 FPS and or higher quality at this upscaled checkerboard 4k? They've said that the 1080p mode will include better graphics and or frame rate, if my understanding is correct (above) then that shouldn't change just because of upscaling. Which leads me to believe my understanding is flawed :)

When the results are better by running at 1440p, I would imagien develoeprs would just do that. With a 4.2 TFLOP GPU they should be able to pull off 1440p at current PS4 graphics settings without too much issue.

I'm confused, but definitely, the technique doesn't stand up even to 1440p upscale don PC. Or at least doesn't appear to.
 
Can someone explain to me why "checkerboard" upscaling is better than something like bilinear/bicubic upscaling or lanczos?

Is it only performance, or can you not use the latter in upscaling games?

You are comparing apples to oranges here. Checkerboard rendering is all about "weird" input format. This says absolutely nothing about how exactly are the missing pixels generated. You probably could use bilinear-esque method for doing it, though I doubt that's what PS4P does (or at least that's what PS4P will do in most games if it's programmable).

Anyway, all the methods you mentioned, while they have their advantages, ignore a lot of information that tends to exist in game rendering pipeline already. So while you can apply them, so can your monitor/TV. And even TVs nowadays use more complicated filters to try to preserve shapes of things, so using things like, out of my head, z-buffer to get better results in a more efficient way than whatever TV would be doing sounds like a good idea.
 

laxu

Member
I don't know if this was mentioned, but the upscaling on at least Nvidia GPUs is seriously bad quality. I have a 1440p, 144 Hz G-Sync display that does not have a built-in scaler. My previous display was a 2560x1600 Dell with a built-in scaler. The Dell's scaler when running at for example 1080p upscaled to 1440p did a much better job. I tried switching the GPU scaling vs display scaling from Nvidia control panel and compared the two using the same Dell monitor. I wish I had taken screenshots, the difference was noticeable.

Where the Dell's internal scaler was surpringly sharp and clean looking, the GPU-scaling option was noticeably blurry. I don't know if the Dell applies a sharpening filter to the end result but whatever it did ended up in a much more pleasing result. Of course now I have a GPU that runs native 1440p all day long but back then the situation was similar to what we have with 4K where GPUs just about manage but are still not 100% able to do that resolution well.

I wish Nvidia gave us more options for upscaling just like they offer a setting for the gaussian blur filter when downscaling from a higher resolution. Offering at least some choices in the upscaling algorithm would be fine or if not options then they really need to just make it do a better job.

Likewise developers should just stop using FXAA. It desperately needs a sharpen post-process filter on top to not look like vaseline smeared on screen. SMAA does a much better job in general without much of a performance hit or blurring. Of course it's not perfect but it is still better. I am now playing Deus Ex: Mankind Divided and had to use ReShade to apply a more subtle sharpening because they went totally overboard with the built-in one, showing noticeable ringing.
 

Stitch

Gold Member
Something's wrong there. I played w/ temporal filtering on my old GPU and I didn't have any of that.

And here's the game with TAA enabled:



So I don't know what's going on but I'm playing it with temporal filtering disabled + FXAA. Otherwise I wouldn't know what the hell is happening.
 

epmode

Member
I don't know if this was mentioned, but the upscaling on at least Nvidia GPUs is seriously bad quality.

This drives me nuts.

I have a 1440p monitor. Logically, a 720p game should look razor-sharp/pixel-perfect on my monitor since it's a perfect multiple. One pixel in 720p should map to 4 pixels in 1440p. ..but it doesn't.

Nvidia's garbage upscaler blurs the image no matter what you do. The only way for 720p to look perfect at 1440p is to use your monitor's scaler. Unfortunately, my new G-Sync monitor doesn't have one so I'm screwed.

Nvidia should really update their drivers to provide more upscaling options.
 

galv

Unconfirmed Member
This drives me nuts.

I have a 1440p monitor. Logically, a 720p game should look razor-sharp/pixel-perfect on my monitor since it's a perfect multiple. One pixel in 720p should map to 4 pixels in 1440p. ..but it doesn't.

Nvidia's garbage upscaler blurs the image no matter what you do. The only way for 720p to look perfect at 1440p is to use your monitor's scaler. Unfortunately, my new G-Sync monitor doesn't have one so I'm screwed.

Nvidia should really update their drivers to provide more upscaling options.

I believe in Nvidia Inspector, there's an option to use higher quality upscaling, no idea if it actually works though...
 
And here's the game with TAA enabled:



So I don't know what's going on but I'm playing it with temporal filtering disabled + FXAA. Otherwise I wouldn't know what the hell is happening.

Something definitely appears wrong there. Even with Temporal Filtering disabled there still appears to be major dithering happening in the image.
 

epmode

Member
I believe in Nvidia Inspector, there's an option to use higher quality upscaling, no idea if it actually works though...

You mean "Nvidia Quality upcaling"? I just tried it but it didn't make much of a difference. Nothing like a pixel perfect upscale.
 

galv

Unconfirmed Member
You mean "Nvidia Quality upcaling"? I just tried it but it didn't make much of a difference. Nothing like a pixel perfect upscale.

Yeah that's the one. I don't know if you'd be able to find compatibility bits elsewhere either...for now, PC seems to be limited to whatever display has good upscaling/settling for NVIDIA's thing.
 
Someone finally did the 1440p comparison I was hoping for (didn't have Rise to do it myself).

I wasn't expecting it to be that noticable... I imagine static will look better using checkerboard and in-motion will look worse than 1440p.

Honestly on my 4k TV, I more often play newer games in 1440p because the difference between 1080p and 1440p is huge, and 1440p to 2160p not so much. At least at my current display to how far away I sit ratio.
 

shark sandwich

tenuously links anime, pedophile and incels
Because it's a poor cousin to the real thing? My 980ti can do 4K30fps+ easily. Last gen games are an easy 4K60fps+, without the need for 'remasters'. No one's going to pay premium price for 'approximation'.

I own a 4K monitor and a GTX 1080. I'd definitely be interested in something like checkerboard rendering. It would be a good compromise in order to hit a higher frame rate and still get a better IQ than traditional upscaling. I would like the option.

I really don't understand the controversy here. Might as well say you don't want different AA settings or texture resolutions.
 

bonej

Member
Would be awesome if nvidia or amd could bake this into hardware via an asic or use async for it. Also since it uses temporal reconstruction it should work better with more fps, problem is the absolut cost for this method in ms always stays the same. It means higher percentage of the frame rendering with higher fps
 

buenoblue

Member
Yeah I was wondering why sony don't use their vr reprojection tech on normal ps4 games. Apparently in psvr it takes 60fps and makes it 120 for smoother vr. If they could reproject 30 to 60 on normal ps4 that would be awesome.
 

Miker

Member
And here's the game with TAA enabled:



So I don't know what's going on but I'm playing it with temporal filtering disabled + FXAA. Otherwise I wouldn't know what the hell is happening.

Yeah that's still messed up. Dunno what to tell you though.
 
The use case is understandable, where some people bought bang-for-buck HDR TVs and want some couch gaming. This option could exist alongside other tradeoffs like 30-lock and reduced settings, based on personal pref.


Options are bad?
It's good if you want to push other types of eye candy and still keep 60fps with better than average IQ, even with a powerful card.
Things like hair works, for example.

I just believe this is a false option. There's better ways to improve image quality (see edge-based AA) and projection does little to resolve texel representation. I'm under the strict belief that 1440p+, even without any AA, is better than the examples we've been shown so far. As for performance, you get what you pay for.

Definitely not 4k30fps at max settings.

I just did the RotTR bench will all settings max (no AA, natch) on 4k. Average is over 30, and even the Geothermal section shows the lowest dip at 28+fps.
 

Momentary

Banned
I'm running 4K RotTR @ 67fps maxed with AA on 1 card.

Here's Mad Max @ 5K running at 79 FPS in this scene MAXed out on 1 card. Action scenes with enemies and explosions bring it down to 65FPS.


Volta will bring affordable 4K60fps solution for consumers. Especially if they choose to run their games @ consoles fidelity settings. The same can be said for cards now. If people run their games at settings comparable to consoles on their 980ti's/1060s/1070s they will still obtain better picture quality with better performance.

I really don't see the reason for NVIDIA or AMD to focus resources on implementing this on platforms where hardware is dynamic. This technique can already be used by the individual developers anyway on PC. It would be viable for people with older cards or for people with new cards to run their games at resolutions like 5K/6K/7K/8K or what have you.

I'd rather other cool pieces of technology like Ansel and wide POV/Multi monitor correction be implemented.
 

Momentary

Banned
What card does that?

It's a Titan XP and it's overclocked. More than likely the XX60 series cards from Volta will bring the same performance at a very modest price from most PC gamers. I feel 4K gaming will be more than doable for the masses in the next 2 years. When I say 4k I mean at high or medium/high settings.

I don't know what's the deal with people trying to set the baseline for 4K@60 gaming by using max settings.
 

Momentary

Banned
Which AA?

I believe I just had FXAA when I was benchmarking. I shouldn't have even had it on @ 4K, but I was trying to compare against other benchmarks that used similar settings. I did have Pure Hair on Very High while most benchmarks just have it set to "ON".
 
Top Bottom