• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why won't PC do checkerboard 4K? (it does)

twobear

sputum-flecked apoplexy
They've patented one (hardware) implementation, and we have no idea what the general quality of that implementation is compared with others, so it's a bit meaningless to say 'nobody else can do it because patent'.
 

Durante

Member
That patent isn't the checker box method. (Nor it affect any usual tricks to get higher resolution on GPUs. (MSAA-trick, checkerbox, temporal reconstruction etc.)

A one implementation of checker box rendering is already used in a game on PC and gives >30% improvement in performance.
http://www.geforce.com/whats-new/guides/tom-clancys-rainbow-six-siege-graphics-and-performance-guide

Nope.

http://www.gdcvault.com/play/1022990/Rendering-Rainbow-Six-Siege
Page 44 forward.

Actually if developers want to have artifact free rendering in 4k with near 1080p shading cost the basic 4xMSAA trick might be a good choice.
Gives perfect edge resolution, no temporal artifacts. (Texture resolution/shading is 1080p.)
Without any tweaking in areas where there are no polygon edges, the basic version would give identical result to 1080p 2x nearest neighbor scaling you suggested.
Thanks for the links.

They've patented one (hardware) implementation, and we have no idea what the general quality of that implementation is compared with others, so it's a beat meaningless to say 'nobody else can do it because patent'.
It's even more meaningless since people have already done it. On PC in fact!
 

Lister

Banned
On my 3440x1440p monitor upscaling form around 90% reoslution looks good, though at the distanc eI sit form my monitor there is a noticeable difference which only gets more noticeable the more I drop from 90%.

Sitting 6 feet from say a 50 inch 4K TV would probably make 90% or even 80% resolution just fine.

I'll probably be doing that on PC vs lower rez reconstruction once I buy a 4K TV.
 

pottuvoi

Banned
Wait... Does the PS4 Pro have the reality creation feature that my Sony Bravia already has?

Is that how it does 4k?
No.
The scaling of Bravia series works by having series of input images and then scale it to 4k.

Games use many different methods of resolving/rendering into higher resolution. (Usually changing someway how rendering is done, not just working on final images.)
Read presentations in these threads if you are interested in specifics.
 

Wavebossa

Member
Yeah, so in the end I doubt this is even worth bring out. Consoles and PCs are vastly different beasts. While the PS4pro might be in consumers hands anywhere from 2-3 years depending on sony's PS5 timeline, PC upgrade according to their owners will and that alone. 4k monitor adoption rate stands at >2% of Steam users right now. What's the point in working on and releasing such tech in a driver profile for the current cards out there when the people probably running a 4k monitor are likely already using the highest end cards out there already and capable of acceptable framerates anyways? By the time GPU hardware catches up to 4k at midrange pricerange (presumably next year) and people start buying 4k monitors, the tech is pointless other than for legacy cards who are probably just going to stick to 1080p anyways until they upgrade again.

There's nothing wrong with having more options though
 

jwhit28

Member
Is the checkerboard effect really that much better than native 1440p? Sony has to make people say and hear 4k for marketing purpose whether it's native or not.
 
No.
The scaling of Bravia series works by having series of input images and then scale it to 4k.

Games use many different methods of resolving/rendering into higher resolution. (Usually changing someway how rendering is done, not just working on final images.)
Read presentations in these threads if you are interested in specifics.
I just read the checkerbox explanation.

How is that different in practice to reality creation on Bravia? It separates each 1080p pixel out in a 1:4 ratio like a checkerboard and then fills in the color gradients corresponding to the change in the surrounding pixels. I'm not talking about it's normal scaler. The Reality Creation feature turns each 1080p pixel into four pixels of different gradients corresponding to the surrounding pixels.

What am I not understanding? Because it looks like another method of doing the exact same thing.

The only difference I'm seeing is that on is happening before the framebuffer and the other is happening slower on a less powerful secondary device.
 

belmonkey

Member
This isn't something to be implemented by either IHV. The games must support it in their engines.

Surely there's something, even a similar type of method, that could be used where ever through software? Maybe kinda like how DSR / VSR can work in any game while applying a filter along with it. I dunno.

Would simply upscaling from 1440p-1800p to 4k provide a similar level of clarity?
 

petran79

Banned
Couldnt they make the PS4 Pro downscale 1080p like the PC GPUs do or are the PSU and hardware insufficient?
 

leandrro

Neo Member
aint that just post processing? seems possible on a post processing filter to be used in every game like with sweetfx

i also bet that 4k extrapolated into 5k,6k whaever looks better on a 4k monitor than native 4k, like a pixar movie looks better in 720p than a 720p game
 

Lylo

Member
With PS4Pro doing upscale 4K* using checkerboard technique, I'm wondering why PC GPUs don't start to do this? This would give us consistent 4K >60fps Ultra without needing Titan Xp.

You answered your own question, NVidia wants you to buy a Titan XP.
 
I'm a bit iffy on secret-sauce-styled gimmicks until we see some actual results on it.

That being said, if it does result in an actual increase in IQ then sure why not. Although you really can't beat native.
 

RedSwirl

Junior Member
the 1070 can do this at 4k:

Rise of the Tomb Raider / very high quality - 39fps

Dirt Rally / Ultra - 53fps

Ashes of the Singularity / extreme - 41fps

Battlefield 4 / Ultra - 60fps

Crysis 3 / Very high + FXAA - 28fps

The Witcher 3 / Ultra - 28-35fps

The Division / Ultra - 32fps

GTA V / Very High - 22-29fps

Hitman / Ultra - 40fps

Source: http://www.anandtech.com/bench/product/1731

Yeah, so in the end I doubt this is even worth bring out. Consoles and PCs are vastly different beasts. While the PS4pro might be in consumers hands anywhere from 2-3 years depending on sony's PS5 timeline, PC upgrade according to their owners will and that alone. 4k monitor adoption rate stands at >2% of Steam users right now. What's the point in working on and releasing such tech in a driver profile for the current cards out there when the people probably running a 4k monitor are likely already using the highest end cards out there already and capable of acceptable framerates anyways? By the time GPU hardware catches up to 4k at midrange pricerange (presumably next year) and people start buying 4k monitors, the tech is pointless other than for legacy cards who are probably just going to stick to 1080p anyways until they upgrade again.

Kinda this.

A point that needs to be brought out is these 4K PS4 games are still mostly running at 30fps with "console" graphics settings.

When people say it's still really tough for PC gaming to do native 4K, what they really mean is native 4K and 60fps at ultra settings for recent games. If you're willing to settle for 30fps and/or slide down some settings, playable native 4K is definitely within reach of modern graphics cards, even Maxwell cards probably. I remember someone in a thread a few months ago saying they could play GTA V at 4K on a 970 or 980 with mostly medium/high settings.
 

pa22word

Member
Kinda this.

A point that needs to be brought out is these 4K PS4 games are still mostly running at 30fps with "console" graphics settings.

When people say it's still really tough for PC gaming to do native 4K, what they really mean is native 4K and 60fps at ultra settings for recent games. If you're willing to settle for 30fps and/or slide down some settings, playable native 4K is definitely within reach of modern graphics cards, even Maxwell cards probably. I remember someone in a thread a few months ago saying they could play GTA V at 4K on a 970 or 980 with mostly medium/high settings.


I can play every game I've tried at 4K on my 980tI, so yeah maxwell is right up there too.
 

mieumieu

Member
i dont think you can do it completely in hardware like a post process filter on the final image because the programmers are needed to set the exact timing reconstruction is applied. some post processing (like depth of field) would be broken if checkerboard reconstruction is applied after them.
 
Nah, go native or go home. Don't want to end up with another Quantum Break mess.
Options are always good, no? That is what PC gaming is all about, right? All PC gamers aren't die hard "PC Master Race" type people. I game on a 50 inch 1080p Plasma with my itx PC (newly upgraded to a gtx 1060)and it will stay there. I am not a desktop PC gamer anymore.

I won't go to a 4K TV for maybe a few more years, but I would love this technique or similar techniques (programmed properly) within games to be able to have 4k on lower level graphic cards that may not handle 4k very well at acceptable framerates.

Also, it's up to the programmers to implement/program these techniques properly or it will run poorly/look poorly regardless.
 
Options are always good, no? That is what PC gaming is all about, right? All PC gamers aren't die hard "PC Master Race" type people.
This is my opinion too. PC gaming to me has always been about flexibility, not spending hundreds or thousands annually to brute force performance (at least we don't need boot floppies any more!).

If different resolutions and anti-aliasing methods are available to tweak appearance vs performance, why not this? I'd love to be able to buy a 4K screen and have my otherwise-capable GTX670 render at 1080p up-whatevered to 4K.
 

pottuvoi

Banned
I just read the checkerbox explanation.

How is that different in practice to reality creation on Bravia? It separates each 1080p pixel out in a 1:4 ratio like a checkerboard and then fills in the color gradients corresponding to the change in the surrounding pixels. I'm not talking about it's normal scaler. The Reality Creation feature turns each 1080p pixel into four pixels of different gradients corresponding to the surrounding pixels.

What am I not understanding? Because it looks like another method of doing the exact same thing.

The only difference I'm seeing is that on is happening before the framebuffer and the other is happening slower on a less powerful secondary device.
Half of the pixels are rendered in every frame, next frame the second half.
Result without motion is pretty close to perfect full frame. (Can have single pixel detail in 4k,)

On TV as you said the input is 1080p, half of the data sent each frame to the checkerbox in that paper.
Also the game can jitter/change sample position in each frame, TV signal stays the same, thus additional sample possibilities are not there.
Those new scalers are amazing though, they do feature search of huge array of images and try to hallucinate correct results. (Most likely getting pretty close guesses.)
 

Momentary

Banned
I bench Rise of the Tomb Raider around 67 FPS maxed out @ 4K on a single card. But the average consumer will not want to pay that kind of money and it's clearly understandable. The PS4 Pro is pretty damn neat in terms of what it's offering for people and at a more than reasonable price point.
 

Mohasus

Member
What is this that Rainbow Six Siege does? I'm getting native 4K when I set it there right?

Not if you enable temporal filtering.

"Temporal Filtering renders the game at half-resolution with 2x MSAA. In other words, a 1920x1080 picture is rendered at 960x540, and 2x MSAA is applied to smooth out the now-rougher edges.

As a result, there are the same number of depth samples as the full-resolution 1920x1080 picture, but only a quarter of the shaded samples, improving performance greatly, but also decreasing image quality. This manifests as a reduction in the quality and visibility of Ambient Occlusion shadowing, increased shader aliasing, decreased lighting and shading fidelity, and a loss of fidelity on smaller game elements, such as leaves, grass, visual effects and minute pieces of geometry."
http://www.geforce.com/whats-new/gu...e#tom-clancys-rainbow-six-siege-anti-aliasing
 

mrklaw

MrArseFace
This talk of PCs doing it 'properly' is silly. Games are all about smoke and mirrors. You get great results by cutting as many corners as you can. This is another option to increase perceived detail and image quality for less computational cost than doing it properly. Like with things like lower resolution reflections, shadows etc you accept a minor trade off for the overall benefits

And clearly it would be an option - nobody would be forced to use it
 
Also, this. Single solution GPUs for 4K / 144Hz / G-Sync will be arriving soon, I can literally taste it.
4K and 144 Hz?

If soon is probably like 4 years away I guess so.... lol.

I hope you meant 4K or 144 Hz actually lol. That would be much sooner and already partly present.
 

Woo-Fu

Banned
There is a massive shortage of the type of partial cloud processing gaming implementations MS was trying to market 3 years ago though.

Because it doesn't fit the typical business model. Who wants to keep paying for Azure after you've made the bulk of your game sales? Nobody who wants to make money---unless they're trying to promote Azure.
 
Okay, so I don't have anything against this solution, but here's my take on this whole thing. Right now, you can easily play most games at 4K/locked 30 (very high/ultra) on a GTX1070, which costs $380 or something. Hell, you can play a lot of the games (bit older ones) at 4K/60 with some settings turned down. Considering how quickly these things drop in price, I'd imagine we'd be seeing this same GPU horsepower at a sub-$250 price point by next summer/fall. While reconstruction methods can definitely be nifty, I'm just not a fan of them. We're really not that far off from affordable 4K/30 and high-end 4K/60.
 

dogen

Member
I just read the checkerbox explanation.

How is that different in practice to reality creation on Bravia? It separates each 1080p pixel out in a 1:4 ratio like a checkerboard and then fills in the color gradients corresponding to the change in the surrounding pixels. I'm not talking about it's normal scaler. The Reality Creation feature turns each 1080p pixel into four pixels of different gradients corresponding to the surrounding pixels.

What am I not understanding? Because it looks like another method of doing the exact same thing.

The only difference I'm seeing is that on is happening before the framebuffer and the other is happening slower on a less powerful secondary device.

This alternates the rendered checker box pattern between frames and uses information from the previous one to construct the current full resolution frame.

So, if nothing is moving on screen, you're gonna get 100% of the data from a full native 4k frame. Even in motion you'll still be reconstructing a lot more of that data than a simple upscaled image wouldn't have.
 

fresquito

Member
Definitely from r/PCMR <_<
I'm just amused at the capacity to reconstruct your speech. A few days ago 4K didn't matter, then 4K mattered, but now that 4K is not a thing anymore, its absence is suddenly a feature. PS5 will eventually land and everybody will be happy to leave this checkboard behind.

More features is always good, but this is not a feature, but a workaround for a shortcoming.
 

dr_rus

Member
Surely there's something, even a similar type of method, that could be used where ever through software? Maybe kinda like how DSR / VSR can work in any game while applying a filter along with it. I dunno.

Would simply upscaling from 1440p-1800p to 4k provide a similar level of clarity?

No, there's not. Such type of upscaling must have full access to framebuffer contents and know how that content is being built for each pixel. For that it must be a part of the program which is building this frame buffer as such upscaling should be implemented inside the rendering pipeline, not as some finishing step or override. Basically you need a custom reconstruction shader integrated into the engine, and this isn't something which any IHV would be willing to provide for all games out there.

DSR/VSR isn't a good example at all as all it does is downscale the image from a high rendering resolution to that of your display.
 
That's why PC gaming is uncompromised gaming, no tricks and no secret sauce. All raw power.
Of course, if you've got the money to spend for said uncompromised gaming.



Also, this. Single solution GPUs for 4K / 144Hz / G-Sync will be arriving soon, I can literally taste it.

Soon? Single gpus cant max out newer games at even 60 fps for 4k. There are also no 4k monitors above 60 right now. 4k is nice but 21:9 is the real game changer.
 

Chobel

Member

Fucking really? Patenting an algorithm?

Okay, so I don't have anything against this solution, but here's my take on this whole thing. Right now, you can easily play most games at 4K/locked 30 (very high/ultra) on a GTX1070, which costs $380 or something. Hell, you can play a lot of the games (bit older ones) at 4K/60 with some settings turned down. Considering how quickly these things drop in price, I'd imagine we'd be seeing this same GPU horsepower at a sub-$250 price point by next summer/fall. While reconstruction methods can definitely be nifty, I'm just not a fan of them. We're really not that far off from affordable 4K/30 and high-end 4K/60.

And what people who can't or won't upgrade?
 
Top Bottom