Thanks for the links.That patent isn't the checker box method. (Nor it affect any usual tricks to get higher resolution on GPUs. (MSAA-trick, checkerbox, temporal reconstruction etc.)
A one implementation of checker box rendering is already used in a game on PC and gives >30% improvement in performance.
http://www.geforce.com/whats-new/guides/tom-clancys-rainbow-six-siege-graphics-and-performance-guide
Nope.
http://www.gdcvault.com/play/1022990/Rendering-Rainbow-Six-Siege
Page 44 forward.
Actually if developers want to have artifact free rendering in 4k with near 1080p shading cost the basic 4xMSAA trick might be a good choice.
Gives perfect edge resolution, no temporal artifacts. (Texture resolution/shading is 1080p.)
Without any tweaking in areas where there are no polygon edges, the basic version would give identical result to 1080p 2x nearest neighbor scaling you suggested.
It's even more meaningless since people have already done it. On PC in fact!They've patented one (hardware) implementation, and we have no idea what the general quality of that implementation is compared with others, so it's a beat meaningless to say 'nobody else can do it because patent'.
If you ask me qb does the opposite lolThere are many similar techniques such as the ones being used in Quantum Break and Rainbow Six Siege, so that really means nothing.
No.Wait... Does the PS4 Pro have the reality creation feature that my Sony Bravia already has?
Is that how it does 4k?
Yeah, so in the end I doubt this is even worth bring out. Consoles and PCs are vastly different beasts. While the PS4pro might be in consumers hands anywhere from 2-3 years depending on sony's PS5 timeline, PC upgrade according to their owners will and that alone. 4k monitor adoption rate stands at >2% of Steam users right now. What's the point in working on and releasing such tech in a driver profile for the current cards out there when the people probably running a 4k monitor are likely already using the highest end cards out there already and capable of acceptable framerates anyways? By the time GPU hardware catches up to 4k at midrange pricerange (presumably next year) and people start buying 4k monitors, the tech is pointless other than for legacy cards who are probably just going to stick to 1080p anyways until they upgrade again.
Well it just does real 4K.
It's nothing like a standard upscale though, reconstruction is a better term.
I imagine you'd probably have a better experience with checkerboarding, considering how most games run at native 4k.
It's probably more noticeable up-close on a PC monitor.
I'm just thinking it's too much to hope for from AMD / Nvidia. They do want to sell more expensive GPUs after all.
I'm just thinking it's too much to hope for from AMD / Nvidia. They do want to sell more expensive GPUs after all.
I just read the checkerbox explanation.No.
The scaling of Bravia series works by having series of input images and then scale it to 4k.
Games use many different methods of resolving/rendering into higher resolution. (Usually changing someway how rendering is done, not just working on final images.)
Read presentations in these threads if you are interested in specifics.
This isn't something to be implemented by either IHV. The games must support it in their engines.
With PS4Pro doing upscale 4K* using checkerboard technique, I'm wondering why PC GPUs don't start to do this? This would give us consistent 4K >60fps Ultra without needing Titan Xp.
It's the new blast processing.
the 1070 can do this at 4k:
Rise of the Tomb Raider / very high quality - 39fps
Dirt Rally / Ultra - 53fps
Ashes of the Singularity / extreme - 41fps
Battlefield 4 / Ultra - 60fps
Crysis 3 / Very high + FXAA - 28fps
The Witcher 3 / Ultra - 28-35fps
The Division / Ultra - 32fps
GTA V / Very High - 22-29fps
Hitman / Ultra - 40fps
Source: http://www.anandtech.com/bench/product/1731
Yeah, so in the end I doubt this is even worth bring out. Consoles and PCs are vastly different beasts. While the PS4pro might be in consumers hands anywhere from 2-3 years depending on sony's PS5 timeline, PC upgrade according to their owners will and that alone. 4k monitor adoption rate stands at >2% of Steam users right now. What's the point in working on and releasing such tech in a driver profile for the current cards out there when the people probably running a 4k monitor are likely already using the highest end cards out there already and capable of acceptable framerates anyways? By the time GPU hardware catches up to 4k at midrange pricerange (presumably next year) and people start buying 4k monitors, the tech is pointless other than for legacy cards who are probably just going to stick to 1080p anyways until they upgrade again.
Kinda this.
A point that needs to be brought out is these 4K PS4 games are still mostly running at 30fps with "console" graphics settings.
When people say it's still really tough for PC gaming to do native 4K, what they really mean is native 4K and 60fps at ultra settings for recent games. If you're willing to settle for 30fps and/or slide down some settings, playable native 4K is definitely within reach of modern graphics cards, even Maxwell cards probably. I remember someone in a thread a few months ago saying they could play GTA V at 4K on a 970 or 980 with mostly medium/high settings.
PCs already have enough brute force to do 4K without upscaling tricks and nonsense. Why would we want it?
Options are always good, no? That is what PC gaming is all about, right? All PC gamers aren't die hard "PC Master Race" type people. I game on a 50 inch 1080p Plasma with my itx PC (newly upgraded to a gtx 1060)and it will stay there. I am not a desktop PC gamer anymore.Nah, go native or go home. Don't want to end up with another Quantum Break mess.
This is my opinion too. PC gaming to me has always been about flexibility, not spending hundreds or thousands annually to brute force performance (at least we don't need boot floppies any more!).Options are always good, no? That is what PC gaming is all about, right? All PC gamers aren't die hard "PC Master Race" type people.
is checkerboard rendering the new HUMA?
Half of the pixels are rendered in every frame, next frame the second half.I just read the checkerbox explanation.
How is that different in practice to reality creation on Bravia? It separates each 1080p pixel out in a 1:4 ratio like a checkerboard and then fills in the color gradients corresponding to the change in the surrounding pixels. I'm not talking about it's normal scaler. The Reality Creation feature turns each 1080p pixel into four pixels of different gradients corresponding to the surrounding pixels.
What am I not understanding? Because it looks like another method of doing the exact same thing.
The only difference I'm seeing is that on is happening before the framebuffer and the other is happening slower on a less powerful secondary device.
What is this that Rainbow Six Siege does? I'm getting native 4K when I set it there right?
Because no one wants a blurry image 2 feet from their eyes.
Definitely from r/PCMR <_<Console gamers, boy. Turning every downside into a feature.
4K and 144 Hz?Also, this. Single solution GPUs for 4K / 144Hz / G-Sync will be arriving soon, I can literally taste it.
There is a massive shortage of the type of partial cloud processing gaming implementations MS was trying to market 3 years ago though.
Couldnt they make the PS4 Pro downscale 1080p like the PC GPUs do or are the PSU and hardware insufficient?
I just read the checkerbox explanation.
How is that different in practice to reality creation on Bravia? It separates each 1080p pixel out in a 1:4 ratio like a checkerboard and then fills in the color gradients corresponding to the change in the surrounding pixels. I'm not talking about it's normal scaler. The Reality Creation feature turns each 1080p pixel into four pixels of different gradients corresponding to the surrounding pixels.
What am I not understanding? Because it looks like another method of doing the exact same thing.
The only difference I'm seeing is that on is happening before the framebuffer and the other is happening slower on a less powerful secondary device.
I'm just amused at the capacity to reconstruct your speech. A few days ago 4K didn't matter, then 4K mattered, but now that 4K is not a thing anymore, its absence is suddenly a feature. PS5 will eventually land and everybody will be happy to leave this checkboard behind.Definitely from r/PCMR <_<
Surely there's something, even a similar type of method, that could be used where ever through software? Maybe kinda like how DSR / VSR can work in any game while applying a filter along with it. I dunno.
Would simply upscaling from 1440p-1800p to 4k provide a similar level of clarity?
That's why PC gaming is uncompromised gaming, no tricks and no secret sauce. All raw power.Of course, if you've got the money to spend for said uncompromised gaming.
Also, this. Single solution GPUs for 4K / 144Hz / G-Sync will be arriving soon, I can literally taste it.
Sony patented it.
Okay, so I don't have anything against this solution, but here's my take on this whole thing. Right now, you can easily play most games at 4K/locked 30 (very high/ultra) on a GTX1070, which costs $380 or something. Hell, you can play a lot of the games (bit older ones) at 4K/60 with some settings turned down. Considering how quickly these things drop in price, I'd imagine we'd be seeing this same GPU horsepower at a sub-$250 price point by next summer/fall. While reconstruction methods can definitely be nifty, I'm just not a fan of them. We're really not that far off from affordable 4K/30 and high-end 4K/60.