• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry + GTX1080ti SLI vs 8K

elelunicy

Member
Almost responded, but where does one even begin with this.

I never even said or implied that I thought "1440p is halfway between 1080p and 4K", so I'm not understanding how you responded to my post with such drivel.

Which is why I said "people seem to think" as opposed to you. I never said or implied that you think "1440p is halfway between 1080p and 4k."
 

K.Jack

Knowledge is power, guard it well
Which is why I said "people seem to think" as opposed to you. I never said or implied that you think "1440p is halfway between 1080p and 4k."

"Sorry but 1440p is a mariginal upgrade over 1080p at best. I hardly even notice an IQ difference between 1080p and 1440p cause they're both bad."

See how much better this reads, as a response to my post? The other interjected editorial was worthless.
 

Paragon

Member
Here's a helpful chart to people who think 1440p is close to the halfway between 1080p and 4k.
http://i.imgur.com/dktvR7M.png
Oh I see, you literally meant "halfway between" 1080p and 4K (2.5x 1080p) and not "half of" 4K (2x 1080p) - which is what most people are talking about when they compare 1440p to 4K.
2560x1440 being 1.8x 1080p makes it roughly half of 4K resolution, while 3440x1440 is 2.4x 1080p and more than half of 4K resolution.
Either of these should be a clear upgrade over 1080p, being roughly double the resolution.
 
Most of the gaming I do is under 720 p lol and I do not mind it at all. IT is good progress. I hope prices come down too.

I only wish development cost and time would come down as well. I do not give shit about graphics if the games take so long and are so expensive.
 

Paragon

Member
That feels so pointless - if you have gpu power for 8K@30 you have enough for 4K@120 which is much better
That doesn't guarantee you'll have the CPU power to push 120 FPS though.
Some games still struggle hit 60 FPS no matter what CPU you have.

Think you need a 6950X just to watch the damn video!
If you have a recent GPU (I have a GTX 1070) try using Edge for playback. Most browsers don't seem to handle 8K playback well.
Firefox plays back 8K60 videos, but constantly drops frames.
Chrome doesn't seem to use GPU acceleration for 8K playback.
Edge plays back 8K60 perfectly smoothly with only 3% CPU usage and without dropping a frame.

That particular video will always look like a stuttering mess though due to it using 4-way SLI.
2-way SLI is bad enough for 'microstutter'.
Multi-GPU setups give you good framerate numbers, but bad frame pacing and latency.
 

elelunicy

Member
Either of these should be a clear upgrade over 1080p, being roughly double the resolution.

See, I do not think 4k is high enough to achieve a good IQ (epecially in games that lack a good TAA implementation, in which case they have awful IQ even at 5k). Both 1440p and 1080p are so far below 4k (a resolution I do not consider as high at all); trying a find a difference between the two is basically splitting hairs. If a game has a lot of shimmering/flickering/pixel crawling at 1080p, rest assured that it will also have a lot of shimmering/flickering/pixel crawling at 1440p. 4k is the bare minimum where you start to see a difference.
 

Nabbis

Member
I wonder how much of a difference that resolution provides for image quality as compared to 4k on a sized 27 screen.
 

Xyber

Member
My PC can't (i7 3770k@4.4) play it smoothly and the buffering is immense. Damn son.

200/20

I gotta give it to MS, Edge played that video perfectly on my cheap laptop with an i3 @2,4GHz and Intel HD 620.

Chrome was a damn slideshow.
 

Qassim

Member
In Edge, the 8K YouTube videos actually seem to use the GPU to play it perfectly smoothly (my GPU clocked up to 2000Mhz) and GPU usage remained low, in Chrome.. it's stutters constantly. My CPU usage spikes, my GPU usage spikes - but the GPU clock ends up between 900Mhz - 1500Mhz (spikes to), nothing consistent.
 
8K seems entirely excessive to me, outside of VR. But by the time we have 8K VR screens we'll also have foveated rendering to dramatically reduce the performance costs, so it's not like rendering an entire 8K screen will be necessary in that case.

I thought that about 4k vs 1080p until I actually saw 4k.
 
I greatly look forward to 8k.

It's the TV upgrade I'm really looking forward to in the next few years. I've seen 8K in person and there's just no desire for me to buy anything else until it arrives at affordable prices. 8K is the fucking *dream*. You'll all see soon enough. 4K became stillborn for me when I saw 8K IRL. Resolution makes it feel like you can step through the TV, as if it were a window. To my eyes, anyway.
 
I greatly look forward to 8k.

It's the TV upgrade I'm really looking forward to in the next few years. I've seen 8K in person and there's just no desire for me to buy anything else until it arrives at affordable prices. 8K is the fucking *dream*. You'll all see soon enough. 4K became stillborn for me when I saw 8K IRL. Resolution makes it feel like you can step through the TV, as if it were a window. To my eyes, anyway.

where did you demo 8K btw?
 

Paragon

Member
See, I do not think 4k is high enough to achieve a good IQ (epecially in games that lack a good TAA implementation, in which case they have awful IQ even at 5k). Both 1440p and 1080p are so far below 4k (a resolution I do not consider as high at all); trying a find a difference between the two is basically splitting hairs. If a game has a lot of shimmering/flickering/pixel crawling at 1080p, rest assured that it will also have a lot of shimmering/flickering/pixel crawling at 1440p. 4k is the bare minimum where you start to see a difference.
You should perhaps learn the difference between the words "good" and "perfect".
4K is more than enough for good image quality. Without good anti-aliasing it is certainly not perfect though.
Bad image quality is a lack of anti-aliasing, poor texture filtering, sub-native resolution rendering etc.
Just because the image is not entirely free of all forms of aliasing and does not look like an offline render, does not mean that it has bad image quality - only that it does not have perfect image quality.

There is still a big difference between 1080p and 1440p, with the latter being almost double the resolution.
Just because it is nearly double the resolution and not 4x or 16x does not mean that there is hardly any difference. Doubling the resolution is a big difference.
It's like saying there is no difference between 720p and 1080p. Or 480p and 720p.

I'm not saying that 4K or 8K are worthless; I hope that it won't be long before those resolutions are actually playable on a single GPU.
However I still don't consider a 1080 Ti to be enough for single-GPU 4K gaming today, because I want to keep framerates above 60 FPS.

I'm really curious to know what kind of display you're using, since you're so dogmatic about this.
Based on your comments, I assume it's the smallest 4K panel you could find.

And image quality is not just about resolution: https://timothylottes.github.io/20161114.html
 
Top Bottom