• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry + GTX1080ti SLI vs 8K

CariusD

Member
Well then it's not really 4k is it?

No, but I wouldn't mind if most games that couldn't hit a native 4k60 at that point have a choice between native 4k30 and checkboard 4k60 since graphics options are becoming a thing for console games.
 
No, but I wouldn't mind if most games that couldn't hit a native 4k60 at that point have a choice between native 4k30 and checkboard 4k60 since graphics options are becoming a thing for console games.

It doesn't work like that. You need a powerful CPU to hit 60 fps unless you design the game for 60 fps in the first place. Dropping the resolution won't be enough in most cases.
 

Frozone

Member
I would rather not chase the pixel throughput trying to get higher resolutions but put the research into shading, lighting, FX, more accurate hair and higher res textures/normal maps (which the author eludes to when playing with graphics settings).

Right now, every game appears to use the same techniques (i.e. making most of them look the same) consequently making it appear as if we hit a wall in developing more complex techniques.
 

CariusD

Member
It doesn't work like that. You need a powerful CPU to hit 60 fps unless you design the game for 60 fps in the first place. Dropping the resolution won't be enough in most cases.

You are right, but if they choose to design around checkboard 60 then they should definitely have enough room for 4k30 as well.
 

Truant

Member
Extremely impressive, but my couch + TV setup makes it really hard to justify getting even a 4k TV. Hell, 1440p downsampled to 1080p looks really good at my viewing distance.
 

Madness

Member
Yeah, I don't think 8K will ever be mainstream. Diminishing returns and all that.

It will be mainstream in home theatre and theatres, but the television size requirementz will go up to like 70"+ screens. Keep in mind as a resolution, 8K content will look amazing.

As for games, why not. The more resolution you can fit, the better. In fact greater than 8K needed for more immersive VR. I think television/film wise we will peak at 8K before some kind of holographic/AR/VR future comes but for games 8K-16K will be norm pretty quick. 4K has come rapidly and is already being rapidly adopted.
 

CariusD

Member
I would rather not chase the pixel throughput trying to get higher resolutions but put the research into shading, lighting, FX, more accurate hair and higher res textures/normal maps (which the author eludes to when playing with graphics settings).

Right now, every game appears to use the same techniques (i.e. making most of them look the same) consequently making it appear as if we hit a wall in developing more complex techniques.

Textures at that point will be a compression issue as to not take 200gb. The rest can and will probably happen in parallel as it has happened every other generation.
 

baconcow

Member
so how many 2DS screens is 8k?

If you include the upper (400x240) and lower (320x240) screen resolutions together, the 2DS (or 3DS, including new and XL varieties) require exactly 192 pairs of screens to make up 8k resolution (7680x4320).
 

Paragon

Member
The architecture after Volta will definitely be making 4K commonplace for PC gaming. By that time mid-enthusiast to top-enthusiast hardware will do 4K@120 without any issues. That will probably be around 2020.
High framerate gaming is increasingly CPU-limited.
Most game engines are still poorly multi-threaded and it really seems like a limit has been reached on single-threaded performance.
Single-threaded performance has improved very little in the last five years, while CPUs at the high-end have been getting more and more cores for significantly faster performance overall.
Intel have 24-core CPUs and AMD are due to release 32-core CPUs soon. Lots of games only really use two.

Even if your GPU is fast enough to do 8K60, that's no guarantee that you would be able to run a game at 4K120.

What was with the slowdown in Forza Horizon 3? Looked like what happens when emulation drops frames.
Forza is a completely busted port.
Not only does it appear to have problems when things slow down, it also has problems with cars jumping all over the place when the framerate gets too high as well.
I'm surprised that Richard said the engine was well-optimized in this video, considering that 90% of the game's CPU workload runs on a single CPU core.
 
Forza is a completely busted port.
Not only does it appear to have problems when things slow down, it also has problems with cars jumping all over the place when the framerate gets too high as well.
I'm surprised that Richard said the engine was well-optimized in this video, considering that 90% of the game's CPU workload runs on a single CPU core.

I'm not surprised at all.
 

Hux1ey

Banned
Forza is a completely busted port.
Not only does it appear to have problems when things slow down, it also has problems with cars jumping all over the place when the framerate gets too high as well.
I'm surprised that Richard said the engine was well-optimized in this video, considering that 90% of the game's CPU workload runs on a single CPU core.

From watching DF over the years, I'm pretty sure they can't be negative about anything haha.
 

Momentary

Banned
High framerate gaming is increasingly CPU-limited.
Most game engines are still poorly multi-threaded and it really seems like a limit has been reached on single-threaded performance.
Single-threaded performance has improved very little in the last five years, while CPUs at the high-end have been getting more and more cores for significantly faster performance overall.
Intel have 24-core CPUs and AMD are due to release 32-core CPUs soon. Lots of games only really use two.

Even if your GPU is fast enough to do 8K60, that's no guarantee that you would be able to run a game at 4K120

I run games at 4K@120FPS+ maxed out all the time. Forza Apex and ReCore come to mind right off the bat. I mean yeah you have limitations, but I haven't come across a game so far where my CPU is the bottleneck for performance. Even if it isn't taking advantage of my threads.

*I stay current on hardware, so yes older CPUs are the source of bottlenecks if you're using a brand new GPU.

Forza is a completely busted port.
Not only does it appear to have problems when things slow down, it also has problems with cars jumping all over the place when the framerate gets too high as well.
I'm surprised that Richard said the engine was well-optimized in this video, considering that 90% of the game's CPU workload runs on a single CPU core.

I've uninstalled Forza Horizon 3. I tried playing it the other day and the fluctuating framerate is ridiculous.


With all this said, I think jumping from 4K to 8K is a bit ridiculous and I don't think most people understand how strenuous that is on hardware right now. I think we'll see a more LOGICAL increase in the coming years in the PC space:

  • 5120x2880 is the next step from 4K.
  • After that is 6400x3600.
  • Then you have 7680x4320.
Jumping from 4K to 8K is fucking ridiculous. I don't see 8K@60fps being feasible for the majority of gaming consumers until 2023 or 2024.

As a matter of fact, until consoles use 7680x4320 @ 30fps as a baseline for the majority of their games, this is not happening anytime soon... So maybe 2025 or later even. Hell, physical gaming platforms might not even be a thing by then.
 

Accoun

Member
The real question is: How high of a resolution they can run Quake 1 (or even quake3/Live) on this thing at 144 FPS.
 

K.Jack

Knowledge is power, guard it well
I feel so behind, I'm barely gonna start gaming at 2k (1440p) on my PC.

In truth, 1440p is really the best place to be, with the best balance of IQ and high framerate performance. 4K isn't worth it right now, for those of us who don't spend 1080/Ti money.

When an x070 level card can do 4K/60 with ease, will be the time to make that leap.

I personally won't make the jump, until 4K/120fps is possible.
 

pa22word

Member
In truth, 1440p is really the best place to be, with the best balance of IQ and high framerate performance. 4K isn't worth it right now, for those of us who don't spend 1080/Ti money.

When an x070 level card can do 4K/60 with ease, will be the time to make that leap.

I personally won't make the jump, until 4K/120fps is possible.

I got sorta forced into a 1440p monitor buy a month ago due to my old monitor shitting the bed, and I was pleasantly surprised at how good of an increase it is. Very noticeable increase in clarity, and when I decide to finally make the jump to a 4K60 ready card I think I'll keep this thing around for ultra fast gaming (165hz monitor, PG278QR) to really make the framerate potential just totally fly.
 

elelunicy

Member
In truth, 1440p is really the best place to be, with the best balance of IQ and high framerate performance.

Sorry but 1440p is a mariginal upgrade over 1080p at best. People seem to think 1440p is halfway between 1080p and 4k, when in reality it's nowhere close to that (fun fact: even ultrawide 1440p (3440x1440) is still closer to 1080p than it is to 4k in terms of pixel count). I hardly even notice an IQ difference between 1080p and 1440p cause they're both bad.
 

Luigiv

Member
naamloos1080ytylk.png

This reminds me of the time I tried to play a PSP game on my 1080p TV. Never again.
 

Paragon

Member
Sorry but 1440p is a mariginal upgrade over 1080p at best. People seem to think 1440p is halfway between 1080p and 4k, when in reality it's nowhere close to that (fun fact: even ultrawide 1440p (3440x1440) is still closer to 1080p than it is to 4k in terms of pixel count). I hardly even notice an IQ difference between 1080p and 1440p cause they're both bad.

2560x1440 is 1.8x as many pixels as 1080p, which is very close to being halfway between 1080p and 4K.
3440x1440 is 2.4x as many pixels as 1080p, which makes it closer to 4K than 1080p.

Both are a significant increase in resolution and image quality over 1080p.
You are allowed to say that you prefer 4K resolution over 1440p without having to exaggerate to make your point.


For single-GPU setups - which is all that I would ever consider due to the microstutter and latency problems with multi-GPU - 1440p makes a whole lot more sense than native 4K on today's hardware.
And moving to 24:10 is a more meaningful upgrade than rendering 16:9 at higher resolutions in my opinion.

That does not mean that I think 4K is a waste of power - or 8K for that matter.
I've already been rendering older games at 4K and 8K using DSR and the image quality is stunning.
I just don't think it's the right choice for today's GPU hardware.

If televisions had not jumped straight from 1080p to 2160p, and there were 1440p options available instead, I doubt anyone would be trying to push 4K resolution as 'mainstream' today.
I'm glad that progress is being made, and the demand for higher resolutions is pushing GPU manufacturers and game developers to innovate, but as someone that wants to be gaming at or above 60 FPS, 1440p is the right choice today.
 
Sorry but 1440p is a mariginal upgrade over 1080p at best. People seem to think 1440p is halfway between 1080p and 4k, when in reality it's nowhere close to that (fun fact: even ultrawide 1440p (3440x1440) is still closer to 1080p than it is to 4k in terms of pixel count). I hardly even notice an IQ difference between 1080p and 1440p cause they're both bad.

Uhh... You're completely factually wrong here.
 

Arkeband

Banned
I might go 1440p with my next video card upgrade but 4K still seems like it brings most cards to their knees. The things I currently play hover around 60fps at 1080p which is where I want it to stay.
 

Momentary

Banned
Sorry but 1440p is a mariginal upgrade over 1080p at best. People seem to think 1440p is halfway between 1080p and 4k, when in reality it's nowhere close to that (fun fact: even ultrawide 1440p (3440x1440) is still closer to 1080p than it is to 4k in terms of pixel count). I hardly even notice an IQ difference between 1080p and 1440p cause they're both bad.

I made a recording to test on my LG 55e6p earlier today and 1440p looks better than 1080p at a normal viewing distance. It's hard for me to even tell the difference between 1440p and 4K on a TV when not sitting up close to it.

Guilty Gear 4K Test Vid
https://www.youtube.com/watch?v=4nhMJr_G4UM

*All resoltuions are upscaled or downscaled to 3840x2160.

I'm a big stickler when it comes to jaggies on polygons, but I feel like 1440p is a sweetspot right now both for PC and PS4 Pro/Scorpio gamers when it comes to gaming on a T.V. I feel like it's the sweet spot even for monitors. You get decent IQ with great framerates if you are on the upper level of the hardware spectrum. I'm in the camp where performance > image quality.
 

hesido

Member
I'll wait for 16K, thanks. I can still see pixels with 8k from 9 inches to screen, which is how I'd like to play my games and watch my movies. With 16k, I'll be able to lick the screen and still not see a pixel with a reading glass.

What a fucking waste of GPU cycles.
 

elelunicy

Member
2560x1440 is 1.8x as many pixels as 1080p, which is very close to being halfway between 1080p and 4K.
3440x1440 is 2.4x as many pixels as 1080p, which makes it closer to 4K than 1080p.

The halfway between 100 and 400 is 200. 240 is 2.4x as much as 100, and thus 240 is closer to 400 than it is to 100.

Or not.

I specifically said pixel count, not any other metrics. The halfway between 1080p and 4k in terms of pixel count is (3840x2160+1920x1080)/2=5,157,000. Meanwhile 3440x1440=4,953,600, and thus it's closer to 1080p and than it is to 4k.

Here's a helpful chart to people who think 1440p is close to the halfway between 1080p and 4k.

dktvR7M.png
 

K.Jack

Knowledge is power, guard it well
Sorry but 1440p is a mariginal upgrade over 1080p at best. People seem to think 1440p is halfway between 1080p and 4k, when in reality it's nowhere close to that (fun fact: even ultrawide 1440p (3440x1440) is still closer to 1080p than it is to 4k in terms of pixel count). I hardly even notice an IQ difference between 1080p and 1440p cause they're both bad.

Almost responded, but where does one even begin with this.

I never even said or implied that I thought "1440p is halfway between 1080p and 4K", so I'm not understanding how you responded to my post with such drivel.
 

Spukc

always chasing the next thrill
i like how this has no added function in gaming.
unless you own a 8k projector.

250 inches i guess
 
Top Bottom