• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PCGAF: 4K/30fps or 1080p/60fps?

Considering that 4k needs roughly 4x the GPU power over 1080p and you're only halving the framerate, you should be able to find a middle ground. 1440p/60, through downsampling, should offer the best of both worlds and be within your GPU performance.
 
D

Deleted member 325805

Unconfirmed Member
24 fps is totally playable for most game types.

I played through Crysis with an average of about 15 fps haha.

Loved every second.

zjxrie5ljfnmbvznhdqh.jpg
 

Tagyhag

Member
4K is absolutely gorgeous but I will always ALWAYS take performance first. It's why I'm getting a 1440P Gsync monitor before a 4K one.
 
I typically favor framerate over image quality, but if a game has piss-poor antialiasing solutions, I might take things up to 4K resolution (typically at the expense of the graphics if the framerate doesn't hold). I can tolerate a lack of shadow quality, post-processing effects and all that if it means less aliasing and a smooth FPS.

We should have a poll. Polls are all the rage.

Mods, can we get a poll?

I'll save you the time and just tell you that the joke option wins
 

finalflame

Member
Isn't that not an accurate comparison? If 1080p60 is your PC's limit, then wouldn't 4K drop the FPS to 15, not 30?

1080p60 is not my computer's limit, it can do well over 60fps at 1080p, but my TV's refresh rate is limited at 60.

489c48ac-6ca2-42a1-98ef-3f3dc01eea0d.png


I could play it at 1440p/70+fps sitting at my desk on my XB27HU, but I prefer couch gaming for games like Witcher 3. I'll be playing Overwatch at my desk at 1440p, 100+fps, G-Sync, though :)
 

Buburibon

Member
1080p60, though it will drop below 60fps (into the low 50s) in certain scenarios if you're maxing the settings. Then again, that's what performance was like on both OCed 5820K and Titan X back when the game was released. There's a chance things have improved since.
 

Josh7289

Member
1080p @ max settings vs 4k at lowered settings, hairworks in particular is a huge drain on resources

I see, thanks.

For me 1080p60 is the easy choice. Framerate is generally the most important aspect of a game's visuals to me. I'll sacrifice resolution and IQ to improve the framerate, in general. Also, I've said it before, but I believe resolutions above 1080p are past the point of diminishing returns, while the difference between 30 fps and 60 fps is still large and immediately perceptible both in terms of how the game feels to play and visually, at least to me.

Now if you want to ask about 1080p60 with max settings and 4K60 with lower settings, that's a more interesting comparison. That might come down to each individual game for me.
 

finalflame

Member
OP have you considered something inbetween 1080p and 4k at 60fps?

Considering my TV's native resolution is 4K, I'd rather not pick something asymmetrical like 1440p. 1080p upscales perfectly to 4K. My desk monitor is 1440p, but I prefer playing games like Witcher on my couch.
 

Buburibon

Member
Considering my TV's native resolution is 4K, I'd rather not pick something asymmetrical like 1440p. 1080p upscales perfectly to 4K. My desk monitor is 1440p, but I prefer playing games like Witcher on my couch.

I've had several Panasonic, and Sony 4K TVs in the past few years, and they've always resolved 1440p by first dowsampling it to 1080p, and then upscaling it to 4K. So, in that case 1440p is preferrable to 1080p. But, if your TV actually upscales 1440p directly to 4K, then I agree that 1080p is the better alternative since it scales up nicely to 2160p.
 
D

Deleted member 325805

Unconfirmed Member
1080@30FPS because I'm on a budget so I bought a 970.

Did you mean to type 960? A 970 allows you to play everything at 1080p/60 on a mix of high and ultra settings from my experience.
 

Peterthumpa

Member
24 fps is totally playable for most game types.

I played through Crysis with an average of about 15 fps haha.

Loved every second.

IQ for me is the king, but 30 FPS is really the minimum.

But yeah, 30/4K. That's how I'm playing Dark Souls 3 right now and it's G-O-R-G-E-O-U-S.
 

Inumbris

Member
Definitely a 60 FPS minimum for me, so 1080p/60 wins every time in this match-up.

If you said 4K 60fps or 1080 120/144fps, that would be a harder choice (most likely 4K/60), but as it stands now definitely 1080/60fps. Smooth frame-rate over great visuals.

I agree, this would me a much harder choice for me to make.
 
I would probably take an even lower res with over 60fps like 900p 120fps over 4K30 tbh

Framerate matters so much to me personally. It doesn't just affect the way the game looks but the way it feels to play, so it takes priority.
 

inner-G

Banned
1440p 60 FPS.

If you can do 4K at 30 you can do that. Best of both worlds, in my opinion, especially if you're downsampling onto a 1080p screen.
That's what I shoot for. (I have a 1440p screen but find it to be a tangible step up from 1080p without being overkill like 4K)
 

Reallink

Member
What percentage of responses in this thread do you think have actually played a game in 4k/30 on a proper set up?
 

SapientWolf

Trucker Sexologist
60fps is a minimum requirement for gaming, the accepted standard for going on 40 years now. 4K resolution is not. Easy choice.

Keep in mind that temporal resolution is really important too, it's something too many gamers ignore. When you cut your framerate in half you are dramatically reducing resolution, just like going from 4K to 1080p does. You're not choosing resolution over framerate, as is popularly assumed, you're choosing one form of resolution over another. You want a balance, and for my money 1080p/60fps is going to get closer to that ideal for most situations than 4K/30fps will.
That starts to play a big part in driving games, or some of the faster paced shooters. At 30fps you can kiss a lot of background detail goodbye.

But you won't actually consciously notice it unless there's scrolling text. The overall effect is that you'll perceive the higher framerate imagery to be sharper than the pixel count suggests. I first noticed that phenomenon on a CRT.
 

Momentary

Banned
1440@60 or 1080@120 for me.

What percentage of responses in this thread do you think have actually played a game in 4k/30 on a proper set up?

Some games dont' take too much to run at 4k. I played Tales of Zesteria at 5K/60FPS. Depends on the game. I'm sure a lot of people have dont 4K/30 on less demanding games.
 

Peterthumpa

Member
60fps is a minimum requirement for gaming, the accepted standard for going on 40 years now. 4K resolution is not. Easy choice.

Keep in mind that temporal resolution is really important too, it's something too many gamers ignore. When you cut your framerate in half you are dramatically reducing resolution, just like going from 4K to 1080p does. You're not choosing resolution over framerate, as is popularly assumed, you're choosing one form of resolution over another. You want a balance, and for my money 1080p/60fps is going to get closer to that ideal for most situations than 4K/30fps will.
Care to explain? That's definitely not what my eyes see when running a game @ 30 FPS in 4K.
 

Dezzy

Member
2560x1440 at 60 for me. When a game can't do that, I'll make it 1920x1080. I'll always choose 60 fps over resolution or higher settings. I won't notice slightly lower shadows or something as much as a choppy framerate.
 

Odrion

Banned
this is harder to consider than I thought

1080p is pretty bad these days... but so is 30fps...

1440p is a good compromise really
 
Top Bottom