• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

For PC gaming it's really not worth to upgrade from 1440p/144hz monitor until 8k 60fps becomes standard(probably in a decade)

Stick with 1080p/1440p monitor until 8k becomes standard?

  • Yes,i will stick with 1080p/1440p monitor

  • No,i will upgrade to 4k monitor


Results are only viewable after voting.
I play on a TV.

Sesame Street Idk GIF
 

Diddy X

Member
On a monitor 1440p is fine, but if you play on a TV like me 4k is definately better and more necessary. 8k? Not unless above 100"
 
Last edited:

Schmendrick

Member
The quality of what you see is not only determined by the resolution but also by the PPI of your monitor. F.e. 1440p at 27" is fine, at 32" however it really starts to look blurry
 
Last edited:

SHA

Member
Some games will downscale to 1080p unless they don't plan to make better looking games, just saying.
 

Minsc

Gold Member
lol I cant imagine gaming on a 27 inch monitor in 2024. Thats fucking insane if you are rocking a 4080 or 4090. WTF.

You need to go OLED. You need to go big screen. You need great tvs with a far wider color range than what monitors, even expensive ones can offer. The tv is a much better upgrade than buying a $1000 card. Once you have a big 55-65 inch tv you can then think about upgrading your GPU.

I left monitor based PC gaming way back in 2013 or so. It was too small. Even the $400-500 monitors were just not good at colors or HDR. Gaming like movies is more immersive on giant screens in a dark room. We are not animals.

Things have changed though, for $1400 or so you can get a top of the line 240hz 32" 4k OLED and HDR works great, even Dolby Vision compatible.

I'd argue the motion clarity on such a monitor would be far greater than that on a 60/120hz HDTV.
 

SlimySnake

Flashless at the Golden Globes
Things have changed though, for $1400 or so you can get a top of the line 240hz 32" 4k OLED and HDR works great, even Dolby Vision compatible.

I'd argue the motion clarity on such a monitor would be far greater than that on a 60/120hz HDTV.
I bought my OLED some 4 years ago for $1,800. It's 65 inch and has vrr and 120 hz at 4k. I cant imagine spending $1,400 for half the size.

Its not like any card out there is running modern games at 240 fps at 4k anyway. The only 4k 120 hz game ive played on my 3080 is hades. The rest require me to drop down to 1440p and its just not worth it at that point. 240 fps is just overkill. A bigger screen will be far more transformative than fps over 120 fps imo
 

Holammer

Member
I'll get a 4k/240 HDR monitor later, just so I can have spiffy CRT shaders in emulators.
It sounds like a joke, but I'm 100% serious. That said, 1440/144-240 will be good enough for years to come.
 

Minsc

Gold Member
I bought my OLED some 4 years ago for $1,800. It's 65 inch and has vrr and 120 hz at 4k. I cant imagine spending $1,400 for half the size.

Its not like any card out there is running modern games at 240 fps at 4k anyway. The only 4k 120 hz game ive played on my 3080 is hades. The rest require me to drop down to 1440p and its just not worth it at that point. 240 fps is just overkill. A bigger screen will be far more transformative than fps over 120 fps imo

I suppose it also depends on your distance, you can certainly sit close enough up to a 32" screen that it would appear larger in your FOV than on a couch further back from a 80" HDTV. I believe this is already the case with my 27" monitor.

And for framerate, it depends on what you're playing. For competitive games, a 240hz OLED with the motion clarity gives an advantage. LG even makes 480hz OLED displays, and then you can also play on a smaller sized section of the screen to help not lose details in your peripheral vision.

But for normal AAA gaming at 30-60fps sure, a nice HDTV is probably better, unless it's an RTS games like StarCraft or something, I just feel like those feel more at home on a regular monitor. Same for high motion FPS like Doom.
 

Xyphie

Member
After 2560x1440@+144Hz the next monitor upgrade IMO is to go ultrawide, not 4K. It's much more impactful than 4K. To my knowledge there isn't a ultrawide monitor above 1600 pixels vertical yet which kind of rules out denser PPI.
 
So yesterday i tried some 4k gaming and honestly i couldn't see that much difference compared to my 1440p. My 1440p is 27",4k is on 28" monitor. For console gaming,it probably is worth especially if you have 55"+ tv..

But you definitely notice downgrade in fps.

I'll get RTX 5080 when it comes out and its probably gonna last me for lot of years on my 1440p monitor.

Until most of GPU's can run 8K res at decent settings,i dont think its worth upgrade to 4k in between
Really depends on viewing distance. If you're a meter away from a 27" , you wont really notice. If you're an immersion whore like me and want to sit super close to fill up FOV, you would absolutely tell the difference. I sit a meter away from my 48" 4k. If I sat a meter away from a 48" 1440p monitor, I would be disgusted, frankly 😂

If you're happy sitting a little further away on a 27 in monitor, then yes I would imagine the difference would be borderline negligible
 

MrPaul

Neo Member
I don’t know, for general browsing I think 4K is easily noticeable. I went from 1440p 27” to 4K 32” and the razor sharp text clarity and high ppi makes general use very nice. you can even skip out on AA and everything looks clean still in games too. I have a 4080 and things run well enough, I use DLSS quality when available, I have no problem with how it looks.
I do like the idea of such a high dpi that running in native resolution gives you the equivalent of super resolution AA as a natural property of the display. Anti-Aliasing artifacts could be dispensed with all together. I still imagine that the smoothing effects of AA were still perceivable, just not as noticeable at 4k/32" as much as I like the sound of your experience. Was this the case?
 

Soodanim

Gold Member
8K is where there’s now enough pixels available that we begin to cross into uncanny valley territory. It’s a huge step up from 4K.
I've heard of this phenomenon before, and it was described as the brain interpreting what you're seeing as actual 3D. I think it was in the era of 3D TVs.

I don't know if there's any truth to it, but it stuck with me.
 

bitbydeath

Gold Member
As someone that's played several games at 8k, no it isn't.

And rendering resolution doesn't do anything to cross a game into "uncanny valley territory". That's lighting/animation/physics/volumetrics/particle effects/and so on. Resolution gets you a sharper/clearer image with less aliasing.


And 4k is already past the point of diminishing returns for most people.
You forgot the most important graphical aspect - textures. And you haven’t played a game made for 8K yet, they don’t exist.

If you think graphics have peaked you will be sorely mistaken.
 

analog_future

Resident Crybaby
You forgot the most important graphical aspect - textures. And you haven’t played a game made for 8K yet, they don’t exist.

If you think graphics have peaked you will be sorely mistaken.

Graphics haven't come anywhere near peaking, but arbitrarily increasing rendering resolution isn't going to be what pushes anything forward.
 

Puscifer

Member
Maybe try playing on 4k for a few months and then going back to 1440p?

That would be proper testing.
Seriously, a 4K monitor is an IMMEDIATE upgrade in sharpness.

I can buy 65'' OLED with 120Hz, miles better HDR + VRR support for the price of high end and outdated 4K monitor without VRR support and BS HDR. Damn, hard choice, can't decide.... Ugh, FFS.

PC Monitors and their makers can fuck right off with absolutely insane and unjustified prices.
That's still a television, not a monitor. And before the usual crap, here's when I ran a C3. Monitors fill up your vision way better. I always felt like I had to keep looking in the corners

PXL-20230329-222103451.jpg
 

Jack Videogames

Gold Member
I changed my prehistoric 1080p60 Dell (more than 10 y.o) for a 34" Ultrawide 1440p165 and it made more difference than any GPU purchase I've made in the last 10 years. But note that such a refresh rate spoils you; if it doesn't reach 100fps it feels choppy to me even though 60fps were more than enough two months ago. And my 4070 struggles with it.

What I mean is, make sure the synergy between your GPU and your monitor is appropriate. If you don't want to pony up for a 4090 then don't get a high resolution /high refresh rate monitor and wait.
 

Puscifer

Member
The quality of what you see is not only determined by the resolution but also by the PPI of your monitor. F.e. 1440p at 27" is fine, at 32" however it really starts to look blurry
Even at 27 4K is a huge upgrade, especially when you're working from home with the amount of information you can fit on screen
 

Bry0

Member
I do like the idea of such a high dpi that running in native resolution gives you the equivalent of super resolution AA as a natural property of the display. Anti-Aliasing artifacts could be dispensed with all together. I still imagine that the smoothing effects of AA were still perceivable, just not as noticeable at 4k/32" as much as I like the sound of your experience. Was this the case?
Yeah it’s still perceivable when turned on but I think IQ is good without it. You can also use something like DSR.
 

Danknugz

Member
i told myself i wouldn't go 4k until i have a headset that does 4k with a realistic FoV (i.e not less than 110 degrees horizontal/no goggle effect/full vertical fov) , in an attempt to achieve parity with what i normally game on and in VR.

of course st that point it would still be slightly less in VR due to the pixels being g stretched, but up to that point I won't have gotten used to 4k so the idea is that it will look better in VR due to me still being used to 1080p.
 
Last edited:

Kataploom

Gold Member
How would 8K help tho? I am on a 4K tv and happy to go as low as 1440p in my games of that means ~80fps in my games instead of just 60, the loss of IQ is acceptable
 

analog_future

Resident Crybaby
More pixels on screen makes for a much more detailed texture.

Yet modern CGI on a 1080p blu-ray looks infinitely better & more realistic than any 8k game. Why? Lighting, animations, physics simulations, detail (this means a HELL of a lot more than just “textures”), volumetrics, particle effects, etc.. etc..


I don’t know why I’m even bothering with this argument lol
 
Last edited:

bitbydeath

Gold Member
Yet modern CGI on a 1080p blu-ray looks infinitely better & more realistic than any 8k game. Why? Lighting, animations, physics simulations, detail (this means a HELL of a lot more than just “textures”), volumetrics, particle effects, etc.. etc..


I don’t know why I’m even bothering with this argument lol
You really think today’s CGI is made in 1080P?

cary elwes saw GIF
 
What's the latest OLED and/or MiniLED in the Heisenberg household?
Currently in OLED I have the LG 45" ultrawide 240 hz
The MSI 32" 4k 240hz and the LG 32" 4k 240hz with 1080p 480hz mode (One of these are getting returned

In mini LED I have the Samsung 32" 4k 240 hz Odyssey NEO G8 (which I still dearly love)

Then a few IPS panels sitting in storage as well

Yeah I have a problem
 

Kenpachii

Member
I found ultrawide a more meaningful upgrade, its hard to go back to standard resolutions after that.

I moved from 3440x1440 > 4k found it a downgrade. Then put my 4k lg c2 at 3440x1440 and its glorious so far.
 
Last edited:
My late model Panasonic plasma still works perfectly. I'm in no rush to upgrade when what i have is superior to current TV's that cost 8 times as much.
 

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
To me TV’s are better, played on an excellent sylvania hdtv 1080p around 2010 when Battlefield 3 was with “battle log” and a must play game
 

Agent_4Seven

Tears of Nintendo
That's still a television, not a monitor.
True, but 65'' TV is way better than a monitor cuz it has miles better HDR and OLED screen. You can use such TV for gaming only. The only downside is 120Hz limit and even then not for everyone.
And before the usual crap, here's when I ran a C3. Monitors fill up your vision way better. I always felt like I had to keep looking in the corners

PXL-20230329-222103451.jpg
Why would anyone put a huge TV on their table and use it as a PC monitor?:messenger_mr_smith_who_are_you_going_to_call:
 
Last edited:

TrebleShot

Member
Only peasants use monitors,
Ascend to OLED master race and use custom resolutions.


Currently rocking a LGC2 42 Running at 3840 x 1836 (21:10) ultrawide.
 

MetalRain

Member
I've been using 4K monitor for five years, it was worth it and still is worth it. Now monitor prices are lower, GPUs are more powerful, DLSS works well, really no brainer.
 
Last edited:

rofif

Can’t Git Gud
4k looks ten thousand times better than 1440p though. Get a big oled monitor or tv that’s 4k even if you use dlss.
If not? Why are young gaming
 
Last edited:

kiunchbb

www.dictionary.com
A screen is more than just resolutions, color accuracy, input lag, refresh rate, hdr brightness, motion blurs etc matters more.

I been using budget monitor all my life, but once I upgraded to high end monitor I couldn’t go back. I remember being so amazed at cyberpunk 2077 hdr when a car headlight is pointed directly at my eye, I almost had to cover my eyes in real life.

The second best thing is super ultrawide, come join us in the super ultrawide master race. Don’t settle for anything smaller than 32:9.
 

GreatnessRD

Member
Currently in OLED I have the LG 45" ultrawide 240 hz
The MSI 32" 4k 240hz and the LG 32" 4k 240hz with 1080p 480hz mode (One of these are getting returned

In mini LED I have the Samsung 32" 4k 240 hz Odyssey NEO G8 (which I still dearly love)

Then a few IPS panels sitting in storage as well

Yeah I have a problem
Wow, rich!

I love my dual monitor setup along with my 55" 4K TV, but I'm thinking about moving to a single monitor 32" for 4K OLED/MiniLED as it appears I won't get 4K 27" OLED, lol. I'm going to wait a little while longer though.
 

ClosBSAS

Member
4k is overrated trash and waste of resources. These weak ass consoles should target 1440p 60fps and maybe next gen go for 4k 60....
 

Gaiff

SBI’s Resident Gaslighter
lol I cant imagine gaming on a 27 inch monitor in 2024. Thats fucking insane if you are rocking a 4080 or 4090. WTF.

You need to go OLED. You need to go big screen. You need great tvs with a far wider color range than what monitors, even expensive ones can offer. The tv is a much better upgrade than buying a $1000 card. Once you have a big 55-65 inch tv you can then think about upgrading your GPU.

I left monitor based PC gaming way back in 2013 or so. It was too small. Even the $400-500 monitors were just not good at colors or HDR. Gaming like movies is more immersive on giant screens in a dark room. We are not animals.
There are OLED monitors now and there have been for a few years.
 

BlackTron

Member
So yesterday i tried some 4k gaming and honestly i couldn't see that much difference compared to my 1440p. My 1440p is 27",4k is on 28" monitor. For console gaming,it probably is worth especially if you have 55"+ tv..

But you definitely notice downgrade in fps.

I'll get RTX 5080 when it comes out and its probably gonna last me for lot of years on my 1440p monitor.

Until most of GPU's can run 8K res at decent settings,i dont think its worth upgrade to 4k in between

I'd say 4k is my upper limit for ever giving a crap at these screen sizes. As it is, my monitor is 23.5" and only 1080. Which leaves lost of headroom for performance on the 3070. I chose this monitor size because I think it is the maximum before diminishing returns on 1080 making me want 1440.

I would look for a discernible difference in 4k before bothering to compress pixels even further.
 
Wow, rich!

I love my dual monitor setup along with my 55" 4K TV, but I'm thinking about moving to a single monitor 32" for 4K OLED/MiniLED as it appears I won't get 4K 27" OLED, lol. I'm going to wait a little while longer though.
Yeah those 4k 27" OLED panels aren't even scheduled to possibly go into production until late next year according to TFTCentral



13:24 mark if my timestamp doesn't work right
 

SF Kosmo

Al Jazeera Special Reporter
4K is useful if you game on a large screen, not so much if you don't. Very confused why you think 8K would be any different.
 

phant0m

Member
Gotta disagree here. I just got the new Alienware 32" 4K/240Hz (AW3225QF) monitor and it felt like a HUGE upgrade from my previous 27" 1440p/120 display.

3080 lets me get 70-90 fps @ 4K w/ DLSS on most current games, I expect a 5080 would be able to either hold 120 or deliver true 4k native rendering at the same spec.
 

Rickyiez

Member
You do know that there are us who use big display like 32-48" as a primary or secondary monitor for gaming right? Which is best with 4k.

You can't be so narrow minded and assume everyone use tiny ass monitor like you.
 

Sentenza

Member
I've been on 1440p/165hz for a while but around the time the 50X0 series from Nvdia drops is precisely when I hope I will be able to make the 4K/100+hz my new baseline standard.
 

yansolo

Member
i dont see much of a difference personally between 1440p and 4k but of course always pick 4k if my pc can handle it, i also don't mind 1080p either
 
Top Bottom