That's not a thing...movies work at 24hz because it captures an aesthetic that is pleasing...games are about controlling well so 60+ is always superior. Also games actually render at their framerate...movies for obvious reasons are captures of much higher framerates
So I guess the "half-refresh" thing is part of Nvidia's drivers...
What's the AMD counterpart (if there is one)?
I've also had problems getting a smooth 30 fps on my PC.
I really stand by the refresh rate on PC right before thing, and I barely see it being brought up
on a TV you're looking at almost everything at 24fps, then you launch a game at 30fps "omg smooth"
on a PC you're looking at everything in the desktop at at least double the refresh rate you get when you then launch a game at a lower framerate, so it's jarring in the opposite way it is on a TV
I really do think it's mostly that.
For a console example, play the Last of Us on PS4 and toggle the frame rate to 30 FPS after playing for a bit. It feels like you enabled slideshow mode.
The answer is motion blur. Uncharted 4 uses copious amounts of it.
-Proper Framepacing
-Object motion blur (if this is not implemented properly it can give games a choppy look, like how 30FPS TLoU Remastered feels choppier than PS3 version which ran at an unstable framerate, because the object motion blur was not optimised for 30FPS in the PS4 version)
-Framerate capper
-Controllers work better at low framerate than 1:1 input like the mouse
In that order...not having any one of these will make your game feel choppier in comparison to something like UC4. It's not UC4 doesn't feel like 30FPS and feels like its more, it's just that you've had bad 30FPS experience in comparison.
+1 for the "use half refresh rate + RTSS" team. It changes everything.
EDIT: Borderless Fullscreen + RTSS might be enough for some games, too.
Doesn't the "use half refresh rate" option increase the amount of input lag by a considerable amount? I've done this method and noticed that the controls were not as responsive as before.
For a console example, play the Last of Us on PS4 and toggle the frame rate to 30 FPS after playing for a bit. It feels like you enabled slideshow mode.
For a console example, play the Last of Us on PS4 and toggle the frame rate to 30 FPS after playing for a bit. It feels like you enabled slideshow mode.
Funny thing is. If i remember correctly....... just restart game after you enable 30 fps lock and it will feel much better ))That's because 30FPS mode has borked motion blur (both camera and object) that is suited for 60FPS, but wasn't optimised for 30FPS. So what happens is that when you cap it at 30FPS, each frame stays on screen for twice as long but the motion blur implementation is unchanged and as such only stays for half of that duration causing a break which makes it look choppy even when compared to the PS3 version which ran at a much lower framerate.
I really stand by the refresh rate on PC right before thing, and I barely see it being brought up
on a TV you're looking at almost everything at 24fps, then you launch a game at 30fps "omg smooth"
on a PC you're looking at everything in the desktop at at least double the refresh rate you get when you then launch a game at a lower framerate, so it's jarring in the opposite way it is on a TV
I really do think it's mostly that.
The frame-pacing is really off on PC unless you jump through hoops with some combination of driver control panel, RTSS and in-game settings.i can't stand playing games at 30fps on my PC. it just looks awful. however 30fps on PS4 doesn't bother me at all. not sure why that is. maybe developers use some kind of motion blur to make it look smoother or maybe it's just my monitor that makes 30fps look bad. when i first build my PC i kept saying i'd be happy to go down to 30fps but now games just have to be 60fps.
The issue is still there 100% with X360 controller.It's the controls. With a mouse 30fps feel sloppy, with a controller the effect is reduced.
i can't stand playing games at 30fps on my PC. it just looks awful. however 30fps on PS4 doesn't bother me at all. not sure why that is. maybe developers use some kind of motion blur to make it look smoother or maybe it's just my monitor that makes 30fps look bad. when i first build my PC i kept saying i'd be happy to go down to 30fps but now games just have to be 60fps.
It all depends on correct frame pacing in my opinion, as uneven frame times will make 30 FPS on PC absolutely GARBAGE.
Capping at 30 with RTSS and playing in Borderless Windowed Mode basically provided me with identical 30 FPS "smoothness" between TV and PC.
What if one just uses the borderless function from within the game's graphic options instead of using the Borderless Windowed program? Is it the same?
What if one just uses the borderless function from within the game's graphic options instead of using the Borderless Windowed program? Is it the same?
I honestly have to say, and it's not a tackle against PC, that I've always found that 30fps on PC looks way worse than 30fps on consoles, it's really bad and stutters like a slideshow. And feels janky. But then, I find the difference between 30fps and 60fps on consoles not that big, not that big at the gap between 30fps on consoles and 30fps on PC. Especially when going into U4 solo after playing some multiplayer at 60fps.
Is this because of the absence of a true 30fps cap by default on PC? Is this the result of framepacing issues and little but constant fluctuation of framerate around 30fps? This is a genuine question, because I've never experienced fluid framerate below 60fps on PC.
Edit: don't have a PC anymore, but if I get one, I'll test that adaptive half refresh rate option on Nvidia panel if it really improves that.
I would love to compare 30fps on a TV with 30fps on a monitor, I would guess motion resolution is vastly superior on a monitor, with a TV looking blurrier in motion. That would explain why 30fps feels really different on a console/TV setup in comparison with a PC/monitor one, to me.
I just finished bloodborne on ps4 about 2 months ago. I then fired up dark souls 3 on PC and right away was blown away by how smooth the game looked and played. If you want a stark comparison of 30fps va 60fps gaming try that one. I want the next bloodborne on PC so bad now.
That's not it, movies work at 24fps because each frame is a representation of the movement that happened in the frame intervals, whenever there's motion there's perfect motion blur in movie, take the motion blur out, and the low framerate becomes very noticeable, which is what happen in stop motion animation.Also: How was the light? The more light in the room, the worse fps feel. Thats why cinemas can do 24fps movies, because its totally dark. Would feel choppy in bright daylight.
Now playing Witcher 3 on ps4 feels choppy to me, after playing UC4. Both most definitely run at 30 most of the time. So I guess motion blur and the quality of animations go a long way.
The answer is motion blur. Uncharted 4 uses copious amounts of it.
The answer is motion blur. Uncharted 4 uses copious amounts of it.
I see a smooth-moving ball, then two jerky-moving balls, only difference is that one of the jerky balls looks all blurry and the other looks clearer.
Motion blur doesn't make things look like they have a higher framerate, they just look blurrier.
It looks even worse than intended, if you do not have 120hz monitor due to the judder. (Or any monitor which has hz not dividable by 24.)I don't see major difference between 30fps and 30fps with motion blur.
Animation in 3rd picture feels the same as 2nd, even worse actually.
no, that's because lag and frame pacing.
Also this can be applied to 60fps games too.
The answer is motion blur. Uncharted 4 uses copious amounts of it.
Also: How was the light? The more light in the room, the worse fps feel. Thats why cinemas can do 24fps movies, because its totally dark. Would feel choppy in bright daylight.
I was playing Uncharted 4 at a friend's place recently, and it blew my mind as to how smooth the game was, at 30 fps no less.
I'm not sure the example posted is all that great, but yes, motion blur absolutely makes things look like they run more smoothly than they actually do. Uncharted 4 doesn't look nearly as choppy as a 30fps game with no motion blur.
I really stand by the refresh rate on PC right before thing, and I barely see it being brought up
on a TV you're looking at almost everything at 24fps, then you launch a game at 30fps "omg smooth"
on a PC you're looking at everything in the desktop at at least double the refresh rate you get when you then launch a game at a lower framerate, so it's jarring in the opposite way it is on a TV
I really do think it's mostly that.
I see a smooth-moving ball, then two jerky-moving balls, only difference is that one of the jerky balls looks all blurry and the other looks clearer.
Motion blur doesn't make things look like they have a higher framerate, they just look blurrier.
I don't see major difference between 30fps and 30fps with motion blur.
Animation in 3rd picture feels the same as 2nd, even worse actually.