darth_infamous
Member
Apparently no one here has heard of polling https://en.m.wikipedia.org/wiki/Polling_(computer_science)
You really have to be significantly more specific than that.Apparently no one here has heard of polling https://en.m.wikipedia.org/wiki/Polling_(computer_science)
Apparently no one here has heard of polling [/QUOTE] It's more caused by pipel...games) let you change it, not sure about AMD.
I thought Bloodborne's PS4 performance was absolutely awful, especially given how it looked. I'll never play another FROM game anywhere except PC.I usually notice framepacing issues and I hate them but I didn't notice anything during my 50 hours with Bloodborne. Must be minor issues (graph anyone?). Game also ran great for me. No big dips. I was pleasently surprised with this FROM title on consoles. Expected worse. Much worse.
PS. Didn't play Multiplayer. I know it looked different there.
Not sure what is happening but 30fps is just so much better on my PS4 than on my PC monitor.
I'm guessing it's either because devs use some post fx or something to make it look smoother or maybe it's just because my TV has some crazy motion blur effect or something.
my PC gaming monitor is 60hz 1ms 1080p and 30fps on it is just unbearable.
I'm not crazy!
That's kind of the point of this thread, with many solutions listed if you'd care to read them:
- either lock FPS to 30 with RTSS or use Nvidia half-refresh v-sync (some say both, however it is recommend to disable any other frame limiters that might interfere - in-game limiters, v-sync, borderless windowed, etc - just use ONE method of frame-limiting, otherwise they may clash and hinder frame-times. My recommendation is half-refresh v-sync alone, in-game full screen).
- max pre-rendered frames to 1
- motion blur options, if available, can help but only if they use a 33ms 'shutter'
This is what I did for Witcher 3 and got perfect frame-pacing that looks the same as the PS4 version, only better because I didn't have a single frame drop in over 100 hours.
I will need to try this. I'm another person who has always felt that 30fps on PC just doesn't feel the same as on console.
Arkham Knight feels so incredibly smooth on PS4. If I totally max Witcher 3 with Hairworks and try to lock to 30fps I don't get the same smoothness.
So setting max pre rendered frames to 1 helps? Would that also be helpful for a 60fps lock? I've found that it makes Witcher 3 smoother at 60fps lock but not sure if its placebo.
The reason why TLoU on PS4 felt choppy at 30 FPS was because of the way the motion blur worked, it worked as if it was still running at 60FPS but because the time between each frame is longer in 30FPS the motion blur would work for only half of that and break creating a choppy look.I've always wondered about this - why is it that 30fps on console always feels much better than it does on PC? Whenever I play games with a locked 30fps on console (like DriveClub, The Order, Horizon 2 etc) it feels fine. Whenever I lock games to 30FPS on PC it always feels like there's some stutter effect and input lag. I tried playing The Witcher 3 at 1620p/30fps as opposed to 1080p/60fps and I really couldn't stand it despite how much cleaner it look IQ wise. From what I've read I understand it has something to do with vsync and your TV's refresh rate. Is there any way to make it feel as good on PC?
Just a side note - TLOU Remaster had an option to lock the framerate to 30FPS and that felt similarly awful too which makes me things even more confusing. People will say it's placebo/recency effect but I'm sure it has something to do with the method used to lock the FPS.
Just tried limiting Arkham Knight to 30 in rivatuner and putting my gpu on adaptive half refresh.
Man, this is smooth. I didn't think 30fps could look this good. You learn something new...
Now try without Rivatuner (making sure the game is on full screen not windowed) and see if it's the same.
Without the limit the fps flits between 30-32 and looks slightly stuttery.
Were you suggesting the limit was unnecessary? Not technically versed here.
Max pre-rendered frames = 1 helps in so many games I just set it globally. It's the lowest input lag you can get (other than 0, which gives your GPU no buffer at all to work with, and will probably cause random frame drops).
For Witcher 3 I recommend:
NVCP: pre-rendered frames=1, half-refresh v-sync, prefer maximum performance
In-game: full screen, frame-cap unlimited, v-sync off, motion blur off
I used max settings (inc. hairworks, exc. various post-processing options that aren't to my taste) and kept 30fps 100% of the time, and it felt smooth, barring the inherently hitchy animation when running or riding the horse. As I said, played for over 100 hours and it was very tolerable. Even though I could maintain 60fps probably 80-90% of the time on my 970 with slightly lower settings, I find drops so jarring I would rather just lock at 30 and forget about it.
Thanks for all your input in this thread! I did exactly what you did and the game looks phenomenal on my GTX 780. All graphic settings cranked except for hairworks. I'll be using this in future graphic-intensive games to get a little more life out of my 780. I don't mind 30fps as long as it is consistent.
I've always wondered about this - why is it that 30fps on console always feels much better than it does on PC? Whenever I play games with a locked 30fps on console (like DriveClub, The Order, Horizon 2 etc) it feels fine. Whenever I lock games to 30FPS on PC it always feels like there's some stutter effect and input lag. I tried playing The Witcher 3 at 1620p/30fps as opposed to 1080p/60fps and I really couldn't stand it despite how much cleaner it look IQ wise. From what I've read I understand it has something to do with vsync and your TV's refresh rate. Is there any way to make it feel as good on PC?
Just a side note - TLOU Remaster had an option to lock the framerate to 30FPS and that felt similarly awful too which makes me things even more confusing. People will say it's placebo/recency effect but I'm sure it has something to do with the method used to lock the FPS.
Excuse me?Because if a game runs at 30fps on consoles it means it was optimized and meant to run that way; that's the difference you are seeing.
In Cryengine this was always controlled by the motionblur shutter speed which you could arbitarily change. So that 30 fps would have the shutter speed of a 60fps game and a 60fps game could have the shutter speed of a 30fps game. If they made shutter speed vary with framerate (or just allowed the user to toggle their preference) it would not do that. AFAIK.The reason why TLoU on PS4 felt choppy at 30 FPS was because of the way the motion blur worked, it worked as if it was still running at 60FPS but because the time between each frame is longer in 30FPS the motion blur would work for only half of that and break creating a choppy look.
This is a simple explanation and I am sure someone who knows better can explain it better.
No worries. If you find your performance suddenly tanks check that the game hasn't switched itself back to borderless windowed. It does that sometimes.
Question for you. Just downloaded Nvidia Inspector and it's great. Under "Sync and Refresh" for my Witcher 3 profile, I've put vertical sync at 1/2 refresh rate, and vertical sync tear control at standard. There's one option I don't understand though: vertical sync smooth AFR behaviour. It's set to off and is not grayed out like the other options (since those are all default). Should I concern myself with this?
I have that turned off - I believe it's only for SLI setups (alternate-frame rendering, or something?).
I also prefer standard to adaptive, as in my experience adaptive will occasionally tear when it doesn't need to.
Can someone tell me why *locking* a v-synced output to your monitor's refresh would improve frame pacing? If it's sync to the display, how could pacing possibly improve by adding a frame lock?
I blame it on Kb+M control on PC. Playing 30 fps games on PC with a controller feels fine. But as soon as I use a mouse, it gets ugly.
Even when you get a smooth 30fps following the tips in this thread, the mouse just doesn't seem great as an input device at 30fps, just honestly find it hard to deal with at that framerate. Gamepad feels much better at 30fps.
I assume that will depend on how good it turns out to be.Out of curiosity, is it likely that the Steam controller gamepad will feel more janky at 30 FPS than, say, an Xbox controller? (Unless maybe a southpaw setup is used to use the joystick on the left side as the "look around / camera pan" option?)
I assume that will depend on how good it turns out to be.
The better it is, the more it will expose the inherent insufficiency of 30 FPS, just like the mouse (king of input devices) does
Oh, that's what that is! The first time I got that was before I went G-Sync and was messing with settings in New Vegas. It kept happening about 10 seconds into loading a particular save and I thought my then new 970 was dying already.Tearing looks bad enough when there's to frames on screen. The tear itself looks shitty and it can make a hitching effect in motion. But when the frametime/rate is fluctuating, you can actually get 3 frames on screen. Often with one being a runt, like a sliver between the two torn ones, and in motion you don't even see it. The runt frame can be just a few pixels tall onscreen, residing between the tow larger torn frames. So that's a whole frame down the shitter while you system is attempting to draw in a timely way.
With gsync, framerates will fluctuate, as they tend to do, but at least the whole frame will be shown ever time. Not two halves of two frames. And you won't be getting runt frames. It's a big improvement.
In a perfect world, all game engines would get their frames out in a perfectly even timing. But they don't. So gsync can help.
It's a circular track pad, so it will feel the same as a mouse. Sticks can feel better because you're just pushing the camera at an engine-set speed instead of moving the camera from A-B in a much shorter time. Any fast movement looks better in higher framerates. It's the same reason FPSs and driving games are the best for noticing 120fps+.Out of curiosity, is it likely that the Steam controller gamepad will feel more janky at 30 FPS than, say, an Xbox controller? (Unless maybe a southpaw setup is used to use the joystick on the left side as the "look around / camera pan" option?)