• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Killlzone Shadow Fall's SP runs at unlocked frame rate

Durante

Member
Sounds like PS4 could use G-sync.

Actually, there wouldn't really be anything stopping Sony from implementing their own take on variable refresh rates, I mean they build both TVs and the PS4.
 

Perkel

Banned
If you watch a 24fps bluray on a screen that is not a true 120hz refresh rate--so pretty much any plasma as well as many lcds--and you look at a slow panning shot, say the begining of The Dark Night with the pan over the city, you will see a bit of jumpiness because 24 doesn't divide into 60 without extra frames. That is judder.

Essentially dropped or added frame what i mentioned. Ok i get it. Now can someone explain me how out of 60 or even 120frames a second someone can notice one frame added or lost ? And we are not talking here about white frame in sea of black frames but same frame doubled or cut like the rest of frames. At low FPS like 35 that should be noticeable because that means essentially one of 35 frames lost or doubled. The more FPS the less noticeable effect is. At 35 FPS it is 1/35 of second or

Maybe people noticing it are playing 20-40 fps games non stop ? :>
 

Thrakier

Member
Which is why we need G-Sync technology (or something like it) to become mainstream and appear in more than just crappy LCD monitors.

Well, I'd say the much more elegant solution would be that devs just optimize there games for 60FPS which is always the superior solution. So I'm really not interested so much in G-Sync, however I am thankful that finally someone makes those issues public. Maybe todays "elitist" is the soccer mom in 10 years, that would be a huge step. I mean, ever soccer mom would call their cable tv provider if the movies suddenly would start stuttering like crazy. It's just, for some weird reason (probably the 3d transition their being 3d was first priority) it's wildly accepted that games can run like shit and an inconsistent experience affecting visuals AND gameplay is somewhat "fine".
 

Sethos

Banned
Yeah. I really don't get this. How many PC gamers lock their framerate? I doubt it is that many.

Wut? I don't know of a single PC gamer among the people I know that doesn't lock his framerate either via a frame specific lock or Vsync / Triple Buffer.
 
Sounds like PS4 could use G-sync.

Actually, there wouldn't really be anything stopping Sony from implementing their own take on variable refresh rates, I mean they build both TVs and the PS4.
How many monitors and tvs am I suppose to buy? But it needs to happen!!
 

Panajev2001a

GAF's Pleasant Genius
If it never dips below 30 FPS or it does it very rarely it might not be too annoying... I just remember the poor experience of Sonic Adventure DX on GameCube where the frame rate would go from 15 or so FPS to 60 FPS all over the place... Yikes, talking about a port worsened by its upgrades...
 
I would rather they lock at 60 fps and use dynamic resolution scaling instead of framerate jumping around. Is that feasible on PS4? I read some about that feature on XB1.
 

Thrakier

Member
Essentially dropped or added frame what i mentioned. Ok i get it. Now can someone explain me how out of 60 or even 120frames a second someone can notice one frame added or lost ? And we are not talking here about white frame in sea of black frames but same frame doubled or cut like the rest of frames. At low FPS like 35 that should be noticeable because that means essentially one of 35 frames lost or doubled. The more FPS the less noticeable effect is. At 35 FPS it is 1/35 of second or

Maybe people noticing it are playing 20-40 fps games non stop ? :>

In most games I notice frame drops from 60 to 59 or 58. I can't explain the technical detail, but it's somehow like the engine is just choking up frames for a second and then going normal. In most cases the effect is quite jarring. Take for example the latest NfS games - as soon as they drop just 1 frame, the game is a stuttery mess. Back to 60 and it's perfectly smooth. Amnesia: Machine for Pigs was the same. Put me in a blind test if you don't believe me.
 

dark10x

Digital Foundry pixel pusher
I don't know jack about GG, their engine, or this game. Just sayin, if an experienced dev says the game is better this way then that way, then that is probably the case.

Whether the game is any good or not, I have no clue. maybe unlocked just makes the game a bit less shitty, who knows. But simply saying "locked 30 always better than unlocked 30-60" is reductive.
Their choice introduces image judder that would not be present at a locked 30 fps. That is fact.

The other truth, however, is that input latency will be decreased in a linear fashion as the framerate is increased. They are likely prioritizing that over visual consistency.

The solution would be to simply offer the option to limit the framerate. That would completely solve this issue for everyone and would be very easy to implement. As it stands, I may have to use my PC to find a creative work around this problem by combining it with a 1080p capture card and some other software.

Well, I'd say the much more elegant solution would be that devs just optimize there games for 60FPS which is always the superior solution. So I'm really not interested so much in G-Sync, however I am thankful that finally someone makes those issues public.
I'm not sure I agree with you. G-Sync is really incredible stuff. Variable framerates on a G-Sync display no longer APPEAR variable in the traditional sense. 55 fps looks virtually identical to 60 fps, for instance, and you can easily hit 70-80 or higher fps without worrying about dips. It completely turns everything we knew on its head, really.
 

SapientWolf

Trucker Sexologist
Which is why we need G-Sync technology (or something like it) to become mainstream and appear in more than just crappy LCD monitors.


I suspect they believe they are doing the hardcore a favor as higher framerates will produce less input lag even if the end result suffers from image judder. So they probably focused heavily on getting the latency under control. If they cap it at 30 fps latency will increase but I'd sooner take a little bit of extra latency over an inconsistent framerate.
I don't see how G-Sync is going to help evenly pace the wildly fluctuating frametimes if it's designed around delivering frames as soon as possible. Or maybe there's more to it than that.
 

Sethos

Banned
I don't see how G-Sync is going to help evenly pace the wildly fluctuating frametimes if it's designed around delivering frames as soon as possible. Or maybe there's more to it than that.

G-sync is designed to set the Hz to the actual FPS, which may sound simple but solves a lot of problems.
 

Durante

Member
I would rather they lock at 60 fps and use dynamic resolution scaling instead of framerate jumping around. Is that feasible on PS4? I read some about that feature on XB1.
Variable resolution rendering is a software feature, not a hradware feature. Every HW platform can do it. E.g. some PS3 games do it.

In most games I notice frame drops from 60 to 59 or 58. I can't explain the technical detail, but it's somehow like the engine is just choking up frames for a second and then going normal. In most cases the effect is quite jarring. Take for example the latest NfS games - as soon as they drop just 1 frame, the game is a stuttery mess. Back to 60 and it's perfectly smooth. Amnesia: Machine for Pigs was the same. Put me in a blind test if you don't believe me.
Yeah, that's because of fixed refresh rate displays. That's exactly the kind of stutter G-sync will fix.
 
When your framerate does not divide evenly into your refresh rate frames must be repeated. At 30 fps on a 60 Hz display each frame is repeated twice. At 60 fps each frame is displayed just one time. If you display at 45 fps, however, the first frame will be displayed on the screen 1 time while the second frame will be displayed twice. It becomes worse when the framerate varies all over the place and you end up with repeating frames here and single frames there all out of order. The result is motion judder.

I through lcd/digital age got rid of this. I remember reading (when lcd/digital inpu became to picture) that lcd displays only change pixels when there is change in image feed/new frame and refresh rate is only limiting fastest possible frame change( for example 1/60s) until next frame can be updated. If it is not case TV manufacturers are biggest idiots in world. we live in digital age, where this would be easy as shit to implement.
 

leadbelly

Banned
Wut? I don't know of a single PC gamer among the people I know that doesn't lock his framerate either via a frame specific lock or Vsync / Triple Buffer.

I will rephrase it: How many PC gamers lock their fps to 30fps because occasionally their are framerate drops?
 
I'm sure they have tested the game locked and unlocked. If they say unlocked improves the experience then I'm not going to argue before I play it.
 

Perkel

Banned
In most games I notice frame drops from 60 to 59 or 58. I can't explain the technical detail, but it's somehow like the engine is just choking up frames for a second and then going normal. In most cases the effect is quite jarring. Take for example the latest NfS games - as soon as they drop just 1 frame, the game is a stuttery mess. Back to 60 and it's perfectly smooth. Amnesia: Machine for Pigs was the same. Put me in a blind test if you don't believe me.

That is not what we are talking here. What you described is sturrer because Engine is fucked up. Like Fallout3/NV.

And those drops aren't from 60 to 59 but from 60 to sometimes essentially 0 for split second but FRAPS doesn't register that since it uses average framerate.

I had it with Fallout 3/NV where FRAPS non stop showed 60FPS but in reality game sturrers like motherfucker because of 60Hz bug in their engine

What you described rarely happens on consoles because it is most of the time problem with Engine that is used for xxx GPUs or GPU drivers.
 

dark10x

Digital Foundry pixel pusher
I don't see how G-Sync is going to help evenly pace the wildly fluctuating frametimes if it's designed around delivering frames as soon as possible. Or maybe there's more to it than that.
Nothing like it has existed before so I can understand why it is difficult to understand but, as Sethos notes, it's basically directly tying refresh rate to framerate. No more duplicate frames. Of course, if you dip under 30 fps using this technology it seems like you end up with gaps in the image so I'm not sure how they can deal with that yet.

The point is, 51 fps will produce an image that looks very much like 60 fps. A drop to 58 fps will be virtually undetectable. It's a complete change in the way the screen is drawn.

I through lcd/digital age got rid of this. I remember reading (when lcd/digital inpu became to picture) that lcd displays only change pixels when there is change in image feed/new frame and refresh rate is only limiting fastest possible frame change( for example 1/60s) until next frame can be updated. If it is not case TV manufacturers are biggest idiots in world. we live in digital age, where this would be easy as shit to implement.
It works just as before. Nothing has changed and it wasn't a simple change to make. Most displays were made with video content in mind (which do display at an even framerate). G-Sync is designed for gaming.
 

Durante

Member
I through lcd/digital age got rid of this. I remember reading (when lcd/digital inpu became to picture) that lcd displays only change pixels when there is change in image feed/new frame and refresh rate is only limiting fastes possible frame change( for example 1/60s) until next frame. If it is not case TV manufacturers are biggest idiots in world. we live in digital age, where this would be easy as shit to implement.
TV (and monitor) manufacturers are biggest idiots in world, and Nvidia just now fixed this issue after more than a decade of digital displays being the norm.

Nothing like it has existed before so I can understand why it is difficult to understand but, as Sethos notes, it's basically directly tying refresh rate to framerate. No more duplicate frames. Of course, if you dip under 30 fps using this technology it seems like you end up with gaps in the image so I'm not sure how they can deal with that yet.
Below 30 FPS they will do frame doubling. So you should still keep your framerate above 30 at all times even with a G-sync display.
 

Thrakier

Member
I'm not sure I agree with you. G-Sync is really incredible stuff. Variable framerates on a G-Sync display no longer APPEAR variable in the traditional sense. 55 fps looks virtually identical to 60 fps, for instance, and you can easily hit 70-80 or higher fps without worrying about dips. It completely turns everything we knew on its head, really.

It will only encourage devs to choose 30FPS I am afraid. The tecnology itself is fine, but it's a workaround for a shitty game design philisophy. It would not be needed it devs would set their priorities straight.
 

QaaQer

Member
Sounds like PS4 could use G-sync.

Actually, there wouldn't really be anything stopping Sony from implementing their own take on variable refresh rates, I mean they build both TVs and the PS4.

It would be a great way to sell tvs to gamers for sure. I'd be pretty excited if this was the case.

I wonder if a g-sync thing would help Occulus at all.
 

dark10x

Digital Foundry pixel pusher
It will only encourage devs to choose 30FPS I am afraid. The tecnology itself is fine, but it's a workaround for a shitty game design philisophy. It would not be needed it devs would set their priorities straight.
It will no longer be about "choosing" 30 fps as you can just let the framerate run free without issue.

Below 30 FPS they will do frame doubling. So you should still keep your framerate above 30 at all times even with a G-sync display.
Interesting, so that's what they've decided on? Fluctuating between 29 and 30 fps would be pretty awful then.

Actually, there wouldn't really be anything stopping Sony from implementing their own take on variable refresh rates, I mean they build both TVs and the PS4.
*IF* Sony were to do this I would immediately buy one of their TVs. I'm dreaming of an OLED TV with this type of technology now....

I'm so annoyed that G-Sync is so far off for me as I want it so very much. It's useless to me in small PC LCD form as I simply do not play games on my monitor these days.
 
Wut? I don't know of a single PC gamer among the people I know that doesn't lock his framerate either via a frame specific lock or Vsync / Triple Buffer.

I never lock my framerate if its going to be variable above 30. I also don't see any judder if it isn't dipping below 30 as long as I have V-Sync on. Please show me a video of this judder you speak of.
 

QaaQer

Member
Essentially dropped or added frame what i mentioned. Ok i get it. Now can someone explain me how out of 60 or even 120frames a second someone can notice one frame added or lost ? And we are not talking here about white frame in sea of black frames but same frame doubled or cut like the rest of frames. At low FPS like 35 that should be noticeable because that means essentially one of 35 frames lost or doubled. The more FPS the less noticeable effect is. At 35 FPS it is 1/35 of second or

Maybe people noticing it are playing 20-40 fps games non stop ? :>

With blurrays, it actually isn't very noticeable unless you know what it is, you are looking for it, and it is a panning shot. I don't really think judder is the right word for what people are discussing here though. I think what bothers people is large fps changes, say from 55 to sub 20 back to 55 within the space of 1 or 2 seconds.
 

dark10x

Digital Foundry pixel pusher
I never lock my framerate if its going to be variable above 30. I don't know any that do and I never see you in any of the PC threads sethos.
I always lock my framerates. I never accept fluctuation on the PC.

With blurrays, it actually isn't very noticeable unless you know what it is, you are looking for it, and it is a panning shot. I don't really think judder is the right word for what people are discussing here though. I think what bothers people is large fps changes, say from 55 to sub 20 back to 55 within the space of 1 or 2 seconds.
No, I'm talking about judder which is exactly what you get with a Blu-ray movie on most displays.

The Kuro that I use features a proper 72 Hz mode, however, which can properly display 24 fps content (by displaying each frame three times). This produces completely even image panning in all films. Hard to watch movie on other displays for me at this point.
 

sol_bad

Member
And some people can't see 60FPS, some people can't see tearing, some people can't feel input and some people can't even see a high resolution. That doesn't mean it isn't there.

I can tell framerate differences, I notice horrible screen tearing and I notice resolution differences on my plasma. I have also noticed micro stutters on my PC in the past at times when it has happened (Assassin's Creed).

I don't understand what you mean by juddering when switching between framerates.
 
In Crysis 3 I was around 30-45 with everything relatively maxed. Never once did I think that was anything but a smooth experience.

Crysis games are a rare exception due to their relatively prisitine motion blur and use of temporal AA. The visual experience is much better than most as a result.
 

dark10x

Digital Foundry pixel pusher
In Crysis 3 I was around 30-45 with everything relatively maxed. Never once did I think that was anything but a smooth experience.
Whatever works for you.

When I play Crysis 3 I either drop details for a constant 60 or max everything out and lock it to 30 (depends on whether I'm using a keyboard/mouse or a gamepad).

Obviously this varies from person to person but let's not pretend the issue doesn't exist.

I don't understand what you mean by juddering when switching between framerates.
I explained it above.
 
This game looks like it would be really fun if you play it without the goddamn xray vision thing. Seeing through walls when you have the ability to blast through them is silly.
 
Whatever works for you.

When I play Crysis 3 I either drop details for a constant 60 or max everything out and lock it to 30 (depends on whether I'm using a keyboard/mouse or a gamepad).

Obviously this varies from person to person but let's not pretend the issue doesn't exist.


I explained it above.

I understand it being a issue if its dipping below 30 but he is saying 30 is the baseline right. Show me some video of the issue encountered with a game with variables of 45-30

Crysis games are a rare exception due to their relatively prisitine motion blur and use of temporal AA. The visual experience is much better than most as a result.

Was Tomb Raider the same way? That was dropping from 60 to around 42 if I remember correctly
 

jett

D-Member
Wut? I don't know of a single PC gamer among the people I know that doesn't lock his framerate either via a frame specific lock or Vsync / Triple Buffer.

How do you artificially lock a framerate in a PC game with an AMD card? Using triple buffering certainly doesn't do that.
 

Perkel

Banned
Whatever works for you.

When I play Crysis 3 I either drop details for a constant 60 or max everything out and lock it to 30 (depends on whether I'm using a keyboard/mouse or a gamepad).

Obviously this varies from person to person but let's not pretend the issue doesn't exist.


I explained it above.

Could this be issue of image processing in TVs and monitors ? I don't use monitor since like 2008 when i changed my CRT for LCDTV.

Maybe frame judder is simply way less noticeable on TVs that do some image processing ? Monitors as i strive for displaying everything as close to source material mostly without any IP. TV on other hand most of the time have some tech that deal with IP.

Maybe that is why i can't see judder on my TV ? Maybe there is some IP going on between frames that "smooths" frames ?

With blurrays, it actually isn't very noticeable unless you know what it is, you are looking for it, and it is a panning shot. I don't really think judder is the right word for what people are discussing here though. I think what bothers people is large fps changes, say from 55 to sub 20 back to 55 within the space of 1 or 2 seconds.

Those are microsturrers which are created by engine itself not by display we are talking here about. They are mostly bugs in engine that rarely happen on consoles (because of fixed hardware)
 

dark10x

Digital Foundry pixel pusher
I understand it being a issue if its dipping below 30 but he is saying 30 is the baseline right. Show me some video of the issue encountered with a game with variables of 45-30



Was Tomb Raider the same way? That was dropping from 60 to around 42 if I remember correctly
I can't easily show you a video without sitting down and making one myself there's very few 60 fps videos out there to begin with. Do you think we're just making this up?!
 
TV (and monitor) manufacturers are biggest idiots in world, and Nvidia just now fixed this issue after more than a decade of digital displays being the norm.

Holy poop. lcd:s don't have to sweep screen constantly like crt tvs to keep picture. How is updating screen only when frame updates not the norm with displays is mind boggling.
 

QaaQer

Member
I always lock my framerates. I never accept fluctuation on the PC.


No, I'm talking about judder which is exactly what you get with a Blu-ray movie on most displays.

The Kuro that I use features a proper 72 Hz mode, however, which can properly display 24 fps content (by displaying each frame three times). This produces completely even image panning in all films. Hard to watch movie on other displays for me at this point.

You're like my gf. Once she experiences something really really good, anything less just is of no interest to her to the point of actually being repulsed. I can't buy grocery store strawberries, cheapish cars, econo trans Atlantic airline tix, starbucks esspresso, etc.

Luckily she's never seen a 72hz color calibrated plasma so we can still watch TV together. I fear the day she watches something on a $10k 72hz calibrated OLED tv.
 

dark10x

Digital Foundry pixel pusher
How do you artificially lock a framerate in a PC game with an AMD card? Using triple buffering certainly doesn't do that.
Afterburner. Everything else introduces micro stutter, for the most part, including nvidias options.
 
Top Bottom