SniperHunter
Banned
Thnx will watch soonhttp://www.ustream.tv/recorded/40142006
51 minutes in.
Seriously no idea why all of the recent preview footage is like sub-20. I think somebody fucked up big time while encoding the media stock footage
Thnx will watch soonhttp://www.ustream.tv/recorded/40142006
51 minutes in.
Seriously no idea why all of the recent preview footage is like sub-20. I think somebody fucked up big time while encoding the media stock footage
If you watch a 24fps bluray on a screen that is not a true 120hz refresh rate--so pretty much any plasma as well as many lcds--and you look at a slow panning shot, say the begining of The Dark Night with the pan over the city, you will see a bit of jumpiness because 24 doesn't divide into 60 without extra frames. That is judder.
Which is why we need G-Sync technology (or something like it) to become mainstream and appear in more than just crappy LCD monitors.
Yeah. I really don't get this. How many PC gamers lock their framerate? I doubt it is that many.
its fucking stupid. at least give me the option to lock it to 30 and play without the fucking stutter. same for knack.
How many monitors and tvs am I suppose to buy? But it needs to happen!!Sounds like PS4 could use G-sync.
Actually, there wouldn't really be anything stopping Sony from implementing their own take on variable refresh rates, I mean they build both TVs and the PS4.
Essentially dropped or added frame what i mentioned. Ok i get it. Now can someone explain me how out of 60 or even 120frames a second someone can notice one frame added or lost ? And we are not talking here about white frame in sea of black frames but same frame doubled or cut like the rest of frames. At low FPS like 35 that should be noticeable because that means essentially one of 35 frames lost or doubled. The more FPS the less noticeable effect is. At 35 FPS it is 1/35 of second or
Maybe people noticing it are playing 20-40 fps games non stop ? :>
Their choice introduces image judder that would not be present at a locked 30 fps. That is fact.I don't know jack about GG, their engine, or this game. Just sayin, if an experienced dev says the game is better this way then that way, then that is probably the case.
Whether the game is any good or not, I have no clue. maybe unlocked just makes the game a bit less shitty, who knows. But simply saying "locked 30 always better than unlocked 30-60" is reductive.
I'm not sure I agree with you. G-Sync is really incredible stuff. Variable framerates on a G-Sync display no longer APPEAR variable in the traditional sense. 55 fps looks virtually identical to 60 fps, for instance, and you can easily hit 70-80 or higher fps without worrying about dips. It completely turns everything we knew on its head, really.Well, I'd say the much more elegant solution would be that devs just optimize there games for 60FPS which is always the superior solution. So I'm really not interested so much in G-Sync, however I am thankful that finally someone makes those issues public.
I don't see how G-Sync is going to help evenly pace the wildly fluctuating frametimes if it's designed around delivering frames as soon as possible. Or maybe there's more to it than that.Which is why we need G-Sync technology (or something like it) to become mainstream and appear in more than just crappy LCD monitors.
I suspect they believe they are doing the hardcore a favor as higher framerates will produce less input lag even if the end result suffers from image judder. So they probably focused heavily on getting the latency under control. If they cap it at 30 fps latency will increase but I'd sooner take a little bit of extra latency over an inconsistent framerate.
I don't see how G-Sync is going to help evenly pace the wildly fluctuating frametimes if it's designed around delivering frames as soon as possible. Or maybe there's more to it than that.
Variable resolution rendering is a software feature, not a hradware feature. Every HW platform can do it. E.g. some PS3 games do it.I would rather they lock at 60 fps and use dynamic resolution scaling instead of framerate jumping around. Is that feasible on PS4? I read some about that feature on XB1.
Yeah, that's because of fixed refresh rate displays. That's exactly the kind of stutter G-sync will fix.In most games I notice frame drops from 60 to 59 or 58. I can't explain the technical detail, but it's somehow like the engine is just choking up frames for a second and then going normal. In most cases the effect is quite jarring. Take for example the latest NfS games - as soon as they drop just 1 frame, the game is a stuttery mess. Back to 60 and it's perfectly smooth. Amnesia: Machine for Pigs was the same. Put me in a blind test if you don't believe me.
When your framerate does not divide evenly into your refresh rate frames must be repeated. At 30 fps on a 60 Hz display each frame is repeated twice. At 60 fps each frame is displayed just one time. If you display at 45 fps, however, the first frame will be displayed on the screen 1 time while the second frame will be displayed twice. It becomes worse when the framerate varies all over the place and you end up with repeating frames here and single frames there all out of order. The result is motion judder.
Wut? I don't know of a single PC gamer among the people I know that doesn't lock his framerate either via a frame specific lock or Vsync / Triple Buffer.
In most games I notice frame drops from 60 to 59 or 58. I can't explain the technical detail, but it's somehow like the engine is just choking up frames for a second and then going normal. In most cases the effect is quite jarring. Take for example the latest NfS games - as soon as they drop just 1 frame, the game is a stuttery mess. Back to 60 and it's perfectly smooth. Amnesia: Machine for Pigs was the same. Put me in a blind test if you don't believe me.
Nothing like it has existed before so I can understand why it is difficult to understand but, as Sethos notes, it's basically directly tying refresh rate to framerate. No more duplicate frames. Of course, if you dip under 30 fps using this technology it seems like you end up with gaps in the image so I'm not sure how they can deal with that yet.I don't see how G-Sync is going to help evenly pace the wildly fluctuating frametimes if it's designed around delivering frames as soon as possible. Or maybe there's more to it than that.
It works just as before. Nothing has changed and it wasn't a simple change to make. Most displays were made with video content in mind (which do display at an even framerate). G-Sync is designed for gaming.I through lcd/digital age got rid of this. I remember reading (when lcd/digital inpu became to picture) that lcd displays only change pixels when there is change in image feed/new frame and refresh rate is only limiting fastest possible frame change( for example 1/60s) until next frame can be updated. If it is not case TV manufacturers are biggest idiots in world. we live in digital age, where this would be easy as shit to implement.
TV (and monitor) manufacturers are biggest idiots in world, and Nvidia just now fixed this issue after more than a decade of digital displays being the norm.I through lcd/digital age got rid of this. I remember reading (when lcd/digital inpu became to picture) that lcd displays only change pixels when there is change in image feed/new frame and refresh rate is only limiting fastes possible frame change( for example 1/60s) until next frame. If it is not case TV manufacturers are biggest idiots in world. we live in digital age, where this would be easy as shit to implement.
Below 30 FPS they will do frame doubling. So you should still keep your framerate above 30 at all times even with a G-sync display.Nothing like it has existed before so I can understand why it is difficult to understand but, as Sethos notes, it's basically directly tying refresh rate to framerate. No more duplicate frames. Of course, if you dip under 30 fps using this technology it seems like you end up with gaps in the image so I'm not sure how they can deal with that yet.
I'm not sure I agree with you. G-Sync is really incredible stuff. Variable framerates on a G-Sync display no longer APPEAR variable in the traditional sense. 55 fps looks virtually identical to 60 fps, for instance, and you can easily hit 70-80 or higher fps without worrying about dips. It completely turns everything we knew on its head, really.
Sounds like PS4 could use G-sync.
Actually, there wouldn't really be anything stopping Sony from implementing their own take on variable refresh rates, I mean they build both TVs and the PS4.
It will no longer be about "choosing" 30 fps as you can just let the framerate run free without issue.It will only encourage devs to choose 30FPS I am afraid. The tecnology itself is fine, but it's a workaround for a shitty game design philisophy. It would not be needed it devs would set their priorities straight.
Interesting, so that's what they've decided on? Fluctuating between 29 and 30 fps would be pretty awful then.Below 30 FPS they will do frame doubling. So you should still keep your framerate above 30 at all times even with a G-sync display.
*IF* Sony were to do this I would immediately buy one of their TVs. I'm dreaming of an OLED TV with this type of technology now....Actually, there wouldn't really be anything stopping Sony from implementing their own take on variable refresh rates, I mean they build both TVs and the PS4.
It would be a great way to sell tvs to gamers for sure. I'd be pretty excited if this was the case.
I wonder if a g-sync thing would help Occulus at all.
Framerate didn't look too stable in the recent footage
Wut? I don't know of a single PC gamer among the people I know that doesn't lock his framerate either via a frame specific lock or Vsync / Triple Buffer.
This. Variable frame rate is an open invitation to judder.its fucking stupid. at least give me the option to lock it to 30 and play without the fucking stutter. same for knack.
Essentially dropped or added frame what i mentioned. Ok i get it. Now can someone explain me how out of 60 or even 120frames a second someone can notice one frame added or lost ? And we are not talking here about white frame in sea of black frames but same frame doubled or cut like the rest of frames. At low FPS like 35 that should be noticeable because that means essentially one of 35 frames lost or doubled. The more FPS the less noticeable effect is. At 35 FPS it is 1/35 of second or
Maybe people noticing it are playing 20-40 fps games non stop ? :>
I always lock my framerates. I never accept fluctuation on the PC.I never lock my framerate if its going to be variable above 30. I don't know any that do and I never see you in any of the PC threads sethos.
No, I'm talking about judder which is exactly what you get with a Blu-ray movie on most displays.With blurrays, it actually isn't very noticeable unless you know what it is, you are looking for it, and it is a panning shot. I don't really think judder is the right word for what people are discussing here though. I think what bothers people is large fps changes, say from 55 to sub 20 back to 55 within the space of 1 or 2 seconds.
I always lock my framerates. I never accept fluctuation on the PC.
And some people can't see 60FPS, some people can't see tearing, some people can't feel input and some people can't even see a high resolution. That doesn't mean it isn't there.
In Crysis 3 I was around 30-45 with everything relatively maxed. Never once did I think that was anything but a smooth experience.
Whatever works for you.In Crysis 3 I was around 30-45 with everything relatively maxed. Never once did I think that was anything but a smooth experience.
I explained it above.I don't understand what you mean by juddering when switching between framerates.
Whatever works for you.
When I play Crysis 3 I either drop details for a constant 60 or max everything out and lock it to 30 (depends on whether I'm using a keyboard/mouse or a gamepad).
Obviously this varies from person to person but let's not pretend the issue doesn't exist.
I explained it above.
Crysis games are a rare exception due to their relatively prisitine motion blur and use of temporal AA. The visual experience is much better than most as a result.
Wut? I don't know of a single PC gamer among the people I know that doesn't lock his framerate either via a frame specific lock or Vsync / Triple Buffer.
Use double buffering. Dxtory has itHow do you artificially lock a framerate in a PC game with an AMD card? Using triple buffering certainly doesn't do that.
How do you artificially lock a framerate in a PC game with an AMD card? Using triple buffering certainly doesn't do that.
How do you artificially lock a framerate in a PC game with an AMD card? Using triple buffering certainly doesn't do that.
Whatever works for you.
When I play Crysis 3 I either drop details for a constant 60 or max everything out and lock it to 30 (depends on whether I'm using a keyboard/mouse or a gamepad).
Obviously this varies from person to person but let's not pretend the issue doesn't exist.
I explained it above.
With blurrays, it actually isn't very noticeable unless you know what it is, you are looking for it, and it is a panning shot. I don't really think judder is the right word for what people are discussing here though. I think what bothers people is large fps changes, say from 55 to sub 20 back to 55 within the space of 1 or 2 seconds.
Awful.
This is neogaf.gifGood.
I can't easily show you a video without sitting down and making one myself there's very few 60 fps videos out there to begin with. Do you think we're just making this up?!I understand it being a issue if its dipping below 30 but he is saying 30 is the baseline right. Show me some video of the issue encountered with a game with variables of 45-30
Was Tomb Raider the same way? That was dropping from 60 to around 42 if I remember correctly
TV (and monitor) manufacturers are biggest idiots in world, and Nvidia just now fixed this issue after more than a decade of digital displays being the norm.
I always lock my framerates. I never accept fluctuation on the PC.
No, I'm talking about judder which is exactly what you get with a Blu-ray movie on most displays.
The Kuro that I use features a proper 72 Hz mode, however, which can properly display 24 fps content (by displaying each frame three times). This produces completely even image panning in all films. Hard to watch movie on other displays for me at this point.
Afterburner. Everything else introduces micro stutter, for the most part, including nvidias options.How do you artificially lock a framerate in a PC game with an AMD card? Using triple buffering certainly doesn't do that.
How do you artificially lock a framerate in a PC game with an AMD card? Using triple buffering certainly doesn't do that.
How do you artificially lock a framerate in a PC game with an AMD card? Using triple buffering certainly doesn't do that.