• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FreeSync 2 Brings Latency, LFC and Color Space Requirements

dex3108

Member
oday AMD is announcing FreeSync 2, a new, concurrently running program that adds some new qualifications to displays for latency, color space and LFC. This new program will be much more hands-on from AMD, requiring per-product validation and certification and this will likely come at a cost. (To be clear, AMD hasn't confirmed if that is the case to me yet.)

Let's start with the easy stuff first: latency and LFC. FreeSync 2 will require monitors to support LFC and thus to have no effective bottom limit to their variable refresh rate. AMD will also instill a maximum latency allowable for FS2, on the order of ”a few milliseconds" from frame buffer flip to photon. This can be easily measured with some high-speed camera work by both AMD and external parties (like us).

These are fantastic additions to the FreeSync 2 standard and should drastically increase the quality of panels and product.

All GPUs that support FreeSync will support FreeSync 2 and both programs will co-exist. FS2 is currently going to be built on DisplayPort and could find its way into another standard extension (as Adaptive Sync was). Displays are set to be available in the first half of this year.

https://www.pcper.com/news/Graphics...rements?utm_source=dlvr.it&utm_medium=twitter
 

Tecnniqe

Banned
I wonder how AMD will do in the coming years.

I do like their initiative but it comes with a question of additional cost.
 

Vuze

Member
This sounds like a much more closed-off approach than FreeSync 1.
I wonder if Nvidia could still support FS2 if they wanted to. But it sounds like the one new feature in this revision in comparison to Gsync is reliant on AMD drivers so probably not?
a game that integrates support for FS2 will be able to get data from the AMD driver stack about the maximum color space of the attached display
 

Guess Who

Banned
It's good that this program runs alongside normal FreeSync rather than replaces it. It would suck for monitor vendors to have no freely-available standard to pick from here.
 
I wonder how AMD will do in the coming years.

I do like their initiative but it comes with a question of additional cost.

It should still be much cheaper than Gsync, which is pretty limited to more expensive displays. I don't think people will mind paying an extra $10 or so if the price of the monitor is still quite low.
 
This sounds like a much more closed-off approach than FreeSync 1.
I wonder if Nvidia could still support FS2 if they wanted to. But it sounds like the one new feature in this revision in comparison to Gsync is reliant on AMD drivers so probably not?

Nvidia doesn't care. They won't support freesync of adaptive sync standards while they are making additional profits off GSync and the ecosystem lock in it creates. Why would they? I think it will take consoles and TV manufacturers embracing adaptive sync through HDMI/DP to get Nvidia to budge. Even that may not be enough.

Sincerely a long time Nvidia user who has a freesync/adaptive sync monitor and refuses to buy a display device that can only be used fully within one ecosystem.

I wonder how AMD will do in the coming years.

I do like their initiative but it comes with a question of additional cost.

The scalers required are about $5-10 on top of the monitor cost. Keep in mind the logic gate board that Nvidia uses for Gsync increases the cost by about $150-200+ on average over a standard monitor with similar features.

My MG279Q I got for $440. The exact same panel GSync equivalent monitor is like $700 on sale. $750-800 usually. Pretty much identical monitors.

The greatest irony being that adaptive sync benefits lower end hardware the most. So essentially reserving it for the highest end users via a large price gate means the great benefits are being with held from the majority of PC users for poor reasons and holding back a great tech for gamers in general.
 
This sounds like a much more closed-off approach than FreeSync 1.
I wonder if Nvidia could still support FS2 if they wanted to. But it sounds like the one new feature in this revision in comparison to Gsync is reliant on AMD drivers so probably not?
Well, the only reason that nVidia doesn't support Freesync now is because nVidia went in and disabled the feature through drivers IIRC, so.... Probably won't support FS2, since they want you to go out and buy a Gsync monitor.
 

Tecnniqe

Banned
Nvidia doesn't care. They won't support freesync of adaptive sync standards while they are making additional profits off GSync and the ecosystem lock in it creates. Why would they? I think it will take consoles and TV manufacturers embracing adaptive sync through HDMI/DP to get Nvidia to budge. Even that may not be enough.

Sincerely a long time Nvidia user who has a freesync/adaptive sync monitor and refuses to buy a display device that can only be used fully within one ecosystem.



The scalers required are about $5-10 on top of the monitor cost. Keep in mind the logic gate board that Nvidia uses for Gsync increases the cost by about $150-200+ on average over a standard monitor with similar features.

My MG279Q I got for $440. The exact same panel GSync equivalent monitor is like $700 on sale. $750-800 usually. Pretty much identical monitors.

The greatest irony being that adaptive sync benefits lower end hardware the most. So essentially reserving it for the highest end users via a large price gate means the great benefits are being with held from the majority of PC users for poor reasons and holding back a great tech for gamers in general.
Seems like a massive difference between the two.

How does FreeSync hold up to GSync?

I was gonna get myself the 1080TI and a GSync UW this year, at least that was the plan.
 

bee

Member
How does FreeSync hold up to GSync?

gsync is superior but overpriced, you get the ulmb (ultra low motion blur ) mode which is absent from freesync and it also usually operates on a wider range e.g 30-144hz, lots of freesync monitors only start at 45-48hz, which is pretty useless for me
 
Seems like a massive difference between the two.

How does FreeSync hold up to GSync?

I was gonna get myself the 1080TI and a GSync UW this year, at least that was the plan.

Gsync is better, there's no doubt. It can work with borderless windowed games, usually has a wider range of coverage and has ULMB (Ultra Low Motion Blur) mode (however you can't use the adaptive sync feature and ULMB at the same time).

Freesync works generally the same, usually has a smaller range and varies depending on the monitor. Many can be fairly easily tweaked to change or increase the ranges though. The main benefit is it's essentially open source, is already a standard on display port and upcoming HDMI revisions. Nvidia actively blocks it, even though they use it on laptops with Gsync proving that they can do Gsync without the expensive board and introduce a low cost option.

The main thing is that it's an option that I believe needs to be provided to gamers considering it's already in most new Nvidia cards from my understanding. It's especially a tech I would love to see console gamers and TV manufacturers adopt as well.
 

Jonnax

Member
Looks interesting. I'm Holdings off for HDR monitors so I'm happy to wait a bit.

e4631lU.jpg


NCh5kAo.jpg



hToPPqI.jpg
 

decoy11

Member
The solution is for games to map directly to the color space of the display. AMD will foster this through FreeSync 2 – a game that integrates support for FS2

This line here makes FreeSync2 as dead as TrueAudio. When was the last time AMD put out an API that took off with developers?
 

Tecnniqe

Banned
gsync is superior but overpriced, you get the ulmb (ultra low motion blur ) mode which is absent from freesync and it also usually operates on a wider range e.g 30-144hz, lots of freesync monitors only start at 45-48hz, which is pretty useless for me

Gsync is better, there's no doubt. It can work with borderless windowed games, usually has a wider range of coverage and has ULMB (Ultra Low Motion Blur) mode (however you can't use the adaptive sync feature and ULMB at the same time).

Freesync works generally the same, usually has a smaller range and varies depending on the monitor. Many can be fairly easily tweaked to change or increase the ranges though. The main benefit is it's essentially open source, is already a standard on display port and upcoming HDMI revisions. Nvidia actively blocks it, even though they use it on laptops with Gsync proving that they can do Gsync without the expensive board and introduce a low cost option.

The main thing is that it's an option that I believe needs to be provided to gamers considering it's already in most new Nvidia cards from my understanding. It's especially a tech I would love to see console gamers and TV manufacturers adopt as well.

I see. Thank you for the insight.

Guess I'll go ahead with GSync for now with a hope of FreeSync becoming the standard in time.
 

DieH@rd

Banned
As long as support for basic Freesync continues to be added to new monitors, I don't see this premium option as a problem.

I hope to see what models will be showcased at CES.
 

Mechazawa

Member
This line here makes FreeSync2 as dead as TrueAudio. When was the last time AMD put out an API that took off with developers?
Maybe that's just for the color space stuff. LFC is already in some Freesync monitors and is just really matching parity with Gsync, and it sounds insane to have latency reduction be something supported on a per-game basis rather than at a global level.
 
My MG279Q I got for $440. The exact same panel GSync equivalent monitor is like $700 on sale. $750-800 usually. Pretty much identical monitors.

Yeah it's the same panel but that's where similar quality ends - it has 35-95Hz range of freesync and doesn't support strobing in any form.

To get something similar to G-sync 1440p IPS screens you would need to go with 1000 Euro Eizo and still end with two separate frequency ranges.

Best thing Freesync 2 has is HDR support
 

Alexious

Member
Anandtech version of the story



The amount of work necessary to implement FreeSync 2 in a game [engine] should be extremely small compared to previous cases. Even Mantle got some decent support, and that was far more demanding of developers.

Even if that was the case, some developers just won't do it due to their preferred relationship with NVIDIA. That's what I hate about the NVIDIA vs AMD rivalry on PC: there will never be a standard in these cases.
 

laxu

Member
Good news but could open the door for sneaky marketing where it is hard to figure out if the display supports Freesync 1 or 2.

I really wish they had added a requirement for a strobing mode too as that is one of the things I absolutely love about G-sync displays. While not all support it for technical reasons, I hope this year we will finally get the GPUs and displays capable of 120+ Hz so ULMB also works at 4K or ultrawides. The ULMB mode is great when you have a game that runs at 60+ fps.
 

DieH@rd

Banned
It doesnt do anything at 30fps or lower.

But it could help a lot with games that cannot hold steady 60fps or are always in 40-50fps range.

I just want to see freesync TVs and consoles.

I want freesync on TVs so that I can finally find some good 40-43" 4K TV that I can use as a PC monitor [and for PS4 Pro also]. I would rather give 500-600e for a good TV than 1000e for premium gaming monitor with a same screen size.
 

b0bbyJ03

Member
But it could help a lot with games that cannot hold steady 60fps or are always in 40-50fps range.



I want freesync on TVs so that I can finally find some good 40-43" 4K TV that I can use as a PC monitor [and for PS4 Pro also]. I would rather give 500-600e for a good TV than 1000e for premium gaming monitor with a same screen size.

agree 100%. Plus I could see this working in AMDs favor to some extent. Im sure if people already had a freesync tv in their house they would really consider going with an AMD GPU. I know I would. I have a Gsync monitor now so I know I'd never want to game without adaptivesync again given the option so that alone keeps me in the Nvidia camp.
 
Freesync works in borderless windowed since december.

Oh that's awesome news. I didn't know that. I really hope the new AMD card is close enough to the 1080ti to give me a reason to jump from my 1070. I would love to be able to use my MG279Q properly. Although with that being said I've been playing a lot more on my 55" Samsung KS8000. I'd say I use them about equally now. As more games on PC get HDR support I may be using it more than my monitor. Hard to justify monitors up near $1000 when my 55" top of the line TV was $729.

Monitor is great for design work and stuff though. 144hz is magic.

Yeah it's the same panel but that's where similar quality ends - it has 35-95Hz range of freesync and doesn't support strobing in any form.

To get something similar to G-sync 1440p IPS screens you would need to go with 1000 Euro Eizo and still end with two separate frequency ranges.

Best thing Freesync 2 has is HDR support

The best thing Freesync has is that it's a great low cost value add. A lot of the features you described are not something a lot of consumers care enough about or can afford to pay the premium for. Now there are numerous freesync monitors out there for sub $200. Or great IPS ones that can be had for significantly less than their Gsync counterparts. Additionally it's something that could be implemented into TV's which are growing significantly amongst PC users for a very low cost. No TV manufacturer is going to say "oh wow let's put that $200 Gsync board in our TV".
 

intbal

Member
Yes, this post is 3 years out of date.
But I figured someone might find it interesting anyway.

Everyone knows what low framerate compensation is, but how about taking a look at an ideal use case for the technology.

The Xbox 360 version of Max Payne 3 was patched (pre-release?) with an improperly set framerate cap of 33fps. This is what you download when you play it through backwards compatibility. There's no way to cap it to 30. So, in order to get smooth frame delivery, you need both VRR and LFC. As the video shows, the frames are tripled, resulting in a 99hz refresh rate on this 120hz monitor.
But Max Payne 3 still delivers its frames erratically, so the 99hz of the framerate counter jumps around a bit. But it's still a lot smoother performance than trying to play the game without VRR/LFC.

 
Last edited:
LOL 20 years after Nvidia introduced G-Sync and set specific standards on what that supports, AMD catches up huh

edit: Oh it's from 2017 but as expected of AMD we haven't heard shit about it since then and it's now 2024 and nothing has changed lol
 
Last edited:
Top Bottom