• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One X Freesync Support

EvB

Member
It's been confirmed that Xbox One X will be able to utilise freesync. What does this mean for Xbox One X games, could framerate be unlocked with no issue?

We've heard Ubisoft and Bungie refer to their games being 30fps, however that is typically intended to be the target or the minimum framerate expected, is it possible that Xbox One X games could be have unlocked v-sync and framerate when used with the correct display?
 

Tain

Member
Ideally yeah, they will, and framerate caps for games built around flexible framerates will no longer be a thing when used on the right display.

And if the implementation is anything like on PC, it shouldn't involve extra work from devs.
 

mitchman

Gold Member
I think you should be cautious about what MS is saying until support is actually confirmed by externals. Just look at how they touted VR support but then back paddled on it.
Microsoft PR and its "true 4k" statement is another one and yet two of the biggest titles demonstrated was using checkerboard rendering for its 4k images.
 
If this is true and works well, it it better for me to play x1x games on my 26" Freesync monitor, or still my 44" tv? They're both 1080p.

Never experienced freesync so i dont know whats the difference really. (I bought the monitor for my wife's xbox and a weak pc lol)
 

EvB

Member
Wouldn't be surprised if 30fps titles enforce a cap even with th existence of freesync, console walled garden and all that.

I am expecting a framerate cap and vysnc off to be the minimum we can expect, it would be nice to have a confirmation.

I'm considering getting a 42 or 50inch freesync monitor , rather than going for a local dimming LCD or OLED.
 

Tain

Member
If this is true and works well, it it better for me to play x1x games on my 26" Freesync monitor, or still my 44" tv? They're both 1080p.

Never experienced freesync so i dont know whats the difference really. (I bought the monitor for my wife's xbox and a weak pc lol)

If a game's performance is a solid 60fps on a TV it won't matter much. Otherwise, yeah, it could very well be worth it to play on the monitor.

It's hard to explain what the impact of basically making the entire 30-60fps range automatically viable is like.
 
It's been confirmed that Xbox One X will be able to utilise freesync. What does this mean for Xbox One X games, could framerate be unlocked with no issue?

We've heard Ubisoft and Bungie refer to their games being 30fps, however that is typically intended to be the target or the minimum framerate expected, is it possible that Xbox One X games could be have unlocked v-sync and framerate when used with the correct display?

This part should be emphasized.

HDMI 2.1 specs are barely finalized. We have no idea what/if TV manufacturers will provide anything next year, and even if they do, what features they're likely to utilize, or standard.

Might be different on the monitor side, but for TVs, it could be a couple years yet...
 

EvB

Member
This part should be emphasized.

HDMI 2.1 specs are barely finalized. We have no idea what/if TV manufacturers will provide anything next year, and even if they do, what features they're likely to utilize, or standard.

Might be different on the monitor side, but for TVs, it could be a couple years yet...

You can get large Freesync displays right now (as monitors) just not from the major manufacturers.
Although LG have just announced a 43inch display that has freesync.
 

Head.spawn

Junior Member
They could definitely flow with an optioned unlocked framerate for every game and it would ensure that every single game can be the best it can be on Xbox One X if the user is using a FreeSync/VRR display... unfortunately, and I asked Albert Penello yesterday and while he said he would love to see an unlocked option in more games at the end of the day they are leaving it up to the developer to implement such a feature.

QKdR9dw.jpg

I'd suggest if you have any games that you know are going to be locked 30fps, or whatever. It would be a good idea to hit up developers on twitter and share your opinion with them and let them know having an option to unlock would be in their best interests.
 

spectator

Member
Wouldn't be surprised if 30fps titles enforce a cap even with th existence of freesync, console walled garden and all that.

The only reason I can imagine for that is a shared code-base with the PS4 version which would be capped because the PS4 Pro doesn't include hardware-level freesync. It would then be a separate development investment to specifically remove the cap clock for the Xbox One X version.
 

MaxiLive

Member
The only reason I can imagine for that is a shared code-base with the PS4 version which would be capped because the PS4 Pro doesn't include hardware-level freesync. It would then be a separate development investment to specifically remove the cap clock for the Xbox One X version.

Less so these days but games can have a lot of dependencies on framefrate such as timers, AI, physics etc and it can take a lot of work to re-code to work with varied framerates.

More devs focus on the PC dev approach first so it is less of an issue these days but still exists. Hopefully games will start to support this feature in the future and give the options for players to enable/disable the settings as it can be very jarring still when it jumps from 30 to 50 then back to 35 in a number of seconds.
 

Atolm

Member
While it's very nice that they support it, it's an extremely niche feature as so far you need a monitor that supports it. No TVs so far have FreeSync support afaik.
 

wildfire

Banned
I don't see this being used in console games until TVs start to support the technology.

Adaptive sync is independent of the games. It's all about if the GPU and the display can communicate to each other properly. The same goes for ULMB/backlight strobing to reduce motion blur for CRT-esque image quality.
 

Dehnus

Member
I think you should be cautious about what MS is saying until support is actually confirmed by externals. Just look at how they touted VR support but then back paddled on it.
Microsoft PR and its "true 4k" statement is another one and yet two of the biggest titles demonstrated was using checkerboard rendering for its 4k images.

MS never stated that they would force third parties to go 4K though. But hey, what ever keeps you going, Brave warrior.
 

mitchman

Gold Member
MS never stated that they would force third parties to go 4K though. But hey, what ever keeps you going, Brave warrior.

You realize they are on record saying it, right? They did during the press conference and used "non-true 4k" titles titles to demonstrate it.

https://www.theverge.com/2017/6/13/15790162/microsoft-phil-spencer-xbox-one-x-vs-ps4-pro-interview

I look at [PS4] Pro as more of a competitor to [Xbox One] S than I do to Xbox One X,” claims Spencer. “This is a true 4K console. If you just look at the specs of what this box is, it's in a different league than any other console that's out there.” Spencer points out 40 percent more GPU speed, more RAM, and the speed of storage as the advantages of the Xbox One X over the PS4 Pro, but he also knocks Sony’s methods for getting to 4K resolutions with some of its games. “When I think about techniques to somehow manufacture a 4K screen like what some other consoles try to do, this is different than that.” Spencer also says he expects the majority of consoles that Microsoft sells next year will be Xbox One S.
 
The ideal would be if somehow the game could know if VRR/Freesync is enabled.

If it is not: Lock the framerate and scale the resolution
If it is: Lock the resolution and scale the framerate.

Or at least this could be a system wide settings for newer games, that way it's always your choice.
 

gamz

Member
Wat ? They always said devs can use that power how they like. Even in an interview at e3 he spoke freely how they can use native, checkerboard or dynamic stuff.

They've been saying it since 2015. Someone posted a bunch of articles dating back and Phil has always said its up to the Devs. He's never changed or waviered on that.
 

Daffy Duck

Member
Just imagine how much extra TV manufacturers will add on for this feature to their TVs on top of the premium for 4K, HDR.
 
Could be useful for some unlocked or 60 fps titles that don't reach target. I'm not sure how well the current Freesync panels would deal with <30 FPS.

FreeSync over HDMI is AMD proprietary tech so it's unlikely to be seen in TV's. It doesn't use the same open VESA Adaptive-Sync tech. Same goes for FreeSync 2, that's more like G-Sync in that it's more stringently certified by AMD. HDMI 2.1 Game Mode VRR is probably what will finally bring this feature available for all without relying on AMD/Nvidia.
 

Datschge

Member
It doesn't use the same open VESA Adaptive-Sync tech.
All adaptive sync solutions by all manufactures us the same VESA Adaptive-Sync tech. The issue here is not the tech but how that information is transmitted. There Microsoft currently can only refer to Freesync 2 since HDMI 2.1 is not finished and as such can't be used for promotion yet.
 

Ehker

Member
You can get large Freesync displays right now (as monitors) just not from the major manufacturers.
Although LG have just announced a 43inch display that has freesync.

If you mean this monitor I'm not seeing how that supports the 1X for Freesync, as it uses HDMI 2.0 inputs.
 

Kelegacy

XBOX - RECORD ME LOVING DOWN MY WOMAN GOOD
I really hope Nvidia lowers their price of the licensing/module so that Gsync monitors will become as reasonable as Freesync. If Freesync keeps being adopted like this, that is great. More marketshare for AMD. But of course then you need a display that only supports the maker of whatever GPU is in your console, which could get problematic if they switch from gen to gen. Frustrating.

I'd ideally like a generic solution for all devices, incorporated in all displays and hardware as standard, not a per company basis. Or, at least if possible, some day we have displays that support both (not cost effective now) like LG and Vizio do with HDR.

I say this as someone who owns a GTX 1080 and two consoles, but currently no adaptive sync tech. I would hate to someday need 2 displays and swap them out according to what I play, console or PC. The worst thing so far about owning an Nvidia card (only a month so far) is paying the premium for Gsync over Freesync.
 
The ideal would be if somehow the game could know if VRR/Freesync is enabled.

If it is not: Lock the framerate and scale the resolution
If it is: Lock the resolution and scale the framerate.

Or at least this could be a system wide settings for newer games, that way it's always your choice.

Freesync is game independent.
The game only needs to have an unlocked framerate (or capped at 60fps).
Freesync also works with VSync enabled.

This is a great video that explains what Freesync actually does:
https://www.youtube.com/watch?v=p7_ZiVY8vwE
 

Fatmanp

Member
I have a Gsync monitor and the dips below 60 are smoothed out but it is still noticeable and still impacts gameplay and my enjoyment of the game. Basically don't put too much stock in these technologies because imo there use is far more noticeable in framerates above 60 due to the elimination of tearing.
 
I have a Gsync monitor and the dips below 60 are smoothed out but it is still noticeable and still impacts gameplay and my enjoyment of the game. Basically don't put too much stock in these technologies because imo there use is far more noticeable in framerates above 60 due to the elimination of tearing.

I think it´s extremely helpful in the 45-60fps range, because there is usually the most noticable stuttering.
With the expected amount of freesync TVs in the wild being more or less zero, i dont think that devs will even bother.

Again, Freesync is game independent.
It works on driver basis.
 

sangreal

Member
All adaptive sync solutions by all manufactures us the same VESA Adaptive-Sync tech. The issue here is not the tech but how that information is transmitted. There Microsoft currently can only refer to Freesync 2 since HDMI 2.1 is not finished and as such can't be used for promotion yet.

The spec sheet already includes HDMI VRR, not just freesync. They just don't claim to support the rest of HDMI 2.1

https://news.xbox.com/wp-content/uploads/FACT-SHEET_Xbox-Specs_FINAL.docx
 

Fatmanp

Member
I think it´s extremely helpful in the 45-60fps range, because there is usually the most noticable stuttering.

I tend to find above 45fps it is better. Between 30fps and 45fps i much prefer to limit the fps with RivaTuner to 30fps.The beauty of having gsync is that I can select my max framerate on a game by game basis. An example is right now I am playing Watchdogs 2. I cannot play the game without bumping pixel density otherwise it just looks too blurry so I hard lock it to 45fps and it feels good.

If MS/Devs allow players to select a custom frame rate if they have a Freesync monitor then this would be a very good thing for console gaming.
 

mitchman

Gold Member
MS never stated that they would force third parties to go 4K though. But hey, what ever keeps you going, Brave warrior.

They've been saying it since 2015. Someone posted a bunch of articles dating back and Phil has always said its up to the Devs. He's never changed or waviered on that.

I guess my point wasn't taken here. They go out with arrogant statements claiming only the Xbox One X can do "true 4k" and compare the PS4 Pro to Xbox One S, but then fails to provide the demonstration of that "true 4k" they use in their marketing in actual demos. It comes off as arrogant and disingenuous when there is little evidence in the titles they claimed was "true 4k" actually native 4k. I'll still get it, but I don't like their marketing here.
 
I tend to find above 45fps it is better. Between 30fps and 45fps i much prefer to limit the fps with RivaTuner to 30fps.The beauty of having gsync is that I can select my max framerate on a game by game basis. An example is right now I am playing Watchdogs 2. I cannot play the game without bumping pixel density otherwise it just looks too blurry so I hard lock it to 45fps and it feels good.

If MS/Devs allow players to select a custom frame rate if they have a Freesync monitor then this would be a very good thing for console gaming.
Usually, if my GPU can´t render more then 45FPS in a game I lower the settings.
I find playing on PC below 40-45 FPS very hard on the eyes, because you sit so close to the monitor.
 

Costia

Member
I think it´s extremely helpful in the 45-60fps range, because there is usually the most noticable stuttering.
Again, Freesync is game independent.
It works on driver basis.
Not really. It depends on the game.
If it's designed to run at a capped and stable 30fps, unlocking the framerate can reveal or introduce new bugs.
So at the very least it will require additional testing.
It could be something like PS4's "use at your own risk" boost mode.
 

sangreal

Member
I guess my point wasn't taken here. They go out with arrogant statements claiming only the Xbox One X can do "true 4k" and compare the PS4 Pro to Xbox One S, but then fails to provide the demonstration of that "true 4k" they use in their marketing in actual demos. It comes off as arrogant and disingenuous when there is little evidence in the titles they claimed was "true 4k" actually native 4k. I'll still get it, but I don't like their marketing here.

They did show native 4K games and they said from the start that not all games would be native 4K. The X was designed to play 1080p Xbox One games at native 4K. That is a fact that is driving the marketing you're complaining about. Developers taking a different road doesn't change that. None of this has anything to do with the topic of this thread
 
Freesync is game independent.
The game only needs to have an unlocked framerate (or capped at 60fps).
Freesync also works with VSync enabled.

This is a great video that explains what Freesync actually does:
https://www.youtube.com/watch?v=p7_ZiVY8vwE

I know, but this is a console, and one aimed at a more locked 30 or 60fps experience compared to regular xbone.

The only way I believe we would see any improvements is if the game removes the framerate lock.
 
All adaptive sync solutions by all manufactures us the same VESA Adaptive-Sync tech. The issue here is not the tech but how that information is transmitted. There Microsoft currently can only refer to Freesync 2 since HDMI 2.1 is not finished and as such can't be used for promotion yet.

HDMI is proprietary, and FreeSync over HDMI is using AMD's proprietary extension to HDMI 1.4, not Adaptive-Sync, which is a VESA standard and only used in DisplayPort. FreeSync over HDMI and FreeSync through DisplayPort do the same thing, but one is proprietary extension, the other is an implementation of a VESA standard.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
May or may not be offtopic.

But would an Nvidia powered PC benefit in anyway from a FreeSync Monitor?

G-Sync is just too far out of my budget even if its been on my wishlist pretty much since launch....but i keep finding other things to upgrade/buy instead.

Cuz if Nvidia powered machines can utilize even through bypass FreeSync im so jumping on a FreeSync monitor.
 

Colbert

Banned
I know some kind of off-topic but ...

For anyone in Germany is interested into a LG OLED 4K TV for less money than usual there is a special offer by mediamarkt.de for LG OLED B6D 55" and 65" for a limited time:

LG OLED B6D 65" 2996,- Euro Direct Link

LG OLED B6D 55" 1796,- Euro Direct Link

I for myself own a 65" B6D. That TV is amazing!
 

SliChillax

Member
I know some kind of off-topic but ...

For anyone in Germany is interested into a LG OLED 4K TV for less money than usual there is a special offer by mediamarkt.de for LG OLED B6D 55" and 65" for a limited time:

LG OLED B6D 65" 2996,- Euro Direct Link

LG OLED B6D 55" 1796,- Euro Direct Link

I for myself own a 65" B6D. That TV is amazing!

For how long have you owned it? Any burn in issues?
 

inner-G

Banned
Does adaptive sync even work at framratfs like 30fps?

I thought it needed to be in the 40ish fps range to even kick in?
 
Top Bottom