• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: A Frame-Rate Free Lunch? FSR3 Frame Generation on PS5/Xbox Series X|S Tested

The video didn't show a side by side testing the exact same section and those percentages don't take drops to 0 FPS on XSX into account. There were differences in IQ settings favoring PS5 previously not mentioned here. There is simply no healthy data to jump into such conclusions.

Agreed, we need NXGamer to do a breakdown as well, with much more thorough testing. I am taking DF at their word in my conclusion.
 

Fafalada

Fafracer forever
This does make me wonder, why dont the console makers utilize antilag/reflex variants for the consoles? Its clearly a massive improvement.
Well for one - it's a game-specific thing - not a device thing - and some games absolutely do.
It was pretty common to implement specific latency reduction measures for console VR titles as an example (in many cases, superior to what you'd found in PC versions of the same titles).
Eg. one of the titles I worked on, at 60fps, had a 'engine-native' motion-2-photon latency at 45-50ms, which we reduced to 30ms by launch. PC version of the same title achieved a similar latency - but only because it ran at 90fps.

Still, most of what these techs do is just rearrange the rendering pipeline, as reduce stalls.
The biggest gains can be achieved by rearranging rendering and simulation both (and more importantly - how they interact) - but 'viability' of some of those varies greatly from game-to-game - as sometimes the changes are prohibitively complex to implement.
Work NVidia's doing on profiling tools/hw is pretty interesting though as - for many developers, the idea of even measuring this is pretty foreign.
 

hinch7

Member
The input lag is horrible even without FG. Wireless controllers would add even more. They need to sort that out for next generation consoles.
 
Last edited:

SKYF@ll

Member
The launch version was FSR2/720p, but now it's FSR3/960p, and the image quality has improved.
These are screenshots from the PS5 version (latest version).
The launch version had terrible image quality and a low frame rate, so this is a big improvement.
WyalG77.png
6rA3L6e.png
 

vkbest

Member


- If VRR won't work with FSR3 it would be terrible on console (since this video it was fixed).
- Using FSR3 to take 30 to 60 would be terrible (true). As Richard says, you need a minimum of 40 at least (true as well).
- You would need less demanding games to lock to 120 with FSR3.
- They think its doable on Immortals, but a locked 120 is probably not possible (true again).

They key takeaway is that they were focused on 60/120 and not considering variable framerates as back then FSR3 simply didn't work with VRR.

So to show how he was right you are using a video where he already knew about this game getting this in consoles?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
We don't need to keep doing this, they came out 4 years ago nearly. We know it's easier to optimize for PS5, anyone who codes knew this too.

I code and I didnt know this.
What makes it easier to optimize for PS5 compared to XSX?
 

Zathalus

Member
So to show how he was right you are using a video where he already knew about this game getting this in consoles?
It was the very first time they tested the possibility. Previously they claimed FSR3 would not really work for making 30fps games 60fps, and that according to the information they got directly from AMD themselves the performance cost was maybe way to high for the console GPUs.

The very first time they tested it themselves they then realized that AMD was being extremely conservative about the performance requirements and made the video.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Talking GPUs here brah

You said CPU.
Nonetheless that's incredibly incredibly reductive.
I thought you had some insight I missed.
There's nothing you've said that backs up your claims it's easier to optimize for PS5.
 
You said CPU.
Nonetheless that's incredibly incredibly reductive.
I thought you had some insight I missed.
There's nothing you've said that backs up your claims it's easier to optimize for PS5.
Physics? The same workload on a CU on PS5 will run faster than the Xbox. Or do you think that isn't so?
 

digdug2

Member
After trying this on Immortals on Ps5 I hope it comes to most games moving forward.
It doesn't fix the core issues with the game (horrible writing, meh visuals and gameplay) but it feels a lot better to play now. The fights are at least somewhat enjoyable now that it doesn't feel like the FPS tanks whenever a couple of enemies show up.

It might not be as good as native frames, but it sures beats 30fps or frames jumping all over the place from 25-60fps.
Would be amazing is Capcom could implement this on Dragons Dogma 2 on console.
I know that they just implemented DLSS3 on DD2, so maybe we'll see FSR3 come to fruition as well.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Physics? The same workload on a CU on PS5 will run faster than the Xbox. Or do you think that isn't so?

That's assuming all else is equal.
Which it isn't.
And that also doesn't tell anyone why it's easier to optimize for PS5.

The 6750xt runs faster than the 6800 by quite a bit. Does that mean it's easier to optimize for the 6750xt?

As I said earlier you are being very reductive and making it seem like clock speed automatically means easier to optimize for.
 

Shane89

Member
I remember DF saying this would be IMPOSSIBLE on consoles because it's not free and it just won't work, highly unlikely and all that. Can't wait to see sony's solution on PS5 Pro.
DF, especially the beaver said a lot of BS
 
That's assuming all else is equal.
Which it isn't.
And that also doesn't tell anyone why it's easier to optimize for PS5.

The 6750xt runs faster than the 6800 by quite a bit. Does that mean it's easier to optimize for the 6750xt?

As I said earlier you are being very reductive and making it seem like clock speed automatically means easier to optimize for.
No as I'm not just talking about clock speed am I? Narrow+Fast vs Wide+Slow, it's the fact that you won't be able to use all the CUs effectively, meaning you will not get access to the power of an Xbox, vs a PS5, because when you're dealing with sync locks, memory barriers etc across multiple CUs, that's harder. Having them complete faster, means you don't have to go to excessive trouble to optimize your shader code to to ensure you have coherent data. So you don't have to work so hard. So it's easier.
 

Buggy Loop

Member
There has to be a catch. Buggy Loop Buggy Loop get in here

Uh?

Didn't really follow the discussion in this thread but watched the DF video. The results are about what was expected no? Motion artifacts and the FSR upscaling being a pixel soup in motion amplifies the error when there's rapid movement. I don't see a catch. Not sure I would enable it if I was on console, 50-60 fps with VRR would already be good enough.

Holy shit at the base latency on the game though, LOL. 130 ms. Garbage for a shooter.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
Hole shit at the base latency on the game though, LOL. 130 ms. Garbage for a shooter.

In a way it only adding 8ms for a boost of 72% in performance for adding interpolated frames is kinda great, the original latency notwithstanding.
 
Last edited:

Buggy Loop

Member
In a way it only adding 8ms for a boost of 72% in performance for adding interpolated frames is kinda great, the original latency notwithstanding.

Sure. But peoples were throwing tomatoes at Nvidia for double performance with 10~12 ms adder and less artifacts 🤷‍♂️

But yes, it can be good. Personally though, if I were to have a game already in 60 fps range, not sure I would enable frame gen for the tradeoff in artifacts. Latency is not really the problem, never has been.
 

Ivan

Member
We only need an evolution of those frame generation engines to come to a solution which will give us 2 generated frames instead of 1. I can easilly see it happening in the next few years....
 
Last edited:
Top Bottom