• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Quantum Break PC performance thread

Daingurse

Member
This has nothing to do with UWP. It has to do with DX11 vs DX12.

Rise of the Tomb Ralder UWP has comparable performance to the Steam version.

Either way, I'd like to see how these people are going to continue to try to justify Quantum Break's original performance, as they have the past few pages, now that we're seeing 30% gains on the DX11 version.
 

Zedox

Member
Either way, I'd like to see how these people are going to continue to try to justify Quantum Break's original performance, as they have the past few pages, now that we're seeing 30% gains on the DX11 version.

I would say that the performance difference is based off of the familiarity of the API used by the developer than just DX11 v. DX12 in itself. It's clear that MS rushed Remedy to make the game, why it was DX12, don't know (UWP has nothing to do with that as Killer Instinct is DX11 I believe). Time and familiarity are the causes for these performance issues (granted that could be for anything but we all presume that this is a rushed port)
 
I would say that the performance difference is based off of the familiarity of the API used by the developer than just DX11 v. DX12 in itself. It's clear that MS rushed Remedy to make the game, why it was DX12, don't know (UWP has nothing to do with that as Killer Instinct is DX11 I believe). Time and familiarity are the causes for these performance issues (granted that could be for anything but we all presume that this is a rushed port)

Agreed. But again, read back through this thread. Many were claiming that there were no "performance issues", but it's just that Quantum Break is the new Crysis and it looks so amazing and is so far ahead of everything and that's why it runs like total shit.
 

dr_rus

Member
Quantum Break Benchmark : Steam mit DX11 gegen Windows Store mit DX12

Yep, some insane performance gains on NV, basically confirming that the UWP/DX12 version was just very badly optimized for NV h/w:

1Kbc.png


2Kbc.png


AMD seems to be loosing a bit of performance but I'm pretty sure that they'll gain this bit back with new drivers.

More at the link.
 

Locuza

Member
If you follow the article to Comuterbase you will see that for AMD there are quite big gains from DX12 if you have a weak CPU.
For AMD and Nvidia the UWP-Variant shows better frametimes than the Steam-Version at least most of the time but Nvidia does have ugly frame-spikes under UWP so DX11 does feel smother nonetheless.
For AMD user it's a mixed bag, since it depends on how your system is balanced if UWP (DX12) or Steam (DX11) is better.

From a customer perspective it's really bizarre and dissapointing that Remedy decided to make DX12 and DX11 store specific.
 

Vuze

Member
Quantum Break Benchmark : Steam mit DX11 gegen Windows Store mit DX12

Yep, some insane performance gains on NV, basically confirming that the UWP/DX12 version was just very badly optimized for NV h/w:

AMD seems to be loosing a bit of performance but I'm pretty sure that they'll gain this bit back with new drivers.

More at the link.
Nice gain after all. Initial benchmarks lacking direct comparison were probably a bit quick to be judged as reference.

I hope they'll add the DX11 renderer to the Store version but I'm still confused whether or not they'll do that. Probably the latter ugh
 

Mohasus

Member
970 @1478MHz here.
1440p (native res), scaling on, medium settings (ultra textures). Almost locked 60fps, act 1 completed. I'm ok with this.
 

dr_rus

Member
For AMD and Nvidia the UWP-Variant shows better frametimes than the Steam-Version at least most of the time but Nvidia does have ugly frame-spikes under UWP so DX11 does feel smother nonetheless.

This is a bit of a strange result tbh as my personal feeling says the exact opposite, at least on a GTX 1080 - Steam version in exclusive fullscreen have better frametime stability than UWP one in a window.
 
Agreed. But again, read back through this thread. Many were claiming that there were no "performance issues", but it's just that Quantum Break is the new Crysis and it looks so amazing and is so far ahead of everything and that's why it runs like total shit.
The game is far ahead of anything Imo, but while any improvement is welcome it's not like now it runs amazingly and before was like shit. It's still a taxing game to run everything maxed.
 
I'm out of the loop on this one. So, does this mean that people who have Quantum Break from the Windows store will or won't get a DX 11 patch for this?
 

eero94

Member
I'm out of the loop on this one. So, does this mean that people who have Quantum Break from the Windows store will or won't get a DX 11 patch for this?
It would probably be too embarrassing for MS to release DX11 patch on their store that runs better than DX12 that they are hyping.
 

b0bbyJ03

Member
GTX 1080
3770K@4.3
16 GB RAM
Windows 10

Runs well. I get about 90 fps on gysync with everything at ultra, at 1440p, except GI and VL at medium with scaling on
 

Locuza

Member
This is a bit of a strange result tbh as my personal feeling says the exact opposite, at least on a GTX 1080 - Steam version in exclusive fullscreen have better frametime stability than UWP one in a window.
The results from Computerbase back this up, since the smaller variance will be probably unnoticeable while the huge frame spikes which occur sometimes are definitely recognizable.
 

Tomodachi

Member
I think triple buffering is disabled for me, anyone with the same problem?
The framerate jumps from 60 to 30 and then back to 30, no middle ground.
I have vsync on (obviously) and am running the game on a 970, latest nvidia driver. Steam version.

EDIT: also, any way to hide the mouse pointer when I'm playing with a pad? It keeps popping up right at the center of the screen in menus and when I read documents, VERY annoying.
 
Quantum Break Benchmark : Steam mit DX11 gegen Windows Store mit DX12

Yep, some insane performance gains on NV, basically confirming that the UWP/DX12 version was just very badly optimized for NV h/w:

1Kbc.png


2Kbc.png


AMD seems to be loosing a bit of performance but I'm pretty sure that they'll gain this bit back with new drivers.

More at the link.

Wow that's bad where you cant get 1080/60 on 980 Ti with the DX12 version

I dont understand why they cant just update the windows store version with the DX11 version that steam has. Feels wrong to me to buy the game a second time just to get 1080/60 on Titan X (OG)
 
Digital Foundry: Quantum Break: Better on DirectX 11! GTX 970/1060 vs RX 480 Gameplay Frame-Rate Tests


GTX 970 runs over 30% faster and has no more stutter and driver crashing.
GTX 1060 runs about 20% faster.
No difference for the RX 480.

Nice gains there, especially on medium, we're looking at roughly 3x the performance of Xbox and real world a 980/1060 has 3x the perf in games.

Glad to have waited. Feel sorry for 970 windows store owners, they should get sent a Steam code if you have proof of owning windows store version.

Just to add, the devs went for 20% of the GPU market and also nerfed the most popular card, the 970. What a sorry ass launch QB was.
 

Manac0r

Member
Have the game running at 60fps but the transition between cut scenes and game play is horrendous. Stuterry and pauses - is this moving from 60 to 30 fps? Looks like in game cut scenes to be fair...
 

4jjiyoon

Member
I think triple buffering is disabled for me, anyone with the same problem?
The framerate jumps from 60 to 30 and then back to 30, no middle ground.
I have vsync on (obviously) and am running the game on a 970, latest nvidia driver. Steam version.

EDIT: also, any way to hide the mouse pointer when I'm playing with a pad? It keeps popping up right at the center of the screen in menus and when I read documents, VERY annoying.

this.

it's sooooo annoying sitting on the couch playing on the tv and the mouse cursor keeps popping up everytime i find a collectible or go into a menu.

vsync seems to be double buffered to me. i set it myself via nvidia inspector and disabled the ingame one.

Have the game running at 60fps but the transition between cut scenes and game play is horrendous. Stuterry and pauses - is this moving from 60 to 30 fps? Looks like in game cut scenes to be fair...

yeah the ingame cutscenes are locked to 30fps.
 

Braag

Member
Some of the outside areas with a lot of effecs going on are a lot demanding than others and it's usually in these areas that the frame rate jumps all over the place.

Here I'm standing looking at this car and the game is at a steady 60 fps


But as soon as I aim at the car my fps drops to 30. Not a single frame higher or lower, but a steady 30.


I'm running a GTX1070. This is the first game where I've experienced such huge jumps in frames with this card.
 
Some of the outside areas with a lot of effecs going on are a lot demanding than others and it's usually in these areas that the frame rate jumps all over the place.

Here I'm standing looking at this car and the game is at a steady 60 fps



But as soon as I aim at the car my fps drops to 30. Not a single frame higher or lower, but a steady 30.



I'm running a GTX1070. This is the first game where I've experienced such huge jumps in frames with this card.

That is double buffered Vsync.
 

fantomena

Member
Just reached chapter 3 act 1.

Super stable at 60 FPS at 1080p, no drops.

Playing at everything maxed out. Film Grain off, AA on, upscaling on, exclusive full screen.
 
Have you seen Forza Horizon 3?

UWP is not necessarily related to performance.

Oh my god guys for the last fucking time I'm not saying UWP is what's killing performance. I'm just saying there's two different versions of the game: UWP and Steam. UWP runs like ass. Not because it's UWP. There's just two different versions of the game out there and one runs like ass. I never said it runs like ass because of UWP.
 
Some of the outside areas with a lot of effecs going on are a lot demanding than others and it's usually in these areas that the frame rate jumps all over the place.

Here I'm standing looking at this car and the game is at a steady 60 fps



But as soon as I aim at the car my fps drops to 30. Not a single frame higher or lower, but a steady 30.



I'm running a GTX1070. This is the first game where I've experienced such huge jumps in frames with this card.

Vsync. It means you were probably just above 60fps before, and just below after. WIth Vsync on that will cut your FPS in half and sync it to 30.
 
Oh my god guys for the last fucking time I'm not saying UWP is what's killing performance. I'm just saying there's two different versions of the game: UWP and Steam. UWP runs like ass. Not because it's UWP. There's just two different versions of the game out there and one runs like ass. I never said it runs like ass because of UWP.

Hey, don't get angry with us because you leaned toward being confrontational rather than being articulate with your thoughts.
 

Braag

Member
That is double buffered Vsync.

Vsync. It means you were probably just above 60fps before, and just below after. WIth Vsync on that will cut your FPS in half and sync it to 30.

That's right.
Turning off Vsync dropped my frames from 75 to around 40 when aiming in that same spot. Yet once I enable borderless window my frames don't drop and stay at a steady 60 regardless if I'm aiming or not.
 

scitek

Member
Some of the outside areas with a lot of effecs going on are a lot demanding than others and it's usually in these areas that the frame rate jumps all over the place.

Here I'm standing looking at this car and the game is at a steady 60 fps



But as soon as I aim at the car my fps drops to 30. Not a single frame higher or lower, but a steady 30.



I'm running a GTX1070. This is the first game where I've experienced such huge jumps in frames with this card.

You are using double buffered vsync.

EDIT: Beaten. In other news, the most demanding effects in this game to turn down with minimal visual impact are the "visual effects" or whatever they're called. Basically the refraction from the time shifting effects are what hit it the most for me.
 
It's one of shitiest and weirdest performing UWP games out there. Plagued with stuttering, awful cpu utilization and bad performance on most older gpu architectures. I mean when a 390 is half as fast as a 480 there's something seriously wrong.

I'm personally not running into any issues with the game on my rig, so I can't comment on issues others are having. I'll pop over to the PC performance thread.
 

TSM

Member
Considering it didn't change fore the 480... It may be due to Nvidia's DX12 performance.

That's not how DX12 works. DX12 performance is all on the dev. The difference between DX12 and DX11 is all the performance Remedy left on the table compared to Nvidia's driver team.
 
That's not how DX12 works. DX12 performance is all on the dev. The difference between DX12 and DX11 is all the performance Remedy left on the table compared to Nvidia's driver team.

Unless it's due to async compute limitations. Nvidia's implementation is nowhere near as efficient as AMDs. So, if a game relies on Async heavily in it's DX12 implementation, then Nvidia cards will suffer. Pascal corrected a lot of the issues that Maxwell 2 had, such as:

Source Anandtech:
The issue, as it turns out, is that while Maxwell 2 supported a sufficient number of queues, how Maxwell 2 allocated work wasn’t very friendly for async concurrency. Under Maxwell 2 and earlier architectures, GPU resource allocation had to be decided ahead of execution. Maxwell 2 could vary how the SMs were partitioned between the graphics queue and the compute queues, but it couldn’t dynamically alter them on-the-fly. As a result, it was very easy on Maxwell 2 to hurt performance by partitioning poorly, leaving SM resources idle because they couldn’t be used by the other queues.

on Pascal :

Right now I think it’s going to prove significant that while NVIDIA introduced dynamic scheduling in Pascal, they also didn’t make the architecture significantly wider than Maxwell 2. As we discussed earlier in how Pascal has been optimized, it’s a slightly wider but mostly higher clocked successor to Maxwell 2. As a result there’s not too much additional parallelism needed to fill out GP104; relative to GM204, you only need 25% more threads, a relatively small jump for a generation. This means that while NVIDIA has made Pascal far more accommodating to asynchronous concurrent executeion, there’s still no guarantee that any specific game will find bubbles to fill. Thus far there’s little evidence to indicate that NVIDIA’s been struggling to fill out their GPUs with Maxwell 2, and with Pascal only being a bit wider, it may not behave much differently in that regard.

and in comparison to AMD:

Meanwhile, because this is a question that I’m frequently asked, I will make a very high level comparison to AMD. Ever since the transition to unified shader architectures, AMD has always favored higher ALU counts; Fiji had more ALUs than GM200, mainstream Polaris 10 has nearly as many ALUs as high-end GP104, etc. All other things held equal, this means there are more chances for execution bubbles in AMD’s architectures, and consequently more opportunities to exploit concurrency via async compute. We’re still very early into the Pascal era – the first game supporting async on Pascal, Rise of the Tomb Raider, was just patched in last week – but on the whole I don’t expect NVIDIA to benefit from async by as much as we’ve seen AMD benefit. At least not with well-written code.

This could account for the difference between the 970 and the 1060 on DX12 and for the difference between the 1060 and 480. It seems to scale pretty well with the async compute capabilities of the cards.
 
Unless it's due to async compute limitations. Nvidia's implementation is nowhere near as efficient as AMDs. So, if a game relies on Async heavily in it's DX12 implementation, then Nvidia cards will suffer.
NV cards won't suffer (as in run worse) as they will just run it serially as they would anyway under DX11 (on Maxwell and below).
 
NV cards won't suffer (as in run worse) as they will just run it serially as they would anyway under DX11 (on Maxwell and below).

That's just it, if a game's implementation of DX12 relies on running code asyncronously, then running serially could cause lower framrate and could stall out if the card isn't able to keep up with vital processes. Sort of like what seems to be happening with the 970 in the GF video which leads to a driver crash.

Considering that the DX12 version was coded to maximize performance on a console, it's a safe bet that it heavily favors async compute as that should be the best way to maximize performance there.
 
Top Bottom