• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Quantum Break PC and Xbox One Patch in the Works

Carlius

Banned
Such bullshit. Show some accountability ffs. Don't try to act like platform is responsible for you not putting in something so obvious. You're basically the first the fuck this up.

really? for a QUIT button? come on man. seamless alt tab transition, alt+f4 instant quit...where is the issue? its so much faster to click alt f4 than to go to menu and go to quit and press a button. this is waaay overblown considering there are serious issues with the game. the quit button not being one of them.
 

GHG

Member
really? for a QUIT button? come on man. seamless alt tab transition, alt+f4 instant quit...where is the issue? its so much faster to click alt f4 than to go to menu and go to quit and press a button. this is waaay overblown considering there are serious issues with the game. the quit button not being one of them.

How many times do we have to go round in circles about the quit option before the defenders will understand why it's an issue?

Do you need me to make a video documenting why it's an issue for some users?
 

SomTervo

Member
On the whole seems really reasonable.

read: we will not support multi-GPU setups in Quantum Break; it's too much work.

What's wrong with that? It's completely reasonable. The game wasn't built for it and they didn't want to/couldn't rebuild the entire structure to support it.

read: Quantum Break renders at nonnative resolution on PC, just like XB1, and we won't be changing that.

This is a bit lame indeed. It would be a bit easier to swallow if they also said it was too much work to re-jig.
 

Carlius

Banned
How many times do we have to go round in circles about the quit option before the defenders will understand why it's an issue?

Do you need me to make a video documenting why it's an issue for some users?

just telling me is more than enough cause i dont see it as an issue...not on pc at least.
 

SomTervo

Member
really? for a QUIT button? come on man. seamless alt tab transition, alt+f4 instant quit...where is the issue? its so much faster to click alt f4 than to go to menu and go to quit and press a button. this is waaay overblown considering there are serious issues with the game. the quit button not being one of them.

I don't really disagree, but i seriously think it should be made clearer on-screen (perhaps on boot-up) how to quit the game for users.
 

Rodin

Member
Render technique and resolution on Windows 10
The Windows 10 version of Quantum Break uses the same reconstruction method as on Xbox One. If your resolution is set to 1080p, the game temporally reconstructs the image (except UI) from four 720p buffers rendered with 4xMSAA, just like on Xbox One. Engine assigns input geometry samples from 4xMSAA rendering into shaded clusters in order to maximize covered geometry while keeping the performance on acceptable level by reducing expensive shaded samples. When you change the resolution, the buffers used to construct the image are always 2/3rds of the set resolution, i.e. in 2560x1440 they would be 1706x960.

So there's no way around this? It's just how the engine works? Wow
 
really? for a QUIT button? come on man. seamless alt tab transition, alt+f4 instant quit...where is the issue? its so much faster to click alt f4 than to go to menu and go to quit and press a button. this is waaay overblown considering there are serious issues with the game. the quit button not being one of them.

Yo. REAL PC gamers don't use controllers, amirite?
 

Jude

Banned
I like the "performance on acceptable level" part the most really.

LtNb.png


GTX 970 has 3+ times more power than Xbox One's GPU = barely manages to hit 30 fps in 1080p.

Xbox one is on med. settings... so more or less:

qb_medium.png
 
MultiGPU support won't be coming as long as Remedy uses the reconstruction technique that uses temporal buffering of frames, which is incompatible with AFR. It's the same reason why temporal AA solutions don't work with SLI/CF.

MultiGPU isn't just AFR. That's one of the benefits of what DX12 offers. AFR is just usually the easiest one to implement into an engine. The other methods DX12 offers to leverage multiple graphics cards would take a lot of work, but at least one of them is compatible with temporal AA.

I'm still disappointed that they aren't looking to do this though. So far of the three MS published DX12 titles that I am aware of, precisely 0 of them support DX12's mGPU capabilities. The only game I am aware of that uses them is Ashes of the Singularity.

I think it's very fair to question MS's feelings about mGPU when they've not worked to ensure that the games they themselves have published support it.
 

GHG

Member
just telling me is more than enough cause i dont see it as an issue...not on pc at least.

Some of us have comfy couch setups that are designed around not ever having to use a mouse or a keyboard believe it or not. Some people even use the controller pretty much exclusively even if they don't have a comfy couch setup. It's 2016 and PC users have more options than ever in terms of how they might want to set things up for themselves. Taking away one of those options is a step backwards.
 

etta

my hard graphic balls
I was told Alt+F4 was for planting the bomb or rescuing the hostages. And purchasing a defuse kit.
 

LordRaptor

Member
I think it's very fair to question MS's feelings about mGPU when they've not worked to ensure that the games they themselves have published support it.

I know I was hoping that mGPU would be an included feature of DX12 at a system level rather than a thing left for devs to implement themselves, because I'm pretty sure most PC Gamers on Intel at the very least have an integrated doing fuck all that might be nice to tap for a little extra oomph.

If DX12 mGPU support is only ever going to be a case by case basis where its up to the developer, its going to be a thing thats more effort than its worth and DX12 titles just won't bother. Which ironically is going to have AMD and nVidia pushing devs to use DX11 instead where SLI / Crossfire is handled automagically.
 

Fracas

#fuckonami
Glad the performance issues will be fixed at least. The resolution thing is dumb though.

Hope the patch comes soon.
 

dmix90

Member
I am not seeing this issue being addressed.
If you cross any light source with your character you will see those lights on a model for a few frames. Is it like that on X1 as well or this is a PC exclusive "feature"? Looks like another side effect of their reconstruction shenanigans.
 

dr_rus

Member
I would imagine max detail settings on PC are pushing a lot more detail than the standard XB1 version, even looking past the much higher resolution.

Well, the problem is that even if it does push a lot more detail no one can see shit on PC because of all the upscaling / postprocessing blur and these details are just completely lost here so that's just power and electricity spent on heating the atmosphere in effect.

But DX11 multi-adapter AFR usually required Nvidia involment to get working well, so you still have THAT issue. HEnce why I think it's dead. No one will invest in taking full advantage of the SLI goodies in DX12, and less and less games will continue to support the old way of SLI (as I understand many modern effects have issues runnign in AFR SLI).

NV's involvement in DX11 is something which is transferred to the dev in DX12 here, but it's kinda minimal as if you actually want the implicit AFR to work you just have to code the renderer in a way which will allow this from the start - and then it's some smaller tweaks to add before release which is what NV is doing in DX11 and the dev have to do in DX12. In this way yeah, developer have to apply some time here to make it work but the main point is that the renderer have to be compatible from the start - if it's not then nobody will be able to make it work.

DX12 adds a couple of other options in splitting the rendering between several GPUs. Whether they will be used is unknown but fundamentally the old DX9/11 way of handling things is still there, with all the issues it had recently as well.

Wait, am I reading this right? My r9 390 works better than an 980ti with this game?

They are pretty much on the same level. Because hey why would we optimize our renderer for anything but Xbox, right?
 

holygeesus

Banned
At the moment, any difference between quality settings just isn't worth the framerate crippling. You notice the jerkiness more than you notice any increased lighting rendering or shadows. I aim to play through on Xbox 1 equivalent settings, at a decent frame rate, then revisit when they actually fix this shit shower.
 

Kezen

Banned
At the moment, any difference between quality settings just isn't worth the framerate crippling. You notice the jerkiness more than you notice any increased lighting rendering or shadows. I aim to play through on Xbox 1 equivalent settings, at a decent frame rate, then revisit when they actually fix this shit shower.

That is true. The atrocious frame pacing defeats the purpose of higher settings really.
I played on high (ultra shadows and textures), it was "playable" most of the time but in places the stuttering was unbearable.
I suspected memory leaks, and it seems to be exactly what Remedy have described here :
Sometimes, after a longer play session, the game can end up to a state where the video memory becomes fragmented, and an important asset gets moved to system memory, which slows the rendering performance significantly
Restarting, reloading fixed this.
 
Uhhh... Pretty sure 720p is under HALF resolution of 1080p, not 2/3rds. 720 is 2/3rds of 1080, but vertical lines alone don't make up resolution. Microsoft either can't into math or they purposefully spun this to make it look good... err... less bad.

Gee, given their history with spinning stuff, I wonder which is it?
 
Uhhh... Pretty sure 720p is under HALF resolution of 1080p, not 2/3rds. 720 is 2/3rds of 1080, but vertical lines alone don't make up resolution. Microsoft either can't into math or they purposefully spun this to make it look good... err... less bad.

Gee, given their history with spinning stuff, I wonder which is it?

2/3 is easy way figure out correct settings for native res pretty much, but yeah... Once you multiply the numbers it's not 2/3 at all.
 

Kezen

Banned
So they're gonna fix everything but the resolution

Remedy pls, the game has a worse IQ on PC than Alan Wake on 360

At "reconstructed" 1080p, no.
AW on 360 was much worse than that.

There is a reason I stopped playing and waited for the PC version in spite of having a 360 at the time.
 

robotrock

Banned
Yeah they better have an option that allows it to render at whatever resolution I want. Even if it's like, an in game downsampling options or whatever.

Just make it happen
 

Lister

Banned
I didn't know they were rendering the game at sub native resolution on PC.

This game just wen't from wait for patch to never picking up. Too many other games to waste my time.
 

Armaros

Member
That is on Nvidia, not on Remedy. Drivers that NV been pushing out as of late have been mostly hot garbage, causing all kinds of minor and major problems for users. Drivers Remedy are suggesting are more or less last actually good drivers NV has released.

Ignoring that DX12 specificly puts the workload onto the developers and gives AMD and Nvidia less access to fix things.
 

Tovarisc

Member
Ignoring that DX12 specificly puts the workload onto the developers and gives AMD and Nvidia less access to fix things.

What that has to do with 362.00 being pretty much last stable driver from NV that doesn't bring risk of BSOD's, BSOD loops, GPU being locked to idle clocks, random performance drops because why not [not talking just DX12 games] or even bricking your GPU? I think Remedy promoting/suggesting actually stable and working driver version is quite legit and has nothing to do with "Ignoring that DX12 specificly puts the workload onto the developers".

Edit: I may have understood your post wrong way? If so sorry, tad tired :/ Still leaving rant here as pissed about NV driver quality.
 

Armaros

Member
What that has to do with 362.00 being pretty much last stable driver from NV that doesn't bring risk of BSOD's, BSOD loops, GPU being locked to idle clocks, random performance drops because why not [not talking just DX12 games] or even bricking your GPU? I think Remedy promoting/suggesting actually stable and working driver version is quite legit and has nothing to do with "Ignoring that DX12 specificly puts the workload onto the developers".

And why isn't remedy working directly with Nvidia to get their software compatible with Nvidia's drivers considering they are the only ones that can fix it.

'We can't be bothered to get things working on latest updates, just go back'

Just like other Normal PC features are not getting fixed here.
 

atr0cious

Member
Wow, I can't believe I'm actually going to skip a Remedy game, and I have owned at least two copies of all their prior work.
 

Tovarisc

Member
And why isn't remedy working directly with Nvidia to get their software compatible with Nvidia's drivers considering they are the only ones that can fix it.

'We can't be bothered to get things working on latest updates, just go back'

Just like other Normal PC features are not getting fixed here.

Apparently I understood you right and you think it's Remedy's fault that NV makes shit drivers? We are talking about shit in general sense, not just about Quantum Break. For e.g. Division release drivers could put users PC into infinite BSOD loop and required some tech know-how and gymnastics to get fixed. DS3 release drivers [current latest from NV] have small issues like drivers not detecting any 3D software so GPU is forever idle clocked, even some reported cases of this driver turning GPU's into expensive paperweights.

How that stuff is on Remedy and how them suggesting last know solid driver [362.00] is bad? Also I don't think Remedy getting pissy to NV about shit drivers would achieve anything that large amounts of feedback from users to NV isn't already achieving.
 

diaspora

Member
And why isn't remedy working directly with Nvidia to get their software compatible with Nvidia's drivers considering they are the only ones that can fix it.

'We can't be bothered to get things working on latest updates, just go back'

Just like other Normal PC features are not getting fixed here.
Remedy isn't working with nvidia to fix performance issues? Do we have a source on that?
 

dr_rus

Member
Might have to wait for this patch, the grain effect looks terrible.

I'm pretty certain that the noisy grain effect is hiding some artifacts of resolution reconstruction they have running and thus removing it may actually produce an even worse result until there will be an option of rendering in native res.

That is true. The atrocious frame pacing defeats the purpose of higher settings really.
I played on high (ultra shadows and textures), it was "playable" most of the time but in places the stuttering was unbearable.
I suspected memory leaks, and it seems to be exactly what Remedy have described here :

Restarting, reloading fixed this.

That's not about a memory leak, that's about a crappy resource management which must be done by the renderer in DX12 and is done by the driver in DX11. Most of DX12 issues are coming from parts where the developer need to control something which was controlled by the driver previously. Turns out that most devs can't produce code which is as efficient let alone better than what is already there in IHV drivers, big surprise.

Apparently I understood you right and you think it's Remedy's fault that NV makes shit drivers? We are talking about shit in general sense, not just about Quantum Break. For e.g. Division release drivers could put users PC into infinite BSOD loop and required some tech know-how and gymnastics to get fixed. DS3 release drivers [current latest from NV] have small issues like drivers not detecting any 3D software so GPU is forever idle clocked, even some reported cases of this driver turning GPU's into expensive paperweights.

How that stuff is on Remedy and how them suggesting last know solid driver [362.00] is bad? Also I don't think Remedy getting pissy to NV about shit drivers would achieve anything that large amounts of feedback from users to NV isn't already achieving.

You're assuming that Remedy knows more about driver stability than NV as NV isn't recommending downgrading to 362.00 drivers anywhere? And why would Remedy even care about what driver is "stable" when what they need to care about is how their game is performing on the latest driver there is?

Fixing their D3D12 issues in QB is on Remedy and not on NV. NV had some driver issues lately but they are hardly related to DX12 in any way.
 
Not really. Nothing much in the average and your card drops to a lower FPS.

The 390 is getting the same average fps, I don't think you understand how wrong that is, they shouldn't be on the same level at all. The fact that my 390 manages to hit the same level as an 980ti is really weird, clearly the game is not well optimized, something isn't right.
 

shandy706

Member
1080*3/2

2880x1620@90hz for 1080p60

You are correct.

I'll start downloading the game tonight and see how this works out, lol.

It will probably be tomorrow before I can test it though.

While I'm sure they can get the game running better, I'm not so sure they can fix the resolution thing. I honestly think that would require rebuilding the entire game (engine) and the way it handles things in some form.
 

vg260

Member
I appreciate the technical explanations for the problems, don't usually see that from developers

Yes, regardless if the issue was dumb in the first place, people don't like the answer, or if it doesn't change a purchase decision, it's good to see them provide explanation for the various situations..
 

Kezen

Banned
I'm pretty certain that the noisy grain effect is hiding some artifacts of resolution reconstruction they have running and thus removing it may actually produce an even worse result until there will be an option of rendering in native res.



That's not about a memory leak, that's about a crappy resource management which must be done by the renderer in DX12 and is done by the driver in DX11. Most of DX12 issues are coming from parts where the developer need to control something which was controlled by the driver previously. Turns out that most devs can't produce code which is as efficient let alone better than what is already there in IHV drivers, big surprise.

I know you are being sarcastic but that is a genuine surprise for me. I thought that devs would be able to outclass the driver easily considering the app knows what it needs and when unlike the driver which sometimes has to assume a lot of things about the game's needs.

So yeah, seriously I'm disappointed that DX12 is behind quality DX11 drivers. Considering top end devs have worked with such APIs before I was expecting a much flatter learning curve and a better experience right off the gate.

I was proven wrong, sadly. Although my experience with DX12 (ROTTR) is very positive, it does run better overall, slightly less in pure GPU bound scenarions (2-3 frames less on my 980).
 
Top Bottom