• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Leadbetter interviews 4A's Oles Shishkovstov about current gen consoles + PC

..yet still no question about the amount of RAM that is available to PS4 games.

Are developers under threat from Sony not to reveal it or something, If so, why the big deal?
 

MaLDo

Member
Tessellation has also been a casualty.

In the TLoU comparisson, they talked about how to move a last gen game to current gen. So both Metro games are an improvement in graphics.

Now, if we're talking about the pc version, that's another different story. A multiplatform game from 4 years ago on pc has better graphics than the new current gen remake on consoles.
 
To be fair, the significantly improved the asset quality while at the same time massively reducing or even eliminating two of the most performance-hungry effects (volumetric lighting and high-quality pp motion blur). It's not an improvement across the board (unless you mean compared to the last-gen console versions?)

off course to the last gen versions, how will be it be fair to compare it to TLOU?
 
I don't see that comment as a dig at Naughty Dog. If anything he is acknowledging them to be one of the best.

People are being too touchy.
 

gofreak

GAF's Bob Woodward
I guess freeing CPU time is almost always helpful, no matter what you do.

Helpful to what is the question. In some cases the 'help' might be limited to reducing power consumption on the CPU... in terms of game pipeline performance, you're overall as fast as your slowest component. In cases where you are GPU bound, speeding CPU work may not net you anything except a cooler CPU maybe :) If you see a game cutting resolution to meet a framerate on XB1, for example, that's an indicator the game is GPU bound rather than CPU bound. Cutting CPU work in those cases (or making it faster) won't win you a higher framerate. Conversely if your frametime is bound by CPU work, obviously cutting CPU work or speeding it up with be helpful to framerate. So it does, from a game perf point of view, depend a fair bit on what you do.
 

Marlenus

Member
this seems confusing, because both have effectively the same CPU (Xbox even apparantly a little faster). So why would it be so much slower? Are they using the GPU somehow to generate draw calls for itself? main memory bandwidth limitation? shit driver from MS?

It is not slower, they just have more overhead. As he says graphics ops on the PS4 require a Dword call that might use a few cycles, on the Xbox due to the API that same graphic op requires many many more cycles.

The sooner they get this sorted the better though because that will be where parity between PS4 and Xbox One is. If the API on Xbox requires more overhead that is less CPU runtime to improve the world simulation so you have less things like interactive physics and so on. Those will be scaled to the lowest common denominator as they would actively change the gameplay/balancing/feel of the game and no dev will want to deal with that.

As the CPU overhead reduces and the compute functionality comes up I really do hope world interactivity has a large increase to make everything feel that little bit more real.
 

arhra

Member
I wonder how much of an improvement this makes. And this could put the Diablo 3 situation into a different light.

I doubt that in particular had anything to do with the Diablo situation - the "DX12-style" is probably essentially a prototype of DX12 itself (with whatever GCN-specific custom tweaks they've added that won't generalize over to nvidia/intel hardware), and taking advantage of that would likely take a near-complete rewrite of the renderer.

During the DX12 announcement presentation, they mentioned that the Forza DX12 port took them 4 months (with a team of 4 engineers working on it), so depending on when that API became available in the XDKs, I wouldn't expect anything to ship using it until next year (unless some first-party teams have had early access - Forza being ported early this year does make me wonder whether Horizon 2 might have had the time to benefit from that work), as nobody would want to embark on anything of that magnitude just before shipping.

Any improvements for games shipping in the near future will be down to whatever they can wring out from the kinect reserve being lifted, and improvements to the DX11.x runtime to fast-path more calls and eliminating as much of the overhead as they can without completely re-architecting the API.
 

c0de

Member
Helpful to what is the question. In some cases the 'help' might be limited to reducing power consumption on the CPU... in terms of game pipeline performance, you're overall as fast as your slowest component. In cases where you are GPU bound, speeding CPU work may not net you anything except a cooler CPU maybe :) If you see a game cutting resolution to meet a framerate on XB1, for example, that's an indicator the game is GPU bound rather than CPU bound. Cutting CPU work in those cases (or making it faster) won't win you a higher framerate. Conversely if your frametime is bound by CPU work, obviously cutting CPU work or speeding it up with be helpful to framerate. So it does, from a game perf point of view, depend a fair bit on what you do.

I wonder what generally nowadays games are bound to. If they are already GPU bound devs should use these cycles for everything else that can be computed, e.g. AI.
 

R_Deckard

Member
Great article and with <6 months of hands on dev kits to launch means they did a great job.

Refreshing to read a dev confirm a lot of what many of us here knew or suspected and just a real sign of when devs move to gpgpu and more importantly gpu driven render pipelines we will see huge improvements, something ND have rumoured to have already done with uc4!
 

c0de

Member
I doubt that in particular had anything to do with the Diablo situation - the "DX12-style" is probably essentially a prototype of DX12 itself (with whatever GCN-specific custom tweaks they've added that won't generalize over to nvidia/intel hardware), and taking advantage of that would likely take a near-complete rewrite of the renderer.

During the DX12 announcement presentation, they mentioned that the Forza DX12 port took them 4 months (with a team of 4 engineers working on it), so depending on when that API became available in the XDKs, I wouldn't expect anything to ship using it until next year (unless some first-party teams have had early access - Forza being ported early this year does make me wonder whether Horizon 2 might have had the time to benefit from that work), as nobody would want to embark on anything of that magnitude just before shipping.

I meant XDK, not the new way to work around the API. This will be to "near" for most devs, I guess.
Still I think what these devs achieved in four months on Xbone and six months on PS4 seems amazing and I wonder what we can expect in the future.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
That dig at ND by Leadbetter i thought was slightly unnecessary. Only 30 people working for 5 months at trying to reverse engineer a game specifically developed for esoteric hardware like the cell and rsx and 512MB ram trifecta to a new hardware at twice the framerate and 2.25 the resolution, is entirely different from a whole team like 4A working on porting an already x86 game to other x86's platforms, dialing down a few features and putting in better environmental assets for certain areas.

Whether you think ND are lazy or not as talented should not enter in the description, they should not be compared because they are different situations entirely.
 
Two past-gen console games, or porting two old pc games to what are basically PCs?

There is a bit of a difference.

It is pretty obvious to the most casual observer that 4as accomplishment should not be understated. Seriously, stop with your console war stuff please.

Just because the game runs better than TLOU remastered does not mean you need to shit on 4as accomplishment.
 

UnrealEck

Member
I am skeptical of the 2x performance increase they think can be gained pretty much from what developers have gotten so far. If developers are still getting used to the console APIs and through time get as close to utilising it fully, wouldn't that just mean the console's graphics processing power will be fully utilised? I'm not understanding why they (and Carmack) think it's going to be 2x performance increases.
 

R_Deckard

Member
It is pretty obvious to the most casual observer that 4as accomplishment should not be understated. Seriously, stop with your console war stuff please.

Just because the game runs better than TLOU remastered does not mean you need to shit on 4as accomplishment.
He is not, no more that leadbetter did in the article, he is merely pointing out an obvious point that in around the same time frame the 2 processes are not comparable.
 
That dig at ND by Leadbetter i thought was slightly unnecessary. Only 30 people working for 5 months at trying to reverse engineer a game specifically developed for esoteric hardware like the cell and rsx and 512MB ram trifecta to a new hardware at twice the framerate and 2.25 the resolution, is entirely different from a whole team like 4A working on porting an already x86 game to other x86's platforms, dialing down a few features and putting in some better environmental assets.

Whether you think ND are lazy or not as talented should not enter in the description, they should not be compared because they are different situations entirely.

i don't think it has anything to do with laziness or talent, it's all on effort. the remaster of TLOU was simply not a priority to ND like 4A. which sucks considering how good the game is. IMO it deserved more than a cheap port.
 
He is not, no more that leadbetter did in the article, he is merely pointing out an obvious point that in around the same time frame the 2 processes are not comparable.

How are they not comparable?

Both games were developed with the "esoteric" hardware of the PS3 in mind.The 4a engine has had a PS3 code path since 2009 or so.
 

R_Deckard

Member
How are they not comparable?
As he says, one was already running on x86 code to begin with and would have been an easier base to start with.

The other was entirely designed around a system that was unique from its CPU/GPU, ram and code base.. I.e. chalk and cheese.

None are less praiseworthy and thus do not need to be compared.
 
I am skeptical of the 2x performance increase they think can be gained pretty much from what developers have gotten so far. If developers are still getting used to the console APIs and through time get as close to utilising it fully, wouldn't that just mean the console's graphics processing power will be fully utilised? I'm not understanding why they (and Carmack) think it's going to be 2x performance increases.

he didn't say that. he said that programming close to the metal can be 2x faster on console than "equivalent pc spec" if equivalent means flops, transistors, MHz.
so that implies the consoles will last longer than some people like to think, and that when fully exploited it might take 2x the transistors on the pc heterogeneous API heavy world for the same result on screen.
 

hodgy100

Member
i really love the direct answers, no garbage PR in the play here. i am quite surprised at how layered the XO API here, he just basically said that its DX11. as for leadbetter's comparison with TLOU:R its true. the effort that was done to the redux edition is way above ND's. not only did they significantly improve the graphics they managed to make the game a locked 60 FPS at 1080p.

its not comparable though. Both Metro Games had a PC version which was miles ahead of the console versions to start with.
 

KKRT00

Member
its not comparable though. Both Metro Games had a PC version which was miles ahead of the console versions to start with.

Omg, but non were optimized around heavy low frequency multitthreading and around 100hz in mind.
Refactoring was definitely harder on both Metro games, especially in 2033 than it was for TLoU.
 

mrklaw

MrArseFace
It was the 'shit driver' - the API was doing a lot of window dressing around draw calls, eating up CPU time etc. As they discuss later, the GNM (and now, DX12) model is more 'you look after this yourself, we won't do extra housekeeping for you anymore'.

For all the shit Sony gets on software, I find it interesting that they basically pioneered the model of a modern 'low level' API with GNM that everyone else is now following. At least on the game runtime and tool side, PS3 really kicked them into shape.

why even launch with such a driver then? Was that part of the windows 8 heritage of the xbox? Surely they'd learned from two previous consoles that you need an efficient low level OS for games?
 

Kayant

Member
It depends. Taking what they're talking about there, if freeing up some CPU improves the framerate on the game then that would obviously be an improvement. But where your framerate is bound is quite a variable thing per game. From what they're saying there it wouldn't improve things on the GPU side so much as making things more efficient on the CPU side, so GPU bound scenarios may not benefit so much.

So in a case like TF there would be an improvement to framerate not resolution right?


In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.

But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.

I guess it's not called DX11.X for not reason then. Given the way they have talked about DX on XB1 it's odd the situation is actually like this.I guess I eat some crow then :p. Well I guess this was what Matt was talking about with "Yes, you can get more out of the PS4&#8242;s CPU than you can the Xbox&#8217;s" statement. Also sounds like MS were surprised at PS4 launching so early and or the way PS4 was setup.

They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.

No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.

Am guessing here he is talking about the same thing? And that the update will help in improving the CPU situation even more and bringing it closer to how it's handled on PS4.

That dig at ND by Leadbetter i thought was slightly unnecessary. Only 30 people working for 5 months at trying to reverse engineer a game specifically developed for esoteric hardware like the cell and rsx and 512MB ram trifecta to a new hardware at twice the framerate and 2.25 the resolution, is entirely different from a whole team like 4A working on porting an already x86 game to other x86's platforms, dialing down a few features and putting in better environmental assets for certain areas.

Whether you think ND are lazy or not as talented should not enter in the description, they should not be compared because they are different situations entirely.

It wasn't the whole team.

We are doing both. We have been working on a new game as well as Redux. We had the production resource free to handle Redux while the next project was in early pre-production, although now the Redux team are needed on the next project as we ramp up!
 

KainXVIII

Member
And we can't correctly compare Metro Redux series and TLoU because they have different point of view, graphical fidelity (for example, motion blur is absent in Metro) and so on.
 
It is pretty obvious to the most casual observer that 4as accomplishment should not be understated. Seriously, stop with your console war stuff please.

Just because the game runs better than TLOU remastered does not mean you need to shit on 4as accomplishment.

Come on, there is no need for that.

If you had the job of porting a PS3, xbox 360 or PC title, I can bet the PS3 version is the last one you would touch.

Nobody is shitting on 4as accomplishment, but the fact stands it is a different process.
 

gofreak

GAF's Bob Woodward
why even launch with such a driver then? Was that part of the windows 8 heritage of the xbox? Surely they'd learned from two previous consoles that you need an efficient low level OS for games?

I'm not sure what MS's API on 360 was like... it may not have mattered so much back then compared to PS3 because of other challenges PS3 had itself, or because the GPUs could only handle a certain scale of draw calls anyway, and it never really troubled the CPU. But now CPU power is relatively stagnant while GPU power has grown dramatically. From the way I see others talk about it wouldn't surprise me if DX12 is MS's first crack at this kind of API, if they did not have that kind of low level API in 360.

As for why they didn't see this coming or why they launched with this kind of driver in the current environment, I guess there may have been a bit of politics involved (i.e. Windows wanted a close software relationship), and probably also just time constraints.
 
He won't mention it I'm sure. Does anyone else read his stuff just for laughs? It's really good satire honestly, the comments are almost as inexplicable as the posts. I hate to give the guy any credit, but damnit if it isn't a joy to read when I need some comedy in my internet browsing.

I honestly cannot wait to see what the Metro guys come up with for their first true next-gen game... he mentions 2.5x "better, richer visuals". I like what I'm hearing :)
Who is Mr. X and where does he comment?
 

SmokedMeat

Gamer™
And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec.

Who does this guy think he is? PC GAF says they're exactly the same, and will always yield the same results.

Edit: Can't wait to see what these guys can do when they have more than four months!
 
Come on, there is no need for that.

If you had the job of porting a PS3, xbox 360 or PC title, I can bet the PS3 version is the last one you would touch.

Nobody is shitting on 4as accomplishment, but the fact stands it is a different process.

While you are not saying it directly, I get the feeling the undercurrent of what you and others are saying is that 4as flat 60fps on PS4 as a port from a last gen game... is not in anyway more impressive than a "not flat 60fps" port of TLOU.

Rather... it is just "so different" that the comparison becomes illogical.


So porting 2 SKUs in comparison to porting 1 are not in anyway comparable?

Similarly, without comparing the process.. can't one just compare the metrics? One is a flat 60 and the other is not?
Who does this guy think he is? PC GAF says they're exactly the same, and will always yield the same results.

I get the feeling he is talking about pretty specific scenarios (draw calls, async compute, whatever) and not over all complete render performance.

No intelligent person would deny that with a lower API (aka Mantle, DX 12, Open GL 4.xx , PS4 lib) you can get better results for specific hardware types. But that would be quite different than saying console hardware is 60fps while equiv hardware on PC dx11/dx9 is 30fps.
 
In the TLoU comparisson, they talked about how to move a last gen game to current gen. So both Metro games are an improvement in graphics.

Now, if we're talking about the pc version, that's another different story. A multiplatform game from 4 years ago on pc has better graphics than the new current gen remake on consoles.

Are you saying that the first iteration of 2033 looked better across the board? It seems to be re-rendering in the new engine has improved some things but maybe I am mistaken. Have they up the modelling for instance?
 
Woah they ported 2 beautiful games(1 in a new skin) in 6 months.

And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec

rkvaLeb.png


This is going to be a good gen with further optimizations and gpgpu capabilities barely being touched yet. 2016-2017 is gonna see amazing looking shooters with Doom 4, Prey 2 and 4A's next project.

Also stupid dig from Leadbetter at TLOU, the game has a couple of drops for a few seconds.. A few seconds out of ~14hours gameplay is basically locked 60fps.
 
Who does this guy think he is? PC GAF says they're exactly the same, and will always yield the same results.

Edit: Can't wait to see what these guys can do when they have more than four months!

You forgot this line: "Practically achieving that performance takes some time, though!"

Not everyone is Carmack or Shishkovstov. In fact most people aren't.
 

bede-x

Member
Leadbetter takes a dig at Naughty Dog

That's not a dig at Naughty Dog. He's just saying that 4A did something that even someone like ND, known for their technical achievements, couldn't achieve. Which is true. As fine a remaster as it is, TLoU does have frame drops in quite a few places.
 

ValeYard

Member
But my favourite quote - just for out of context potential...



PS4 one million times faster than Xbox One confirmed.

Joking aside, it seems crazy to me that the Xbox One API is like this right now. Not only are they bottlenecked by the eSRAM's size, but have an API that isn't as comparatively close to the metal as Sony's. Man, the tables really turned this gen.

Great interview with lots of insight into current state of development for consoles and a bit of optimism for the future.
 

Renekton

Member
I'm not sure what MS's API on 360 was like... it may not have mattered so much back then compared to PS3 because of other challenges PS3 had itself, or because the GPUs could only handle a certain scale of draw calls anyway, and it never really troubled the CPU. But now CPU power is relatively stagnant while GPU power has grown dramatically. From the way I see others talk about it wouldn't surprise me if DX12 is MS's first crack at this kind of API, if they did not have that kind of low level API in 360.
According to Eurogamer and bit-tech, the DX API on X360 does have noticeable costs, but maybe one-tenth that of PCs.

https://twitter.com/jimhejl/status/1450730530

I get the feeling he is talking about pretty specific scenarios (draw calls, async compute, whatever) and not over all complete render performance.

No intelligent person would deny that with a lower API (aka Mantle, DX 12, Open GL 4.xx , PS4 lib) you can get better results for specific hardware types. But that would be quite different than saying console hardware is 60fps while equiv hardware on PC dx11/dx9 is 30fps.
Well equivalent-ish PC hardware is probably the lower end Kaveri paired with a R7 265.
 

Percy

Banned
Because of course Leadbetter would have to drag ND into it when no one needed to be singled out to get his point across. Groan.

Regardless, sounds like 4A did a good job with what they had to work with... even if some of the cutbacks sound a little disappointing.
 

stryke

Member
That's not a dig at Naughty Dog. He's just saying that 4A did something that even someone like ND, known for their technical achievements, couldn't achieve. Which is true. As fine a remaster as it is, TLoU does have frame drops in quite a few places.

I don't think I said anything about it being true or not. It's a clear criticism of ND failing to do what 4A achieved.
 
Top Bottom