• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Leadbetter interviews 4A's Oles Shishkovstov about current gen consoles + PC

stryke

Member
Metro Redux: what it's really like to develop for PS4 and Xbox One

As tech interviews go, this one's a corker. Readers of our previous Metro 2033 and Metro Last Light tech Q&As will know that 4A Games' chief technical officer Oles Shishkovstov isn't backward about coming forward on the matters that are important to him, and in the transition across to the new wave of console hardware, clearly there are plenty of important topics to discuss.

Digital Foundry: In our last interview you were excited by the possibilities of the next-gen consoles. Now you've shipped your first game(s) on both Xbox One and PlayStation 4. Are you still excited by the potential of these consoles?

Oles Shishkovstov: I think what we achieved with the new consoles was a really good job given the time we had with development kits in the studio - just four months hands-on experience with Xbox One and six months with the PlayStation 4 (I guess the problems we had getting kits to the Kiev office are well-known now).

But the fact is we haven't begun to fully utilise all the computing power we have. For example we have not utilised parallel compute contexts due to the lack of time and the 'alpha' state of support on those consoles at that time. That means that there is a lot of untapped performance that should translate into better visuals and gameplay as we get more familiar with the hardware.

Carmack console/pc performance spec quote returns -

Digital Foundry: Xbox 360 and PS3 were highly ambitious designs for the 2006/7 era. Xbox One and PS4 are much more budget conscious - have they got what it takes to last as long as their predecessors?

Oles Shishkovstov: Well obviously they aren't packing the bleeding edge hardware you can buy for PC (albeit for insane amounts of money) today. But they are relatively well-balanced pieces of hardware that are well above what most people have right now, performance-wise. And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec. Practically achieving that performance takes some time, though!

ESRAM stuff

Digital Foundry: Is ESRAM really that much of a pain to work with?

Oles Shishkovstov: Actually, the real pain comes not from ESRAM but from the small amount of it. As for ESRAM performance - it is sufficient for the GPU we have in Xbox One. Yes it is true, that the maximum theoretical bandwidth - which is somewhat comparable to PS4 - can be rarely achieved (usually with simultaneous read and write, like FP16-blending) but in practice I've seen only a few cases where it becomes a limiting factor.

Leadbetter takes a dig at Naughty Dog

Digital Foundry: The Metro games have a reputation for pushing visual boundaries, but even high-end PCs can struggle to sustain 60fps. I've played Metro Last Light for hours on PS4 and Xbox One and that 60fps is basically locked. Obviously some of the PC high-end features are reduced or altered, but in an age where not even Naughty Dog can't run its last-gen title consistently at 1080p60 on PS4, what is the secret of your success?

Oles Shishkovstov: There is no secret. We just adapted to the target hardware.

[...]

Their next title could return to 30fps on consoles -

Digital Foundry: Surely the easier path would have been to lock at 1080p30 and concentrate on integrating as many high-end rendering features as possible. Why target 60fps over 30fps?

Oles Shishkovstov: Because we can! Actually for the next unannounced project, the designers want more and more of everything (as usual) and quite possibly we will target 30fps.

Thinking about pc? Spend more on the GPU

Digital Foundry: Console development trends do have an impact on PC gaming. If you were building a mainstream gaming PC now with the future in mind, what choices would you make?

Oles Shishkovstov: This is tricky to answer without going into 'fan wars'. Get the most powerful components your budget allows for, with the emphasis on GPU.

There's some more stuff on PBR, APIs, unified memory architecture and mobile as well.
 

madmackem

Member
Pretty much echo what people on gaf have been saying for sometime, ps4 is more powerful and the esram is a major issue that will hamper them well into the gen.
 

Putty

Member
Pretty much echo what people on gaf have been saying for sometime, ps4 is more powerful and the esram is a major issue that will hamper them well into the gen.

Still won't quell the most....rabid XB1 owners, and a particular individual and his band of merry men (cult) in particular. Though i do await Mr X's response 8). I'm expecting more diagrams...
 

JesseDeya

Banned
But my favourite quote - just for out of context potential...

In general - I don't really get why they choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.

PS4 one million times faster than Xbox One confirmed.
 

Plywood

NeoGAF's smiling token!
I wonder what he means by 2x more performance. GPU and CPU combined ?
I'm assuming he means that optimizing your code for specific hardware yields better performance versus hardware that would have the same/similar specs on PC. Which is essentially what he said.
 

T.O.P

Banned
Clear and direct answers

Great job on the game anyway, waiting for my X1 Redux copy to arrive later today :D
 

Kezen

Banned
I'm assuming he means that optimizing your code for specific hardware yields better performance versus hardware that would have the same/similar specs on PC. Which is essentially what he said.

We can thank DX11 for that I guess. DX12 is long overdue, and should drastically lower the overhead on PC.
I don't think we will ever need 2x console specs with DX12.
 

Vidpixel

Member
Digital Foundry: The Metro games have a reputation for pushing visual boundaries, but even high-end PCs can struggle to sustain 60fps. I've played Metro Last Light for hours on PS4 and Xbox One and that 60fps is basically locked. Obviously some of the PC high-end features are reduced or altered, but in an age where not even Naughty Dog can't run its last-gen title consistently at 1080p60 on PS4, what is the secret of your success?

Was this really necessary? Good interview overall, though.
 

Rourkey

Member
Seems MS were way behind Sony in their API's on release, probably just used DX11 to get the thing out the door. I'm sure things will improve for XB1 as time goes by.
 

d00d3n

Member
I found this interesting:

Digital Foundry: You've improved both Metro titles, but the Redux is the same two games at their core. Last time we spoke, you dropped some hints about the future - specifically physics-based character animation. Now you've been hands-on with the next-gen consoles, can you tease anything else you're working on?

Oles Shishkovstov: For the game we are working on now, our designers have shifted to a more sand-box-style experience - less linear but still hugely story-driven. I will not go into details, but it requires some work from programmers as well. Also, we are improving graphics in very different aspects, like recently we did a physically-based global ambient occlusion (instead of local, like SSAO). I will not talk about PBR (physically-based rendering) here, because here we are at the stage when artists are still adapting their mentality to it.
 
Is ESRAM expensive? Is there a reason Microsoft couldn't have gone with 64 or 128MB?

I know this has been answered but for the scope of such an increase here's what the Xbox one APU looks like with only 32mb of esram

YsxT4fN.jpg
 

Nizz

Member
that dig at naughty dog is shitty. TLOU pretty much does run at 60fps for 99.9% of the time. :/
Yep, I agree. TLoU felt really smooth to me. A couple of drops into the 50s I still consider pretty damn smooth. I honestly did not encounter any spots in TLoU where I thought the framerate bothered the gameplay.

On topic, I only played about an hour of Metro 2033 but the game looks amazing and runs like a dream (PS4 version). Props to 4A Games for accomplishing this.
 

scoobs

Member
Still won't quell the most....rabid XB1 owners, and a particular individual and his band of merry men (cult) in particular. Though i do await Mr X's response 8). I'm expecting more diagrams...

He won't mention it I'm sure. Does anyone else read his stuff just for laughs? It's really good satire honestly, the comments are almost as inexplicable as the posts. I hate to give the guy any credit, but damnit if it isn't a joy to read when I need some comedy in my internet browsing.

I honestly cannot wait to see what the Metro guys come up with for their first true next-gen game... he mentions 2.5x "better, richer visuals". I like what I'm hearing :)
 
so will the people who vociferously post that coding to the metal is not an advantage get quieter now? anyone who programs to squeeze maximum performance ideally wants to know exactly how many cycles various low level operations take, and that's impossible to pin down on a heterogeneous platform, and that's completely ignoring the Imported pc abstraction layers.
 

Feindflug

Member
He won't mention it I'm sure. Does anyone else read his stuff just for laughs? It's really good satire honestly, the comments are almost as inexplicable as the posts. I hate to give the guy any credit, but damnit if it isn't a joy to read when I need some comedy in my internet browsing.

I honestly cannot wait to see what the Metro guys come up with for their first true next-gen game... he mentions 2.5x "better, richer visuals". I like what I'm hearing :)

2.5x better visuals if they go for 30fps instead of 60fps.

Also they did a fine job with the console versions considering the time they spent with the devkits, really impressive stuff.
 

KKRT00

Member
Was this really necessary? Good interview overall, though.

But its true. Its not locked like Metro, even though Metro games do more in terms of pure tech than TLoU. This doesnt mean that it runs badly or anything, its just not locked 60fps.
 

Renekton

Member
One little example I can give: Metro Last Light on both previous consoles has some heavily vectorised and hand-optimised texture-generation tasks. One of them takes 0.8ms on single PS3 SPU and around 1.2ms on a single Xbox 360 hyper-thread. Once we profiled it first time - already vectorised via AVX+VEX - on PS4, it took more than 2ms! This looks bad for a 16ms frame. But the thing is, that task's sole purpose was to offload a few cycles from (older) GPUs, which is counter-productive on current-next-gen consoles. That code path was just switched off
Makes sense.

Wii U started the trend of stacking the GPU since GPUs are better equipped for SIMD than CPUs ever will be, or at least Carmack said so too (when discussing Intel's Xeon Phi).
 

scoobs

Member
But its true. Its not locked like Metro, even though Metro games do more in terms of pure tech than TLoU. This doesnt mean that it runs badly or anything, its just not locked 60fps.

Isn't it kind of semantics though? It sits at exactly 60fps for 99% of the game, with infrequent drops. I would agree that it was an unfair and uncalled for poke at Naughty Dog. Felt intentional, and it probably was. He's getting a reaction out of GAF so he probably feels like he did his job :p
 

c0de

Member
Digital Foundry: DirectX 11 vs GNMX vs GNM - what's your take on the strengths and weakness of the APIs available to developers with Xbox One and PlayStation 4? Closer to launch there were some complaints about XO driver performance and CPU overhead on GNMX.

Oles Shishkovstov: Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result.

so DirectX12 might actually change the situation on xbone while others said it won't?
 
But its true. Its not locked like Metro, even though Metro games do more in terms of pure tech than TLoU. This doesnt mean that it runs badly or anything, its just not locked 60fps.

It's a reductive statement. Naughty Dog are aiming for 1080/60 on their first next-gen title. Porting a title to PS4 originally made for PS3 is different to adapting an engine for current gen consoles.
 

KKRT00

Member
Isn't it kind of semantics though? It sits at exactly 60fps for 99% of the game, with infrequent drops. I would agree that it was an unfair and uncalled for poke at Naughty Dog. Felt intentional, and it probably was. He's getting a reaction out of GAF so he probably feels like he did his job :p

Going for locked 60fps is harder than 60 with slight drops, so its valid point.
You need to cover all the heaviest scenes in Your game and have them render under 16.6ms when in game that runs generally in 60ms You basically cover the basics.
Going for locked 60fps in very dynamic games is quite challenging, because You dont control how many enemies and explosions will be at any time for different players.

To be more clear. Metro games probably runs at about 90-100fps constantly, where TLoU
at about 70-75fps. Metro needs more overhead in framerate to be locked in 60.
---
It's a reductive statement. Naughty Dog are aiming for 1080/60 on their first next-gen title. Porting a title to PS4 originally made for PS3 is different to adapting an engine for current gen consoles.

4A adapted two past-gen games with some DX11 features to consoles and yet still aimed for locked 60fps, its challenging. Definitely more challenging that what ND did.

And yet again, its not really anything to complain about ND, but locking 60fps is different philosophy and harder challenge that making Your gaming to run in 60fps most of the time.
 

gofreak

GAF's Bob Woodward
so DirectX12 might actually change the situation on xbone while others said it won't?

It depends. Taking what they're talking about there, if freeing up some CPU improves the framerate on the game then that would obviously be an improvement. But where your framerate is bound is quite a variable thing per game. From what they're saying there it wouldn't improve things on the GPU side so much as making things more efficient on the CPU side, so GPU bound scenarios may not benefit so much.
 

Durante

Member
It depends. Taking what they're talking about there, if freeing up some CPU improves the framerate on the game then that would obviously be an improvement. But where your framerate is bound is quite a variable thing per game. From what they're saying there it wouldn't improve things on the GPU side so much as making things more efficient on the CPU side, so GPU bound scenarios may not benefit so much.
This is also true on PC by the way. With the added fact that you are less likely to be CPU-bound on a modern PC.
 
i really love the direct answers, no garbage PR in the play here. i am quite surprised at how layered the XO API here, he just basically said that its DX11. as for leadbetter's comparison with TLOU:R its true. the effort that was done to the redux edition is way above ND's. not only did they significantly improve the graphics they managed to make the game a locked 60 FPS at 1080p.
 

scoobs

Member
What do you mean by change the situation?

Once they unlock that stacked second GPU they gonna destroy that old PS4 hardware.
Jk... I'm sure he is referring to getting the lesser versions of multiplatform games? Might bring the Xbox One more in line with PS4, it just depends on if devs want to utilize the more powerful hardware in the PS4 or just force parity across both. The PS4 is simply more powerful, he even says so in this interview, so theoretically there should be some kind of gap always, if developers decide to take advantage of both consoles.
 
4A adapted two past-gen games with some DX11 features to consoles and yet still aimed for locked 60fps, its challenging. Definitely more challenging that what ND did.

Two past-gen console games, or porting two old pc games to what are basically PCs?

There is a bit of a difference.
 

mrklaw

MrArseFace
so DirectX12 might actually change the situation on xbone while others said it won't?

Digital Foundry: DirectX 11 vs GNMX vs GNM - what's your take on the strengths and weakness of the APIs available to developers with Xbox One and PlayStation 4? Closer to launch there were some complaints about XO driver performance and CPU overhead on GNMX.

Oles Shishkovstov: Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result.

this seems confusing, because both have effectively the same CPU (Xbox even apparantly a little faster). So why would it be so much slower? Are they using the GPU somehow to generate draw calls for itself? main memory bandwidth limitation? shit driver from MS?
 

Denton

Member
Most significant new information from this interview is the fact that their next game will FINALLY be more open and less linear. And it probably concerns their new space game.

And as a bonus, they might target 30fps on consoles, meaning much better graphics at 60fps for me. Win-win.
 

gofreak

GAF's Bob Woodward
this seems confusing, because both have effectively the same CPU (Xbox even apparantly a little faster). So why would it be so much slower? Are they using the GPU somehow to generate draw calls for itself? main memory bandwidth limitation? shit driver from MS?

It was the 'shit driver' - the API was doing a lot of window dressing around draw calls, eating up CPU time etc. As they discuss later, the GNM (and now, DX12) model is more 'you look after this yourself, we won't do extra housekeeping for you anymore'.

For all the shit Sony gets on software, I find it interesting that they basically pioneered the model of a modern 'low level' API with GNM that everyone else is now following. At least on the game runtime and tool side, PS3 really kicked them into shape.
 

Marlenus

Member
Is ESRAM expensive? Is there a reason Microsoft couldn't have gone with 64 or 128MB?

It is on the same die as the GPU and the CPU, 32MB is already the largest on die SRAM pool ever made so making it even larger would have

a) increased power consumption,
b) decreased APU yield,
c) increased APU cost due to b,
d) increased power supply cost due to a,
e) increased cooling costs due to a.
 

Durante

Member
i really love the direct answers, no garbage PR in the play here. i am quite surprised at how layered the XO API here, he just basically said that its DX11. as for leadbetter's comparison with TLOU:R its true. the effort that was done to the redux edition is way above ND's. not only did they significantly improve the graphics they managed to make the game a locked 60 FPS at 1080p.
To be fair, the significantly improved the asset quality while at the same time massively reducing or even eliminating two of the most performance-hungry effects (volumetric lighting and high-quality pp motion blur). It's not an improvement across the board (unless you mean compared to the last-gen console versions?)
 
But its true. Its not locked like Metro, even though Metro games do more in terms of pure tech than TLoU. This doesnt mean that it runs badly or anything, its just not locked 60fps.

There's a big difference in that TLoU was developed so,Ely for PS3; bizarre hardware at best. Bringing Metro - a PC game - across to platforms that mimic PC hardware - is a more straightforward undertaking.

I think both developers appear to have done a great job.
 

Kezen

Banned
To be fair, the significantly improved the asset quality while at the same time massively reducing or even eliminating two of the most performance-hungry effects (volumetric lighting and high-quality pp motion blur).

Tessellation has also been a casualty.
 

c0de

Member
It depends. Taking what they're talking about there, if freeing up some CPU improves the framerate on the game then that would obviously be an improvement. But where your framerate is bound is quite a variable thing per game. From what they're saying there it wouldn't improve things on the GPU side so much as making things more efficient on the CPU side, so GPU bound scenarios may not benefit so much.

I guess freeing CPU time is almost always helpful, no matter what you do. I guess this is one of the most interesting parts besides of
But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.
I wonder how much of an improvement this makes. And this could put the Diablo 3 situation into a different light. Obviously MS is doing a lot of work with each XDK release so if you build against an old XDK you might loose performance.
 

coastel

Member
performance-wise. And let's not forget that programming close to the metal will usually mean that we can get 2x performance gain over the equivalent PC spec.

Oh shit PC gaf not gonna like that..anyway great article even though I don't understand some of it it's very interesting to read.
 

Denton

Member
To be fair, the significantly improved the asset quality while at the same time massively reducing or even eliminating two of the most performance-hungry effects (volumetric lighting and high-quality pp motion blur). It's not an improvement across the board (unless you mean compared to the last-gen console versions?)
On PCGamer comparison it seemed like texture quality in 2033 Redux also took a hit compared to the original.
 

Marlenus

Member
But its true. Its not locked like Metro, even though Metro games do more in terms of pure tech than TLoU. This doesnt mean that it runs badly or anything, its just not locked 60fps.

As others have mentioned there is a lot of difference in optimising a game that already runs on x86 hardware (even if the API is a bit different) to porting a game running on completely different architecture.

They had an easier base to start with and as they said they ported it and then tested and tested and tested. When ND first did the port the game did not even run so they had to get it working before they could get into the testing phase which meant they had less time to optimise for performance and they would have had further to go.

By the sounds of it there is still a lot of performance on the table so games that come out in 2/3 years will be really interesting to see and hopefully it will be used for some great interactivity and physics rather than just extra screen fluff. I hope they use it to make the worlds feel really alive.
 

KKRT00

Member
There's a big difference in that TLoU was developed so,Ely for PS3; bizarre hardware at best. Bringing Metro - a PC game - across to platforms that mimic PC hardware - is a more straightforward undertaking.

I think both developers appear to have done a great job.
Two past-gen console games, or porting two old pc games to what are basically PCs?

There is a bit of a difference.

Nope. Metro 2033 was not very multithreaded friendly, same goes for Metro:LL.
Both games needed tons of refactoring especially for locked 60fps.
Both games were also DX9 in heart and running them on crappy jaguar cores in almost 100hz is very impressive.
 

arhra

Member
this seems confusing, because both have effectively the same CPU (Xbox even apparantly a little faster). So why would it be so much slower? Are they using the GPU somehow to generate draw calls for itself? main memory bandwidth limitation? shit driver from MS?

Vanilla DX11 does a ton of book-keeping, as a result of it's legacy as a hardware-independent API that was designed to make things as easy as possible. Stuff like buffering GPU commands to make sure they're issued in the right order and making sure that the GPU state is correct before issuing a new batch of commands, inserting multiple checks to make sure that memory is being accessed safely, etc. That all eats CPU time.

DX12, and traditional console APIs, generally leave all that up to the programmer themselves, under the assumption that they know what they're doing (and for consoles, because they're always running on the same hardware, there's no worry about abstracting any differences away), so they can do the absolute bare minimum of checking, saving a ton of CPU cycles.
 
Top Bottom