• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Leadbetter interviews 4A's Oles Shishkovstov about current gen consoles + PC

Durante

Member
Who does this guy think he is? PC GAF says they're exactly the same, and will always yield the same results.
Please provide a quote from "PC GAF" stating that. Don't worry, I'll wait. (By the way, why is generalizing "PC GAF" not treated in the same way as "Sony GAF"?)

I still think people should qualify such a statement a lot more before they present it to the masses. Reduce your CPU overhead twofold? Sure, at least in some scenarios (and you better, considering the anemic CPUs in consoles). But you won't get much better performance shading pixels or transforming vertices on consoles than on equivalent PC hardware -- and many high-end multiplatform titles confirm this.
 
I am skeptical of the 2x performance increase they think can be gained pretty much from what developers have gotten so far. If developers are still getting used to the console APIs and through time get as close to utilising it fully, wouldn't that just mean the console's graphics processing power will be fully utilised? I'm not understanding why they (and Carmack) think it's going to be 2x performance increases.

pc games are horribly inefficient. late gen console games will be achieving much closer to peak throughput in all areas of its hardware than a pc will ever dream of
 

coastel

Member
Please provide a quote from "PC GAF" stating that. Don't worry, I'll wait. (By the way, why is generalizing "PC GAF" not treated in the same way as "Sony GAF"?)

I still think people should qualify suc ha statement a lot more before they present it to the masses. Reduce your CPU overhead twofold? Sure, at least in some scenarios (and you better, considering the anemic CPUs in consoles). But you won't get much better performance shading pixels or transforming vertices on consoles than on equivalent PC hardware -- and many high-end multiplatform titles confirm this.

The problem is they are early tools still the article pretty much confirms this and theres no way a multi-plat game will be as well optimised as a 1st party game so kinda pointless to be honest as both side's can't prove it.
 

JaggedSac

Member
I doubt that in particular had anything to do with the Diablo situation - the "DX12-style" is probably essentially a prototype of DX12 itself (with whatever GCN-specific custom tweaks they've added that won't generalize over to nvidia/intel hardware), and taking advantage of that would likely take a near-complete rewrite of the renderer.

During the DX12 announcement presentation, they mentioned that the Forza DX12 port took them 4 months (with a team of 4 engineers working on it), so depending on when that API became available in the XDKs, I wouldn't expect anything to ship using it until next year (unless some first-party teams have had early access - Forza being ported early this year does make me wonder whether Horizon 2 might have had the time to benefit from that work), as nobody would want to embark on anything of that magnitude just before shipping.

Any improvements for games shipping in the near future will be down to whatever they can wring out from the kinect reserve being lifted, and improvements to the DX11.x runtime to fast-path more calls and eliminating as much of the overhead as they can without completely re-architecting the API.

4 man months. Which means 4 people for 1 month.
 

Dragon

Banned
People on both sides are way too sensitive. Let Ledbetter say what he wants the most interesting parts seem to be a discussion between Durante and gofreak among others. Thanks for the link OP.
 
Please provide a quote from "PC GAF" stating that. Don't worry, I'll wait..
it comes up all the time. !!
- the coding to the metal "myth."
- let's not dredge up that cliffyb tweet.
- there has been zero examples of a multi platform game benefitting from any metal advantage.
- pc has assembler too.
- any performance increase over the years was down to algorithms not optimization.

it is pc gaf lore that metal is not a thing that matters, and pc gaf lore that the same number of Teraflops means same game.
that's just my impression! but what do I know I have only been lurking daily for a year.
 

c0de

Member
Pretty much echo what people on gaf have been saying for sometime, ps4 is more powerful and the esram is a major issue that will hamper them well into the gen.

Of course it is more powerful. According to him "PS4 is just a bit more powerful.".
But ESRAM major issue? I guess you opened a different link.
 

Oemenia

Banned
it comes up all the time. !!
- the coding to the metal "myth."
- let's not dredge up that cliffyb tweet.
- there has been zero examples of a multi platform game benefitting from any metal advantage.
- pc has assembler too.
- any performance increase over the years was down to algorithms not optimization.

it is pc gaf lore that metal is not a thing that matters, and pc gaf lore that the same number of Teraflops means same game.
that's just my impression! but what do I know I have only been lurking daily for a year.
To be fair many PC gamers here on GAF have been reasonable about it. A lot however start frothing at the mouth when the word comes up but then talk about the benefits of Mantle/DX12.
 

Shpeshal Nick

aka Collingwood
I don't know about ESRAM itself being expensive, but it would have made the APU a lot more expensive if they didn't want to compromise the GPU any further.

It directly increases the size of your main chip by a significant amount. So, yes.

I guess this greatly depends on how you define an "insane amount of money".

It's integrated on the same die with the APU (hence the E, from embedded) which is already huge as it is. That makes it expensive at the moment, until they shrink the whole chip.

I know this has been answered but for the scope of such an increase here's what the Xbox one APU looks like with only 32mb of esram

It is on the same die as the GPU and the CPU, 32MB is already the largest on die SRAM pool ever made so making it even larger would have

a) increased power consumption,
b) decreased APU yield,
c) increased APU cost due to b,
d) increased power supply cost due to a,
e) increased cooling costs due to a.

Ok cool. Thanks for the replies.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Of course it is more powerful. According to him "PS4 is just a bit more powerful.".
But ESRAM major issue? I guess you opened a different link.

PS4 has twice the number of GPU components but it's only a bit more powerful?
 

coastel

Member
To be fair many PC gamers here on GAF have been reasonable about it. A lot however start frothing at the mouth when the word comes up but then talk about the benefits of Mantle/DX12.

Yea wrong to brand any side as a whole because of what a few hardcore platform fans say.
 

c0de

Member
What would you expect him to say? "PS4 has a decent performance lead and it took us a while to get the X1 version up to scratch"?

Why not? I think given they had even less time on Xbox One their work on this version is amazing.
 

Seanspeed

Banned
I don't know if I still want 60 fps.

Seriously 2.5 times better, that's a lot.
That's a pretty arbitrary number there. Define what 1x better means, ya know?

And it all depends on what the developer decides to prioritize. You can easily take a 60fps game and thrown in MSAAx8 and suddenly you have a 30-40fps game. It is going to look 2.5x better? Hardly. It will look a lot cleaner obviously, but the multitude of other things that make up 'good graphics' are being ignored there.
 

JaggedSac

Member
Digital Foundry: To what extent will DX12 prove useful on Xbox One? Isn't there already a low CPU overhead there in addressing the GPU?

Oles Shishkovstov: No, it's important. All the dependency tracking takes a huge slice of CPU power. And if we are talking about the multi-threaded command buffer chunks generation - the DX11 model was essentially a 'flop', while DX12 should be the right one.

It is important, it isn't important, can people make up their minds.
 

Seanspeed

Banned
it comes up all the time. !!
- the coding to the metal "myth."
- let's not dredge up that cliffyb tweet.
- there has been zero examples of a multi platform game benefitting from any metal advantage.
- pc has assembler too.
- any performance increase over the years was down to algorithms not optimization.

it is pc gaf lore that metal is not a thing that matters, and pc gaf lore that the same number of Teraflops means same game.
that's just my impression! but what do I know I have only been lurking daily for a year.
I think what most people say is that more is made of the 'coding to the metal' thing than is really there. Perhaps, on-paper, in a theoretical capacity, it might be true. But in practical terms, I don't think there's a whole lot of evidence to support the difference being quite that big. And I also don't think you'll find anybody who denies that there is some advantage to using a low level console API.

You are exaggerating and generalizing.
 

Nzyme32

Member
it comes up all the time. !!
- the coding to the metal "myth."
- let's not dredge up that cliffyb tweet.
- there has been zero examples of a multi platform game benefitting from any metal advantage.
- pc has assembler too.
- any performance increase over the years was down to algorithms not optimization.

it is pc gaf lore that metal is not a thing that matters, and pc gaf lore that the same number of Teraflops means same game.
that's just my impression! but what do I know I have only been lurking daily for a year.

What the hell is "PC GAF lore" and where can I see actual quotes of it?
 
Pretty much echo what people on gaf have been saying for sometime, ps4 is more powerful and the esram is a major issue that will hamper them well into the gen.

They had kits for a limited time and themselves said it was in an infant state.

Esram is not a "hamperiment", per se, it might have become one in early titles due lack of tools and apis on xbone devkit, but that doesn't mean it's going to stay that way forever. Heck, we are already seeing improvements.

And this is a huge reason why:

But my favourite quote - just for out of context potential...



PS4 one million times faster than Xbox One confirmed.

The xbone's XDKs even today doesn't offer the borderline performance the hardware is capable of. Things like multithreaded draw calls that have been possible on Ps4 since before day one, are still not possible on the bone.

Of course, it's Ms own fault, but it means things will still improve further.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
In the grand scheme of computing power, he isn't wrong.

In the grand scheme, no. The GPU in the PS4 is only a bit more powerful than the Xbone in comparison with a GTX Titan. But in a direct comparison between just the PS4 and Xbone, I don't think that's true.
 
PS4 has twice the number of GPU components but it's only a bit more powerful?

Because Ps4 does not have twice the gpu components, and even the ones it does (like ROPs or TMUs) are usually bound by things other than the units themselves like bandwidth.
 

KKRT00

Member
it comes up all the time. !!
- the coding to the metal "myth."
- let's not dredge up that cliffyb tweet.
- there has been zero examples of a multi platform game benefitting from any metal advantage.
- pc has assembler too.
- any performance increase over the years was down to algorithms not optimization.

it is pc gaf lore that metal is not a thing that matters, and pc gaf lore that the same number of Teraflops means same game.
that's just my impression! but what do I know I have only been lurking daily for a year.

Coding to metal is a myth - yes. Its a myth, because people treat it exactly like a quote above:
You get 2x times more global performance on consoles or You are only achieving 50% performance on PC hardware.
Its so ridiculous statement to think that PC are running on 50% or less efficiency that i dont know who with the any brain claim that, but You know what?
There are tons of people here and on the internet that do, even though not only theoretically its not possible, but there is not a single real example of that.

Ps. No one is ever claiming lack of benefits of lower level API. Far from it actually.
 

c0de

Member
In the grand scheme, no. The GPU in the PS4 is only a bit more powerful than the Xbone in comparison with a GTX Titan. But in a direct comparison between just the PS4 and Xbone, I don't think that's true.

I think we all are well aware about the hardware differences but we should keep in mind that all numbers given are theoretical max values which probably can't be achieved all the time.
For example we know currently that the os by PS4 and Xbone (afair) take a good amount of RAM (3gb, if I am not wrong) and take 2 CPU cores. Than you have to keep in mind that both systems run on multi-tasking OS's. So the scheduler is constantly stealing CPU time from your game (of course with priorities, but still).
The comparison between both systems with raw numbers is, in my opinion, of no real use this gen as there are side effects on both systems we are uncertain of considering performance hits.
 

stryke

Member
Oles Shishkovstov: The problem with unified memory is memory coherence. Even on consoles, where we see highly integrated SoCs (system on chips), we have the option to map the memory addresses ranges basically 'for CPU', 'for GPU' and 'fully coherent'. And being fully coherent is really not that useful as it wastes performance. As for the traditional PC? Going through some kind of external bus just to snoop the caches - it will be really slow.

Isn't fully coherent the ultimate goal of HSA and heterogeneous compute?

Does that mean gaming is unlikely to benefit from such?
 

Mr Vast

Banned
pc games are horribly inefficient. late gen console games will be achieving much closer to peak throughput in all areas of its hardware than a pc will ever dream of

Are you sure of that. Because 8800 gt released in november of 2006 for 200 bucks is faster then a ps3 in recent multiplat. This whole optimization is fishy.
 

meppi

Member
More like op trying to start a faux controversy.

A111h5i.gif
 
Are you sure of that. Because 8800 gt released in november of 2006 for 200 bucks is faster then a ps3 in recent multiplat. This whole optimization is fishy.

its also over twice as powerful as the gpu in ps3, yet provides an experience thats marginally better. plz find me anything a 7950gt, or whatever card is spec matched to the rsx, can run at playable fps that looks anywhere near as good as uncharted 3, god of war, tlou, killzone 3 etc.
 
its also over twice as powerful as the gpu in ps3. plz find me anything a 7950gt, or whatever card is spec matched to the rsx, can run at playable fps that looks anywhere near as good as uncharted 3, god of war, tlou, killzone 3 etc.

So it should run games about the same as the PS3 thanks to the latter's automagical 2x applies-to-everything-and-every-developer-out-there-can-do-it-while-standing-on-his-head code to the metal optimizations, not significantly better (as it does).
 

Leb

Member
I was about to ask for an actual example of a game running on consoles at 2x performance compared to an equivalent PC. Maybe there are, but I can't find any.

Well, given that DX11 overhead is felt most acutely on the CPU side, I think you might actually see a large performance delta if you were to pair, say, an AMD 5150 (4 core Jaguar) with a 7850.

But in practice, of course, this is fairly meaningless, as even a mid-level build from 2011 is likely to sport a CPU that is more than twice as fast as the console counterparts, thereby erasing much of that delta.
 
In the grand scheme, no. The GPU in the PS4 is only a bit more powerful than the Xbone in comparison with a GTX Titan. But in a direct comparison between just the PS4 and Xbone, I don't think that's true.

He is primarily a PC developer so I imagine he is looking at this in terms of the bigger picture. To him, the difference is not a huge one.
 

Shin-Ra

Junior Member
'dem ROPs.

I think it's very promising we're getting 1080p60 with this level of detail early on while Valve/EA/HL2 struggled to maintain 720p30 early last-gen.
 
He is primarily a PC developer so I imagine he is looking at this in terms of the bigger picture. To him, the difference is not a huge one.

People should remember this. Unless someone is explicitly comparing PS4 and XBO the difference will be minimal. I remember this discussion from ther threads (I think it was about a developer from CD Projekt).
 
I was about to ask for an actual example of a game running on consoles at 2x performance compared to an equivalent PC. Maybe there are, but I can't find any.

There aren't any, if there were you can be sure that you would see them plastered all over Neogaf threads. I believe that no developer would seriously consider spending many months or even years on optimization just to eek out a bit better performance. Carmack, whose "2x" tweet is often brought up as evidence, since there isn't any actual evidence out there, had this to say on the topic of his greatest regrets:

“That’s the biggest thing when we look back on this generation,” said the co-founder of id Software, when asked about the length of time it’s taken to develop Rage in an interview with CVG.

“We had, at one point or another, seven years of work that went into it.

“It’s a foregone conclusion that we’re never going to throw out that much of the code base ever again. If we have eight times the resources on a next gen console, we can’t spend a decade writing the optimal technology for that. We need to be able to be at a point where we rewriter sections that matter and have an impact, and reuse as much as we can.”

He went on to say that his own development outlook has changed. He wants id’s content creators to be able to do their job without the never-ending drive for better technology getting in the way.

“A lot of the things we do nowadays, with the AI and the animation, is really good enough,” he said.

“Every aspect of Rage could be made better if we wanted to spend another two years on it, but it’s not the optimal use of [our] resources. I’d be happier if we produced two games in that time. And that’s actually my personal marching orders for technology development going forward.”

Carmack in the end understood what the vast majority of multiplatform devs already know. That endless optimization is a gigantic waste of time for little actual benefit.
 

Renekton

Member
Its so ridiculous statement to think that PC are running on 50% or less efficiency that i dont know who with the any brain claim that
Well you are directly contradicting Shishkovstov and Carmack, they are not mincing words and leaving it open to interpretation. Any counterclaim has to start with "Sorry Oles and John, but you guys are wrong".
 

Oemenia

Banned
There aren't any, if there were you can be sure that you would see them plastered all over Neogaf threads. I believe that no developer would seriously consider spending many months or even years on optimization just to eek out a bit better performance. Carmack, whose "2x" tweet is often brought up as evidence, since there isn't any actual evidence out there, had this to say on the topic of his greatest regrets:



Carmack in the end understood what the vast majority of multiplatform devs already know. That endless optimization is a gigantic waste of time for little actual benefit.
Oh so you believe that optimisation DOES exist after all. And youre quoting that n00b Carmack!

Well you are directly contradicting Shishkovstov and Carmack, they are not mincing words and leaving it open to interpretation. Any counterclaim has to start with "Sorry Oles and John, but you guys are wrong".
The defensive attitude only proves that its to do with vested interests. Some people see bringing up the idea as somehow making more powerful PC hardware less impressive.
 

Seanspeed

Banned
Well you are directly contradicting Shishkovstov and Carmack, they are not mincing words and leaving it open to interpretation. Any counterclaim has to start with "Sorry Oles and John, but you guys are wrong".
More like, "Sorry Oles and John, but I don't see the evidence that supports this."

Once again, its likely a theoretical thing, but not something that actually pans out in practical circumstances.
 

Alej

Banned
You people just use that "2x more perf quote" in fanboyish manners. Please stop, it's stupid you aren't devs or anything close to that.
Endlessly I see odd comparisons flourish here. Like comparing multiplatform games and all to prove anything.

Comparing multiplatform games between PC and consoles will always being about bruteforcing things. Why? Basically because those multiplatform games aren't designed to take advantages to platform specific hardware features (and it's the same for console wars bullshit using multiplatform software), so yes more flops = more performance here, always.

PC will always have a power advantage versus consoles, they are just limited by the tech available at a time and that's it. It's undebatable. Problems arise when you compare things that aren't very comparable, furthermore using bad examples (multiplatform games).

Yes, Crysis 3 on PS3 is exactly what you should expect of equivalent hardware on PC (if it does exist) because that kind of game is distributed among a big range of hardwares and isn't designed around the advantages (and flaws) of one hardware in particular. But a game like TLOU is something you can just dream to achieve with equivalent hardware in a PC. It doesn't say PC hardware isn't as capable, I bet if one PC developer would design a game just for that particular hardware (let's say an equivalent tflops and bandwidth config on PC than PS3) he could build a game as ambitious as TLOU because the hardware is effectively as capable.

This particular problem resides in the design philosophy. Designing a game around the hardware of a closed platform improves efficiency by a lot. You can say, to end this, that multiplatform console gaming and PC gaming have the same limiting factors. PC is just like there was plenty of other consoles around there where some have more power and some less, with games designed to run on every of them.

Ultimately, I hear every now and then that consoles are a limiting factor for PC gaming, but in fact the biggest limiting factor for high-end PC gaming is PC. If there was a "label" like "high-end PC gaming only" forming an high-end platform with devs targeting this (and only this) range of high end hardwares only, you would see things right now you never imagine your shiny hardware would be capable of today, things you won't see until the next generation of consoles, right now. I know it's frustrating and it's a shame that hardware isn't fully used and never will be because high end PC gaming is niche (in regards of big publishers) bit it's not the fault of console gaming or even budget PC gaming, it's because devs don't (or aren't economically allowed to) design their games for your hardware.

That's why, in fact, plenty of us choose the console road.
 
Top Bottom