• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EuroGamer: More details on the BALANCE of XB1

goonergaz

Member
They are all the same.


basically 14 CU's is good enough for 1080P @ 60FPS using it for just fixed function graphics, so the other 4 can get you something like 1080P @ 75FPS or 1400P @ 60FPS.

but why would you do that if they are not standards? so you will be better off using the 4 CU's to add more effects using compute.

Or ensuring games hit 1080p60 locked?
 
Tbh my interpretation (fwiw) was that 14 was already past a very sharp drop off...now you're sounding like "a bit of a drop past 14"

I was trying to take the edge out of it, since it gets people so riled up, but this is exactly how it was communicated. A point in which there is a huge knee in the performance curve, then beyond that point there's a significant drop off in the value of additional ALU resources for graphics, and the PS4 is well to the right of that point or knee. Developers are then encouraged to use the additional ALU resources instead for compute, which a recommendation of 14 CUs for graphics rendering, and 4 CUs for Compute tasks.

So, it then becomes not what my definition of significant drop off is, but instead what is Sony's definition of a significant drop off. I don't know the answer to that.

Well it's a bit harsh to call bs until bish confirms either way isn't it...

See, well, this is kind of the reason I held off on saying anything for so long. People get so touchy when you say this stuff, but this is how it was presented on an official sony slide at a devcon. I heard nothing about it being presented as just a one off case or example. It sounded like it was fact regardless of what the developer intends to do, but this doesn't somehow prevent a developer from ignoring the recommendation, and contributing 15+ CUs, or even all, purely to graphics operations.
 
I'm not sure that claim is being made. The same kind of behaviour about non linear return on ALU in 'typical games' can be observed on PC, in that DF comparison article for example.
Reading the initial posts, it seemed like such. It made out to sound like a ubiquitous and sharp drop off in performance gain - EDIT: specifically affecting this console hardware, as opposed to desktop GPUs.

I don't think anyone necessarily finds the idea that there may be non-linear returns on increasing the number of ALUs for graphics performance that controversial an idea, if all else remains equal, and depending upon the specific software in question - as you've been describing.
 
They are all the same.


basically 14 CU's is good enough for 1080P @ 60FPS using it for just fixed function graphics, so the other 4 can get you something like 1080P @ 75FPS or 1400P @ 60FPS.

but why would you do that if they are not standards? so you will be better off using the 4 CU's to add more effects using compute.

Which basically means that the diminishing returns are not due to inefficiency, but the application of said gains may not be as beneficial as using the compute functions. Eventually 4k gaming will require more. Come on 4k gaming
 
The DF comparison that tried to isolate 50%+ ALU showed a range of returns in 5 of today's games at some different levels of settings and effects. Those games showed a circa 17-30% improvement with 50% more ALU.

i.e. non-linear, 'might be better of spending it on something else' in some cases, but still fairly substantial

17%-30% is still very significant a change

Would love if that could be used to improve the general graphics (resolution/framerate) of my PS4 games although I am curious to see what becomes of the GPGPU push
 

gofreak

GAF's Bob Woodward
They are all the same.


basically 14 CU's is good enough for 1080P @ 60FPS using it for just fixed function graphics

There's no such thing (fixed function graphics) today...they're saying x CUs worth of ALU might often be a sweet spot for a typical mix of shaders today at a given res and framerate (probably 1080p, probably not 60fps if I had to hazard a guess...).

Of course, what is 'a typical shader mix'? They probably did some benchmarks. Has nothing to do with fixed function vs programmable graphics though.
 
No. Games are inherently spiky when it comes to performance, nothing is better off with evenly distributed average, that's the entire reason why we have unified shaders since 2005.
But yes, XB1 will operate like that as well (barring the differences in approach to ACU by each company).

I remember reading that unified shaders were less efficient (ie. one pix shader was more "powerful" in ps3 than one unified shader for pixel shading in 360) than standard shaders when 360 came around or was that BS spreading around. I would assume things have been sorted out by now days anyway.
 
Just sent.



Keep in mind I didn't say it couldn't be used however a dev wished. Just what Sony presented to devs about a bit of diminishing returns for graphics operations after the 14 CU mark. Devs if they wish can use all 18 for graphics, but apparently there's a significant drop off in the benefit that you see for the additional ALU resources for graphics past the 14 CU mark, which is why Sony suggests the most optimal use for the extra 4 CUs is to use them for compute work.

This is down to the xbone not having the grunt to have any use past 12. Just because the one cannot do it stop making out that every other device out there about 14 is rubbish.

The amount of PR spin from you unbelievable. This is the sort of junk I read on other forums trying to make out the ps4 is rubbish because it does not use 12 like its the magic friggin number.

Maybe this is their secret sauce.

I might not buy a new graphics card for my PC next year but go and buy an old 8800 as that might have the magic number.

Sorry for the rant just get hacked of with PR spin.
 
This is down to the xbone not having the grunt to have any use past 12. Just because the one cannot do it stop making out that every other device out there about 14 is rubbish.

The amount of PR spin from you unbelievable. This is the sort of junk I read on other forums trying to make out the ps4 is rubbish because it does not use 12 like its the magic friggin number.

Maybe this is their secret sauce.

I might not buy a new graphics card for my PC next year but go and buy an old 8800 as that might have the magic number.

Sorry for the rant just get hacked of with PR spin.

Dude, take a chill pill. This is actual information presented by Sony to developers. After you understand that, you're free to take it however you want. This can't be applied to desktop GPU scenarios. This purely down to the design and balance of the PS4 hardware.

Oh shit, I said balance.

But, I kinda expected this kind of silly reaction, so I'll excuse myself.
 

onQ123

Member
If you don't think the extra ALU's for Compute is important just take a look at the Xbox One games like Ryse & Killer Instinct taking a hit in resolution just so they can have the effects in their games.

if XBOX One had 4 CU's leftover they could have added the effects at 1080P but instead they had to dig into the 12 CU's that they have to get these effects.
 
Dude, take a chill pill. This is actual information presented by Sony to developers. After you understand that, you're free to take it however you want. This can't be applied to desktop GPU scenarios. This purely down to the design and balance of the PS4 hardware.

Oh shit, I said balance.

But, I kinda expected this kind of silly reaction, so I'll excuse myself.

But all we see is your interpretation of Sony's information

There is no impartiality here

That's why it would be nice to have someone else's take on this information

I don't know what Sony said about this

I know what you said about what sony said about this

That is not the same
 
Well it's a bit harsh to call bs until bish confirms either way isn't it...
Usage does not diminish with more CUs if your apllication uses them simple as that. if you can feed GPU there are no "diminishing results" . There could be part of ps4 that hinderes use of all CUs effectively. BUT general assertion that with over 14 CUs comes big drop off is BS, pure and simple.
 

LiquidMetal14

hide your water-based mammals
Dude, take a chill pill. This is actual information presented by Sony to developers. After you understand that, you're free to take it however you want. This can't be applied to desktop GPU scenarios. This purely down to the design and balance of the PS4 hardware.

Oh shit, I said balance.

But, I kinda expected this kind of silly reaction, so I'll excuse myself.

I don't have any real understanding of how internal parts work down to the silicon but the kind of stuff you mentioned is the kind of stuff most devs have known and talked about for a while now.

The way you've presented it is in a sort of desperate way to sort of downplay PS4. I don't necessarily disagree with everything you're saying but you are who you are and you do a ton of spinning for MSFT. That's the way it is so you bringing this thing up isn't exactly a bad piece of news. I just find it interesting that you would bring it up but this is a performance/tech related discussion so I accept it. I don't like how you brought this up and the message you're conveying but it is what it is.
 

gofreak

GAF's Bob Woodward
Dude, take a chill pill. This is actual information presented by Sony to developers. After you understand that, you're free to take it however you want. This can't be applied to desktop GPU scenarios. This purely down to the design and balance of the PS4 hardware.

No, it's not. It's down to how games 'typically' run on any GPU and how much ALU a 'typical' game uses at a given res vs other resources. A PC GPU with the same ratio of ALU:Rops:bandwidth would see the same less than linear return on more ALU past a certain point with the same games. It is not some soldered-into-the-hardware switch, it's a matter of 'typical' software balance and how it relates to hardware.

Your root information is correct, but that interpretation is not.
 

onQ123

Member
There's no such thing (fixed function graphics) today...they're saying x CUs worth of ALU might often be a sweet spot for a typical mix of shaders today at a given res and framerate (probably 1080p, probably not 60fps if I had to hazard a guess...).

Of course, what is 'a typical shader mix'? They probably did some benchmarks. Has nothing to do with fixed function vs programmable graphics though.


It was just easier to explain it that way.
 

jcm

Member
I did :p
That explains exactly diminishing returns. There is a drop off in the value of the extra ALU resources for graphics after a certain point, that point being 14 CUs.

Maybe we're just quibbling over semantics, but to me a "huge knee" isn't "a bit of diminishing returns". It's a bottleneck.
 
If you don't think the extra ALU's for Compute is important just take a look at the Xbox One games like Ryse & Killer Instinct taking a hit in resolution just so they can have the effects in their games.

if XBOX One had 4 CU's leftover they could have added the effects at 1080P but instead they had to dig into the 12 CU's that they have to get these effects.

I never, ever, said they weren't important, or somehow won't be useful to the PS4. I simply said what Sony said to developers. Compute is used in games, too, so even if devs take sony up on their recommendation and use 14 CUs for graphics, that doesn't mean the remaining 4 CUs won't be used to make the game even better. The only point of mentioning it is that the PS4 GPU 14 + 4 balance rumor is true. If people didn't know what was meant by balanced at 14 CUs before, then they know now.

I didn't present the info as "haha ps4 blows! Xbox rocks!" I presented it in a purely informational, non trolling fashion. I know the PS4 is the stronger of the two machines.

I don't have any real understanding of how internal parts work down to the silicon but the kind of stuff you mentioned is the kind of stuff most devs have known and talked about for a while now.

The way you've presented it is in a sort of desperate way to sort of downplay PS4. I don't necessarily disagree with everything you're saying but you are who you are and you do a ton of spinning for MSFT. That's the way it is so you bringing this thing up isn't exactly a bad piece of news. I just find it interesting that you would bring it up but this is a performance/tech related discussion so I accept it. I don't like how you brought this up and the message you're conveying but it is what it is.

I brought it up specifically in reference to this post.

http://www.neogaf.com/forum/showpost.php?p=83275817&postcount=1107

On one end the poster felt Microsoft was saying "there's a lot of misinformation out there," and then on the next they felt Microsoft was guilty of spreading more misinformation about what Sony said about their own hardware balance. It's a bit slick obviously since Sony wouldn't necessarily say that their system is only balanced for 14 CUs. They would say their GPU is balanced for 18, with 14 for graphics and 4 for compute being their preferred or more optimal use for the system. But there was some truth to the balanced at 14 CUs statement. The details explaining what that means were simply not elaborated on, in a similar fashion that Cerny talked about the same thing in a eurogamer interview, also without fully elaborating on what was meant, but extra details were presented to devs in a slide, and I just felt that required pointing out.

No, it's not. It's down to how games 'typically' run on any GPU and how much ALU a 'typical' game uses at a given res vs other resources. A PC GPU with the same ratio of ALU:Rops:bandwidth would see the same less than linear return on more ALU past a certain point with the same games. It is not some soldered-into-the-hardware switch, it's a matter of 'typical' software balance and how it relates to hardware.

Your root information is correct, but that interpretation is not.

I can accept that. It's just how it was presented, with the "Given the rest of the design, the ps4 is heavily slanted towards ALUs part." made it sound like something that was a consequence of the design, and maybe something that may not translate quite the same to someone's PC that has a radeon 7970, but maybe I'm wrong about that, and if I am then I admit to being wrong on my interpretation of the reason for why this is the case.
 

LiquidMetal14

hide your water-based mammals
Maybe we're just quibbling over semantics, but to me a "huge knee" isn't "a bit of diminishing returns". It's a bottleneck.

This is true but we do not know how everything in PS4 would work. There will be theoretically some bottlenecks since it isn't an unhinged PC with the best of the best parts. That doesn't take away from PS4 though so that's why I question the motive and interpretation and yes, the person who is bringing this up and his track record. I don't want to typecast but that's how I see it.

I never, ever, said they weren't important, or somehow won't be useful to the PS4. I simply said what Sony said to developers. Compute is used in games, too, so even if devs take sony up on their recommendation and use 14 CUs for graphics, that doesn't mean the remaining 4 CUs won't be used to make the game even better. The only point of mentioning it is that the PS4 GPU 14 + 4 balance rumor is true. If people didn't know what was meant by balanced at 14 CUs before, then they know now.

I didn't present the info as "haha ps4 blows! Xbox rocks!" I presented it in a purely informational, non trolling fashion. I know the PS4 is the stronger of the two machines.

Fair enough. I still have my trepidation about you but I'm not calling you anything or demeaning you. I sit back and don't say a lot though so your original message got my attention for good or bad.
 

JaggedSac

Member
If you don't think the extra ALU's for Compute is important just take a look at the Xbox One games like Ryse & Killer Instinct taking a hit in resolution just so they can have the effects in their games.

if XBOX One had 4 CU's leftover they could have added the effects at 1080P but instead they had to dig into the 12 CU's that they have to get these effects.

Nothing but conjecture.
 

goonergaz

Member
Usage does not diminish with more CUs if your apllication uses them simple as that. if you can feed GPU there are no "diminishing results" . There could be part of ps4 that hinderes use of all CUs effectively. BUT general assertion that with over 14 CUs comes big drop off is BS, pure and simple.

Yeah, I suppose I'm reading his comments in context...so therefore there are PS4 bottlenecks meaning that beyond 14 there is a sharp drop
 
No, it's not. It's down to how games 'typically' run on any GPU and how much ALU a 'typical' game uses at a given res vs other resources. A PC GPU with the same ratio of ALU:Rops:bandwidth would see the same less than linear return on more ALU past a certain point with the same games. It is not some soldered-into-the-hardware switch, it's a matter of 'typical' software balance and how it relates to hardware.

Your root information is correct, but that interpretation is not.

This is what I mean. If a dev wants to use 8 for one and 10 for the other that is up to them. There is no deminishing return that drop of at 14. Some could drop of at 8 but they have that extra 10 to do anything else.

With this there is so much more the ps4 will be capable to do in the long run.
 

gofreak

GAF's Bob Woodward
Maybe we're just quibbling over semantics, but to me a "huge knee" isn't "a bit of diminishing returns". It's a bottleneck.

Look at any graph with a line scaling up uniformly, that at a point changes how it scales. You'll have an angle - 'a knee' e.g. something that is no longer scaling linearly with an axis. There's no bottleneck, your perf isn't dipping past that point, it's just not scaling as it used to.

Yeah, I suppose I'm reading his comments in context...so therefore there are PS4 bottlenecks meaning that beyond 14 there is a sharp drop

Yikes. There is no drop. The gain may not be linear or the same past a certain point of ALU - which is entirely game dependent.
 
Senju probably scouring super old speculative B3D posts again.

Bish has been made aware of whatever information Senju is on about

Hopefully he will chime in with his take or even a suggestion one way or the other

I don't really believe it changes much that we know already
 

LiquidMetal14

hide your water-based mammals
Bish has been made aware of whatever information Senju is on about

Hopefully he will chime in with his take or even a suggestion one way or the other

I don't really believe it changes much that we know already

His info is not necessarily false but interpreted the wrong way. But I'm not the PS4 HW engineer so it's hard to even explain some of the things you read here so I don't claim to know the details of how each transistor works in tandem.
 
Yeah, I suppose I'm reading his comments in context...so therefore there are PS4 bottlenecks meaning that beyond 14 there is a sharp drop

Tasks are not equal. I say do better engines/ write better code. I would like to see this bottleneck.

edit: So it is just lower gain per CU past certain point in ps4.. no bottleneck (well there have to be some)
 

viveks86

Member
But all we see is your interpretation of Sony's information

There is no impartiality here

That's why it would be nice to have someone else's take on this information

I don't know what Sony said about this

I know what you said about what sony said about this

That is not the same

Actually it is what he said about what his developer source said about what Sony said. LOL. And that's what bugs me. This has multiple layers of interpretation and needs some corroboration from others to be taken seriously.
 

Skeff

Member
Everyone will be so disappointed when this comes down to being that 14+4 example sony gave that one time about if a developer wanted to designate certain resources to GPGPU... I highly doubt Sony have stated what SS claims. Even if it was true Sony wouldn't admit it.
 

Dragon

Banned
Gemüsepizza;83311589 said:
Oh right, now SenjutsuSage (of all people) has some "secret insider information" about Sony hardware that he can not post publicly for "reasons" and which somehow describe a bottleneck which isn't in the specs and which magically reduces the performance beyond 14 CUs, while AMD has plans to release cards with up to 44 compute units.

One usually gets banned for saying something like that without backing it up.
 

onQ123

Member
I never, ever, said they weren't important, or somehow won't be useful to the PS4. I simply said what Sony said to developers. Compute is used in games, too, so even if devs take sony up on their recommendation and use 14 CUs for graphics, that doesn't mean the remaining 4 CUs won't be used to make the game even better. The only point of mentioning it is that the PS4 GPU 14 + 4 balance rumor is true. If people didn't know what was meant by balanced at 14 CUs before, then they know now.

I didn't present the info as "haha ps4 blows! Xbox rocks!" I presented it in a purely informational, non trolling fashion. I know the PS4 is the stronger of the two machines.

Wasn't talking about you I was just putting things in perspective.

at a time when compute is being used more & more in video games this isn't the time to say oh 12 CU's is all that's needed for 1080P 60FPS games on a console.
 
Look at any graph with a line scaling up uniformly, that at a point changes how it scales. You'll have an angle - 'a knee' e.g. something that is no longer scaling linearly with an axis. There's no bottleneck, your perf isn't dipping past that point, it's just not scaling as it used to.
That knee, and where it occurred would also presumably depend on the specific software one was trying to run though right? I.e. if you're compute-limited then performance would continue to scale more-or-less linearly with more compute units?

The corresponding question would then be why one begins to see non-linear performance gain beyond 14CUs in the situation in question - what is limiting it's performance gain?
 

Guymelef

Member
One usually gets banned for saying something like that without backing it up.

The problem here is that he is not the source and there are not docs, he can PM with words from a friend/etc... talking about it.
At the same time his source could be wrong...
 
The problem here is that he is not the source and there are not docs, he can PM with words from a friend/etc... talking about it.
At the same time his source could be wrong...

Is that what senju's source is though?

We literally have no information on this except from Senju and while I'm sure there is a source I have no idea what it is comprised of aside from someone who attended a Sony devcon conference is somehow involved
 
But he has so many threats to attend. I don't think he has enough enough time to scour old B3D posts.

What, now I have to fend off threats, too!? I can't keep up with all this stuff. :p

The problem here is that he is not the source and there are not docs, he can PM with words from a friend/etc... talking about it.
At the same time his source could be wrong...

I wouldn't have said it if there were any chance of this being the case. Just saying.
 

Ebomb

Banned
Is that what senju's source is though?

We literally have no information on this except from Senju and while I'm sure there is a source I have no idea what it is comprised of aside from someone who attended a Sony devcon conference is somehow involved

Shouldn't the vgleaks stuff count as a Source? I thought that disclosed 14+4 as well.
 

gofreak

GAF's Bob Woodward
That knee, and where it occurred would also presumably depend on the specific software one was trying to run though right? I.e. if you're compute-limited then performance would continue to scale more-or-less linearly with more compute units?

Correct.


The corresponding question would then be why one begins to see non-linear performance gain beyond 14CUs - what is limiting it's performance gain?

The number 14 depends on the software. It might be 'typical'. The reason a 'typical' shader workload may scale less than linearly beyond that inflection point is because the ratio of its demands on ALU vs its demands on other resources (like pixel write or bandwidth) is less than the ratio available in the hardware. So other resources start forming a bound, that prevent the ALU side of the system running on all cylinders. ALUs will start to idle more as the ratio of some other resource in the hardware isn't matching what the software calls for. Again, what ratio software calls for will vary. It's why in DF's benchmarks some games gain more, some less, from additional ALU past the base card they used in that test. If you had a software mix that was very ALU intensive you could see a linear scale with all the CUs you can throw at it.

What Sony was saying here was 'just' what many would have already concluded - that past a certain point for a certain resolution, a x% larger amount of ALU ("flops") won't necessarily give you a x% higher framerate. In a typical case today, you might be leaving up to 4 CUs worth of ALU time on the table. So consider using (more ALU intensive) GPGPU to soak up the excess.
 

Ebomb

Banned
It's the second time you come here with this.
At least read the rest of the thread where you can see how this was debunked multiple times.

Hey, the jerkstore called. This is not the second time I've said this. My only other comment on this thread was relating to Cerny quote on roundness.
 
Top Bottom