• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Duke2k

Neo Member
Seriously this needs to stop. 14+4 split was debunked.

Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.

The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.

There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).
 

Tabular

Banned
PlayStation 4 is currently around 50 per cent faster than its rival Xbox One. Multiple high-level game development sources have described the difference in performance between the consoles as “significant” and “obvious.”

Our contacts have told us that memory reads on PS4 are 40-50 per cent quicker than Xbox One, and its ALU (Arithmetic Logic Unit) is around 50 per cent faster. One basic example we were given suggested that without optimisation for either console, a platform-agnostic development build can run at around 30FPS in 1920×1080 on PS4, but it’ll run at “20-something” FPS in 1600×900 on Xbox One. “Xbox One is weaker and it’s a pain to use its ESRAM,” concluded one developer.

Microsoft is aware of the problem and, having recently upped the clock speed of Xbox One, is working hard to close the gap on PS4, though one developer we spoke to downplayed the move. “The clock speed update is not significant, it does not change things that much,” he said. “Of course, something is better than nothing.”

So 20 something is a wild card here as some have said. We have to pretty much guess what 20 something actually is. The difference between 21 and 29 is huge for example. Personally I would assume it will be high twenties or they would have said low or mid twenties. However, using the purported 50% higher performance we can estimate. If you do the math;

....(1920 x 1080) / (1600 x 900) = 1.44

....(1.44/1.50) x 30 = 28.8

So if the Xbone is performing at 66% ps4, it is likely running the game at 900p @ 28.8 fps. So not a huge disparity to work with although I wonder if there will be other degradations happening in the build process. Anyway not saying I know what they meant but ~28fps is my guess until we know better.

What it does not cover is what happens one these builds go through massage to use each consoles specific strengths. What happens once the dedicated hardware in the Xbox is put to use, what happens when the developer actually things about their render pipeline and starts to use the ESRAM etc. Likewise when developers start to experiment with the more abundant GPU on the PS4 for extra GPGPU.

At the moment developers are having a hard time, they have had poor drivers, non final spec's and they are targeting more consoles than every as they also often have 360 and PS3 ports on the go also.

yes it would appear the Xbox takes slightly more work than porting a PC build to it but as the console game is all about refinement and optimization this is not a bad thing as developers will be doing this anyway.

Are you thinking that the early Xbox One API's talked about by dev's aren't using the ESRAM yet and that's why there is a 50% performance gulf? ('40-50% quicker memory reads") Obviously ESRAM is used in all these early builds, or the difference would be like 160% instead of 40-50%. Although I agree with you that the Xbone should benefit from optimizations more than the simpler PS4 memory architecture. As you said once GPGPU inevitably becomes integral to game development PS4 too will benefit from the extra CU's and 16 times more ACE queues, etc. At launch however, I do expect to see -40% better performing multiplat's on the PS4.
 

CLEEK

Member
Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.

The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.

There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).

There's not split. The VGLeaks 14+4 was debunked as not true. There are just 18 CU.

I think you're confusing the 8 ACEs in the PS4, which are used for GPGPU. GPGPU works in tandem with typical GPU rendering. It's not one or the other. GPGPU compute task are fed to the GPU when it has spare cycles.

W!CKED did a good summary around how GPGPU in the PS4 works in another thread. The Xbox One will work in the same way, but far more limited. Not only does the One have fewer CUs, it only has 2 x ACE and 2 x Compute Queues (compared to 8 x ACE and 64 x compute queue in the PS4).
 

Finalizer

Member
Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.

The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.

There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).

So now we're moving from 14+4 to 10+8. Goodie.

This fellow explained it nicely. You don't dedicate entire CUs to GPGPU; It's all scheduled to run by the PS4 when there's resources available. EDIT: Or the guy above me can find a quote that explains it in further detail, heh.
 

Klocker

Member
So now we're moving from 14+4 to 10+8. Goodie.

This fellow explained it nicely. You don't dedicate entire CUs to GPGPU; It's all scheduled to run by the PS4 when there's resources available. EDIT: Or the guy above me can find a quote that explains it in further detail, heh.

Let's say 12+6


where the 6 makes up for lack or esram ;)
 

KidBeta

Junior Member
Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.

The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.

There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).

"All you will need for graphics".

What?, thats not how things work at all, to imply something that, that there is a fixed number of resources needed for certain activities shows a distinct lack of understanding about how any of it works.
 

Duke2k

Neo Member
There's not split. The VGLeaks 14+4 was debunked as not true. There are just 18 CU.

I think you're confusing the 8 ACEs in the PS4, which are used for GPGPU. GPGPU works in tandem with typical GPU rendering. It's not one or the other. GPGPU compute task are fed to the GPU when it has spare cycles.

W!CKED did a good summary around how GPGPU in the PS4 works in another thread. The Xbox One will work in the same way, but far more limited. Not only does the One have fewer CUs, it only has 2 x ACE and 2 x Compute Queues (compared to 8 x ACE and 64 x compute queue in the PS4).

Oh well, it might be 8 ACE. The other link with the summary, concluded it to be SPE, I thought he knew Japanaese so translated it better than google.
 

Tabular

Banned
Sounds like anonymous dev needs to brush'n up his esram skills.

Dev(s) as in several of them. And no it won't be easy getting parity with PS4. There's a lot of redundancy in using 32 MB as a frame buffer which will eat away at the high BW.

As for the DDR3; only 68 GB/s; minus OS needs; minus CPU needs = not much for textures.
 

NBtoaster

Member
Dev(s) as in several of them. And no it won't be easy getting parity with PS4. There's a lot of redundancy in using 32 MB as a frame buffer which will eat away at the high BW.

As for the DDR3; only 68 GB/s; minus OS needs; minus CPU needs = not much for textures.

The OS doesn't consume bandwidth like that.
 

mrklaw

MrArseFace
The OS doesn't consume bandwidth like that.

10% of the GPU is reserved for the OS - presumably because a game has to cope with being 'snapped' and running alongside other apps. If they reserve GPU, they'll need to also reserve memory bandwidth - if you're using Skype or fantasy football snapped alongside your game, that will be using up some bandwidth too (not much, but some)
 

Klarax

Banned
If sales for Xbox are low (cant see it tho) then the developers wont bother developing for the system; Wii U syndrome...

Any game developed wont be held back be xbox being weaker then.
 

Log4Girlz

Member
If sales for Xbox are low (cant see it tho) then the developers wont bother developing for the system; Wii U syndrome...

Any game developed wont be held back be xbox being weaker then.

Porting to Xbone would be trivial. So anything going to PS4 is going to Xbone. Even with low sales, ROI shouldn't be a problem.
 

QaaQer

Member
Ignored this thread mostly because I thought I could guess how stupid it could become. Then, out of curiosity, I poke my head in and find dudes blowing up a single frame of a pick to prove it was Photoshoped to hide clipping on an Xbone game. I don't know if it was or wasn't, if I had to guess I'd say it was, either way it doesn't make a lick of difference about anything at all. That being said, I know someone believes there's some scandal or winning going on here so by all means continue.

what psychological need is fulfilled by coming in to a discussion to tell people how stupid they are for having a particular interest in something? I guess I'm asking if that post made you feel better about yourself?
 

JonnyLH

Banned
Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.

The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.

There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).
This is a number we definitely need announced from Sony because with the architecture of the box, its natural for its GPU to picking up a lot of the work for the OS and other miscellaneous tasks performed by the game. I honestly feel like they're holding back on a lot of this information to fuel wars like this.
 

Busty

Banned
If sales for Xbox are low (cant see it tho) then the developers wont bother developing for the system; Wii U syndrome...

Any game developed wont be held back be xbox being weaker then.

Even if the Xbone is a WiiU level disaster (which it won't) then you see MS simply throw money at the problem and third parties will still release shitty, vanilla ports of titles anyway.

Just look at all the effort Sony put into getting titles onto the PSP. I believe that Sony were actually paying for at least partially paying for third parties to port titles to the system.

Jesus Christ. As someone who has worked at a major game studio, if you don't think this 'photoshopping' issue you guys are hung up on is happening at every studio, I got a bridge in Brooklyn to sell you.

Pffft. You clearly don't think much of the members of GAF. Bridge indeed.

In saying that we are looking to invest the GAF pension fund in some magic beans. If you could hook us up please PM me.
 

twobear

sputum-flecked apoplexy
If sales for Xbox are low (cant see it tho) then the developers wont bother developing for the system; Wii U syndrome...

Any game developed wont be held back be xbox being weaker then.
Hahahaha, wow, just imagine how great the PS2 generation would have been if the Xbox had massively outsold the PS2, then!
 

Cesar

Banned
So they're capable of storing an extra 30fps in the cloud?

Forza 5 90fps confirmed (online only).

No, I am not saying to that degree but for a launch game like forza 5 with cloud calculated ai AT LAUNCH you will get ai cloud controled enemies in fps's like halo 5 in the future. It will get further each time.
 

bonus_sco

Banned
No, I am not saying to that degree but for a launch game like forza 5 with cloud calculated ai AT LAUNCH you will get ai cloud controled enemies in fps's like halo 5 in the future. It will get further each time.

Forza's AI isn't being calculated in the cloud at runtime. They're using machine learning to generate driving "personalities".
 

Finalizer

Member
This is a number we definitely need announced from Sony because with the architecture of the box, its natural for its GPU to picking up a lot of the work for the OS and other miscellaneous tasks performed by the game. I honestly feel like they're holding back on a lot of this information to fuel wars like this.

Stop trying to spread FUD. Again, there is no hard split on CUs dedicated to GPGPU tasks.

From Cerny's own mouth:

Now when I say that many people say, "but we want the best possible graphics". It turns out that they're not incompatible. If you look at how the GPU and its various sub-components are utilised throughout the frame, there are many portions throughout the frame - for example during the rendering of opaque shadowmaps - that the bulk of the GPU is unused. And so if you're doing compute for collision detection, physics or ray-casting for audio during those times you're not really affecting the graphics. You're utilising portions of the GPU that at that instant are otherwise under-utilised. And if you look through the frame you can see that depending on what phase it is, what portion is really available to use for compute.

Again, resources do not have to be taken away from graphical capabilities to utilize GPGPU functionality.
 
This is a number we definitely need announced from Sony because with the architecture of the box, its natural for its GPU to picking up a lot of the work for the OS and other miscellaneous tasks performed by the game. I honestly feel like they're holding back on a lot of this information to fuel wars like this.

The main job of OS is to control the processes. It's not the kind of workload that can be done on a GPU. Also if a game needs 4 CUs for compute and 14 CUs for graphics, it doesn't mean 4 CUs are off-limit to graphics. GPUs don't work like that. It means for each second, 4/18 seconds will be used for compute while 14/18 seconds will be used for compute.

If a multiplat game needs 4 CUs for compute, you can be sure that xbone version will need 4 CUs for compute as well, which means xbone will have 8 CUs for graphics while ps4 has 12.
 

Chobel

Member
This is a number we definitely need announced from Sony because with the architecture of the box, its natural for its GPU to picking up a lot of the work for the OS and other miscellaneous tasks performed by the game. I honestly feel like they're holding back on a lot of this information to fuel wars like this.

Seriously dude? Sony doing that so NeoGAF members start a console wars?

How much of the GPU is reserved for the OS?

Probably none. PS4 OS don't do "snap".
 

Kaako

Felium Defensor
No, I am not saying to that degree but for a launch game like forza 5 with cloud calculated ai AT LAUNCH you will get ai cloud controled enemies in fps's like halo 5 in the future. It will get further each time.
Unless MS magically created a solution to minimize latency/make it a non-issue; you won't see cloud implementation for real-time AI in any meaningful way.
Keyword: magically.
 

bonus_sco

Banned
The main job of OS is to control the processes. It's not the kind of workload that can be done on a GPU. Also if a game needs 4 CUs for compute and 14 CUs for graphics, it doesn't mean 4 CUs are off-limit to graphics. GPUs don't work like that. It means for each second, 4/18 seconds will be used for compute while 14/18 seconds will be used for compute.

If a multiplat game needs 4 CUs for compute, you can be sure that xbone version will need 4 CUs for compute as well, which means xbone will have 8 CUs for graphics while ps4 has 12.

Is this where you get told to stop spreading FUD? ;)

Do you know what the PS4 reserves?
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Do you know what the PS4 reserves?

Why should the PS4 reserve any non-trivial GPU time for anything other than the game when the game is the only application rendering graphics during gaming? The XBO runs snapped Metro apps concurrently with the actual games, and since Metro is a hardware-accelerated UI and the WinRT-based apps have access to rendering capabilities, GPU time reservation is necessary. This does not apply to the PS4 which simply doesn't have the snap feature. As others already said, background tasks are much better put on the CPU.
 

mrklaw

MrArseFace
Why should the PS4 reserve any non-trivial GPU time for anything other than the game when the game is the only application rendering graphics during gaming? The XBO runs snapped Metro apps concurrently with the actual games, and since Metro is a hardware-accelerated UI and the WinRT-based apps have access to rendering capabilities, GPU time reservation is necessary. This does not apply to the PS4 which simply doesn't have the snap feature. As others already said, background tasks are much better put on the CPU.

At a guess, I'd say the same as PS3 for popup notifications etc. probably sub 1%?
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
At a guess, I'd say the same as PS3 for popup notifications etc. probably sub 1%?

This is marginal, and, most importantly, there is no need to reserve a fixed amount of resources all the time for such things. Most probably, notifications are just part of the game's process and handled by the overall game framework, whereas snapped applications not only are running in different processes, but in a different virtual partition.
 
Is this where you get told to stop spreading FUD? ;)

Do you know what the PS4 reserves?

In what sense am I spreading FUD? Don't start taking things personally now.

2 cpu cores for OS and 2.5 ~ 2 GB of ram is all we know so far. And ps4 doesn't support snap which means games can be suspended while using ps4's UI. So I don't think there'll be as much hit on GPU time compared to xbone which needs to render the game and the OS UI at the same time.
 

bonus_sco

Banned
In what sense am I spreading FUD? Don't start taking things personally now.

2 cpu cores for OS and 2.5 ~ 2 GB of ram is all we know so far. And ps4 doesn't support snap which means games can be suspended while using ps4's UI. So I don't think there'll be as much hit on GPU time compared to xbone which needs to render the game and the OS UI at the same time.

It's not personal :)
 
It's not personal :)

The following is my original post. Please point out where I'm spreading FUD.

The main job of OS is to control the processes. It's not the kind of workload that can be done on a GPU. Also if a game needs 4 CUs for compute and 14 CUs for graphics, it doesn't mean 4 CUs are off-limit to graphics. GPUs don't work like that. It means for each second, 4/18 seconds will be used for compute while 14/18 seconds will be used for compute.

If a multiplat game needs 4 CUs for compute, you can be sure that xbone version will need 4 CUs for compute as well, which means xbone will have 8 CUs for graphics while ps4 has 14.

Edit: changed ps4 having 12 CUs worth of GPU power left to 14 in the last sentence of my quote.
 
All of it is an assumption on how you think it works, none of it is accurate.

What is not accurate about it? Unlike CPU processes and threads which have to be managed by a programmer, on GPUs, you just pass on a small shader program which the GPU itself sends off to its many cores. Also why would Cerny talk about asynchronous compute if you need to partition the GPUs. That doesn't make sense at all.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
All of it is an assumption on how you think it works, none of it is accurate.

The statement that you don't need to assign a fixed number of CUs to a category of tasks is accurate. GPU threads and thread groups are hardware-managed.
 

Jack_AG

Banned
Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.

The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.

There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).
Dude. There is no such thing as "14 is all you need". No such thing. Devs will use whatever they can to bring the most to their games. It is never a case of "good enough", but a case of "how far can we go".

There is no single limitation other than the maximum number of CUs. There is no 14+4. Its just 18 and they can be used however a Dev wishes.
 

Finalizer

Member
There's more to it than that.

I really can't say any more than that.

PS4 has secret sauce, lol.

Cool story, bro.

Qxw4Cu7.gif


EDIT: Or should this be interpreted as "UNLEASH THE BISH" time?
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I really can't say any more than that.

There has been some coquetry with alleged insider knowledge lately. If somebody does not let his credentials be verified by a mod, I am inclined to dismiss all such statements as trolling.
 

Respawn

Banned
Ignored this thread mostly because I thought I could guess how stupid it could become. Then, out of curiosity, I poke my head in and find dudes blowing up a single frame of a pick to prove it was Photoshoped to hide clipping on an Xbone game. I don't know if it was or wasn't, if I had to guess I'd say it was, either way it doesn't make a lick of difference about anything at all. That being said, I know someone believes there's some scandal or winning going on here so by all means continue.
Must have bugged the heck out of you since you post all that. You would be better off observing.
 
Top Bottom