• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

JonnyLH

Banned
Doesn't seem like you made it clear enough, this is what you said in your very first post:

Reads more like you're now finally here to school the poor uninformed masses.

And now you have to admit you were wrong while in your first post calling out Edge for being wrong is very ironic.
Oh my absolute god. You wonder why this forum has such a bad reputation and often is called "SonyGAF". I'm not even preaching for the X1 here, nor a fanboy, literally saying that the 50% claim is absolute horse-shit. Which, if you check the OP. Is what this thread is about.
 

bonus_sco

Banned
Which is all fine and dandy, but there's still extra load there.

I'm not sure what your point is there.

If the GPU stalls cycles go to waste so it switches wavefronts and carries on processing. It's exactly how GPUs are designed to hide latency.

CPUs reorder instructions to achieve the same thing.

Getting add close to 100% hardware utilisation is exactly what coding to the metal on consoles is all about.
 

benny_a

extra source of jiggaflops
Oh my absolute god. You wonder why this forum has such a bad reputation and often is called "SonyGAF". I'm not even preaching for the X1 here, nor a fanboy, literally saying that the 50% claim is absolute horse-shit. Which, if you check the OP. Is what this thread is about.
And you have not given sufficiently good arguments why that is the case. You even made the GDDR5 latency argument, that should tell everyone that has followed tech threads on GAF enough about your knowledge about the technical debate.
 

JonnyLH

Banned
The GPU doesn't 'HAVE' to do anything, hell the GPU doesn't even have to render the scene if you want to, what most people are bringing you up on is that you still insist that there is a hardware division somewhere that means 4 CU's have to be used for something when its obviously not true.
People have forwarded me on information around that it is possible, which is fine. Then you've got to consider which poor sod is coding it to do that. The fact of the matter is, its the best and easiest way to 'dedicate' hardware to certain tasks. I'm not denying anything here.
 

onanie

Member
You're now gloating on one terminology wrong and disregarding the whole of my other posts and avoiding my underlying point which is that the GPU will be utilised for other things other than graphics which makes this console generation a more levelled playing field.

The GPU can be utilised for things other than graphics, but not because it needs to (which is what you're desperately trying to imply here).
 

JonnyLH

Banned
And you have not given sufficiently good arguments why that is the case.

The GPU is going to be working on other tasks at hand which include physics, audio, support the CPU in ALU operations. Which everyone in heated debate is confirming. Which means when it comes to games, you're seeing a very level playing field. Which may I add, we are seeing.
 

KidBeta

Junior Member
People have forwarded me on information around that it is possible, which is fine. Then you've got to consider which poor sod is coding it to do that. The fact of the matter is, its the best and easiest way to 'dedicate' hardware to certain tasks. I'm not denying anything here.

Of course, but if you want to consider the poor sod then you need to consider that GPGPU isn't exactly the easiest thing to extract perf from there are many a task where it is slower then a CPU, or only barely faster for 10x the 'theoretical performance', it is at times like getting blood from a stone. Most people will default to the CPU for a lot of tasks because of this.

The GPU is going to be working on other tasks at hand which include physics, audio, support the CPU in ALU operations. Which everyone in heated debate is confirming. Which means when it comes to games, you're seeing a very level playing field. Which may I add, we are seeing.

Could be working on, not going to, it doesn't have to, we have been over this already.
 

bonus_sco

Banned
Oh my absolute god. You wonder why this forum has such a bad reputation and often is called "SonyGAF". I'm not even preaching for the X1 here, nor a fanboy, literally saying that the 50% claim is absolute horse-shit. Which, if you check the OP. Is what this thread is about.

The GPU has 50% more CUs than the Xbox One.

The Xbox had fixed function coprocessors which will help out a bit.
 

JonnyLH

Banned
Of course, but if you want to consider the poor sod then you need to consider that GPGPU isn't exactly the easiest thing to extract perf from there are many a task where it is slower then a CPU, or only barely faster for 10x the 'theoretical performance', it is at times like getting blood from a stone. Most people will default to the CPU for a lot of tasks because of this.
Not in the case of the PS4. This is why the beef is there in the GPU?
 

JonnyLH

Banned
The GPU has 50% more CUs than the Xbox One.

The Xbox had fixed function coprocessors which will help out a bit.
So that means the whole console is 50% powerful? You've proved my point. My god, I'm off for some food and games. Also 15 co-processors may I add.
 

bonus_sco

Banned
The GPU is going to be working on other tasks at hand which include physics, audio, support the CPU in ALU operations. Which everyone in heated debate is confirming. Which means when it comes to games, you're seeing a very level playing field. Which may I add, we are seeing.

The Xbox is going to be doing those things too.

If PS4 is split 14+4, Xbox might be split 9+3.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
You're now gloating on one terminology wrong and disregarding the whole of my other posts and avoiding my underlying point which is that the GPU will be utilised for other things other than graphics which makes this console generation a more levelled playing field.

I am not gloating on terminology, I already said that you got the concepts wrong, and you are not "admitting" that, you are just jumping to the next topic, going in cycles, just to repeat the same stuff you already wrote later on. About avoiding issues, that's highly ironic, I already wrote:

By the way, I love how we are bouncing from issue to issue without any conclusion after every argument gets debunked. First, we are talking about a OS reserve of GPU time on the PS4. Without addressing the counter arguments, you switch the debate to the PS4 "sorely" relying on the GPU. Without addressing the arguments here, you switch to a debate that 4 CUs are supposedly different, and you dig out the latency argument that was already debunked a few posts ago.

That does not bring any debate forward.

I don't see a reason to jump from red herring to red herring without actually addressing the issues.
 

Finalizer

Member
The GPU is going to be working on other tasks at hand which include physics, audio, support the CPU in ALU operations. Which everyone in heated debate is confirming.

You're the one saying this, not the rest of us. Everyone else is explaining to you that the GPU can be used for whatever devs want to. You're the one insisting on 14+4 hardware division nonsense.
 

KidBeta

Junior Member
Not in the case of the PS4. This is why the beef is there in the GPU?

What I posted is the case for GPGPU programming in general, for every GPU.

The PS4 has plenty of CPU power it might not have the most ultra fast CPU of today but it has 8 cores for gods sake, thats plenty.
 
Not in the case of the PS4. This is why the beef is there in the GPU?

I might be speaking out of turn but Cerny said they designed the PS4 so that it was easy to use the power from day one but they were futureproofing so developers can extract more power from GPGPU enhancements as the years go on.
 

onQ123

Member
The ALU purpose is in the GPU to help the CPU. If you take that out of it and dedicate all those to rendering, then prepared for poor audio, jerky fps, and the lack of physics.

Because they all can be used for that purpose, doesn't mean they should. I was going off the VGLeaks article where it clearly said they can be used for rendering if they want.

I was just showing you that it's not 4 CU's dedicated to Compute & that all of the CU's can be used as the devs please.


nothing that you said change that you are just going from one thing to the next.
 

onanie

Member
The GPU is going to be working on other tasks at hand which include physics, audio, support the CPU in ALU operations. Which everyone in heated debate is confirming. Which means when it comes to games, you're seeing a very level playing field. Which may I add, we are seeing.

The PS4 GPU can work on tasks other than graphics, but if that is not matched by an equivalent effort from the xbone GPU, it wont be a level field by any stretch of imagination. It will be ugly for the xbone.
 

JonnyLH

Banned
What I posted is the case for GPGPU programming in general, for every GPU.

The PS4 has plenty of CPU power it might not have the most ultra fast CPU of today but it has 8 cores for gods sake, thats plenty.
Ofcourse it is, and it will be used extensively. Although its not the best place for audio processing or physics at all in this case.

You're the one saying this, not the rest of us. Everyone else is explaining to you that the GPU can be used for whatever devs want to. You're the one insisting on 14+4 hardware division nonsense.
No I'm not? I'm just going from the VGLeaks article and in software terms, it's much easier to dedicate CU's for a specific purpose which can easily provide that.

I am not gloating on terminology, I already said that you got the concepts wrong. About avoiding issues, that's highly ironic, I already wrote:



I don't see a reason to jump from red herring to red herring without actually addressing the issues.
When you read a page and most of the quote boxes are green, its easy to skip out replies. By any means is my underlying point wrong.

Im out.
 

KidBeta

Junior Member
Ofcourse it is, and it will be used extensively. Although its not the best place for audio processing or physics at all in this case.

Audio will be fine on the CPU. I honestly cannot see it taking up more then 1 jaguar core to as many effects and reverbs as you want, this leaves you with in the least 5 more, to do what? AI?. May as well do other stuff on it.
 

Finalizer

Member
No I'm not? I'm just going from the VGLeaks article and in software terms, it's much easier to dedicate CU's for a specific purpose which can easily provide that.

Again,

Now when I say that many people say, "but we want the best possible graphics". It turns out that they're not incompatible. If you look at how the GPU and its various sub-components are utilised throughout the frame, there are many portions throughout the frame - for example during the rendering of opaque shadowmaps - that the bulk of the GPU is unused. And so if you're doing compute for collision detection, physics or ray-casting for audio during those times you're not really affecting the graphics. You're utilising portions of the GPU that at that instant are otherwise under-utilised. And if you look through the frame you can see that depending on what phase it is, what portion is really available to use for compute.

Devs don't need to dedicate CUs to GPGPU functionality.
 

Perkel

Banned
The GPU does have to work on them, its why this ALU argument got added in the first place because of the VGLeaks article. Sony have modified this for the purpose of it picking up these tasks. The extra beef is there for a reason.


Stop insisting that they added extra ALUs because their other hardware like audio is lackluster.

They added it because they see future in GPGPU not because it is patch for their lackluster hardware.
 
The GPU is going to be working on other tasks at hand which include physics, audio, support the CPU in ALU operations. Which everyone in heated debate is confirming. Which means when it comes to games, you're seeing a very level playing field. Which may I add, we are seeing.

Completely, utterly false.

Only if the developers WANTS the GPU to do other tasks outside of rendering scenes, will it do those tasks.

It is not a requirement whatsoever.

someone forgot to tell you the PS4 has a dedicated audio processor as well, fully able to handle high def. audio streams on its own.

Also, they haven't " added " ALU's at all. The Radeon 7850 has 16 CU's, the 7870 has 20CU's. Looks like they just met that in the middle with the PS4. Most everything else from this GPU is right in line with the Radeon 7850, besides have a little bit more memory bandwidth and the like. I like to call the PS4 GPU a 7860. Seems to be right there in the middle.
 

onanie

Member
Ofcourse it is, and it will be used extensively. Although its not the best place for audio processing or physics at all in this case.

That is why the PS4 also has a dedicated audio processor. Neither consoles have dedicated physics processing. Thus all those "other tasks" that the PS4 GPU "needs" to do, will be done with more available resources than the xbox GPU has.
 

bonus_sco

Banned
Audio will be fine on the CPU. I honestly cannot see it taking up more then 1 jaguar core to as many effects and reverbs as you want, this leaves you with in the least 5 more, to do what? AI?. May as well do other stuff on it.

AI, animation, PVS, GPU driver (we're looking at loads more draw calls, setting up compute, PRT work, streamjng, etc.), PS4 CPU or GPU is going to have to do the swizzling the move engines do on Xbox, maybe audio on PS4 if wanting SHAPE like effects.

The CPU won't be short of work :)
 

Finalizer

Member
Hurray, another thread with another junior spreading misinformation.

tumblr_mgpf2dWQMp1r1rt5lo1_500.jpg
 

KidBeta

Junior Member
AI, animation, PVS, GPU driver (we're looking at loads more draw calls, setting up compute, PRT work, streamjng, etc.), PS4 CPU or GPU is going to have to do the swizzling the move engines do on Xbox, maybe audio on PS4 if wanting SHAPE like effects.

The CPU won't be short of work :)

The GPU memory controllers will do all kinds of swizzling for you, it shouldn't be a surprise as it should be what the XBONE DME's are most likely bassed off.

Setting up and firing off jobs these days is a lot cheaper then it used to be, I think it was around DX10 when they started to try and reduce the number of draw calls to make the strain less on the CPU. PRT stuff should be done from the GPU, as it is the device that reads in the textures.

I honestly don't see a lot of work for the CPU to do :p. (I counted the GPU driver in the 2 reserved cores btw) and reserved a entire jaguar core just for SHAPE.

Animation is a good point though.

If we give each job a core of its own we still have 2/3 left over doing nothing.
 

bonus_sco

Banned
We don't even know what they do or what they contain, i'm going to treat it as PR fluff until we learn more.

Well, it depends on if the VGLeaks docs are accurate but you've got memcpy, tiling/swizzle, JPEG decode, LZ encode/decode, framebuffer scaling, framebuffer blending, audio generation, audio encode/decode, h.264 all on coprocessors.
 
You also have to remember that not only does the PS4 have extra CUs compare to X1.
It also has more ROPs and Texture units so every little bit adds up when we talking about GFX.
 

bonus_sco

Banned
The GPU memory controllers will do all kinds of swizzling for you, it shouldn't be a surprise as it should be what the XBONE DME's are most likely bassed off.

Setting up and firing off jobs these days is a lot cheaper then it used to be, I think it was around DX10 when they started to try and reduce the number of draw calls to make the strain less on the CPU. PRT stuff should be done from the GPU, as it is the device that reads in the textures.

I honestly don't see a lot of work for the CPU to do :p. (I counted the GPU driver in the 2 reserved cores btw).

Animation is a good point though.

People keep saying GPUs can do hardware tiling/swizzling. They can't :).

The driver usually does it in the driver on the CPU, MS added dedicated hardware for the task.
 

jaosobno

Member
Are you ignoring my posts on purpose?

The Xbox One only has 2 ACE's with 4 compute queues and a fixed function Audio DSP, the PS4 has 8 ACE's with 64 compute queues which can execute compute and graphics threads in parallel as well as a basic Audio DSP.

It has more than enough compute "umph" compared to XBone.

Doesn't each ACE have 8 queues?
 

bonus_sco

Banned
The GPU memory controllers will do all kinds of swizzling for you, it shouldn't be a surprise as it should be what the XBONE DME's are most likely bassed off.

Setting up and firing off jobs these days is a lot cheaper then it used to be, I think it was around DX10 when they started to try and reduce the number of draw calls to make the strain less on the CPU. PRT stuff should be done from the GPU, as it is the device that reads in the textures.

I honestly don't see a lot of work for the CPU to do :p. (I counted the GPU driver in the 2 reserved cores btw) and reserved a entire jaguar core just for SHAPE.

Animation is a good point though.

If we give each job a core of its own we still have 2/3 left over doing nothing.

The CPU has to decide which tiles should be resident for PRT and "upload" them to the driver. The GPU just reads/writes to a virtual memory address.
 

Oppo

Member

.. he said, as the blast door came down on him...

I wonder why people do that, double-down on an argument, when practically the whole thread is saying no, that's not right. You'd think you'd want to be right, have the actual correct info, more than "winning" the debate. Ah well.

Console launches, eh? good times.

bonus sco said:
The driver usually does it in the driver on the CPU, MS added dedicated hardware for the task.

MS has a hardware swizzler?

Not doubting that, but I find the term kind of hilarious.
"Quickly! Into the Swizzler!"
 

bonus_sco

Banned
That is why the PS4 also has a dedicated audio processor. Neither consoles have dedicated physics processing. Thus all those "other tasks" that the PS4 GPU "needs" to do, will be done with more available resources than the xbox GPU has.

Cerny explicity called out "audio" GPGPU. He talked about raycasting to work out effect volumes etc.
 

bonus_sco

Banned
.. he said, as the blast door came down on him...

I wonder why people do that, double-down on an argument, when practically the whole thread is saying no, that's not right. You'd think you'd want to be right, have the actual correct info, more than "winning" the debate. Ah well.

Console launches, eh? good times.



MS has a hardware swizzler?

Not doubting that, but I find the term kind of hilarious.
"Quickly! Into the Swizzler!"

Lol, it was known as tiling until recently but it gets confused with tiles in PRT.

Swizzling gets confused with vector element operations now. Too many words have too many overloaded meanings.
 
Top Bottom