• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

I am winding you guys up, it's easy done ;)
.
05d.jpg
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I am winding you guys up, it's easy done ;)

Just a question before you are perma banned. Why would you go through the whole torturous registration process and then post stuff like this.
Surely you have lurked on this site for ages before getting the all clear. Please explain why you would do that.
 

Hurley

Member
Are wind ups taken that seriously around here?

I'd imagine they are, considering these threads are suppose to promote meaningful and intelligent discussion about the consoles hardware, last thing people want is someone in here winding others up just for a laugh.
 

Artorias

Banned
Are wind ups taken that seriously around here? It didn't take much to lead people on...

This is a forum dedicated to gaming news and discussion. Pretending to have inside info and then admitting to trolling, or "wind ups" is definitely taken seriously.

It's a bit hard to believe that anyone might think that wouldn't be an issue.
 

Knuf

Member
Just a question before you are perma banned. Why would you go through the whole torturous registration process and then post stuff like this.
Surely you have lurked on this site for ages before getting the all clear. Please explain why you would do that.

Just look at his username: bonus_sco.
SCO were masters of trolling (remember the SCO vs. Linux patent case?), probably he worked there and trolling is part of his DNA now.
 

KidBeta

Junior Member
Are wind ups taken that seriously around here? It didn't take much to lead people on...

Pretending to have insider information and then attempting to discredit what people are saying based off information you made up is taken seriously I believe.
 

JonnyLH

Banned
Dude. There is no such thing as "14 is all you need". No such thing. Devs will use whatever they can to bring the most to their games. It is never a case of "good enough", but a case of "how far can we go".

There is no single limitation other than the maximum number of CUs. There is no 14+4. Its just 18 and they can be used however a Dev wishes.

Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?

I hate to bring this topic subject up again, but CPU's are hindered by GDDR memory. Tasks like audio processing, which would normally get handled by the CPU will get offloaded into the GPU. Same goes with physics. GDDR latency isn't as apparent in the GPU.

In conclusion, for actual CU's which are 'dedicated' to games so to speak, it may very well decrease.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?

My guess would be the dedicated encoding/decoding hardware.

Mark Cerny said:
For example, by having the hardware dedicated unit for audio, that means we can support audio chat without the games needing to dedicate any significant resources to them. The same thing for compression and decompression of video.

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=3
 

KidBeta

Junior Member
Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?

I hate to bring this topic subject up again, but CPU's are hindered by GDDR memory. Tasks like audio processing, which would normally get handled by the CPU will get offloaded into the GPU. Same goes with physics. GDDR latency isn't as apparent in the GPU.

In conclusion, for actual CU's which are 'dedicated' to games so to speak, it may very well decrease.

Every modern AMD GPU includes FF hardware to encode and decode h264 video on the fly without using any of the CU's.

Also the GDDR5 latency rumour has been debunked multiple times, the latency is the close enough if the not the same as DDR3.
 
Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?

Jonny, I won't pretend to be as technically savvy as you are, but you keep missing out on stuff that PS4 already said have dedicated hardware for specific functions.

You should go and read up on all PS4 hardware details/interviews first, else you keep getting debunked on trivial stuff.
 

JonnyLH

Banned
Its known the PS4's audio chip is just and encoder/decoder and provides basic support in terms of next-gen for actual voice processing. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.
 

Perkel

Banned
Pretending to have insider information and then attempting to discredit what people are saying based off information you made up is taken seriously I believe.

It is 100% ban. We had already cases like that and people got banned. Not the first time not the last.
 

JonnyLH

Banned
Every modern AMD GPU includes FF hardware to encode and decode h264 video on the fly without using any of the CU's.

Also the GDDR5 latency rumour has been debunked multiple times, the latency is the close enough if the not the same as DDR3.
In the GPU its a non-issue. In the CPU its a completely different story. It does have massive effects where latency dependent tasks like an audio buffer, come to suffer.

I'm not trying to degrade the PS4 in anyway, I'm just stating that the extra beef in the GPU was a smart move and will be used for tasks the CPU will come to struggle with.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
That provides me nothing, also its known the PS4's audio chip is just and encoder/decoder. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.

Without going into a new argument about all of that, what has anything of that to do with your original statement that "Its a common conception that some CU's will be reserved for OS tasks."?

Nice red herring. (http://en.wikipedia.org/wiki/Red_herring)
 

gruenel

Member
That provides me nothing, also its known the PS4's audio chip is just and encoder/decoder. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.

Really? Source on that? Because I was under the impression we know virtually nothing about PS4's audio chip.
 

KidBeta

Junior Member
That provides me nothing, also its known the PS4's audio chip is just and encoder/decoder. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.

Every (modern) AMD GPU includes a full h264 decoder/encoder.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/9
 

Tiberius

Member
Its known the PS4's audio chip is just and encoder/decoder and provides basic support in terms of next-gen for actual voice processing. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.
Since when?
 

JonnyLH

Banned
Without going into a new argument about all of that, what has anything of that to do with your original statement that "Its a common conception that some CU's will be reserved for OS tasks."?

Nice red herring. (http://en.wikipedia.org/wiki/Red_herring)
When you have a system sorely relying on its GPU, there very well could be. Question though, people say the 14+4 CU rumour was debunked, but how so? From the early reports VGLeaks have been spot on. Just wondering.
 

Guymelef

Member
Its known the PS4's audio chip is just and encoder/decoder and provides basic support in terms of next-gen for actual voice processing. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.

Here we go again... TheKayle 2.0.
Encoder/decoder is one of the functions, not the only one.
 

Perkel

Banned
When you have a system sorely relying on its GPU, there very well could be. Question though, people say the 14+4 CU rumour was debunked, but how so? From the early reports VGLeaks have been spot on. Just wondering.


Spot on what ? all CU are the same there is no 14+4
 

KidBeta

Junior Member
When you have a system sorely relying on its GPU, there very well could be. Question though, people say the 14+4 CU rumour was debunked, but how so? From the early reports VGLeaks have been spot on. Just wondering.

It wasn't debunked it was just reported on wrong, it was a example Sony gave during a bit of developer documentation (is my guess). The reason that people don't believe the 14 + 4 stuff is that to have a physical difference between the two doesn't make sense because it would require you to majorly change the design of the GPU, in essence you would be spending money to remove features and flexability.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
When you have a system sorely relying on its GPU, there very well could be.

That doesn't even begin to be an argument. How is the PS4 a system that is "sorely relying on its GPU"? And does "very well could be" mean that you have no idea why the PS4 would reserve GPU for the OS, but you assume it nevertheless? On the XBO we have a very good idea why it is the case. On the PS4 there is no OS feature that would need dedicated GPU time while the game is running.

Question though, people say the 14+4 CU rumour was debunked, but how so? From the early reports VGLeaks have been spot on. Just wondering.

Because the VGLeaks document never said that there is a fixed assignment of "14+4".
 

JonnyLH

Banned
Here we go again... TheKayle 2.0.
Encoder/decoder is one of the functions, not the only one.

I'm going with what I know and read from Mark Cerny here. He specifically mentions the decoding/encoding of formats and refers to the actual processing inside the GPU.
 

KidBeta

Junior Member
I'm going with what I know and read from Mark Cerny here. He specifically mentions the decoding/encoding of formats and refers to the actual processing inside the GPU.

He talks about audio raycasting on the GPU IIRC, something that SHAPE doesn't do.
 

onQ123

Member
In the GPU its a non-issue. In the CPU its a completely different story. It does have massive effects where latency dependent tasks like an audio buffer, come to suffer.

I'm not trying to degrade the PS4 in anyway, I'm just stating that the extra beef in the GPU was a smart move and will be used for tasks the CPU will come to struggle with.

Where was all this concern about using GDDR for the GPU & CPU when the Xbox 360 was being released?


all I remember hearing about was how it was so much better to have just one unified pool of memory.
 

Oppo

Member
JonnyLH said:
Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?
It's not the GPU. It's a dedicated part, an encoder/decoder chip. We know this because it happens in every game and the debs have already said there's zero overhead for the function from their side. That means not GPU time. Although it might live in there physically.
 

JonnyLH

Banned
That doesn't even begin to be an argument. How is the PS4 a system that is "sorely relying on its GPU"? And does "very well could be" mean that you have no idea why the PS4 would reserve GPU for the OS, but you assume it nevertheless? On the XBO we have a very good idea why it is the case. On the PS4 there is no OS feature that would need dedicated GPU time while the game is running.



Because the VGLeaks document never said that there is a fixed assignment of "14+4".
http://www.vgleaks.com/world-exclusive-orbis-unveiled-2/

More information inside that article. The PS4's 4 'hardware balanced' CU's are extra ALU's. These are primarily for mathematical and logical operations. This is very smart actually, these will be quite different from the other CU's as they're balanced for this operation. These usually process any logic and maths, place the result in RAM for the CPU to pick up. They've put these in the GPU because they're primarily calling RAM pretty much constantly. Like said, the GPU handles and gets a lower latency from the RAM (depending on unified bus) and will handle these operations much better. If these were in the CPU you'd be throttling your whole system. Like the article says, because these are balanced in this way, they would only provide a minor benefit in terms of actual rendering.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
He talks about audio raycasting on the GPU IIRC, something that SHAPE doesn't do.

Yes, the SHAPE hardware operates on audio streams and basically accelerates certain functions of the XAudio2 API [2]. Raycasting, on the other hand, is a rather generic technique that is computed on the game world's geometry, not on audio data. The results can be, among other things, used to parameterize transformations of audio streams (e.g. employ raycasting to determine the travel of sound waves from a source to a receiver in the game world, and then use the results as a parameter to transform audio accordingly).

http://en.wikipedia.org/wiki/XAudio2
 

KidBeta

Junior Member
http://www.vgleaks.com/world-exclusive-orbis-unveiled-2/

More information inside that article. The PS4's 4 'hardware balanced' CU's are extra ALU's. These are primarily for mathematical and logical operations. This is very smart actually, these will be quite different from the other CU's as they're balanced for this operation. These usually process any logic and maths, place the result in RAM for the CPU to pick up. They've put these in the GPU because they're primarily calling RAM pretty much constantly. Like said, the GPU handles and gets a lower latency from the RAM (depending on unified bus) and will handle these operations much better. If these were in the CPU you'd be throttling your whole system. Like the article says, because these are balanced in this way, they would only provide a minor benefit in terms of actual rendering.

If you had read and responded to any of my posts you'd know this was 100% horseshit. What you typed doesn't even make any sense extra ALU can be used for any ALU tasks which would include graphics.
 
Top Bottom