I think we need to call Bish in.
I am winding you guys up, it's easy done
This is where I say you done fucked up, my dear junior.I am winding you guys up, it's easy done
It was nice to meet you, bonus_sco. See you in another life, brother.
There's more to it than that.
I really can't say any more than that.
PS4 has secret sauce, lol.
Are wind ups taken that seriously around here? It didn't take much to lead people on...
I am winding you guys up, it's easy done
Are wind ups taken that seriously around here?
Are wind ups taken that seriously around here? It didn't take much to lead people on...
Just a question before you are perma banned. Why would you go through the whole torturous registration process and then post stuff like this.
Surely you have lurked on this site for ages before getting the all clear. Please explain why you would do that.
Are wind ups taken that seriously around here? It didn't take much to lead people on...
Just look at his username: bonus_sco.
SCO were masters of trolling (remember the SCO vs. Linux patent case?), probably he worked there and trolling is part of his DNA now.
There's more to it than that.
I really can't say any more than that.
PS4 has secret sauce, lol.
Dude. There is no such thing as "14 is all you need". No such thing. Devs will use whatever they can to bring the most to their games. It is never a case of "good enough", but a case of "how far can we go".
There is no single limitation other than the maximum number of CUs. There is no 14+4. Its just 18 and they can be used however a Dev wishes.
Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?
Mark Cerny said:For example, by having the hardware dedicated unit for audio, that means we can support audio chat without the games needing to dedicate any significant resources to them. The same thing for compression and decompression of video.
Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?
I hate to bring this topic subject up again, but CPU's are hindered by GDDR memory. Tasks like audio processing, which would normally get handled by the CPU will get offloaded into the GPU. Same goes with physics. GDDR latency isn't as apparent in the GPU.
In conclusion, for actual CU's which are 'dedicated' to games so to speak, it may very well decrease.
Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?
Its known the PS4's audio chip is just and encoder/decoder and provides basic support in terms of next-gen for actual voice processing. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.My guess would be the dedicated encoding/decoding hardware.
http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=3
Pretending to have insider information and then attempting to discredit what people are saying based off information you made up is taken seriously I believe.
In the GPU its a non-issue. In the CPU its a completely different story. It does have massive effects where latency dependent tasks like an audio buffer, come to suffer.Every modern AMD GPU includes FF hardware to encode and decode h264 video on the fly without using any of the CU's.
Also the GDDR5 latency rumour has been debunked multiple times, the latency is the close enough if the not the same as DDR3.
That provides me nothing, also its known the PS4's audio chip is just and encoder/decoder. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.
That provides me nothing, also its known the PS4's audio chip is just and encoder/decoder. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.
That provides me nothing, also its known the PS4's audio chip is just and encoder/decoder. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.
Since when?Its known the PS4's audio chip is just and encoder/decoder and provides basic support in terms of next-gen for actual voice processing. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.
When you have a system sorely relying on its GPU, there very well could be. Question though, people say the 14+4 CU rumour was debunked, but how so? From the early reports VGLeaks have been spot on. Just wondering.Without going into a new argument about all of that, what has anything of that to do with your original statement that "Its a common conception that some CU's will be reserved for OS tasks."?
Nice red herring. (http://en.wikipedia.org/wiki/Red_herring)
Its known the PS4's audio chip is just and encoder/decoder and provides basic support in terms of next-gen for actual voice processing. There's still the huge feat of actually processing those audio voices. Effects, post processing, actual rendering etc. Regarding video, I was just using it as an example, I've personally not seen that article before. It would be stupid to not have it offloaded.
When you have a system sorely relying on its GPU, there very well could be. Question though, people say the 14+4 CU rumour was debunked, but how so? From the early reports VGLeaks have been spot on. Just wondering.
When you have a system sorely relying on its GPU, there very well could be. Question though, people say the 14+4 CU rumour was debunked, but how so? From the early reports VGLeaks have been spot on. Just wondering.
Since when?
When you have a system sorely relying on its GPU, there very well could be.
Question though, people say the 14+4 CU rumour was debunked, but how so? From the early reports VGLeaks have been spot on. Just wondering.
Here we go again... TheKayle 2.0.
Encoder/decoder is one of the functions, not the only one.
I'm going with what I know and read from Mark Cerny here. He specifically mentions the decoding/encoding of formats and refers to the actual processing inside the GPU.
http://www.gamechup.com/mark-cerny-ps4-contains-a-dedicated-audio-processing-chip/
To be honest its very mis-leading how they say "Audio Processor" and then just go to say, yeah it just decodes/encodes over 200 streams. Doesn't actually process them.
The principal thing that it does is that it compresses and decompresses audio streams, various formats.
In the GPU its a non-issue. In the CPU its a completely different story. It does have massive effects where latency dependent tasks like an audio buffer, come to suffer.
I'm not trying to degrade the PS4 in anyway, I'm just stating that the extra beef in the GPU was a smart move and will be used for tasks the CPU will come to struggle with.
It's not the GPU. It's a dedicated part, an encoder/decoder chip. We know this because it happens in every game and the debs have already said there's zero overhead for the function from their side. That means not GPU time. Although it might live in there physically.JonnyLH said:Its a common conception that some CU's will be reserved for OS tasks. For example, what do you think decodes your screen and encodes it into a 15 minute video for you to share?
http://www.vgleaks.com/world-exclusive-orbis-unveiled-2/That doesn't even begin to be an argument. How is the PS4 a system that is "sorely relying on its GPU"? And does "very well could be" mean that you have no idea why the PS4 would reserve GPU for the OS, but you assume it nevertheless? On the XBO we have a very good idea why it is the case. On the PS4 there is no OS feature that would need dedicated GPU time while the game is running.
Because the VGLeaks document never said that there is a fixed assignment of "14+4".
He talks about audio raycasting on the GPU IIRC, something that SHAPE doesn't do.
http://www.vgleaks.com/world-exclusive-orbis-unveiled-2/
More information inside that article. The PS4's 4 'hardware balanced' CU's are extra ALU's. These are primarily for mathematical and logical operations. This is very smart actually, these will be quite different from the other CU's as they're balanced for this operation. These usually process any logic and maths, place the result in RAM for the CPU to pick up. They've put these in the GPU because they're primarily calling RAM pretty much constantly. Like said, the GPU handles and gets a lower latency from the RAM (depending on unified bus) and will handle these operations much better. If these were in the CPU you'd be throttling your whole system. Like the article says, because these are balanced in this way, they would only provide a minor benefit in terms of actual rendering.