• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4's memory subsystem has separate buses for CPU (20Gb/s) and GPU(176Gb/s)

i-Lo

Member
Because the quoted post is a backwards way of looking at it. There shouldn't be any subtracting of numbers from the 176 GB/s. There's only one way in and out of the APU, and that's 176 GB/s. How that bandwidth is divided up internally is academic. The Xbone has the same structure-- one unified path in and out of the APU. How it's divided up on there between CPU, GPU, ESRAM blah blah is academic. There shouldn't be any adding or subtracting of numbers.

You should only add if your internal busses can read/write from each other while you're pulling down that full 176/8x through your unified memory interface simultaneously. Otherwise you're playing number games.

Technically, if CPU and GPU both, concurrently had to access data at their hard limit via their dedicated buses, the wouldn't assignment of bandwidth be 20GB/s and 156GB/s respectively?
 
How he put it really isn't anywhere near as important as the key point he made IMO. Which was to make it clear that this 20GB/s CPU bus isn't in addition to the 176GB/s bandwidth. As I said, despite not being technically correct in every way that key point was correct and worth pointing out. Because some people were thinking 196GB/s..
Those people were joking.


Technically, if CPU and GPU both, concurrently had to access data at their hard limit via their dedicated buses, the wouldn't assignment of bandwidth be 20GB/s and 156GB/s respectively?

Yes, because that's still 176 total to the APU. The problem is that 156 isn't the GPU's hard limit. It can go the whole 176.
 
I'm a game developer. I have no idea what these numbers mean.
I'm a game developer, and I do know what this all means. Doesn't matter, though, as it specifically applies to this hardware.

Also, unless you're developing the engine for your title(s) you may never need to know what these numbers mean, but you can ask whomever is handling your rendering pipeline :)
 

FuturusX

Member
One of my friends who was working for a big publisher (I won't tell which it is) told me recently that the OS memory footprint for the PS4 is 3GB. Same as Xbox One.

Don't be coy. It's family....

Let me guess your cousin? Or is it your Uncle...perhaps your dad? No wait..it's your brother.
 

MORT1S

Member
The problem is that maybe the Killzone slides were misinterpreted. The Killzone presentation was based on early code running on apha kits and not on final PS4 hardware. Thus, it's possible that on the alpha dev kits only 6 cores where available because 2 of them were emulating the dedicated audio/video hw of the PS4.

http://www.edge-online.com/features/killzone-shadow-fall-and-the-power-of-playstation-4/

Regardless, the article this thread is focused on also states the 2 Core OS reservation. If in relation to their talks with Ubi, would you not assume their claim is accurate?

Having 2 cores reserved isn't that big of a deal. Core reservation has never seemed to matter till this point. I believe everyone is inflating these numbers in their mind, making them far more significant than they are.

So what if Windows doesn't reserve that much... Or FBSD doesn't reserve that much in ideal cases.

I have a feeling that you will see similar reserves for both systems... had none of this come to light, you would never know.
 
laughvlrx1.gif

Who is this?
 

i-Lo

Member
One of my friends who was working for a big publisher (I won't tell which it is) told me recently that the OS memory footprint for the PS4 is 3GB. Same as Xbox One.

System doesn't have more than 1 OS and even that isn't designed to be intensely multi-tasking oriented.

I'm calling bullshit on this.
 

Krakatoa

Member
System doesn't have more than 1 OS and even that isn't designed to be intensely multi-tasking oriented.

I'm calling bullshit on this.

It could be true, it's better to assign more memory at the start for the OS, rather than later, as that would effect game performance.

Also it gives Sony the ability to upgrade the OS features over time, and try and Match MS in the entertainment area.
 

i-Lo

Member
It could be true, it's better to assign more memory at the start for the OS, rather than later, as that would effect game performance.

Also it gives Sony the ability to upgrade the OS features over time, and try and Match MS in the entertainment area.

While not out of the realm of possibility it is highly improbable. Their base OS is different from MS's. Plus, multimedia functionality courtesy of kinect which has its own resource requirements will never be a part of PS4.

I am absolutely confident that regardless of the refinement to services (as Sony has done with PS3 while concurrently shaving ram footprint from 120MB to ~50MB at present), they will not block off 3GB of the extra 4GB.
 

Ryoku

Member
In this instance, Sony is being completely open with their information, which clearly indicates an upgrade of performance, and posters are literally reading the information, and purposefully lying that the system is being downgraded.

No, it's not "clearly" an upgrade of performance. It remains where it was at a couple of months prior to E3, aside from the 4GB additional RAM. The CPU just has a "hard limit" to the bandwidth it can access from memory. The GPU, on the other hand, has access to whatever bandwidth is left over.

Let's say the CPU is utilizing the full 20GB/s of bandwidth. The GPU has access to 156GB/s of bandwidth. Of course, this probably isn't realistic. These are theoretical maximums. Real-world performance may be lower.

And if the devs choose to, they can create a game that is less reliant on CPU tasks, thereby utilizing a fraction of the 20GB/s allotted to the CPU. Let's say 10GB/s. In this case, the GPU has access to a theoretical maximum of 166GB/s.

This isn't really an improvement in performance. And as said before, this is why GDDR5 is overkill for CPU. CPUs don't need crazy bandwidth. But the unified pool makes development easier, so there's that.
 

KoopaTheCasual

Junior Member
No, it's not "clearly" an upgrade of performance. It remains where it was at a couple of months prior to E3, aside from the 4GB additional RAM. The CPU just has a "hard limit" to the bandwidth it can access from memory. The GPU, on the other hand, has access to whatever bandwidth is left over.

Let's say the CPU is utilizing the full 20GB/s of bandwidth. The GPU has access to 156GB/s of bandwidth. Of course, this probably isn't realistic. These are theoretical maximums. Real-world performance will probably be lower.

And if the devs choose to, they can create a game that is less reliant on CPU tasks, thereby utilizing a fraction of the 20GB/s allotted to the CPU. Let's say 10GB/s. In this case, the GPU has access to a theoretical maximum of 166GB/s.

This isn't really an improvement in performance. And as said before, this is why GDDR5 is overkill for CPU. CPUs don't need crazy bandwidth. But the unified pool makes development easier, so there's that.

Sorry, I was loose with my wording. I realize this info has been public since April and isn't a simple 1+1=2 type of improvement. I was simply implying, there's literally no way this information can be taken as a downgrade, without some serious flawed logic, and ulterior motives.

If this type of info was public for the XB1 and people were trying to flip numbers into a downgrade, I would expect them to be quickly banned as well. I don't think the mod team in this forum is as bias as some would like to believe.
 

Famassu

Member
It could be true, it's better to assign more memory at the start for the OS, rather than later, as that would effect game performance.
.
Sony wouldn't add 4GB more RAM just to futureproof their OS by giving 2 of those 4GBs to OS. That's just silly talk. It makes absolutely no sense for a single OS system like PS4 and would be a completely overboard decision. In no scenario would their OS require 2GBs more RAM.
 
Regardless, the article this thread is focused on also states the 2 Core OS reservation. If in relation to their talks with Ubi, would you not assume their claim is accurate?

Even if it's stated in the article, it's still all speculation at this point. Furthermore, they clearly said the Ubi guys could not give them specific details about the PS4 due to NDA.

Having 2 cores reserved isn't that big of a deal. Core reservation has never seemed to matter till this point. I believe everyone is inflating these numbers in their mind, making them far more significant than they are.

So what if Windows doesn't reserve that much... Or FBSD doesn't reserve that much in ideal cases.

I have a feeling that you will see similar reserves for both systems... had none of this come to light, you would never know.

The PS4 has dedicated hw to handle the most intensive OS tasks like audio chat, DVR functionalities, and so on. On top of that, it has a whole ARM CPU handling other OS tasks like backgound downloading and security. So, what would be felt for the 2 reserved cores to handle? Not much, I think. That's why it does seem an incredible waste of resources dedicating 2 cores for something that would use just a fraction of their power. And that's why I think this rumor may be wrong.
 

kuroshiki

Member
It could be true, it's better to assign more memory at the start for the OS, rather than later, as that would effect game performance.

Also it gives Sony the ability to upgrade the OS features over time, and try and Match MS in the entertainment area.

If previous sony oses are any indication (PSP, PS3, etc) OS footprint usually 'decreases', not 'increases'. 1GB just for OS is unprecedented for Sony console. 3GB? No way in hell.
 

MORT1S

Member
The PS4 has dedicated hw to handle the most intensive OS tasks like audio chat, DVR functionalities, and so on. On top of that, it has a whole ARM CPU handling other OS tasks like backgound downloading and security. So, what would be felt for the 2 reserved cores to handle? Not much, I think. That's why it does seem an incredible waste of resources dedicating 2 cores for something that would use just a fraction of their power. And that's why I think this rumor may be wrong.

You can have as much dedicated, fixed function hardware to handle all of the tasks you want... The OS still needs to drive that, No?

The dedicated hardware isn't going to "know" what to do otherwise. I would imagine it to be no different than how the drivers are handled for the CPU and GPU.
 

nib95

Banned
One of my friends who was working for a big publisher (I won't tell which it is) told me recently that the OS memory footprint for the PS4 is 3GB. Same as Xbox One.

Considering most devs were working on the presumption the final retail kit would have only 4GB GDDR5, I highly doubt this lol. My guess is 1GB. 1.5Gb max. But so far all insider knowledge points to 1GB, that's already a bump up from the original reserved amount of 500mb.
 

spwolf

Member
You can have as much dedicated, fixed function hardware to handle all of the tasks you want... The OS still needs to drive that, No?

The dedicated hardware isn't going to "know" what to do otherwise. I would imagine it to be no different than how the drivers are handled for the CPU and GPU.

i am not sure whats your point exactly? Dedicated hardware is not important because OS needs drivers?
 

MORT1S

Member
i am not sure whats your point exactly? Dedicated hardware is not important because OS needs drivers?

No, The fact that the dedicated hardware is used as reasoning for there to be little, to no OS overhead.

I would imagine it would take a modest amount of CPU resources for the OS to manage tasks between all of that hardware. NEVER did I say Dedicated hardware is unimportant.
 
You can have as much dedicated, fixed function hardware to handle all of the tasks you want... The OS still needs to drive that, No?

The dedicated hardware isn't going to "know" what to do otherwise. I would imagine it to be no different than how the drivers are handled for the CPU and GPU.

And you do not need to reserve entire cores for that. The Xbox360 reserves 5% of the power of cores 2 and 3 for OS tasks. The rest of the CPU power available for games. Thus, I don't see why the PS4 would do any different.
 

MORT1S

Member
And you do not need to reserve entire cores for that. The Xbox360 reserves 5% of the power of cores 2 and 3 for OS tasks. The rest of the CPU power available for games. Thus, I don't see why the PS4 would do any different.

The tasks the 360 cpu handles are not even in the same ballpark as the tasks being handled here. Comparing the two would be selling the PS4 short.

The fact is, There is evidence of at least ONE FULL CORE being reserved, and evidence for TWO cores being reserved. I don't blame anyone for believing the lesser... It sounds better.
 

Truespeed

Member
This was all disclosed by Gamasutra after Mark Cerny did his initial presentation way back in April.

Enabling the Vision: How Sony Modified the Hardware

The three "major modifications" Sony did to the architecture to support this vision are as follows, in Cerny's words:

First, we added another bus to the GPU that allows it to read directly from system memory or write directly to system memory, bypassing its own L1 and L2 caches. As a result, if the data that's being passed back and forth between CPU and GPU is small, you don't have issues with synchronization between them anymore. And by small, I just mean small in next-gen terms. We can pass almost 20 gigabytes a second down that bus. That's not very small in today’s terms -- it’s larger than the PCIe on most PCs!

Next, to support the case where you want to use the GPU L2 cache simultaneously for both graphics processing and asynchronous compute, we have added a bit in the tags of the cache lines, we call it the 'volatile' bit. You can then selectively mark all accesses by compute as 'volatile,' and when it's time for compute to read from system memory, it can invalidate, selectively, the lines it uses in the L2. When it comes time to write back the results, it can write back selectively the lines that it uses. This innovation allows compute to use the GPU L2 cache and perform the required operations without significantly impacting the graphics operations going on at the same time -- in other words, it radically reduces the overhead of running compute and graphics together on the GPU."

Thirdly, said Cerny, "The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, we’ve worked with AMD to increase the limit to 64 sources of compute commands -- the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that's in the system.
 

nib95

Banned
This was all disclosed by Gamasutra after Mark Cerny did his initial presentation way back in April.

Sounds great. It seems as though they really thought the customisations through. Can't wait to see second and third gen games for this console!
 
The tasks the 360 cpu handles are not even in the same ballpark as the tasks being handled here. Comparing the two would be selling the PS4 short.

That's why you have all this dedicated hw in the PS4 that handles OS tasks.

The fact is, There is evidence of at least ONE FULL CORE being reserved, and evidence for TWO cores being reserved. I don't blame anyone for believing the lesser... It sounds better.

Please provide this evidence. There has been no official confirmation, just speculations.
 

spwolf

Member
No, The fact that the dedicated hardware is used as reasoning for there to be little, to no OS overhead.

I would imagine it would take a modest amount of CPU resources for the OS to manage tasks between all of that hardware. NEVER did I say Dedicated hardware is unimportant.

lol, what? why would it take any resources from the OS... what are you talking about? Thats funniest thing i have read in a while.
 

MORT1S

Member
Please provide this evidence. There has been no official confirmation, just speculations.
Would the article that this thread is about, not be evidence?

Is Eurogamer not a reliable source, serious question?


lol, what? why would it take any resources from the OS... what are you talking about? Thats funniest thing i have read in a while.
Ummm... I don't recall saying anything about OS resources. Perhaps you meant CPU?

Regardless, I concede... I don't know what I am talking about. Eurogamer is probably wrong about the OS CPU reserve.
 

Durante

Member
What the PS4 has is a memory controller that is connected to the main RAM (at 176 GB/s), that then has buses to both the CPU and GPU, the bandwidth of the bus from the controller to the GPU is up to the full speed of the memory (176 GB/s), the other bus from it to the CPU is limited to 20 GB/s.

It even says so in that diagram!
Pretty much, I don't think there's any new information here.
 
If I'm not wrong doesn't Eurogamer's 2 core OS figure come from the Guerrilla Games PDF?

I don't think that was confirmation of anything tbh.
 

satam55

Banned
This is why you don't see GDDR5 used by PC's CPUs, both intel and amd's CPUs cant use that bandwidth so it's overkill.

So.. Ps4's CPUs have bandwidth of DDR3 and GPU has bandwidth of GDDR5.. Just like a pc setup

Not really. This is still faster than PCIe and starting next year PC's will be sold with APUs.


AMD has already been selling APUs for a couple of years. But the main reason I'm replying is to say, that the next generation of Desktop AMD APU (coming out later this year, called "Kaveri") will support GDDR5 RAM.
 

spwolf

Member
Ummm... I don't recall saying anything about OS resources. Perhaps you meant CPU?

Regardless, I concede... I don't know what I am talking about. Eurogamer is probably wrong about the OS CPU reserve.

Dedicated hardware takes load off the CPU, not to the cpu... whether Eurogamer is right or wrong, i have no idea, it is based on Killzone slides that were done long time ago.

But "tasking" is not the issue, dedicated hardware is only positive and nothing negative... it helps every step of the process actually, both games and OS usage.
 

TheCloser

Banned
If I'm not wrong doesn't Eurogamer's 2 core OS figure come from the Guerrilla Games PDF?

I don't think that was confirmation of anything tbh.

Yep, the slide showed that they were using 6 cores not that they were limited to 6 cores.
 

MORT1S

Member
Dedicated hardware takes load off the CPU, not to the cpu... whether Eurogamer is right or wrong, i have no idea, it is based on Killzone slides that were done long time ago.

But "tasking" is not the issue, dedicated hardware is only positive and nothing negative... it helps every step of the process actually, both games and OS usage.
Right, of course the dedicated hardware is a good thing. I was only trying to grip why 2 cores would be reserved. Those were my assumptions.

Sorry
 

Lord Error

Insane For Sony
Regardless, the article this thread is focused on also states the 2 Core OS reservation. If in relation to their talks with Ubi, would you not assume their claim is accurate?
The two core reservation was again assumption made by the author of the article, probably due to that same KZ presentation we assume this from. Noone from Ubi said that (nor they could due to NDA).
 
Top Bottom