Because the quoted post is a backwards way of looking at it. There shouldn't be any subtracting of numbers from the 176 GB/s. There's only one way in and out of the APU, and that's 176 GB/s. How that bandwidth is divided up internally is academic. The Xbone has the same structure-- one unified path in and out of the APU. How it's divided up on there between CPU, GPU, ESRAM blah blah is academic. There shouldn't be any adding or subtracting of numbers.
You should only add if your internal busses can read/write from each other while you're pulling down that full 176/8x through your unified memory interface simultaneously. Otherwise you're playing number games.
Those people were joking.How he put it really isn't anywhere near as important as the key point he made IMO. Which was to make it clear that this 20GB/s CPU bus isn't in addition to the 176GB/s bandwidth. As I said, despite not being technically correct in every way that key point was correct and worth pointing out. Because some people were thinking 196GB/s..
Technically, if CPU and GPU both, concurrently had to access data at their hard limit via their dedicated buses, the wouldn't assignment of bandwidth be 20GB/s and 156GB/s respectively?
I'm a game developer, and I do know what this all means. Doesn't matter, though, as it specifically applies to this hardware.I'm a game developer. I have no idea what these numbers mean.
Those people were joking.
One of my friends who was working for a big publisher (I won't tell which it is) told me recently that the OS memory footprint for the PS4 is 3GB. Same as Xbox One.
Yes, because that's still 176 total to the APU. The problem is that 156 isn't the GPU's hard limit. It can go the whole 176.
The problem is that maybe the Killzone slides were misinterpreted. The Killzone presentation was based on early code running on apha kits and not on final PS4 hardware. Thus, it's possible that on the alpha dev kits only 6 cores where available because 2 of them were emulating the dedicated audio/video hw of the PS4.
http://www.edge-online.com/features/killzone-shadow-fall-and-the-power-of-playstation-4/
One of my friends who was working for a big publisher (I won't tell which it is) told me recently that the OS memory footprint for the PS4 is 3GB. Same as Xbox One.
One of my friends who was working for a big publisher (I won't tell which it is) told me recently that the OS memory footprint for the PS4 is 3GB. Same as Xbox One.
It looks as though the PS4 will have specific capabilities that go beyond simply having more graphics cores or a faster CPU and that could give it the advantage, this time around.
System doesn't have more than 1 OS and even that isn't designed to be intensely multi-tasking oriented.
I'm calling bullshit on this.
Have you heard any figures?Nah.
Who is this?
It could be true, it's better to assign more memory at the start for the OS, rather than later, as that would effect game performance.
Also it gives Sony the ability to upgrade the OS features over time, and try and Match MS in the entertainment area.
In this instance, Sony is being completely open with their information, which clearly indicates an upgrade of performance, and posters are literally reading the information, and purposefully lying that the system is being downgraded.
No, it's not "clearly" an upgrade of performance. It remains where it was at a couple of months prior to E3, aside from the 4GB additional RAM. The CPU just has a "hard limit" to the bandwidth it can access from memory. The GPU, on the other hand, has access to whatever bandwidth is left over.
Let's say the CPU is utilizing the full 20GB/s of bandwidth. The GPU has access to 156GB/s of bandwidth. Of course, this probably isn't realistic. These are theoretical maximums. Real-world performance will probably be lower.
And if the devs choose to, they can create a game that is less reliant on CPU tasks, thereby utilizing a fraction of the 20GB/s allotted to the CPU. Let's say 10GB/s. In this case, the GPU has access to a theoretical maximum of 166GB/s.
This isn't really an improvement in performance. And as said before, this is why GDDR5 is overkill for CPU. CPUs don't need crazy bandwidth. But the unified pool makes development easier, so there's that.
Sony wouldn't add 4GB more RAM just to futureproof their OS by giving 2 of those 4GBs to OS. That's just silly talk. It makes absolutely no sense for a single OS system like PS4 and would be a completely overboard decision. In no scenario would their OS require 2GBs more RAM.It could be true, it's better to assign more memory at the start for the OS, rather than later, as that would effect game performance.
.
I get shit speeds and I have the exact connection as you (Verizon FiOS I assume).
Regardless, the article this thread is focused on also states the 2 Core OS reservation. If in relation to their talks with Ubi, would you not assume their claim is accurate?
Having 2 cores reserved isn't that big of a deal. Core reservation has never seemed to matter till this point. I believe everyone is inflating these numbers in their mind, making them far more significant than they are.
So what if Windows doesn't reserve that much... Or FBSD doesn't reserve that much in ideal cases.
I have a feeling that you will see similar reserves for both systems... had none of this come to light, you would never know.
One of my friends who was working for a big publisher (I won't tell which it is) told me recently that the OS memory footprint for the PS4 is 3GB. Same as Xbox One.
It could be true, it's better to assign more memory at the start for the OS, rather than later, as that would effect game performance.
Also it gives Sony the ability to upgrade the OS features over time, and try and Match MS in the entertainment area.
The PS4 has dedicated hw to handle the most intensive OS tasks like audio chat, DVR functionalities, and so on. On top of that, it has a whole ARM CPU handling other OS tasks like backgound downloading and security. So, what would be felt for the 2 reserved cores to handle? Not much, I think. That's why it does seem an incredible waste of resources dedicating 2 cores for something that would use just a fraction of their power. And that's why I think this rumor may be wrong.
One of my friends who was working for a big publisher (I won't tell which it is) told me recently that the OS memory footprint for the PS4 is 3GB. Same as Xbox One.
You can have as much dedicated, fixed function hardware to handle all of the tasks you want... The OS still needs to drive that, No?
The dedicated hardware isn't going to "know" what to do otherwise. I would imagine it to be no different than how the drivers are handled for the CPU and GPU.
i am not sure whats your point exactly? Dedicated hardware is not important because OS needs drivers?
You can have as much dedicated, fixed function hardware to handle all of the tasks you want... The OS still needs to drive that, No?
The dedicated hardware isn't going to "know" what to do otherwise. I would imagine it to be no different than how the drivers are handled for the CPU and GPU.
And you do not need to reserve entire cores for that. The Xbox360 reserves 5% of the power of cores 2 and 3 for OS tasks. The rest of the CPU power available for games. Thus, I don't see why the PS4 would do any different.
Enabling the Vision: How Sony Modified the Hardware
The three "major modifications" Sony did to the architecture to support this vision are as follows, in Cerny's words:
First, we added another bus to the GPU that allows it to read directly from system memory or write directly to system memory, bypassing its own L1 and L2 caches. As a result, if the data that's being passed back and forth between CPU and GPU is small, you don't have issues with synchronization between them anymore. And by small, I just mean small in next-gen terms. We can pass almost 20 gigabytes a second down that bus. That's not very small in todays terms -- its larger than the PCIe on most PCs!
Next, to support the case where you want to use the GPU L2 cache simultaneously for both graphics processing and asynchronous compute, we have added a bit in the tags of the cache lines, we call it the 'volatile' bit. You can then selectively mark all accesses by compute as 'volatile,' and when it's time for compute to read from system memory, it can invalidate, selectively, the lines it uses in the L2. When it comes time to write back the results, it can write back selectively the lines that it uses. This innovation allows compute to use the GPU L2 cache and perform the required operations without significantly impacting the graphics operations going on at the same time -- in other words, it radically reduces the overhead of running compute and graphics together on the GPU."
Thirdly, said Cerny, "The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, weve worked with AMD to increase the limit to 64 sources of compute commands -- the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that's in the system.
This was all disclosed by Gamasutra after Mark Cerny did his initial presentation way back in April.
The tasks the 360 cpu handles are not even in the same ballpark as the tasks being handled here. Comparing the two would be selling the PS4 short.
The fact is, There is evidence of at least ONE FULL CORE being reserved, and evidence for TWO cores being reserved. I don't blame anyone for believing the lesser... It sounds better.
I'm on Telus in Canada, actually.I get shit speeds and I have the exact connection as you (Verizon FiOS I assume).
No, The fact that the dedicated hardware is used as reasoning for there to be little, to no OS overhead.
I would imagine it would take a modest amount of CPU resources for the OS to manage tasks between all of that hardware. NEVER did I say Dedicated hardware is unimportant.
Garlic and onion are good,but not as good as Paula Agnus and Denise
Garlic and onion are good,but not as good as Paula, Agnus and Denise
Would the article that this thread is about, not be evidence?Please provide this evidence. There has been no official confirmation, just speculations.
Ummm... I don't recall saying anything about OS resources. Perhaps you meant CPU?lol, what? why would it take any resources from the OS... what are you talking about? Thats funniest thing i have read in a while.
Pretty much, I don't think there's any new information here.What the PS4 has is a memory controller that is connected to the main RAM (at 176 GB/s), that then has buses to both the CPU and GPU, the bandwidth of the bus from the controller to the GPU is up to the full speed of the memory (176 GB/s), the other bus from it to the CPU is limited to 20 GB/s.
It even says so in that diagram!
This is why you don't see GDDR5 used by PC's CPUs, both intel and amd's CPUs cant use that bandwidth so it's overkill.
So.. Ps4's CPUs have bandwidth of DDR3 and GPU has bandwidth of GDDR5.. Just like a pc setup
Not really. This is still faster than PCIe and starting next year PC's will be sold with APUs.
Ummm... I don't recall saying anything about OS resources. Perhaps you meant CPU?
Regardless, I concede... I don't know what I am talking about. Eurogamer is probably wrong about the OS CPU reserve.
If I'm not wrong doesn't Eurogamer's 2 core OS figure come from the Guerrilla Games PDF?
I don't think that was confirmation of anything tbh.
Who is this?
Right, of course the dedicated hardware is a good thing. I was only trying to grip why 2 cores would be reserved. Those were my assumptions.Dedicated hardware takes load off the CPU, not to the cpu... whether Eurogamer is right or wrong, i have no idea, it is based on Killzone slides that were done long time ago.
But "tasking" is not the issue, dedicated hardware is only positive and nothing negative... it helps every step of the process actually, both games and OS usage.
The two core reservation was again assumption made by the author of the article, probably due to that same KZ presentation we assume this from. Noone from Ubi said that (nor they could due to NDA).Regardless, the article this thread is focused on also states the 2 Core OS reservation. If in relation to their talks with Ubi, would you not assume their claim is accurate?
Based on the last 2 pages: Told you so for introducing that dumb ass bandwidth accounting into the thread.You're taking this way too seriously, benny.
The name of their new libGCM equivalent was revealed here. All hail the king GNM.Pretty much, I don't think there's any new information here.
Garlic and onion, now they only need an olive oil thingie in there and we're cooking.