• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4's memory subsystem has separate buses for CPU (20Gb/s) and GPU(176Gb/s)

RoboPlato

I'd be in the dick
Really interesting article, and somewhat worrying.
First of all, how on earth was thig game only running at 30FPS, on presumably some high spec PC? it's not that great looking at all.
Second, they said that initial PS4 build was running at just 10FPS which in itself is worrisome, when they didn't say what performance they achieved after all those optimizations that they've done.

The 10fps stat is fairly typical for getting a base engine up and running, which is all they were doing. Getting it up that quickly with a tiny team is pretty impressive.
 

MORT1S

Member
From what we know the Shadowfall demo used 6 GB of RAM on the devkit (which had 8GB of RAM), that was planned to be further optimised for the previous 4GB GDDR5 RAM PS4 spec. It's changed now.

You're right that there's no official statement on how much RAM the PS4 OS will use, but it being 1GB is a pretty safe bet, considering that no developer has disputed it, or this guy right here:

http://67.227.255.239/forum/showthread.php?t=617901

To pry a bit...

Do we know if actual numbers were used in the question, or was the interviewer prefacing the question for the reader?

Either way, I still believe the KZ slide, as it was on the money with CPU reserve.
 

mrklaw

MrArseFace
Most probably because some 3rd party multimedia programs will be hosted there [Netflix].

3rd party apps won't be running in the background while you're gaming. When the OS is front and center it can use all 8 cores. Reservations is just for stuff happening in the background.

background downloads shouldn't count, it has a dedicated processor for that (although it might need some housekeeping)

maybe its just being better safe than sorry, and they can dial it back later if needed.
 
Really interesting article, and somewhat worrying.
First of all, how on earth was thig game only running at 30FPS, on presumably some high spec PC? it's not that great looking at all.
Second, they said that initial PS4 build was running at just 10FPS which in itself is worrisome, when they didn't say what performance they achieved after all those optimizations that they've done.

This thread man, oh boy.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
The CPU 20GB/s bus to main memory, is this separate from the 176GB/s GPU peak read/write bandwidth on its own bus? Or is the figure actually 156GB/s GPU<-->GDDR5 and 20GB/s CPU<-->GDDR5?

It is not separate. The total bandwidth between the APU and the memory is 176GB/s. Garlic can use a 100% of that bandwidth budget. Onion can use 20GB/s (which is totally fine for the CPU) of that bandwidth budget. There aren't really two physical busses between APU and main memory, but internally the APU has different ways of accessing main memory. And those have different individual upper limits on how much of the available bandwidth they can individually allocate.
 

Krakn3Dfx

Member
Really interesting article, and somewhat worrying.
First of all, how on earth was thig game only running at 30FPS, on presumably some high spec PC? it's not that great looking at all.
Second, they said that initial PS4 build was running at just 10FPS which in itself is worrisome, when they didn't say what performance they achieved after all those optimizations that they've done.

tumblr_m96qikFvMa1qlydob.jpg
 

mrklaw

MrArseFace
Eh

I would imagine 2GB would not be a bad number to reserve.

I would rather assume 6, than 7 usable and hear better later.

when the PS4 was first rumoured to have 2GB total, don't you think reserving 2GB of that for the OS is too much? Even when it was 4GB it is too much. When they went out to 8GB at the last moment, the OS would be fairly done, and would have been specified based on no more than 4GB total system memory - i.e it would be designed to be compact.

with the increase they may choose to add some extra OS memory for overheads and future proofing (with the aim of bringing it back down again if possible) but 2GB just sounds like pie in the sky. Its like MS' 3GB is normalising in people's minds and therefore 1GB can't be enough for a decent OS.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Second, they said that initial PS4 build was running at just 10FPS which in itself is worrisome, when they didn't say what performance they achieved after all those optimizations that they've done.

10 FPS for the literal first successful startup of the game after changing the entire low-level API is actually pretty good.
 

tensuke

Member
Eurogamer said:
"The PS4's GPU is very programmable. There's a lot of power in there that we're just not using yet. So what we want to do are some PS4-specific things for our rendering but within reason - it's a cross-platform game so we can't do too much that's PS4-specific," he reveals.

"There are two things we want to look into: asynchronous compute where we can actually run compute jobs in parallel... We [also] have low-level access to the fragment-processing hardware which allows us to do some quite interesting things with anti-aliasing and a few other effects."

ySS4S9G.gif


But really though, this is some fantastic news. Just 2-3 guys ported their core engine? Loving how much easier this is to work on than with Cell. God, that was a cool chip but it was confusing as hell.
 

Interfectum

Member
Really interesting article, and somewhat worrying.
First of all, how on earth was thig game only running at 30FPS, on presumably some high spec PC? it's not that great looking at all.
Second, they said that initial PS4 build was running at just 10FPS which in itself is worrisome, when they didn't say what performance they achieved after all those optimizations that they've done.

To get a port up and running at 10 fps before you even touch much of the code for optimization seems like a pretty good thing to me.
 

pirata

Member
When I saw "Garlic Bus," all I could think of was Wario driving a city bus and cackling maniacally as passengers hop on board.
 

MORT1S

Member
when the PS4 was first rumoured to have 2GB total, don't you think reserving 2GB of that for the OS is too much? Even when it was 4GB it is too much. When they went out to 8GB at the last moment, the OS would be fairly done, and would have been specified based on no more than 4GB total system memory - i.e it would be designed to be compact.

with the increase they may choose to add some extra OS memory for overheads and future proofing (with the aim of bringing it back down again if possible) but 2GB just sounds like pie in the sky. Its like MS' 3GB is normalising in people's minds and therefore 1GB can't be enough for a decent OS.

No, I would assume that the footprint has grown with the available RAM.

Is it much to say that RAM increases, they see more advantages that could bring to an OS? I would hate for them to reserve too little.

I don't think it has anything to do with the One's reserve, more so than reserving enough for the future.

If I recall, 1GB of RAM in the Xbox One is reserved, doing nothing... I might be wrong, though.
 

Lord Error

Insane For Sony
Guys, before 10 more people quotes me, the worrying part for me was that they never said what performance they've achieved after they've optimized things, switched to lower level hardware access API (GNM instead of GNMX) etc.Devs tend to brag with accomplishments, so I'd think if they've hit a solid 30FPS after optimizing it, they'd probably say that in the article. But they never revisited that performance comment.
 
Xbone's low spec holding PS4 back confirmed?

They are not developing the xbone version, they are only handling conversion from the base (PC) to Ps4.

Other than the Onion buses, isn't this exactly like a PC? I thought there would be high bandwidth between memory and CPU, and we could see some new programming paradigms.

For a cpu, it's actually lots of available bandwidth there, plus they have a bus between the cpu and gpu, which really seems big enough for lots of new paradigms.
 
They are not developing the xbone version, they are only handling conversion from the base (PC) to Ps4.



For a cpu, it's actually lots of available bandwidth there, plus they have a bus between the cpu and gpu, which really seems big enough for lots of new paradigms.

Not to support who you quoted, but it does seem like they won't be utilizing all the uniqueness of the PS4 setup in order to preserve fidelity across all platforms. Anything they can share among the two systems I'm sure they'll capitalize on.
 

i-Lo

Member
From what we know the Shadowfall demo used 6 GB of RAM on the devkit (which had 8GB of RAM), that was planned to be further optimised for the previous 4GB GDDR5 RAM PS4 spec. It's changed now.

You're right that there's no official statement on how much RAM the PS4 OS will use, but it being 1GB is a pretty safe bet, considering that no developer has disputed it, or this guy right here:

http://67.227.255.239/forum/showthread.php?t=617901

Yep, this is the one I remember.

Also, (i've been away for a while) I take it that 6 cores for games is still not set in stone.
 

Pistolero

Member
The ease of developement is a great thing to have, but what I am most excited about are the low level access to the GPU and the flexibility of the GP-GPU PS4 architecture. It bodes well for the visual evolution (but not only) of the console games in the next 5-6 years...
 

Rashid

Banned
I got sidetracked by this as well on the Eurogamer article in OP:

http://www.eurogamer.net/articles/digitalfoundry-ps3-disc-vs-digital-delivery

PSN downloads being capped at 1.5MB or 12mbps. My connection tops out at that too which is why I've never experienced slowdown (and I hardly ever download anything). What I'm worried about is my download speed not being enough, and having the game stop or something, essentially buffering, because I caught up with the download. Hopefully GAF's man crush Cerny will explain soon?
 

nib95

Banned
So the PS4 has more powerful hardware, an almost complete SDK and lower level access to hardware which should give higher performance for more effort.

Xbone has less powerful hardware, an SDK which still needs significant work and DirectX which provides ease of use at the expense of performance.

The Xbone supposedly has 2 games (Forza and Titanfall) already running on Xbone hardware at 1080p/60 and the PS4 doesn't have any IIRC (corrections welcome).

The PS4 games do seem to be graphically superior to me but not by a huge amount. Something isn't adding up.

You know what, a lot of juniors have been making posts like this...


Anyway, all the sports games run at 1080p/60fps on PS4, as does BF4, COD Ghosts and several PS titles as well. So no, nothing is up. Just design choices.

And it's not a very good idea to bring up TitanFall and COD Ghosts as good examples of 60fps since, well, they look pretty crap graphically speaking. Far worse than other games. F5 looks great imo, though as has been said before, several major corners were cut to achieve the performance it has, and it's still lacking here and there in areas such as textures and filtering. Having said that, I fully expect them to go dynamic with the next one whilst retaining 60fps. These are launch games, things will change massively.
 

vazel

Banned
I got sidetracked by this as well on the Eurogamer article in OP:

http://www.eurogamer.net/articles/digitalfoundry-ps3-disc-vs-digital-delivery

PSN downloads being capped at 1.5MB or 12mbps. My connection tops out at that too which is why I've never experienced slowdown (and I hardly ever download anything). What I'm worried about is my download speed not being enough, and having the game stop or something, essentially buffering, because I caught up with the download. Hopefully GAF's man crush Cerny will explain soon?
I get higher speeds than that on PSN.

7IANT5D.png
 

Ryoku

Member
You know what, a lot of juniors have been making posts like this...


Anyway, all the sports games run at 1080p/60fps on PS4, as does BF4, COD Ghosts and several PS titles as well. So no, nothing is up. Just design choices.

I don't think the resolution of BF4 was stated anywhere. Correct me if I'm wrong.
 

Rashid

Banned
Yep, this is the one I remember.

Also, (i've been away for a while) I take it that 6 cores for games is still not set in stone.

I know for Cell there were 6 SPU's available for games, 1 for the OS and 1 locked to increase yields. They wouldn't do the same for essentially a conventional PC processor would they? But I did hear that Intel's 6 core processors actually have another 2 cores but Intel switch those off according to this article. For a comparatively easy to manufacture chip as the Jaguar to the Cell they really wouldn't need to increase yields because it's simple, right?
 
This is why you don't see GDDR5 used by PC's CPUs, both intel and amd's CPUs cant use that bandwidth so it's overkill.

So.. Ps4's CPUs have bandwidth of DDR3 and GPU has bandwidth of GDDR5.. Just like a pc setup
 

Oppo

Member
This is why you don't see GDDR5 used by PC's CPUs, both intel and amd's CPUs cant use that bandwidth so it's overkill.

So.. Ps4's CPUs have bandwidth of DDR3 and GPU has bandwidth of GDDR5.. Just like a pc setup

Not really. This is still faster than PCIe and starting next year PC's will be sold with APUs.
 

injurai

Banned
The GPU seems like God-like.

Not really. The GPU architecture itself is already outdated. What makes these systems over come that is how things are designed and reconfigured specifically for developing video games.

Sony has taken a lot of care to give this system a lot of growing room in terms of developing at an extremely optimized degree. This is how it will keep up with superior PC and off the shelf architecture. But the GPU in and of itself is not g0d-liek.
 

benny_a

extra source of jiggaflops
The CPU in a PC doesn't communicate with RAM via the PCIe Bus.
Look up what hUMA does and why he talks about PCIe.

I'm not quite sure what your comment is supposed to mean. Do you genuinely think you're correcting him on any point?
 

i-Lo

Member
Not really. The GPU architecture itself is already outdated. What makes these systems over come that is how things are designed and reconfigured specifically for developing video games.

Sony has taken a lot of care to give this system a lot of growing room in terms of developing at an extremely optimized degree. This is how it will keep up with superior PC and off the shelf architecture. But the GPU in and of itself is not g0d-liek.

While I don't subscribe to the user's hyperbole you quoted, I am not sure whether calling it, "outdated" is fair given expected customizations notwithstanding, the 8 Asynchronous Compute Units system for queuing 64 commands (in total) that is in Liverpool will become part of AMD's next generation retail cards.
 

DieH@rd

Banned
The GPU seems like God-like.

AMD makes best modular CPUs and GPUs on the market, with great scalability, price/performance/power ratios, ability to modify modules per buyers needs, and with all the latest tech inside. Plus they play nice [they need money].

AMD was the only choice for MS and Sony. Nvidia and Intel are arrogant, pricey, hard to work with, and dont have tech to create high-powered APUs.
 

lyrick

Member
Look up what hUMA does and why he talks about PCIe.

I'm not quite sure what your comment is supposed to mean. Do you genuinely think you're correcting him on any point?

Buzzwords that are as cute as the Horde chanting 8GB GDDR5 RAM. It enables some cool parallelism, but it isn't world changing.

aquamala is pretty spot on with his bandwidth comparison. The PS4 GPU is about on par with an Intel/AMD System Memory BUS, while the PS4s GPU BUS is about on par with a midrange discrete GPUs Video Memory Bandwidth.
 
I like the part where they kinda say that they could be doing a lot more with the PS4, but won't because it's multiplatform so they will just do a little more.

Next gen will be interesting.
 

benny_a

extra source of jiggaflops
Buzzwords that are as cute as the Horde chanting 8GB GDDR5 RAM. It enables some cool parallelism, but it isn't world changing.
So he and I (I presume) were using buzzword while you were correcting some misconception that you didn't elaborate on.

I would welcome actual misconceptions being corrected instead of made up ones like "CPU doesn't go through PCIe on PCs" which nobody implied at any point in the history of NeoGAF. Giraffes are not fishes, by the way.
 

DieH@rd

Banned
While I don't subscribe to the user's hyperbole you quoted, I am not sure whether calling it, "outdated" is fair given expected customizations notwithstanding, the 8 Asynchronous Compute Units system for queuing 64 commands (in total) that is in Liverpool will become part of AMD's next generation retail cards.

Yeah, those modifications will most likely be part of Radeon 9xxx series.
 
It is really just about whether an individual address in main memory should be mapped to CPU-L1/L2 cache (Onion) or not (Garlic). CPU-L1/L2 is (a) of limited size and (b) highly relevant to the CPU but at the same time irrelevant to the GPU. Hence, you issue access commands to CPU-relevant data through Onion, and access to CPU-irrelevant data through Garlic. As a result the GPU does not bully the CPU.

And how does this compare to the Xbox360 architecture?
 

Evoga

Member
Really interesting article, and somewhat worrying.
First of all, how on earth was thig game only running at 30FPS, on presumably some high spec PC? it's not that great looking at all.
Second, they said that initial PS4 build was running at just 10FPS which in itself is worrisome, when they didn't say what performance they achieved after all those optimizations that they've done.

It looks like its hard to program the ps4 with that damn flashlight controller dazzling your eyes and screen.
 
Top Bottom