• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 initial costing analysis [Updated]

DarkFlow

Banned
That GDDR5 cost is insane. I'm really shocked the Sony managment allowed this. Still, it's going to be amazing for (hardcore) consumers.
Well it will be high out the gate yes, but Sony alone will most likely drive the cost way down. The sheer amount that they need is going to make it cheaper across the board I would guess very fast.
 
People keep trotting out the latency boogie man for some reason. It's not going to be a problem on either console.

RAM latency is the largest factor for the CPU by far. The CPU rarely needs the bandwidth. GPU on the other hand is opposite due to being parallel.

Again the ps4 will have a superior graphics solution while the X1 will have q superior CPU solution. And both parts are equally important in modern gaming. My prediction is both boxes in the end perform very close to each other in the real world.
 
Well it will be high out the gate yes, but Sony alone will most likely drive the cost way down. The sheer amount that they need is going to make it cheaper across the board I would guess very fast.
The increase of its usage in PCs across the years haven't done much to spot prices, and the amount of video cards being shipped these days is very substantial and higher then console shipments. I doubt Sony will make a paradigm shift in the business. Any cost decrease will just be a natural trend IMO.
 

coldfoot

Banned
the amount of video cards being shipped these days is very substantial and higher then console shipments.
Do you have any figures to back that up? Especially when you only include video cards that have GDDR5? Even then, PS4 will have 16 4Gbit chips and most video cards won't have that many on them to drive down prices.

GDDR5 is really not much different than DDR3 in terms of fundamental silicon production, it's just produced at far less quantities than DDR3.
 
RAM latency is the largest factor for the CPU by far. The CPU rarely needs the bandwidth. GPU on the other hand is opposite due to being parallel.

Again the ps4 will have a superior graphics solution while the X1 will have q superior CPU solution. And both parts are equally important in modern gaming. My prediction is both boxes in the end perform very close to each other in the real world.
I don't think this is true at all and the latency problem was debunked here multiple times already:

-CPUs are very weak in both consoles. Devs are supposed to use the GPGPU features more and more instead of the CPU. Therefore the GPU will play the biggest part nextgen

-PS4 uses a OOO CPU
While DDR3 has the advantage of low latency when compared to GDDR5, you need high bandwidth when transfering large amounts of data (graphics processing is precisely where you have to deal with such scenario) and that is where GDDR5 excels over DDR3.

Low latency found in DDR3 is meaningful (AFAIK) in order to remedy situations like cache miss where CPU fails to find data in the cache and has to look for it in the system memory (worst case scanario). This causes CPU to stall and increases system latency (and decreases system performance). Low latency of DDR3 enables you to fetch data more quickly on your next attempt. However, since CPU in PS4 is an out-of-order processor, it means that it doesn't have to wait for that missed data to appear in cache, but instead the CPU will simply execute other instructions instead of just sitting and waiting data from cache miss. So when dealing with small chunks of data, DDR3 is better. When dealing with large (huge) chunks of data, you need high bandwidth supplied by GDDR5.

Nothing of any importance.
A console processor requires a very limited functionality compared with a PC proc. Consoles processors aren't required to be optimized for a vast range of software, drivers, hardware changes, OS bloat, concurrent processes. Even a PC CPU can make do with a very small amount of RAM if the workload is streamlined- as it is in a console.

Vienn_22 said: ↑
You also said above that the APU is "sensitive to memory bandwidth" but that is only for the GPU side, since the APU has a CPU side too what will happen now that PS4 is using GDDR5 for both, since the 2 memories are opposites wouldn't it affect the CPU performance??
Nope, not in the slightest. What applications would a console be running that are CPU intensive and require minimal latency? CPUs require minimal latency because of multiple applications fighting for resources from available compute threads/cores - and multiple concurrent applications aren't likely to come into play with a console.
Most, if not all, applications running on a console APU would be hardware (GPU) accelerated. At this point I'm not even sure if PhysX wouldn't be HW accelerated on an AMD APU.
http://www.techspot.com/community/t...-memory-and-gddr5-memory.186408/#post-1295335


-PS4 has several backdoors to speed things up
Latency has been going up year after year on both desktop memory and video memory, but so has performance. The PS4 has backdoors to skip the cache with the Onion and Garlic customizations.
http://www.vgleaks.com/playstation-4-architecture-evolution-over-time/


I'm pretty sure latency is going to be a non-issue.
 
RAM latency is the largest factor for the CPU by far. The CPU rarely needs the bandwidth. GPU on the other hand is opposite due to being parallel.

Again the ps4 will have a superior graphics solution while the X1 will have q superior CPU solution. And both parts are equally important in modern gaming. My prediction is both boxes in the end perform very close to each other in the real world.

Your prediction is wrong. X1's marginally better latency (we are talking about a few nanoseconds here) won't be able to compete with 55+% more processing power, 158% more memory bandwidth and 40% more usable memory.
 

nib95

Banned
Assembly - 13 (Based on being made in China, for made in Japan, make that ~ $45)

So on the manufacturing of 10 million (est quantity based off the first few batches for global production), if made in China instead of Japan, they save themselves an estimated $450 million? Damn.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
True but Sony can't change the physical properties of DDR5. It flat out sucks for CPU tasks.

It doesn't suck, it is ill suited. CPUs don't need massive bandwidth and hence they don't use more expensive high bandwidth RAM. It's like saying a V8 Hemi "flat out sucks" for a Civic.
 
APU and ARM chip costs don't seem realistic.
Why?
RAM latency is the largest factor for the CPU by far. The CPU rarely needs the bandwidth. GPU on the other hand is opposite due to being parallel.

Again the ps4 will have a superior graphics solution while the X1 will have q superior CPU solution. And both parts are equally important in modern gaming. My prediction is both boxes in the end perform very close to each other in the real world.

Wait what?

An overly latent memory solution would hurt the CPU but neither system suffers from that. PS4 and XB1 are identical at worse, as far as CPUs.
 

Krilekk

Banned
Do you have any figures to back that up? Especially when you only include video cards that have GDDR5? Even then, PS4 will have 16 4Gbit chips and most video cards won't have that many on them to drive down prices.

GDDR5 is really not much different than DDR3 in terms of fundamental silicon production, it's just produced at far less quantities than DDR3.

Problem: GDDR6 comes out next year and will be adopted by GPU manufacturers in no time (there was basically no GPU in the last four years that didn't use GDDR5), meaning PS4 won't change anything in overall quantities. And since it's five year old tech that isn't developed anymore (we have stacked DDR3, DDR4 and GDDR6 next year, each better than GDDR5) cost won't go down substantially.
 

SSM25

Member
I firmly believe the MS's decision to go with DDR3 was purely based on their need to have 8 GB of RAM available instead of cutting on costs (it does not make sense for MS to cut costs on this)

8 GB of RAM + 1.6 billion transistors for eSRAM is not cheap
 

Razgreez

Member
Problem: GDDR6 comes out next year and will be adopted by GPU manufacturers in no time (there was basically no GPU in the last four years that didn't use GDDR5), meaning PS4 won't change anything in overall quantities. And since it's five year old tech that isn't developed anymore (we have stacked DDR3, DDR4 and GDDR6 next year, each better than GDDR5) cost won't go down substantially.

You mean "might" come out late next year and will only likely be available in low quantities initially and take quite some time for production to ramp up. A more realistic outlook at least

Also there are still GPUs today that don't use GDDR5
 
An overly latent memory solution would hurt the CPU but neither system suffers from that. PS4 and XB1 are identical at worse, as far as CPUs.

Xbone one uses eSRAM so it's actually got a lower latency pool of ram (albeit smaller) than PS4. Sony has said that they have worked into their APU design to reduce memory latency, but I guess only they and their devs have real benchmarks.

You mean "might" come out late next year and will only likely be available in low quantities initially and take quite some time for production to ramp up. A more realistic outlook at least

Also there are still GPUs today that don't use GDDR5

His overall point is that there are lots of upcoming memory solutions that compete with GDDR5. Not sure how that will affect supply or demand at this point.
 
Xbone one uses eSRAM so it's actually got a lower latency pool of ram (albeit smaller) than PS4. Sony has said that they have worked into their APU design to reduce memory latency, but I guess only they and their devs have real benchmarks.



.

Well yeah I know that the Xbone uses DDR3, therefore having a lower-latency. Im saying that it doesn't necessarily make the CPU situation better.
 
Well yeah I know that the Xbone uses DDR3, therefore having a lower-latency. Im saying that it doesn't necessarily make the CPU situation better.

I wrote eSRAM; didn't even bother saying DDR3. But I don't quite understand what you are saying as far as the latency and the CPU considered together.

His overall point is moot if either company wanted/wants to release this year.

They will be making consoles for the next 5+ years, so I don't see how it is moot. We're talking about GDDR5 price over the entire production life.
 

DBT85

Member
His overall point is that there are lots of upcoming memory solutions that compete with GDDR5. Not sure how that will affect supply or demand at this point.

As some people speculated in the early GDDR5 threads, we don't know if Sony will be able to switch to a stacked DDR4 production later in the life cycle or not.
 

Razgreez

Member
They will be making consoles for the next 5+ years, so I don't see how it is moot. We're talking about GDDR5 price over the entire production life.

Let me get this straight, you're saying Sony should wait, at best, over a year to release their next console at most likely a higher price point to gain the benefit of the decreasing price of GDDR6 which might only reach price parity with GDDR5 in, again best case, a year after its released i.e. 2yrs time. Not forgetting that the RAM is only one component in the entire system

What a brilliant business plan
 
Let me get this straight, you're saying Sony should wait, at best, over a year to release their next console at most likely a higher price point to gain the benefit of the decreasing price of GDDR6 which might only reach price parity with GDDR5 in, again best case, a year after its released i.e. 2yrs time. Not forgetting that the RAM is only one component in the entire system

What a brilliant business plan

No one ever said that, at least I didn't. Again, we're talking about forecasting the price of GDDR5. NOT about Sony changing memory options. GDDR6 was mentioned as a competing memory option, not as a possibility for PS4.

As some people speculated in the early GDDR5 threads, we don't know if Sony will be able to switch to a stacked DDR4 production later in the life cycle or not.

Sure, I think this is a scenario where Sony has better models for the price of PS4 memory and possible options than we do. We don't really know where what technologies will be further out, nor do we fully know the economies of them.
 
I wrote eSRAM; didn't even bother saying DDR3. But I don't quite understand what you are saying as far as the latency and the CPU considered together.



They will be making consoles for the next 5+ years, so I don't see how it is moot. We're talking about GDDR5 price over the entire production life.

XBones memory is DDR3. The ESram is nothing more than a big scratchpad for primarily the GPU. DDR3 is less latent than GDDR5, but GDDR5 latency is nothing to worry about.
 
XBones memory is DDR3. The ESram is nothing more than a big scratchpad for primarily the GPU. DDR3 is less latent than GDDR5, but GDDR5 latency is nothing to worry about.

I'm not going to discuss this with you any further than this post. Talking about memory latency, you said that the CPU on both systems was "identical at worse, as far as CPUs." I wrote that the Xbone CPU and memory set up could enable it to do certain things with better than the PS4 CPU and memory set up. That's it.

So, no, they are not "identical at worse." Just because you say that the eSRAM would be used as a GPU scratch pad doesn't mean that the Xbone CPU + eSRAM + DDR3 memory set up couldn't be used to perform better at some tasks than the PS4 + GDDR5.

In practice, I don't think any of this will make a difference, especially since the GPU is what will be featured over the CPU in both machines. And frankly, only Sony and their devs know the memory latency after all their hardware tweaking and optimizing.
 

Melchiah

Member
I remember PS2 was $375 to manufacture and sold at $299. PS3 cost $840 and sold at $599. Unless you want to make a very weak console like Nintendo has been doing lately, taking a loss on hardware comes with the territory. If PS4 cost $450~ then I'm expecting it to be sold at $400. Of course they may not do that and want to break even or even make a bit per console, but I won't lose sleep over it. I'm planning on getting one regardless.

That's news to me. They recouped their losses well in the European market, where the PS2 cost 500€ at launch.

EDIT: And the PS3 cost 659€ here at launch
 
I'm not going to discuss this with you any further than this post. Talking about memory latency, you said that the CPU on both systems was "identical at worse, as far as CPUs." I wrote that the Xbone CPU and memory set up could enable it to do certain things with better than the PS4 CPU and memory set up. That's it.

So, no, they are not "identical at worse." Just because you say that the eSRAM would be used as a GPU scratch pad doesn't mean that the Xbone CPU + eSRAM + DDR3 memory set up couldn't be used to perform better at some tasks than the PS4 + GDDR5.

In practice, I don't think any of this will make a difference, especially since the GPU is what will be featured over the CPU in both machines. And frankly, only Sony and their devs know the memory latency after all their hardware tweaking and optimizing.

The Xbox One CPU can't read from the ESRAM. It's a non-factor in CPU performance.
 
That's news to me. They recouped their losses well in the European market, where the PS2 cost 500€ at launch.

EDIT: And the PS3 cost 659€ here at launch

PS2 was a different beast. Sony will have to start boosting revenue from digital distribution and various other services by a lot.

The Xbox One CPU can't read from the ESRAM. It's a non-factor in CPU performance.

well, my bad.
 
I'm not going to discuss this with you any further than this post. Talking about memory latency, you said that the CPU on both systems was "identical at worse, as far as CPUs." I wrote that the Xbone CPU and memory set up could enable it to do certain things with better than the PS4 CPU and memory set up. That's it.

So, no, they are not "identical at worse." Just because you say that the eSRAM would be used as a GPU scratch pad doesn't mean that the Xbone CPU + eSRAM + DDR3 memory set up couldn't be used to perform better at some tasks than the PS4 + GDDR5.

In practice, I don't think any of this will make a difference, especially since the GPU is what will be featured over the CPU in both machines. And frankly, only Sony and their devs know the memory latency after all their hardware tweaking and optimizing.

LOL thats a hell of a precursor. Im saying that latency, only matters past a certain point. I can dive into the details if you want but then again you don't seem to know what you're talking about.
 

mavs

Member
I'm not going to discuss this with you any further than this post. Talking about memory latency, you said that the CPU on both systems was "identical at worse, as far as CPUs." I wrote that the Xbone CPU and memory set up could enable it to do certain things with better than the PS4 CPU and memory set up. That's it.

So, no, they are not "identical at worse." Just because you say that the eSRAM would be used as a GPU scratch pad doesn't mean that the Xbone CPU + eSRAM + DDR3 memory set up couldn't be used to perform better at some tasks than the PS4 + GDDR5.

In practice, I don't think any of this will make a difference, especially since the GPU is what will be featured over the CPU in both machines. And frankly, only Sony and their devs know the memory latency after all their hardware tweaking and optimizing.

The CPU has to go through the GPU memory controller to access the ESRAM, if vgleaks is to be believed. So we really don't know what the latency will be like. And really there's basically no configuration of components where we could say with any confidence what it would be without measuring it in a hands on experiment.
 
I'm not going to discuss this with you any further than this post. Talking about memory latency, you said that the CPU on both systems was "identical at worse, as far as CPUs." I wrote that the Xbone CPU and memory set up could enable it to do certain things with better than the PS4 CPU and memory set up. That's it.

So, no, they are not "identical at worse." Just because you say that the eSRAM would be used as a GPU scratch pad doesn't mean that the Xbone CPU + eSRAM + DDR3 memory set up couldn't be used to perform better at some tasks than the PS4 + GDDR5.

In practice, I don't think any of this will make a difference, especially since the GPU is what will be featured over the CPU in both machines. And frankly, only Sony and their devs know the memory latency after all their hardware tweaking and optimizing.
Again there is nothing to worry about. GDDR5 latency is a non-issue in a console environment. It doesn't do 1000 things at the same time like a PC, so the latency won't come into play at all. Even if it does, the CPU in the PS4 is OOO, which means if it has to wait reading from RAM, it simply executes another task first. So in this controlled environment devs can easily program around that, because they know what they are working with.

These two posts lay it out pretty good:

While DDR3 has the advantage of low latency when compared to GDDR5, you need high bandwidth when transfering large amounts of data (graphics processing is precisely where you have to deal with such scenario) and that is where GDDR5 excels over DDR3.

Low latency found in DDR3 is meaningful (AFAIK) in order to remedy situations like cache miss where CPU fails to find data in the cache and has to look for it in the system memory (worst case scanario). This causes CPU to stall and increases system latency (and decreases system performance). Low latency of DDR3 enables you to fetch data more quickly on your next attempt. However, since CPU in PS4 is an out-of-order processor, it means that it doesn't have to wait for that missed data to appear in cache, but instead the CPU will simply execute other instructions instead of just sitting and waiting data from cache miss. So when dealing with small chunks of data, DDR3 is better. When dealing with large (huge) chunks of data, you need high bandwidth supplied by GDDR5.

A console processor requires a very limited functionality compared with a PC proc. Consoles processors aren't required to be optimized for a vast range of software, drivers, hardware changes, OS bloat, concurrent processes. Even a PC CPU can make do with a very small amount of RAM if the workload is streamlined- as it is in a console.

Vienn_22 said: ↑
You also said above that the APU is "sensitive to memory bandwidth" but that is only for the GPU side, since the APU has a CPU side too what will happen now that PS4 is using GDDR5 for both, since the 2 memories are opposites wouldn't it affect the CPU performance??
Nope, not in the slightest. What applications would a console be running that are CPU intensive and require minimal latency? CPUs require minimal latency because of multiple applications fighting for resources from available compute threads/cores - and multiple concurrent applications aren't likely to come into play with a console.
Most, if not all, applications running on a console APU would be hardware (GPU) accelerated. At this point I'm not even sure if PhysX wouldn't be HW accelerated on an AMD APU.
http://www.techspot.com/community/t...-memory-and-gddr5-memory.186408/#post-1295335
 

prag16

Banned
I remember PS2 was $375 to manufacture and sold at $299. PS3 cost $840 and sold at $599. Unless you want to make a very weak console like Nintendo has been doing lately, taking a loss on hardware comes with the territory. If PS4 cost $450~ then I'm expecting it to be sold at $400. Of course they may not do that and want to break even or even make a bit per console, but I won't lose sleep over it. I'm planning on getting one regardless.

This is only the BOM. That's not the whole story.

Wii's BOM was $160 iirc and it sold for a slight (single digit dollar) profit. Vita's BOM was $170 or $180, and sold at a loss at $250. I don't know the exact breakdowns, but you have to consider manufacturing, labor, packaging, shipping, retailer markup (between $10-20 for last gen consoles at launch I believe), and probably other minor costs.

So selling a $450 BOM console at $400 is much more than a $50 loss; it would really be easily over $100, maybe even close to $150.

With regard to the numbers in the OP, a few of them are probably a little high, but the ARM chip and the APU sound too low. The chipworks guy speculated the Wii U's MCM cost Nintendo around $100. So either Nintendo fucked that up way worse than we even thought paying that much, or the estimate in the APU is way too low. It DOES sound low for a 'decent' octocore CPU and an alleged HD7850 equivalent.

$449 or $499 is more likely imo.
 

King326

Member
I'm pretty sure SONY learned from their previous mistake of launching PS3 at such a high price. $400-$450 range for PS4 should do it for me.
 

KidBeta

Junior Member
Best case for CPUs you don't even hit the RAM, I'm sure the devs can cook up some cache magic for a good majority of the cases on a nice closed box.
 
The problem with hailing the Xbone as having a superior CPU due to latency is that both the PS4 and the Xbone's CPU is cheap trash. I think the GPU would have a bigger influence this gen, and the PS4 has a definite lead there, judging from the specs we know.
 

Jburton

Banned
The problem with hailing the Xbone as having a superior CPU due to latency is that both the PS4 and the Xbone's CPU is cheap trash. I think the GPU would have a bigger influence this gen, and the PS4 has a definite lead there, judging from the specs we know.

Xbox One does not have a superior CPU ........ where are you getting that from?


I stated they day of the Playstation 4 reveal and a few posts down from the OP that the PS4 will retail in the UK for £359 ....... I still stand by that, maybe less.
 

Shikoro

Member
Lol at those prices...
You can easely slash off 50% off that.

Not really half, but at least 25% off the total price. I don't expect the manufacturing costs for the whole console to be above $350. The rest will bring the cost a bit above $399, but they will easily make it back with the price we in Europe will be paying. I really wouldn't worry neither about the affordability nor them taking big losses. They'll be perfectly fine. :)
 

PistolGrip

sex vacation in Guam
Cloud computing has less latency than GDDR5, amirite? Guys...?

No but the bandwidth increases dramatically therefore making the GDDR5 bandwidth advantage moot. When it comes to physics and animation you will be see a tremendous advantage on the Xbone due to its HPC(high performance cloud)

LOL
 
Top Bottom