lostinblue
Banned
No need to say sorry, I knew it because I researched it, and had read the very same PDF you presented.You're right of course. I should've read that more carefully, sorry.
It is ridiculously high, yes.Still, that seems almost ridiculously high to me and would make the PS4 250+ W. Much more than I'd have expected. Are you sure that GDDR5 chips are not quite a bit less power hungry today on more modern production processes (sadly I don't find anything on that on the net)?
As for them being less power hungry, not really. These chips are still stuck in the 46 nm process and using the same voltage as before. The best part money can buy here is 4Gbit chips at 1.35V; any other solution will eat up even more energy
Perhaps it'll have 32 of them; but since this seems like a late change doubling density seems like the only sane way to not return to the drawing board with the whole circuit and power source design/testing.Interesting; wasn't aware they had 512MB chips. Thought it was all 256 still at this point (at any sane price). I was envisioning 32 of those things on the board, lol. 16 isn't as bad.
I'm also assuming the dummies didn't go with a second DDR3 bank because the GPU and the CPU lack a DDR3 controller and it's kinda tight to change that at this point.
Most chips are still 256 MB, a lot of manufacturers are not offering 512 MB ones or do so in limited numbers. They're also kinda expensive seeing production is clearly not fully ramped up.
But the possibility of 32 chips for 8 GB is just too much power drain wise.
It's not about the amount of RAM, it's the type. GDDR5 for shit like OS and caching is like using a Ferrari for city traffic crossing.Wii U has 1 GB out of 2 GB for OS functions. How does that make you feel?8 GB of GDDR5 is overkill, hell, it makes me sad thinking 1/2 GB of that will be allocated for OS duties and perhaps another extra chunk will go for caching bluray data, being fed at a whooping 27 MB/s.
And everyone is ignoring the real bottleneck of this upcoming generation; sure fast RAM is nice and all that, but too much RAM increases the absolute bottleneck here.
And that is Blu-Ray transfer rate and how many times the usable RAM fits into a disc; because most data populating the RAM pool comes from the disc.
To illustrate my point:
PSone had 3 MB of RAM (2 MB plus 1 MB of VRAM), and the CD-ROM drive had a 300 KB/s throughput, this puts 3 MB of data being loaded at 10 seconds (ignoring seek time).
These 3 MB also fitted 217 times in a disc, meaning that (in a non linear way, just defining a pattern here as there is always shared data) ou could have 217 completely separate scenes in there to fill a 650 MB disc, which is why devs could only do that by stepping up on FMV's.
On PS2, the drive had a 5.54 MB/s transfer rate, which for 32 MB of RAM means 32 MB of data take 6 seconds to stream, also 32 MB can fit 147 times in a 4.7 GB DVD.
This generation... Most multiplatform games were bound by X360 DVD storage limits so we're talking 8.54 GB DVD's (and throughout most of the generation they only used 7 GB as the rest was reserved for security counter measures) so, going by the same logic it took 33 seconds to stream 512 MB out of the disc at 15.85 MB/s and 7/8.54 GB fit 14/17 times (respectively) on 512 MB.
This generation, I'll consider PS4 will only have 6 GB for games; that's 3 minutes and 48 seconds for 6 GB of data streaming at 27 MB/s, and 6 GB fit roughly 4 times in a 25 GB Bluray.
This means tendency for huge loadings, short games or repetitive when it comes to the assets if the whole RAM pool is used on a regular basis. It's a major bottleneck seeing you can't possibly make 16 minute installs before you play a game mandatory. (that's the time 25 GB of data take to get transfered)
For games that do pre-emptive caching, they could be using 170 GB/s RAM for 27 MB/s transfers, that's what makes me sad, they're pouring money on a RAM type that is overkill for the tasks the hardware has to do to compensate the fact the drive is hellish slow in comparison with the ammount of data it has to feed the machine with. You do have comparatively shitty DDR3 on PC's and the like for a reason; it's cheap and it consumes 2.58W per 16 chip DIMM. This... This is nuts, it's megalomaniac in a time I thought Sony had learned it's lesson.
Wii U will probably have textures with more compression (less resolution too, seeing it's meant for 720p) but way more sustainable loadings; in the end most people won't notice and it'll feel more like a console seeing installs won't be mandatory and games simply run more seamlessly sans-pauses in that sense. Although I do think it could have more RAM, just in case/for future proofing reasons; 1 GB right now seems just right.