• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Pitcairn (Radeon 7870) is such a good candidate for consoles

Its $349 at retail not because of cost of manufacturing it but AMD feels that is a good price for profit/supply reasons. Wafer costs for 28nm are higher than 40nm but it will come down when it matures later in this year and should easily be around similar pricing as 40nm around the time frame when the next cycle is about to begin.

In short, I expect Pitcairn (size) GPU to cost about $30-50.

If that were the case, Microsoft would completely redesign the Xbox720.
 

artist

Banned
I wouldn't remove the ram from the equation, the next gen consoles will probably have 2GB of GDDR3. If the peak consumption is around 115W, think of adding components to some max TDP. If you had a 150W budget this would leave 35W for the rest of the system (CPU, Optical, HD drives, USB, Wifi, etc). That's probably enough. If the system power budget is 200W, then it's probably do able.

I'm thinking we'll see something along the lines of a 80W GPU (what would be be 7830 levels), 45 W CPU and about 20-30W for everything else ~150W total system draw, less then launch consoles of this gen, but still up there.
The reason why I removed GDDR5's power consumption off of the 7870's TDP is to get the TDP for the GPU alone. Kind of like how you got 80W GPU figure, I wanted to get that for Pitcairn.

If that were the case, Microsoft would completely redesign the Xbox720.
You mean ditch Turks for Pitcairn? Too late, completely different arch type and would require a bigger PSU and also a more significant HSF (compared to the one required for Turks).
 

artist

Banned
You're seriously over estimating the power draw of the RAM modules and completely neglecting the fact that they still need to be included in the console anyway. You can't base results on cherry picked chips that run undervolted in a few inconclusive tests. Even the runt of the litter needs to hit your performance/power targets when you're designing silicon for a mass market device.

I read during the 58xx release period that the power draw of GDDR5 was around 20W for 8 chips, I'm still looking for it and will update the first post if I find it.
I remembered incorrectly.

It was the 48xx and the power consumption of 8 GDDR5 chips was almost 40W. So my estimate of 20W is still quite conservative :)
 

artist

Banned
There's a problem with heat I think. Your "heat chart" is based on a model with a 2-slot GPU.
Now, let's back to 2005. PS3/Xbox360 were big, had heat problem and they were based on single slot GPU:
94|a2aa92|709a_1755-2.jpg

ati_x1800gto_front.jpg
Custom versions from AIB partners will come in 1 slot eventually ;) (w/o watercooling)
Called it.

Ls0ar.jpg
 

SapientWolf

Trucker Sexologist
Rumors are saying 7970m, which has a 100W TDP versus the 175W TDP on the desktop 7870. So where did the 75W go? Lower clocks?
 
Rumors are saying 7970m, which has a 100W TDP versus the 175W TDP on the desktop 7870. So where did the 75W go? Lower clocks?

it's a laptop part, so they take extra measures to lower the TDP. on desktop parts, they're not fully worried about power usage since they have these things running on high watt power supplies. if they wanted to take the extra measure to lower the TDP for the desktop part, they could but it's a waste of $ to do that for obvious reasons.
 

artist

Banned
http://en.wikipedia.org/wiki/Comparison_of_AMD_graphics_processing_units

based on this wiki, the 7970M has a TDP of 75W @850 mhz with 20 compute units (64 shader cores per CU) = 1280 shader cores.

PS4 is rumored to have 18 compute units @800mhz + some compute units for the CPU. I'm thinking they have 20 compute unites w/ 2 compute unites (128 shader cores) set aside for HSA functions but are accessible if needed for graphics functions
Or the more obvious reason for settling on 18 CUs seems like higher yields/budgeting the TDP.
 
Top Bottom