• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Pitcairn (Radeon 7870) is such a good candidate for consoles

artist

Banned
AMD just lifted the embargo on the reviews of their latest GPU - Pitcairn, which makes up the Radeon 7800 series. The most shocking aspect I gathered from all the reviews was that this was a tiny chip that consumed 100W less than the GTX580 while pumping out similar graphics power. Few reasons why I think a custom design based off of Pitcairn is a good candidate to power the next cycle of consoles from Sony and Microsoft.

A. Die Size

Die size dictates the cost of manufacturing the GPU. The Pitcairn (7870) GPU is 212 mm². For reference, the Xenos (X360 GPU) is 190mm² without the daughter die and the RSX is 240mm². Remove the PC centric logic from Pitcairn; like Crossfire, Eyefinity, UVD, PCIe and it would probably be pretty close to Xenos' die size.

dieshot.jpg.jpeg


So it's definitely in the same ballpark as the previous console GPUs.

B. Power consumption

Peak power consumption is about 115W. Pitcairn is also the highest performance per watt GPU. Hardware.fr undervolted their card and managed to get a power draw of only 95W out of their card. Now take the 2GB of GDDR5 memory or 8 chips out of the equation or about 20W and the TDP of the GPU alone would be closer to 75W.

While I cant find the exact TDP of Xenos or RSX, good estimates out there put both of them to be <100W.

C. Heat dissipation

In terms of heat dissipation, the Pitcairn GPU isnt the hottest GPU around and it isnt employing a elaborate dissipation setup. The GPU could be slightly cooler (smaller) with the custom design (without PC centric logic) and could also be clocked slightly lower than 1GHz to fit the heat envelope of the required design.

44658.png
HD-7800-Aufmacher_120229165832810.png


D. Performance

In terms of performance, Pitcairn is overall pretty close to the GTX580 and not really that far off from the 7950, which are based on the flagship PC GPUs available currently.



Compared to the original Xenos and the RSX, Pitcairn ticks off all the right boxes.

jtrscmN4OoSc3.PNG


The only (major) unknown is the direction that Sony/Microsoft are heading to. Will they emphasize the technical requirements as much as they did in the previous cycle or will they decide to go the other direction. I hope for the former.

/fanboy wishlist
 

benny_a

extra source of jiggaflops
Very good argument. In the other threads about future GPU in next-gen consoles the basic argument I've heard from the proponents that say that graphics will not increase that much is because power draw is going to be so much more than current-gen.

If they can increase power draw by 20% to the first models of consoles but increase the performance by a factor of 5-6 or what you show that would be fantastic.

I'm not at all into hardware, so I need to see some objections.
 

androvsky

Member
Funny thing is the PS3 uses a 7800 series derived GPU, so having the PS4 use a 7800 series GPU would be rather amusing. :)

Yes, I know different vendors and all that, just funny how the two companies numbering systems have looped around
 

mclaren777

Member
I think cost will keep this GPU from being a good candidate.

That, and I personally don't want AMD hardware in the next Playstation.
 

StevieP

Banned
I think cost will keep this GPU from being a good candidate.

That, and I personally don't want AMD hardware in the next Playstation.

You're likely going to get it, however. Pitcairn shaved back a little (clock, units) would indeed make a good candidate.
 

DieH@rd

Banned
I think cost will keep this GPU from being a good candidate.

That, and I personally don't want AMD hardware in the next Playstation.

Its a 210mm^2 chip. Cost will not be a problem.

But as Charlie said, Sony is going with SoC solution [several of them], there will be no standalone GPU.
 

i-Lo

Member
I see a high probability of highly customized (OC'd) R7770 or a R7850. The performance between 7870 and 7850 are not substantial but the savings in power is,. (around 19W). Of course, I am only speculating where the paradigm has been set based on power usage. If we can get more performance of R7870 packed into consoles, I don't think anyone will be unhappy.
 

KKRT00

Member
I think cost will keep this GPU from being a good candidate.

That, and I personally don't want AMD hardware in the next Playstation.

Why dont You want AMD GPU there?

I dont think that cost is really high, its 349$ at retail, so custom made and sold in bulks shouldnt cost more than 200$ at launch for Sony and Microsoft. 400$ machine is reasonable price for next gen launch and 200-250$ for rest of components should be enough.
 

artist

Banned
I dont think that cost is really high, its 349$ at retail, so custom made and sold in bulks shouldnt cost more than 200$ at launch for Sony and Microsoft. 400$ machine is reasonable price for next gen launch and 200-250$ for rest of components should be enough.
Its $349 at retail not because of cost of manufacturing it but AMD feels that is a good price for profit/supply reasons. Wafer costs for 28nm are higher than 40nm but it will come down when it matures later in this year and should easily be around similar pricing as 40nm around the time frame when the next cycle is about to begin.

In short, I expect Pitcairn (size) GPU to cost about $30-50.
 

THE:MILKMAN

Member
From my point of view it is not about what is technically possible at all.

I'm sure if Sony or MS invested $10 billion they could launch a SLI 580/4Ghz 8 core monster with cryo cooling. Sony especially, can't afford to do this.

I wouldn't rule out something like the Pitcairn if the launch was 2014 but I'm not expecting it.

I'm keeping my expectations low for PS4/Xbox Next. 2-3 times Wii-U if it is indeed 2 times 360.
 

luffeN

Member
Its a 210mm^2 chip. Cost will not be a problem.

But as Charlie said, Sony is going with SoC solution [several of them], there will be no standalone GPU.

Is there a difference in performance when comparing a SoC solution to a standalone GPU? Can a SoC solution be "as strong as" a standalone GPU without costing more etc.?
 

artist

Banned
But as Charlie said, Sony is going with SoC solution [several of them], there will be no standalone GPU.
Charlie's record with consoles is spotty at best, isnt it? He kept insisting that the next Xbox was Oban when it wasnt .. Not saying he's incorrect but even his article about the PS4, he said that the SoC was using a GCN based core, didnt he?
 

StevieP

Banned
Well, there's the 2.5 TFLOPS that Tim Sweeney was looking for. I'm obviously oversimplifying.

Hah! Tim Sweeney.

Tim Sweeney said:
We only need what Moore’s law will readily provide. Compared to current-generation consoles, I’d much like to see roughly 8-12x more CPU performance, 10x higher GPU triangle and rasterizer throughput, and 20x more GPU computational (ALU) throughput. With that degree of leap, I’m confident we can ship next-generation games with graphics on par with the Samaritan real-time demo.

http://www.gamestm.co.uk/interviews...xt-gen-consoles-and-the-future-of-videogames/
 
Charlie's record with consoles is spotty at best, isnt it? He kept insisting that the next Xbox was Oban when it wasnt .. Not saying he's incorrect but even his article about the PS4, he said that the SoC was using a GCN based core, didnt he?

Wait what did Oban end up being?
 
There's a problem with heat I think. Your "heat chart" is based on a model with a 2-slot GPU.
Now, let's back to 2005. PS3/Xbox360 were big, had heat problem and they were based on single slot GPU:
94|a2aa92|709a_1755-2.jpg

ati_x1800gto_front.jpg
 
There's a problem with heat I think. Your "heat chart" is based on a model with a 2-slot GPU.
Now, let's back to 2005. PS3/Xbox360 were big, had heat problem and they were based on single slot GPU:
94|a2aa92|709a_1755-2.jpg

ati_x1800gto_front.jpg



I don't see how they could make that:
amd-radeon-hd-7870.jpg


fit in a console.


Arf, fail editing... can a moderator delete my post above ? Sorry for the inconvenience.
 

StevieP

Banned
Edit: Sorry, GAF crashing led to double post.

Double-edit: GhostTrick, I don't think something QUITE as good as Pitcairn will go in a console (in reality, the most they could top out would be something between a 7770-7850 in a console-sized box) but even then it would be downclocked from its PC version among other things. Even with 140w of power consumption, give-or-take, there's still heat to deal with as you point out. At slower clocks, this "somewhere in between 7770-7850" chip wouldn't pour out heat as heavily :)

Triple-edit: yeah, what brain_stew said:

brain_stew said:
No console will have a GPU quite that powerful, though the PS4 shouldn't be too far off.
 

artist

Banned
There's a problem with heat I think. Your "heat chart" is based on a model with a 2-slot GPU.
Now, let's back to 2005. PS3/Xbox360 were big, had heat problem and they were based on single slot GPU:
94|a2aa92|709a_1755-2.jpg

ati_x1800gto_front.jpg
Custom versions from AIB partners will come in 1 slot eventually ;) (w/o watercooling) Besides, you dont need to cool a 5 phase PWM, 8 GDDR5 chips in a console ..

Wait what did Oban end up being?
Durango.

I took it that "Oban" was the codename for the SoC and "Durango" the codename for the console itself.
Hmm .. goes to recheck. Thanks.
 

THE:MILKMAN

Member
Charlie's record with consoles is spotty at best, isnt it? He kept insisting that the next Xbox was Oban when it wasnt .. Not saying he's incorrect but even his article about the PS4, he said that the SoC was using a GCN based core, didnt he?

I took it that "Oban" was the codename for the SoC and "Durango" the codename for the console itself.
 

Solid07

Banned
Too expensive. Sony wouldn't want their next-gen console to be 599.99 USD again. We all saw what happened to their launch. Ouch.
 
No console will have a GPU quite that powerful, though the PS4 shouldn't be too far off.

Drop the clockspeed a little and ditch a few compute units and you'll be in the right sort of ballpark, anything over 2 teraflops is unrealistic if Sony/Microsoft use a modern GPU architecture.

You're seriously over estimating the power draw of the RAM modules and completely neglecting the fact that they still need to be included in the console anyway. You can't base results on cherry picked chips that run undervolted in a few inconclusive tests. Even the runt of the litter needs to hit your performance/power targets when you're designing silicon for a mass market device.
 

wsippel

Banned
Charlie's record with consoles is spotty at best, isnt it? He kept insisting that the next Xbox was Oban when it wasnt .. Not saying he's incorrect but even his article about the PS4, he said that the SoC was using a GCN based core, didnt he?
He said the chip was called Oban, not the whole console. Chips usually don't have the same codename as the consoles, and in many cases they have two code names, even (IBM doesn't necessarily use the same codename as Microsoft).
 

pestul

Member
I think a design modified around the 7850 would be fantastic frankly. I think it falls into a really nice performance window.
 
Its a 210mm^2 chip. Cost will not be a problem.

But as Charlie said, Sony is going with SoC solution [several of them], there will be no standalone GPU.

Charly said it could be a 3D stack, thus several chips (cpu,gpu and memory) joint together over a common base -silicon interposer-,enabling this way biger chips than in a normal Soc.
 

artist

Banned
You're seriously over estimating the power draw of the RAM modules and completely neglecting the fact that they still need to be included in the console anyway. You can't base results on cherry picked chips that run undervolted in a few inconclusive tests. Even the runt of the litter needs to hit your performance/power targets when you're designing silicon for a mass market device.
I read during the 58xx release period that the power draw of GDDR5 was around 20W for 8 chips, I'm still looking for it and will update the first post if I find it.

I dont get by what you mean in terms of cherry picked chips. Pretty sure by the time next cycle of consoles will launch, this should be the norm and not an anomaly.

How does this shit make any sense? Unless the damn PS3 is 2 PS3's duct taped together, people constantly complain it'll be a $599.99 monster.
It doesnt make sense, just like this one for example:
Hey why stop there! Why not slap a 7970 in the next gen consoles while where at it!
 

DieH@rd

Banned
Hey why stop there! Why not slap a 7970 in the next gen consoles while where at it!

It can be done. Lower core speed to ~700mhz and you will get console viable part that spends ~130W [both for GPU chip and 2gb of GDDR5 ram].

But 7870 is better [and smaller] solution.
 

artist

Banned

SapientWolf

Trucker Sexologist
I think the GPU that goes into next gen consoles will be closer to what's in the notebook segment than the desktop part.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
It would be great to have that level of GPU in the next machines but I somehow doubt it.

The OP put a pretty good case together so you never know.
 

squidyj

Member
Too expensive. Sony wouldn't want their next-gen console to be 599.99 USD again. We all saw what happened to their launch. Ouch.

Yeah because that happened because of teh cell and totally not because they wanted to push Blu Ray at a time when it was hella expensive even while having a hardware BC module inside the box. That shit had nothing to do with it.
 

Shai-Tan

Banned
There's a problem with heat I think. Your "heat chart" is based on a model with a 2-slot GPU.
Now, let's back to 2005. PS3/Xbox360 were big, had heat problem and they were based on single slot GPU:

first, consoles have big heatsinks inside, second the current consoles run hot already because they were made with old processes. the original ps3 iirc is 90nm, later are 65nm and now 45nm. xbox 360, same. current gpu are also much more quiet than they used to be. they run more hot than they would to keep fan rpm down
 

-viper-

Banned
Yeah because that happened because of teh cell and totally not because they wanted to push Blu Ray at a time when it was hella expensive even while having a hardware BC module inside the box. That shit had nothing to do with it.

If they go with the AMD CPU + GPU route, then I imagine the console will be cheaper.

I think they should aim for $399 or $450 max.
 

DCKing

Member
If Sony and Microsoft will go for $399 large powerhouses again, then the OP is correct. A GPU similar to Pitcairn but clocked lower is the best you can fit in the same envelope as the first Xbox 360.

I don't think Microsoft and Sony will go down that route, however. I think the Cape Verde chips (Radeon HD7750 & HD7770) are the best analogy now, maybe with a bit more stuff on the chip.
 

StevieP

Banned
From Anand today:
Anand said:
The 7870 PCB itself runs 9.5” long, with an additional .25” of shroud overhang bringing the total to 9.75”. Our card is equipped with 8 5GHz 256MB Hynix GDDR5 memory chips, the same 5GHz chips that we saw on the 7700 series. For the 7870 power is provided by a pair of 6pin PCIe power socket, while the sub-150W 7850 uses a single socket. Both cards feature a single CrossFire connector, allowing them to be paired up in a 2-way CrossFire configuration.

http://www.anandtech.com/show/5625/...d-7850-review-rounding-out-southern-islands/2
 
Well the 7870 is high tier, low high tier in the 7000 series. It's feasible if the launch of the next gen is 1.5-2 years away because then it will be closer to true mid tier where the current gen's cards were when they launched.
 
first, consoles have big heatsinks inside, second the current consoles run hot already because they were made with old processes. the original ps3 iirc is 90nm, later are 65nm and now 45nm. xbox 360, same. current gpu are also much more quiet than they used to be. they run more hot than they would to keep fan rpm down

Current GPUs are only quieter because of the money spent on expensive cooling solutions that aren't viable for a cheap mass market box.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
Too expensive. Sony wouldn't want their next-gen console to be 599.99 USD again. We all saw what happened to their launch. Ouch.

Blu-ray drives are cheap now. You're being ridiculous.
 

McHuj

Member
Now take the 2GB of GDDR5 memory or 8 chips out of the equation or about 20W and the TDP of the GPU alone would be closer to 75W.

I wouldn't remove the ram from the equation, the next gen consoles will probably have 2GB of GDDR3. If the peak consumption is around 115W, think of adding components to some max TDP. If you had a 150W budget this would leave 35W for the rest of the system (CPU, Optical, HD drives, USB, Wifi, etc). That's probably enough. If the system power budget is 200W, then it's probably do able.

I'm thinking we'll see something along the lines of a 80W GPU (what would be be 7830 levels), 45 W CPU and about 20-30W for everything else ~150W total system draw, less then launch consoles of this gen, but still up there.
 

Solid07

Banned
Blu-ray drives are cheap now. You're being ridiculous.

I'm being ridiculous because I want Sony to make money off of their consoles this time around from the beginning and get themselves out of their BBB+ credit rating. Yeah. Okay.

Realistically speaking, both MS and Sony will use ~HD6000 series gpu chips. I'm happy with that considering the 7800GT-like gpu chip in the PS3 is still performing today like a HD 4850/GTX260.
 
Top Bottom