• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 Rumors , APU code named 'Liverpool' Radeon HD 7970 GPU Steamroller CPU 16GB Flash

Status
Not open for further replies.

LastNac

Member
I'm sure the majority of us hate this thought, but the idea of just a pure gaming machine is long dead IMO. .

Depends on what you consider a "pure gaming machine". Hell, technically we haven't had a "pure gaming machine' for a while given the fact that the PS2 and Xbox both played DVDs. I don't think any of us are looking for exclusivity on what it runs. Keep in mind, there is "pure" and then there is predominant, and I don't see why we can't get a machine that has a predominant focus on what many of us bought it in the first place to do.
 

Globox_82

Banned
LOL he's playing with you. Double precision would make that console $5999

PS3: 255.2 GFLOPS (24, 2 x 4-way ALUs (pixel)+ 8, 5-way ALUs (vertex) @ 550MHz)

Nvidia flops lol.

how do you report someone? Ithought neogaf was strickt when it comes 2 trolling.

thanks for the reply
 
How the next-gen consoles compare, the story so far:

Xbox Orange - Wii U - PlayStationForever

GPU: ??? - Radeon e6760 - Radeon HD 7970
CPU: ??? - Tri-core Wii++ CPU - Quad-core AMD
RAM: ??? - 2GB (of which 1GB available to developers)+32MB eDRAM - 2GB (as of now, might increase.)

The Wii U GPU has around 500-600 Gflops, the PSF GPU has about 1800 Gflops - or roughly 3x better.
 

Globox_82

Banned
How the next-gen consoles compare, the story so far:

Xbox Orange - Wii U - PlayStationForever

GPU: ??? - Radeon e6760 - Radeon HD 7970
CPU: ??? - Tri-core Wii++ CPU - Quad-core AMD
RAM: ??? - 2GB (of which 1GB available to developers)+32MB eDRAM - 2GB (as of now, might increase.)

The Wii U GPU has around 500-600 Gflops, the PSF GPU has about 1800 Gflops - or roughly 3x better.

if rumor is true.
didnt sony say they aim for 10x? so it would be around 2500gflops, no?
brw what about 720?
 
How the next-gen consoles compare, the story so far:

Xbox Orange - Wii U - PlayStationForever

GPU: ??? - Radeon e6760 - Radeon HD 7970
CPU: ??? - Tri-core Wii++ CPU - Quad-core AMD
RAM: ??? - 2GB (of which 1GB available to developers)+32MB eDRAM - 2GB (as of now, might increase.)

The Wii U GPU has around 500-600 Gflops, the PSF GPU has about 1800 Gflops - or roughly 3x better.

Xbox Orange?
 

Meelow

Banned
if rumor is true.
didnt sony say they aim for 10x? so it would be around 2500gflops, no?
brw what about 720?

Sony never said how much more the PS4 will be, they just said they want the PS3 successor to have noticeably better graphically games than PS3, with Xbox 720 it's in the middle, the GLOPS are said to be 1200 (correct me if I'm wrong).
 

thuway

Member
if rumor is true.
didnt sony say they aim for 10x? so it would be around 2500gflops, no?
brw what about 720?

This is what I'm hoping for. The performance offered by the parts in the Radeon 8000 series are truly something to behold. The 8870 yields FOUR teraflops of power which is something that would put this beyond next-gen and into next gen and a bit more.
 

Globox_82

Banned
This is what I'm hoping for. The performance offered by the parts in the Radeon 8000 series are truly something to behold. The 8870 yields FOUR teraflops of power which is something that would put this beyond next-gen and into next gen and a bit more.

sadly probably too much. wet dream. i mean 4tflops, that thing would cost 999Usd or more, correct?
EDIT-besides isnr it to late for 8000 if consoles r launching fall 2013? production and tesing must begging at least a year earlier
 

Globox_82

Banned
amd-radeon-8000-series-1.jpg


how is it possible for newer cards 2 be cheaper and yet way better on papper, makes no sense. the one that is 199usd sounds cheap enough 2 end in a console.
 

thuway

Member
sadly probably too much. wet dream. i mean 4tflops, that thing would cost 999Usd or more, correct?

It's not about cost. The prices for the rumored cards are pretty reasonable. The real issue is thermal/power envelope. The other question is long term reliability. An environment with that much heat, electricity, and size is not healthy for the parts.

Still, this has been such a long wait between cycles. Jack Tretton is famously quoted as saying, "If you can build a better machine... it's better than rushing to market.... The number one goal is to be the best machine."

http://www.youtube.com/watch?v=RHKkoKVthck#t=6m01s
 

thuway

Member
amd-radeon-8000-series-1.jpg


how is it possible for newer cards 2 be cheaper and yet way better on papper, makes no sense. the one that is 199usd sounds cheap enough 2 end in a console.

Son', Sony and Microsoft don't pay retail for the parts, in fact they don't even buy the parts. They buy the manufacturing process and they do their own magic.
 

Globox_82

Banned
Son', Sony and Microsoft don't pay retail for the parts, in fact they don't even buy the parts. They buy the manufacturing process and they do their own magic.

you didnt reply if it is 2 late 2 use 8000 - in case they can - if consoles r coming next year?
 

thuway

Member
you didnt reply if it is 2 late 2 use 8000 - in case they can - if consoles r coming next year?

No one knows, but I'd bet my bottom dollar both Sony and Microsoft knew of the 8000 series, maybe even 9000 series cards before they decided to commit to any sort of time line. If Sony truly wants a 10X increase over PS4, there is no way around a 2.5 teraflop card with 4 gigs of ram.
 
No one knows, but I'd bet my bottom dollar both Sony and Microsoft knew of the 8000 series, maybe even 9000 series cards before they decided to commit to any sort of time line. If Sony truly wants a 10X increase over PS4, there is no way around a 2.5 teraflop card with 4 gigs of ram.

But straight up raw power is only one way of achieving that 10x. It could be through efficiency gains on top of raw power. In other words 7-8x raw power (in this case FLOPs) + efficiency gains in newer hardware = 10x.
 
Sony would only need to get a GPU about the power of an HD 5870 if we're talking about simple flops. The modern GPUs have a tremendous amount of computational power and it's been that way since the first DX11 cards showed up.

this is just by theoretical flops. real world performances in games would be off the charts. GPU tech has advanced so far that they don't even need ultra over the top hardware anymore in consoles. I'd be satisfied with Sony or M$ using something as lowly as the 7850 because they pack so much power that hasn't even been tapped yet because the consoles have kept the low bar very low.

People think next gen isn't going to be a significant upgrade, but if the rumors are true and we get 7970 level of GPU's in next gen, the lighting/physics and special shader effects are going to be a monumental leap fwd.
 

androvsky

Member
GPU SPU's might be able to. But it would take several SPU's to emulate one SPE in the Cell... which would likely cause latency...

Yeah, it's pretty much impossible to split up emulation of a single thread into multiple threads*, since in a single thread future calculations will often rely on previous results, so one core will have to wait for the other to finish anyway. When you see regular emulators using threads, it's typically when they're emulating hardware that operates independently anyway, such as the VUs on a PS2.

missile made a fascinating post a little while back about swapping out microcode on a modern processor to handle instructions from a different architecture. In a case like that, emulating the 360's CPU or the PPC core from the PS3 would be trivial as long as the cache controller is updated to handle the different endian order between PPC and x86. That still leaves the matter of the SPUs. I was thinking if the PS4 has at least 7 cores, it could have the microcode on six of them handle SPU instructions, and have the L2 cache make it look like the local stores. They'd need to be pretty beefy cores that can handle lots of vector processing. It could do fetches to main memory when the SPUs do a DMA request, and... it'd be a big job for AMD. I guess if they wanted the design win bad enough they might do it, and it'd be a good selling point to draw Sony away from using the Cell again (as had been rumored after they gave up on Larrabee). Now that I think about it, Sony did something similar with the Vita, since iirc an exec mentioned they added a few extra instructions to the CPU to make PSP emulation easier.

Still, it'd pretty much have to be a 7+ core CPU with beefy vector units, but not a GPU (unless the compute units are monsters).


*There is out-of-order execution, but that requires a lot of hardware logic and probably isn't practical in software emulation, as all the time spent splitting up instructions and checking dependencies would slow things down much more than you'd get back from the extra cores. There's also the case of when the emulator knows exactly what it's executing it can split it up into threads (assuming it's running an algorithm that can be split up), but in those cases it's probably better to just run a native library that does the same job.
 
So, judging by the past couple of pages, is it safe to assume we are going to have another generation of HD twins?

I mean, unless of course Sony wants to go out swinging and decides to throw a Tesla chip in with an i7, a couple gigs of RAM, and a GTX 790 just to really go into the red!

Yes, I am aware that things are pointing to them going with AMD. I just am firing off the names of components that come to my head first.
 

TheD

The Detective
Yeah, it's pretty much impossible to split up emulation of a single thread into multiple threads*, since in a single thread future calculations will often rely on previous results, so one core will have to wait for the other to finish anyway. When you see regular emulators using threads, it's typically when they're emulating hardware that operates independently anyway, such as the VUs on a PS2.

missile made a fascinating post a little while back about swapping out microcode on a modern processor to handle instructions from a different architecture. In a case like that, emulating the 360's CPU or the PPC core from the PS3 would be trivial as long as the cache controller is updated to handle the different endian order between PPC and x86. That still leaves the matter of the SPUs. I was thinking if the PS4 has at least 7 cores, it could have the microcode on six of them handle SPU instructions, and have the L2 cache make it look like the local stores. They'd need to be pretty beefy cores that can handle lots of vector processing. It could do fetches to main memory when the SPUs do a DMA request, and... it'd be a big job for AMD. I guess if they wanted the design win bad enough they might do it, and it'd be a good selling point to draw Sony away from using the Cell again (as had been rumored after they gave up on Larrabee). Now that I think about it, Sony did something similar with the Vita, since iirc an exec mentioned they added a few extra instructions to the CPU to make PSP emulation easier.

Still, it'd pretty much have to be a 7+ core CPU with beefy vector units, but not a GPU (unless the compute units are monsters).


*There is out-of-order execution, but that requires a lot of hardware logic and probably isn't practical in software emulation, as all the time spent splitting up instructions and checking dependencies would slow things down much more than you'd get back from the extra cores. There's also the case of when the emulator knows exactly what it's executing it can split it up into threads (assuming it's running an algorithm that can be split up), but in those cases it's probably better to just run a native library that does the same job.

The idea about using different microcode is all good and well bar the problem of the original CPU it is trying to run code for not being 100% the same speed (faster or slower) as the new CPU, thus breaking the code or giving non same performance.
 

androvsky

Member
The idea about using different microcode is all good and well bar the problem of the original CPU it is trying to run code for not being 100% the same speed (faster or slower) as the new CPU, thus breaking the code or giving non same performance.

Emulators aren't perfect either, but shipping game code is usually solid enough to handle some minor variances, especially for modern games that have to deal with multi-threading and OS interruptions anyway. It still wouldn't be trivial (see XBox 360 S having to purposefully slow down a connection to avoid breaking software), but it'd be a lot less impossible than pure software emulation.
 

squidyj

Member
So, judging by the past couple of pages, is it safe to assume we are going to have another generation of HD twins?

I mean, unless of course Sony wants to go out swinging and decides to throw a Tesla chip in with an i7, a couple gigs of RAM, and a GTX 790 just to really go into the red!

Yes, I am aware that things are pointing to them going with AMD. I just am firing off the names of components that come to my head first.

Sony secretly develops memristor drive, ENTIRE HARD DRIVE IS MEMORY, WOAHHHHHHHHHHHHHHH
 
Yeah, it's pretty much impossible to split up emulation of a single thread into multiple threads*, since in a single thread future calculations will often rely on previous results, so one core will have to wait for the other to finish anyway. When you see regular emulators using threads, it's typically when they're emulating hardware that operates independently anyway, such as the VUs on a PS2.

missile made a fascinating post a little while back about swapping out microcode on a modern processor to handle instructions from a different architecture. In a case like that, emulating the 360's CPU or the PPC core from the PS3 would be trivial as long as the cache controller is updated to handle the different endian order between PPC and x86. That still leaves the matter of the SPUs. I was thinking if the PS4 has at least 7 cores, it could have the microcode on six of them handle SPU instructions, and have the L2 cache make it look like the local stores. They'd need to be pretty beefy cores that can handle lots of vector processing. It could do fetches to main memory when the SPUs do a DMA request, and... it'd be a big job for AMD. I guess if they wanted the design win bad enough they might do it, and it'd be a good selling point to draw Sony away from using the Cell again (as had been rumored after they gave up on Larrabee). Now that I think about it, Sony did something similar with the Vita, since iirc an exec mentioned they added a few extra instructions to the CPU to make PSP emulation easier.

Still, it'd pretty much have to be a 7+ core CPU with beefy vector units, but not a GPU (unless the compute units are monsters).


*There is out-of-order execution, but that requires a lot of hardware logic and probably isn't practical in software emulation, as all the time spent splitting up instructions and checking dependencies would slow things down much more than you'd get back from the extra cores. There's also the case of when the emulator knows exactly what it's executing it can split it up into threads (assuming it's running an algorithm that can be split up), but in those cases it's probably better to just run a native library that does the same job.

I still think the idea that Jeff (based on patents) are brilliant. It's supposed to have 4 jag cores, but it COULD have 2 jag cores + 2 PPU/SPE modules to emulate the cell, while the GPU can easily emulate the RSX. The cell would allow bluray and 4k movies/streaming without the need of using the electricity hogging GPU.
 

StevieP

Banned
I still think the idea that Jeff (based on patents) are brilliant. It's supposed to have 4 jag cores, but it COULD have 2 jag cores + 2 PPU/SPE modules to emulate the cell, while the GPU can easily emulate the RSX. The cell would allow bluray and 4k movies/streaming without the need of using the electricity hogging GPU.

This description makes it sound a LOT more trivial than it really is lol. Even if those parts were involved I mean. The design paradigm of RSX even differs greatly from the design paradigm of modern Nv Gpus, for example - let alone an ati gcn part.
 

DieH@rd

Banned
amd-radeon-8000-series-1.jpg


how is it possible for newer cards 2 be cheaper and yet way better on papper, makes no sense. the one that is 199usd sounds cheap enough 2 end in a console.


Radeon 7xxx/Gforce 6xx series brought big price increase. This alleged pricedrop will not bring the price down to the level like we had before for "middle class" cards, but its a nice start.
 
LOL he's playing with you. Double precision would make that console $5999

PS3: 255.2 GFLOPS (24, 2 x 4-way ALUs (pixel)+ 8, 5-way ALUs (vertex) @ 550MHz)

Nvidia flops lol.

How many flops is Cell?

So Sony plans on supporting the PS3 with game releases till 2015 and farther into the future if games sell well on the system. The updated PS3 is going with a street price of 270 USD. Some are speculating that the PS4 will have an 8000 chipset.

Based on the above paragraph I am assuming that the PS4 will launch in 2014.

My only question now is, Can the new specs support software emulation of the Cell/RSX? Previous specs said no, but have things changed enough to allow software based BC?

Well according to Jeff it may be possible if they include this 1PPU4SPU co possessor in addition to the GPU/CPU SoC. According to Jeff Sony did a patent for a 1PPU4SPU back in 2010. Theres been no indication that they plan to use it in anything though, in fact all the rumors point to them abandoning Cell tech all together. So who knows. Jeff has brought up a lot of good points to support this 1PPU4SPU though.

I have a question. Would they have any trouble emulating RSX? Besides the fact its nvidia and their rumored to go with AMD now. Theres no edram like PS2, so they shouldnt have any bandwidth concerns this time around, correct? Essentially if they had this 1SPU4PPU they be in the clear right, there no problems on the gpu side of things?
 

Panajev2001a

GAF's Pleasant Genius
How many flops is Cell?



Well according to Jeff it may be possible if they include this 1PPU4SPU co possessor in addition to the GPU/CPU SoC. According to Jeff Sony did a patent for a 1PPU4SPU back in 2010. Theres been no indication that they plan to use it in anything though, in fact everything points to them abandoning Cell tech all together. So who knows. Jeff has brought up a lot of good points to support this 1PPU4SPU though.

Each SPU can process up to 4 fused multiply-adds per cycle (and so can the VMX unit on the PPU), which means 8 FP ops per cycle * 3.2 GHz * 8 (7 SPU's + 1 PPU) = 204.8 GFLOPS at single precision (DP is considerably less than that on consumer CELL CPU's) although games can use more like 179.2 GFLOPS since one of the SPU's is used by the OS... still, it is not like it is always idling its thumbs... often the OS is doing things that the apps would have to do by themselves.
 
The growing consensus seems to be that they're not. Wii U gave them another year to milk this gen.

Ive been predicting a 2014 launch for PS3 since last May, but isn't the next Xbox still rumored for 2013?

Each SPU can process up to 4 fused multiply-adds per cycle (and so can the VMX unit on the PPU), which means 8 FP ops per cycle * 3.2 GHz * 8 (7 SPU's + 1 PPU) = 204.8 GFLOPS at single precision (DP is considerably less than that on consumer CELL CPU's) although games can use more like 179.2 GFLOPS since one of the SPU's is used by the OS... still, it is not like it is always idling its thumbs... often the OS is doing things that the apps would have to do by themselves.

thx. So another wards PS3 has around 420 total flops usable for games(cpu +gpu).

How much is Xenon? Just wondering cause Cell is supposed to be a monster in flops, as its it main forte.
 

Panajev2001a

GAF's Pleasant Genius
Yeah, it's pretty much impossible to split up emulation of a single thread into multiple threads*, since in a single thread future calculations will often rely on previous results, so one core will have to wait for the other to finish anyway. When you see regular emulators using threads, it's typically when they're emulating hardware that operates independently anyway, such as the VUs on a PS2.

missile made a fascinating post a little while back about swapping out microcode on a modern processor to handle instructions from a different architecture. In a case like that, emulating the 360's CPU or the PPC core from the PS3 would be trivial as long as the cache controller is updated to handle the different endian order between PPC and x86. That still leaves the matter of the SPUs. I was thinking if the PS4 has at least 7 cores, it could have the microcode on six of them handle SPU instructions, and have the L2 cache make it look like the local stores. They'd need to be pretty beefy cores that can handle lots of vector processing. It could do fetches to main memory when the SPUs do a DMA request, and... it'd be a big job for AMD. I guess if they wanted the design win bad enough they might do it, and it'd be a good selling point to draw Sony away from using the Cell again (as had been rumored after they gave up on Larrabee). Now that I think about it, Sony did something similar with the Vita, since iirc an exec mentioned they added a few extra instructions to the CPU to make PSP emulation easier.

Still, it'd pretty much have to be a 7+ core CPU with beefy vector units, but not a GPU (unless the compute units are monsters).


*There is out-of-order execution, but that requires a lot of hardware logic and probably isn't practical in software emulation, as all the time spent splitting up instructions and checking dependencies would slow things down much more than you'd get back from the extra cores. There's also the case of when the emulator knows exactly what it's executing it can split it up into threads (assuming it's running an algorithm that can be split up), but in those cases it's probably better to just run a native library that does the same job.

I still think that Sony will not offer software BC for PS3 titles, while they will offer it for PSOne titles and maybe PSTwo titles. I think they might offload the cost onto consumers by selling HW BC kits like in some patents they filed for a while ago. If people want it, fine... if the demand is low, fine too. It could be a good solution, although people would bitch about it anyways.
 
I still think that Sony will not offer software BC for PS3 titles, while they will offer it for PSOne titles and maybe PSTwo titles. I think they might offload the cost onto consumers by selling HW BC kits like in some patents they filed for a while ago. If people want it, fine... if the demand is low, fine too. It could be a good solution, although people would bitch about it anyways.

Yea I was suggesting this a while ago. It could be very profitable for Sony, if they could sell a BC add on for $99 or less. A lot people would bitch, but I would be all for it if it led to higher specs and BOM for the PS4. Maybe they could add some benifits to justify it too, like higher framerates.

a lot people were saying they wouldnt be able to manufacture something like this at that price though. IMO they would need to have around a 25%-40% profit margin on such an accessory. So ~$75 would be the max it should cost to make.
 

Panajev2001a

GAF's Pleasant Genius
Ive been predicting a 2014 launch for PS3 since last May, but isn't the next Xbox still rumored for 2013?



thx. So another wards PS3 has around 420 total flops usable for games(cpu +gpu).

How much is Xenon? Just wondering cause Cell is supposed to be a monster in flops, as its it main forte.

Xenons has three VMX-128 units (this refers to the number of vector registers which is 128 as opposed to 32 IIRC in regular VMX, so 128x128 bits vs 32x128 bits... SPU's have a general purpose register file of 128x128 bits each).

So, 8 FP ops per cycle * 3.2 GHz * 3 = 76.8 GFLOPS on the CPU side of things.

You could also add some flops by the FPU unit (peak of 2 FP ops per cycle) in each of the Xenon cores and in the PPU (although I am not sure how much you would double issuing VMX instructions and FPU instructions, still...):

Xenon's new peak = 76.8 + (2 * 3.2 * 3) = 96 GFLOPS.

CELL CBE new peak = 211.2 GFLOPS.
 

Panajev2001a

GAF's Pleasant Genius
Yea I was suggesting this a while ago. It could be very profitable for Sony, if they could sell a BC add on for $99 or less. A lot people would bitch, but I would be all for it if it led to higher specs and BOM for the PS4. Maybe they could add some benifits to justify it too, like higher framerates.

a lot people were saying they wouldnt be able to manufacture something like this at that price though. IMO they would need to have around a 25%-40% profit margin on such an accessory. So ~$75 would be the max it should cost to make.

No Blu-Ray drive, no I/O features (no WiFi, no Bluetooth, no Ethernet), no TV out, etc... only CPU, GPU, main RAM, and VRAM, the PSU could be external, etc... Such a module would not need a great deal of things.

If they are somehow able to put both CPU and GPU on the same chip they might even try to use a single 512 MB pool of RAM, but it would not be an easy task (probably easier to use two separate pools), still they still have potential manufacturing technology improvements left in both the CPU and the GPU.
 

Panajev2001a

GAF's Pleasant Genius
The actual difference where the consoles are concerned would be less than that though wouldn't it due to the PS3 getting a "7-core" Cell at best?

I am not considering the disabled SPE.
The CELL CPU in PS3 has 9 "cores" (8 active and 1 disabled at manufacturing time). 1 PPE, 7 active SPE's, and 1 disabled SPE.
Each of the active cores as an SIMD unit capable of a peak of 8 FP ops per clock cycle.
 
I am not considering the disabled SPE.
The CELL CPU in PS3 has 9 "cores" (8 active and 1 disabled at manufacturing time). 1 PPE, 7 active SPE's, and 1 disabled SPE.
Each of the active cores as an SIMD unit capable of a peak of 8 FP ops per clock cycle.

I thought it was under the impression with eight but with one disabled SPE? Regardless, i guess there would still be a sizeable gulf.
 

Panajev2001a

GAF's Pleasant Genius
I thought it was under the impression with eight but with one disabled SPE? Regardless, i guess there would still be a sizeable gulf.

Yes, 8 SPE's with one of them disabled and the PPE.

LXLbk.png


I hope the picture is visible.

Edit: the PPU's L2 cache area in that layout is huge, you could have fit almost 2 more SPE's there, lol.
 
I still think that Sony will not offer software BC for PS3 titles, while they will offer it for PSOne titles and maybe PSTwo titles. I think they might offload the cost onto consumers by selling HW BC kits like in some patents they filed for a while ago. If people want it, fine... if the demand is low, fine too. It could be a good solution, although people would bitch about it anyways.
There were many reasons why I speculated Sony would totally redesign the PS3 this refresh, that if not done this refresh imply a roadmap going forward that does not include the PS3 as a separate console.

It all depends on the cost of the PS4, if PS3 BC can be included economically and what it would cost Sony to refresh the PS3 to 22nm at the end of it's life. (22nm refresh would be released late 2014) Given Sony statements that they plan to support the PS3 till 2015, maybe more, a refresh in 2014 doesn't seem likely.

If the target price of the PS4 is close to $299 with a 500 gig hard disk and Sony keeps the PS3 price at $200 because they can't or do not want to spend the money for a complete redesign (that would allow for a cheaper PS3), who would buy a PS3 at $200 when they can get a PS4 at $299?

My mistake, if my speculation is wrong, was in believing that Sony would keep supporting the PS3 as a bottom end lower price game console primarily for the XMB/Browser front end. That they would redesign for a 1 chip SoC they could sell cheaper like Microsoft did with the 360S @ 45nm and is likely going to further reduce the price of the Xbox 360 refresh @ 32nm to be released this year.

Look at the press at the PS3 4K price; everyone is saying Sony is making a big mistake. Will Microsoft make the same mistake or can we expect a cheaper Xbox 360 this holiday season?

We still don't know the power usage of the PS3 4K....that will give us information, a tear-down and picture of the PS3 4K motherboard more information. If it's a one chip 35 watt at the XMB motherboard design (complete redesign using modern hardware or Coreless (cheaper to manufacture) and more efficient) then it changes the PS3 endgame.

From NathansFortune, a Sony Software Developer's post on BY3D 5/2011

Yes, I've always thought that the 2009 model is what Sony wanted to launch in 2006. There is definitely another slim in the pipeline before PS3 is EOL'd. It will be 32nm Cell, 28nm RSX, 2x 1Gbit XDR chips at 32nm, 2x 1Gbit GDDR3 at 28nm, and 20GB NAND with an empty 2.5" HDD slot for people who want to upgrade. If Sony really want to monetise they could make it 1.8" and team up with Toshiba to sell standard 1.8" drives with PlayStation branding and a tidy markup.

That would be the final stage of PS3 cost-cutting, and it will give Sony access to the $149/129/99 market for times after PS4 launches.
The thrust of the ?speculation? is the same as mine (Cheaper PS3 refresh) but I could find no work by IBM on a 32nm Cell processor just Xbox 360 @ 32nm International project with no mention of the elements in the CPU just the cache (could have included SPUs). RSX @28nm and Cell @32nm with both coreless might meet the 35 watts at the XMB for 3rd tier EPA Goldstar specs.

But there is a statement that Sony was skipping 32nm for Cell and the IBM Linkedin Employee post with work on Cell at 22nm.
 
Status
Not open for further replies.
Top Bottom