• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
Im just saying, copying unified memory arrangement sounds dubious. It was either Esram + DDR3 or GDDR5 from single pool, for both if them.

Designing console, with main requirement being it having 8GB of RAM back in 2010, there was only one choice. If MS knew 100% that GDDR5 could deliver 8GB in 2013, they would have gone with that thus putting more compute on chip without raising BOM.
I merely said they copied the unified GDDR5 setup, which PS4 did first. Where's the disagreement here?

I don't know if XB1 R&D started in 2010, but Cerny had said he has been x86-focused since 2008. Back then, I doubt they had more than 2GB GDDR5 in mind. You'd be surprised if you read 2011 GAF posts where people were saying that 2GB would be more than enough for PS4/XBOX720.

There's also this: http://n4g.com/news/1291403/ps4-sony-earlier-thought-about-slow-gddr5-edram-1088gb-s

Gambling/taking calculated risks is part of the business.

Then every single crypto miner will buy the consoles and mine crypto if they can access more power per dollar than buying a card straight up - like the ps3 a lot of people got them to run non gaming applications.
Did Sony say that OtherOS/Linux capabilities will make a comeback?

If not, why do you assume that people will use them as mining rigs?

Let alone the fact that cracking console security doesn't happen so soon these days, it will take years, if ever, to install Linux on PS5.

Why do people still think it works like this? Every gen we get this quote.
I guess many people are oblivious about console history.

There was no cryto mania in 2005 with ps3/x360 but with x1 and ps4 being under powered it was not an issue.
There was Folding@Home in case you have forgotten. PS3 even had an official, Sony-sanctioned app.

It was basically the same concept (distributed computing over the internet).

Rest assured though, PCs will always have much more powerful GPGPUs, whether from nVidia or AMD.

There's that Arcturus rumor about a compute-focused monster (128 CUs) with zero rasterization fat (no ROPs/TMUs etc.)
 
If Stadia performs 10+ tf why next gen couldn't perform more

Some time ago, a post (or was it a tweet) claimed that the PS5 was scoring over 20,000 on the 3D Mark Firestrike demo. If we assume that info is correct, a close card is the 1080 TI:

https://www.3dmark.com/fs/14278291

The 1080 TI is said to be around 11TF:

https://www.anandtech.com/show/11172/nvidia-unveils-geforce-gtx-1080-ti-next-week-699

So ... the PS5 could be over 10TF? Maybe? There's a lot of ifs in in all of this.
 

R600

Banned
I merely said they copied the unified GDDR5 setup, which PS4 did first. Where's the disagreement here?

I don't know if XB1 R&D started in 2010, but Cerny had said he has been x86-focused since 2008. Back then, I doubt they had more than 2GB GDDR5 in mind. You'd be surprised if you read 2011 GAF posts where people were saying that 2GB would be more than enough for PS4/XBOX720.

There's also this: http://n4g.com/news/1291403/ps4-sony-earlier-thought-about-slow-gddr5-edram-1088gb-s

Gambling/taking calculated risks is part of the business.


Did Sony say that OtherOS/Linux capabilities will make a comeback?

If not, why do you assume that people will use them as mining rigs?

Let alone the fact that cracking console security doesn't happen so soon these days, it will take years, if ever, to install Linux on PS5.


I guess many people are oblivious about console history.


There was Folding@Home in case you have forgotten. PS3 even had an official, Sony-sanctioned app.

It was basically the same concept (distributed computing over the internet).

Rest assured though, PCs will always have much more powerful GPGPUs, whether from nVidia or AMD.

There's that Arcturus rumor about a compute-focused monster (128 CUs) with zero rasterization fat (no ROPs/TMUs etc.)
Because copying is like saying Sony has copied MS if PS5 ends up with 4k BD player. Its not, it would be a decision they made in time (PS4Pro) for whatever business or design reason they did. If they feel its way to go now, it wont be because "They copied MS with including 4K BD player in PS5".

What I am saying is, there where technical and business decisions that made MS go with 8GB of DDR3 + ESRAM back in 2013, and unified in time of Xbox One X, and it has nothing to do with copying. By the time X was out, ESRAM was completely unnecessary. It was good decision for PS2 and 360, bad for Xbone. Xbox OG for example had unified setup like PS4 had. Single pool of fastest available memory.
 
Last edited:

CrustyBritches

Gold Member
Ethereum at $178 with a card that has 50MH/s with 200W consumption and $.12 KWh is a losing proposition. Even if consoles could mine, spending $500 to lose $15/yr doesn't make sense. Better off with your old mining rigs with Polaris cards.
---
Concerning the FS 20k+ score, now that we have plenty of data for the 5700 series, we know you can get 20K+ with a RX 5700(non-XT)+Ryzen 5 2600. Compared to 5700XT+Ryzen 5 3600 = 22K+. Switch to TimeSpy for comparison under more modern workload, and RX 5700+Ryzen 5 2600 performs something like a 2060+Ryzen 5 3600, or a 2060 Super+Ryzen 5 2600.
 
I've used this same arguement. It's true, you will not get a 2080ti for a 1060 price. Not only that but AMD will not devalue their beefier video cards.
The (manufacturing) cost is dictated by the die size:

uzkPJ50.png


I'm a bit shocked people don't realize there's a huge markup in PC CPUs & GPUs. Even AMD realized it and we saw blatant price gouging with 5700 XT.

Someone had done some calculations and a 7nm 400mm2 die shouldn't cost more than $140.

Is that far-fetched for $499 MSRP and $599 BoM cost? $200 higher BoM than OG PS4, $100 loss that will be recouped by subscriptions.

2013 consoles didn't have an MSRP vs BoM deviation. They were profitable from day 1.

The way I see it is like this:

OG PS5 ~400mm2 (7nm EUV - 2020) -> PS5 Slim ~200mm2 (5nm EUV - 2023) -> PS5 Super Slim ~100mm2 (3nm GAAFET - 2027)

PS6 will probably utilize the same 3nm process as well. It might be the last physical/consumer console, unless a miracle happens.

Because copying is like saying Sony has copied MS if PS5 ends up with 4k BD player. Its not, it would be a decision they made in time (PS4Pro) for whatever business or design reason they did. If they feel its way to go now, it wont be because "They copied MS with including 4K BD player in PS5".

What I am saying is, there where technical and business decisions that made MS go with 8GB of DDR3 + ESRAM back in 2013, and unified in time of Xbox One X, and it has nothing to do with copying. By the time X was out, ESRAM was completely unnecessary. It was good decision for PS2 and 360, bad for Xbone. Xbox OG for example had unified setup like PS4 had. Single pool of fastest available memory.
It seems you're slightly offended by the copying logic, even though it's true.

Yes, successful consoles tend to influence the industry. What's so weird about it? Sony copied XBOX360's unified memory (vs PS3's discrete), but they didn't copy the eDRAM (even though they could).

See what I'm saying? Copy the good stuff, not the bad ones. Nothing wrong with that. It happens all the time.

Regarding PS5 discs, it will have 100GB media not because people care about 4K Blu-Ray (they didn't care on XBOX, since movie streaming is so prevalent), but because next-gen games will need bigger discs. Nothing more, nothing less.

If physical movie discs was a killer feature for a console (it used to be important back in the PS2 era, but not anymore), then you would be right about Sony copying MS in regards to 4K Blu-Ray.

Regarding OG XBOX, it was a bit memory starved due to having to share the same memory pool with the CPU and the GPU. Plain old DDR, there was no GDDR variant back then.
 
The (manufacturing) cost is dictated by the die size:

uzkPJ50.png


I'm a bit shocked people don't realize there's a huge markup in PC CPUs & GPUs. Even AMD realized it and we saw blatant price gouging with 5700 XT.

Someone had done some calculations and a 7nm 400mm2 die shouldn't cost more than $140.

Is that far-fetched for $499 MSRP and $599 BoM cost? $200 higher BoM than OG PS4, $100 loss that will be recouped by subscriptions.

2013 consoles didn't have an MSRP vs BoM deviation. They were profitable from day 1.

The way I see it is like this:

OG PS5 ~400mm2 (7nm EUV - 2020) -> PS5 Slim ~200mm2 (5nm EUV - 2023) -> PS5 Super Slim ~100mm2 (3nm GAAFET - 2027)

PS6 will probably utilize the same 3nm process as well. It might be the last physical/consumer console, unless a miracle happens.


It seems you're slightly offended by the copying logic, even though it's true.

Yes, successful consoles tend to influence the industry. What's so weird about it? Sony copied XBOX360's unified memory (vs PS3's discrete), but they didn't copy the eDRAM (even though they could).

See what I'm saying? Copy the good stuff, not the bad ones. Nothing wrong with that. It happens all the time.

Regarding PS5 discs, it will have 100GB media not because people care about 4K Blu-Ray (they didn't care on XBOX, since movie streaming is so prevalent), but because next-gen games will need bigger discs. Nothing more, nothing less.

If physical movie discs was a killer feature for a console (it used to be important back in the PS2 era, but not anymore), then you would be right about Sony copying MS in regards to 4K Blu-Ray.

Regarding OG XBOX, it was a bit memory starved due to having to share the same memory pool with the CPU and the GPU. Plain old DDR, there was no GDDR variant back then.
NEAT. My point still stands. Motherboard, CPU, GPU, RAM, 3D Sound Chip, SSD, Embedded SSD, Case, Cooling, 4K Blu-Ray Drive, Licensing for all the technologies it will have, labor, shipping, box, retail space. All that is not free. the GPU is not even half of the equation.
 
Last edited:
NEAT. My point still stands. Motherboard, CPU, GPU, RAM, 3D Sound Chip, SSD, Embedded SSD, Case, Cooling, 4K Blu-Ray Drive, Licensing for all the technologies it will have, labor, shipping, box, retail space. All that is not free. the GPU is not even half of the equation.
it's hard to read that people don't get that.
 

bitbydeath

Gold Member
name me a gen where the console surpassed the strongest pc graphics card 18 months old?
If the consoles were sold below cost per dollar it would be like

PS3 was the last crazy expensive and exquisite hardware we got. PS4 I believe had the latest of what could go inside a console without burning your house down. As parts get smaller and generate less heat things change back in favour of powerful hardware.
 
PS3 was the last crazy expensive and exquisite hardware we got. PS4 I believe had the latest of what could go inside a console without burning your house down. As parts get smaller and generate less heat things change back in favour of powerful hardware.
But he's right, you're not going to exceed the power of a PC in a console.. ever.
 
NEAT. My point still stands. Motherboard, CPU, GPU, RAM, 3D Sound Chip, SSD, Embedded SSD, Case, Cooling, 4K Blu-Ray Drive, Licensing for all the technologies it will have, labor, shipping, box, retail space. All that is not free. the GPU is not even half of the equation.
If BoM is $600, then I think we will get all that.

Why did you mention the 3D sound "chip" separately? That's part of the Navi GPU, not a discrete DSP/chip (like Aureal 3D in the late 90s). Not even RT will be a discrete chip.

But he's right, you're not going to exceed the power of a PC in a console.. ever.
Yes, because PCs will always have a higher TDP ceiling. It wasn't always like that.

Also, 2080 Ti will be more than 2 years old when the PS5 is released:


Sep 2018 vs Nov 2020. 26 months later, not 18.

As soon as Ampere is released, Turing will be old news. Nobody will be impressed anymore.
 

Panda1

Banned
If BoM is $600, then I think we will get all that.

Why did you mention the 3D sound "chip" separately? That's part of the Navi GPU, not a discrete DSP/chip (like Aureal 3D in the late 90s). Not even RT will be a discrete chip.


Yes, because PCs will always have a higher TDP ceiling. It wasn't always like that.

Also, 2080 Ti will be more than 2 years old when the PS5 is released:


Sep 2018 vs Nov 2020. 26 months later, not 18.

As soon as Ampere is released, Turing will be old news. Nobody will be impressed anymore.
Nvidia GeForce RTX 2080 Ti
Best Overall / 4K (When Price is No Object)
GPU: Turing (TU102) | Core Clock: 1,350 MHz | Video RAM: 11GB GDDR5X | TDP: 260 watts


£989.99

 
Nvidia GeForce RTX 2080 Ti
Best Overall / 4K (When Price is No Object)
GPU: Turing (TU102) | Core Clock: 1,350 MHz | Video RAM: 11GB GDDR5X | TDP: 260 watts


£989.99
You're repeating yourself like a broken record, while dismissing previous posts. May I ask why?

uzkPJ50.png


Also, do you mind addressing my OtherOS/Linux question? Is that coming back in the PS5?
 

Norse

Member
A 2070 super may be better than a 5700xt spec wise, but the 5700xt still out performs the 2070 super in some games. Plenty of YouTube videos out there that compare them in quite a few games.
 

Farrell55

Banned
Turing uses 12nm FinFET. Roughly the same process as 14-16nm.

7nm offers 3.3x scaling on the exact same die size. 3 times more transistors:
Wrong!

Only the 7nm HD process has around 3.3
The 7nm HP process that is used on pc/console cpu's/Cpu's scales only 2×!

The HD process is used on low clocked High transistor counts mobile parts like apple A12 chips
 
Here is another way to think about it: IF the next gen console were to be only 7-8Tflops, that would put them at the current RX 5700 (non-XT) level. Does anybody here really believe that the next gen consoles will not be at least as powerful if not significantly more so than the current RX 5700 XT (~10TFLOPS)? Does anyone think Microsoft and Sony would invest billions of dollars to build a machine to last 5-7 years and be the future of gaming to launch with a entry level - mid-range GPU released over 1 year earlier than their box? With the slew of more powerful Navi cards still to launch between now and end of 2020, they will settle on a small mid-range card from 2019 that is barely a step up from the Xbox One X? Does that even make sense?

"Does anyone think Microsoft and Sony would invest billions of dollars to build a machine to last 5-7 years and be the future of gaming to launch with a entry level - mid-range GPU released over 1 year earlier than their box?"

here is nothing to believe.... they have proven it:

PS4 and Xbox one
 
Wrong!

Only the 7nm HD process has around 3.3
The 7nm HP process that is used on pc/console cpu's/Cpu's scales only 2×!

The HD process is used on low clocked High transistor counts mobile parts like apple A12 chips
Even in that case, a 754mm2 12nm die will become a 377mm2 die at 7nm. That's smaller than an RTX 2060 (mid-range, RT-enabled card / 445mm2).

EUV increases density by another 15-20%.
 

Fake

Member


Thoughts?

Disagree. Don't really matter they push Scarllet if the problem its not the hardware.
PS5 and Scarllet will have similar hardware so the difference will be software.
IMO Microsoft need to back to original form of release games exclusives on Xbox garden.
Besides, what matters if Game Pass keep destroying their PR/chances of Scarllet shiny?
They still hide info about how Game Pass works against their first party studious sales.
'Hey, we're like to show our Next Gen Xbox and today his first game with will be avaliable on Game Pass as well'.
Don't make much sense to me. Next gen will be expensive for sure.
 
Last edited:
Some time ago, a post (or was it a tweet) claimed that the PS5 was scoring over 20,000 on the 3D Mark Firestrike demo. If we assume that info is correct, a close card is the 1080 TI:

https://www.3dmark.com/fs/14278291

The 1080 TI is said to be around 11TF:

https://www.anandtech.com/show/11172/nvidia-unveils-geforce-gtx-1080-ti-next-week-699

So ... the PS5 could be over 10TF? Maybe? There's a lot of ifs in in all of this.
Add to that Before Navi came out they were rumors that it will perform like a 1080ti priced between 250 - 350 $ card, Cyberpunk showcased their game with a 1080ti, coincidence ? No, that studio is not gonna waste money programing, people should stop assuming that next gen is 7tf(weak) or 13tf(way too much), 9tf - 11tf is a realistic number stop dreaming with 2080ti perf with RT, in the pro variant yes 3 years from now why not, base model 2020 no, check Halo trailer, what a joke.
 
Also


Thoughts? I somewhat agree. It wouldn't be a full reveal but I think scarlett should have some presence.

I disagree, they will vauguely hint at Scarlett specs as you don't want to give away the keys to the castle until 6 months before launch. That way your competitor can't change things. Halo Infinite Gameplay makes sense, Fable Teaser makes sense.
 

McHuj

Member
We'll get specs soon enough and we have gotten some bits. What we haven't gotten is actual gameplay of a next-gen game.

If there is one thing that I would want to see at X019, would be some actual next gen gameplay. Be it Halo Infinite or something else, I want to see what a next gen game looks like.
 
We'll get specs soon enough and we have gotten some bits. What we haven't gotten is actual gameplay of a next-gen game.

If there is one thing that I would want to see at X019, would be some actual next gen gameplay. Be it Halo Infinite or something else, I want to see what a next gen game looks like.
.......... Cyberpunk ?
 

DeepEnigma

Gold Member

Scraping bottom barrel using random tweets, eh?

Also this...
Sony strongly hinted that PlayStation 4 games will be playable on the PlayStation 5, so there may be some truth in this rumor.

They did not "strongly hint". they said it will. Multiple times.
 

Tqaulity

Member
"Does anyone think Microsoft and Sony would invest billions of dollars to build a machine to last 5-7 years and be the future of gaming to launch with a entry level - mid-range GPU released over 1 year earlier than their box?"

here is nothing to believe.... they have proven it:

PS4 and Xbox one
Thank you, I was waiting for someone to bring this up :) Common misconception (at least for PS4). More education:

Most people assume the PS4 GPU was a scaled down 7870 desktop which released in March 2012 and was only the 2nd fastest card AMD made (the fastest was the 7970). In fact, the PS4 was actually based on the mobile GPU variant called the 7970M which initially released on the PC in April 2012. That card was the absolute fastest mobile card AMD produced and it was in fact based on the full 7870 desktop silicon. If you look at the specs you will see that it is indeed virtually identical to the known PS4 GPU sans the PS4 GPU being even more underclocked from 850 to 800 Mhz. Now even though the PS4 did not release until Nov 2013, that 7970M card was still the fastest mobile card AMD manufactured at the time of the PS4 release! In fact, if you look at the history, AMD did not make another faster mobile GPU until the R9 M295X which didn't release until Nov 2014. This was essentially the mobile version of the 7970 desktop card but it came too late to be included in the PS4 (and was also way too power hungry to be in a console).

So contrary to popular belief, Sony included the absolute best and fastest card available at that time considering AMD's roadmap, PS4 release schedule, and thermal considerations for the console. In fact, if you know about computer hardware and thermals than you will know why it the PS4 HAD to include the mobile version. In short, AMD GPUs at that time were not very power effecient. GPUs in a console typically need to be < 100W TDP in order to be practical given the thermal and cooling limitations in a console. The desktop 7970 GPU released with a TDP of 250W which is absolutely not practical for a console. Even the desktop 7870 GPU had a TDP 175W which again is not practical given that the total TDP for the console is less than 150W. However, AMD was able to make a mobile variant of the 7870 with a TDP of only 75W, perfect for a console. Again this was the absolutely fastest card that was feasible up until the PS4 released in Nov 2013.

People always want to talk about the PS4 being so "underpowered" at launch but if you know that facts and what it takes to build a console from a hardware standpoint, Sony did the best they could given what AMD had available. But the focus for the PS4 (and Xbox One) wasn't on pure power, it was on developer ease and convenience. The real change was the move to x86 architecture on the CPU and again the Jaguar was the only thing AMD had available that could "fit" in a console form factor at that time (the AMD desktop parts were way too big and power hungry).

However, AMD is NOT the same company today that it was in 2012/2013. They have a wide range of highly competitive CPUs and will be releasing a wide range of highly competitive GPUs in the coming year. Not just in raw performance, but more important in terms of efficiency.

Also, the priorities of the console manufactures are different this time around. While the PS4 and Xbox One were mostly about removing barriers for developers, both Sony and Microsoft have made it clear that PS5 and Scarlett will be about pushing the boundaries of gaming. Having announced a GPU with 8K, 120fps, and ray-tracing capabilities says that they will definitely be significantly more powerful than the current RX 5700. If you look back, the PS4 and Xbox One GPU didn't really offer anything "new" at the time (except for ACE units on PS4). There were no marketing buzzwords to highlight advanced features and in fact all Sony could say was that it was a "super charged" PC.

Microsoft focused on software and UX features for Xbox One as part of their defocus on gaming initially. They paid the price and are hell bent on making sure they are technically competitive from day one with Scarlett. The difference in power between the two will be extremely small, probably smaller than any other previous generation.

So again if you think that the next consoles will ship with AMD Navi GPUs equivalent to the current RX 5700, then you are surely mistaken and you will see in due time!
 
Last edited:
Not really sure why the price of the 2080ti is being brought up. RRP isn't really relevant any more. Yes the 2080ti is $1200 RRP, but that's only really because it has no competition. Literally one generation ago the 1080ti was priced at $700.
Nvidia have been gouging consumers since Pascal buried AMD. The 2080 retailed at the same price as a 1080ti, while offering no performance increase. To buy a card with more performance than a 1080ti you needed to drop $1200. Nvidia was making insane margins. Even current 5700(XT) and 2070 prices are way out of line with what is expected of cards traditionally in that performance tier. They're making as much margin as they can get away with.

For that reason, citing prices of current gen graphics cards; particularly Nvidia's ridiculous overpriced graphics cards as a measure of determining the performance bracket of a console due to be released a year from now is meaningless. Console chips are sold by AMD at very thin margins compared to their own graphics cards.

Unless we know exactly how much AMD paying for each wafer at TSMC, and how big the APU die sizes are, what the process yields are and what margins they're charging console manufacturers, we have no way of knowing what the performance limit of a console for a given BOM, could be. This of course doesn't factor in what suppliers are charging Sony and Microsoft for RAM modules, PCB components, VRMs, MOSFETs etc. We don't know how much the plastic case, or the heat sinks or fans cost. We don't know how much R&D $$$ needs to be amortized during the expected lifetime of the products.

One could argue that Sony have an advantage in that all four of their previous home consoles have sold in excess of 85 million units, with three reaching over 100 million. This could give them leverage in negotiating bulk prices for smaller components. Microsoft are a big company and can obviously throw down a lot of dollars for R&D, but make no mistake they will not be tolerant of a situation where each sold console is losing a shit load of money. The Xbox division needs to be independently profitable, or there at least needs to be a thorough roadmap detailing ROI over the console lifespan that justifies the BOM. The same of course applies to Sony and PlayStation, though PSN makes more money the entirety of Xbox and Nintendo's annual income, combined, so they might have more room to play with - again assuming they can detail a roadmap that satisfies higher-ups and shareholders that the return will be worth the cost.

Side note: I don't really understand what the obsession with TFLOPs is about. TFLOPs = TFLOPs in the same way a metre is a metre. It's a measurement of theoretical peak FP32 compute performance of a graphics chip. Doesn't matter whether it's Nvidia TFLOPs or Polaris or Vega or Navi a TFLOP is a TFLOP. It measures compute, and while it can give you an idea of how it might perform in gaming, it does not measure it directly.

Imo stuff like memory bandwidth, stream processors, geometry, asynchronous compute engines, ROPs, etc. Are more important than purely the compute performance.
 

Aceofspades

Banned
Stolen from "that other place". PS5 is slighty more powerful, slightly.


Add this to the list of other sources that have been saying this since May.

PS5 being more powerful plus having native BC to all Playstation consoles will be a mega blow to MS. It will destroy Scarlett momentum for good.
 
Last edited:

FranXico

Member
Stolen from "that other place". PS5 is slighty more powerful, slightly.

I hope any difference between the two is either slight or none, indeed.
 

Darius87

Member
with late 2020 release it's more likely that ps5 will be based 7nm+ process and RDNA2 arch and then shortly after AMD should release it's RDNA2 based PC cards, same thing happened with ps4 which had gcn 2.0, and all pc cards were gcn 1.0 at the time.
Probably Tflops numbers won't be very impressive but there was leak mentioned that ps5 and next xbox will be in double digits so 10 < 11 Tflops my guess with RDNA2.
 
Imo stuff like memory bandwidth, stream processors, geometry, asynchronous compute engines, ROPs, etc. Are more important than purely the compute performance.

Just a quick addition to this, as I don't know how/cannot edit my post.
Memory bandwidth is particularly important. The PS5/Scarlett could have 80 compute units and be capable of 20TFlops, but if its only got 256-bit bus with 448GB/s bandwidth, then its just not gonna be fed with enough data for that extra compute to matter. Heck even a 384-bit bus would struggle with that much compute.
There is concern online about rumours of a Navi 12 die, which has been theorised to be larger than Navi 10, as it may only have a 256-bit bus. People reckon that, that bus width would be insufficient to keep a larger Navi die fed with enough data, and therefore think its between Navi 10 and 14. Others think it might be a 2-stack HBM2E solution with a 2048-bit bus, as that also matches up with the datamined specs, but that's another story entirely.
Either way, memory bandwidth is hugely important for the consoles.
 
Status
Not open for further replies.
Top Bottom