• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

QaaQer

Member
How about you read this http://www.reddit.com/r/Games/comments/1h2qxn/xbox_one_vs_ps4_memory_subsystems_compared/caqjldw
it proves that GDDR latency is almost or maybe even better than DD3

I'll try to copy it here.
Here's a write-up I did a while ago that I'll just paste here:

My background is in VLSI design, but I know nothing about graphics programming. But here goes my analysis for the chip and system level design.

1) There are low to mid-range video cards that have both a DDR3 and GDDR5 version. This provides a direct comparison point. In the very low end, I found benchmarks showing that the GDDR5 version is only 10-20% faster. However, in mid-high range cards, the GDDR5 version can be almost 50% faster. http://ht4u.net/reviews/2012/msi_rad...ZfRJ3KdXukbpQQ

This makes sense. In low end cards, the GPU does not have enough processing power to be significantly bottlenecked by memory bandwidth, but in faster cards it definitely can.

So bandwidth is critical for GPU tasks. That's why high-end video cards use 384-bit wide interfaces to memory while CPU memory interfaces are only 64 bits wide (per channel)! It certainly is not cheap to dedicate that many IO pins to memory, so they do it for a good reason.

Memory bandwidth and latency is not too critical for most CPU tasks though. For the PC builders out there, you can find some benchmarks comparing different memory timings and speeds and in most cases you'd be better off buying a faster video card instead of spending money on better RAM.

2) GDDR5 having much higher latency than DDR3 is a myth that's been constantly perpetuated with no source to back it up. Go look up datasheets of the actual chips and you'll see that the absolute latency has always been the same, at around 10ns. It has been around that since DDR1. Since the data rates have been increasing, the latency in clock cycles has increased but the absolute latency has always been the same. Anyone who wants to argue with me should dig through datasheets to back their claims up.

From Wikipedia: DDR3 PC3-12800 @ IO frequency 800MHz has typical CAS latency 8. This means the absolute latency is 10ns. DDR2 PC2-6400 runs at IO frequency 400MHz, with CAS latency 4. This is also 10 ns.

Here's a typical GDDR5 chip datasheet: http://www.hynix.com/datasheet/pdf/g...FR(Rev1.0).pdf

Here is the table showing CAS latency vs frequency (page 43):


The data rates are a factor of 4x faster than the memory clock. So at a typical 5.0Gbps output data rate, the memory runs at 1.25GHz source: (page 6)

and supports CL latency of 15. This is 15/(1.25 GHz) = 12 ns

3) The Xbox One has additional SRAM cache to improve their bandwidth. However, they needed to dedicate additional silicon area and power budget in the cache and the cache controller. This is definitely a big cost-adder in terms of yield and power budget, but probably not as much as using GDDR5 chips. Chips these days are limited only by the amount of power they can dissipate and everything is a trade-off. By adding complexity in one area, the designer must remove it from another. So Microsoft spent some of their power budget on implementing a cache, while Sony could use it to actually improve the number of GPU cores. And it shows

Who knows how well the Xbox One's cache system will work to catch up to PS4's bandwidth advantage. But it is certainly not going to be _faster_ or simpler. When you're streaming in the huge textures needed for next-gen 720p to 1080p graphics, a 32MB cache is not big enough to constantly provide enough bandwidth.

Also, since the PS4 has more GPU power, it will definitely need all the bandwidth it can get.

nice find
 

Razgreez

Member
GTX 780 and Titan are both based on GK110. PS4 and Xbox One's GPUs are not based on the same chip.

Not only that it has only 15% more cores and the same memory bandwidth. This individual clearly has no idea what he's talking about. Definitely another kayle/reiko type here
 

vcc

Member
What? The 360 will probably be the biggest selling console ever in the UK.

In consoles it seems brand power is pretty fickle thing. You'd recall the PS2, the wii, and the snes sold enormously well and were well regarded and their follow up disappointed. for one reason or another gamers seem more fickle and brand power isn't enough.
 

Cheech

Member
Both companies will have great games. This is not a competitive advantage.

Come on. Games steer the ship. Guys like those of us posting in this thread are in the vast, vast minority.

I know a guy who is getting an Xbone so his young daughters can play Just Dance, and his son can play CoD. I asked him why he settled on that instead of a PS4, and he didn't even know Sony was still making Playstations.

Obviously, I'm American. I'm getting a PS4 initially and an Xbone down the road, primarily because I have faith that Sony will have corrected their numerous errors this gen. I primarily play multiplat games, and it appears the PS4 will be getting the best versions.

That said, the Xbone will certainly be getting loads of must have, exclusive games. People writing them off are deluding themselves.
 

JonnyLH

Banned
Thankyou!
The drivers will probably never be finished as both companies will continue to optimize their consoles and tools for the developers. As it stands now the PS4 has better tools from what we have heard from developers, I forgot the exact name of the studio but I am sure someone will pull up the quote (I think it was Avalanche but not too sure). The hardware not being final is something that Microsoft keeps repeating so it would be better to ask them about that.
This may well be true, but it may boil down to personal preference of the developer. For example, I know both companies are using VS 2013 as an IDE now so regarding a developer, its just the same application with a different tool set in there. Regarding the software stacks on the console themselves, I would say MS have more of a feature rich environment purely because of the full DirectX stack.
Again, this isn't some random conjecture we created in our mind, this type of stuff comes from the developers who have said the PS4 is easier to code for and has more mature development tools. We don't know how well the drivers are on either side but I think we can confidently say both will be very optimized.
Yeah, no doubt about that. Both sides have very lightweight stacks which heavily tie into their hardware. If we brought business into it, unfortunately a lot of modern engines are specifically designed around DirectX in mind. Frostbite is a prime example. With that in mind, a lot of that code feels more natural to sit on the X1. This also makes me beg the question around BF4 on the X1, but that's a totally different subject.
The problem is 6 more CUs is not the only thing that the PS4 has over the Xbox One. It also has 32 ROPS over the 16 in Xbox One and it has optimization specifically for GPGPU. The main problem here is that these 2 GPUs come from the same product family but from a different product stack. The GPU in Xbox One is more akin to a mainstream product(7700 series) while the GPU in the PS4 is mid-ranged enthusiast card(7800 series).
I didn't actually know that, output streams can't really be matched. That will be a clear pro for the PS4. I'd argue against the comparable PC counterparts in the GPU. The numbers will be the only similarities whereas the silicon will be modified heavily in both parties.
This is wrong, GDDR5 doesn't inherently have any more latency than DDR3. In fact both are pretty much the same memory just that one is optimized for latency while the other is optimized for bandwidth. The reason why PCs use DDR3 memory for RAM is because there are so many applications that require the CPU at the same time and so the low latency helps in that regard but on a closed hardware console what is going to be calling on the CPU while you are gaming? Nothing. Another thing is Jaguar is an OoO CPU so it doesn't have to wait around and do nothing while it's waiting for data to be retrieved.
I've just had to read up on this, because its a common belief but just seems so inaccurate. GDDR has higher timings because of its architectural layout, it was designed with bandwidth in mind. Any Mark Cerny quote I've seen always refers to the "GPU ofcourse" because you can't avoid the fact that CPU's throws clock cycles out of the window. Unfortunately, GDDR does have high latency by design. The CPU is used for an awful amount while gaming, physics, AI, networking, audio processing. A lot of these tasks will just have to be offloaded to the GPU. Now it gets technical, OoO in a CPU is governed by the input streams of data, rather than the order of execution in processes. Unfortunately, GDDR is an input stream, so the problem isn't resolved with that.

Audio will not be off-loaded to the GPU, Mark Cerney was talking about in the future when developers harness the power of GPGPU they will be able to offload some audio tasks to the GPU like audio raycasting. As far as I remember the Xbox One doesn't have a PPU so it's also going to be running physics on the GPU and that's not really a Playstation specific problem. On the CPU being "crippled" I have already addressed this above.
It depends on the developers intention, the audio chip in the ps4 is just the encoder, so a chip somewhere has to handle the processes of the audio voices. The CPU would have a heck of a hard time doing this with GDDR hence his opinion of offloading it in the GPU.
 

EGM1966

Member
I'm not saying the Xbox will dominate, far from it. I'm saying it will be very close. The Xbox brand is very powerful in the US and UK, I don't see that changing. Microsoft's mistakes will cost them the lead they would otherwise have after such a successful generation but that is all. People who expect a PS4 domination are just letting their feelings cloud their judgment.

I honestly doubt it will be very close worldwide but I will concede unless MS really screw up they should be able to retain decent share in US/UK - right now though I think all the signs point to XB1 doing worse than 360 outside those core territories and I doubt it'll dominate US either this time around.

Brand is important but as I posted it's also linked strongly to price, differentiation and perception in the market at the point in time of release. Otherwise 360 would never stood a chance vs PS3.

Right now the general perception will be based on non-niche coverage and the generally reported perception is that PS4 is cheaper, may be more powerful and MS has taken a consumer backlash hammering.

I have no doubt the XB1 will sell okay and MS will repair much of the current damage - but a lot of the 360s success was down to launching first, being much cheaper for most of the time, having some killer timed exclusives (that the market wouldn't suspect as being timed then) and having far and away the better online experience - pretty much all of that is a wash or in PS4 favour this time around and that coupled with the current image issues is what's almost surely going to hurt XB1 in the market vs the PS4.
 

Vizzeh

Banned
No, it is a hardware thing as to whether tiled resources need to be implemented via software or via hardware. There was a thread on Beyond3D where an AMD guy came in and spoke about Tier 1 and Tier 2 hardware. I am attempting to look it up now. We don't know which Tier either of the consoles fall into though.

Edit: Found it on this thread: http://forum.beyond3d.com/showthread.php?t=64206&page=9

A relevant post is here:



http://forum.beyond3d.com/showthread.php?t=64206&page=6



AMD guy in response to the post above:



In reference to people trying to figure out his vague posts.

Interesting, I guess we have to wait and find out which version it is, atleast we know both consoles support DX11.2 so it should be available to some degree, I think some of the Tier 2 stuff was added tweaks like the mip locking, I think the textures could be manualy sized (im possibly in accurate here, but all the detail was in the DX11.2 Microsoft reveal vid)
 

gruenel

Member
Unfortunately its not that linear. I hate reverting back to PC because they're quite incomparable but for this instance its not too bad.

Look at the Titan compared to the GTX 780. The titan has a whole 0.5Tflops of power more, which is a 12.5% theoretical performance gain. Unfortunately, they're both very similar in benchmark results. For example:
http://www.videocardbenchmark.net/high_end_gpus.html

Yeah, that's why I said almost. I don't know how videocardbenchmark.net does their benchmarking but the Titan does have a ~10% FPS advantage in lots of games. Of course it depends on the game, if it's CPU or BW limited the advantage will be smaller or nonexistant.
 
aN2erQX.png

chubby-colorado-state-ram-fan.gif
 

JonnyLH

Banned
How about you read this http://www.reddit.com/r/Games/comments/1h2qxn/xbox_one_vs_ps4_memory_subsystems_compared/caqjldw
it proves that GDDR latency is almost or maybe even better than DDR3

I'll try to copy it here.
This is referring to GDDR v DDR latencies inside the GPU, which as I've said, doesn't matter inside the GPU. CPU doesn't have the bus to be able to handle GDDR, hence why it skips clock cycles.

EDIT: It's also very different inside a unified architecture because the GPU doesn't have the RAM sat next to it, its bus is decreased and latencies would be similar.

My head now hurts, I'm off to play some games.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Thanks for the link... Oh wait more assertions with no evidence.

I've been looking through some technical papers and hynix data sheets. The popular position is that DDR3 has half the latency of GDDR5 but the data sheets don't seem to support that. The latency seems quite similar.

Either I'm missing a piece of the puzzle or there are a lot of folks out there who are wrong.
 
What? The 360 will probably be the biggest selling console ever in the UK.

Let me put it another way the majority of the UK has no Brand loyalty to anyone .
Both MS and Sony going to have there fans but what allow MS to over take the UK was not Brand .
It was price and release date they don't either of them this gen .
The same can be said for the rest of the world to a certain degree .
 

kartu

Banned
According to some. PS4 has a magic memory controller that offsets the latency issues of GDDR5.

It has been many times, but here we go again.

In short: there are no "lattency issues of GDDR5", it's a myth.

Probably it was born because some articles were comparing CPU mem controller latencies which worked with DDR3 with GPU and are OPTIMIZED FOR LOW LATENCY to mem controllers on GPU, optimized for throughput.

Wonderful thing about PS4 is, that TWO different mem controllers are accessible to its GPU: low throughput low latency and high throughput high latency.
 

RoboPlato

I'd be in the dick
I've been looking through some technical papers and hynix data sheets. The popular position is that DDR3 has half the latency of GDDR5 but the data sheets don't seem to support that. The latency seems quite similar.

Either I'm missing a piece of the puzzle or there are a lot of folks out there who are wrong.

I was under the impression that at the same clock speed the latency of DDR3 was less than GDDR5 but GDDR5's significantly higher clock rates generally balance that out.
 

Chobel

Member
This is referring to GDDR v DDR latencies inside the GPU, which as I've said, doesn't matter inside the GPU. CPU doesn't have the bus to be able to handle GDDR, hence why it skips clock cycles.

My head now hurts, I'm off to play some games.

Did you read it? really?
Where does it say it doesn't work with CPU? How do you know it doesn't work with CPU?
How about some links, sources to prove your claim...?

I was under the impression that at the same clock speed the latency of DDR3 was less than GDDR5 but GDDR5's significantly higher clock rates generally balance that out.

You're correct.
 
Did you read it? really?
Where does it say it doesn't work with CPU? How do you know it doesn't work with CPU?
How about some links, sources to prove your claim...?

You would think AMD and intel would spring into that market to enable it on mobo?
So what is keeping the cpu and mobo manufacturers from using gddr5 is it a cost thing?
 

Boss Man

Member
Even if the latencies were significantly different I'm pretty sure latency is not much of a factor here. Bandwidth has been a much tighter bottleneck. I mean, there's a reason why having GDDR5 in a graphics card costs more. It's superior tech.
 

JonnyLH

Banned
Did you read it? really?
Where does it say it doesn't work with CPU? How do you know it doesn't work with CPU?
How about some links, sources to prove your claim...?



You're correct.
Yup, had to wrap my head around it. Its mentioned when he's referring to busses:
So bandwidth is critical for GPU tasks. That's why high-end video cards use 384-bit wide interfaces to memory while CPU memory interfaces are only 64 bits wide (per channel)! It certainly is not cheap to dedicate that many IO pins to memory, so they do it for a good reason.
Then he makes the point:
Memory bandwidth and latency is not too critical for most CPU tasks though.
Whereas he's right in saying its not critical for most tasks, they don't care. Although, from the CPU's point of view, It's literally throwing its power out of the window. The best example which is effected by latency is audio because of its sync with the frame. If you're throwing clock cycles, your seeing big stutters.

Now I'm really off. Been insightful guys :)
 
I've been looking through some technical papers and hynix data sheets. The popular position is that DDR3 has half the latency of GDDR5 but the data sheets don't seem to support that. The latency seems quite similar.

Either I'm missing a piece of the puzzle or there are a lot of folks out there who are wrong.

I may be wrong but I think the assumption came from CAS / RAS values.
DDR3 have lower CAS / RAS... because they are running at lowest clock rate while GDDR5 have higher ones.
 

Chobel

Member
Yup, had to wrap my head around it. Its mentioned when he's referring to busses:

Then he makes the point:

Whereas he's right in saying its not critical for most tasks, they don't care. Although, from the CPU's point of view, It's literally throwing its power out of the window. The best example which is effected by latency is audio because of its sync with the frame. If you're throwing clock cycles, your seeing big stutters.

Now I'm really off. Been insightful guys :)

That doesn't say latency for CPU is not the same latency with GPU.
 

JonnyLH

Banned
That doesn't say latency for CPU is not the same latency with GPU.

The size of the bus directly effects the latency of the memory call. He mentions the bus being lower for CPU's because of its high cost (awfully right). Then specifies they can do that because its effect, which I can see what he means, but it does have a greater effect than he mentioned. Like he said himself, he doesn't architect software.
 

Applecot

Member
You would think AMD and intel would spring into that market to enable it on mobo?
So what is keeping the cpu and mobo manufacturers from using gddr5 is it a cost thing?

Price jumps are huge. That's why everyone was saying Sony got lucky with the GDDR5 panning out the way it did.
 

QaaQer

Member
The size of the bus directly effects the latency of the memory call. He mentions the bus being lower for CPU's because of its high cost (awfully right). Then specifies they can do that because its effect, which I can see what he means, but it does have a greater effect than he mentioned. Like he said himself, he doesn't architect software.

And you do? So, find us some sources please.
 

TechnicPuppet

Nothing! I said nothing!
In consoles it seems brand power is pretty fickle thing. You'd recall the PS2, the wii, and the snes sold enormously well and were well regarded and their follow up disappointed. for one reason or another gamers seem more fickle and brand power isn't enough.

Saying MS is not a powerful brand in the UK when the current MS console is going to be the biggest selling ever is plainly nonsense.
 

astraycat

Member
I've been looking through some technical papers and hynix data sheets. The popular position is that DDR3 has half the latency of GDDR5 but the data sheets don't seem to support that. The latency seems quite similar.

Either I'm missing a piece of the puzzle or there are a lot of folks out there who are wrong.
The missing piece of the puzzle is that the GPUs themselves are probably where the latencies come in.

Just look at the cache latencies on AMD cards -- L1 on the GPU is as high as L3 latency on Intel CPUs.
 

Chobel

Member
The size of the bus directly effects the latency of the memory call. He mentions the bus being lower for CPU's because of its high cost (awfully right). Then specifies they can do that because its effect, which I can see what he means, but it does have a greater effect than he mentioned. Like he said himself, he doesn't architect software.

Now this is interesting, how about source, link, article or pdf?
 

onanie

Member
The size of the bus directly effects the latency of the memory call. He mentions the bus being lower for CPU's because of its high cost (awfully right). Then specifies they can do that because its effect, which I can see what he means, but it does have a greater effect than he mentioned. Like he said himself, he doesn't architect software.

You completely ignored the latency figures for both DDR3 and GDDR5.
 

TechnicPuppet

Nothing! I said nothing!
Let me put it another way the majority of the UK has no Brand loyalty to anyone .
Both MS and Sony going to have there fans but what allow MS to over take the UK was not Brand .
It was price and release date they don't either of them this gen .
The same can be said for the rest of the world to a certain degree .

What has price and release date got to do with the massive sales for the last 4 years?

There is brand loyalty in the UK, both to Sony and MS. The fact Sony sold any consoles at all when it launched is testament to that.
 

Razgreez

Member
Price jumps are huge. That's why everyone was saying Sony got lucky with the GDDR5 panning out the way it did.

There's more to it than the price of GDDR5. In fact that's the easiest/cheapest part of the puzzle. APU's and motherboards specifically designed to take advantage of it, the fact that stacking is on the horizon which might mitigate the need for GDDR in its current form, etc. There are so many factors which impact cost exponentially
 

Vizzeh

Banned
Saying MS is not a powerful brand in the UK when the current MS console is going to be the biggest selling ever is plainly nonsense.

Xbox360 has sold 8.4Mill, PS2 is at 10million. - I agree its a powerful Brand, but once the next Generation begins im sure the 360 sales will slow completely. Pre-orders + Hype suggests they will be giving up their console dominance this gen.
 
People attacking the Edge article for lack of sources and/or lack of technical explanation, still not factoring all the developers/people in the know who have either publicly or off the record spoken about the power difference.
 
People attacking the Edge article for lack of sources and/or lack of technical explanation, still not factoring all the developers/people in the know who have either publicly or off the record spoken about the power difference.

Don't you see? Non-developers know more about these machines than the game developers who have worked on both. It's science.
 

TechnicPuppet

Nothing! I said nothing!
Xbox360 has sold 8.4Mill, PS2 is at 10million. - I agree its a powerful Brand, but once the next Generation begins im sure the 360 sales will slow completely. Pre-orders + Hype suggests they will be giving up their console dominance this gen.

We are way off topic here so i'm not going to continue.
 
Top Bottom