• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

Titanfall will be by FAR the more enjoyable game on the PC. It's supposedly not even 1080p on XB1. If anything, the case of Titanfall prooves that my point is valid.

(And yes, you can buy a 500$PC which plays Titanfall at 1080p/60, I'm pretty sure)

Really? It will be far more enjoyable because it's 1080p on the PC? I'll probably have way more enjoyment playing it with my personal friends and Clan Mates on the X1.

I can't wait.
 

rjinaz

Member
everyone was like lets wait to hear what the devs say. devs speak. and everyone argues cloud (STILL SOMEHOW) and that these "devs" dont know what they are talking about, or arent real... november cant come soon enough

I really don't think when these consoles launch, much is going to change, at least not right away. The discussions will just move to pic comparisons and the like and we'll still have the exact people saying the same things just in different ways. Console war never changes. It is what it is and watching is entertaining for me :)
 

Chobel

Member
GPU's love GDDR because they don't work on clock cycles. They process their tasks parallel and without interruption, like a "stream". Hence their name. If the CPU is calling the RAM, it has to wait for that RAM to return with its result. It can't change what its doing, or carry on with something else, it literally has to sit and wait. The CPU could miss several clock cycles (depending on the clock speed) until GDDR returns. That's why DDR is prefered as a general computing memory.

GPU's on the other hand, can jump ship and work on something else when they wait for GDDR to return, they don't have clock cycles and simply keep ticking. Hence why they're prefered for maths calculations and not intermittent calls or changing things about. Although, when the GPU is managing its tasks this is primarily down to the software handling which in this case will be OpenGL. How baked and fully implemented this iteration of OpenGL is in the PS4, I don't know but if its not in there, it'll be left to the developers and things can get messy. Its essentially what sony meant by "coding to the metal".

Sorry if I've just rambled a bit.

Ok, so you're talking about GDDR5 latency.
GDDR5 in PS4 has almost the same latency of DDR3.
 

JaggedSac

Member
Let me ask, did DirectX or OpenGL ever get upgraded after the consoles got released? These drivers are final at manufacturing and both companies have had theirs final for a very long time, which makes me believe this article is just capitalistic journalism at its best.

Yes, the 360 got some DirectX upgrades, such as being able to precompile the command buffer.

Who knows more about hardware design than Mark Cerny and knows more about the power differences between these two consoles than the developers actually making software for them.

Got it.

To be fair, most of the stuff that usually gets linked to are from people who are basing it completely on specs(which is a fair thing to do) and not from actual Bone development.
 

TheCloser

Banned
GPU's love GDDR because they don't work on clock cycles. They process their tasks parallel and without interruption, like a "stream". Hence their name. If the CPU is calling the RAM, it has to wait for that RAM to return with its result. It can't change what its doing, or carry on with something else, it literally has to sit and wait. The CPU could miss several clock cycles (depending on the clock speed) until GDDR returns. That's why DDR is prefered as a general computing memory.

GPU's on the other hand, can jump ship and work on something else when they wait for GDDR to return, they don't have clock cycles and simply keep ticking. Hence why they're prefered for maths calculations and not intermittent calls or changing things about. Although, when the GPU is managing its tasks this is primarily down to the software handling which in this case will be OpenGL. How baked and fully implemented this iteration of OpenGL is in the PS4, I don't know but if its not in there, it'll be left to the developers and things can get messy. Its essentially what sony meant by "coding to the metal".

Sorry if I've just rambled a bit.


This post explains that point.

You know what hUMA is right? surely you must know it's purpose and general goal. Firstly, there's no waiting for the GDDR to return because they both have access to the same pool of memory. The only time the cpu or gpu won't have access to a specific memory address is when one or the other is doing calculations with the data at that memory address. The issue you mentioned above is not a real issue as long as you design your algorithms with that in mind. It only occurs if you try to-do calculations with the gpu and cpu on the same memory address which would require new types of algorithms. This is kinda the whole idea behind hUMA.
 

JonnyLH

Banned
Adding another ALU (CU) usually would decrease the performance because of locks and synchronizing between CU and most of the time the data can't be processed in parallel. however this not always.
Adding another CU in GPU will increase performance because the data of rendering can be always processed in parallel (data parallelism).
It pretty much is always, you're never going to see full utilisation across the board.
 
I really don't think when these consoles launch, much is going to change, at least not right away. The discussions will just move to pic comparisons and the like and we'll still have the exact people saying the same things just in different ways. Console war never changes. It is what it is and watching is entertaining for me :)

i dont know if its politically correct or not whatever but when people believe so strongly in cloud power i often wanna ask if they are creationists. think dinosaurs are fake. xp
 

SapientWolf

Trucker Sexologist
I think it's fair to point out that the value proposition is not entirely in PS4's favor. Yes, the PS4 has better specs but it is debatable whether it presents the best value, as Microsoft chose to invest in an improved Kinect Sensor and bundle it for everyone. A better GPU does not have the potential to offer truly different gaming and multimedia experiences, it will just make the same games go a bit faster. In terms of overall package, I think Microsoft made the right call.
Or make the same games look better. A lot of the marketing for the One focuses on how much the graphics have improved over last gen. If the improvements were only marginal then MS would have a whole other issue on their hands.

Core gamers are cold towards Kinect, MS haven't done anything to change that so far, and the conversation about the performance gap is going to get louder and more heated as time progresses. The One isn't priced for the mainstream, so at best, their blunder is going to cost them valuable momentum after the launch hype subsides. At worst they just gave the gen to Sony by bundling Kinect.
 

James Sawyer Ford

Gold Member
that's not going to happen...ever. Resolution may be a bit worse, same with framerate, but the games will mostly look similar. Giving up resolution provides a lot of room to keep all the other graphical bells and whistles turned on. I had to do this with my pc in order to keep the games maxed out before I got around to upgrading my gpu.

I think you'll easily see it with Sony first party exclusives. Combination of talent and additional hardware.

The difference might not be about visual fidelity as much as it is about interactivity, though. PS4 has quite a bit more power under the hood that's currently not being tapped thanks to the impressive amount of GPGPU capability they focused on.

Things like physics, destructibility, etc could become a much bigger deal. In which case you'd notice the differences in motion.
 

Deuterium

Member
My conclusions, from reading these informative tech comparison / console war free-for-alls.

A) Microsoft's Xbox One is in serious trouble, right out of the gate...due to lower performance specs (at least on paper), higher selling price, as well as major public relation / marketing blunders. Microsoft managed to pull off a Trifecta, in this regards. Somehow I can't imagine this have happening if J. Allard had been at the helm.

-- or --

B) The theoretical performance advantages of the PS4 will not translate into the major graphical and game-play improvements that everyone is expecting, in actual real-world applications. Multi-platform games will be more or less a wash...with the deciding factors then becoming quality of exclusive games, as well as online features, content and multiplayer experience.

At this point, I think either is possible. However, I am now leaning towards scenario "A" as being the higher in probability. Of course, since all of us here on GAF are more or less "hardcore" gaming enthusiasts...we may be biased in how we are predicting these factors will influence the majority of video game consumers.

P.S. -- Yes, I realize these are not earth-shattering insights.
 

Vizzeh

Banned
First thing first is PS4 does not depend DirectX or OpenGL it has it's own API.
They have a wrapper API to port stuff easy and other called PSSL ..

From Playstation themselves.
PS4 is very approachable for development
- DX11/OpenGL 4.4 level Shader language in PSSL
 
JonnyLH
you said your getting a ps4 but you come off as one of the ones who says that to not seem opinionated and fair.
your doubt seems very much swayed to one side. when there have been nothing but reports about drivers being further ahead on sonys side. easier to work with etc
 

JonnyLH

Banned
You know what hUMA is right? surely you must know it's purpose and general goal. Firstly, there's no waiting for the GDDR to return because they both have access to the same pool of memory. The only time the cpu or gpu won't have access to a specific memory address is when one or the other is doing calculations with the data at that memory address. The issue you mentioned above is not a real issue as long as you design your algorithms with that in mind. It only occurs if you try to-do calculations with the gpu and cpu on the same memory address which would require new types of algorithms. This is kinda the whole idea behind hUMA.
hUMA is a very awesome memory bus, no taking that away from it. It doesn't change the architecture of GDDR though, the latency still remains. The CPU still has to wait for RAM return and it misses clock cycles because of it, it's just a given downfall.
 

EGM1966

Member
A console generation is not a race, it's a marathon. Maybe the PS4 sells better in the first few months. It will just be temporary. As I said, the casual market is crucial for the success of a console.

The general market is indeed crucial but I don't see why you think Kinect is going to make the difference. The Playstation brand is arguably far more known to the general market, Kinect 1 after the huge marketing push and initial forced interest clearly tailed off and the general market rarely comes back for V2 of something that has no clear differentator (as Nintendo can attest to right now).

The general market is I'd argue unlikely to flock to a more expensive XB1 with Kinect looking at known games and market trends. This gen was a marathon and that ended mostly a draw worldwide despite PS3 giving 360 a head start and being more expensive - why would you even consider for a moment Sony launching cheaper this time and at the same time is going to result in XB1 mopping up the general market?

I'm sure XB1 will indeed see okay general sales but right now you'd have to expect the PS4 to see even more globally given each consoles historic performance, historic market reaction to price and expectation of suitability and general market perception (and by perception I mean how the brands are viewed not whether this is warranted or not).
 

Chobel

Member
hUMA is a very awesome memory bus, no taking that away from it. It doesn't change the architecture of GDDR though, the latency still remains. The CPU still has to wait for RAM return and it misses clock cycles because of it, it's just a given downfall.

I'll repeat this:
GDDR5 in PS4 has almost the same latency of DDR3.
 

JonnyLH

Banned
JonnyLH
you said your getting a ps4 but you come off as one of the ones who says that to not seem opinionated and fair.
your doubt for one side seems much swayed to one side. when there have been nothing but reports about drivers being further ahead on sonys side. easier to work with etc
I fully understand that comment, its something I was trying to steer away from when writing my posts. For me personally, it is all down to the games. Even though I love the fastest and enjoy a nice piece of kit I just look out for the games on there. I feel like the X1 is getting a lot of stick for something its not, they're both very very capable consoles and its very much a level playing field. I could mention some negatives around the X1 in the very same manner, but unfortunately this isn't the topic subject.
 

Vizzeh

Banned
hUMA is a very awesome memory bus, no taking that away from it. It doesn't change the architecture of GDDR though, the latency still remains. The CPU still has to wait for RAM return and it misses clock cycles because of it, it's just a given downfall.

The latency has be squashed many times on these boards. Its the same as DDR3, only the eSRAM has less latency and that is GPU only.
 

JaggedSac

Member
Well cheers for that, I'll retract my previous statement. I was just trying to make a point around going mass-market with beta drivers.

Absolutely, especially since online is not required so update downloads cant be relied upon, the drivers have to be complete.

If GDDR's latency is so bad for GPUs, why was it specifically designed for GPUs?

He/she is saying it is bad for CPUs not GPUs. Whether that is correct or not is a different story.
 

SapientWolf

Trucker Sexologist
hUMA is a very awesome memory bus, no taking that away from it. It doesn't change the architecture of GDDR though, the latency still remains. The CPU still has to wait for RAM return and it misses clock cycles because of it, it's just a given downfall.
You're gonna have to do a lot more homework if you want to soldier that hard and still keep your new GAF account. And even then it's risky.
 

Jon Canon

Member
Thats what I was basing my tiled resources quotes on, a 16mb file that can exist in the esram, being STREAMED from files as big as 10gb textures, that along side a 1080p Frame buffer = 8mb, Double buffered = 16mb = so 16mb tiled + 16mb frame buffer = 32mb eSRAM.

However PS4 can do this too since it suports DX11.2

eSram is ideal with the low latency but the GDDR5 im sure can do it also.

Why would you want a 16 mb chunk of 10 gb data in the esram? It is way too small to be a useful subset for rendering.
 

JonnyLH

Banned
I'll repeat this:
GDDR5 in PS4 has almost the same latency of DDR3.
I really don't see that as being possible. It's just how GDDR "works" and is structured. Very much like saying you couldn't get a X86 CPU to do PowerPC's tricks.

If I see evidence, then obviously this makes this invalid. I honestly don't see it happening though.
 
From Playstation themselves.
PS4 is very approachable for development
- DX11/OpenGL 4.4 level Shader language in PSSL

Yep but they don't depend on them so they can make changes however they want .
Point is both X1 and PS4 going to have changes in there API compare to the normal OpenGL and DX11.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
hUMA is a very awesome memory bus, no taking that away from it. It doesn't change the architecture of GDDR though, the latency still remains. The CPU still has to wait for RAM return and it misses clock cycles because of it, it's just a given downfall.

And remember this little snippet.

AY: It depends what youre doing. GPU, like 40 per cent more powerful. DDR5 is basically 50 per cent more powerful than DDR3, but the memory write [performance] is bigger on Xbox One so it depends on what youre doing.
 

JaggedSac

Member
Thats what I was basing my tiled resources quotes on, a 16mb file that can exist in the esram, being STREAMED from files as big as 10gb textures, that along side a 1080p Frame buffer = 8mb, Double buffered = 16mb = so 16mb tiled + 16mb frame buffer = 32mb eSRAM.

However PS4 can do this too since it suports DX11.2

eSram is ideal with the low latency but the GDDR5 im sure can do it also.

Yep, but we don't know whether these devices are Tier 1 or Tier 2 in regards to the tiled resource support. Tier 1 is a software implementation, Tier 2 is a hardware implementation thus faster.
 

Vizzeh

Banned
Why would you want a 16 mb chunk of 10 gb data in the esram? It is way too small to be a useful subset for rendering.

Tiled resources is supported in dx11.2 it STREAMs the data from the 10gb based on your POV, instead of having 1 large texture viewable at distance, it would use a small texture file, the closer you get to that object the higher the texture quality. The 16mb doesnt gain in size.. tiled resources looks awesome.
 

JonnyLH

Banned
You're gonna have to do a lot more homework if you want to soldier that hard and still keep your new GAF account. And even then it's risky.
If I get banned for keeping on topic subject and discussing the integrity around a journalistic article, then there's something wrong here.

It's not like I'm saying I'm always right, I'm just posting based on my experience and knowledge around the subject. Prove me wrong, I'm not fussed.
 

rjinaz

Member

My conclusion from reading lots of these type of threads is that I still have no clue what is being said. As far as I know, anybody could be right. I basically try and decide who is right based on who has the most people agreeing with them in the thread. My personal opinion is that PS4 will be more powerful, but whether or not the difference is small or large, I don't think anybody can really say for sure at this point because there is a lot of contradictions. I enter these thread for the interesting discussion and not the answers to the questions presented.
 

REV 09

Member
If SONY actually advertises the most powerful console ever line, it's a done deal IMO.
MS will just advertise it right back. It's like politics...often times you advertise hard against your primary weakness by presenting it as a strength. The laymen won't be able to tell the difference when both are saying they're the most powerful. Some may assume that Xbox is more powerful due to the higher price.

I think these power differences are going to be hard to use as a selling point outside of message boards. "Xbox on" is easier to sell than steadier framerates or higher resolutions.
 

BigDug13

Member
What are the exact latencies of DDR3 and GDDR5?

I see people bring it up all the time but never saying numbers when discussing.
 

Thrakier

Member
Really? It will be far more enjoyable because it's 1080p on the PC? I'll probably have way more enjoyment playing it with my personal friends and Clan Mates on the X1.

I can't wait.

1080p, better effects, better Image Quality, stable 60fps, a Mouse, cheaper, free online. You can have friends on PC too. Yeah. Much More enjoyable.
 

CoG

Member
aN2erQX.png
 
I fully understand that comment, its something I was trying to steer away from when writing my posts. For me personally, it is all down to the games. Even though I love the fastest and enjoy a nice piece of kit I just look out for the games on there. I feel like the X1 is getting a lot of stick for something its not, they're both very very capable consoles and its very much a level playing field. I could mention some negatives around the X1 in the very same manner, but unfortunately this isn't the topic subject.

It's by no means a level playing field. That's just patently disingenuous.

MS will just advertise it right back. It's like politics...often times you advertise hard against your primary weakness by presenting it as a strength. The laymen won't be able to tell the difference when both are saying they're the most powerful. Some may assume that Xbox is more powerful due to the higher price.

I think these power differences are going to be hard to use as a selling point outside of message boards. "Xbox on" is easier to sell than steadier framerates or higher resolutions.

"most powerful console ever" is far easier sell than "control your cable".
 

rjinaz

Member
If I get banned for keeping on topic subject and discussing the integrity around a journalistic article, then there's something wrong here.

It's not like I'm saying I'm always right, I'm just posting based on my experience and knowledge around the subject. Prove me wrong, I'm not fussed.

I really don't think you have anything to worry about judging by the TOS (at least not yet). You're being quite respectful and only looking to have discussions. It's when people lose their cool they start having problems.
 
I'm sure XB1 will indeed see okay general sales but right now you'd have to expect the PS4 to see even more globally given each consoles historic performance, historic market reaction to price and expectation of suitability and general market perception (and by perception I mean how the brands are viewed not whether this is warranted or not).

I'm not saying the Xbox will dominate, far from it. I'm saying it will be very close. The Xbox brand is very powerful in the US and UK, I don't see that changing. Microsoft's mistakes will cost them the lead they would otherwise have after such a successful generation but that is all. People who expect a PS4 domination are just letting their feelings cloud their judgment.
 

Vizzeh

Banned
Yep, but we don't know whether these devices are Tier 1 or Tier 2 in regards to the tiled resource support. Tier 1 is a software implementation, Tier 2 is a hardware implementation thus faster.

Does this not suggest its hardware?

"Modern GPU
- DirectX 11.2+/OpenGL 4.4 feature set
- With custom SCE features"

again from official playstation source
 
GDDR latency is worked around through gpu parallelization i dont think its that big of a deal with this being a game console and not a pc primarily
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
If GDDR's latency is so bad for GPUs, why was it specifically designed for GPUs?

Because it's more important to have the ability to move large chunks of data around quickly then it is to respond to the request immediately.
And the latency is going to be at a fixed value so you can design your graphics engine to accommodate that. It's good for GPU tasks.

It's less ideal for a general purpose CPU.
 
Top Bottom