• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: GTA V PS4 and Xbox One compared in new frame-rate stress test.

drotahorror

Member
This needs to be addressed (if at all possible, CPU overclock via firmare?) by Sony.

If all these new games will be bottlenecked by the CPU, then it doesn't matter that much if their GPU is 40% better (well, it matters as they can render everything with more detail).
I don't know the thermal threshold they have due to their smaller box, though. It might cause them more issues (heating, etc) than solutions.

For a console that is supposed to be the clear superior performer, its might start to be a thorn on their side if we start seeing more CPU intensive games.

150mhz is pretty negligible. You never hear about people OCing cpu's for a 150mhz gain. Maybe the garbage cpu's that ps4/xb1 use benefit greatly from tiny ass overclocks though, I am ignorant to that atleast.
 
You are mind sometimes in something of mystical to understand. It walks in way beyound the human nature. I'm not sure what you want to say here, really.

I'll make it simple then: why is the fact that Leadbetter wrote the article of importance? Someone above said that he is a Microsoft plant, a serious accusation. Is there any proof?
 

wachie

Member
I don't get it. Is that supposed to mean something? I'm missing some context here.
Yes, this ..

Yeah, performance is great. I've just watched the video and from what I've just watched, the XB1 doesn't perform anywhere near as consistently as the PS4 version. I'm completely baffled by this and the way this thread has gone. At one point the XB1 dropped to 25fps and the PS4 held 30fps throughout. Is this a case of selective thinking?

I'll take the higher detail thank you. I'm not bothered about silly fluctuations that both versions have. And by all accounts it actually performs better on the PS4 most of the time.

THIS IS GETTING NUTS.

I find it weird some people holding up the XB1 as the superior performer.

Just reading the OP quotes shows that they both drop frame rates in city junctions, but the XB1 does so to "a consistently lesser degree"

XB1 drops frames in driving outside these areas as well, while the PS4 does not.

XB1 also drops frames during explosive missions, while PS4 does not.

So one scenario where both drop, but PS4 does it more, but isn't born out in the video and 2 more where XB1 drops and PS4 doesn't. The article is terrible.

The entire article is written in a way that implies both console versions are on equal footing when they are in fact not. Their own video shows that one has more graphical detail via higher resolution shadows, increased environmental detail and larger draw distance in vast open areas. The same version offers a consistent 2-4fps advantage across almost every performance scenario except a single instance in which the other version is slightly higher. That is not about equal like the actual written article would have people believe.

The PS4 version not only has more graphical flourish it also performs better 99% of the time offering a slight (2-4 frame per second) advantage in situations where both versions falter and offering a solid 30fps in many scenarios where the XBO is dropping to 24-26fps. There is only a single instance where that performance advantage shifts to the other version and yet the article makes it seem that is more important than every other example they themselves provide. It's absurd.

As far as actual quotation from the article goes:



This is in direct contradiction of their own performance analysis videos wherein the XBO actually produces 24 FPS in high speed chases and the PS4 produces 26fps. But they would have you believe that this completely bullshit "advantage" on the XBO is good enough to offset the frequent and stark performance gap shown when in shootouts (solid 30 vs sustained 24fps). How is that anything but misleading. They even go so far as to insinuate the minimal difference in performance in this extremely limited scenario is due to the small 150mhz speed bump on the XBO cpu when in reality it's far more likely to be due to porting a last gen engine to current gen hardware.

So yes, to put it bluntly, it IS biased bullshit.

Then, if you're like me you might want to ask "why bother?" But then you read stuff like this in this thread and it becomes fairly obvious why they would minimize the advantages of one console in order to make them both appear as though they are on more equal footing



All the sudden the dialogue is shifting to the idea that the XBO can overcome the performance gap that has existed this far due to actual hardware disadvantages. When you have the leading publication for performance analysis minimizing graphical and performance advantages something is wrong. The only explanation is bias with an intent to muddy the water around the existing hardware and the ensured performance gap between the two consoles going forward.
along with a whole other bunch of posts rightfully calling Leadbetter out on his BS.

But you can carry on with the 9/11 tinfoil hatting.
 

twobear

sputum-flecked apoplexy
I'll make it simple then: why is the fact that Leadbetter wrote the article of importance? Someone above said that he is a Microsoft plant, a serious accusation. Is there any proof?
Proof? I don't need proof when I have selective quoting and a funny feeling in my gut.
 
He's right. The Scene on the PS4 has more cars.
http://youtu.be/5z6ONWZ7etA?t=40s

Start the video at 40 seconds, PS4 has more cars on the screen.

Without a doubt.

Hopefully something good will come out of this soon - as in a Sony or another developer who really shows the power of GPCPU (?).

Perhaps Sony should have concentrated on a tools to make this easier access for Developers?

Or perhaps Sony didn't want this though too early so as to help along a continual graphical improvement through the generation (as in to combat the ease of 'to triangle per Mark Cerny' statement). Ie. It makes things interesting for Developers to push themselves to get better/more efficient results.
 
He's right. The Scene on the PS4 has more cars.
http://youtu.be/5z6ONWZ7etA?t=40s

Start the video at 40 seconds, PS4 has more cars on the screen.

During the gun fight, the frame-rate drops more on the XB1
http://youtu.be/5z6ONWZ7etA?t=2m22s

This should be a better comparison considering there's the same amount of objects of the screen during this event.

The gun fight starts earlier.

Also XB1 drops during the police chase scene after the 1st heist where PS4 has no issues.
 

twobear

sputum-flecked apoplexy
Without a doubt.

Hopefully something good will come out of this soon - as in a Sony or another developer who really shows the power of GPCPU (?).

Perhaps Sony should have concentrated on a tools to make this easier access for Developers?

Or perhaps Sony didn't want this though too early so as to help along a continual graphical improvement through the generation (as in to combat the ease of 'to triangle per Mark Cerny' statement). Ie. It makes things interesting for Developers to push themselves to get better/more efficient results.
Durante has said that not everything a CPU can do can be offloaded effectively to a GPU. It depends how parallelisable the task is.
 

KageMaru

Member
So it's looking like the PS4 is more likely to drop in heavily populated areas while the XBO is likely to drop when alpha effects fill the screen? Makes sense considering the hardware, especially the XBO. I've noticed frame drops on the PS4 version but nothing to kill the experience.

For quite good reasons. Let's not rely on the hater word, hun.

No the reasons aren't good actually. It usually looks like people are bitter over him covering the lesser PS3 ports that plagued the system all those years. Many of the claims to support his hate have been from exaggerated examples or misunderstanding of what he says.

Not to mention how people sometimes just assumes there is a slant towards Microsoft when he isn't even the author of the article.
 

omonimo

Banned
I'll make it simple then: why is the fact that Leadbetter wrote the article of importance? Someone above said that he is a Microsoft plant, a serious accusation. Is there any proof?
It's not the same person who said these things, said that, it's the person of Leadbetter who wrote the article so if said bullshit it's not the whole redaction responsible of what write in the article. About the MS plant, well Leadbetter surely not hate them, but I don't judge a person for his love.
 

adamsapple

Or is it just one of Phil's balls in my throat?
I don't get it. Is that supposed to mean something? I'm missing some context here.

Its a long standing tradition that whenever a DF article about something comes up, people bring up his name to try and discredit it as if he's a corporate shill or a sellout or something.

Its completely stupid, but still continues to happen in every Richard Leadbetter written article.
 

omonimo

Banned
Durante has said that not everything a CPU can do can be offloaded effectively to a GPU. It depends how parallelisable the task is.
I think the point it's more ps4 not need to offload what xbone does with the juicy overheated cpu, because it's meaningless in the whole performance.
 
Durante has said that not everything a CPU can do can be offloaded effectively to a GPU. It depends how parallelisable the task is.

Off course not but it does not need to do everything and it was never meant to. However unloading anything helps and there is always something that can be and the amount will only increase and its efficiency. I would much rather be CPU bound in the PS4 case with its hardware than GPU bound in the Xbox case.
 

RiZ III

Member
So basically there's almost no difference and the games run at a solid 30fps 99% of the time. Outrage! I demand refund for those missing frames!
 

KageMaru

Member
I'll make it simple then: why is the fact that Leadbetter wrote the article of importance? Someone above said that he is a Microsoft plant, a serious accusation. Is there any proof?

No there is no real proof, just bitter tears.

You can see in the grass comparison thread many people claim that he made a big deal over the difference in foliage in the RDR comparison while downplaying the the significance of the difference in GTAV. However here is the original RDR quote surrounding the foliage:

Much has been said about the foliage in the game, the assortment of brush that adds a bit more life to the fairly barren landscape within Rockstar's recreation of the Wild West.

It's fairly obvious from the screenshots shown to date that there has been some significant paring back of detail on the PS3 version of the game, so what's going on? Well, the foliage is generated from transparent alpha textures and these are more expensive for the host hardware to draw.

Cutting back on them saves on performance in terms of fill-rate and bandwidth. It's also interesting to note that the alpha-to-coverage transparency type is used on both systems, but looks considerably worse on PS3 owing to the reduced resolution.

This is incidental detail - albeit one of the more prominent elements - so unless you have both games operating side by side you are not really going to notice it.

Nothing in that quote is a lie, he even tries to educate the readers on why there are differences. Worse he even goes to say it's an "incidental detail" but that didn't stop people from exaggerating.
 

Marlenus

Member
Exept a gpu and a cpu does different things, so you can't really compare 1 gpu flop to 1 cpu flop. If you could why have a cpu at all? They would just drop the cpu altogether, put a titan and a lot of memory in there and have a super powerful console.

A flop is a flop is a flop. The CPU can handle certain code better than the GPU and vice versa but flops are flops and these are just a certain type of workload that is easy to calculate to form a comparison using actual, objective numbers rather than subjective reasoning.

Again, this is all perhaps right but not necessarily. We just don't know, no matter how hard we try to find reasoings for whatever opinion we have.
We don't even know how the scheduler in these new consoles work. We know that only 6 of 8 cores are used but are they locked to certain cores? Does this change? This would have immidiately implications of how cache gets invalidated lines, for example. And this is only one uncertainty we have. This is why I am saying we need profiling data - without it, _nothing_ can be said for sure.

If you think that cache allocations or OS CPU affinities will lead to a 20% FPS difference then that is a bit barmy tbh. That is more barmy than the idea that the CPU alone is the culprit, it may be a contributing factor but it is not the sole reason.

I agree in the concept that there are many unknowns and there are specific differences, such as the CPU - RAM bandwidth being 30GB/s on Xbox One and 20GB/s on PS4. Does that make a difference? Well I do not think so as even an i7 4770K only has 24GB/s of memory bandwidth when pared with DDR3 1600 and that is by far a more powerful CPU which could actually use the bandwidth.

All of these little things do make a difference but you need to always put it in the context of the overall performance picture and sometimes when looking at the finer details it can be hard to consider that it is only going to make a tiny difference overall.

The point I am making is that with what we know of the hardware CPU bottlenecking does not explain the differences in AC:U or the occasional differences in GTA:V. These differences are just too great for it to be CPU alone so for a website like DF to state that it points to CPU bottlenecking is just poor analysis. Even more so when you look at Kabini reviews which show a 28.1% clock speed increase giving 4% gains at best in gaming scenarios and 11.7% gains in CPU limited scenarios.
 

RurouniZel

Asks questions so Ezalc doesn't have to
but even then we don't know if the XBO has less traffic (cut-back like the foliage to make it run better)

again, why would the other examples all have PS4>XBO

Open-world game engines have been built to be very CPU-centric for some time. In games that don't stress the CPU the way Open-world games like AC: U and GTAV do the PS4's GPU and RAM win out. In CPU heavy-games we're seeing the XBO take the lead.
 
No there is no real proof, just bitter tears.

You can see in the grass comparison thread many people claim that he made a big deal over the difference in foliage in the RDR comparison while downplaying the the significance of the difference in GTAV. However here is the original RDR quote surrounding the foliage:

Nothing in that quote is a lie, he even tries to educate the readers on why there are differences. Worse he even goes to say it's an "incidental detail" but that didn't stop people from exaggerating.

It seems to me the poor guy is being blamed for nothing.
 

c0de

Member
All of these little things do make a difference but you need to always put it in the context of the overall performance picture and sometimes when looking at the finer details it can be hard to consider that it is only going to make a tiny difference overall.

The point I am making is that with what we know of the hardware CPU bottlenecking does not explain the differences in AC:U or the occasional differences in GTA:V. These differences are just too great for it to be CPU alone so for a website like DF to state that it points to CPU bottlenecking is just poor analysis. Even more so when you look at Kabini reviews which show a 28.1% clock speed increase giving 4% gains at best in gaming scenarios and 11.7% gains in CPU limited scenarios.

I think there are many unknown variables which *do* influence performance but we don't know exactly how much. And if many of these come into play things can even get worse. Proving points with different CPUs in a different context with different APIs and different OS's and schedulers will lead us to nohing but even more confusion. We need hard facts on how the new consoles really work and which variables in which way really make for the performance, currently. As we don't know this yet, any talk about power differences might be legit but doesn't tell us why game x performs worse/better on console 1 or 2.
But I agree that CPU clock itself shouldn't make a difference in this case but this doesn't mean it is playing a role in it.
 

Ploid 3.0

Member
Wow just saw the frame rate video, even on current gen GTA can't hold a steady frame rate. Rockstar seem to have this as a signature. Will they ever make a solid framerate GTA?
 
150mhz is pretty negligible. You never hear about people OCing cpu's for a 150mhz gain. Maybe the garbage cpu's that ps4/xb1 use benefit greatly from tiny ass overclocks though, I am ignorant to that atleast.

150mhz is a 9,37% increase over 1,6Ghz for 8 cores on consoles CPU.

150mhz is a 3,95% increase over 3,8Ghz for 2/4 cores on average PC CPU.
 

KageMaru

Member
Wow just saw the frame rate video, even on current gen GTA can't hold a steady frame rate. Rockstar seem to have this as a signature. Will they ever make a solid framerate GTA?

Open world games are some of the hardest to optimize because of all of the unpredictable variables that result in having a free roam world filled with AI. I think it's a great port since the game doesn't look to struggle to hit 30fps like we saw with so many open world games last Gen.
 

BigDug13

Member
So basically there's almost no difference and the games run at a solid 30fps 99% of the time. Outrage! I demand refund for those missing frames!

That's not what I saw in those videos posted on this very page. I saw the PS4 doing what you just described and the XBO version dropping a bit of frames, usually in the 2-4 FPS range quite a bit in action scenes.
 

BigDug13

Member
Open-world game engines have been built to be very CPU-centric for some time. In games that don't stress the CPU the way Open-world games like AC: U and GTAV do the PS4's GPU and RAM win out. In CPU heavy-games we're seeing the XBO take the lead.

I doubt that tiny CPU tick is going to seriously do that much. Maybe, but not much. "Take the lead" is going to be at a lower resolution or compromised GPU-centric things most likely as well.

Also why in this CPU centric open world game does the PS4 outperform the XBO according to the videos posted on this page?
 

gofreak

GAF's Bob Woodward
Open-world game engines have been built to be very CPU-centric for some time. In games that don't stress the CPU the way Open-world games like AC: U and GTAV do the PS4's GPU and RAM win out. In CPU heavy-games we're seeing the XBO take the lead.

I don't know if I'd say that in this case.

There are scenes in the DF video where the XB1 is giving better performance and scenes where the PS4 is. In the car-chase example above both are stuttering, with a min-FPS gap of 4fps. In a scene later in the video XB1 is stuttering - with a min fps gap of 5fps - while PS4 remains locked at 30.

Either this game is not CPU bound enough, or XB1's CPU advantage is not substantial enough here, to offer a general overall lead at all.
 

omonimo

Banned
Open-world game engines have been built to be very CPU-centric for some time. In games that don't stress the CPU the way Open-world games like AC: U and GTAV do the PS4's GPU and RAM win out. In CPU heavy-games we're seeing the XBO take the lead.
I though it was explained with tons of examples in the pc rig comparison, that cpu has barely an advantage compared the Ps4 which not translate in something of really palpable.
 
What's happening in the shoot-out scene that is taxing the XB1? It mentions effects in the article but even when nothing's happening the XB1 is still dropping frames?
 

Ce-Lin

Member
Open-world game engines have been built to be very CPU-centric for some time. In games that don't stress the CPU the way Open-world games like AC: U and GTAV do the PS4's GPU and RAM win out. In CPU heavy-games we're seeing the XBO take the lead.

well... I see a gunfight scene totally CPU-bound and the PS4 maintains a remarkable stable framerate whereas the XBO chugs along for the entire duration of the sequence, in the meantime the PS4 manages to do it with better gfx as well, also I see random driving scenes where the XBO drops frames too, we are not talking about an XBO "lead" here, more like about a "lead-better".

What's happening in the shoot-out scene that is taxing the XB1? It mentions effects in the article but even when nothing's happening the XB1 is still dropping frames?

CPU, the effects come and go and the XBO still drops frames until the AI (gunfight) stops stressing the system.
 
Indeed so lets put this into context.

Both consoles use 6 cores for gaming with 2 running the OS and other functions. Each core can do 8 flops / clock so we can work out the GFlops for the CPUs.

Xbox 1 = 1750 * 6 * 8 = 84 GFlops.
PS4 = 1600 * 6 * 8 = 76.8 GFlops.

Are you aware that your GFLOPS maths only cover CPU's FPU?

Even then it is a lot different. The IBM PPC in the 360 is a RISC based CPU using the Power architecture, the AMD Jaguar is a CISC based CPU using X86 architecture. They work in a fundamentally different way.

Talking about RISC and CISC was totally meaningless even 10 years ago.

RISC isn't a technology, just a CPU design philosophy. And CISC isn't even a thing, just how RISC apologists called everything else that wasn't RISC.

With current CPU technology there is no place for such distinctions.
 

Hanmik

Member
lol.. this thread... here it is in gif form..

fewUc0o.gif
 

Fafalada

Fafracer forever
gofreak said:
Either this game is not CPU bound enough, or XB1's CPU advantage is not substantial enough here, to offer a general overall lead at all.
Like I said with AC:U - chasing a few frames of difference down to a marginal CPU clock-speed differential without having any context about code, data setup or actual performance metrics of such frames is basically as good as random-guessing (and Gaf has banned certain sites for doing that with sales).

Eg. I could take a guess that high-speed scenes are related to streaming, and data-unpacking is where the differences come from. This could be due to dedicated LZO-units performing differently (which aren't the same across two consoles), issues with I/O async performance (one Console implementation just being more demanding then the other), using different compression alltogether, or maybe just the two tests not being 1:1 in terms of processing load (which they obviously aren't)...
Thing is that's already 4 variable guesses, all possible without any differences from "CPU" performance itself - and we're just scratching the surface here - could list a lot of other potential causes without knowing any better detail that we have.
 
Like I said with AC:U - chasing a few frames of difference down to a marginal CPU clock-speed differential without having any context about code, data setup or actual performance metrics of such frames is basically as good as random-guessing (and Gaf has banned certain sites for doing that with sales).

Eg. I could take a guess that high-speed scenes are related to streaming, and data-unpacking is where the differences come from. This could be due to dedicated LZO-units performing differently (which aren't the same across two consoles), issues with I/O async performance (one Console implementation just being more demanding then the other), using different compression alltogether, or maybe just the two tests not being 1:1 in terms of processing load (which they obviously aren't)...
Thing is that's already 4 variable guesses, all possible without any differences from "CPU" performance itself - and we're just scratching the surface here - could list a lot of other potential causes without knowing any better detail that we have.

Too many variables.

I'm out of here until the inevitable post-stability-patch Digital Foundry frame-rate stress test.
 

gofreak

GAF's Bob Woodward
Like I said with AC:U - chasing a few frames of difference down to a marginal CPU clock-speed differential without having any context about code, data setup or actual performance metrics of such frames is basically as good as random-guessing (and Gaf has banned certain sites for doing that with sales).

Eg. I could take a guess that high-speed scenes are related to streaming, and data-unpacking is where the differences come from. This could be due to dedicated LZO-units performing differently (which aren't the same across two consoles), issues with I/O async performance (one Console implementation just being more demanding then the other), using different compression alltogether, or maybe just the two tests not being 1:1 in terms of processing load (which they obviously aren't)...
Thing is that's already 4 variable guesses, all possible without any differences from "CPU" performance itself - and we're just scratching the surface here - could list a lot of other potential causes without knowing any better detail that we have.

Indeed. Outside of maybe two scenes in the DF video, there's blips on both consoles that could be pretty much anything or nothing of particular consequence.

And in the two scenes were things are a bit more suspect, it's impossible for us to really say for sure what's going on.

I'm just poking at the narrative that this is another 'CPU bound open world game' that does better on XBO. Even if one wanted to assign CPU perf as the factor where PS4 is doing worse in the DF video, I think you'd have to assume people haven't watched the video to try and pass that idea.
 

Krisprolls

Banned
I'm not sure why we care that much about Digital Foundry when it has shown time and time again it had a MS bias. Minimizing differences is now their business, the same way they used to magnify every 2 fps advantage the Xbox 360 used to have over the PS3.

XB1 and PS4 share a very similar architecture but PS4 is around 30 % more powerful, so of course in every game it'll show by running better on PS4, unless you really try hard to make the XB1 version runs the same (mostly by downgrading the PS4 version on purpose).

As far as I know, a slightly better XB1 version happened exactly once, with AC : Unity. But it had nothing to do with CPU and everything with Ubi wanting to force parity.

Analyzing made more sense when we had different architectures. Now ? Not so much... Of course the most powerful machine will get a slightly better version than a less powerful machine with the same architecture.
 

Marlenus

Member
Are you aware that your GFLOPS maths only cover CPU's FPU?

No shit that the FPU deals with floating point operations, just like the shaders in a GPU deal with floating point operations and it has nothing to do with the TMUs, ROPS etc. It is just a single metric to show the scale difference between the CPU and the GPU.



Talking about RISC and CISC was totally meaningless even 10 years ago.

RISC isn't a technology, just a CPU design philosophy. And CISC isn't even a thing, just how RISC apologists called everything else that wasn't RISC.

With current CPU technology there is no place for such distinctions.

Funny that the 3rd fastest super computer (Sequoia) is using a Power PC based CPU which is using a RISC type instruction set.

RISC vs CISC are different CPU architecture paradigms just like the Object Oriented paradigm vs the Functional paradigm. Going from one to the other is possible but it takes more work than going from different implementations of the same paradigm.

Like I said with AC:U - chasing a few frames of difference down to a marginal CPU clock-speed differential without having any context about code, data setup or actual performance metrics of such frames is basically as good as random-guessing (and Gaf has banned certain sites for doing that with sales).

Eg. I could take a guess that high-speed scenes are related to streaming, and data-unpacking is where the differences come from. This could be due to dedicated LZO-units performing differently (which aren't the same across two consoles), issues with I/O async performance (one Console implementation just being more demanding then the other), using different compression alltogether, or maybe just the two tests not being 1:1 in terms of processing load (which they obviously aren't)...
Thing is that's already 4 variable guesses, all possible without any differences from "CPU" performance itself - and we're just scratching the surface here - could list a lot of other potential causes without knowing any better detail that we have.

Exactly this, looking at the difference and just pointing at the CPU is lazy. If DF really wanted to earn their stripes they would see if they could speak to the devs to find out what is really going on. Sure the devs may not want to share but they could do a better analysis than 'points to a CPU bottleneck'.
 

c0de

Member
Funny that the 3rd fastest super computer (Sequoia) is using a Power PC based CPU which is using a RISC type instruction set.

RISC vs CISC are different CPU architecture paradigms just like the Object Oriented paradigm vs the Functional paradigm. Going from one to the other is possible but it takes more work than going from different implementations of the same paradigm.

Funny that the distinction doesn't exist in current x86 (or x64) cpus as they are RISC internally since a good amount of time.
 
I'm not sure why we care that much about Digital Foundry when it has shown time and time again it had a MS bias. Minimizing differences is now their business, the same way they used to magnify every 2 fps advantage the Xbox 360 used to have over the PS3.

XB1 and PS4 share a very similar architecture but PS4 is around 30 % more powerful, so of course in every game it'll show by running better on PS4, unless you really try hard to make the XB1 version runs the same (mostly by downgrading the PS4 version on purpose).

As far as I know, a slightly better XB1 version happened exactly once, with AC : Unity. But it had nothing to do with CPU and everything with Ubi wanting to force parity.

Analyzing made more sense when we had different architectures. Now ? Not so much... Of course the most powerful machine will get a slightly better version than a less powerful machine with the same architecture.

I don't know, MGR had a 10fps average advantage, yet they called it a draw cause the ps3 had slightly better AA, which made them seem sony biased to me.
 
Top Bottom