• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: GTA V PS4 and Xbox One compared in new frame-rate stress test.

STEaMkb

Member
Seems like R* put in a good effort on both platforms.

I'd wager R* has spent more time developing one version over the other.

It's no secret that Xbox One's memory configuration has compelled Microsoft to work closely with third-party developers, identifying bottlenecks and helping them to write more efficient code, tailored to the needs of their system. Performance gains and advanced know-how is shared promptly after being acquired. In contrast, it's not nearly as problematic to get something decent running on PlayStation 4 in a similar timeframe (though not always obvious). It's evident from interviews that Microsoft has been more proactive in this area, out of necessity. In stark contrast, Sony are in the rewarding position where they do not need to share their most intimate and recently gained knowledge regarding the workings of their system, particularly when it comes to pushing its limits. They want their own titles to shine that much brighter, which is fair enough.

I'm not implying PS4 multiplats are being ignored--not by any stretch--but in regard to the inaugural wave of first-year titles, it's a reasonable inference that third-party developers are expending more time narrowing the deficit by refining and polishing the Xbox One version. Exploring the boundaries of PS4 is not a high priority for them at the present time. So far, external devs are using PS4's extra power in the most mundane way possible (exactly as Richard Leadbetter said they would). It's largely Sony's closest partners who are developing advanced techniques and exploring alternative avenues.

If Sony's first-party teams get to play with additional system resources (there was some early suggestion they do) then they need to be opened up to all developers, not merely a select few. As Microsoft overcome initial stumbling blocks, Sony may wish to revaluate the balance between maintaining such a strong first-party advantage and helping others to raise the PS4 version of multiplats more impressively above the competition.
 

That's all fine and dandy about cache and bandwidth.. but it falls apart when the system is waiting on the GPU to compute stuff. Basically after the instructions have been sent and the system is waiting for the GPU to write to the framebuffer, there is where the PS4 lags behind. We can see this in all the PC ports which can just brute force their way to 1080p/60fps.
 
People say that every. single. time. And it's simply not true. It's not about 'extracting' more power necessarily, it's about figuring out more clever algorithms that do things better and faster. More optimized math, better optimized parallelization, things like that, which require ingenuity and new thinking, rather than making old algorithms run faster. As a recent example, you have an AA method like HRAA now which runs in same frame time as FXAA on the same hardware, but blows it out of the water in terms of quality.
If you want to talk about actually extracting more power of of something, that too is not out of the question. Just look at the Tomorrow Children presentation and how they've done something that's exactly that, using unconventional methods.

My point is that since the consoles are PCs, most developers have already been figuring out ways to optimize their graphics engines.. Years in fact. There is nothing special about the new console hardware that would allow them to make another engine more optimal with even more visual fidelity and still run at 1080p/30fps solid. There becomes a point of diminishing returns on how much fidelity you can introduce and how much you have to scale back due to hardware limitations.

In 5 years, the console hardware will be the same as it was on PC back in 2007 (I believe that's when DX11 cards came out). The only advancement will be on the PC side with DX12 and faster brute force GPUs.
 
So what you're saying is that it experiences frame drops during gameplay intensive segments. Surely there's no problem with that in a video game?

No, that's not what I'm saying. 99% of the game is at 30FPS. It does dip at minor times that are nearly imperceptible. Which is exactly what the original CVG report showed... That the game is basically fixed at 30FPS at all times, except for very remote instances where it might dip to 29FPS.


You'll have to excuse my bag of salt, but I don't think it's unreasonable to get fed up with devs trying to shove as much stuff into their game as possible then forgetting to budget the polish, because polish is not marketable.
And this is a 60 dollar re-release of a game that came out last year. I don't want it to be good most of the time.

It's actually good all of the time, regardless of the fractional, incredibly rare FPS drops. That it dips from 30FPS to 29FPS for a second maybe once every few hours in the game doesn't actually make it a bad game. If anybody is capable of noticing that 30 to 29 to 30 dip, an event that occurs basically once every few hours of playing the game, I'd be shocked.

Although, I should note, I haven't played the high explosion mission where the game dips to 26FPS for a handful of seconds, so many that makes this a terrible game and I haven't gotten that far to tell.

Watch the initial 20 minute Digital Foundry FPS analysis:

https://www.youtube.com/watch?v=mxJCBgcX_jk#t=15

This video is 19 minutes long. I watched the whole thing on Tuesday and counted about 5 instances of the game dropping to 29fps, and then bouncing back up to 30fps a second later. In that 19 minute video you have 34,200 frames (19 * 60 * 30). During those 19 minutes, you have about 5 instances of 29fps, which let's be generous and say those frame drops last for 2 seconds, which means you've lost 10 total frames (30 * 5 * 2 - 29 * 5 * 2). 34190 / 34200 * 100 = 99.97%. So, let's say those 19 minutes are generally reflective of the whole game, do you think that a game maintaning 30fps for 99.97% of the entire game is bad, or "chugs at sub-30 fps" ?
 

Lord Error

Insane For Sony
My point is that since the consoles are PCs, most developers have already been figuring out ways to optimize their graphics engines.. Years in fact. There is nothing special about the new console hardware that would allow them to make another engine more optimal with even more visual fidelity and still run at 1080p/30fps solid. There becomes a point of diminishing returns on how much fidelity you can introduce and how much you have to scale back due to hardware limitations.

In 5 years, the console hardware will be the same as it was on PC back in 2007 (I believe that's when DX11 cards came out). The only advancement will be on the PC side with DX12 and faster brute force GPUs.
I guess time will tell. Examples I counted above already prove that's it's not at all cut and dry like you think it is, though.
In terms of DX11 stuff, you already have something like Tomorrow Children which circumvents the whole DX feature paradigm by using custom raytraced rendering (and runs at above 30FPS).
 

omonimo

Banned
My point is that since the consoles are PCs, most developers have already been figuring out ways to optimize their graphics engines.. Years in fact. There is nothing special about the new console hardware that would allow them to make another engine more optimal with even more visual fidelity and still run at 1080p/30fps solid. There becomes a point of diminishing returns on how much fidelity you can introduce and how much you have to scale back due to hardware limitations.

In 5 years, the console hardware will be the same as it was on PC back in 2007 (I believe that's when DX11 cards came out). The only advancement will be on the PC side with DX12 and faster brute force GPUs.
You are talking like the last big AAA use the pc rig for the graphic specs where it's completely false.
 
I guess time will tell. Examples I counted above already prove that's it's not at all cut and dry like you think it is, though.
In terms of DX11 stuff, you already have something like Tomorrow Children which circumwents the whole DX feature paradigm by using custom raytraced rendering (and runs at above 30FPS).

Yes, but TTC has pretty simple gameplay and a world that's pretty empty. Not knocking the game but it's not the same as a game like GTA5 or AC:U.
 

Lord Error

Insane For Sony
Yes, but TTC has pretty simple gameplay and a world that's pretty empty. Not knocking the game but it's not the same as a game like GTA5 or AC:U.
It's not simple gameplay at all. It's a real time collaborative game where many people can participate at the same time in the world which has completely destructive/rebuildable environment. The world is only 'empty' because there's probably like ten people total working on the game, vs. thousand people on AC:U, not because their engine is not capable of rendering more.
 

Reg

Banned
TTC looks way more interesting from a gameplay perspective than that gutter trash automated AC game.
 

Lethe82

Banned
Unlikely, they will have been binned and tested based on the current clock speed, at the current voltage settings with the current cooler. While 90% of consoles may be able to support the increased clock speed without upgrading the cooling system or increasing the voltage there will be a few that will not because the particular chip only just passed the tests at the desired speed bin.

I still find it funny that they are attributing a > 9.4% FPS advantage to the CPU alone. It is likely part of the reason but when you are comparing 24 FPS to 28 FPS you are talking about a 20% gap so there must be something else going on too.

They may just wait for a hardware revision.
 
It's not simple gameplay at all. It's a real time collaborative game where many people can participate at the same time in the world which has completely destructive/rebuildable environment. The world is only 'empty' because there's probably like ten people total working on the game, vs. thousand people on AC:U, not because their engine is not capable of rendering more.

Well, even if it were, there is a limit to how much you'll be able to put on the screen and keep that RT GI implementation in check. This stuff isn't for free. The big question is how much more can it render and still maintain 1080p/30fps consistently? How much texture information can be rammed down it's pipe with the complexity of the interiors in AC:U and still maintain a decent framerate?

You sound like in 5yrs we'll have AC:U with realtime GI like in TCC and it will run at 1080p/30fps no matter the complexity. That's not going to happen IMO.
 

prag16

Banned
Grass has never been this cinematic.

No, seriously am I missing something here? Why is a last-gen up-port enhanced it may be chugging sub-30? Devs stop feeding this slop to us and demanding 60 dollars for it.

Don't forget that PS360 had older but significantly beefier CPUs in terms of raw power. The CPU advantage of the new consoles isn't massive, and once you get done upgrading everything to take advantage of the more powerful GPUs and higher memory bandwidth, it's not a shock that the comparatively weak CPUs can't quite keep up.
 
has anyone tried switching off DoF to see if that changes anything?

Yeah i did, makes no difference.

Curiously, the dropped frames when driving through traffic don't seem to appear at night on both consoles..

I'll test PS4 with the light bar and speaker effects turned off and see if that makes a difference during the day.
 

Forsete

Member
After having played the game for a few hours now I am perfectly happy with how the game is running. It is stable 90% of the time and no where is the framerate terrible from what I have seen. A few droped frames here and there for such a massive and glorious looking game I can tolerate.

Good job Rockstar. Now blow us out of the water again with GTA VI. :p
 

-griffy-

Banned
Where? Im not noticing parallax mapping. Could you give some examples.

bPLgBtO.jpg


Look at the bushes here:
s679Grand%20Th3241u86.jpg


rvctvn.gif


Many of the ground textures/sidewalks/roads/rocks feature it. The snow in the opening sequence has it. Sometimes it's using fairly lower res textures so it's hard to notice the effect. Even roof tiles have it (at 2:20, also NSFW language).
 
lol @ the concern

any drops to 26 are minor and are triggered by flat out driving through certain city intersections, in the day; and even then are not reproduceable perfectly (hanging around in an area tends to eliminate them).

that is 1% of the gta5 experience, considering trevor hates the city, many missions are indoor, or at night; or you are parachuting or swimming or hand gliding or flying a plane, or surfing a train, or running in those areas -- or doing one of a 100 other things.

suddenly its a crisis that an open world game has not got 30fps (which looks great btw) nailed down in every single microsecond of the hours and hours of experience? if you are fixated on 60fps, are you going to throw a wobbly if the average and median is 60 but there are drops to 52fps?
 
bPLgBtO.jpg


Look at the bushes here:
s679Grand%20Th3241u86.jpg


rvctvn.gif


Many of the ground textures/sidewalks/roads/rocks feature it. The snow in the opening sequence has it. Sometimes it's using fairly lower res textures so it's hard to notice the effect. Even roof tiles have it (at 2:20, also NSFW language).

Thanks, its low quality pom - but it is. I should play more with the first person option, because in third person even with aiming its hard to notice.
 

ttech10

Member
We have very different ideas of fun ;p

It was fun taking out a grenade or rocket launcher. It also made for better police chases when I could hop on a bike and weave through large sections of traffic in Times Square.

My favorite, though, is it added a nice touch of realism. I hope I can see the day where in a GTA game all of the car parks and parking lots/spaces are populated by vehicles and sidewalks have Unity levels of crowds. It's strange seeing how empty the pier/boardwalk is during peak hours or near empty car parks downtown during business hours.
 

STEaMkb

Member
My point is that since the consoles are PCs, most developers have already been figuring out ways to optimize their graphics engines.. Years in fact. There is nothing special about the new console hardware that would allow them to make another engine more optimal

We have firsthand accounts from developers who, after porting their PC games to console, reported the games ran poorly. These consoles can be challenging under different circumstances (parallel execution, etc). There's plenty of room left for optimization.
 
Xbox One and PS4 both have CPUs that are comprised of 8 cores of shit. Current iPad CPUs can run circles around them, with 6 less cores.

I honestly think using such weak architecture (one that wasn't even good enough to compete with Atoms in mobile space) shouldn't have been utilized on plugged in the wall socket gaming consoles. AMD's solution of throwing more cores at the problem clearly don't cut it.
 
Who would have guessed that the XB1's 10% faster CPU would actually translate into a meaningful performance advantage.

With that said, can someone explain to me what is the purpose of GPGPU? I thought it was supposed to help offload some of the work from the CPU to the GPU. I recall the Resogun developer saying the were using mostly GPGPU and didn't even really need the CPU. Why can't other developers do the same?
 

Piggus

Member
Who would have guessed that the XB1's 10% faster CPU would actually translate into a meaningful performance advantage.

With that said, can someone explain to me what is the purpose of GPGPU? I thought it was supposed to help offload some of the work from the CPU to the GPU. I recall the Resogun developer saying the were using mostly GPGPU and didn't even really need the CPU. Why can't other developers do the same?

GPGPU isn't really suited for stuff like AI, traffic simulation, etc. It's great for physics though.
 
MS would have had to retest all chips to make sure it could handle the increased clock speed. There is no other way to guarantee it will work.

Doesn't it depend on how the chips are binned and the testing they go through? I don't think retesting is necessarily a must.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Who would have guessed that the XB1's 10% faster CPU would actually translate into a meaningful performance advantage.

Leadbetter guessed, because there is no evidence or source for this conclusion. The drops are not correlated with a 9% difference and often the XB1 drops when the PS4 doesn't in traffic. Its probably just some stall in the pipeline for both.
 
Leadbetter guessed, because there is no evidence or source for this conclusion. The drops are not correlated with a 9% difference and often the XB1 drops when the PS4 doesn't in traffic. Its probably just some stall in the pipeline for both.

Well I going based off the article, but doesn't ACU also perform better on XB1 than PS4? And that's because of the XB1's slightly faster CPU.
 

RexNovis

Banned
4 fps = no discernible difference (in PS4's favor) on AC: Unity, but here it's interesting/fascinating/megaton? I appreciate the detail explored in other performance scenarios in this article, but the wording in some cases in some of the recent articles almost comes across as apologist for the Xbones shortcomings.

On top of that it went from "XBO is a solid 30FPS throughout!" and "Both versions offer the same graphical quality across platforms" to "XBO has drops in effect laden scenarios and shootouts and in the countryside that are completely absent but PS4 totally drops more frames in this specific spot guys so it's totally even." and "The CBO does indeed have less ground clutter and sports lower resolution shadows and a slightly lower LOS in the countryside when compared to PS4 but who cares about any of that?!? We never noticed it when playing!"

I mean come the fuck on. The games perform the same or better on PS4 with added graphical detail 99% of the time. The video even shows that the game drops 2 more frames on average in like for like situations across both console versions not to mention having more scenarios where the Fran rate drops on XBO and stays at a solid 30 on PS4. Somehow this stuff wasn't worth a mention but this one area where the XBO holds a frame advantage is a fucking game changer? Absolutely ridiculous. Impartial analysis my ass.
 

Darknight

Member
I wonder if Sony's 1st party teams will hit the so called "CPU bottleneck" and affect their performance as they usually are the ones pushing the system to its limits.

While its obvious that the CPU is holding both systems back, are the other devs dealing with the same issue? And Im not talking about Ubisoft.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
While its obvious that the CPU is holding both systems back,

Why do people keep repeating stuff they have no idea about? Is this some Fox News echo chamber affect?

you are in a thread about a last gen port, last gen...
 

Marlenus

Member
Doesn't it depend on how the chips are binned and the testing they go through? I don't think retesting is necessarily a must.

Well it is possible they always intended to release it with a 1.75Ghz cpu and a 853Mhz gpu so they just binned it for those but IIRC they did state they were increasing the clocks. If they only tested at the original speeds they would need to retest at the increased speed.
 

Darknight

Member
Why do people keep repeating stuff they have no idea about? Is this some Fox News echo chamber affect?

you are in a thread about a last gen port, last gen...

Oh thats why Im asking if other developers are facing the same issue because I dont really buy it. I put "CPU bottleneck" for a reason. Im curios if 1st party devs will have any issues or prove that its not really an issue.

I understand GTA5 is an HD port so Im not concerned that this will be an issue really for native games unless we have shit engines by say UBI.
 

-griffy-

Banned
Well I going based off the article, but doesn't ACU also perform better on XB1 than PS4? And that's because of the XB1's slightly faster CPU.

Your question implies GTAV also performs worse on PS4, which clearly isn't the case. XBO performs better in one specific set of circumstances, whereas PS4 has a clear lead in other circumstances. Overall the result for both versions is they maintain 30fps a majority of the time, and have notable drops in specific, different circumstances. If anything the PS4 has the slight lead in performance overall.

Again, we don't know that AC:Unity performance is better on XBO over PS4 because of CPU differences, that is merely speculation from people based on specs. When Ubisoft came out and said they are working on performance optimization, and that the number of NPC's wasn't affecting performance so they wouldn't be changing that, it brought the CPU advantage into question. It might be some other simple issue causing poorer performance in AC:U on PS4.
 
Xbox One and PS4 both have CPUs that are comprised of 8 cores of shit. Current iPad CPUs can run circles around them, with 6 less cores.

I honestly think using such weak architecture (one that wasn't even good enough to compete with Atoms in mobile space) shouldn't have been utilized on plugged in the wall socket gaming consoles. AMD's solution of throwing more cores at the problem clearly don't cut it.

I like this post.
 
it runs at 30 FPS the majority of the time, quit being nitpicky. this is getting as bad as PC gamers with the nitpicking, i hate how this greater focus on tech has brought out the non gamers to scrutinize little details. Back in the ps1 and ps2 era we played games where characters looked like block people and they were some of the best games ever. Threads like this just need to be banned already.

But the bickering is pure entertainment
 
it runs at 30 FPS the majority of the time, quit being nitpicky. this is getting as bad as PC gamers with the nitpicking, i hate how this greater focus on tech has brought out the non gamers to scrutinize little details. Back in the ps1 and ps2 era we played games where characters looked like block people and they were some of the best games ever. Threads like this just need to be banned already.

Why do you have to drag PC gamers into this? Some people do this all the time on console threads and then complain when PC gamers reply to these posts. Please leave PC gaming out of whatever this thread is.
 

Marlenus

Member
Xbox One and PS4 both have CPUs that are comprised of 8 cores of shit. Current iPad CPUs can run circles around them, with 6 less cores.

I honestly think using such weak architecture (one that wasn't even good enough to compete with Atoms in mobile space) shouldn't have been utilized on plugged in the wall socket gaming consoles. AMD's solution of throwing more cores at the problem clearly don't cut it.

The only Atom it does not compete with is the latest one that was released after the consoles came out. It is still faster than any tablet SOC by a good margin.
 

Gold_Loot

Member
Once again a thread gets derailed vai PC jargon, and once again people are comparing PC hardware like for like with console specs. Stop doing this.
 

THRILLH0

Banned
People are always happy to "wait for the DF analysis" until it doesn't suit their preferred narrative.

I think that the problems in the PS4 version are the lesser of 2 evils here. The firefights are where I have the most fun in GTA games so I'd prefer they were stable. Besides ive got FH2 right over here if I want to fang around the city...
 

Piggus

Member
So GPGPU is good at crunching numbers/calculations, but bad at keeping track of/directing numerous objects? Is that the gist of it?

I don't know much about the difference in how each is programmed but yes, I believe it has to do with the complexity of say a set of AI instructions compared to calculating the physics of a flag flapping in the wind. The type of instructions used in AI simulation are better suited to a general purpose CPU whereas physics are MUCH more efficient on the GPU.
 
Top Bottom