• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Performance Analysis: Assassin's Creed Unity (Digital Foundry)

ICPEE

Member
That's the point I'm trying to make, it's not that simple. These GPUs are largely the same, Ps4 has more execution units, but they have lots of the same hardware in it. And al of those run faster on xbone due the higher clock.



Of course there are. Plenty of reasons actually.

Like low performant SDKs with high cpu overhead, virtualization costs, etc... Do you not noticed a trend where as soon as Ms started improving their tools the performance started creeping closer to PS4? Ps4 will probably still outperform the xbone, but early titles were not showing the real baseline performance of the xbone hardware.


That's subjective as fuck.

I think games like FH2 and SO have nothing to be ashamed of when compared to Ps4 exclusives, in many ways they even outdo them. But it's a bit pointless arguing over that,.



I will try to simplify:

- The cutscenes are largely gpu bound. Ps4 drops frames as well, so it's not being capped or anything. Still, the performance delta is not a single time close to the 40% the extra flops on Ps4 would lead to believe? Why? It might be because there are parts of the rendering pipeline (like for example setting up the vertex data as fragments to the pixel shader) that runs faster on xbone, which can make up for the difference. It might also be that the shaders they are using rely on bandwidth or some other resource than flops. Either way, the 40% isn't showing here, while curiously a 10% overclock is netting fairly often more than 10% frames for xbone during gameplay.

- Using the smoke grenade causes framedrops on both platforms, but on Ps4 not only the drop is more severe it's also the lowest point for the console (18fps). Ps4 has twice the number of ROPs, why does this happen? Kinda right to pinpoint a culprit without any profile data, but looking at the architectures might give an answer: The esram on xbone provides on a theoretical max, more bandwidth than the entire GDDR5 on Ps4, but that bandwidth is only accessible when writing and reading from it at the same time, something that a huge curtain of smoke might very well do. The game uses deferred lighting, has tons of post processing which relies on screen space, and has some alpha effects, it's not out of the ordinary to say they are often bandwidth bound, so in a scenario like that, the esram might be an advantage for the xbone, despite having less ROPs.

See what I'm talking about? The Ps4 might be more powerful, but in one scenario it doesn't outperforms as well as it should, and on the other is being outperformed, despite theoretically having more hardware to deal with the issue.
lol_ricky_gervais.gif
 

Valnen

Member
Sony should not have let this pass certification without hitting 1080p/locked 30 fps minimum. A competent developer could have pulled it off.
 

Cynn

Member
Jesus fucking christ, Ubisoft. Alright, so in a perfect timeline:

  • This game is delayed by one year to get its shit sorted.
  • Rogue, which is apparently far better anyway, is cross-gen and becomes this year's AC game.

If this were a few months ago and you were an exec at Ubi, you'd think you'd see where the wind was blowing and do the above.

I never understood just throwing Rouge out there randomly anyway. One last cross gen would have been fine.
 

HardRojo

Member
I meant to say that I've never played an assassins creed game and thank god I'm not starting with this one.

The present day stuff barely matters so the Ezio Trilogy is a good way to start, you can then go on to AC Black Flag. Avoid AC3 like the plague, it's horrible, tedious and boring.
 
It really does take one hell of a fuck up to have poorer performance on the better hardware. Did they just get 3 interns together and ask them to port it over to PS4 while everyone worked on Xbone?
 
No.

The PS4 reserves cpu time for the OS, too.

Yeah, that's the issue here. 2 entire cpu cores are reserved for the OS; I think it's a waste of resources for running such a basic OS, especially when games like this could benefit a lot from having more cpu power available.
 

RetroStu

Banned
Why do people always blame Microsoft for EVERYTHING on this forum?.

I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.

Jesus some of you never give up.
 
Why do people always blame Microsoft for EVERYTHING on this forum?.

I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.

Jesus some of you never give up.

MS paid you to write this huh?
 

SSReborn

Member
Why do people always blame Microsoft for EVERYTHING on this forum?.

I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.

Jesus some of you never give up.
Lol
 

mid83

Member
So my takeaway as a completely ignorant when it comes to hardware issues is that the bottleneck is the NPC density throughout the city.

That just seems stupid. I always thought urban areas in AC games seemed full and crowded in a way that made other open world games seem barren. Not sure where the logic lies in adding so many more NPCs. Seems like a completely unnecessary reason to sacrifice performance.

For those of you who understand this stuff way better than me, is the density of NPCs in the city something that could be reduced and patched accordingly to help performance, or is it something far more complicated?
 
And we're back to last gen, forgot about the PS4 slightly worse performance. This game performance is really bad through out the board on both of them.
 

Head.spawn

Junior Member
Well, yes, obviously, but I was under the impression the Xbox was running 3 operating systems at once, along with the Kinect features, hence why they'd decided to overclock their identical CPU to compensate.

PS4 still reserves 2 cores though, so the fact that MSs OS does more is irrelevant I would think.
 

joeblow

Member
Why do people always blame Microsoft for EVERYTHING on this forum?.

I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.

Jesus some of you never give up.


3rsm2p.jpg
 

kabel

Member
Why do people always blame Microsoft for EVERYTHING on this forum?.

I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.

Jesus some of you never give up.

http://www.neogaf.com/forum/showthread.php?t=489769
 

thelastword

Banned
PS4 version of Jaguar is capable of using Turbo mode which should set the CPU clock to 2.75 GHz.
Sony might have to do something about this to avoid future issues with cpu bound games, they can up the clockspeed and release some cpu reserves from the OS. I imagine there's more they can release as opposed to Microsoft since they have dedicated hardware for streaming and capture etc...
 

i-Lo

Member
That's the point I'm trying to make, it's not that simple. These GPUs are largely the same, Ps4 has more execution units, but they have lots of the same hardware in it. And al of those run faster on xbone due the higher clock.



Of course there are. Plenty of reasons actually.

Like low performant SDKs with high cpu overhead, virtualization costs, etc... Do you not noticed a trend where as soon as Ms started improving their tools the performance started creeping closer to PS4? Ps4 will probably still outperform the xbone, but early titles were not showing the real baseline performance of the xbone hardware.


That's subjective as fuck.

I think games like FH2 and SO have nothing to be ashamed of when compared to Ps4 exclusives, in many ways they even outdo them. But it's a bit pointless arguing over that,.



I will try to simplify:

- The cutscenes are largely gpu bound. Ps4 drops frames as well, so it's not being capped or anything. Still, the performance delta is not a single time close to the 40% the extra flops on Ps4 would lead to believe? Why? It might be because there are parts of the rendering pipeline (like for example setting up the vertex data as fragments to the pixel shader) that runs faster on xbone, which can make up for the difference. It might also be that the shaders they are using rely on bandwidth or some other resource than flops. Either way, the 40% isn't showing here, while curiously a 10% overclock is netting fairly often more than 10% frames for xbone during gameplay.

- Using the smoke grenade causes framedrops on both platforms, but on Ps4 not only the drop is more severe it's also the lowest point for the console (18fps). Ps4 has twice the number of ROPs, why does this happen? Kinda right to pinpoint a culprit without any profile data, but looking at the architectures might give an answer: The esram on xbone provides on a theoretical max, more bandwidth than the entire GDDR5 on Ps4, but that bandwidth is only accessible when writing and reading from it at the same time, something that a huge curtain of smoke might very well do. The game uses deferred lighting, has tons of post processing which relies on screen space, and has some alpha effects, it's not out of the ordinary to say they are often bandwidth bound, so in a scenario like that, the esram might be an advantage for the xbone, despite having less ROPs.

See what I'm talking about? The Ps4 might be more powerful, but in one scenario it doesn't outperforms as well as it should, and on the other is being outperformed, despite theoretically having more hardware to deal with the issue.

I..uh..uhh.. agree with this. ESRAM is da beast; Ubisoft is taking advantage by turning the theoretical max into the actual max which is beyond PS4's GDDR5 which is also highly susceptible to latency. Of course, a higher clock speed for CPU doesn't hurt either.
 
this is what "parity" pretty much entails. More time spent on the weaker hardware to match expectations. This is what many feared would happen. I can just sit back and cont the days till Naughty Dog sets the precedent for next ten games on PS4. This is utterly baffling what i am seeing here.
 
So my takeaway as a completely ignorant when it comes to hardware issues is that the bottleneck is the NPC density throughout the city.

That just seems stupid. I always thought urban areas in AC games seemed full and crowded in a way that made other open world games seem barren. Not sure where the logic lies in adding so many more NPCs. Seems like a completely unnecessary reason to sacrifice performance.

For those of you who understand this stuff way better than me, is the density of NPCs in the city something that could be reduced and patched accordingly to help performance, or is it something far more complicated?

According to some of the reviews it makes navigating through them a nightmare too.
 

Cynn

Member
So my takeaway as a completely ignorant when it comes to hardware issues is that the bottleneck is the NPC density throughout the city.

That just seems stupid. I always thought urban areas in AC games seemed full and crowded in a way that made other open world games seem barren. Not sure where the logic lies in adding so many more NPCs. Seems like a completely unnecessary reason to sacrifice performance.

For those of you who understand this stuff way better than me, is the density of NPCs in the city something that could be reduced and patched accordingly to help performance, or is it something far more complicated?
NPC count and AI was a huge selling point for the game so dialing that back probably wasn't an option.
 

omonimo

Banned
Why do people always blame Microsoft for EVERYTHING on this forum?.

I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.

Jesus some of you never give up.
So now you show your true face, Phil. Stop to use fake account.
 

Percy

Banned
This game really does seem to have been a top to bottom embarassment on both of these platforms. DF really pushing quite hard on a minor difference on two versions of the game that both run like garbage though, curiously.

Ah well... if nothing else this is certainly going to be a fun thread to refer back to when the next DF article showing the PS4 version of a game being better than the Xbox One version rolls into town.
 
Is it even historically accurate to have that many people walking on the streets of Paris more than 200 years ago, as if its a crowded morning on Time's Square?
 

TyrantII

Member
NPC count and AI was a huge selling point for the game so dialing that back probably wasn't an option.

And yet they still failed on both of them, huge distracting pop in and same brain dead AI.

I would have cut out the NCPs, no one would have missed them if it got the game playable.
 
Why do people always blame Microsoft for EVERYTHING on this forum?.

I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.

Jesus some of you never give up.

I think Sony is to blame for having a quarter of the CPU locked away for the OS as well as almost half of the system memory. I hope this changes with future firmware releases.
 
NO. a 9% increase is still a 9% increase, when its competitor has the same number of cores. You don't multiply it.

I have two cores. You have two cores. I increase my clock speed 20%. Does that now make me 40% faster than you? No, it's still 20 freakin percent, because we have the same number of cores still.


Past me -

I actually don't follow so please explain it like I'm 5. If the clock speed of a single core is increased 9% it seems reasonable that a parallel task across all available cores each get a 9% increase. In the case of a 6 core processor the total potential gain is 54% for any given parallel task?

So for example, if I have 120 tasks that can be performed in parallel (ignoring overhead of managing those tasks, memory contention and so on) each core/thread would get 20 tasks. It seems reasonable to me that if I click up the clock rate on each core by 9% the entire batch of 120 tasks would complete 54% faster. Why is that thinking wrong just because another system has the same number of cores?
 
sad to see such an interesting setting for a game wasted by bad project management and/or politics.
will be interesting to see what other multiplats perform better on xbox, and if they are by chance games that also have a marketing deal.
 
Top Bottom