• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Performance Analysis: Assassin's Creed Unity (Digital Foundry)

MMaRsu

Banned
I was thinking the same thing lol, anyway, being this a pretty CPU limited game, and being Xbox One CPU a bit faster than the PS4 one i can see it to be a few frame smoother on Xbox during the most CPU load scenarios, without screaming at Microsoft payed Ubisoft ecc...

More like Xbox One was the main platform. Developing for the lowest common denominator so nobody can bitch about better graphics on ps4.
 

Marlenus

Member
No, ACE is for making a waverfronts generally.
And You cant do async compute when the CU unit is fully occupied, which it is in many cases, especially in heavier scenarios. Sure, there will be always some async time available, but i can vary a lot and You will never be able to push all compute to async.

No game will reach 100% rendering efficiency so there will always be some available shader units somewhere on the GPU.

From Anandtech.

Anandtech said:
Meanwhile on the compute side, AMD’s new Asynchronous Compute Engines serve as the command processors for compute operations on GCN. The principal purpose of ACEs will be to accept work and to dispatch it off to the CUs for processing. As GCN is designed to concurrently work on several tasks, there can be multiple ACEs on a GPU, with the ACEs deciding on resource allocation, context switching, and task priority. AMD has not established an immediate relationship between ACEs and the number of tasks that can be worked on concurrently, so we’re not sure whether there’s a fixed 1:X relationship or whether it’s simply more efficient for the purposes of working on many tasks in parallel to have more ACEs.

One effect of having the ACEs is that GCN has a limited ability to execute tasks out of order. As we mentioned previously GCN is an in-order architecture, and the instruction stream on a wavefront cannot be reodered. However the ACEs can prioritize and reprioritize tasks, allowing tasks to be completed in a different order than they’re received. This allows GCN to free up the resources those tasks were using as early as possible rather than having the task consuming resources for an extended period of time in a nearly-finished state. This is not significantly different from how modern in-order CPUs (Atom, ARM A8, etc) handle multi-tasking.
 

jpax

Member
No, ACE is for making a waverfronts generally.
And You cant do async compute when the CU unit is fully occupied, which it is in many cases, especially in heavier scenarios. Sure, there will be always some async time available, but i can vary a lot and You will never be able to push all compute to async.

---


For sure in cutscenes, probably in gameplay too. Its clear that Xbone is limited by GPU in cutscenes.

No program ever has 100% efficiency. That is the reason ace is so good.

Edit: beaten above
 

Sweep14

Member
I think you'll see a lot of this. MS want parity and are clearly influencing that overtly for indie games, and less so for AAA titles - but clear example being Diablo III where they forced parity from Blizzard.

Devs will make the lowest common denominator the lead dev platform - in the hope that the more powerful unit will cope with having it ported.

Because X1 takes the lead dev position, its more likely to be easier to optimise and you'll get results like this.

If Unity was coded on PS4 as the lead platform, this outcome would not exist.

Than Sony should reject bad ports and make it clear to publishers that they will not accept unoptimized botched software.
 

Ashes

Banned
This is their second assassin's creed game on the next gen console. You'd think they would learn how much they can get away with on this hardware.
 

TheYanger

Member
Not gonna lie, I'm not that surprised with these results after watching the GB quicklook. The sizes of those crowds in practice (I never believe the BS at E3 and such) are extremely large and impressive. Would've done a lot of good to cut some of that shit down in my opinion. If you cut the crowd density by like 25% I'd still have been impressed by their size.
 

virtualS

Member
Update your PS4s to 2.02 and we will approach greater parity.

Excuse me while I throw up. The thought of all those poor GPU transistors chained up by an exclusive marketing deal makes me ill.
 

eso76

Member
Welp, I was wrong: i thought parity meant we'd see the xbone version performing much worse, but this at least does prove that the game is indeed CPU bound like Ubisoft said.

Still doesn't explain why the PS4 version couldn't be better as far as GPU related tasks go.

If both version are running with the same exact assets/resolution/shaders/aa etc and framedrops are, as it seems, CPU related, we can assume the PS4 version isn't using PS4 GPU advantages at all and it's fair to assume the game could have used higher res, better AA or whatever non CPU affecting improvement without impacting performances.
 

Marlenus

Member
Of course. Technichal evidences are shite.

Moneyhat conspiracy is the only truth.

The move engines are there to assist the moving of data into, and out of ESRAM, something the PS4 does not have to do so the CPU time saving of that is minimal at best.

A 9% clock speed advantage will never give you a 20% FPS advantage, the disparity in performance is too great for the CPU alone to be considered the primary cause. That leaves shoddy development work which is practically a Ubisoft trademark.
 

KKRT00

Member
No game will reach 100% rendering efficiency so there will always be some available shader units somewhere on the GPU.

From Anandtech.
No program ever has 100% efficiency. That is the reason ace is so good.

Edit: beaten above

Where did i write that there will 100% rendering efficiency?
I actually wrote the opposite, but async time is highly variable and You cant push all Your GPGPU tasks there for every scenario.

----

Still doesn't explain why the PS4 version couldn't be better as far as GPU related tasks go.

If both version are running with the same exact assets/resolution/shaders/aa etc and framedrops are, as it seems, CPU related, we can assume the PS4 version isn't using PS4 GPU advantages at all and it's fair to assume the game could have used higher res, better AA or whatever non CPU affecting improvement without impacting performances.

I have to repeat the same every page? Check cutscenes analysis, game would drop heavily in 1080p.
 

oldergamer

Member
I wonder if people are upset because this rings true back to how things often were on PS3. PS3 was a significantly stronger machine but ended up often with worse framerates than Xbox 360 versions.
Unlike resolutions and jaggies and AA which are about looking nice for the aesthetics, low framerates can literally make a game unplayable.
As explained in the DF article, they were heavily CPU bound. There was lots of GPU headroom, but GPUs don't have anything to do with the computation of 5000 npcs, AI and so on. So Xbox Ones slightly superior CPU actually showed its graces here, while Ubi didn't take advantage of all the GPU headroom on the PS4, and that of course is extra upsetting over everything else.


Perhaps Unity should have strived for 2500 NPCs instead (and 5000 in that one scene, instead of 10000). I am not sure why having so many NPCs matter when so many of them are just copy pasted models with the same faces. Takes me out of it completely. They should have made procedual randomized faces.

PS3 wasn't significantly stronger. It had a more powerful CPU, but a less powerful GPU. The CPU often had to take tasks away from the CPU, not to mention it was much harder to utilize which made squeezing high performance out of it difficult. I see the current systems being much closer in this regard.
 
Not gonna lie, I'm not that surprised with these results after watching the GB quicklook. The sizes of those crowds in practice (I never believe the BS at E3 and such) are extremely large and impressive. Would've done a lot of good to cut some of that shit down in my opinion. If you cut the crowd density by like 25% I'd still have been impressed by their size.

This is my thinking as well. Nothing here is surprising other than they choose to sacrifice the games IQ and framerate over a number. Yes it's true it feels like a great sales trick to tout 5000! 10000! but at the same time, these CPUs are not build for that, and it's not like people are not getting thrown off the immersion aspect with the way all their faces are the same or how character models swap/change right in front of you.
 

Mastperf

Member
Of course. Technichal evidences are shite.

Moneyhat conspiracy is the only truth.
What technical evidence? You have one unfinshed open-world game performing better on the XB1. You seem to be ignorant about PS4 hardware and mention things that the PS4 doesn't need (move engines) or things it already has like unique buses.
 

oldergamer

Member
The move engines are there to assist the moving of data into, and out of ESRAM, something the PS4 does not have to do so the CPU time saving of that is minimal at best.

A 9% clock speed advantage will never give you a 20% FPS advantage, the disparity in performance is too great for the CPU alone to be considered the primary cause. That leaves shoddy development work which is practically a Ubisoft trademark.

The shoddy development work comment isn't realistic. What if the audio in the game is what causes a larger portion of the delta between the two system? Perhaps Xb1 has more capacity to handle all the audio in the game where as the CPU plays a bigger role in handling audio on PS4.
 

Marlenus

Member
The shoddy development work comment isn't realistic. What if the audio in the game is what causes a larger portion of the delta between the two system? Perhaps Xb1 has more capacity to handle all the audio in the game where as the CPU plays a bigger role in handling audio on PS4.

The GPUs both have True Audio, if the devs are not using it and are putting the audio workload on the CPU they are stupid.
 
The game looks quite good, but the performance is terrible. My 770 renders scenes in the fucking teens on moderately high settings. Id say its a shitty port, but then the console versions run like shit themselves so, is it really a shitty port or is it simply a shitty game?
 

Elios83

Member
Of course. Technichal evidences are shite.

Moneyhat conspiracy is the only truth.

You're not bringing in any technical evidence.
The move engines in the XB1 are used to assist the esram.
The CPU clockspeed advantage the XB1 has is pretty much insignificant in real world performance and on PS4 they could use part of the GPU to assist the CPU thanks to GPGPU, considering both games run at the same resolution that is even more true.
Also if the game is CPU limited and PS4 has more GPU headroom, it's not clear why the game doesn't run at a higher res on PS4 while keeping the same CPU-bound frame rate.
There really is no conspiracy btw, Ubisoft has just fucked up, they were way behind schedule, PS4 version got much less development time and the focus was on their co-marketed XB1 lead platform, both versions have been released in an unfinished state. The parity debacle last month was the tip of the iceberg for them.
 

iceatcs

Junior Member
Look like you need a massive group to make it parity.

You would need lot of people to make Xbox One to be on par?
See at Rockstar, Bungie, EA internal and then Ubisoft Montreal which all of them has hundred hundred of people in the studio or multi-studios. So you will have number of people on both sides - like internet console wars in the studio.

Hopefully in the future, there is no need to have a massive team nor $100m+ a game.
 

Marlenus

Member
I have to repeat the same every page? Check cutscenes analysis, game would drop heavily in 1080p.

Maybe, but with the graphical fidelity on display in those cut scenes you would expect a rock solid 30 fps on both consoles. They are no better than other games so what is causing the crappy frame rates other than inefficient code.
 

oldergamer

Member
The GPUs both have True Audio, if the devs are not using it and are putting the audio workload on the CPU they are stupid.
So they both have tru Audio, but as I recall wasn't there some extra DPS's in the Xb1 audio chip as it was more custom? I recall them saying they packed a lot more into it and it was almost like a small CPU?

anyway, having the hardware and keeping the hardware fed with data could be different between the two systems. Ps4 might have the cpu moving audio data to it's audio processors where the xb1 doesn't need to eat up those resources.
 

RexNovis

Banned
I only have one thing to say...

QUp7Biu.gif

RikerPicardLOL.gif


Perfection.
 

Marlenus

Member
So they both have tru Audio, but as I recall wasn't there some extra DPS's in the Xb1 audio chip as it was more custom? I recall them saying they packed a lot more into it and it was almost like a small CPU?

anyway, having the hardware and keeping the hardware fed with data could be different between the two systems. Ps4 might have the cpu moving audio data to it's audio processors where the xb1 doesn't need to eat up those resources.

They were talked about a lot pre launch but didn't that just end up being something to do with the kinect? I have not looked into it for a long while so I would have to do some further digging to find out.

Why try and find complicated reasons for the > clock speed difference in performance? The simplest explanation is that they developed the game in DX for the PC and Xbox One and then just ported it over to the PS4 and did not really do much if any performance tuning.
 

jpax

Member
Of course. Technichal evidences are shite.

Moneyhat conspiracy is the only truth.

Stop making a fool out of yourself... You presented zero evidence, except the presumed clock rate. Move engines? Why would Sony want something like this? MS cared more with their architecture? Everything speaks to the opposite.
Just stop...

Edit: beaten a thousand times ;-) but hey it was directed at me :)
 

Mastperf

Member
So they both have tru Audio, but as I recall wasn't there some extra DPS's in the Xb1 audio chip as it was more custom? I recall them saying they packed a lot more into it and it was almost like a small CPU?

anyway, having the hardware and keeping the hardware fed with data could be different between the two systems. Ps4 might have the cpu moving audio data to it's audio processors where the xb1 doesn't need to eat up those resources.
The majority of that extra hardware was put in for Kinect and unusable by devs. We don't know how much if any was returned with the removal of Kinect reserves since it was for voice recognition. You seem to be trying to find any reason possibe to explain it other than what it clearly is. The game is a buggy and unfinished mess that shouldn't be used as any sort of technical benchmark for either console.
 

Marlenus

Member
The game is a buggy and unfinished mess that shouldn't be used as any sort of technical benchmark for either console.

This, the game seems like it should have had a 3 month QA and optimisation period prior to release but that would have pushed it into next year so Ubi just released it in this state.

If the devs get to work on it and they patch it I hope DF revisit it to see if they managed to improve it.
 

DasDamen

Member
If both version are running with the same exact assets/resolution/shaders/aa etc and framedrops are, as it seems, CPU related, we can assume the PS4 version isn't using PS4 GPU advantages at all and it's fair to assume the game could have used higher res, better AA or whatever non CPU affecting improvement without impacting performances.

Maybe they didn't want people to think that the graphical upgrades would be to blame for the garbage performance? I don't know. I'm grasping at straws here.
 

RexNovis

Banned
Oh, it is like nobody on Gaf ever pointed that this could happen on open games with heavy CPU demands.

Except some people did.

It's not only about 150mhz upclock, MS cared more about soften Jaguar cores deficiencies with all those move engines crap and dedicated buses to max performance and lose less cycles.

Ok people, time to surrender your parity posts, your moneyhats accusations and your lazy devs claims.

Of course. Technichal evidences are shite.

Moneyhat conspiracy is the only truth.

RikerPeaceOutLOL.gif


Please just stop. You obviously have no clue what you are talking about and, no, false dichotomies is not going to hide your complete ignorance.
 
Async GPGPU exists, but most of GPGPU related algorithms will not run in async. So yeah its possible to some degree, but most of the time GPGPU will take GPU time from rendering.

---

But their games work fine on low spec and mid-range PCs. See Watch Dogs or AC IV analysis on DF on lower spec PCs or GPUs.

Watch Dogs did NOT run fine on my mid-high end PC at all.
 
Welp, I was wrong: i thought parity meant we'd see the xbone version performing much worse, but this at least does prove that the game is indeed CPU bound like Ubisoft said.

Still doesn't explain why the PS4 version couldn't be better as far as GPU related tasks go.

If both version are running with the same exact assets/resolution/shaders/aa etc and framedrops are, as it seems, CPU related, we can assume the PS4 version isn't using PS4 GPU advantages at all and it's fair to assume the game could have used higher res, better AA or whatever non CPU affecting improvement without impacting performances.

Reposting this graph from the previous page - could GPU bandwidth be the constraint if the CPU is stealing loads of it?


Is anyone more qualified than me able to confirm if that graph is legit or not? It's the only thing I can think would explain the difference and in particular why there's not a resolution upscale on a cpu-bound game. Either that or simply not as much effort put into the PS4 version.
 

Ty4on

Member
Is anyone more qualified than me able to confirm if that graph is legit or not? It's the only thing I can think would explain the difference and in particular why there's not a resolution upscale on a cpu-bound game. Either that or simply not as much effort put into the PS4 version.

They likely ran out of time and the game still chugs in cutscenes from GPU load (look at how the XB1 suddenly loses). Resolution AFAIK does affect RAM usage which they had to debug if running the PS4 version in 1080p.
 

Harlock

Member
Like that guy said in the Bombcast e-mail. Ubisoft re-uses the maximum possible of code. Plus short time to release the game. Very poor optimization in each platform.

In the last-gen some companies started to build first in the PS3, to run better in the trickiest platform. Maybe now this situation is with the X1. Just they are not having enough time to do a decent job in the PS4 port.
 
Reposting this graph from the previous page - could GPU bandwidth be the constraint if the CPU is stealing loads of it?



Is anyone more qualified than me able to confirm if that graph is legit or not? It's the only thing I can think would explain the difference and in particular why there's not a resolution upscale on a cpu-bound game. Either that or simply not as much effort put into the PS4 version.

The graph is legit, but I don't think that's the case here.

What I'm wondering now is: isn't there a direct bus between the CPU and GPU parts of the PS4's APU? Couldn't this be used (in special situations) to "stream" data from the memory through the GPU to the CPU without decreasing the overall memory bandwidth?

Probably not.
 

Marlenus

Member
Reposting this graph from the previous page - could GPU bandwidth be the constraint if the CPU is stealing loads of it?



Is anyone more qualified than me able to confirm if that graph is legit or not? It's the only thing I can think would explain the difference and in particular why there's not a resolution upscale on a cpu-bound game. Either that or simply not as much effort put into the PS4 version.

CPU --> RAM bandwidth is limited to 20GB/s. That leaves 156GB/s available to the GPU which is in line with the 7850 and 7870 and they do not seem to be too badly affected by memory bandwidth limitations.

This can be shown by comparing the 7870 Ghz to the R7 265. The R7 265 has 179.2 GB/s of bandwidth but the 7870 Ghz with only 153.6 GB/s of bandwidth is the faster performer as you would expect due to its increased shader and TMU count.

The most reasonable explanation is that they did not really put any time into tuning the PS4 version which is still required, despite having the same hardware architecture, due to the difference in the APIs.
 

oldergamer

Member
The majority of that extra hardware was put in for Kinect and unusable by devs. We don't know how much if any was returned with the removal of Kinect reserves since it was for voice recognition. You seem to be trying to find any reason possibe to explain it other than what it clearly is. The game is a buggy and unfinished mess that shouldn't be used as any sort of technical benchmark for either console.

I'm theorizing reasons. IMO some of you aren't trying to think of valid reasons and instead resort to the lazy developers excuse ( which just doesn't fly in this instance). I actually don't think that was the hardware specifically for Kinect, not from what i remember. I'm fairly certain the two systems aren't exactly identical in what they can do with the audio hardware.
 
The move engines are there to assist the moving of data into, and out of ESRAM, something the PS4 does not have to do so the CPU time saving of that is minimal at best.

A 9% clock speed advantage will never give you a 20% FPS advantage, the disparity in performance is too great for the CPU alone to be considered the primary cause. That leaves shoddy development work which is practically a Ubisoft trademark.

What technical evidence? You have one unfinshed open-world game performing better on the XB1. You seem to be ignorant about PS4 hardware and mention things that the PS4 doesn't need (move engines) or things it already has like unique buses.

You're not bringing in any technical evidence.
The move engines in the XB1 are used to assist the esram.
The CPU clockspeed advantage the XB1 has is pretty much insignificant in real world performance and on PS4 they could use part of the GPU to assist the CPU thanks to GPGPU, considering both games run at the same resolution that is even more true.
Also if the game is CPU limited and PS4 has more GPU headroom, it's not clear why the game doesn't run at a higher res on PS4 while keeping the same CPU-bound frame rate.
There really is no conspiracy btw, Ubisoft has just fucked up, they were way behind schedule, PS4 version got much less development time and the focus was on their co-marketed XB1 lead platform, both versions have been released in an unfinished state. The parity debacle last month was the tip of the iceberg for them.

Increased CPU resources on the Xbox One is a subject largely discussed already on this same forum. You just have to remember all those balance posts.

I know there is people here that need PS4 to be superior on every regard over One. That leads to people repeating on every thread that PS4 cpu might be overclocked to 1'8 or even 2Ghz, refusing to accept any evidence of stock speeds that they get. Thread after thread.

Then they blame devs everytime that they consider the gap is not wide enough, like some anorexic girl in front of the mirror. Unable to see anything outside of her own belly.

Also, there is many people claiming that 150mhz is negligible based on their desktop computers, not taking into account how heavily CPU bound current gen is and how low clock those CPU are to start with.

You were wrong about paritygate, and launch game proves it. Deal with it.
 

Chobel

Member
Increased CPU resources on the Xbox One is a subject largely discussed already on this same forum. You just have to remember all those balance posts.

I know there is people here that need PS4 to be superior on every regard over One. That leads to people repeating on every thread that PS4 cpu might be overclocked to 1'8 or even 2Ghz, refusing to accept any evidence of stock speeds that they get. Thread after thread.

Then they blame devs everytime that they consider the gap is not wide enough, like some anorexic girl in front of the mirror. Unable to see anything outside of her own belly.

Also, there is many people claiming that 150mhz is negligible based on their desktop computers, not taking into account how heavily CPU bound current gen is and how low clock those CPU are to start with.

You were wrong about paritygate, and launch game proves it. Deal with it.
post-33537-Jim-Carrey-Truman-Show-gif-wha-cIrC.gif


PS4 CPU don't need any of these "Increased CPU resources" because it doesn't have eSRAM.
 

Mastperf

Member
I'm theorizing reasons. IMO some of you aren't trying to think of valid reasons and instead resort to the lazy developers excuse ( which just doesn't fly in this instance). I actually don't think that was the hardware specifically for Kinect, not from what i remember. I'm fairly certain the two systems aren't exactly identical in what they can do with the audio hardware.
According to a MS employee on beyond3d it was for Kinect.
 

Mastperf

Member
Increased CPU resources on the Xbox One is a subject largely discussed already on this same forum. You just have to remember all those balance posts.

I know there is people here that need PS4 to be superior on every regard over One. That leads to people repeating on every thread that PS4 cpu might be overclocked to 1'8 or even 2Ghz, refusing to accept any evidence of stock speeds that they get. Thread after thread.

Then they blame devs everytime that they consider the gap is not wide enough, like some anorexic girl in front of the mirror. Unable to see anything outside of her own belly.

Also, there is many people claiming that 150mhz is negligible based on their desktop computers, not taking into account how heavily CPU bound current gen is and how low clock those CPU are to start with.

You were wrong about paritygate, and launch game proves it. Deal with it.
Are you ok?
 

jpax

Member
Increased CPU resources on the Xbox One is a subject largely discussed already on this same forum. You just have to remember all those balance posts.

I know there is people here that need PS4 to be superior on every regard over One. That leads to people repeating on every thread that PS4 cpu might be overclocked to 1'8 or even 2Ghz, refusing to accept any evidence of stock speeds that they get. Thread after thread.

Then they blame devs everytime that they consider the gap is not wide enough, like some anorexic girl in front of the mirror. Unable to see anything outside of her own belly.

Also, there is many people claiming that 150mhz is negligible based on their desktop computers, not taking into account how heavily CPU bound current gen is and how low clock those CPU are to start with.

You were wrong about paritygate, and launch game proves it. Deal with it.

As it seems you will not stop making a fool out of yourself... Keep rocking to beat of ignorance!
 
Top Bottom