• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Performance Analysis: Assassin's Creed Unity (Digital Foundry)

I wonder if people are upset because this rings true back to how things often were on PS3. PS3 was a significantly stronger machine but ended up often with worse framerates than Xbox 360 versions.
Unlike resolutions and jaggies and AA which are about looking nice for the aesthetics, low framerates can literally make a game unplayable.
As explained in the DF article, they were heavily CPU bound. There was lots of GPU headroom, but GPUs don't have anything to do with the computation of 5000 npcs, AI and so on. So Xbox Ones slightly superior CPU actually showed its graces here, while Ubi didn't take advantage of all the GPU headroom on the PS4, and that of course is extra upsetting over everything else.


Perhaps Unity should have strived for 2500 NPCs instead (and 5000 in that one scene, instead of 10000). I am not sure why having so many NPCs matter when so many of them are just copy pasted models with the same faces. Takes me out of it completely. They should have made procedual randomized faces.
 
I read once that the PS4 has 4.5 GB RAM for games and the XOne 5 GB, the rest of the memory is reserved for the OS.

Under some circumstances developer could use more RAM on PS4, but they can never rely on using more than 4.5 GB.

Taken together with the faster CPU of the XOne this would explain the performance difference.

But the PS4s RAM is much faster and less complex than that of the XBOs move engines and edram setup. Its more likely the xbone was simply the lead platform in montreal and was there for optimized a bit better.
 

Asmodai48

Member
I wonder if people are upset because this rings true back to how things often were on PS3. PS3 was a significantly stronger machine but ended up often with worse framerates than Xbox 360 versions.
Unlike resolutions and jaggies and AA which are about looking nice for the aesthetics, low framerates can literally make a game unplayable.
As explained in the DF article, they were heavily CPU bound. There was lots of GPU headroom, but GPUs don't have anything to do with the computation of 5000 npcs, AI and so on. So Xbox Ones slightly superior CPU actually showed its graces here, while Ubi didn't take advantage of all the GPU headroom on the PS4, and that of course is extra upsetting over everything else.


Perhaps Unity should have strived for 2500 NPCs instead (and 5000 in that one scene, instead of 10000). I am not sure why having so many NPCs matter when so many of them are just copy pasted models with the same faces. Takes me out of it completely. They should have made procedual randomized faces.

PS3 was significantly stronger?
 

Faith

Member
Why is this surprising? The Xbox One has a faster CPU and lower latency RAM.

There will always be games that perform better on the Xbox One.
 

Durante

Member
Wow 10% or less upclock and now it's got a MORE powerful CPU
Well, yes, that's how it works. If you take the same CPU architecture and have one instance of it clocked higher than the other, then it's by definition more powerful.

True, it also matches the result of those Ubisoft tests, but it does seem weird considering other developers's comments about both CPU's in the past and both consoles's draw call cost (see Witcher 3 Tech Lead interview and earlier Bink benchmarks for example) which would count when you are dealing with lots of objects and would help the slower CPU.
Still their AI and animation code is probably the big decided there.
Yeah, I also wonder about that part. Maybe MS' recent SDK releases put them on par in CPU overhead.
 

Durante

Member
Well that wouldn't be a problem for a team of 10,000.
They could have a team of 100000 people working on the game and there would still be some algorithms which are inherently not suited to GPGPU.

The fact that so many people consider it a magic bullet, and interchangable with CPU performance,is quite the marketing success.
 
PS3 was significantly stronger?

Yes, PS3 had a Cell processor which they claimed was the level of a super computer. It was notoriously difficult to develop for though. They also had problems with PS2's Emotion Engine graphics chip. Several games ended up looking better on Dreamcast (Dead or Alive 2 being one of the more striking examples if I remember correctly) as well.
PS4 is not supposed to be more difficult to develop for though!
 

Lord Error

Insane For Sony
Why is this surprising? The Xbox One has a faster CPU and lower latency RAM.

There will always be games that perform better on the Xbox One.
It's surprising because this is the first one. Or at least the first notable game - where it performs better with all other things being identical. Lower latency ram thing is bs btw.
 

jryi

Senior Analyst, Fanboy Drivel Research Partners LLC
The Xbox One has a faster CPU and lower latency RAM.

I've been trying to find some information about the latency advantage, but with no luck so far. Is there actual data to back this up somewhere out there, or is this just one of those "facts" that has no basis in reality?
 
Was this posted yet? If legit it may explain some of this. (heavy CPU use of bandwidth reduces gpu bandwidth by a larger amount)

PS4-GPU-Bandwidth-140-not-176.png
 
They could have a team of 100000 people working on the game and there would still be some algorithms which are inherently not suited to GPGPU.

The fact that so many people consider it a magic bullet, and interchangable with CPU performance,is quite the marketing success.



Yep. Exactly this. GPGPU became a buzz word. I even read people using GPGPU as an hardware term, like this GPU is a GPGPU.

People just dont get that gpu computation remains limited to some fields. It cannot replace a CPU. It can ease the workload for some tasks of course. And this is eating in your GPU power.

But if it were that easy, then consoles would only be made of GPUs with GDDR5.
 
They could have a team of 100000 people working on the game and there would still be some algorithms which are inherently not suited to GPGPU.

The fact that so many people consider it a magic bullet, and interchangable with CPU performance,is quite the marketing success.

So by and large because it is CPU heavy, Unity would always run better on One rather than PS4? There was nothing Ubi could do with the Sony tech to get it to run better?
 

big_z

Member
I've been trying to find some information about the latency advantage, but with no luck so far. Is there actual data to back this up somewhere out there, or is this just one of those "facts" that has no basis in reality?


RAM latency: A delay in transmitting data between a computer's RAM and its processor. Since RAM is not necessarily fast compared to the computer's processor, RAM latency can occur, causing a delay between the time a computer's hardware recognizes the need for a RAM access (initiates a request for data) and the time the data or instruction is available to the processor. If the CPU requests data that is not stored in the cache, then it will have to wait for the RAM to retrieve the data, opening the door to latency problems.


Slower cpu+higher latency equals lower performance in certain situations.

So by and large because it is CPU heavy, Unity would always run better on One rather than PS4? There was nothing Ubi could do with the Sony tech to get it to run better?

if they spent time optimizing the engine for each console it would run better on both. instead they seemed to go for the one size fits all approach.

If ubi really wanted to get into optimization they could offload the crowd AI onto the servers much like titanfall did. Not connected online, reduce the people on screen.
 

Durante

Member
So by and large because it is CPU heavy, Unity would always run better on One rather than PS4? There was nothing Ubi could do with the Sony tech to get it to run better?
I don't have nearly enough information to make any judgment on that.

In general, for a huge game like this, there are always lots of things you can do better on every platform it's released for.

Yep. Exactly this. GPGPU became a buzz word. I even read people using GPGPU as an hardware term, like this GPU is a GPGPU.
I was really amused when I first heard the term used like this. It's actually pretty common now. "GPGPU" used to be the software term, but now that seems to be "GPU computing" and "GPGPU" has turned into a hradware term (making very little sense).
 

TGO

Hype Train conductor. Works harder than it steams.
Seems even DF are confused as to why the PS4 isn't at a higher resolution not to mention lower framerates it has at 900p, something is obviously not right.
 

Mooreberg

Member
I wonder if some people who get this in a bundle are going to think their system is screwed up. The YouTube videos of glitches are heinous.
 
They could have a team of 100000 people working on the game and there would still be some algorithms which are inherently not suited to GPGPU.

The fact that so many people consider it a magic bullet, and interchangable with CPU performance,is quite the marketing success.

I don't think everyone see's it that way still it can be use to help the console .
I see some people saying GPGPU going to take years to use when we had a launch game that used it.
Yeah it from a small team and it was first party but it's not something that going to take years to use ,The Tomorrow Children also uses it .
Even if the game cpu bound at the end of the day it looks like ubi did not use any of the PS4 extra gpu power even if only for some better AA.
 

Scanna

Member
I still have friends who are telling me that on ps4 the frame rate is so much better...and i can trust them, but DF is usually very reliable...
 
Given the spec and the performance of the last AC, the only logical conclusion is that MS paid Ubi to not improve the PS4 version, therefore, paid parity. Unless Ubi comes out with a detailed essay along with factual evidence explaining exactly why the PS4 version cannot perform better than the XB1 version, there is no reason to believe otherwise. MS and Ubi have officially killed all the goodwill that I ever have for them with this move. This is simply unacceptable.
Lol
 

Vroadstar

Member
Well, yes, that's how it works. If you take the same CPU architecture and have one instance of it clocked higher than the other, then it's by definition more powerful.

Maybe next time please post my entire quote to not take it out of context? He's taking it as WIN based on his consoles MORE powerful cpu but clearly performing terribly on both consoles.

To put it simply he is saying based on this game, 10% advantage is now MORE powerful while 40%-50% advantage of the other console is considered only SLIGHTLY powerful. So by your definition is that how it works?
 
They could have a team of 100000 people working on the game and there would still be some algorithms which are inherently not suited to GPGPU.
Surely the best developers won't have trouble identifying those cases though? Between the CPU and GPGPU it's just a case of getting the right things to the right places.
 

ufo8mycat

Member
Just did a screenshot comparison then noticed DF has their comparison up.
Oh well here's the one I did anyway. Just a few screenshots (middle mouse click the links is best) :
(PS4 screens taken from console screenshot thread)

PS4 Screenshot 1

PC Screenshot 1

PS4 Screenshot 2
PC Screenshot 2

PS4 Screenshot 3

PC Screenshot 3

First of all let me get this out of the way. Yes the PC screenshots look more crisp BUT
overall, the graphical look isn't that big of a difference at all to me looking at these screens.

It is quite astounding considering how much weaker the PS4 hardware is compared to a high-end PC, yet this is the only graphical difference?

Maybe I am not someone who looks at the extra finer details etc - but yeah.

Just my opinion.
 

EGM1966

Member
Its interesting pining down the "whys" and there are some good views being posted: for me though the big issue is that Ubi chose to push ahead with a game that didn't deliver on a known hardware base (both PS4 and XB1).

I'm amazed why NPC count wasn't culled downwards to sensible levels for the hardware and more focus on core code Optimization made.

Really feels to me the scope and number of people involved on this ended up hindering efficient development process.

You could easily reduce NPC count and still have packed streets for example and keep the gameplay experience intact.

It's obvious though the game is buggy and unfinished as frame rate is an issue when it shouldn't be and here are seemingly lots of glitches/issues.

Game wasn't ready plain and simple, correct choices weren't made on stuff like NPC count and the game was obviously shoved out regardless to make holiday sales period.

All in all a pretty mediocre effort although I have a degree of sympathy for all the individuals who probably worked real hard to meet impossible design goals and unrealistic launch dates.
 

DSN2K

Member
way I see it the PS4 version is likely just poorly optimised, XB1 potentially got more attention to get up to the standard(LOL) required.

Game also seems a technical mess on both, there is nothing here to champion on. Ubisoft to need improve their game engines and development process. These massive teams clearly aren't doing the best job.
 

MMaRsu

Banned
First of all let me get this out of the way. Yes the PC screenshots look more crisp BUT
overall, the graphical look isn't that big of a difference at all to me looking at these screens.

It is quite astounding considering how much weaker the PS4 hardware is compared to a high-end PC, yet this is the only graphical difference?

Maybe I am not someone who looks at the extra finer details etc - but yeah.

Just my opinion.

The game isnt a looker by any means on any platform.
 

ISee

Member
No but I told them what kind of stutter I was encountering and they sustain it did not happen to them

The human eye do not see any difference between 20 and 30 fps. Don't you know? Do some research before you accuse your friends of being fanboys.
 

gofreak

GAF's Bob Woodward
There was lots of GPU headroom, but GPUs don't have anything to do with the computation of 5000 npcs, AI and so on.

Well, depending on requirements they could. But that doesn't mean they have in this case, or that any parts they might be running on a GPU shift the bound away from the CPU. I expect future software will get better about moving simulation of various kinds to the GPU. Kind of have to based on the example here unless Ubisoft left CPU optimisation on the table here that they could exploit more easily in the future...because the performance here isn't good on either platform, and correct me if I'm wrong, while better, isn't brilliant on PC either.
 

Marlenus

Member
Aside from developer incompetence the likely reason the Xbox One is faster than the PS4 in this game is going to be due to how they created it. All I can think of is that they created the game in DX and did a quick port over to the PS4 and did no performance tuning at all. That means some bits of code will be sub optimal leading to a performance differential that is greater than clock speed difference as there is nothing else in the CPUs that is different.

The question is would turning up the res to 1080p actually cost frames? I doubt it to be honest as the weaker GPU can obviously handle a higher frame rate when given enough to do by the CPU which means the PS4 could do it at 1080p with no performance penalty.

Poor optimisation + poor decision making have led to the worst PS4 release so far this generation. Prior to release I never expected any game to be faster on Xbox One for any reason. The extra overhead of the Hypervisor and OS management would eat away at the performance gains the clock speed increase gave the Xbox One leaving the CPUs pretty much neck and neck. To actually achieve what should be impossible really takes some special talent so UBI should be thankful that it has the engineers and developers in house to be able to take the more powerful console and make it run like crap, an amazing feat.
 

dark10x

Digital Foundry pixel pusher
Poor optimisation + poor decision making have led to the worst PS4 release so far this generation.
The worst? No, my friend, the worst has to be pre-patch The Evil Within. The frame-rate is much lower than Assassin's Creed Unity. Even after the patch is applied it's still rubbish.
 

Scanna

Member
The human eye do not see any difference between 20 and 30 fps. Don't you know? Do some research before you accuse your friends of being fanboys.
They are though game journalists..so they should be used to it...then again not accusing anybody, just reporting: as of yesterday i was convinced to have gotten the worse version and now DF telling me the contrary..still confused
 
The worst? No, my friend, the worst has to be pre-patch The Evil Within. The frame-rate is much lower than Assassin's Creed Unity. Even after the patch is applied it's still rubbish.

With bugs i seeing it has to be close for some either way still bad decision making from both companies .
 

Marlenus

Member
The worst? No, my friend, the worst has to be pre-patch The Evil Within. The frame-rate is much lower than Assassin's Creed Unity. Even after the patch is applied it's still rubbish.

What about the bugs + pop in + walking over obstacles NPCs + teleporting NPCs etc. That on top of the frame rate makes it a real mess. Does The Evil Within suffer with those kinds of bugs and immersion breaking issues too or does it just have a poor framerate?
 

Vizzeh

Banned
We know the X1 cpu is clocked marginally faster, but the PS4 Gpu is significantly stronger between 40% - 100% on GPUGPU.. what

we also know is that PC will always be a powerhouse when running games like this... I guess some of you need to check out the PC performance thread where a 980GTX card is cpu bound by an i7 4790k which is ridiculously quicker than both consoles...

What does that say when computers of that spec struggle to maintain over 40-60fps solid... A piss poor optimised engine. The PS4 v X1 debate in AC unity is irrelevant when in that context.

Hell, even on that setup dayz a cpu bound unoptomised game will run worse that an exceptionally hardware dependent game like BF4 that utilises the GPU at full rather than the CPU.

Where is your common sense UBI, at least scale back the game until the GPU is saturated...
 
Right now I'm waiting for Ubisoft to issue a statement that a patch is on it's way to fix the framerate problems and various huge game breaking bugs. If they don't in the next week then I will be taking it back. I've got plenty of other things to play whilst I wait for them to fix this shit.
 
Top Bottom