• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Performance Analysis: Assassin's Creed Unity (Digital Foundry)

Kezen

Banned
Xzf4pdw.jpg


Hm, on the PC side you have stronger GPUs beating out weaker ones. How strange.

It also shows that you don't need cutting-edge hardware to have a much better experience than consoles.
Mid-range CPUs such as the I5 2500K fare very well and can provide a stable 30fps even with very high LOD.
 

BigDug13

Member
It also shows that you don't need cutting-edge hardware to have a much better experience than consoles.

I don't know, that i7 processor and 32MB of DDR4 RAM seem kinda beefy. The 3.0 GHZ version of that processor is over $1000 on Newegg. And the speed and amount of that RAM is about $600 on Newegg.

So before the video card you're looking at $1600 in Processor and RAM. (The chart seems to be showing unrealistic hardware for general user outside of the Video Card)
 
Xbox or PS4 being the lead platform makes no difference. They are running the same code. So everything that you code for one will run mostly the same or better on the other.

Where you all say there is unused GPU resources used on PS4, I clearly see a system struggling to hold 30 FPS even on cutscenes, and performing better than Xbox at the same time. Maybe your expectations are unrealistic.

Minor CPU advantage on Xbox is a fact. And analysis on AC4 makes sense with things we know about both consoles.

The way the game is coded is to heavily rely on the CPU, of which X1's CPU has a minor 150 mhz advantage--all else being equal. Both of these machines, PS4 especially, have been designed to offload tasks like hordes of brainless NPCs from the CPU. With the bug-riddled, unoptimized state of the game (it even performs like shit on high end PCs), it is likely that Ubisoft did not properly leverage GPGPU Compute in crowd scenarios. If they had, the performance gulf would likely be significantly in the PS4's favor.

That's why you have a verified 3rd party dev Matt in this very thread being surprised, saying, "Wow." He also commented that he was tempted to say that there wasn't an excuse for this but he changed his comment since he wasn't actually there at Ubisoft to see the particular scenario encountered.
 

Marlenus

Member
Then you are blaming Sony for a worse API or compiler, not Ubi. AFAIK that isn't an issue, so I don't agree.

Not worse, just different. They each have certain nuances and if you ignore them you can tank performance quite easily.



My point is that:

a. Xbone's CPU have an advantage on both clockspeed and memory setup. Just not a plain 9%. Would be more or less depending on load scenario.

b. You can't expect a linear progression on a heavily CPU bound scenario.

What do you mean by 'memory setup'? You talking about read/writes or the fact the memory controller will run at the higher speed too? Regardless it still results in a 9% increase at best and that is all you are getting.

Except the most heavily CPU limited game I can think of, Starcraft 2, does show linear progression with CPU clock speed.
 

Kezen

Banned

Clockwork5

Member
You don't need an I7 to run the game at 30+fps. ;) I never talked about price...Just the range of hardware. A 760/I5 combo is far, far from being "cutting edge".
760 + I5 = much better than consoles at settings higher than consoles as well,

SIFxckH.jpg

Sure, with SLI 980s...
 

Kezen

Banned
Sure, with SLI 980s...

You miss the point of the benchmark. It's to show the impact of the CPU performance wise. As you can see even a I5 does not bottleneck a SLI of high-end GPUs that much, much less so a mid-range GPU.

Point remaining : a mid-range PC achieves significantly more than any console in this game, for a port labeled as "unoptimized"...I'll take that. ;-)
 

BigDug13

Member
You miss the point of the benchmark. It's to show the impact of the CPU performance wise.

Which is "not very much" is what you're saying? Because people are saying i5 processor is overkill for this game so it must not be doing all that much CPU-wise.
 

Kezen

Banned
Which is "not very much" is what you're saying? Because people are saying i5 processor is overkill for this game so it must not be doing all that much CPU-wise.

Those people are wrong, Unity stresses your CPU. If you have less than an I5 (2500K or higher) things are rough, very rough.
If you are running the game at PS4/XBO settings then the mid-range PC I was talking about does even better.

I think it's quite incredible that the bar is so low for better-than-consoles experiences in such a demanding game. :)
Sustaining a solid 30fps with console setting is not difficult at all on PC.
 
They are not the same code, that is ridiculous. At best they run the same algorithm with similar code.

The rest of your hypothesis is just confirmation bias. You are taking data (CPU clocks) and fitting it to the results. The best case you could build without profiling the games is knowing the game is 100% bottlenecks by the CPU and then the XB1 version has a % better frame rate that exactly matched the % upclock. Of course you don't have that and there are so many other variables that you cannot make that case.

I could just as easily make the case that Ubi has a bundle and marketing deal with the MS and therefore spent 1.5x the man-hours on the XB1 version. This is far more controversial, so no one will ever admit it publicly.

If Ubi was smart they would have removed 9% of the crowds and made the two versions equal, but of course that excuse about the AI is probably just that, an excuse. The engine is probably new and a POS at this point, but hey MS paid for some extra TLC.

Let's pretend like you can't run the same code on different platforms without bothering much more than which flags can you use on the compiler for each.

Some obscure M$ is the only answer if you don't want to be considered biased or be spammed with Star Trek gifs.

Excuse me if I'm having fun.
 
Between this and COD:AW, i'm beginning to wonder if I made the right decision in selling my Xbone and keeping my PS4 because it was the more powerful/capable of the two.

If multiplats keep turning up best on Xbone, I may need to make a switch.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
^ That doesn't make any sense. Dynamic resolution is going to get more consistent results in terms of FPS. CODAW has 360 framerates on PS4 in single player and the same framerate as Xbox One in multiplayer at a higher resolution. How does it come out to being better? Especially in technical messes like this. 1 percent of games(you listed only 2 out of every other game on the platforms) being inferior doesn't make up for the other 99%



The fact of the matter is, Ubi didn't bother to optimize at all for 2 out of 3 of their platforms they were launching on, that's a really big problem and reeks of laziness, like...at that point don't even port the game if you can't even bother to do a good job on it.

If the CPU really is the problem, i don't need 10,000 NPC's on screen at one time to enjoy a game. Cut it in half or even a quarter and my enjoyment is not going to be suddenly lessened. If this kind of thing is going to be an issue, performance comes first.

The consoles are what they are, if a developer can't build around the specs and come out with a decent product, then the onus is on them.
 
You don't need an I7 to run the game at 30+fps. ;) I never talked about price...Just the range of hardware. A 760/I5 combo is far, far from being "cutting edge".
760 + I5 = much better than consoles at settings higher than consoles as well,

SIFxckH.jpg

What this chart indicates is that the gamecode is fully cpu-reliant and is in no way leveraging GPGPU to offload anything from the CPU.

Again, the game is an unoptimized piece of crap. It's not like the NPCs are smart. They are hordes of do nothings with repetitive animations.
 

Marlenus

Member
You miss the point of the benchmark. It's to show the impact of the CPU performance wise. As you can see even a I5 does not bottleneck a SLI of high-end GPUs that much, much less so a mid-range GPU.

Point remaining : a mid-range PC achieves significantly more than any console in this game, for a port labeled as "unoptimized"...I'll take that. ;-)

A mid ranged Nvidia GPU, the AMD GPUs get trounced. Considering both consoles use GCN you would not expect that kind of performance differential so that further suggests it was actually PC 1st using Gameworks and then ported to X1 with some performance tuning and then PS4 with very little tuning.

I think someone earlier got it right that the PS4 is almost too easy to develop for so the devs here left it till last and did not have time to optimise in anyway for it. The fact the rest of the game is somewhat broken too would imply they were very rushed to get it to even this state for release. It needed 3 more months at least if not 6.

They would have been better off doing a cross gen version of Rogue and releasing Unity next year.
 

dark10x

Digital Foundry pixel pusher
So how well does using half-refresh plus RivaTuner OSD to lock at 30 fps actually work? Seems like it should be possible to achieve a stable 30 fps but I'm also hearing of other dips and stutters that interfere. Anyone actually getting a 100% locked, stable 30 fps on the PC? Seems like 60 fps is out of reach for the moment.
 
Let's pretend like you can't run the same code on different platforms without bothering much more than which flags can you use on the compiler for each.

Some obscure M$ is the only answer if you don't want to be considered biased or be spammed with Star Trek gifs.

Excuse me if I'm having fun.

I agree with your premise that they are using pretty much the same code on all platforms. The game is a combination of poorly coded to leverage available compute hardware outside of the CPU, unoptimized, and broken/bug-riddled.
 

Kezen

Banned
What this chart indicates is that the gamecode is fully cpu-reliant and is in no way leveraging GPGPU to offload anything from the CPU.
I don't think the data we have here is enough to make that claim. Even accounting for some offloads to the GPU the game still needs a lot of CPU power, that could be it.
I would not immediately conclude the game does not use GPGPU in any way.

Again, the game is an unoptimized piece of crap. It's not like the NPCs are smart. They are hordes of do nothings with repetitive animations.
It's not the perfect optimization regardless of platforms but the performance data we have on PC is extremely promising, a mid-range PC maxing out the game at 30fps does not scream "crap optimization" to me, at all.

A mid ranged Nvidia GPU, the AMD GPUs get trounced. Considering both consoles use GCN you would not expect that kind of performance differential so that further suggests it was actually PC 1st using Gameworks and then ported to X1 with some performance tuning and then PS4 with very little tuning.
In CPU bound situations Nvidia's driver superiority shows. Not surprising AMD can't keep up.

Still an expensive cpu and so on why argue a expensive PC will run it better than console's even though it will still run like shit. Usually you would be right in this case it's not worth the money.
A 4690K for example is 200, that's very cheap, isn't it ? That's nowhere near my definition of expensive. Same goes for the 760.
You get what you pay for. ;-)

Whether or not the performance on PC is worth the investment is a completely subjective argument.
 

Marlenus

Member
Let's pretend like you can't run the same code on different platforms without bothering much more than which flags can you use on the compiler for each.

Some obscure M$ is the only answer if you don't want to be considered biased or be spammed with Star Trek gifs.

Excuse me if I'm having fun.

It is not about running the same code on different platforms, that is fine as long as all the platforms support the same API. The thing is the PS4 and Xbox One are using different APIs so they require separate code paths.
 

Chobel

Member
That wasn't from a year ago? Several SDK updates arrived since then, including Kinect reservation removal.

You have this game showing better performance on crowded scenes on Xbox One, and better performance on cut scenes and some other emptier scenes on PS4. That says something. Most likely, you have a CPU bound and GPU bound console (One) and slightly more CPU bound console (PS4). Lawl.

No, he said it after the famous June SDK http://www.neogaf.com/forum/showpost.php?p=121958788&postcount=203

And why you're ignoring that he said in this thread that there's no excuse for this?

Edit: Another one http://www.neogaf.com/forum/showpost.php?p=120463891&postcount=435
 

TheStruggler

Report me for trolling ND/TLoU2 threads
Between this and COD:AW, i'm beginning to wonder if I made the right decision in selling my Xbone and keeping my PS4 because it was the more powerful/capable of the two.

If multiplats keep turning up best on Xbone, I may need to make a switch.
The ps4 version got the nod for multiplayer from DF as for the single player they said its a draw, ps4 had some frame drop while bone has screen tearing and not maxed resolution. Since cod is primarily a multiplayer focused game the win goes to ps4
 

DryvBy

Member
Between this and COD:AW, i'm beginning to wonder if I made the right decision in selling my Xbone and keeping my PS4 because it was the more powerful/capable of the two.

If multiplats keep turning up best on Xbone, I may need to make a switch.

Or you could just not support this kind of crap.
 

EGM1966

Member
Between this and COD:AW, i'm beginning to wonder if I made the right decision in selling my Xbone and keeping my PS4 because it was the more powerful/capable of the two.

If multiplats keep turning up best on Xbone, I may need to make a switch.
I'm sorry but you sound horribly misinformed my friend.

CoD SP on PS4 has a solid frame rate with minor dips here and there none of which really affects gameplay at full 1080p.

CoD SP on XB1 has a slightly more solid frame rate at the expense of a pretty big resolution drop ain't all the time.

Try them both and you'll find the resolution hike more than makes up for a small fps difference. If PS4 SP is patched to support variable resolution in line with XB1 then PS4 will easily outperform it.

MP is a clear a notable win with equal fps but full1080p on PS4 making for much better experience with better resolution of foes at a distance.

Nobody with common sense would think for a nano second the XB1 version performs better.

Unity is simply a mess but its a mess that's been slightly better optimized on XB1.

If you think that's a trend vs the huge amount of titles performing better on PS4 I suggest a quick google on basic statistical analysis. The good news is you'll realise you're fine with your choice with regards to technical performance.
 
I don't think the data we have here is enough to make that claim. Even accounting for some offloads to the GPU the game still needs a lot of CPU power, that could be it.
I would not immediately conclude the game does not use GPGPU in any way.

Well, the frame rate is scaling pretty linearly with the same graphics card setup. Seems overly CPU-reliant to me. Of course there could be other factors at work.

They did say they were using GPGPU for cloth simulation in one of the slides. They might have added their piss-poor zombie AI to that list, and these issues probably would not have crept up, especially for NPCs that are not in the direct vicinity of the player (i.e., background NPCs who pretty much have looping animations anyway).
 

Kezen

Banned
Well, the frame rate is scaling pretty linearly with the same graphics card setup. Seems overly CPU-reliant to me. Of course there could be other factors at work.
True, but is that mutually exclusive with GPU compute being used ? I'm not saying they've maxed out compute on any platform but I would be extremely surprised if they haven't taken great advantage of this. Compute isn't some sort of esoteric tech, it's been around a very long time (CUDA, Nvidia, 2008).
 

Ateron

Member
No, he said it after the famous June SDK http://www.neogaf.com/forum/showpost.php?p=121958788&postcount=203

And you're ignoring that he said in this thread that there's no excuse for this?

Psst...don't let facts get in the way of a good story.
This poor optimized mess of a game has opened the floodgates for everyone waiting for a chance to bash the ps4, for once. Vindication or some shit. The fact that even DF, of all places, was surprised this was not a win for ps4 says it all.

Yesterday I was wondering the same thing many posters said today, and even commented it with my gf. Maybe the ps4 is so easy to program that the devs reach their target with relative ease, and then spend more time on the x1 version until it surpasses the shitty target they've set on ps4. This isn't proof that the ps4 is less powerful than the x1. It proves that if you settle for a low bar on ps4 and then spend more time polishing the other version you will reach it as well and then surpass it, even if only slightly.

Maybe the same happened with gta V on ps3. I bought it fully expecting it to be outclassed by the 360 version, like it had happened prior (GTA IV and RDR) and yet, ps3 was the better version. Rockstar clearly spent more time optimizing for the most difficult platform, adapting to the cell's quirks. If they had spent an equal amount of time on both versions the 360 would have probably performed better. One can be a PS fan and still see things for what they are. I think the same happened here. PS4 is more capable, yet that made them lazy. Easy enough to reach their target, get it over with and then jump to the rescue of the other version. It really looks like they gave up halfway "fuck it, it's good enough, let's focus on the other version till it ships" and called it a day.

I'm aware that this seems a console version of this

ryan.jpg


but this ease of development may be a two edged sword, especially for lazy/over ambitious/ bat shit insane devs with unrealistic expectations on what these consoles can or cannot handle properly.
 
True, but is that mutually exclusive with GPU compute being used ? I'm not saying they've maxed out compute on any platform but I would be extremely surprised if they haven't taken great advantage of this. Compute isn't some sort of esoteric tech, it's been around a very long time (CUDA, Nvidia, 2008).

It has been around for a while, but my understanding is that it is an oft underutilized component due to archaic memory setups that severely penalized CPU with latency-induced misses when leveraged. hUMA and a number of other advances in the memory subsystem, along with the more granular threads that can be set up and managed via the latest GCN GPU archtiecture, are supposed to help alleviate this issue.

Because this is a 3rd party game that seems undercooked and unoptimized for each particular platform, I would doubt they leveraged GPGPU to the extent that they could given more development time, especially on the AI end of things.
 

Kezen

Banned
It has been around for a while, but my understanding is that it is an oft underutilized component due to archaic memory setups that severely penalized CPU with latency-induced misses when leveraged. hUMA and a number of other advances in the memory subsystem, along with the more granular threads that can be set up and managed via the latest GCN GPU archtiecture, are supposed to help alleviate this issue.

Because this is a 3rd party game that seems undercooked and unoptimized for each particular platform, I would doubt they leveraged GPGPU to the extent that they could given more development time, especially on the AI end of things.

There is always more thing you can do with a later deadline. Unity has been in development (not saying full production) for 4 years. Console specs have been known since 2012, Ubisoft had the time to exploit the hardware, based on the data we have I suspect they aimed too high for those consoles.
 

ExoSoul

Banned
I think Sony may have made things TOO easy for developers. They can get their games in a "working state" so easily on PS4 while on XBO, they have to work a lot harder with ESRAM management, etc so developers throw WAY more manhours at the XBO version than the PS4 version. Add the marketing partnership and console bundling and Ubisoft made sure to get that version running as well as possible and didn't have the manpower to devote to improving what they got going on PS4.

It's baffling though why any company would want the version that is releasing on the console with 2x+ more consumers available to be the worse performing version when it seems like more manhours devoted to optimization would have improved it based on the spec differences.

In the end though, it's a failure of a game technically. These consoles should not already be struggling to hit 30fps and based on how poorly this game plays on PC as well, it's not the console's fault here.

Should have made Rogue cross-gen this time and released Unity next year spring or something.

Just like how they did with the PS3 and CELL right?
Oh...
 

RexNovis

Banned
No, he said it after the famous June SDK http://www.neogaf.com/forum/showpost.php?p=121958788&postcount=203

And you're ignoring that he said in this thread that there's no excuse for this?

You're just pulling stuff out of your ass, purely based on a clock value for the XB1 CPU.
The rest is just not based on anything. And contradicts what devs posting here have to say on the topic, like Matt and Fafalada. But sure, carry on.

It is not about running the same code on different platforms, that is fine as long as all the platforms support the same API. The thing is the PS4 and Xbox One are using different APIs so they require separate code paths.

They are not the same code, that is ridiculous. At best they run the same algorithm with similar code.

The rest of your hypothesis is just confirmation bias. You are taking data (CPU clocks) and fitting it to the results. The best case you could build without profiling the games is knowing the game is 100% bottlenecks by the CPU and then the XB1 version has a % better frame rate that exactly matched the % upclock. Of course you don't have that and there are so many other variables that you cannot make that case.

I could just as easily make the case that Ubi has a bundle and marketing deal with the MS and therefore spent 1.5x the man-hours on the XB1 version. This is far more controversial, so no one will ever admit it publicly.

If Ubi was smart they would have removed 9% of the crowds and made the two versions equal, but of course that excuse about the AI is probably just that, an excuse. The engine is probably new and a POS at this point, but hey MS paid for some extra TLC.

As much as I and all the sane posters in this thread agree with you guys here, we all know this already. It's been said over and over. In this case, I hate to break it to you, but you're just wasting your breath. You're arguing facts with someone who actually said optimization is pointless because it's the same code across different platforms. A statement that is so entirely ignorant and devoid of common sense that it leads to one conclusion: Clearly facts have no bearing. So, do yourselves a favor and save your time and effort.

In fact, I might humbly suggest enjoying the Gifs, videos and general hilarity ensuing in the Framerate is atrocious thread here. I assure you it is a far more productive and enjoyable use of your time.
 
There is always more thing you can do with a later deadline. Unity has been in development (not saying full production) for 4 years. Console specs have been known since 2012, Ubisoft had the time to exploit the hardware, based on the data we have I suspect they aimed too high for those consoles.

Most definitely the case. They stated in an interview they were expecting the CPUs in the consoles would be able to handle 10x the AI and that they were disappointed in what they wound up with. They stated that it has been a challenge to make the AI work on the CPUs in these boxes (which is not surprising, being that they are glorified low-power netbook CPUs). This all points to Ubisoft not really leveraging the GPGPU to handle any element of the AI, as they were caught off guard mid-development.
 

cgcg

Member
So how well does using half-refresh plus RivaTuner OSD to lock at 30 fps actually work? Seems like it should be possible to achieve a stable 30 fps but I'm also hearing of other dips and stutters that interfere. Anyone actually getting a 100% locked, stable 30 fps on the PC? Seems like 60 fps is out of reach for the moment.

Ubisoft Game Architect implied that MS is making them lock the framerate on PC too".

Mission accomplished?
 
Between this and COD:AW, i'm beginning to wonder if I made the right decision in selling my Xbone and keeping my PS4 because it was the more powerful/capable of the two.

If multiplats keep turning up best on Xbone, I may need to make a switch.

nothing like the hard truth of one game being poorly optimized to warrant a console swap. you should go for it.
 

SpotAnime

Member
I don't know, that i7 processor and 32MB of DDR4 RAM seem kinda beefy. The 3.0 GHZ version of that processor is over $1000 on Newegg. And the speed and amount of that RAM is about $600 on Newegg.

So before the video card you're looking at $1600 in Processor and RAM. (The chart seems to be showing unrealistic hardware for general user outside of the Video Card)

That's the one thing I hate about their performance charts. They always measure the best hardware, not on average hardware which would be a better representation.
 

Kezen

Banned
Most definitely the case. They stated in an interview they were expecting the CPUs in the consoles would be able to handle 10x the AI and that they were disappointed in what they wound up with. They stated that it has been a challenge to make the AI work on the CPUs in these boxes (which is not surprising, being that they are glorified low-power netbook CPUs). This all points to Ubisoft not really leveraging the GPGPU to handle any element of the AI, as they were caught off guard mid-development.
Well I hope they will find to shift more work to the GPU.
Nvidia have something called GAI :
http://www.geeks3d.com/20100606/gpu-computing-nvidia-gpu-artificial-intelligence-technology-preview/

This technology preview is a snapshot of some internal research we have been working on and talking about at various conferences for the past couple years. The level of interest in GPU-accelerated AI has continued to grow, so we are making this (unsupported) snapshot available for developers who would like to experiment with the technology.

That's the one thing I hate about their performance charts. They always measure the best hardware, not on average hardware which would be a better representation.
For a GPU benchmark to be relevant, one has to make sure the CPU is out of the way. You want to know how various cards stack up against each other.
With CPU and GPU benchmarks you can find easily how "average" hardware does in a game. Look at what a measle I3 achieves in such a CPU bound game, pair it with a mid-range GPU and you can get a very solid 30fps at 1080p, along with more effects than consoles.
 
Top Bottom