• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Performance Analysis: Assassin's Creed Unity (Digital Foundry)

Ty4on

Member
What does that say when computers of that spec struggle to maintain over 40-60fps solid... A piss poor optimised engine. The PS4 v X1 debate in AC unity is irrelevant when in that context.
I'm not trying to defend Ubisoft for releasing a rushed game, but optimizing a game this complex is far from trivial. And to their credit a high end PC will run the game much better than the consoles which can't be said about all games.
Yes, PS3 had a Cell processor which they claimed was the level of a super computer. It was notoriously difficult to develop for though. They also had problems with PS2's Emotion Engine graphics chip. Several games ended up looking better on Dreamcast (Dead or Alive 2 being one of the more striking examples if I remember correctly) as well.
PS4 is not supposed to be more difficult to develop for though!
The PS3 CPU was very efficient when using very specific code. It sounds great if those "lazy programmers" would do their job, but it was obviously bad for game development when you look at the early years.
Anyone has an idea how Digital Foundry does those frametests? I've searched a bit on the internet, but I couldn't find an answer.
I'm pretty sure they have hardware that reads each frame and checks when new ones are displayed. Nvidia has FCAT which is used in some PC benchmarks.

With vsync a new frame can only be displayed every 1/60th of a second so if frame n-1 != n = n+1 != n+2 then the same frame was displayed for 2 * 1/60th of a second or 33ms (30 fps). If n-1 != n = n+1 = n+2 != n+3 then the same frame was displayed for 3 * 1/60th of a second so 50ms (20 fps). If you look at the frametimes window you'll notice it oscillates between 33 and 50 when the framerate isn't 20 or 30 fps. That can then be averaged over a second to give you the fps number and that's why frametimes is a better measurement than fps :p
 

jpax

Member
Why is this surprising? The Xbox One has a faster CPU and lower latency RAM.

There will always be games that perform better on the Xbox One.

Nonsense! Our very own Matt wrote that the PS4 CPU is stronger

xbone does not have lower latency RAM that is a myth. Both systems have between 10ns and 13ns. GDDR5 has CAS times between 13ns and 17ns. DDR3 between 9ns (incredible expensive) and 15ns. This results in latency for PS4 9.5ns to 12.4ns and for xbone 8.5ns (again this is really expensive shit right here) and 14ns.

So stop spreading misinformation, the real latency is added by the controllers.

Edit: Seems like it could be even the other way around, that the PS4 actually has the lower latency.
http://www.hynix.com/datasheet/pdf/graphics/H5GQ1H24AFR(Rev1.0).pdf starts at around page 130.
 

rafbanaan

Member
I'm pretty sure they have hardware that reads each frame and checks when new ones are displayed. Nvidia has FCAT which is used in some PC benchmarks.

With vsync a new frame can only be displayed every 1/60th of a second so if frame n-1 != n = n+1 != n+2 then the same frame was displayed for 2 * 1/60th of a second or 33ms (30 fps). If n-1 != n = n+1 = n+2 != n+3 then the same frame was displayed for 3 * 1/60th of a second so 50ms (20 fps). If you look at the frametimes window you'll notice it oscillates between 33 and 50 when the framerate isn't 20 or 30 fps. That can then be averaged over a second to give you the fps number and that's why frametimes is a better measurement than fps :p

And on consoles? :p
 

justino

Neo Member
I think you'll see a lot of this. MS want parity and are clearly influencing that overtly for indie games, and less so for AAA titles - but clear example being Diablo III where they forced parity from Blizzard.

Devs will make the lowest common denominator the lead dev platform - in the hope that the more powerful unit will cope with having it ported.

Because X1 takes the lead dev position, its more likely to be easier to optimise and you'll get results like this.

If Unity was coded on PS4 as the lead platform, this outcome would not exist.
 

goonergaz

Member
Hell, even on that setup dayz a cpu bound unoptomised game will run worse that an exceptionally hardware dependent game like BF4 that utilises the GPU at full rather than the CPU.

dang, I forgot about DayZ - was looking forward to that on PS4 but now I doubt it'll run well :(
 
Nonsense! Our very own Matt wrote that the PS4 CPU is stronger

xbone does not have lower latency RAM that is a myth. Both systems have between 10ns and 13ns. GDDR5 has CAS times between 13ns and 17ns. DDR3 between 9ns (incredible expensive) and 15ns. This results in latency for PS4 9.5ns to 12.4ns and for xbone 8.5ns (again this is really expensive shit right here) and 14ns.

So stop spreading misinformation, the real latency is added by the controllers.

Edit: Seems like it could be even the other way around, that the PS4 actually has the lower latency.
http://www.hynix.com/datasheet/pdf/graphics/H5GQ1H24AFR(Rev1.0).pdf starts at around page 130.

This needs to be confirmed. Hope Matt makes a clear post about the PS4 CPU someday.
 

KKRT00

Member
Anyone has an idea how Digital Foundry does those frametests? I've searched a bit on the internet, but I couldn't find an answer.

http://www.eurogamer.net/articles/digitalfoundry-fps-tools-v2-blog-post
http://www.eurogamer.net/articles/digitalfoundry-what-gta4-did-for-us

Basically they check for frame changes on pixel level and count unique frames.
Two frames are never identical in modern games, because of animations, particles and post-effects.
I think in newest build of their tool, they are using stuff similar to FCAT.
 
It's not such a shock if you believe Ubisoft's claim that it's CPU limited. And given the nature of the game that doesn't seem so far-fetched.

Absolutely Spot On!
There's a huge difference in CPU's between the consoles so obviously the X1 version would be better even though the PS4 GPU is superior and the GDDR5 RAM is better suited for games!
 

Durante

Member
Absolutely Spot On!
There's a huge difference in CPU's between the consoles so obviously the X1 version would be better even though the PS4 GPU is superior and the GDDR5 RAM is better suited for games!
Calm down. You are not making any sense, your attempt at sarcasm is falling flat since you are overdoing it, and spittle is flying around the room.
 

Marlenus

Member
Why is this surprising? The Xbox One has a faster CPU and lower latency RAM.

There will always be games that perform better on the Xbox One.

The faster CPU is a very small advantage and with the added API and OS overhead I am not really sure that it is that much faster in a real world scenario. Some benchmarks might indicate it is in specific tasks but generally they are practically the same.

The Xbox One does not have lower latency RAM though. In terms of latency they are the same give or take ~ 2ns.

DDR3 Act to Act is ~ 46.09ns
GDDR5 Act to Act is ~ 40ns

This is from Hynix data sheets though so the timings in the PS4 and the Xbox One may be slightly different to the above but they are going to be accurate to within a very small margin of error. Latency is not an issue for PS4 so this myth needs to stop.

EDIT: Beaten above!
 

jpax

Member
This needs to be confirmed. Hope Matt makes a clear post about the PS4 CPU someday.

The CPU part? yeah it does... But we have one independet benchmark which confirms it.
http://www.neogaf.com/forum/showthread.php?t=737629

The latency part? No. This is clear and it is just wrong that the xbone has significantly lower latency. And like suggested in the link provided it could very well be the other way around.

Interesting read (not about latency):
http://wccftech.com/playstation-4-vs-xbox-one-vs-pc-ultimate-gpu-benchmark/
 

benny_a

extra source of jiggaflops
The CPU part? yeah it does... But we have one independet benchmark which confirms it.
http://www.neogaf.com/forum/showthread.php?t=737629
That was then, now is now.

Any reasonable person sees the Ubisoft dancer graphs and now concludes that the Xbone CPU is now marginally faster in real world scenarios, despite the overhead which caused it to fall back before they did the June SDK update.
That graph doesn't translate to the relative difference that is being seen here in the videos, but it is what it is. Arguing that the PS4 CPU is a stronger in real world scenarios in post-June SDK is a fool's errand.
 

jpax

Member
The faster CPU is a very small advantage and with the added API and OS overhead I am not really sure that it is that much faster in a real world scenario. Some benchmarks might indicate it is in specific tasks but generally they are practically the same.

The Xbox One does not have lower latency RAM though. In terms of latency they are the same give or take ~ 2ns.

DDR3 Act to Act is ~ 46.09ns
GDDR5 Act to Act is ~ 40ns

This is from Hynix data sheets though so the timings in the PS4 and the Xbox One may be slightly different to the above but they are going to be accurate to within a very small margin of error. Latency is not an issue for PS4 so this myth needs to stop.

EDIT: Beaten above!

http://www.hynix.com/datasheet/pdf/graphics/H5GQ1H24AFR(Rev1.0).pdf

Did you get the information also from this document?
 

jpax

Member
That was then, now is now.

Any reasonable person sees the Ubisoft dancer graphs and now concludes that the Xbone CPU is now marginally faster in real world scenarios, despite the overhead which caused it to fall back before they did the June SDK update.
That graph doesn't translate to the relative difference that is being seen here in the videos, but it is what it is. Arguing that the PS4 CPU is a stronger in real world scenarios in post-June SDK is a fool's errand.

I agree with your That was then, now is now statement.
But are you really suggesting trusting Ubisoft makes you a reasonable person?
 

Conduit

Banned
Absolutely Spot On!
There's a huge difference in CPU's between the consoles so obviously the X1 version would be better even though the PS4 GPU is superior and the GDDR5 RAM is better suited for games!

In raw number, Xbone CPU 10% advantage is HUGE compared to PS4 CPU, but PS4's GPU 50% advantage compared to Xbone's GPU is SLIGHTLY!

Got it!
 

Marlenus

Member
http://www.hynix.com/datasheet/pdf/g...FR(Rev1.0).pdf
Did you get the information also from this document?

I got the DDR3 timings from the Computing_DDR3_H5TQ2G4(8)3CFR(Rev1.0).pdf document as it includes 2133Mhz DDR3 and I used the now rather old H5GQ1H24AFR(Rev1.0).pdf document for the GDDR5.

I tried to find the a document for the Samsung k4g41325fc-HC03 GDDR5 as that is what is in the PS4 but I was not able to find one that had the information.
 
but clear example being Diablo III where they forced parity from Blizzard.

Blizz said 1080p PS4, 900p Xbox One.

Microsoft sent in the ninja devs (as I like to call them, tongue in cheek!) and got it to 1080p.

The way you put it makes it sound like a negative thing.
 

benny_a

extra source of jiggaflops
I agree with your That was then, now is now statement.
But are you really suggesting trusting Ubisoft makes you a reasonable person?
That the PS4 CPU outperformed the Xbone CPU was odd in the first place. PS4 is clocked lower by 150MHz.

When you see the specs, it's not what you expect. The explanation offered was that there are reserves built in (which were now freed) and the inherent inefficiency due to their virtualization setup.

Ubisoft's dancer graph is just one extra data point. About your question of trust: I don't trust Ubisoft the corporation because they want to sell you a product, but an individual engineer at GDC is not automatically disqualified because he is on Ubi's payroll.
 

Truant

Member
Don't you people see what's going on? Stop the stupid console wars. The game runs like shit on all platforms. This is Ubisoft's great sacrifice. A terrible game to join all gamers in a single cause. They did this for the greater good.

Assassin's Creed: Unity.
 

Marlenus

Member
That was then, now is now.

Any reasonable person sees the Ubisoft dancer graphs and now concludes that the Xbone CPU is now marginally faster in real world scenarios, despite the overhead which caused it to fall back before they did the June SDK update.
That graph doesn't translate to the relative difference that is being seen here in the videos, but it is what it is. Arguing that the PS4 CPU is a stronger in real world scenarios in post-June SDK is a fool's errand.

These are specific test scenarios. The Metro developers have stated that the Xbox One has high CPU overhead on draw calls compared to the PS4 so in a specific scenario it stands to reason the APIs might be very similar but in a more generalised scenario there is no such evidence to suggest the same.

As another posted has stated it is most likely the Xbox One was the lead platform and they just ported the code over to the PS4 and did very little, if any, performance tuning for that platform.
 

jpax

Member
That the PS4 CPU outperformed the Xbone CPU was odd in the first place. PS4 is clocked lower by 150MHz.

When you see the specs, it's not what you expect. The explanation offered was that there are reserves built in (which were now freed) and the inherent inefficiency due to their virtualization setup.

Ubisoft's dancer graph is just one extra data point. Abour your question of trust: I don't trust Ubisoft the corporation because they want to sell you a product, but an individual engineer at GDC is not automatically disqualified.

Fair enough.
 

hodgy100

Member
Absolutely Spot On!
There's a huge difference in CPU's between the consoles so obviously the X1 version would be better even though the PS4 GPU is superior and the GDDR5 RAM is better suited for games!

there is not a huge difference, the cpu's are edentical bar clock speed of which the X1 is only 150mhz faster. it is a small difference, but obviously one to make a visible differencei n this instance.
 

benny_a

extra source of jiggaflops
These are specific test scenarios. The Metro developers have stated that the Xbox One has high CPU overhead on draw calls compared to the PS4 so in a specific scenario it stands to reason the APIs might be very similar but in a more generalised scenario there is no such evidence to suggest the same.

As another posted has stated it is most likely the Xbox One was the lead platform and they just ported the code over to the PS4 and did very little, if any, performance tuning for that platform.
Oh for sure. I'm not saying that the PS4 performs that way because of the slight difference in CPU performance according to the Ubisoft dancer graph.
This does not look like a port that performs exactly in line with the computational specs you throw at it. The discrepancy is often way too big and other times way too small.

I'm just saying that the PS4 CPU is worse in the real world at this point and posting about how it was performing better last year is ignoring the June SDK where Microsoft freed up resources.
 

jpax

Member
I got the DDR3 timings from the Computing_DDR3_H5TQ2G4(8)3CFR(Rev1.0).pdf document as it includes 2133Mhz DDR3 and I used the now rather old H5GQ1H24AFR(Rev1.0).pdf document for the GDDR5.

I tried to find the a document for the Samsung k4g41325fc-HC03 GDDR5 as that is what is in the PS4 but I was not able to find one that had the information.

So same document ok. Thank you.
 
I will probably get this once it is on sale for 10$, I would really like to see how it looks.

They make it sound loke it actually is really pretty, but runs like crap.
 

Bl@de

Member
Absolutely Spot On!
There's a huge difference in CPU's between the consoles so obviously the X1 version would be better even though the PS4 GPU is superior and the GDDR5 RAM is better suited for games!

150Mhz is not a huge difference. I benchmarked my old CPU (X4 975) @3.6 and 4.2 GHz (600Mhz difference^^) ... the 15% clock difference only showed 5-8% fps increase ...

So stop actiing like 1.6Ghz and 1.75Ghz is huge. It's 1-2%.
 

Fafalada

Fafracer forever
jpax said:
Did you get the information also from this document?
I would avoid speculating on latency of memory subsystems unless there are 1:1 benchmarks available. Eg. PS2 was lambasted for supposed Ram latency for years even though its real-world access times were actually 2nd fastest of the 4 consoles of its era.

Durante said:
The fact that so many people consider it a magic bullet, and interchangable with CPU performance,is quite the marketing success.
You're in a thread where people are taking an amalgation of CPU&GPU jobs spread across 8 cores and X-GPU resources, get a 25% difference on the end and go "AHA - it's because of CPU clock-speed being 10% different" - it's just silly assertions against silly assertions on all sides.
I mean we're discussing an application that is barely holding together at its seams - there's little guarantee different platforms have even shipped on the same data/code-version and a difference in either could account for dramatic performance differences even if hw-performance was identical. And that's assuming performance differences aren't purely down to bugs, rather than optimization, to begin with.

Panajev2001a said:
What about the rooftop scenes though? Open vistas with a much reduced NPC count...
IMO it's meaningless to even speculate on this - anything near complexity of ACU is not a use-able CPU benchmark with the amount of variables involved (see my comments above).
And even if we assume those variables are NOT an issue - there's platform-specific codebase unknowns. As an extreme example - a number of Ubisoft's early PS3 games ran on a DirectX emulation layer, badly obfuscating any relative-hw performance metrics you might expect to get comparing cross-platforms.
 

jpax

Member
Oh and another side note. This is what happens if you are "fine with 30fps"... If your goal is 30fps there is a really good chance of getting 20fps. That game is the reason why guys get so upset with "30fps is fine" people.
 

Scanna

Member
I am NOT rooting for anybody, but I wrote to DF anyways, to shed some light on the PS4 vs X1 debate. I have multiple sources telling me that they are not experiencing frame drops in the PS4 version, at least no big ones, while I have been struggling with the X1 version.
Could it be some firmware thing? Maybe DF wrote the article before 2.02? I dunno, I cannot call liars neither the DF guys neither my colleagues (italian reviewers). That's a pickle.
 

KOHIPEET

Member
It's probably simple:
X1 was "lead" platform and the PS4 port didn't get much love


I'm leaning towards this.

Also given the game looks exactly the same on both platforms -regarding resolution, texture and effects quality- suggests that PS4's GPU is being under utilized, right?

So, couldn't be the not used part of the GPU used to make up for the slighly lower clocked CPU in PS4? (GPGPU)
 
Oh and another side note. This is what happens if you are "fine with 30fps"... If your goal is 30fps there is a really good chance of getting 20fps. That game is the reason why guys get so upset with "30fps is fine" people.

Yeah put the fault on the players...makes sense.
 

quetz67

Banned
It's probably simple:
X1 was "lead" platform and the PS4 port didn't get much love

Yes probably.

And the game is unoptimized on a very high level which makes it so CPU dependant.

At least there is no excuse to make it 1080p on the PS4 maintaining the current framerate.
 

Marlenus

Member
I would avoid speculating on latency of memory subsystems unless there are 1:1 benchmarks available. Eg. PS2 was lambasted for supposed Ram latency for years even though its real-world access times were actually 2nd fastest of the 4 consoles of its era.

It is not speculation when you are using the data direct from the memory manufacturers regarding the latency. The memory controllers in both consoles are also going to be pretty much the same so I find it very unlikely that will an impact on the relative latency for the systems.

Memory latency for the consoles is as good as the same, there is so little difference it is not worth mentioning.
 

Durante

Member
It is not speculation when you are using the data direct from the memory manufacturers regarding the latency.
Of course it's speculation. The raw latency of the memory chips doesn't tell you the overall latency of the memory subsystem.

If there's one point where there's a significant difference between the two console SoCs it's their respective memory architectures, so I don't think we have the information to do anything other than speculate about latency.
 

Ty4on

Member
150Mhz is not a huge difference. I benchmarked my old CPU (X4 975) @3.6 and 4.2 GHz (600Mhz difference^^) ... the 15% clock difference only showed 5-8% fps increase ...

So stop actiing like 1.6Ghz and 1.75Ghz is huge. It's 1-2%.

That depends entirely on the task.
A5150 is 1.6Ghz and A5350 is 2.05Ghz while both are otherwise identical Jaguar CPUs:
64170.png

Without a calculator you can see Cinebench scales perfectly.
 
Top Bottom