• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Performance Analysis: Assassin's Creed Unity (Digital Foundry)

hqdefault.jpg


Woohoo 4 frame advantage.

Regardless, glad I didn't pick this one up.
 
RikerPeaceOutLOL.gif


Please just stop. You obviously have no clue what you are talking about and, no, false dichotomies is not going to hide your complete ignorance.

Quoting you as I could quote anyone else overreacting against my posts with insults and stupidity.

Basically, PS4 is the 1st pure unified memory design. There were previous approaches before with N64 or 360, but both of them had private pools of memory (EDRAM or DMEM). This was possible thanks to the huge BW provided by the GDDR5, we know. But every design have its own downsides.

When people see 176Gb/s they think they have 156Gb/s for the CPU and 20Gb/s for the CPU, but that doesn't work that way. Everytime memory have to change its electric state, it loses cycles. Every time you change the job to do, you lose cycles. Everytime you can't repeat a pattern, you lose efficiency. And, of course, GPGPU isn't free as many people in this thread argue.

Both components hammering the same pool of ram just decrease its overall perfomance. This doesn't matter as much on GPU, but it hinders greatly on CPU.

In the way One is designed, CPU have more efficiency over the already higher clock. It's not like I'm saying that the One have better architecture than the Four. Just like some scenarios favour it, and this is one of those.

The sooner you can accept this, the less preorders you will have to cancel.
 

Lemondish

Member
Increased CPU resources on the Xbox One is a subject largely discussed already on this same forum. You just have to remember all those balance posts.

I know there is people here that need PS4 to be superior on every regard over One. That leads to people repeating on every thread that PS4 cpu might be overclocked to 1'8 or even 2Ghz, refusing to accept any evidence of stock speeds that they get. Thread after thread.

Then they blame devs everytime that they consider the gap is not wide enough, like some anorexic girl in front of the mirror. Unable to see anything outside of her own belly.

Also, there is many people claiming that 150mhz is negligible based on their desktop computers, not taking into account how heavily CPU bound current gen is and how low clock those CPU are to start with.

You were wrong about paritygate, and launch game proves it. Deal with it.

There's a psych ward in hell dedicated to your special brand of silly.
 

Bastardo

Member
The main point of why unity performs so horribly at all (developed only for PC, not taking GPU compute into account), were already said in this thread, but I wanted to summarize it with a case study here.

TL;DR is: Unity was developed for the PC and not converted appropiately for the architectures of the PS4 and XB1. Ubisoft calls this "being CPU-heavy", because they simply didn't offload enough on the GPU.

Long explanation:

Facts:
1.) The consoles use a HSA architecture. Memory is shared between CPU and GPU.
2.) Comparably weak CPUs (however 8 Cores each).
3.) The PS4 memory has about double bandwidth (good) in comparison to the XB1.
Edit: It was pointed out to me that taking the clockrate into account, the latency for both XB1 and PS4 is about comparable, previously I wrote: PS4 has a much higher latency (this is wrong) Thanks to jpax for pointing it out.). (68 GB/s DDR3 and 204 GB/s esram for the XB1 and 170 GB/s PS4.)
4.) PCs usually have separate memory for CPU and GPU. The CPU memory allows for random access, the GPU memory has high bandwidth (good), but does not allow for random access.
5.) CPU memory access is very often random and therefore allows for complex algorithms. GPU memory access is often aligned and therefore benefits from high bandwidth.
6.) Due to 5.) some algorithms run well on GPU, some run well on CPU.

Now imagine a very simple AssCreed algorithm: 3000 NPCs on Screen wave towards their next neighbour. The hard part of this algo is to figure out who is the next neighbour.

A good CPU algo would look like this:
1.) Geometrically divide the 2D-plane somehow (e.g. KD-Tree - complex algo, lots of random memory access) complexity O(N*log(N))
2.) Query the complex datastructure sequentially for each actor (3000 calls, each cost about log(N)) to find out the nearest neighbour, total also O(N*log(N))

A bad CPU algo would look like this:
1.) For every pair of actors (3000*3000 = 9000000 pairs) calculate the distance. O(N^2)
2.) Always store the actor with the minimum distance O(1)
Total time O(N^2), which will destroy your performance

When thinking about the GPU however the bad CPU algo becomes good:
1.) Assign each actor one GPU core (GTX980 has 2048 cores), so it takes two passes
2.) For each GPU Core calculate 3000 distances (exactly 3000 calls)
In other words: The GPU could run the bad CPU implementation in about 2*N cycles. The good CPU implementation does not really work very well on the GPU, because it takes lots of random memory accesses.

In theory the following is therefore true (for the algo above):
GPU algorithm > good CPU algorithm > bad CPU algorithm

In practice usually the following happens:
The GPU algorithm is missing two kinda transparent steps:
new: 0.) Transfer the actor coordinates from main memory to GPU memory
old: 1.) Assign each actor one GPU core (GTX980 has 2048 cores), so it takes two passes
old: 2.) For each GPU Core calculate 3000 distances (exactly 3000 calls)
new: 3.) Transfer the result back to the CPU to use it further

Now if 1.) and 2.) take much longer than 0.) and 3.), this is completely acceptable. This holds true for very compute intensive problems and problems, where step 3.) is unnecessary (think lara crofts hair, it just needs to be rendered on GPU, but not be sent back to the CPU, because it has no gaming impact).
No matter how fast your GPU is (even two Titans), step 0. and 3. will always take about the same time, because they are limited by your main memory and pcie bandwidth.

Therefore on a non HSA architecture usually the following happens for small data sizes in practice:
good CPU algorithm > GPU algorithm > bad CPU algorithm

Keeping in mind that the PS4 and the XB1 are HSA architectures memory transfers are not required. This is the one thing, which elevates them above most Intel-Nvidia PC architectures. Yes, your Titan is going to always have a better fillrate and yes it's going to be the only contender to render at 4k, but it will always require memory transfers between main and gpu memory. Therefore there is always a tradeoff, before gpu compute becomes viable. GPU compute does not incur the overhead on XB1 and PS4, which it does on most (non-Kaveri) PC. Quite certainly Ubisoft went the PC route and ported the code over to PS4 and XB1 without using much GPU compute. Therefore the code is very CPU heavy. As the typical optimized PC-code uses very many random accesses, it sucks both on Xbox and PS4.
Edit: My explanation that it runs better on XB1 was previously that I thought it would have lower latency memory. This was wrong, as the latency difference should be too small to account for the differences.

Bottom line:
It is easy to efficiently code on the new consoles, much easier than for Cell, because unlike on the PC, GPU computing does not incur an overhead and you should use it, whenever possible, even for very small computations.
 
I am NOT rooting for anybody, but I wrote to DF anyways, to shed some light on the PS4 vs X1 debate. I have multiple sources telling me that they are not experiencing frame drops in the PS4 version, at least no big ones, while I have been struggling with the X1 version.
Could it be some firmware thing? Maybe DF wrote the article before 2.02? I dunno, I cannot call liars neither the DF guys neither my colleagues (italian reviewers). That's a pickle.

Interesting. And Dragon Age seems to also have issues if you don't use the latest firmware, so it may have something to do with that.
 
TL;DR is: Unity was developed for the PC and not converted appropiately for the architectures of the PS4 and XB1. Ubisoft calls this "being CPU-heavy", because they simply didn't offload enough on the GPU.

Console game was developed by the main studio and PC version was outsourced.
 
The way I see it:

Assassin's Creed Unity was a rushed project that obviously needed more time in the oven. It was an ambitious title on a new engine.

When a project gets rushed out the door, its doubtful that both versions have been given the proper development time. There's a good chance that there is one platform lagging behind.

Imagine if Ubisoft pushed out Watch Dogs in 2013 for the PS4 and Xbox One launch.
The game was obviously bugging and lacking polish. The frame rate was weak, too.
Can you imagine what the Xbox One version looked like? It probably would have been disastrous and would have demonstrated the largest gap between the Xbox One and PS4 we had ever seen.

If Ubisoft actually does give a shit about the performance of this game and decides to release a patch, I wouldn't be surprised to see the gap between these two versions close.
 

jpax

Member
The main point of why the XB1 performs better in this game (lower memory latency) and why it performs so horribly at all (developed only for PC), were already said in this thread, but I wanted to summarize it with a case study here.

TL;DR is: Unity was developed for the PC and not converted appropiately for the architectures of the PS4 and XB1. Ubisoft calls this "being CPU-heavy", because they simply didn't offload enough on the GPU.

Long explanation:

Facts:
1.) The consoles use a HSA architecture. Memory is shared between CPU and GPU.
2.) Comparably weak CPUs (however 8 Cores each).
3.) The PS4 memory has about double bandwidth (good) in comparison to the XB1. It also has a much higher latency (bad). (68 GB/s DDR3 and 204 GB/s esram for the XB1 and 170 GB/s PS4.)
4.) PCs usually have separate memory for CPU and GPU. The CPU memory has low latency, the GPU memory has high bandwidth (good) and high latency (bad).
5.) CPU memory access is very often random and therefore benefits from low latency. GPU memory is often aligned and therefore benefits from high bandwidth.
6.) Due to 5.) some algorithms run well on GPU, some run well on CPU.

Why would you repeat the low latency nonsense? You taint your whole post with that.
 

score01

Member
Mark Cerny am cry.

All that time, devising the super-duper-turbo-charged-system with the GDDR rams and the additional ROPS and ACE units and for what?

MS overclocks their CPU by 10% a couple of weeks before launch and BAM. They have won the gen.

I now understand the true meaning behind the infamous Albert P post. We really do owe him an apology.
 

Raist

Banned
Increased CPU resources on the Xbox One is a subject largely discussed already on this same forum. You just have to remember all those balance posts.

I know there is people here that need PS4 to be superior on every regard over One. That leads to people repeating on every thread that PS4 cpu might be overclocked to 1'8 or even 2Ghz, refusing to accept any evidence of stock speeds that they get. Thread after thread.

Then they blame devs everytime that they consider the gap is not wide enough, like some anorexic girl in front of the mirror. Unable to see anything outside of her own belly.

Also, there is many people claiming that 150mhz is negligible based on their desktop computers, not taking into account how heavily CPU bound current gen is and how low clock those CPU are to start with.

You were wrong about paritygate, and launch game proves it. Deal with it.

I suppose that totally explains the, what, 20 other multiplats that run better on PS4 at higher resolutions, too? It's because this new gen is totally CPU-bound and the XB1 is so much better at it, right?
 

HIR0

Member
Mark Cerny am cry.

All that time, devising the super-duper-turbo-charged-system with the GDDR rams and the additional ROPS and ACE units and for what?

MS overclocks their CPU by 10% a couple of weeks before launch and BAM. They have won the gen.

I now understand the true meaning behind the infamous Albert P post. We really do owe him an apology.

michael-jordan-laugh.gif


Edit: yay! I'm a member now. Thanks mods :]
 

Dicer

Banned
Thanks for the many chuckles in this thread, it's been pretty damned amazing....


OT: instead of blaming consoles for the performance, why not blame Ubi for overreaching and creating something far too unoptimized and bloated and pushing it out the door. If there was ever a time for the Xbone/PS4/PC crowd to be tri-partisan this would be it.
 
Mark Cerny am cry.

All that time, devising the super-duper-turbo-charged-system with the GDDR rams and the additional ROPS and ACE units and for what?

MS overclocks their CPU by 10% a couple of weeks before launch and BAM. They have won the gen.

I now understand the true meaning behind the infamous Albert P post. We really do owe him an apology.

Tell this to misterxmedia as well. He was right since the beginning, this is the point when the secret sauces and hidden chips come into play!
 
That leads to people repeating on every thread that PS4 cpu might be overclocked to 1'8 or even 2Ghz, refusing to accept any evidence of stock speeds that they get. Thread after thread.
I've never seen anyone talking about the PS4 CPU being overclocked in any thread (other than speculation threads from 2013). Can you give us 2 or 3 examples of this? Thanks.
 
Mark Cerny am cry.

All that time, devising the super-duper-turbo-charged-system with the GDDR rams and the additional ROPS and ACE units and for what?

MS overclocks their CPU by 10% a couple of weeks before launch and BAM. They have won the gen.

I now understand the true meaning behind the infamous Albert P post. We really do owe him an apology.
I'm pleased to see your not getting carried away with all this.
 

jimi_dini

Member
Phil: So we are getting screwed on the graphics front, its creating a bad image for the X1 and will stop us from world domination. $25 million aught to do it?

Yves: Hoh-hoh je suis le fromage hoh hoh for $25 mill, I wipe my french ass with $25mill I want a life size diamond statue of Aisha Tyler, nothing else

Phil: I will have to pull a few strings, but yes I can sought that out, but if the PS4 version is not inferior MS will destroy you faster then you can say Sacre bleu.

Yves: Oui oui

Phil: So yes?

Yves: no I have to take a piss you stupid american

Phil: huh

Yves: Let me tell you of zee secret of ubisoft. all our games are programmed using zhe Quiz Whiz

I want a crepe hoh hoh

oOFGU0Z.gif
 
I find it interesting that despite PS4 having bigger numbers, XBone seems to have an edge here because words and the recent words. According to my calculations, PS4 should be pushing numbers but for some reason the code for words is largely unoptimized for words words words, most likely due to the game being on PC.
 
Thanks for the many chuckles in this thread, it's been pretty damned amazing....


OT: instead of blaming consoles for the performance, why not blame Ubi for overreaching and creating something far too unoptimized and bloated and pushing it out the door. If there was ever a time for the Xbone/PS4/PC crowd to be tri-partisan this would be it.

Get the hell out of here with your common sense and stuff - this is not the place.

OT: I agree. The whole thing is disapointing on so many levels across all platforms.
 
I find it interesting that despite PS4 having bigger numbers, XBone seems to have an edge here because words and the recent words. According to my calculations, PS4 should be pushing numbers but for some reason the code for words is largely unoptimized for words words words, most likely due to the game being on PC.

This is now my favorite post in this thread.
 

patientx

Member
They obviously have one solution now : Lock both consoles ro 24 fps for dat cinematic experience also people wont notice any framedrops then ... :)
 
Mark Cerny am cry.

All that time, devising the super-duper-turbo-charged-system with the GDDR rams and the additional ROPS and ACE units and for what?

MS overclocks their CPU by 10% a couple of weeks before launch and BAM. They have won the gen.

I now understand the true meaning behind the infamous Albert P post. We really do owe him an apology.

Wow... some people in this thread..

Djeezus people. It would be wise not to buy this game on any of the consoles cause they made the wrong decisions while making the game

People rather play a smooth game then looking at 3000 useless copypasted characters popping in to the screen. They also like the game NOT to be boring, as well as walking in a world that is not boring.

It's a shame. the setting could have been sooo good.
 
Mark Cerny am cry.

All that time, devising the super-duper-turbo-charged-system with the GDDR rams and the additional ROPS and ACE units and for what?

MS overclocks their CPU by 10% a couple of weeks before launch and BAM. They have won the gen.

I now understand the true meaning behind the infamous Albert P post. We really do owe him an apology.

So you are tellling me that MS have won the gen because of 4 extra frames? This is why Ubisoft doesnt make games designed with Neogaffers in mind. lol
 

dofry

That's "Dr." dofry to you.
I find it interesting that despite PS4 having bigger numbers, XBone seems to have an edge here because words and the recent words. According to my calculations, PS4 should be pushing numbers but for some reason the code for words is largely unoptimized for words words words, most likely due to the game being on PC.

Word.
 

Percy

Banned
Increased CPU resources on the Xbox One is a subject largely discussed already on this same forum. You just have to remember all those balance posts.

I know there is people here that need PS4 to be superior on every regard over One. That leads to people repeating on every thread that PS4 cpu might be overclocked to 1'8 or even 2Ghz, refusing to accept any evidence of stock speeds that they get. Thread after thread.

Then they blame devs everytime that they consider the gap is not wide enough, like some anorexic girl in front of the mirror. Unable to see anything outside of her own belly.

Also, there is many people claiming that 150mhz is negligible based on their desktop computers, not taking into account how heavily CPU bound current gen is and how low clock those CPU are to start with.

You were wrong about paritygate, and launch game proves it. Deal with it.

Using Anorexia as a serious point of analogy for a subject as trivial as this... wow.

I should probably be shocked by this, but seeing as this is a thread where some seem to genuinely believe that significant differences between two pieces of locked hardware already available to consumers is a 'gap' that can be 'closed', appalling shit like this somehow seemed like the only direction the posts could go.

Reminds me of "PS4 gives you brain tumours" guy from a similar thread a couple of weeks ago.
 

RexNovis

Banned
Quoting you as I could quote anyone else overreacting against my posts with insults and stupidity.

Basically, PS4 is the 1st pure unified memory design. There were previous approaches before with N64 or 360, but both of them had private pools of memory (EDRAM or DMEM). This was possible thanks to the huge BW provided by the GDDR5, we know. But every design have its own downsides.

When people see 176Gb/s they think they have 156Gb/s for the CPU and 20Gb/s for the CPU, but that doesn't work that way. Everytime memory have to change its electric state, it loses cycles. Every time you change the job to do, you lose cycles. Everytime you can't repeat a pattern, you lose efficiency. And, of course, GPGPU isn't free as many people in this thread argue.

Both components hammering the same pool of ram just decrease its overall perfomance. This doesn't matter as much on GPU, but it hinders greatly on CPU.

In the way One is designed, CPU have more efficiency over the already higher clock. It's not like I'm saying that the One have better architecture than the Four. Just like some scenarios favour it, and this is one of those.

The sooner you can accept this, the less preorders you will have to cancel
.

Picard-facepalm-o.gif


Well you would have a point if PS4 didn't have HUMA and simultaneous read/write access on the ram for the CPU and GPU. The fact is the single advantageous feature hardware wise for the XB1 is a 150mhz up clock on the CPU. It is outperformed in literally every other capacity on the competing platform. The FPS gap between versions is not proportional to a minor cpu clock boost.

This combined with the fact that we have members going on record saying that XB1 was the lead platform would indicate it's an optimization issue not a result of any hardware advantage. There's also the matter of resolution and effect parity which essentially means that the GPU advantage the PS4 enjoys is just being left unused for some unknown reason. Readily available and easily utilized power on a locked hardware system is being left unused and you are claiming some superiority for the system that has served as lead platform and as such received extensive optimization. Either you're being purposefully obtuse or you're just incapable of grasping the big picture. There are more factors involved here than hardware limitations.

But hey don't just take my word for it.

We have a vetted dev (matt) who said (paraphrasing here) there's no practical scenario where Xbone beat PS4 in performance. He also commented in this thread saying there's no excuse for this.
 

ascar

Neo Member
Man people really came out of the woodwork with the latency thing ahaha
Any technical discussion on this game is completely useless since the game is just an utter unoptimized mess, and we don't know how much is unoptimized on each of the plattforms involved... we should just wait for a couple of patches instead of trying to build theories founded on no logical basis...
 

Hanmik

Member
Mark Cerny am cry.

All that time, devising the super-duper-turbo-charged-system with the GDDR rams and the additional ROPS and ACE units and for what?

MS overclocks their CPU by 10% a couple of weeks before launch and BAM. They have won the gen.

I now understand the true meaning behind the infamous Albert P post. We really do owe him an apology.

K2uKMCX.gif
 

Bastardo

Member
Why would you repeat the low latency nonsense? You taint your whole post with that.

Yes you are right. I presented the latency of DDR3 very clear cut in favor of XB1, when it's more like 10ns to 12ns at max. It is not as clear, as I presented it and I was wrong in doing so.

Here is a quote, which actually undermines your point. From here: http://www.redgamingtech.com/ps4-vs-xbox-one-gddr5-vs-ddr3-latency/
"The latency on GDDR5 and DDR3 memory does vary some, but the typical CAS latency of most sticks of DDR3 is between 7 and 9 (see the earlier Tomshardware link for more information). GDDR5 RAM meanwhile is a typical latency of about 15. Once again, all of this depends on the make and models of the RAM.

So – there’s the answer right? Let’s say 8 CAS vs 15 CAS? No, it’s not. We have to remember that the speeds are for all intents and purposes – relative. If you take the CAS of both, and then multiply it by the clock speed – you get the ns of delay. CAS of 15/1.35 = 11ns."


I will edit my post reflecting that.
Edit: The edit of my post above is finished.
 

Emedan

Member
Because GPU's are good at maths (physics, collision, graphics), but not decision making making them pretty bad at AI and game logic.
So allocate some tasks to the GPU which it can handle to free up the CPU, isn't that the point of GPGPU.

First they need to have the game in beta longer than 1 week, for starters. It's completely unpolished.
So it seems, I wonder what happened to the developers that sought pride in technical accomplishments and strived towards it..this conveyor belt mentality towards making games is terrible
 
The fanboys in this thread are blowing my mind, anus, and everything in between.

Why are we not condemning Ubisoft for this obvious shit job of a video game? (I know some are but the majority are arguing about why it runs better on X1; which is obviously because the game is a turd.
 

RobRSG

Member
I find it interesting that despite PS4 having bigger numbers, XBone seems to have an edge here because words and the recent words. According to my calculations, PS4 should be pushing numbers but for some reason the code for words is largely unoptimized for words words words, most likely due to the game being on PC.

Best post.

Another cool thing is that lots of high profile technical guys keeps appearing out of nowhere.
 
Really ??

People are using this half baked piece of cr@p as some kind victory for the Bone ?

It is performing badly on both platforms.

Im flabbergasted regardless of leaning to a certain console we as gamers should be sticking two fingers up to Ubisoft let alone giving any credence that this proves the Bones technical superiority.
 
You guys are still fighting console warz even though everyone with half a brain should be blaming Ubisoft for rushing this shit out, and nothing else?
Come on!

This right here. Now is not the time to be divided. We must come together to overthrow the gaming developer aristocracy that has ruled this generation and last with an ignorant, unwavering iron fist.

No longer should we the people except scraps of framerates, the secrecy of embargo, and the excuses for poor product from the so called elite who feel they hold the power to crush our wallets with an iron boot.

We must no longer lay down our wallets in advance. We must no longer lay down our wallets for a half cake, only to lay them down again for the rest! We must stand and make them deliver the products we deserve. For we hold the power! We hold the money! They do not survive without us!

Viva gaming! Viva la revalucion!




Ah who am I kidding. 5 million people already made this game profitable for them. I'm sure of it.
 
Top Bottom