• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Performance Analysis: Assassin's Creed Unity (Digital Foundry)

You're just pulling stuff out of your ass, purely based on a clock value for the XB1 CPU.
The rest is just not based on anything. And contradicts what devs posting here have to say on the topic, like Matt and Fafalada. But sure, carry on.

That hobby you have of being always late have a term for sure, I will check it. Later, so you can get it in time.

Irrelevant.. right now I don't need to upgrade the disk of my PS4. All I need is included in the 400usd price.

You dont need 32GB of DDR4 RAM and an octocore Intel CPU either to run this game over 30fps with better settings than console.
 
Between this and COD:AW, i'm beginning to wonder if I made the right decision in selling my Xbone and keeping my PS4 because it was the more powerful/capable of the two.

If multiplats keep turning up best on Xbone, I may need to make a switch.


People keep saying this! Did COD get a post-launch 1080p patch for the XB1 I missed?
 

hodgy100

Member
Let's pretend like you can't run the same code on different platforms without bothering much more than which flags can you use on the compiler for each.

youre_serious_futurama.gif


so look up the differences between coding an OpenGL renderer and a DX renderer, now you have an idea of how code needs to differ between platforms, then that's not taking into account any differences in the rest of the api for say interfacing with the system's OS for networking file i/o, trophies / achievements. Setting up job systems and managing multiple threads, controller handling accessory handling... I can go on and on.
 

RexNovis

Banned
Do you know what unoptimized means?

Just out of curiosity.

According to him optimization doesn't matter because it's the same code:

Xbox or PS4 being the lead platform makes no difference. They are running the same code. So everything that you code for one will run mostly the same or better on the other.

Where you all say there is unused GPU resources used on PS4, I clearly see a system struggling to hold 30 FPS even on cutscenes, and performing better than Xbox at the same time. Maybe your expectations are unrealistic.

Minor CPU advantage on Xbox is a fact. And analysis on AC4 makes sense with things we know about both consoles.

So code magically works across platforms "mostly the same or better" performance.

Simply put there are not enough face palm GIFs on the Internet to demonstrate the fail.
 

omonimo

Banned
There is always more thing you can do with a later deadline. Unity has been in development (not saying full production) for 4 years. Console specs have been known since 2012, Ubisoft had the time to exploit the hardware, based on the data we have I suspect they aimed too high for those consoles.
So, they work in this hardware for 4 years and just the last month to say 'oups,we aimed too high'. Very competent.
 

Nafai1123

Banned
According to him optimization doesn't matter because it's the same code:



So code magically works across platforms "mostly the same or better" performance.

Simply put there are not enough face palm GIFs on the Internet to demonstrate the fail.

Lol, makes sense.
 

Ateron

Member
So, they work in this hardware for 4 years and just the last month to say 'oups,we aimed too high'.

More like "oops, we forgot that we make open world games that, by nature, don't run in controlled environments and are subject to a bunch of player induced variables that might tank the fps. We also forgot that we have been showing the game on top of the line pcs that run laps around what these consoles can achieve, on top of it being advertised on several controlled vertical slices of the game that in no shape or form will correspond to the end result, especially on consoles. tee heee, they will buy it anyways, screw it. Just make sure no one gets to review the game until it's out there and on people's hands and we're golden."
 
According to him optimization doesn't matter because it's the same code:



So code magically works across platforms "mostly the same or better" performance.

Simply put there are not enough face palm GIFs on the Internet to demonstrate the fail.

Tell me how would you manage to make a given architecture with such gflop and many gddr5 run worse a given code than same architecture with DDR3 and M$ logos all over it.

Oh, wait, moneyhats.

Some day Sony fanbase will stop to blame devs for everything that goes wrong. That day isn't close.


Still waiting for some dev to come here and tell us all than PS4's CPU performs better than Xbox one. Not that the console is better overall, that is a thing clear enough. Can you tell the difference?
 

JohnnyFootball

GerAlt-Right. Ciriously.
Don't worry. Once they see how there are a lot more sales for the PS4 version of this game than on the Xbox One version, Ubisoft will think twice before making the Xbox One version of their games the lead platform.

Aside from Splinter Cell, Ubisoft has traditionally had more marketing deals with Sony. Farcry 4 next week will have Sony marketting behind it ironically.

I remember how people commented that Farcry 3 ended up being what everyone hoped Assassins Creed 3 would be!

Maybe we will have the same thing happen for AC:Unity and Farcry 4.
 

Chobel

Member
Can you tell the difference between CPU and GPU?

Just out of curiosity.

Yep, now do you know the meaning of "There is nothing about the XBO that allows it to do anything over the PS4" and "In essentially every way, the PS4's hardware is more powerful than the XBO's."?

Just out of curiosity.
 
According to him optimization doesn't matter because it's the same code:



So code magically works across platforms "mostly the same or better" performance.

Simply put there are not enough face palm GIFs on the Internet to demonstrate the fail.

Yeah, it's super clear they have never worked with cross platform or cross API code, so as you've said there's no point putting much value in what they're saying.
 

Ateron

Member
Yep, now do you know the meaning of "There is nothing about the XBO that allows it to do anything over the PS4" and "In essentially every way, the PS4's hardware is more powerful than the XBO's."?

Just out of curiosity.

I wouldn't waste more time on that, mate. We have given those examples over and over again, and still isn't enough for some people.
 

Marlenus

Member
Tell me how would you manage to make a given architecture with such gflop and many gddr5 run worse a given code than same architecture with DDR3 and M$ logos all over it.

Well why would an SQL query with sub selects be slower than a query that gives you the same result set but uses joins with indexed tables? The simple answer is that there is often more than a single way to get from a -> b and the fastest way in API 1 might not be the fastest way in API 2 even when running on the same hardware.
 
Yep, now do you know the meaning of "There is nothing about the XBO that allows it to do anything over the PS4" and "In essentially every way, the PS4's hardware is more powerful than the XBO's."?

Just out of curiosity.

Thread name:

Virtual testing of PS4 and XBO GPUs prove PS4 has bigger grafix numbers :

I fully agree with that quote in that context.

Now keep talking about CPU. Show me where does he, or any other dev, say that Xbox's CPU is worse essentially in every way.
 
Well why would an SQL query with sub selects be slower than a query that gives you the same result set but uses joins with indexed tables? The simple answer is that there is often more than a single way to get from a -> b and the fastest way in API 1 might not be the fastest way in API 2 even when running on the same hardware.

So Sony's fault for a lackluster API regarding it's ability as AC:U player.

Lazy Sony :mad:
 
Thread name:

Virtual testing of PS4 and XBO GPUs prove PS4 has bigger grafix numbers :

I fully agree with that quote in that context.

Now keep talking about CPU. Show me where does he, or any other dev, say that Xbox's CPU is worse essentially in every way.

Is there even a point to all your posts beyond trying to shoehorn why badly coded game X runs slightly less badly on platform A compared to platform B into reason Y?

So Sony's fault for a lackluster API regarding it's ability as AC:U player.

Lazy Sony :mad:

Yeah, you're either utterly clueless or just completely trolling.
 

Sweep14

Member
Mark Cerny am cry.

All that time, devising the super-duper-turbo-charged-system with the GDDR rams and the additional ROPS and ACE units and for what?

MS overclocks their CPU by 10% a couple of weeks before launch and BAM. They have won the gen.

I now understand the true meaning behind the infamous Albert P post. We really do owe him an apology.

Hope you're being sarcastic there, otherwise...
 

cakely

Member
Came in to check on this thread, and, wow! It looks like someone here is using a broken game as proof of ... let's see ... that Sony can't write an API?

Truly amazing.
 

KidJr

Member
I wont speak on matters I know nothing about but for people who believe you cant run AI algorithms on the GPU, you would be very very very wrong.
 

SapientWolf

Trucker Sexologist
The way the game is coded is to heavily rely on the CPU, of which X1's CPU has a minor 150 mhz advantage--all else being equal. Both of these machines, PS4 especially, have been designed to offload tasks like hordes of brainless NPCs from the CPU. With the bug-riddled, unoptimized state of the game (it even performs like shit on high end PCs), it is likely that Ubisoft did not properly leverage GPGPU Compute in crowd scenarios. If they had, the performance gulf would likely be significantly in the PS4's favor.

That's why you have a verified 3rd party dev Matt in this very thread being surprised, saying, "Wow." He also commented that he was tempted to say that there wasn't an excuse for this but he changed his comment since he wasn't actually there at Ubisoft to see the particular scenario encountered.
I don't think it's the CPU. If it was, the PS4 could move up to 1080p since a higher resolution shouldn't have an affect in a CPU limited game. The relatively small difference in CPU capabilities alone doesn't explain the wide performance gap we saw in the DF video. A member of the AI team even stated that the AI isn't using a lot of processing power. It looks like most of them just go through canned animations, and the people in the background don't have a lot of detail or behavior. The only people that you interact with are the ones you are pushing out of the way.

Also, the PS4's low point was when you had a lot of shader effects on screen (https://www.youtube.com/watch?v=clQfCP3NFuc#t=45s). Similar story with the XB1, and I noticed that there was no volumetric lighting coming through the window in the XB1 scene. So it may be the case that the PS4 has higher graphical settings. There wasn't a large crowd on scene doing complex things. The scene with the large crowd at the end kept a steady 30fps.

The team wrote a white paper on GPGPU, so it's not like they ignored it. There's just a limited subset of algorithms that you can use it on (i.e. well suited to SIMD). I suspect that the problem is memory related, and 32GB of DDR4 and another 4GB of GDDR5 on the GPU provides a crap ton of memory bandwidth. That's something that devs seem to struggle with on the consoles (thus the lack of AF).
 
Still waiting for some dev to come here and tell us all than PS4's CPU performs better than Xbox one. Not that the console is better overall, that is a thing clear enough. Can you tell the difference?

As a discrete piece of hardware, X1's CPU is clocked 150 mhz high than PS4's CPU and happens to be exactly the same CPU. It is known that, by itself, it has a minor advantage. There was a question about how the X1's CPU performed functionally, especially prior to the SDK updates and the relinquishing of the in-game Kinect CPU reserves. Matt and others said it was easier to get performance out of PS4's CPU in prior threads. This may have been due to the memory subsystem setup: the fact that PS4 actually leverages hUMA with its unified pool of high speed memory and volatile memory pointers for information tagged as needing to both be worked on by the CPU and the GPU. Not having to copy information back and forth between pools of CPU and GPU memory means less copying/memory overhead, less bandwidth usage, less latency, and more available CPU cycles and GPU cycles to do productive work with. This could be one possible explanation why in most scenarios, the PS4 CPU outperforms the X1 CPU, even though it is clocked higher. The utilization of the PS4 CPU (and by extension GPU) could just simply be more efficient in most cases.
 

Chobel

Member
Thread name:

Virtual testing of PS4 and XBO GPUs prove PS4 has bigger grafix numbers :

I fully agree with that quote in that context.

Now keep talking about CPU. Show me where does he, or any other dev, say that Xbox's CPU is worse essentially in every way.

Are you kidding me? Are even reading these comments I linked? "In essentially every way" means virtually and non virtually.

His other comment (which I also linked) was in "Sniper Elite 3 Digital Foundry Face-Off", do you see "virtual" in the title?

And FFS! Why are you ignoring that he said right in this thread that there is no excuse for this?

No one is arguing that PS4 CPU is more powerful, only you twisting what we're saying so it fits your narrative.
 
I wont speak on matters I know nothing about but for people who believe you cant run AI algorithms on the GPU, you would be very very very wrong.

Well I hope they will find to shift more work to the GPU.
Nvidia have something called GAI :
http://www.geeks3d.com/20100606/gpu-computing-nvidia-gpu-artificial-intelligence-technology-preview/:

"This technology preview is a snapshot of some internal research we have been working on and talking about at various conferences for the past couple years. The level of interest in GPU-accelerated AI has continued to grow, so we are making this (unsupported) snapshot available for developers who would like to experiment with the technology."

.
 

SpotAnime

Member
For a GPU benchmark to be relevant, one has to make sure the CPU is out of the way. You want to know how various cards stack up against each other.
With CPU and GPU benchmarks you can find easily how "average" hardware does in a game. Look at what a measle I3 achieves in such a CPU bound game, pair it with a mid-range GPU and you can get a very solid 30fps at 1080p, along with more effects than consoles.

That makes sense. I would eventually like to see someone create some +-% of scale depending on the CPU and GPU used, based on some hardware baseline. I know that's easier said than done, but I bet some approximation of performance could be achieved.
 

KidJr

Member

Dude, common man I'm not trying to start a seperate debate but right now I'm looking at at a k-means clustering algorithm that has been optimized to run on a pretty average GPU and seeing hell of performance increase.

Right now, infront of me. Please do not tell me that AI type algorithms arnt suited to GPU use.

I dont get what you trying to show me, I'm saying AI is very well suited to GPU use???
 

Randdalf

Member
Dude, common man I'm not trying to start a seperate debate but right now I'm looking at at a k-means clustering algorithm that has been optimized to run on a pretty average GPU and seeing hell of performance increase.

Right now, infront of me. Please do not tell me that AI type algorithms arnt suited to GPU use.

I dont get what you trying to show me, I'm saying AI is very well suited to GPU use???

GPUs are very good at embarrasingly parallel algorithms, that have minimal branching (I.e. decision making). AI code is basically all decision making, it is the absolute worst sort of thing you could ask a GPU to do unless it is disruptively trivialised.

Incidentally, doing k-means on a GPU is probably a bad idea too and could be done more effectively on a CPU.
 

KidJr

Member
GPUs are very good at embarrasingly parallel algorithms, that have minimal branching (I.e. decision making). AI code is basically all decision making, it is the absolute worst sort of thing you could ask a GPU to do unless it is disruptively trivialised.

Incidentally, doing k-means on a GPU is probably a bad idea too and could be done more effectively on a CPU.

Not when you offset the compute intensive parts of the algorithm to the GPU. it's MUCH faster

It's more complicated yes, because you of speed that GPU and CPU communicate is considerably slower. But even that time cost of that is offset by the speed gains of using GPU compute.

May I ask why you think GPU Compute could be done more efficiently on CPU only?

GPU Compute is is also good for decision making algorithms, I'm heading out of work but will edit this or respond shortly.
 
Quoting you as I could quote anyone else overreacting against my posts with insults and stupidity.

Basically, PS4 is the 1st pure unified memory design. There were previous approaches before with N64 or 360, but both of them had private pools of memory (EDRAM or DMEM). This was possible thanks to the huge BW provided by the GDDR5, we know. But every design have its own downsides.

When people see 176Gb/s they think they have 156Gb/s for the CPU and 20Gb/s for the CPU, but that doesn't work that way. Everytime memory have to change its electric state, it loses cycles. Every time you change the job to do, you lose cycles. Everytime you can't repeat a pattern, you lose efficiency. And, of course, GPGPU isn't free as many people in this thread argue.

Both components hammering the same pool of ram just decrease its overall perfomance. This doesn't matter as much on GPU, but it hinders greatly on CPU.

In the way One is designed, CPU have more efficiency over the already higher clock. It's not like I'm saying that the One have better architecture than the Four. Just like some scenarios favour it, and this is one of those.

The sooner you can accept this, the less preorders you will have to cancel.

If the game was programmed to take advantage of each consoles abilities then there is no way the X1 version would be superior. This means the PS4 version isn't optimised and more than likely just ported over from the X1 version
 

Randdalf

Member
Not when you offset the compute intensive parts of the algorithm to the GPU. it's MUCH faster

It's more complicated yes, because you of speed that GPU and CPU communicate is considerably slower. But even that time cost of that is offset by the speed gains of using GPU compute.

May I ask why you think GPU Compute could be done more efficiently on CPU only?

GPU Compute is is also good for decision making algorithms, I'm heading out of work but will edit this or respond shortly.

The problem I have with doing k-means on a GPU is because k-means would require frequent (depending on the rate of convergence I suppose) reduction operations and global synchronisations (which can only be done by re-launching the kernel). Those reductions would also not generally operate on the data set uniformly, but rather one would need to be done for each cluster of items, which could be distributed all over the data set at random.

If you have an absolutely massive data set, then I imagine that the compute might overshadow the reductions, but I'm still not convinced a GPU is the best platform suited for this task which is more about making decisions and moving data about rather than raw compute.

I absolutely disagree that the GPU is good for decision making algorithms, it really does not make sense based on how the hardware actually works. Often you'll have a 16 or 32 item wide SIMD execution pipeline that runs through the data. This means that branching is extremely costly relative to the peak performance you could be getting, because each work item should be performing the same instruction at the same time.
 
Are you kidding me? Are even reading these comments I linked? "In essentially every way" means virtually and non virtually.

His other comment (which I also linked) was in "Sniper Elite 3 Digital Foundry Face-Off", do you see "virtual" in the title?

And FFS! Why are you ignoring that he said right in this thread that there is no excuse for this?

No one is arguing that PS4 CPU is more powerful, only you twisting what we're saying so it fits your narrative.

If Ubisoft actually fully utilized both X1 and PS4 there is no way in hell the X1 would perform better. So what are are Ubisoft doing with the extra 40% power of the PS4 GPU and the benefits of unified GDDR5 Ram? And if the CPU was bottlenecking this game so severely then why didn't they just lower the NPC account 5-10%???
 
Tell me how would you manage to make a given architecture with such gflop and many gddr5 run worse a given code than same architecture with DDR3 and M$ logos all over it.

Oh, wait, moneyhats.

Some day Sony fanbase will stop to blame devs for everything that goes wrong. That day isn't close.


Still waiting for some dev to come here and tell us all than PS4's CPU performs better than Xbox one. Not that the console is better overall, that is a thing clear enough. Can you tell the difference?
You're like those people that say if God was real, he would drop a million dollars out of the sky right then and there. And when it doesn't happen, you use that as proof that you were right all along.
 

Marlenus

Member
So Sony's fault for a lackluster API regarding it's ability as AC:U player.

Lazy Sony :mad:

As I have said more than once now it is not to do with the quality of the API just that they are different and something that may be quick and efficient in one may not be in another so it requires tuning for each platform.
 
TotalBiscuit about the PC version:

https://www.youtube.com/watch?v=SgpzT5V5Mgs

He usually do the "Let's not play" with terrible and unplayable games. Usually indie garbage pre-alpha.

Just consider, that he has an Amazing PC rig. If he is having problems with that....

And people blaming Sony/MS for this.

he's a bit misinformed. ac: u has fxaa, msaa (very taxing), and txaa(even more taxing than msaa). As well, he seems to call anything less than 1080P/60FPS bad performance. as well, benchmarks show the game is cpu bottlenecked which is why he can never hit a locked 60fps. its beyond any current processor. in cutscenes though, bokeh dof tanks gpu performance explaining the tanking there.
 
Top Bottom