Muppet of a Man
Member
There is no excuse for this.
A wild Matt approaches....
Welp, one and done. PS4 should not perform like it does. Ubisoft confirmed incompetent. Pack it up. Nothing to see here.
There is no excuse for this.
That's the point I'm trying to make, it's not that simple. These GPUs are largely the same, Ps4 has more execution units, but they have lots of the same hardware in it. And al of those run faster on xbone due the higher clock.
Of course there are. Plenty of reasons actually.
Like low performant SDKs with high cpu overhead, virtualization costs, etc... Do you not noticed a trend where as soon as Ms started improving their tools the performance started creeping closer to PS4? Ps4 will probably still outperform the xbone, but early titles were not showing the real baseline performance of the xbone hardware.
That's subjective as fuck.
I think games like FH2 and SO have nothing to be ashamed of when compared to Ps4 exclusives, in many ways they even outdo them. But it's a bit pointless arguing over that,.
I will try to simplify:
- The cutscenes are largely gpu bound. Ps4 drops frames as well, so it's not being capped or anything. Still, the performance delta is not a single time close to the 40% the extra flops on Ps4 would lead to believe? Why? It might be because there are parts of the rendering pipeline (like for example setting up the vertex data as fragments to the pixel shader) that runs faster on xbone, which can make up for the difference. It might also be that the shaders they are using rely on bandwidth or some other resource than flops. Either way, the 40% isn't showing here, while curiously a 10% overclock is netting fairly often more than 10% frames for xbone during gameplay.
- Using the smoke grenade causes framedrops on both platforms, but on Ps4 not only the drop is more severe it's also the lowest point for the console (18fps). Ps4 has twice the number of ROPs, why does this happen? Kinda right to pinpoint a culprit without any profile data, but looking at the architectures might give an answer: The esram on xbone provides on a theoretical max, more bandwidth than the entire GDDR5 on Ps4, but that bandwidth is only accessible when writing and reading from it at the same time, something that a huge curtain of smoke might very well do. The game uses deferred lighting, has tons of post processing which relies on screen space, and has some alpha effects, it's not out of the ordinary to say they are often bandwidth bound, so in a scenario like that, the esram might be an advantage for the xbone, despite having less ROPs.
See what I'm talking about? The Ps4 might be more powerful, but in one scenario it doesn't outperforms as well as it should, and on the other is being outperformed, despite theoretically having more hardware to deal with the issue.
Yeah I know lol, was just trying to crack a joke. Which failed![]()
Jesus fucking christ, Ubisoft. Alright, so in a perfect timeline:
- This game is delayed by one year to get its shit sorted.
- Rogue, which is apparently far better anyway, is cross-gen and becomes this year's AC game.
If this were a few months ago and you were an exec at Ubi, you'd think you'd see where the wind was blowing and do the above.
I meant to say that I've never played an assassins creed game and thank god I'm not starting with this one.
No.
The PS4 reserves cpu time for the OS, too.
Why do people always blame Microsoft for EVERYTHING on this forum?.
I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.
Jesus some of you never give up.
LolWhy do people always blame Microsoft for EVERYTHING on this forum?.
I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.
Jesus some of you never give up.
Well, yes, obviously, but I was under the impression the Xbox was running 3 operating systems at once, along with the Kinect features, hence why they'd decided to overclock their identical CPU to compensate.
Parity.
Lol, no.Fuck it, this is the last Ubisoft game i'll ever buy.
rofl...Oh, the gap is closing. Good.
Why do people always blame Microsoft for EVERYTHING on this forum?.
I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.
Jesus some of you never give up.
Why do people always blame Microsoft for EVERYTHING on this forum?.
I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.
Jesus some of you never give up.
Sony might have to do something about this to avoid future issues with cpu bound games, they can up the clockspeed and release some cpu reserves from the OS. I imagine there's more they can release as opposed to Microsoft since they have dedicated hardware for streaming and capture etc...PS4 version of Jaguar is capable of using Turbo mode which should set the CPU clock to 2.75 GHz.
That's the point I'm trying to make, it's not that simple. These GPUs are largely the same, Ps4 has more execution units, but they have lots of the same hardware in it. And al of those run faster on xbone due the higher clock.
Of course there are. Plenty of reasons actually.
Like low performant SDKs with high cpu overhead, virtualization costs, etc... Do you not noticed a trend where as soon as Ms started improving their tools the performance started creeping closer to PS4? Ps4 will probably still outperform the xbone, but early titles were not showing the real baseline performance of the xbone hardware.
That's subjective as fuck.
I think games like FH2 and SO have nothing to be ashamed of when compared to Ps4 exclusives, in many ways they even outdo them. But it's a bit pointless arguing over that,.
I will try to simplify:
- The cutscenes are largely gpu bound. Ps4 drops frames as well, so it's not being capped or anything. Still, the performance delta is not a single time close to the 40% the extra flops on Ps4 would lead to believe? Why? It might be because there are parts of the rendering pipeline (like for example setting up the vertex data as fragments to the pixel shader) that runs faster on xbone, which can make up for the difference. It might also be that the shaders they are using rely on bandwidth or some other resource than flops. Either way, the 40% isn't showing here, while curiously a 10% overclock is netting fairly often more than 10% frames for xbone during gameplay.
- Using the smoke grenade causes framedrops on both platforms, but on Ps4 not only the drop is more severe it's also the lowest point for the console (18fps). Ps4 has twice the number of ROPs, why does this happen? Kinda right to pinpoint a culprit without any profile data, but looking at the architectures might give an answer: The esram on xbone provides on a theoretical max, more bandwidth than the entire GDDR5 on Ps4, but that bandwidth is only accessible when writing and reading from it at the same time, something that a huge curtain of smoke might very well do. The game uses deferred lighting, has tons of post processing which relies on screen space, and has some alpha effects, it's not out of the ordinary to say they are often bandwidth bound, so in a scenario like that, the esram might be an advantage for the xbone, despite having less ROPs.
See what I'm talking about? The Ps4 might be more powerful, but in one scenario it doesn't outperforms as well as it should, and on the other is being outperformed, despite theoretically having more hardware to deal with the issue.
Wow.
Fuck it, this is the last Ubisoft game i'll ever buy.
So my takeaway as a completely ignorant when it comes to hardware issues is that the bottleneck is the NPC density throughout the city.
That just seems stupid. I always thought urban areas in AC games seemed full and crowded in a way that made other open world games seem barren. Not sure where the logic lies in adding so many more NPCs. Seems like a completely unnecessary reason to sacrifice performance.
For those of you who understand this stuff way better than me, is the density of NPCs in the city something that could be reduced and patched accordingly to help performance, or is it something far more complicated?
NPC count and AI was a huge selling point for the game so dialing that back probably wasn't an option.So my takeaway as a completely ignorant when it comes to hardware issues is that the bottleneck is the NPC density throughout the city.
That just seems stupid. I always thought urban areas in AC games seemed full and crowded in a way that made other open world games seem barren. Not sure where the logic lies in adding so many more NPCs. Seems like a completely unnecessary reason to sacrifice performance.
For those of you who understand this stuff way better than me, is the density of NPCs in the city something that could be reduced and patched accordingly to help performance, or is it something far more complicated?
You've missed some brilliant games.
So now you show your true face, Phil. Stop to use fake account.Why do people always blame Microsoft for EVERYTHING on this forum?.
I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.
Jesus some of you never give up.
NPC count and AI was a huge selling point for the game so dialing that back probably wasn't an option.
Lol, no.
I think you got it wrong... It should be the first of many Ubisoft games you wont buy...
Dude that's just a limitation of the Animus.
Wow.
Why do people always blame Microsoft for EVERYTHING on this forum?.
I'm sick of seeing constant "Microsoft money hatting" this and "developers don't want to upset Microsoft" that. Its like is Sony some tiny penniless company that developers don't care about or something?, or a company that wouldn't do something about this themselves if there were any truth to this?.
Jesus some of you never give up.
NO. a 9% increase is still a 9% increase, when its competitor has the same number of cores. You don't multiply it.
I have two cores. You have two cores. I increase my clock speed 20%. Does that now make me 40% faster than you? No, it's still 20 freakin percent, because we have the same number of cores still.
Past me -
Well i need mah Rainbow Six.I mean I already bought it but I won't be buying anymore.
I learned PR guys are that "innocent" lolAm i being trolled by Ubisoft after a failed trolling attempt?
![]()