I'm not trying to defend Ubisoft for releasing a rushed game, but optimizing a game this complex is far from trivial. And to their credit a high end PC will run the game much better than the consoles which can't be said about all games.What does that say when computers of that spec struggle to maintain over 40-60fps solid... A piss poor optimised engine. The PS4 v X1 debate in AC unity is irrelevant when in that context.
The PS3 CPU was very efficient when using very specific code. It sounds great if those "lazy programmers" would do their job, but it was obviously bad for game development when you look at the early years.Yes, PS3 had a Cell processor which they claimed was the level of a super computer. It was notoriously difficult to develop for though. They also had problems with PS2's Emotion Engine graphics chip. Several games ended up looking better on Dreamcast (Dead or Alive 2 being one of the more striking examples if I remember correctly) as well.
PS4 is not supposed to be more difficult to develop for though!
I'm pretty sure they have hardware that reads each frame and checks when new ones are displayed. Nvidia has FCAT which is used in some PC benchmarks.Anyone has an idea how Digital Foundry does those frametests? I've searched a bit on the internet, but I couldn't find an answer.
With vsync a new frame can only be displayed every 1/60th of a second so if frame n-1 != n = n+1 != n+2 then the same frame was displayed for 2 * 1/60th of a second or 33ms (30 fps). If n-1 != n = n+1 = n+2 != n+3 then the same frame was displayed for 3 * 1/60th of a second so 50ms (20 fps). If you look at the frametimes window you'll notice it oscillates between 33 and 50 when the framerate isn't 20 or 30 fps. That can then be averaged over a second to give you the fps number and that's why frametimes is a better measurement than fps