Wow, what the hell, that's rough. Should have taken longer or just dropped the res/effects if they had to. That's some n64 frame rate there.
If there isn't a day 1 patch for that to radically improve it I hope users take square to task for letting it release like that.
I'll get the game on PC soon, but with xenoblade coming out soon and fallout still on the table I think I'll just wait to see if the pc version has the memory leaks too.
Dropping Rez is not the ultimate answer for everything, it's only the cheapest and most hasslefree way to bring up perf. This game clearly shows the devs needed to be hassled in console optimization. In many cases it's not that the hardware is not enough, it's just that they don't make use of it properly.
He absolutely can't without an SDK and the source code. He can only guess but sadly that kind of stuff he is sort of known for. Usually the devs politely remind him but haven't seen that yet.
I really wish he would just cover what he actually knows because each time he does this it spreads fud and the better the knowledge the better armed consumers are. Luckily others have already pointed it out.
Again thats not saying its not that its just whenever analysis is used and your talking about this its best to actually be factual. Especially in this instance.
He didn't just say it in the text of an article, he actually shows his assumptions in video form....."Didn't you watch the video where the game locks down and goes to 0fps"
This generation is so not lasting 8 years.
The crazy thing is Amy, these consoles could be decked with overclocked i7's and 12GB GPU's and you would still get releases like this. Some devs have to realize that developing for PC and consoles are two different things, you have to spend time on the console versions to get them at acceptable levels.
Simply porting code over is not enough and it's the reason why we are getting such low presets in console games that could have been higher or framerates that could have been much smoother. The borderlands devs getting a 20+fps improvement via their patch to that package says a lot of how some games are shipped on consoles.
Arkham Knight PC runs at over 30fps even on weaker GPUs, so not really. It's 60fps that's hard to maintain.
Weaker GPU's over what exactly? The consoles, could you link me where weaker GPU's over the consoles are running AK better?
It's also funny that when an open world game is properly done on consoles and PC gets the bad port, the devs are the worst, yet, some people never imagine such a thought when it's the console versions getting such treatment....especially when weaker GPU's runs said games much better over consoles...........
Mad Max was a better looking game to me and it ran like a dream. Shame that the original JC2 team didn't get to make this instead of Mad Max. =/
I agree totally. JC has awesome explosions, but the character models and it's IQ issues are really bad on consoles. It also has this stylized almost cartoony look with it's varied color palette and less realistic/detailed world objects.
All the whining about this, and the average framerate in the video is 27.something....I'm stuck wondering how is that bad? The crashing, the long load times - yes, bitch about those. But 27 fps average is hardly a slideshow.
That's like saying said game only falls to the teens for one second here and there, then it falls to the low 20's too and it's always around 24fps and below in any action scene. Is that fine to you? Of course when he is just running across an empty road with no NPC's, engaging no one it will be locked to 30fps and that tallies into the average framerate you're seeing in the stats, that does not mean that the game is 27fps through out.....
Even then, are our standards really depreciating, a 27fps average game is no where acceptable, it never was. It's just like Ryse which averaged around 25fps and people pretend that was fine. At what point is "Framerate is King" important, and at what point is it not?
Is it crazy though? Every time there 's a debate about what constitutes an acceptable framerate tons of people defend poorly performing games. "I used to play games at 15 fps, 30 with drops is fine", "I survived Blighttown, I'm fine with this" and so on. Plus quite a few of those poorly performing games sell millions on consoles. So I ask again: Are publishers crazy in thinking that poor performance is acceptable or are they actually right?
Even in this thread, I've noticed some people saying they have been playing for a while and have no issues. It's similar to the Fallout 4 thread where some people were saying it's fine. The sales of that game surely did the talking though. I'm not sure Bethesda is even worried about it's performance issues tbh or will ever be.
Seems like a game like this should really be taking advantage of GPU compute for physics, especially on PS4. Pretty stupid if they really are using the terrible CPUs for physics.
Yet another case of devs underutilizing the hardware given and then we have a select bandwagon rushing in to say "these consoles are weak with their smartphone Cpu's".....
I'm not even convinced that it's a CPU issue tbh...looking at so many issues with crashes, frame locks, memory leaks, awful loadtimes. It's not even like the game is a looker on consoles outside of some nice explosions. Awful character models, jaggies galore, planes cars etc..look like plastic....
So lets forget that they're not making use of GPGPU, which is there for use for even better performance, they have a million issues to resolve, which they can, by just optimizing on the basic hardware profile sans async compute.
I've heard that they have patches underway or the game is not complete in it's current state, we'll see how it pans out. One thing is for sure, this should be the perfect game for GPGPU use, since it's heavy on physics. You would think that's the first thing the devs would prioritize knowing that their game is an open world game with lots of physics calculations or heavier CPU use.......
More and more, I'm thinking that many devs simply develop on PC and just port code over to consoles with not much care afterwards. Few thirdparty devs actually cater to the console hardware and try to use what it has or try to get as much performance for their games. Perhaps Alexandros has a point, just do any lacklustre port, these console guys can't tell the difference or won't care anyway, we will get the most sales from consoles with minimal effort......
It would appear that Rocksteady was the only dev to truly shine on consoles recently (I mean really set the bar) as they tried to make the most of the hardware developing the XB/PS4 version in-house and giving these platforms their due attention, it may also be the reason why the AK port was more difficult to transition properly over to PC. From what I'm seeing, there are a bevy of games that run at double the framerate on even entry level GPU's with better effects too and yet the console offering is substandard, even on the console with the better GPU.
Look at Witcher 3 on entry level PC's against the mess that it is with presets and load times on consoles. Some people will still convince themselves that the low preset game with awful loadtimes is now acceptable because the devs patched the game to 26-27fps average in the swamps 6 months later. JC3, Fallout4, Blops3 , I mean come on, yet JC3 is no different to the million other games like it on consoles with subpar framerates. It's also baffling when people say "oh this franchise is running better on average than last gen" by what? two frames? It's just sad that we continue to accept such efforts and deem them fine, when standards are suppose to continuously increase for the better.
Are people really trying to imply that an extra 2 or 3 frames on average for an AC game at a subpar resolution is all we envisioned over last gen for that franchise, and these consoles have so much more GPU power and much more RAM. Are all these games which run 60fps on account of the 750ti GPU with better effects and presets, do they only warrant 30fps on consoles with lowered effects, where they still fall below 30fps?
At first I thought PC to console development would makes things easier for devs, but it seems the easier you make it for the devs, they will want it even easier, not optimizing at all for consoles. Perhaps if it was a 16 core Cell CPU, they would have no choice and would have to learn how to get their games running at higher than 5fps after porting. I think since consoles form the biggest sales pie for the majority of these games, the PS4 should be the lead for the majority of multiplats, it's the only way these titles will be halfway decent in terms of presets and performance.
I guess alot of developers went abit overboard with their scope for next gen games, not anticipating the cpu issues on open world games (which are becoming the norme it seems).
Games like Arkham City show, how good a game can look and run when the scope is appropiate.
Yet there are open world games which runs at 60fps and those which average around 40fps in ISS/FL, and they have pretty good loadtimes. It's really up to the developer to plan what type of open world game they want to do and balance a proper resolution with effects/good loadtimes/good performance and the best presets they can muster with all of this.
At this point, all games should cater to 900p on the XB1 and 1080p on the PS4 and optimize till you get the best presets and a solid performance in kind.