Yep, clearly its 900p just like he said...
Is this sarcasm?, you've been corrected many times on this issue yet you won't concede. Again, Nxgamer never said conclusively that the game was 900p, he said it was sub-true/native hd and in the range of 900p. He was also right about the 30fps cutscenes, he actually did gamers a huge favor by giving them a heads up on release day.
Df's article is coming out way after the fact (only today, just after I enquired) when most have already purchased the game and noticed much for themselves, at least the the xbone players could corroborate with Nxgamer that the game was indeed sub native on the day of release.
He was unable to spot the obvious artifacts you get from upscaling only horizontally.
I'll let you in on something, none of these Eurogamer editors are pixel counters, they get their pixel information from Beyond 3D or if these B3D guys happen to post it on GAF. I'm pretty sure that Eurogamer got this pixel information from an Al Strong post on gaf where he declared 1344*1080/1360*1080p. At this point in DF's final article the rez is not even conclusive/definite but people want to throw NXgamer under a bus becasue he has a good eye and said it first.
The big problem I find with so called technical editors is when they can't tell that a game is sub-hd or when they can't differentiate between aa methods, motion blur methods/samples or AO methods.
I believe some people were simply butt-hurt when NX called the lower resolution on the xbone version, many said they would wait on DF's analysis, insinuating that NXgamer is not as credible and that the dev said both versions was 1080p 60fps on both. I would imagine that many were actually waiting for DF to come up with an article saying 1080p native on the xbone to get nails and coffin ready for NX, but no once again he's right, only that he offered the information ahead of any other to help with a release day purchase.
Dat first
post.
Why not just go 900 on One? Killer Instinct looks and runs great.
1600* 900 = 144 0000
1344* 1080 = 1451520 (11,520 pixel upgrade from 900p)
1360* 1080 = 146 8800 (28,800 pixel upgrade from 900p)
The difference between these three are negligible relative to performance.
A huge performance/IQ differentiator on the other hand is the 633,600 pixel upgrade that 1920 *1080p provides.
So NXGamer's analysis was completely in line with DF's. I knew he was right when he said the XB1 version is upscaled. Also, nice to see 8x AF on the consoles, especially PS4 for a game running on Unreal Engine 3. Hopefully 8x AF becomes the standard this year.
I'm waiting on NXgamer's final analysis myself for a more thorough and visual take, but yes it would seem that AF is sorted after the uproar, I wonder why they don't go full 16xAF on the PS4 due to the stronger hardware though, it's clear that MKX never drops frames during gameplay, so there must be a significant overhead.
Either way, the fact that they got the AA solution right here makes your entire little comment totally irrelevant to the article.
I believe they got it right because it was easy to tell, it's the only option available in the PC version and by extension all versions. I just think they're a bit too predictable with their assumptions when there's less evidence at play, especially when it comes AA, effects etc....that's why they are always being corrected by devs which has to be embarassing for a tech site. In essence I don't trust their eye to puick up things and be correct about it as other comparison sites right now, but it's because of their doing.