The 7790 is the closest approximation, because it's a lot more architecturally similar to the Xbox One GPU compared to a 7770. People are focusing on the gap between a 7850 and the 7770, and using that as a guide, but considering how the Xbox One GUP has a 7770 beat, it's obvious that the numbers would be even better for the Xbox One GPU by comparison. Plus it will benefit from console level optimizations. Even in the very edge article they refer to 20fps or so on 1600x900 before any optimizations whatsoever. The Xbox One being a system that has to be more carefully managed by developers, would obviously suffer more from lack of proper optimization, and the more powerful PS4 that is simpler to design for would be much better off in such an un-optimized scenario.
Keep in mind similar things could be said about the 7850 as a representation of the PS4 - you're talking about a GPU with two less CUs.
The point is, neither GPU in the example gives the full, true picture - and that's besides the point in the first place. No one (reasonable) would use these metrics as an actual judge of either of the new systems; they simply do not, and cannot, give the full story. They're just an example for a very general idea of the performance difference; or at least, the resolution difference necessary to achieve rough performance parity on these GPUs.
Again, not meant to be wholly accurate. You want that, there was another post with a nice write-up detailing the exact GPUs and respective over/underclocks on them needed for a more accurate test. Even then, it'd still be an inherently muddy comparison that I'm not even sure the effort would be worth it, though it'd still be interesting to see nonetheless.