And now that people like to talk some much about ESRAM latency... Think about the latency when you send data via internet to be processed by a server far far away...
And another thing, do people think MS is going to have for each X1 sold a equivalent 600 gflops gpu in its servers reserver for that console sold to catch up PS4 GPU performance?. Well, it would have been easier and cheaper for them to include that power in the console from the beginning and make the customer pay directly for it...
These are all just uncomfortable truths (especially the latency one), no room for those in console wars and games PR *cough * I mean journalism, sorry I choked on my words.
Cloud 'computing' is NEVER going to fit into games unless the server is in your own house.
Devs work their asses off to cut a single ms off the time it takes to do each part of a frame, I know guys, let's send this data out to be calculated on another computer hundreds of kilometers away and add another 50-100 ms of input lag, brilliant. Man holy shit think about how uneven the frametimes would be if you had to rely on pinging data to a server with packet loss and choke doubling or tripling the latency at random!
Amd and nvidia try their hardest to improve the memory bus of a gpu so the data can be accessed and moved as fast as possible.
"I know what we should do!" said a brilliant young Microsoft engineer.
Instead of sending this data through a memory bus at a bandwidth measured in 100s of GB/sec with a latency measured in clockcycles, let's send it over a wifi connection (with hilarious amounts of packet loss) onto the internet at a bandwidth measured in megabits per second and a latency measured in tens to hundreds of milliseconds.
The mere concept is so fucking stupid yet game junnalists like Edge try to sell it to the plebs anyhow.
Shills.
(can't wait for someone to confuse rendering a frame with server side loot/spawns like in diablo)