• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Phil Spencer Next Cloud Demo being planned will show BW/ CPU/ Latency Info

MrJoe

Banned
The point of the statement is that the same game (let's say Mirror's Edge or Borderlands 2) can have PhysX "on" and "off" modes, both being viable games but one with "more".

A game with cloud-based physics processing could also have "on" and "off" modes, with one (online) using remote computing and the other (offline) being more basic local physics (Earth Defense Force-style).

Do you understand now?

yes, your thorough explanation of the difference between "on" and "off" was adequate. my point was the massive differences in scale.

take the PC I'm using now as an example. the GPU has a bandwidth of 31.51GB/s (aka 32,266.24MB/s) both ways (up and down.) a relatively good internet connection (to the cloud) has about 20MB/s down, and usually much slower up. even if you doubled or tripled that, it's like comparing a new Corvette to a new tricycle. thus the massive differences between "local" and "remote." local is ALWAYS better.
 

Dredd97

Member
Is this really gonna become a 'thing' now? a game that isn't an MMO requiring the internet so you get all the bells and whistles because MS cheapened out on their GPU budget?

Yes, it's going to work so spectacularly well isn't it...
 
They've needed to do this for a while.

There has been so much BS said about the cloud that now even the real parts of it are met with extreme skepticism. Hopefully a very straightforward demo with all the little details explained will convince most non-believers.
 

jem0208

Member
How about just showing the game.
Isn't it slated for 2016. The cloud stuff is probably all they have to show of it yet.

Is this really gonna become a 'thing' now? a game that isn't an MMO requiring the internet so you get all the bells and whistles because MS cheapened out on their GPU budget?

Yes, it's going to work so spectacularly well isn't it...
They'd have to put a pretty beast GPU in the thing of they wanted to do what they've been demoing locally...
 

KidBeta

Junior Member
Have Microsoft ever mentioned what sort of network these demos are running on? whilst I don't want to rule out everything I just find it weird that I haven't seen anything running over a actual decently realistic (6mbit/etc) internet connection yet.
 
You know not everyone has to get every game. If this game requires an internet connection its not the end of the world there's hundreds of games that don't.
 
Network based demo in a controlled environment means NOTHING.

What they need to do is release a small free program that utilizes Azure for graphics to gold members.

This way, they get free beta testers and will be better able to predict what will happen when they do release a full game utilizing Azure for graphics. Not to mention optimize their games around the gathered data.
 
there is no correlation between what a GPU (local) can do and what the "cloud" (remote) can do. your comparison is highly specious. but then you already knew that; "I know this is different."
Actually in this case - calculating physics - it is EXACTLY the scenario that's been demoed to date, so nothing specious about it.
 

thematic

Member
neogaf is schizophrenic...

It almost seems like no matter what MS says we want the opposite here.

Neogaf: BS! CLOUD TECH IS BS! SHUT UP AND SHOW PROOF!
MS: Okay, we're working on a demo for you. Thanks for the feedback.
Neogaf: Conspiracy! Conspiracy! You're going to try and fool us!
Neogaf: Why don't you just show the damn game, jesus!

sigh...

that happen because MS promoting "Cloud" as if it powerful enough to compensate their GPU/weaker choice.

Cloud isn't some new tech. it main function currently is to store file/save data. computational function only useful in "latency-proof" game like Clash of Clans.

even if they managed to achieve MAJOR PERFORMANCE GAIN, Sony/Nintendo could easily create their own "Cloud".
 
I'm just staggered by the tone of some of the replies I'm this thread. It's as if, for some people, they are simply unable or unwilling to concede even a 1% chance that cloud computing might actually be a real thing.

Games/tech go from idea to proof of concept then the usual alpha/beta cycle, etc. What was shown at //Build was clearly a proof of concept demo. I'm sure at some point there will be a large scale test to gain telemetry/real world feedback but doesn't look as if they're anywhere near that stage yet.

Suggesting a demo while being clear about latency /bandwidth/etc. is a great next step; it doesn't have to be wrapped up as a fully functional vertical slice, or rolled out to everyone with an xbox for it to be valid, it just seems to be too early for that.
 

RedStep

Member
yes, your thorough explanation of the difference between "on" and "off" was adequate. my point was the massive differences in scale.

take the PC I'm using now as an example. the GPU has a bandwidth of 31.51GB/s (aka 32,266.24MB/s) both ways (up and down.) a relatively good internet connection (to the cloud) has about 20MB/s down, and usually much slower up. even if you doubled or tripled that, it's like comparing a new Corvette to a new tricycle. thus the massive differences between "local" and "remote." local is ALWAYS better.

What does that have to do with anything? The comment wasn't "online is better than offline", it was (and I quote):

SatansReverence:
"They may be able to have an online/offline mode. Just very basic destruction while offline."


Hawks269:
"It could be like that. On my gaming PC and I know this is different, but using Nvidia PhysX you can turn it on and off and the difference is pretty substantial. A good example was Mafia 2 with and without PhysX. Perhaps, off-line, there will still be some destruction, but no where to the degree when connected to the servers for the cloud compute."


You're over here talking about whether bicycles or corvettes are better, meanwhile the conversation is about what might happen to the physics in a game if they were intended for remote computation but that's not available.

Also, bandwidth is not the limiting factor in either case, not sure what that has to do with anything. PhysX is available now and not everybody uses it, because the cost in many cases is greater than the benefit. If you had the choice (say on some existing PC game) between using PhysX at 30fps or a "cloud" solution and keeping 60fps, which would you take?
 
I'm just staggered by the tone of most the replies I'm this thread. It's as if, for some people, they are simply unable or unwilling to concede even a 1% chance that cloud computing might actually be a real thing.

Games/tech go from idea to proof of concept then the usual alpha/beta cycle, etc. What was shown at //Build was clearly a proof of concept demo. I'm sure at some point there will be a large scale test to gain telemetry/real world feedback but doesn't look as if they're anywhere near that stage yet.

Suggesting a demo while being clear about latency /bandwidth/etc. is a great next step; it doesn't have to be wrapped up as a fully functional vertical slice, or rolled out to everyone with an xbox for it to be valid, it just seems to be too early for that.

Oh "cloud computing" is real alright**, what we aren't convinced of is it's application to games - which are effectively real-time applications with time constraints.

** "Cloud Computing" is just a buzz word. We have had cloud computing since the 80s back when computers worth a damn cost an arm, a leg and your first born. People rented "computer time" back then. You connect via a "dumb terminal" (basically just a monitor with basic communications circuitry) to the mainframe else where. That went the way of the dodo when personal computers (aka PCs) became affordable and people could have their own computer!
 
Oh "cloud computing" is real alright**, what we aren't convinced of is it's application to games - which are effectively real-time applications with time constraints.

** "Cloud Computing" is just a buzz word. We have had cloud computing since the 80s back when computers worth a damn cost an arm, a leg and your first born. People rented "computer time" back then. You connect via a "dumb terminal" (basically just a monitor with basic communications circuitry) to the mainframe else where. That went the way of the dodo when personal computers (aka PCs) became affordable and people could have their own computer!

Cloud computing isn't a buzzword though (it's two, hah!)... But seriously, the key point with cloud computing is the "elastic capacity" side of it. That's what differentiates it from simply a giant pool of dedicated servers.

The elastic scale side of it is what makes it easier for companies to adopt (no need to build out/rent a fixed number of servers which either sit there idle or lead to launch day people unable to connect).

Then factor in the sdk support for making use of the capacity and you have the things which differentiate it from the more traditional approach.

Source: Do this for a living
 

Alx

Member
I believe what we need to see:

* The specific conditions where it "works".
* How the software reacts to a dropped connection (local or server).
* How the software reacts to a slow or lagging connection.
* Assuming it handles these cases, how much development/design work this takes.
* Assuming the development work is viable, what restrictions does this place on interactivity with the chosen effects. Will they be cosmetic only (because you can't guarantee when and where the changes will appear to the player) and hence better done as pre-calculated scenes.

A demo covering all that would set the record straight.

If it's a demo built "for the internet", I don't expect to get that many details... it would be different if it were a paper for something like GDC or SIGGRAPH, of course (but those will probably come soon).
 

ZehDon

Member
I'm just staggered by the tone of most the replies I'm this thread. It's as if, for some people, they are simply unable or unwilling to concede even a 1% chance that cloud computing might actually be a real thing...
Well, cloud computing is 100% a real thing, however it creates problems for people. The destructibility demo was a nice piece of tech - but it'll add an internet requirement to any game that uses it. This was a major issue when the Xbone was annouced, as you might recall.

That demo was also devoid of anything resembling gameplay. Tracking physically accurate particles using cloud computing is one thing. Interacting with a realtime game on the otherside of the world using full destructibility is something else. More than likely, given the speed of light issue, cloud computing will track the generated destruction particles, but they won't have an impact on the gameplay. In that demo, the processing issue was tracking the physically accurate movement of the generated particles - not the actual destruction itself, which the offline Xbone could handle with relative ease.

So, it shouldn't be too hard to concede that the majority of gamers don't want to add an internet requirement for better particles. It's cool, but it's ultimately not going to change anything that actually matters.
 

Alx

Member
So, it shouldn't be too hard to concede that the majority of gamers don't want to add an internet requirement for better particles. It's cool, but it's ultimately not going to change anything that actually matters.

Maybe we should wait for more information on what use "the cloud" will be for the games, before reaching those conclusions. Of course if it is only for cosmetic effects, it may not be worth adding an online requirement (although it would be easier then to offer an offline alternative, with the equivalent of "low settings").
But if it is used for complex environment destruction that couldn't be handled by an offline console, like the demo and trailers suggest, then it could "change things that actually matter".
I don't think people at MS or Cloudgine are stupid, they know that if the first examples of their tech aren't relevant, soon enough another developer will show a similar demo running offline on a regular console and say "look Ma, no cloud !".
 

SiRatul

Member
Why does he look away from the falling structure each time he initiates the destrcution in that cloud demo? It seems like there are some dips starting to occure in the frames and then he just moves on to the next thing.

Well anyway, I really wanna see what they can do with this but I'll only believe it when I see it real time being used in a game .
 
Why does he look away from the falling structure each time he initiates the destrcution in that cloud demo? It seems like there are some dips starting to occure in the frames and then he just moves on to the next thing.

Well anyway, I really wanna see what they can do with this but I'll only believe it when I see it real time being used in a game .

Because he wanted to blow other stuff up. They posted another demo later where they didn't look away, to avoid any conspiracy theories.
 

RulkezX

Member
Does this sort of thing mean Crackdown etc are going to come with a required net connection / upload/ download speed requirement on the box ?
 

EmpReb

Banned
This will be very interesting. But how doctored and controlled is this test going to be? They could easily cook these numbers. Latency is a real factor here. You can't compare a world class internet connection of a huge corporation to some dude who lives in bum fuck Nebraska.
As some one who lives in Nebraska... you really don't know how normal this state really is... and we do have decent internet everywhere now too.
 

Mastperf

Member
yes, your thorough explanation of the difference between "on" and "off" was adequate. my point was the massive differences in scale.

take the PC I'm using now as an example. the GPU has a bandwidth of 31.51GB/s (aka 32,266.24MB/s) both ways (up and down.) a relatively good internet connection (to the cloud) has about 20MB/s down, and usually much slower up. even if you doubled or tripled that, it's like comparing a new Corvette to a new tricycle. thus the massive differences between "local" and "remote." local is ALWAYS better.

A 20 megabyte per second internet connection is far beyond relatively good. You sure you don't mean 20 megabit per second?
 

Stinkles

Clothed, sober, cooperative
Is this really gonna become a 'thing' now? a game that isn't an MMO requiring the internet so you get all the bells and whistles because MS cheapened out on their GPU budget?

Yes, it's going to work so spectacularly well isn't it...

Young man yells at cloud.
 

Dabanton

Member
This will be very interesting. But how doctored and controlled is this test going to be? They could easily cook these numbers. Latency is a real factor here. You can't compare a world class internet connection of a huge corporation to some dude who lives in bum fuck Nebraska.

Tbh what value would there in 'cooking' the numbers they need this to be above board so people can see it's viable.
 

Dredd97

Member
Isn't it slated for 2016. The cloud stuff is probably all they have to show of it yet.

They'd have to put a pretty beast GPU in the thing of they wanted to do what they've been demoing locally...

Then make games that suit your GPU budget, not one that will fall over the second you lose your net connection

Young man yells at cloud.

I'm hardly young ;) but thanks I'm not complaining about the tech, I'm complaining the fact MS are creating a game that requires additional computation from a remote server because they got caught with their pants down and are desperately trying to catch up with Sony...

how long before people start expecting 1080p with cloud 900p without cloud...

it's just bloody ridiculous...
 
One thing the could could be used for is item locations in open world games. For example, Elder Scrolls games are notorious for poorly managing memory when it comes to item locations, leading to the eventual slowdown in Skyrim on PS3s when too many items were adjusted from their original locations. Those item locations and information could be stored in the cloud, you could disconnect and just not see them or have less junk items in the world.
 

Alx

Member
I'm complaining the fact MS are creating a game that requires additional computation from a remote server because they got caught with their pants down and are desperately trying to catch up with Sony...

Not everything is about the console wars, you know... At the moment, most of the MS strategy is based on "the cloud", one way or another. Azure is an expanding and lucrative activity, they put Office and Visual Studio "in the cloud", music and video services, file storage,... That's even the main focus of Satya Nadella, their new CEO (probably one of the reasons he got the job).
From the beginning the Xbox One was a "cloud console", for game saves, recorded videos, and even retail game licenses (with the old DRM). Cloud computing is just another aspect of that logic, and certainly not something you improvise at the last minute because you need more performance.
 
Cloud computing isn't a buzzword though (it's two, hah!)... But seriously, the key point with cloud computing is the "elastic capacity" side of it. That's what differentiates it from simply a giant pool of dedicated servers.

The elastic scale side of it is what makes it easier for companies to adopt (no need to build out/rent a fixed number of servers which either sit there idle or lead to launch day people unable to connect).

Then factor in the sdk support for making use of the capacity and you have the things which differentiate it from the more traditional approach.

Source: Do this for a living

How is that different from time slicing of mainframe time in the bad old days before personal computers? You pay more, they give you more CPU cycles and memory.

2ndly, the "cloud" is fine for non-realtime applications. But games have 33ms to complete all their relevant computations per frame and have to at a minimum do it 30 times a sec.

Lastly, there is no reason other companies can't do it themselves. This is especially true for AAA development. You have the $$$ to fund a AAA game (100s of millions), you have the $$$ to develop your own cloud server farm that will elastically scale for each of your games. It would be greatly beneficial to DIY too as this way you aren't tied to MS - bad idea; MS is a company with a history of showing no hesitation when it comes to screw over others for their own benefit, be it competitors, partners or even customers. No one with their head screwed on right would want to be under MS's thumb.

PS: I just noticed something. No one is talking about the "cloud" but MS and developers who are firmly in their pocket. Not a pip from EA, Ubisoft, Activision, Crytek, Epic, Take 2 ... etc.
 

jem0208

Member
Then make games that suit your GPU budget, not one that will fall over the second you lose your net connection

Chances are it's online only. I don't really see why this is a problem. If the game is online why shouldn't they take advantage if extra power?


Why do you want them to shackle their games to weaker hardware when they can take advantage of the cloud to make their games better??
 

Filaipus

Banned
I just calculated an estimate of the data rate for the Crackdown demo shown at Build. Obviously there's a couple more variables involved, for example, how the building breaks and the shape of the chunks. Would they derive from the local box which then gets sent up to Azure? Presumably a server application which would have the collision meshes of the map so it can sync up with the local box, it'd first receives the variables around the explosion like size, direction, radius etc.

Data Rate

UPDATED: Rather than real-time calculating of every chunk, 32 times a second, /u/caffeinatedrob recommended drawing paths which I've just substituted into the calculations

32 bits * 6 - Float

9 bits * 2 - 9 Bit Integer

Compression Ratio: 85%

Chunks: 10,000

Total Bits per Chunk: 210 bits

Total Bits for Chunks: 2,100,000

Total Bits Compressed: 315,000

Typical Ethernet MTU = 1500 bytes = 12000 bits

Data Frames per Initial Explosion of 10,000 Chunks: 27

Typical UDP Overhead = 224 bits

Total Overhead per Explosion = 6048 bits

Total Bits Needing to Be Sent Per Explosion: 321,048

Throughput Needed Per Initial Explosion: 313Kbps

All of Chunks Collide in 4 seconds: 2500 Chunks re-drawn every second

2500*210 = 525000

Compressed: 78750 bits

Data Frames per second needed for re-draw: 7

UDP Overhead = 1568 bits

Total Bits Needed per re-draw: 80318 bits

Throughput Needed per re-draw: 78kbps

Overall throughput needed in the first second: 391kbps

Every second after initial explosion would be: 78kbps

For the data, I've used float values for the X,Y,Z starting co-ordinates and the same for the finishing co-ordinates of the path on the map. I've assigned 9 bit integers for the rotation values on the path and the radius of the arc of the path.

The compression used is a given in this scenario. With the data being compressed featuring purely floats/ints the compression is very high, in around the 80's which I've substituted in.

To compare this to services which are used daily, for example Netflix, which uses 7Mbps for a Super HD stream which is pretty much standard these days. Next-gen consoles, previous gen support Super HD.

Latency

Average RTT (Round Trip Time) to Azure: 40ms

Calculation Time at Server: 32ms (For 32FPS)

Total RTT = 72ms

In Seconds = 0.072 Seconds

That means it takes 0.072 seconds from the beginning of the explosion for it to come back and start happening on your screen. Once the first load has occurred, you only have to receive data if the chunks collide with anything which would result in the re-draw of paths. The latency on that would be the calculation time, call it 16ms which is a lot considering that only a few may have to be-drawn. Then, add the half trip time of 20ms which would result in waiting 36ms or 0.036 seconds before the re-drawn path gets updated on-screen.

Packet Loss

In regards to packet loss, in 2014, you simply don't have any. ISPs these days tend to be both Tier 3 and 2 with peering often directly to large services which make up a lot of the bandwidth. This includes services like Google, Facebook, Twitter, Netflix etc. Honestly, unless you have a poor wireless signal inside your house which often causes some slight packet loss, you're not going to get any. Even if you drop a couple of packets, you'd loose a handful of chunks for that one frame and in-terms of gameplay it's not going to be noticeable really.

Conclusion

After taking suggestions on-board and drawing paths rather than real-time chunk calculation, the data rates which are needed there a significantly lower and the requirements for the internet connection are perfectly acceptable with only needing to transmit at 391kbps.

If anyones got any suggestions how to increase accuracy, or anything, let me know.

The OLD solution which requires 5.8Mbps is documented here:

http://pastebin.com/vQQs5ffZ

TL;DR: Cloud computing is definitely feasible on normal ISP connections. Would require 391kbps when the explosion starts.


From reddit, you can read it here http://www.reddit.com/r/xboxone/comments/27yczf/i_just_calculated_an_estimate_of_the_internet/

Other guys posted some corrections and things like that in the comments but the numbers stay mostly the same.
 

mrklaw

MrArseFace
This is good. Even him saying it reassures me that this might be possible.

Still seems like more effort than is practical, but I look forward to seeing it.
 

Shpeshal Nick

aka Collingwood
Then make games that suit your GPU budget, not one that will fall over the second you lose your net connection



I'm hardly young ;) but thanks I'm not complaining about the tech, I'm complaining the fact MS are creating a game that requires additional computation from a remote server because they got caught with their pants down and are desperately trying to catch up with Sony...

how long before people start expecting 1080p with cloud 900p without cloud...

it's just bloody ridiculous...

But did you think that Microsoft might have still pushed the Azure stuff even if the One was as powerful or more powerful than the PS4?

Could it be, just possibly, that they want to integrate as many of their services/platforms into all their products to further enhance their ecosystem?

Just a thought.
 
From reddit, you can read it here http://www.reddit.com/r/xboxone/comments/27yczf/i_just_calculated_an_estimate_of_the_internet/

Other guys posted some corrections and things like that in the comments but the numbers stay mostly the same.

The compression used is a given in this scenario. With the data being compressed featuring purely floats/ints the compression is very high, in around the 80's which I've substituted in.

Maybe there is something I'm missing here ... What does the format of the data being floats/ints have to do with compression? White noise can be encoded as floats/ints ... it's will still be practically impossible to compress.

Compression relies on redundancy in the data. Music data is stored as 16 bits ints (assuming from a CD), lossless compression exploiting redundancy in stereo can at best bring it to ~50%. Not sure the position of 1000s-millions of particles have any redundancy at all. Much less 80%.
 

MrJoe

Banned
What does that have to do with anything? The comment wasn't "online is better than offline", it was (and I quote):

SatansReverence:
"They may be able to have an online/offline mode. Just very basic destruction while offline."


Hawks269:
"It could be like that. On my gaming PC and I know this is different, but using Nvidia PhysX you can turn it on and off and the difference is pretty substantial. A good example was Mafia 2 with and without PhysX. Perhaps, off-line, there will still be some destruction, but no where to the degree when connected to the servers for the cloud compute."


You're over here talking about whether bicycles or corvettes are better, meanwhile the conversation is about what might happen to the physics in a game if they were intended for remote computation but that's not available.

Also, bandwidth is not the limiting factor in either case, not sure what that has to do with anything. PhysX is available now and not everybody uses it, because the cost in many cases is greater than the benefit. If you had the choice (say on some existing PC game) between using PhysX at 30fps or a "cloud" solution and keeping 60fps, which would you take?

show me the money. show me an example of a 30 FPS PhysX demo vs. a 60 FPS equivalent running on a remote cloud. we have this MS demo showing a remarkable difference; you may call me paranoid but I don't have any trust when it comes to MS. oh and I am aware that a number of factors are at play: bandwidth caps, latency on the remote side (internet) as well as local latency, and of course the fact that no one but MS has access to the demo. I wonder why they don't just release it like Nvidia does for their demos?
 

Jomjom

Banned
If Spencer is saying this, it must work pretty well.

He wouldn't make this claim if it was all bullshit. He knows how much shit he would get if there's no substance behind this. This and DX12 are really going to change the landscape of the console competition I think.
 

DrM

Redmond's Baby
This will be very interesting. But how doctored and controlled is this test going to be? They could easily cook these numbers. Latency is a real factor here. You can't compare a world class internet connection of a huge corporation to some dude who lives in bum fuck Nebraska.
Server in the room next door :D
 

Jomjom

Banned
This will be very interesting. But how doctored and controlled is this test going to be? They could easily cook these numbers. Latency is a real factor here. You can't compare a world class internet connection of a huge corporation to some dude who lives in bum fuck Nebraska.

Will that really matter though? People said the same about PSNOW but it works exactly as advertised for me in California. As long as this works for a significant number of people won't it still be a very big benefit? The hypothetical person living in Nebraska will still be able to play the game, but maybe he just won't get the extra effects and whatnot. That's not a huge deal.
 

kyser73

Member
Still have 1 question:

What happens to cloud-reliant games that are commercial failures? How long will support last?
 
Top Bottom