intheinbetween
Member
I notice that the Cloudgine website now sort of exists.
it's been up for a long time, not now
I notice that the Cloudgine website now sort of exists.
The point of the statement is that the same game (let's say Mirror's Edge or Borderlands 2) can have PhysX "on" and "off" modes, both being viable games but one with "more".
A game with cloud-based physics processing could also have "on" and "off" modes, with one (online) using remote computing and the other (offline) being more basic local physics (Earth Defense Force-style).
Do you understand now?
Why would I want to live there?
Which is like "something you could already do is actually viable to do because it is now cheaper."When someone says "the cloud opens up new possibilities", what they really mean is "something you could already do is now cheaper".
Isn't it slated for 2016. The cloud stuff is probably all they have to show of it yet.How about just showing the game.
They'd have to put a pretty beast GPU in the thing of they wanted to do what they've been demoing locally...Is this really gonna become a 'thing' now? a game that isn't an MMO requiring the internet so you get all the bells and whistles because MS cheapened out on their GPU budget?
Yes, it's going to work so spectacularly well isn't it...
Actually in this case - calculating physics - it is EXACTLY the scenario that's been demoed to date, so nothing specious about it.there is no correlation between what a GPU (local) can do and what the "cloud" (remote) can do. your comparison is highly specious. but then you already knew that; "I know this is different."
neogaf is schizophrenic...
It almost seems like no matter what MS says we want the opposite here.
Neogaf: BS! CLOUD TECH IS BS! SHUT UP AND SHOW PROOF!
MS: Okay, we're working on a demo for you. Thanks for the feedback.
Neogaf: Conspiracy! Conspiracy! You're going to try and fool us!
Neogaf: Why don't you just show the damn game, jesus!
sigh...
yes, your thorough explanation of the difference between "on" and "off" was adequate. my point was the massive differences in scale.
take the PC I'm using now as an example. the GPU has a bandwidth of 31.51GB/s (aka 32,266.24MB/s) both ways (up and down.) a relatively good internet connection (to the cloud) has about 20MB/s down, and usually much slower up. even if you doubled or tripled that, it's like comparing a new Corvette to a new tricycle. thus the massive differences between "local" and "remote." local is ALWAYS better.
I'm just staggered by the tone of most the replies I'm this thread. It's as if, for some people, they are simply unable or unwilling to concede even a 1% chance that cloud computing might actually be a real thing.
Games/tech go from idea to proof of concept then the usual alpha/beta cycle, etc. What was shown at //Build was clearly a proof of concept demo. I'm sure at some point there will be a large scale test to gain telemetry/real world feedback but doesn't look as if they're anywhere near that stage yet.
Suggesting a demo while being clear about latency /bandwidth/etc. is a great next step; it doesn't have to be wrapped up as a fully functional vertical slice, or rolled out to everyone with an xbox for it to be valid, it just seems to be too early for that.
Oh "cloud computing" is real alright**, what we aren't convinced of is it's application to games - which are effectively real-time applications with time constraints.
** "Cloud Computing" is just a buzz word. We have had cloud computing since the 80s back when computers worth a damn cost an arm, a leg and your first born. People rented "computer time" back then. You connect via a "dumb terminal" (basically just a monitor with basic communications circuitry) to the mainframe else where. That went the way of the dodo when personal computers (aka PCs) became affordable and people could have their own computer!
I believe what we need to see:
* The specific conditions where it "works".
* How the software reacts to a dropped connection (local or server).
* How the software reacts to a slow or lagging connection.
* Assuming it handles these cases, how much development/design work this takes.
* Assuming the development work is viable, what restrictions does this place on interactivity with the chosen effects. Will they be cosmetic only (because you can't guarantee when and where the changes will appear to the player) and hence better done as pre-calculated scenes.
A demo covering all that would set the record straight.
Well, cloud computing is 100% a real thing, however it creates problems for people. The destructibility demo was a nice piece of tech - but it'll add an internet requirement to any game that uses it. This was a major issue when the Xbone was annouced, as you might recall.I'm just staggered by the tone of most the replies I'm this thread. It's as if, for some people, they are simply unable or unwilling to concede even a 1% chance that cloud computing might actually be a real thing...
So, it shouldn't be too hard to concede that the majority of gamers don't want to add an internet requirement for better particles. It's cool, but it's ultimately not going to change anything that actually matters.
Why does he look away from the falling structure each time he initiates the destrcution in that cloud demo? It seems like there are some dips starting to occure in the frames and then he just moves on to the next thing.
Well anyway, I really wanna see what they can do with this but I'll only believe it when I see it real time being used in a game .
Wouldn't you like to see some proof that the cloud can offer some tangible, and feasible, benefits to games?
Does this sort of thing mean Crackdown etc are going to come with a required net connection / upload/ download speed requirement on the box ?
As some one who lives in Nebraska... you really don't know how normal this state really is... and we do have decent internet everywhere now too.This will be very interesting. But how doctored and controlled is this test going to be? They could easily cook these numbers. Latency is a real factor here. You can't compare a world class internet connection of a huge corporation to some dude who lives in bum fuck Nebraska.
yes, your thorough explanation of the difference between "on" and "off" was adequate. my point was the massive differences in scale.
take the PC I'm using now as an example. the GPU has a bandwidth of 31.51GB/s (aka 32,266.24MB/s) both ways (up and down.) a relatively good internet connection (to the cloud) has about 20MB/s down, and usually much slower up. even if you doubled or tripled that, it's like comparing a new Corvette to a new tricycle. thus the massive differences between "local" and "remote." local is ALWAYS better.
Is this really gonna become a 'thing' now? a game that isn't an MMO requiring the internet so you get all the bells and whistles because MS cheapened out on their GPU budget?
Yes, it's going to work so spectacularly well isn't it...
As some one who lives in Nebraska... you really don't know how normal this state really is... and we do have decent internet everywhere now too.
This will be very interesting. But how doctored and controlled is this test going to be? They could easily cook these numbers. Latency is a real factor here. You can't compare a world class internet connection of a huge corporation to some dude who lives in bum fuck Nebraska.
Isn't it slated for 2016. The cloud stuff is probably all they have to show of it yet.
They'd have to put a pretty beast GPU in the thing of they wanted to do what they've been demoing locally...
Young man yells at cloud.
I'm complaining the fact MS are creating a game that requires additional computation from a remote server because they got caught with their pants down and are desperately trying to catch up with Sony...
Cloud computing isn't a buzzword though (it's two, hah!)... But seriously, the key point with cloud computing is the "elastic capacity" side of it. That's what differentiates it from simply a giant pool of dedicated servers.
The elastic scale side of it is what makes it easier for companies to adopt (no need to build out/rent a fixed number of servers which either sit there idle or lead to launch day people unable to connect).
Then factor in the sdk support for making use of the capacity and you have the things which differentiate it from the more traditional approach.
Source: Do this for a living
Then make games that suit your GPU budget, not one that will fall over the second you lose your net connection
I just calculated an estimate of the data rate for the Crackdown demo shown at Build. Obviously there's a couple more variables involved, for example, how the building breaks and the shape of the chunks. Would they derive from the local box which then gets sent up to Azure? Presumably a server application which would have the collision meshes of the map so it can sync up with the local box, it'd first receives the variables around the explosion like size, direction, radius etc.
Data Rate
UPDATED: Rather than real-time calculating of every chunk, 32 times a second, /u/caffeinatedrob recommended drawing paths which I've just substituted into the calculations
32 bits * 6 - Float
9 bits * 2 - 9 Bit Integer
Compression Ratio: 85%
Chunks: 10,000
Total Bits per Chunk: 210 bits
Total Bits for Chunks: 2,100,000
Total Bits Compressed: 315,000
Typical Ethernet MTU = 1500 bytes = 12000 bits
Data Frames per Initial Explosion of 10,000 Chunks: 27
Typical UDP Overhead = 224 bits
Total Overhead per Explosion = 6048 bits
Total Bits Needing to Be Sent Per Explosion: 321,048
Throughput Needed Per Initial Explosion: 313Kbps
All of Chunks Collide in 4 seconds: 2500 Chunks re-drawn every second
2500*210 = 525000
Compressed: 78750 bits
Data Frames per second needed for re-draw: 7
UDP Overhead = 1568 bits
Total Bits Needed per re-draw: 80318 bits
Throughput Needed per re-draw: 78kbps
Overall throughput needed in the first second: 391kbps
Every second after initial explosion would be: 78kbps
For the data, I've used float values for the X,Y,Z starting co-ordinates and the same for the finishing co-ordinates of the path on the map. I've assigned 9 bit integers for the rotation values on the path and the radius of the arc of the path.
The compression used is a given in this scenario. With the data being compressed featuring purely floats/ints the compression is very high, in around the 80's which I've substituted in.
To compare this to services which are used daily, for example Netflix, which uses 7Mbps for a Super HD stream which is pretty much standard these days. Next-gen consoles, previous gen support Super HD.
Latency
Average RTT (Round Trip Time) to Azure: 40ms
Calculation Time at Server: 32ms (For 32FPS)
Total RTT = 72ms
In Seconds = 0.072 Seconds
That means it takes 0.072 seconds from the beginning of the explosion for it to come back and start happening on your screen. Once the first load has occurred, you only have to receive data if the chunks collide with anything which would result in the re-draw of paths. The latency on that would be the calculation time, call it 16ms which is a lot considering that only a few may have to be-drawn. Then, add the half trip time of 20ms which would result in waiting 36ms or 0.036 seconds before the re-drawn path gets updated on-screen.
Packet Loss
In regards to packet loss, in 2014, you simply don't have any. ISPs these days tend to be both Tier 3 and 2 with peering often directly to large services which make up a lot of the bandwidth. This includes services like Google, Facebook, Twitter, Netflix etc. Honestly, unless you have a poor wireless signal inside your house which often causes some slight packet loss, you're not going to get any. Even if you drop a couple of packets, you'd loose a handful of chunks for that one frame and in-terms of gameplay it's not going to be noticeable really.
Conclusion
After taking suggestions on-board and drawing paths rather than real-time chunk calculation, the data rates which are needed there a significantly lower and the requirements for the internet connection are perfectly acceptable with only needing to transmit at 391kbps.
If anyones got any suggestions how to increase accuracy, or anything, let me know.
The OLD solution which requires 5.8Mbps is documented here:
http://pastebin.com/vQQs5ffZ
TL;DR: Cloud computing is definitely feasible on normal ISP connections. Would require 391kbps when the explosion starts.
Then make games that suit your GPU budget, not one that will fall over the second you lose your net connection
I'm hardly young but thanks I'm not complaining about the tech, I'm complaining the fact MS are creating a game that requires additional computation from a remote server because they got caught with their pants down and are desperately trying to catch up with Sony...
how long before people start expecting 1080p with cloud 900p without cloud...
it's just bloody ridiculous...
From reddit, you can read it here http://www.reddit.com/r/xboxone/comments/27yczf/i_just_calculated_an_estimate_of_the_internet/
Other guys posted some corrections and things like that in the comments but the numbers stay mostly the same.
The compression used is a given in this scenario. With the data being compressed featuring purely floats/ints the compression is very high, in around the 80's which I've substituted in.
What does that have to do with anything? The comment wasn't "online is better than offline", it was (and I quote):
SatansReverence:
"They may be able to have an online/offline mode. Just very basic destruction while offline."
Hawks269:
"It could be like that. On my gaming PC and I know this is different, but using Nvidia PhysX you can turn it on and off and the difference is pretty substantial. A good example was Mafia 2 with and without PhysX. Perhaps, off-line, there will still be some destruction, but no where to the degree when connected to the servers for the cloud compute."
You're over here talking about whether bicycles or corvettes are better, meanwhile the conversation is about what might happen to the physics in a game if they were intended for remote computation but that's not available.
Also, bandwidth is not the limiting factor in either case, not sure what that has to do with anything. PhysX is available now and not everybody uses it, because the cost in many cases is greater than the benefit. If you had the choice (say on some existing PC game) between using PhysX at 30fps or a "cloud" solution and keeping 60fps, which would you take?
Server in the room next doorThis will be very interesting. But how doctored and controlled is this test going to be? They could easily cook these numbers. Latency is a real factor here. You can't compare a world class internet connection of a huge corporation to some dude who lives in bum fuck Nebraska.
This will be very interesting. But how doctored and controlled is this test going to be? They could easily cook these numbers. Latency is a real factor here. You can't compare a world class internet connection of a huge corporation to some dude who lives in bum fuck Nebraska.