polyh3dron
Banned
no thanks
Holy, that is a LOT of hardware to invest for a single player. It'd almost be more practical just to give everyone Titan cards.
Yes, this. It makes no sense to keep rendering servers up and running, for single player games that after six months only have 10K people playing at any given time. Especially if these ten thousand are distributed over a wide geographic area, and aren't paying a subscription.
Or even watched the videoAdditionally, I'm a little skeptical about certain aspects of this demo. Most of these scenes assumed 150ms latency, which is the absolute best real world scenario you could ask for. Even with a good connection however, its not uncommon for latency to rise to 500-600-700ms, or more during certain situations. Whats gonna happen to lighting under these scenarios?
.
Free Titans for everyone! Where do i sign up??
Cloud-based rendering would be really amazing for offline applications though, like professional rendering tasks. You could rent the Nvidia Cloud for $X/hr and it would render your scene for you. Professional GPUs cost incredible amounts of money, like thousands of dollars, so this would be a great tool for people who do that kind of thing for a living.
I would love that.Do we need to go dig up some quotes from that cloud thread shortly after e3?
So, the impossible.....is looking possible?
I don't think you understand how cloud based virtualization works..
Or even watched the video
Going to be a low of crow eating in the next year or two when cloud computing becomes a differentiator.
Just as the platforms mature and get better year over year, so will the cloud impact as devs become more comfortable with the tech and process.
What about the other systems, like voxel one and the irradiance map. How much BW needed?
Also, dat latency.
Nvidia can't do 60 FPS and that is with Titans helping out. I thought this was next-gen.
Going to be a low of crow eating in the next year or two when cloud computing becomes a differentiator.
Just as the platforms mature and get better year over year, so will the cloud impact as devs become more comfortable with the tech and process.
Most cloud compute platforms only charge when instances are actually utilized. Developers spawn new instances as needed. They don't keep them idling on standbyI watched about 2/3rds, the last third locked up on me for some reason.
Yes I know how Cloud based virtualization works. It still flipping costs money, even just to have Cloud Compute servers on standby.
This is pretty delusional. Sorry to be so brunt... but just look at the costs (HW and bandwidth) and the practical applications. It is completely impractical right now... and will be for quire some time.
What's your point? The video shows that it holds up very well under latency. Only at extreme latencies does it start to show signs of trouble.
For nvidia and independent devs, sure but for someone like MS who already has a large cloud infrastructure established, not so much.
no thanks
Only saw a issue at 1000ms not sure what you're getting at
Except even if it did happen (which it probably still won't until internet gets a LOT better than it is currently), it wouldn't be a differentiator. The cloud is, unsurprisingly, not linked in any way to the local hardware, so PS4 can do it just as easily as X1, or a PC, or Steambox or whatever. In fact, a dev could in all likelihood use WindowsAzure in tandum PS4 if they wanted to, they just wouldn't get the break on server costs and X1 development apparently gets them.
Are you saying microsoft is going to invest 1,000 dollars plus per xb1 in server side support so 200ms lagging indirect lighting can be come a reality in a handful of games?
Yep... sounds very realistic
what is this I don't even
no thanks
Most cloud compute platforms only charge when instances are actually utilized. Developers spawn new instances as needed. They don't keep them idling on standby
That's not how it works. You don't have dedicated hardware for each user, not everyone is playing at the same time and the ones that are aren't all playing games that require cloud processing. That's the whole point of the cloud, having processing power available when needed for whatever application you need it for.
You would have to have dedicated resources for each user, yes. For a game like CoD, that can average 10000+ players and peak at 20,000 or so, the investment would be gargantuan to support them.That's not how it works. You don't have dedicated hardware for each user, not everyone is playing at the same time and the ones that are aren't all playing games that require cloud processing. That's the whole point of the cloud, having processing power available when needed for whatever application you need it for.
Even if this stuff isn't viable in the next couple of years people seem to forget how fast technology moves. When the 360 and PS3 were launched the iPhone didn't even exist yet.
Okay guys. Deep breath. Stop and read.
We are in a climate where publishers are pulling down their multiplayer servers for games that still have active communities. The biggest, richest companies do this, even for games that aren't dedicated servers and are just matchmaking servers.
And you people think that publishers will run these servers?!?! Really???
Its absolutely delusional to think that the pubs would spend extra money for this stuff, that takes a lot more overhead then multiplayer servers and won't even work for their entire userbase.
That claim above that in the next year realtime cloud stuff like this is going to be a big deal is not correct.
The cloud is good at the stuff it is already used for. Save games and the like.
nope your upload is too slow
You would have to have dedicated resources for each user, yes. For a game like CoD, that can average 10000+ players and peak at 20,000 or so, the investment would be gargantuan to support them.
Only saw a issue at 1000ms not sure what you're getting at
Okay guys. Deep breath. Stop and read.
We are in a climate where publishers are pulling down their multiplayer servers for games that still have active communities. The biggest, richest companies do this, even for games that aren't dedicated servers and are just matchmaking servers.
And you people think that publishers will run these servers?!?! Really???
Its absolutely delusional to think that the pubs would spend extra money for this stuff, that takes a lot more overhead then multiplayer servers and won't even work for their entire userbase.
That claim above that in the next year realtime cloud stuff like this is going to be a big deal is not correct.
The cloud is good at the stuff it is already used for. Save games and the like.
But if you're only using GI for stationary lights, you might as well prebake them and avoid the cloud entirely. This is exactly the point: the cloud only helps if the lighting is dynamic, but this video shows some strong limitations (technical and economic) on that usage.I think you could decrease the compute cost.
By keeping it only for the sun and maybe some stationary lights like torches in rooms.
Bullets,flashlight and particles i would ignore using them for GI.
Of course cloud services are virtualized. But--at least currently--you need the equivalent of a Titan for each player. (Or a Titan and a Quadro, if you use the voxel solution!) So if you sell 1m copies of a game with this lighting, you'd need, say, the power of 750,000 Titans in the cloud; not everyone would play at once, but you have to prepare for a high-use scenario. Building this level of infrastructure quickly is very unlikely to happen. For consoles, why not just invest that money in making a more powerful local machine?That's not how it works. You don't have dedicated hardware for each user, not everyone is playing at the same time and the ones that are aren't all playing games that require cloud processing. That's the whole point of the cloud, having processing power available when needed for whatever application you need it for.
You're going to need them for every game that uses cloud processing though. CoD would be one game, think about how many people game on XBL at peak times. Currently it seems like MS is pimping its own Azure servers for this infinite power. Costs would be insane.No, I mean if there are 50 millions consoles you don't need 50 million dedicated Titans in the cloud.
But if you're only using GI for stationary lights, you might as well prebake them and avoid the cloud entirely. This is exactly the point: the cloud only helps if the lighting is dynamic, but this video shows some strong limitations (technical and economic) on that usage.
Of course cloud services are virtualized. But--at least currently--you need the equivalent of a Titan for each player. (Or a Titan and a Quadro, if you use the voxel solution!) So if you sell 1m copies of a game with this lighting, you'd need, say, the power of 750,000 Titans in the cloud; not everyone would play at once, but you have to prepare for a high-use scenario. Building this level of infrastructure quickly is very unlikely to happen. For consoles, why not just invest that money in making a more powerful local machine?