• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA unveils Cloud Based Indirect Lighting

nib95

Banned
What was i thinking :p
But from the sun alone should help with the lighting of the game.

Hundreds of millions of dollars worth of cloud investment, development complexity, a mass of obstacles and potential bandwidth and latency issues, all so gamers can get some Cloud computed Globally Illuminated sun. Lol. Which they will likely get even without the Cloud on local hardware...
 

amardilo

Member
That's some pretty cool technology.

This could really help game graphics and would be great if games could use additional lighting effects on top of the standard lighting if the user has a good internet connection (if they don't the standard lighting could just be used).

Hopefully AMD has something similar and the likes of Microsoft can upgrade their Azure cloud infrastructure to include this stuff (same for Sony).
 

gofreak

GAF's Bob Woodward
The latency might be OK for lower frequency indirect lighting. But higher frequency effects might not be so tolerant.

The quality is something they should try to ramp up if offloading stuff, though...the quality here didn't look all that great (I mean, vs what one could do with a tonne of power). Like really pump up the quality, number of lights etc. Although I guess there's an economical limit to how much you can assign to one player's view, that may not be very high at the moment.
 

Darknight

Member
What about offline users that play games with cloud support? Will they play a shittier game? Or have a worse gameplay experience? If the game is online only Single Player, then that would be stupid just to use DA CLOUD.
 
Hundreds of millions of dollars worth of cloud investment, development complexity, a mass of obstacles and potential bandwidth and latency issues, all so gamers can get some Cloud computed Globally Illuminated sun. Lol. Which they will likely get even without the Cloud on local hardware...
No. This is a demo displaying only one application of cloud servers.
There are many uses beyond Global Illumination.
Of course nobody would invest all of that money simply for cloud generated lighting.
 

nib95

Banned
No. This is a demo displaying only one application of cloud servers.
There are many uses beyond Global Illumination.
Of course nobody would invest all of that money simply for cloud generated lighting.

In terms of graphics advantages?

If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.

Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process


They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.

And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)

So, a good research paper, but not economically practical either on the client or the server side yet.

Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.
 

Hex

Banned
Well at least I got a Lie to Me nod out of this thread.
Amazing now many people blow through the op cherry picking but ignoring the important parts and rush to the reply button for the sake of console crusadery.
Love me some Nvidia, not enough to splurge on a Titan or two yet though lol.
 

wsippel

Banned
Even Nintendo is investing money in that stuff (GPGPU powered cloud computing), so there's obviously something to it. It's what NERD's currently working on, apparently.
 

FINALBOSS

Banned
No. This is a demo displaying only one application of cloud servers.
There are many uses beyond Global Illumination.
Of course nobody would invest all of that money simply for cloud generated lighting.

I had a sinking feeling when you quoted me earlier that you never read any of the threads you post in--hell, you might not even read what people say when you quote them.

This confirms it.
 
I think cloud computing is a little more than just a buzzword lol
No it truely is a buzz word in every possible way. There is absolutely nothing new about the technology behind the word. Thinking so only shows that you have no clue about server technology. Virtualization has bbeen around for a very long time. People have just commoditized and automated it to a point where it can be sold to the masses in bite sized chunks. And that requires a buzz word.

Wrapping ruby, c and bash shell, and python scripts around command line tools and then exposing a web based admin console is nothing new. Having usefull scripts that you can market to consumers without using wording like PaaS, Platform as a Service, or virtual machine server hosting is much adventageous.
 
In terms of graphics advantages?
No, probably not for rendering graphics right away.
Maybe in half a decade as these next gen GPUs come down in price and ISPs are offering better services to more people.
But in titles that are designed to use cloud computing nowadays we'll see more fidelity in certain areas such as scale, which makes the the game look more impressive.

I had a sinking feeling when you quoted me earlier that you never read any of the threads you post in--hell, you might not even read what people say when you quote them.

This confirms it.
Thanks for the attention, it's flattering
 
Using the cloud for defered graphics rendering is a bit lazy and misguided. I could see it used for the ultimate procedurally generated world and world art asset creator but I don't really care about anything else. It's nice that online games may stay online longer due to automated scaling but why wasn't that done years ago?
 
No, probably not for rendering graphics right away.
Maybe in half a decade as these next gen GPUs come down in price and ISPs are offering better services to more people.
But in titles that are designed to use cloud computing nowadays we'll see more fidelity in certain areas such as scale, which makes the the game look more impressive.


Thanks for the attention, it's flattering

Scale? What does scale have anything to do with the cloud? If you aren't dynamically generating a world and steaming it to a client there isn't much else the cloud is doing. If it is static content then in that utopian distant future where everyone has über net you could easily negate having the cloud at all and use a peer to peer system for asset sharing for better geographical coverage.
 
That was 35% build pre alpha..

Killzone Shadow Fall is using GI too. And these are LAUNCH titles running on far worse gpus than a Titan...
The same baked GI that was deemed as the most unimpressive thing ever when people talk about Forza 5.

But isn't that exactly what's being done in this tech demo as well?

No, they are actually using the same global lighting algorithm Epic was using on UE4, in real time and then bringing the result to the Pc.
 

Momentary

Banned
Right on brotha, fuck the armed forces.


Are you kidding me? If we're in an area where there isn't any electricity or internet connectivity we're definitely not worried about gaming. Those who are might be in the wrong business. As a U.S. Marine, I remember taking "libbo" trips to Army and Naval bases since they basically had established a small city AND they had more than decent internet with actual download speeds of up to 4.0 mb/s.

So every time I here this it kind of pisses me off. Most POGs don't have it bad at all when their deployed, so stop saying this crap. When a service member is at a base and they have a massage parlor, a Taco Bell, a KFC, a Starbucks, and a shopping district, they really have no reason to be complaining about anything.
 

Portugeezer

Member
I prefer my games to work offline.

How so?

And about the latency, they actually showed it there, even with 200ms of latency (which is absurdly high) still looked amazing.

Even at 500ms it was okay for me, only the 1 second delay was actually noticeable in that demo.
 
Guessing DriveClub uses real time Global Illumination given the sun moves around, sets, rises etc. DriveClub dev comments on it here.

http://www.youtube.com/watch?v=VoengumG6FI&t=3m40s

Even more curious to see how this game ends up looking and performing at launch now.

When people say Global illumination they mean direct + indirect lighting (including reflections and refractions), not dynamic lighting.

The only company (that I know of) that has games in development, or shipped with realtime GI is Crytek, and even then they only have GI for the sun, and only for first indirect bounce.

If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.

Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process

They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.

And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)

So, a good research paper, but not economically practical either on the client or the server side yet.

Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.
If indirect lighting is adequatable, then indirect shadowing most likely is too, like ambient occlusion. Some games like Gears 2 used an accumulation buffer for AO, that had a few seconds delay to show the proper AO when an object moved.
 

FINALBOSS

Banned
No, probably not for rendering graphics right away.
Maybe in half a decade as these next gen GPUs come down in price and ISPs are offering better services to more people.
But in titles that are designed to use cloud computing nowadays we'll see more fidelity in certain areas such as scale, which makes the the game look more impressive.


Thanks for the attention, it's flattering

I didn't really post it for you.

I posted it for everyone else attempting to engage with you--that it'd be a 100% complete waste of time.
 

Leb

Member
Man, people are getting pretty bent out of shape about this. I mean, let's be clear: this is an academic paper presented at ACM's preeminent conference for computer graphics. The paper represents important foundational work for what will one way become an important technique for supplementing the processing power of connected devices.

Suggesting that this has any immediate implications for either next-gen console is disingenuous at best.
 

MrKayle

Member
2866234020.png


am I ready for teh clod??
 

Respawn

Banned
well this proof of concept flies in the face of everything that has been said here about the cloud. It is cloud computation used to offload graphical computation from the local system.

there are still a lot of questions regarding the neccesary bandwidth, server power etc. But it is proof that it can be done
Always these post without any legs to stand on. Do you run into burning buildings because someone tells you there's a chance the flames won't touch you? Or have you just ignored the post that tells you why this cannot be done at this lvl from Azure to Xbone.
 
Always these post without any legs to stand on. Do you run into burning buildings because someone tells you there's a chance the flames won't touch you? Or have you just ignored the post that tells you why this cannot be done at this lvl from Azure to Xbone.

Why not? Azure has so called big compute servers, beasts with 16 cores, 120GB of ram, and insanelly high network connectivity between then (40gbs).

500 of these servers managed to rank at the top500 super computers with 151 teralops (far beyond of what titan is capable of), and higher than 90% efficiency.

The only question is if Ms has enough of those servers to serve millions of gamers at a time.
 

Chobel

Member
2866234020.png


am I ready for teh clod??

You are the 1%

Why not? Azure has so called big compute servers, beasts with 16 cores, 120GB of ram, and insanelly high network connectivity between then (40gbs).

500 of these servers managed to rank at the top500 super computers with 151 teralops (far beyond of what titan is capable of), and higher than 90% efficiency.

The only question is if Ms has enough of those servers to serve millions of gamers at a time.

Except those servers aren't specialized in real time graphics, they contain CPUs not GPUs
 
200ms? 500ms?

Brings me back to the good old days of being a HPB and sliding around in quake until they made Quakeworld with client prediction
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
Why not? Azure has so called big compute servers, beasts with 16 cores, 120GB of ram, and insanelly high network connectivity between then (40gbs).

500 of these servers managed to rank at the top500 super computers with 151 teralops (far beyond of what titan is capable of), and higher than 90% efficiency.

The only question is if Ms has enough of those servers to serve millions of gamers at a time.

Azure does not, as of now, have titans in their servers. Why is that so difficult to understand?
 
Why not? Azure has so called big compute servers, beasts with 16 cores, 120GB of ram, and insanelly high network connectivity between then (40gbs).

500 of these servers managed to rank at the top500 super computers with 151 teralops (far beyond of what titan is capable of), and higher than 90% efficiency.
I'm sure you don't see it, but the numbers you just posted prove how impossible this solution is on Azure.

Assuming for the sake of argument that compute teraflops are the exact same as GPU teraflops, 500 Azure servers equals about 50 Titans' worth of teraflops. That is, it takes 10 entire Azure "big compute" servers to equal 1 Titan.

Microsoft has announced that they're allocating 300,000 servers to the One. That means 30,000 players is the max load that could use this cloud lighting...and no one else could use the cloud for anything.
 

nib95

Banned
The same baked GI that was deemed as the most unimpressive thing ever when people talk about Forza 5.



No, they are actually using the same global lighting algorithm Epic was using on UE4, in real time and then bringing the result to the Pc.

Only the GI with respect to the sun direction. All the lighting in KZ SF is dynamic, including shadows. Everything in Forza 5 is pre-baked. Be surprised if F5 even used GI. Based on the track we saw it looks like it uses clamped highlights, rays and techniques similar to Killzone 2.
 
Wow, I'm definitely shocked. I thought we were at least a decade from cloud rendering being possible, let alone ready for a demonstration...this is a BFD for sure.

Also however, it makes me wonder about external GPU rendering through the USB 3.0 (or even 2.0?). If this can be done now through the internet, why can't it be done through USB now? That'd make high-end PC gaming much more accessible for me and I'm sure many others. There was a demonstration of external full rendering (not just part of the pipeline) through USB 2.0 awhile ago, AMD and Nvidia should be all over that if they want $$$ in my opinion.
 
Why are we assuming that no better hardware could be produced to do this job? I'd suggest that a GPU made to go into a sever farm to do this job could have a lot of stuff cut out and optimised for lower cost/performance.
 

twobear

sputum-flecked apoplexy
sure, nvidia might claim the tech demo is using cloud compute, but i'm sure that exactly like simcity it's all local and the connection is just for drm
 
Why are we assuming that no better hardware could be produced to do this job? I'd suggest that a GPU made to go into a sever farm to do this job could have a lot of stuff cut out and optimised for lower cost/performance.
No one is assuming that. Inevitably better hardware will be produced. And Nvidia already makes server farm cards under the "Grid" trade name; each card includes multiple Kepler GPUs. Over time, improved versions will be rolled out. The question, as GameSeeker trenchantly observed, is whether the combination of server hardware and internet service will improve in cost/performance faster than client hardware.

sure, nvidia might claim the tech demo is using cloud compute, but i'm sure that exactly like simcity it's all local and the connection is just for drm
When there's a reasonable discussion going on, posting sarcastic responses to imagined idiots is very counterproductive. If you're concerned about the level of discourse, don't be part of the problem.
 
Also however, it makes me wonder about external GPU rendering through the USB 3.0 (or even 2.0?).

Why you need that?
1) Cost will be the same.
2) Bandwidth will be lower.
3) Latency will be bigger.

I don't see any positives, besides easy installation process.

--

Also, guys, how come you don't believe in cloud computing for additional stuff, but don't have any problems with Gaikai, it is basically the same thing (same problem with latency, internet bandwidth and servers power)?
 
Also, guys, how come you don't believe in cloud computing for additional stuff, but don't have any problems with Gaikai, it is basically the same thing (same problem with latency, internet bandwidth and servers power)?
Gaikai is not the same thing. Rather than split the workload across the client and the server, it does all work on the server. The only thing passed to the client is a video stream, and the only thing sent back is controller inputs. The biggest technical advantage over the cloud-distributed idea is being able to have all calculations run in a single locale, thus not requiring more elaborate executables that have to balance multiple, variably-latent resources.

In addition, video streaming is a very mature technology. Not only does it require comparatively light bandwidth, it also has more graceful degradation. If a cloud-distributed solution has problems, entire graphical effects will be missing, or will be delayed versus other elements. This is visible in the original video as pop-in, flickering, and visual sync issues. In the same situations for server-side setups, color depth or resolution can fluctuate dynamically, or compression can throttle the bitrate, which may be less visible to users.

That's not to say latency is no problem with a Gaikai-type solution; controller input lagging is usually going to disrupt player immersion worse than lighting lag, for example. And different people might have different preferences regarding pop-in versus compression artifacts. As for server power, that's certainly a concern--but that's why Sony have only announced Gaikai to run previous-gen games.

The differences in tech, though, mean that the Gaikai solution is far more economically feasible right now. Remember Gaikai was a real product, used millions of times, years ago. Cloud-distributed graphics, on the other hand, are still in the research phase. They might take over in coming years, but it'll be a slow ramp up. (They won't necessarily follow the same curve, but just for comparison Gaikai was announced at GDC '09 and didn't become a live product for two years.)
 
Why you need that?
1) Cost will be the same.
2) Bandwidth will be lower.
3) Latency will be bigger.

I don't see any positives, besides easy installation process.

--

Also, guys, how come you don't believe in cloud computing for additional stuff, but don't have any problems with Gaikai, it is basically the same thing (same problem with latency, internet bandwidth and servers power)?
1)Cost will not be the same if you don't have to buy a new computer.
2) Not on USB 3.0 (5 to10 gigabits per second), and usually not on 2.0 (480 megabits per second). Although 2.0 only allows 1 way communication at a time, while 3.0 allows 2 way in parallel.
3) I have no idea where you're getting the latency argument from, care to explain?

Just doing some quick research on this, I've found that people have already jerry-rigged something exactly like this themselves sort of. Apparently USB 3.0 is a WIP (a good enough PCI to 3.0 converter is needed) but PCI is working just fine right now with external GPUs.

And keep in mind that the link above is talking about full pipeline rendering, where as Nvidia is just demonstrating supplemental rendering aid through the internet. USB 3.0, possibly even 2.0, and certainly PCI would be fast enough for that.
 
1)Cost will not be the same if you don't have to buy a new computer.
2) Not on USB 3.0 (5 to10 gigabits per second), and usually not on 2.0 (480 megabits per second). Although 2.0 only allows 1 way communication at a time, while 3.0 allows 2 way in parallel.
3) I have no idea where you're getting the latency argument from, care to explain?

Just doing some quick research on this, I've found that people have already jerry-rigged something exactly like this themselves sort of. Apparently USB 3.0 is a WIP (a good enough PCI to 3.0 converter is needed) but PCI is working just fine right now with external GPUs.

And keep in mind that the link above is talking about full pipeline rendering, where as Nvidia is just demonstrating supplemental rendering aid through the internet. USB 3.0, possibly even 2.0, and certainly PCI would be fast enough for that.

1) What? How exactly you will pay less, by buying same videocard, but with different connection slot?
2) Where "up to 10Gb" coming from? All the sources says - only 5Gb to one direction. And PCI-E 2.0 already have ~16GB. Look closely, not Gbits, like in the USB 3.0, but GBytes.
3) Latency argument? I assume motherboard design favor pci-e slots for latency. Dunno, may be wrong, do you have data to refute me?
 
Top Bottom