• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA unveils Cloud Based Indirect Lighting

nib95

Banned
Yeah but DriveClub looks poor graphically so it'd be hard to argue that offloading the lighting wouldn't be of any benefit.

That was 35% build pre alpha..

Killzone Shadow Fall is using GI too. And these are LAUNCH titles running on far worse gpus than a Titan...
 

glenn8

Banned
Seriously?

lol-no.gif

tumblr_m2ingt6FmK1r3nmy6.gif
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
That's quite a statement.

Not only do you need to spend extra time and resources programming network code, changing the rendering pipeline, programming what to do in the case of network interruption or congestion, but you also have to pay for servers with GPUs to make it work.

This will not be used anytime soon. Period.

Not when the benefits are so small compared to the costs.
 

shandy706

Member
The pop in becomes really noticeable at 200ms of latency. That might get a little distracting in environments where lights are turning on and off all the time (say, any action game.)

It seems the best use is real time lighting for the overall world. I would think lighting that moves quickly (flashlights, fire, particles) should be left to local rendering. It would certainly help free up resources when the local client doesn't have to render world lighting.
 
It's funny, even with it being in the OP, people are ignoring that this requires a lot of GPU power. Something Azure does not have.
 
It seems the best use is real time lighting for the overall world. I would think lighting that moves quickly (flashlights, fire, particles) should be left to local rendering. It would certainly help free up resources when the local client doesn't have to render world lighting.

I think you could decrease the compute cost.
By keeping it only for the sun and maybe some stationary lights like torches in rooms.
Bullets,flashlight and particles i would ignore using them for GI.
 

Takuya

Banned
It seems the best use is real time lighting for the overall world. I would think lighting that moves quickly (flashlights, fire, particles) should be left to local rendering. It would certainly help free up resources when the local client doesn't have to render world lighting.

Like everyone has said from the start, it's only really even remotely useful for areas you don't really care if the indirect lighting works or not. For the focus area, it's terrible when latency bogs it down.
 

EvB

Member
:)

I'm willing to eat crow at launch if you are.

I am, there is nothing wrong with eating a little crow (or a big one)

I think Sony are putting themselves in an awkward position Launching the undeniably impressive GT6 just before Driveclub.

Forza 5 i an impressive game, but I still think that GT has the edge. I'm not even sure why
 

pixlexic

Banned
It's funny, even with it being in the OP, people are ignoring that this requires a lot of GPU power. Something Azure does not have.

I don't think people are looking at it as a MS only deal and also if some one like MS or Sony went to a manufacture of Nvidia or ATI gpus and said we need 100 thousand.. I am sure they would work out a whole sale deal.
 

nib95

Banned
I only saw an issue at 1000ms, even 500ms looked ok in the demo. Anything under 200ms looked damn near perfect.

Perhaps in the last segment. The midway scenes had pop in even at decent latency. Lighting pop in, flickering, all sorts. Certain pillars go from lit to unlit in a split second etc.
 

pixlexic

Banned
Perhaps in the last segment. The midway scenes had pop in even at decent latency. Lighting pop in, flickering, all sorts. Certain pillars go from lit to unlit in a split second etc.

I didn't see that. and it wouldn't work that way anyway .. the local client would keep the last set of data until it got another response.

The only thing the video shows is at very high latency the lighting is a bit behind.
 
It's funny, even with it being in the OP, people are ignoring that this requires a lot of GPU power. Something Azure does not have.

Right now it doesn't.

But that's the great thing of having these resources in the cloud. MS is free to update and improve the hardware in their servers over the life of the console, adding powerful GPU's, updating the CPU/RAM, etc. On the console itself, the hardware is going to stay the same for the entire generation, but that isn't the case with Azure.
 
Right now it doesn't.

But that's the great thing of having these resources in the cloud. MS is free to update and improve the hardware in their servers over the life of the console, adding powerful GPU's, updating the CPU/RAM, etc. On the console itself, the hardware is going to stay the same for the entire generation, but that isn't the case with Azure.

They haven't announced it yet who knows maybe they do have or are building Nvidia titans rack at azure right now. From what i saw it is extremely easy to replace the containers containing the CPU racks at azure data centers.

trololol that is where the E3 titan cards where for emulate the X1 and cloud resources....

Guessing DriveClub uses real time Global Illumination given the sun moves around, sets, rises etc. DriveClub dev comments on it here.

http://www.youtube.com/watch?v=VoengumG6FI&t=3m40s

Even more curious to see how this game ends up looking and performing at launch now.

You want us to count frames in a shitty video played by a terrible or distracted driver?
Hard to tell it could be updating once every 500~1000ms for all i know.
 

Tripolygon

Banned
This is running on Nvidia GRID, Azure is different from Nvidia GRID. Think about this as SLI but over the internet, the server (Nvidia GRID) renders indirect illumination, sends it down before the local client an Nvidia 650M renders the scene and they are composited together to form the final frame. Oversimplification of course.

This experiment is done in a closed environment, nobody said it couldn't be done but its gonna be difficult to do in an open internet environment when major latency is introduced. Which is what people have been saying.
 

Damian.

Banned
I think cloud computing is a little more than just a buzzword lol

Let him have his fun while most people don't know any better.

On topic, cloud computing is going to make some massive strides in the next decade or so when latency gets down to acceptable ping levels. Make sure your bodies are ready.
 

GameSeeker

Member
2521341-1810295880-Snap%2B.png


Found this.

If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.

Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process

They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.

And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)

So, a good research paper, but not economically practical either on the client or the server side yet.

Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.
 
They haven't announced it yet who knows maybe they do have or are building Nvidia titans rack at azure right now. From what i saw it is extremely easy to replace the containers containing the CPU racks at azure data centers.

trololol that is where the E3 titan cards where for emulate the X1 and cloud resources....

It's totally possible. I'm just going off the assumption that the servers they're offering for XBL Cloud Computing is the standard set of instances they offer in Azure.

It is really easy to replace the containers in azure. It takes about 24 hours to configure a new set of 2,000-3,000 servers since they're self-contained in a shipping container. That number is actually from 2010-2011, so maybe they're able to install new servers even faster, or have a higher density of servers in the containers.

This is definitely one area to keep an eye on to see how it evolves over the life of the console. I'm hoping MS will allow developers to open up about this at GDC.
 

shandy706

Member
Just to clarify, you're saying neither KZ SF or DriveClub use real time GI?

Watching Driveclub it looks like they're using that flying orb for lighting sometimes, heh. The warping/moving light on the trees during gameplay is disorienting.

Anyway...off subject.
 

nib95

Banned
If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.

Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process

They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.

And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)

So, a good research paper, but not economically practical either on the client or the server side yet.

Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.

Interesting. So cloud computing advantages in terms of graphics are still very limited. How much computational effort is actually saved by using the cloud for this? Obviously right now the costs and bandwidth limitations make it non viable but what about in 5 years time? Or will local hardware (and with next gen consoles engine optimisations) have caught up to be powerful enough that real time GI would be more easily serviceable?
 

Pimpbaa

Member
A titan for every server? Stutter when a person was moving through the demo? Insane amount of bandwidth? It can be done, but there are some major hurdles to overcome before this becomes available for pc or consoles. Internet providers are going to be hindering this more than anything. Just hope it doesn't take as long as next next generation. Also doing ALL the rendering server side like OnLive might be more desirable by then.
 

Ploid 3.0

Member
So it's real....Well damn.

NVIDIA showed off their cloud rendering realtime with transformers cgi stuff. They have GPU farms that they rent to companies. These graphics aren't for gaming though. During the stage where you're planning and creating cg stuff it looks very basic and a bit laggy but workable. This is for a different market.

http://www.youtube.com/watch?v=HYUOUMy-VDo

Video of it in action.

Think of it as the ability to control powerful PCs from any laptop/pc. Like remote desktop.
 
If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.

Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process

They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.

And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)

So, a good research paper, but not economically practical either on the client or the server side yet.

Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.

43! Guess this is not feasible yet.
 

Hollow

Member
Notice a bit of jitter on some of the demonstrations in that vid but this is still impressive and an interesting proof of concept.

Finally, someone has shown the cloud isn't just PR bullshit.
 
They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.

And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)

What about the other systems, like voxel one and the irradiance map. How much BW needed?
 

WolvenOne

Member
And even then it only makes sense in a subscription style package, otherwise it would be idiotic to have such a setup for games longer than their prime selling period.

Yes, this. It makes no sense to keep rendering servers up and running, for single player games that after six months only have 10K people playing at any given time. Especially if these ten thousand are distributed over a wide geographic area, and aren't paying a subscription.

Additionally, I'm a little skeptical about certain aspects of this demo. Most of these scenes assumed 150ms latency, which is the absolute best real world scenario you could ask for. Even with a good connection however, its not uncommon for latency to rise to 500-600-700ms, or more during certain situations. Whats gonna happen to lighting under these scenarios?

Finally, I think this might be a decent solution for say, MMO's, but only when it comes to lighting that the player doesn't have direct control of. If there's a half second lag burst during a sunrise animation, it might not be too noticeable, but if a character is using a flash-light and they get hit by lag even momentarily, they'll notice immediately.

This brings up my last concern, flexibility. When utilizing such a system, can latency critical lighting effects be handled locally, or is this an all or nothing affair? I didn't really catch them mentioning anything along these lines during this presentation.

Overall, its an interesting tech, but I don't think it really changes anything. There are still practical and technical limitations that make this sort of thing difficult to implement, and even when implemented, it'd be most useful serving a similar function as pre-baked lighting.
 

FyreWulff

Member
So, the impossible.....is looking possible?

The impossible is still impossible.

A component of rendering is now possible, but cloud rendering as a whole is still not.. Well, it's technically possible, but games would have massive input lag. Think playing Splosion Man or Halo campaign co-op button lag, but actually a bit worse, and across every game no matter what mode you're playing. I guess the word here is practical. Cloud rendering is not practical.
 

FINALBOSS

Banned
Yeah, no. I was specifically told that I was an idiot for believing that the cloud would ever be used to improve graphics.

Not my fault you've been talking to naughty members.


And even if they did say that, I'm sure they meant now.

Who the hell knows what fancy shit we'll have in the future.
 

WolvenOne

Member
If folks read the actual Siggraph presentation you will see that they look at the entire rendering pipeline to see which parts would make sense for Cloud computing.

Here are the parts of the pipeline they conclude are BAD for Cloud computing:
1) UI
2) Shadow Map Render
3) Physics
4) Direct Illumination/Composite/Post-Process

They found only one part of the pipeline suitable for Cloud computing: Indirect Illumination.

And to do the demo they needed:
1) A GeForce Titan in every server
2) 43Mb/s Internet bandwidth for a Battlefield 3 size map (using Photon system)

So, a good research paper, but not economically practical either on the client or the server side yet.

Which will improve faster, client H/W or server H/W + Internet bandwidth? That's the magic question.

Holy, that is a LOT of hardware to invest for a single player. It'd almost be more practical just to give everyone Titan cards. :p
 
Top Bottom