• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA unveils Cloud Based Indirect Lighting

Ok so the video keeps crapping out on me and have just been able to see the title screen.

But I assume the the systems composing this cloud are using GPU's.. is that the case?
 

Ahasverus

Member
First realtime screenshot of cloud based indirect lightning

gaming-lightingreturns-cloud.qJpg
/dead
 

Krakn3Dfx

Member
I don't ever want to buy a game knowing that 5 years down the line it'll be unplayable because the publisher couldn't financially justify keeping the indirect lighting cloud servers online.

Maybe that's just me though.
 

i-Lo

Member
Yeah, no. I was specifically told that I was an idiot for believing that the cloud would ever be used to improve graphics.

Yes, realistically it's still an unrealistic feat with existing tech. Variable lag is the biggest barrier.
 
I like it!
Its not like everyone was guessing .. the cloud servers do not render the frames for you they just render certain data aspects and stream that data to the client to be added to the local rendering pipeline. Its doesn't have to be 1:1 frame rendering.

This is a very good sign for both the ps4 and Xbone. Think of it as a way to upgrade the machines in the future without really upgrading their hardware.
Yep, it is definitely not going to replace hardware at this point, just augment it.
 

shandy706

Member
Many people here and across the Internet are wrong about the "cloud" systems being explored.

When used correctly, it's very capable.

I don't ever want to buy a game knowing that 5 years down the line it'll be unplayable because the publisher couldn't financially justify keeping the indirect lighting cloud servers online.

Maybe that's just me though.

Yeah, because its impossible to put generic lighting in for when a game can't connect...right. /sarcasm
 

DesertFox

Member
Seems like pretty promising tech. I think that it is still a way off from being able to power a title that sells millions of copies (How would the logistics of 5 million people trying to use this at once work?), but it definitely gives credence to claims that this can be viable for offloading graphics crunching in the future.

...Even Cerny said it could be done. No one said it couldn't be done.

No one in the industry, or no one on this forum? Because there were definitely people on this forum saying it couldn't be done.
 

Nirolak

Mrgrgr
So one thing that needs to be clarified is if this is running completely on the cloud or is it augmenting an local client. Cause if its the former it's nothing new and has zero implication on XB1's "infinite power"
The tablet solution is full cloud rendering while for desktops it feeds data into the local rendering pipeline.
 

Ahasverus

Member
I don't ever want to buy a game knowing that 5 years down the line it'll be unplayable because the publisher couldn't financially justify keeping the indirect lighting cloud servers online.

Maybe that's just me though.
this. but it's your choice if you want to support this, don't weep later though
 

Vic20

Member
I don't ever want to buy a game knowing that 5 years down the line it'll be unplayable because the publisher couldn't financially justify keeping the indirect lighting cloud servers online.

Maybe that's just me though.

I worry about this as well, and I think the best solution is have any game that uses tech like this also have the option to do all of its calculations locally if so desired. This would, at least on the PC, allow the game to stand on its own. So as the parent company moves on and the hardware we own improves over time, we would just switch to doing those calcs locally. I can dream right...?
 
Ok so the video keeps crapping out on me and have just been able to see the title screen.

But I assume the the systems composing this cloud are using GPU's.. is that the case?

All our indirect lighting algorithms run in the Cloud on a GeForce TITAN. The voxel algorithm streams video to the user and relies on a secondary GPU to render view-dependent frames and perform H.264 encoding. Because this secondary GPU leverages direct GPU-to-GPU transfer features of NVIDIA Quadro cards (to quickly transfer voxel data), we use a Quadro K5000 as this secondary GPU. Timing numbers for client-side photon reconstruction occurred on a GeForce 670.

.
 

nib95

Banned
Ok, hope enough people realize there is a lot of difference between a cloud system with GPU's and one with only CPU's..

Anyway, will have to wait to see the damn video...

Based on reading memory, a single Titan costs near enough as much to manufacture as the entirety of the Xbox One console lol. That's based loosely on the BoM predictions for the Xbox One, of which a good portion will be taken up by the cost to manufacture Kinect as well. In other words, a single one of these servers is going to cost considerably more than a single Xbox One or PS4.
 
I can already see where this is going. For months we've been arguing against the "INFINITE POWER OF THE CLOUD, IT'S AMAZING!" basically trying to say it has a lot of limitations and is far, far from the idealistic picture that it's being portrayed as. The system has tons of barriers standing in its way like, what if people are offline? What if it's a multi-platform game? What what happen on the other platforms if they don't have any backend server support? Overhead? Worth implementing? Azure being a CPU farm primarily whilst this stuff also need GPU rendering as proven in the OP video, how will that work? Basically trying to deflate the pumped up PR image of the cloud in its current.

That is slowly turning into; LOL EVERYONE SAID THE CLOUD WAS FAKE AND IT DIDN'T WORK!!!1

Basically, this dosn't answer any of these questions. Plenty of people said that it was possible to use cloud for things like these, but there way lot of problems and questions regarding the tech.
 

????

All our indirect lighting algorithms run in the Cloud on a GeForce TITAN. The voxel algorithm streams video to the user and relies on a secondary GPU to render view-dependent frames and perform H.264 encoding. Because this secondary GPU leverages direct GPU-to-GPU transfer features of NVIDIA Quadro cards (to quickly transfer voxel data), we use a Quadro K5000 as this secondary GPU. Timing numbers for client-side photon reconstruction occurred on a GeForce 670.

Interesting...
 

Raist

Banned
Come on now. A lot of people on GAF said that.

A lot of people said that what Microsoft suggested it could do with it is completely unrealistic. Not that theoretically, cloud computing is a fairy tale.

It's like some guy try to sell you a miracle cure for all cancers, and in a couple of months there's a decent breakthrough in breast cancer, like improving remission by 50% in certain cases, and people would go like "hey that guy was right all along!"
 
Really cool stuff. I knew when people where talking about cloud graphics that irradiance maps would be used but the photon stuff is really cool aswell.
 

benny_a

extra source of jiggaflops
I can already see where this is going. For months we've been arguing against the "INFINITE POWER OF THE CLOUD, IT'S AMAZING!" basically trying to say it has a lot of limitations and is far, far from the idealistic picture that it's being portrayed as. The system has tons of barriers standing in its way like, what if people are offline? What if it's a multi-platform game? What will happen on the other platforms if they don't have any backend server support? Overhead? Worth implementing? Azure being a CPU farm primarily whilst this stuff also need GPU rendering as proven in the OP video, how will that work? Basically trying to deflate the pumped up PR image of the cloud in its current.

That is slowly turning into; LOL EVERYONE SAID THE CLOUD WAS FAKE AND IT DIDN'T WORK!!!1

The prophecy has been fulfilled.
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
Yeah, no. I was specifically told that I was an idiot for believing that the cloud would ever be used to improve graphics.

Used in the real world? Nope.

Used in proof of concept? Sure.
 

nib95

Banned
Ok that's cool.

2521341-1810295880-Snap%2B.png


Found this.

Seems like its somewhat augmenting the lighting, but is it improving it?

That's what I'd like to know too. It also still seems to work better with better local hardware according to the vid. Maybe someone more well versed can break it down.

I know a few next gen games are using Global Illumination in conjunction with real time lighting and shadows (DriveClub, Killzone Shadow Fall) so just how resource or computationally intensive can it be to justify the use of cloud and such expensive cloud hardware?
 
The pop in becomes really noticeable at 200ms of latency. That might get a little distracting in environments where lights are turning on and off all the time (say, any action game.)
 
That's what I'd like to know too. It also still seems to work better with better hardware. Maybe someone more well versed can break it down. I know a few next gen games are using Global Illumination in conjunction with real time lighting and shadows (DriveClub) so just how resource or computationally intensive can it be to justify the use of cloud and such expensive cloud hardware?
Yeah but DriveClub looks poor graphically so it'd be hard to argue that offloading the lighting wouldn't be of any benefit.
 

StudioTan

Hold on, friend! I'd love to share with you some swell news about the Windows 8 Metro UI! Wait, where are you going?
Ok that's cool.

2521341-1810295880-Snap%2B.png


Found this.

Seems like its somewhat augmenting the lighting, but is it improving it?

Global illumination makes a big difference in the quality lighting and therefore improves the graphics.

GI_garden_760x240.jpg
 
Top Bottom