• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA unveils Cloud Based Indirect Lighting

Sethos

Banned
Oh, you used one of the magical buzzwords - This thread is going to be real interesting in 5 ... 4 ... 3 ... 2 ...
 

Glix

Member
And who is hosting these resources? How many servers do you need for each instance of the game that is running?

Like Onlive showed us, the tech for this stuff can work, but it takes way too much overhead.
 

Cidd

Member
7ca.gif
 

Nirolak

Mrgrgr
Oh, you used one of the magical buzzwords - This thread is going to be real interesting in 5 ... 4 ... 3 ... 2 ...
On the one hand it's cloud based lighting, on the other hand it isn't Microsoft's cloud, so the thread could go either way.

That said this seems to hold up pretty well even in high latency situations, but I'm not sure who would be interested in adding that much cost overhead.
 

nubbe

Member
What if I have two computers with 3way SLi and connect them with an Ethernet cable?

that's a local cloud
 

KKRT00

Member
yup this thread will go places.

that kind of tech is all sorts of awesome. I wonder how they work out the latency since from each of the demos it showed a 100+ ms latency

Watch it to the end, they show how it looks like with even 1s latency.

Tech looks really good, still i think its not tech fir PC, You can do all of that in real time, even You have to update GI only every 3-7 frames [like CE 3 can be setup too].
 

ferr

Member
Guessing it'll be like nvidia + physx, except the new hardware is reached out to. Buy an nvidia card, get access to servers. Devs will use it just like they use physx, just another asset to utilize if available.
 

Majanew

Banned
Cool. Wonder why Microsoft didn't bother to show what Azure was doing to improve any visual effects for Xbox One games. When you make claims, you should prove it.
 

Vestal

Gold Member
RAM and cloud computing send GAF into flames.
well this proof of concept flies in the face of everything that has been said here about the cloud. It is cloud computation used to offload graphical computation from the local system.

there are still a lot of questions regarding the neccesary bandwidth, server power etc. But it is proof that it can be done
 

Darkroronoa

Member
Very interesting,so thats what you can do with the power of the cloud uh?I guess microsoft will use something like this in the future because if i understand correctly this wont come cheap.Maybe that why they did not demonstrate anything like this yet.
Pretty awesome that we will see stuff like that on pc games too.
 

nib95

Banned
Cool. Wonder why Microsoft didn't bother to show what Azure was doing to improve any visual effects for Xbox One games. When you make claims, you should prove it.

Because Azure has no where near the tech or GPU performance capability of this Nvidia cloud. Still watching through now. I'm intrigued. Does appear like this might not be financially viable for some time.
 

benny_a

extra source of jiggaflops
well this proof of concept flies in the face of everything that has been said here about the cloud. It is cloud computation used to offload graphical computation from the local system.
This also flies in the face of the claim that Azure's cloud can be used for this.

This is using Titans and Q5000 graphics cards, not CPU.
 
well this proof of concept flies in the face of everything that has been said here about the cloud. It is cloud computation used to offload graphical computation from the local system.

there are still a lot of questions regarding the neccesary bandwidth, server power etc. But it is proof that it can be done


I don't know that anyone in the know ever said it was impossible, they said it was years away and expensive. I don't think this video changes that perception.


I think most tech experts agree that we are all heading towards cloud computing... but if you think that Forza during the xbox one launch is going to use it for graphics you're getting a bit ahead of yourself.
 

fade_

Member
Plus this kind of tech is still in its infancy and I would be surprised if it's used in any games this console gen let alone launch titles.
 
I don't know that anyone in the know ever said it was impossible, they said it was years away and expensive. I don't think this video changes that perception.


I think most tech experts agree that we are all heading towards cloud computing... but if you think that Forza during the xbox one launch is going to use it for graphics you're getting a bit ahead of yourself.

And even then it only makes sense in a subscription style package, otherwise it would be idiotic to have such a setup for games longer than their prime selling period.
 

FINALBOSS

Banned
well this proof of concept flies in the face of everything that has been said here about the cloud. It is cloud computation used to offload graphical computation from the local system.

there are still a lot of questions regarding the neccesary bandwidth, server power etc. But it is proof that it can be done

...Even Cerny said it could be done. No one said it couldn't be done.
 

Sethos

Banned
well this proof of concept flies in the face of everything that has been said here about the cloud. It is cloud computation used to offload graphical computation from the local system.

there are still a lot of questions regarding the neccesary bandwidth, server power etc. But it is proof that it can be done

I think you've misread a LOT of posts and completely failed to understand the arguments.
 
but does AMD have similar tech? Could this help w/ Xbox One?
Mark Cerny said the cloud wouldn't help with graphics but this proves that wrong?
 

GribbleGrunger

Dreams in Digital
I still don't see how you can build a game around the notion of 'cloud' computing? What of the people who haven't got the internet?
 
But... but... PR gimmick supported by money hatted devs and Microsoft shills!
Although Microsoft does still need to show that features like this are possible through their cloud infrastructure.
Great work by NVIDIA

No one said it couldn't be done.
LOL okay
 

Vestal

Gold Member
I don't know that anyone in the know ever said it was impossible, they said it was years away and expensive. I don't think this video changes that perception.


I think most tech experts agree that we are all heading towards cloud computing... but if you think that Forza during the xbox one launch is going to use it for graphics you're getting a bit ahead of yourself.
did I ever mention any games using it currently?

I haven't even mentioned MS in all of this lol. Everyone needs to stop being so defensive. Simply commenting on the tech. How it flies in the face of alot of the talk surrounding the cloud.
 

tauke

Member
I still see potential for lighting artists who need to have a rough approximate render through the cloud to quickly preview a lighting mood for offline baking.

For real time usage like gaming? Various potential issues when the connection gets borked.
 

akira28

Member
well this proof of concept flies in the face of everything that has been said here about the cloud. It is cloud computation used to offload graphical computation from the local system.

there are still a lot of questions regarding the neccesary bandwidth, server power etc. But it is proof that it can be done

still waiting on proof that it should be done.

I don't trust the cloud for anything.
 

Sethos

Banned
I can already see where this is going. For months we've been arguing against the "INFINITE POWER OF THE CLOUD, IT'S AMAZING!" basically trying to say it has a lot of limitations and is far, far from the idealistic picture that it's being portrayed as. The system has tons of barriers standing in its way like, what if people are offline? What if it's a multi-platform game? What will happen on the other platforms if they don't have any backend server support? Overhead? Worth implementing? Azure being a CPU farm primarily whilst this stuff also need GPU rendering as proven in the OP video, how will that work? Basically trying to deflate the pumped up PR image of the cloud in its current form.

That is slowly turning into; LOL EVERYONE SAID THE CLOUD WAS FAKE AND IT DIDN'T WORK!!!1
 
So one thing that needs to be clarified is if this is running completely on the cloud or is it augmenting an local client. Cause if its the former it's nothing new and has zero implication on XB1's "infinite power"
 

Sub_Level

wants to fuck an Asian grill.
Cloud will be used for skill server redundancy. If you lose your skill at Call of Duty you can system restore yourself back to a previous point in time and be good again.
 

StudioTan

Hold on, friend! I'd love to share with you some swell news about the Windows 8 Metro UI! Wait, where are you going?
...Even Cerny said it could be done. No one said it couldn't be done.

Yeah, no. I was specifically told that I was an idiot for believing that the cloud would ever be used to improve graphics.
 

pixlexic

Banned
I like it!
Its not like everyone was guessing .. the cloud servers do not render the frames for you they just render certain data aspects and stream that data to the client to be added to the local rendering pipeline. Its doesn't have to be 1:1 frame rendering.

This is a very good sign for both the ps4 and Xbone. Think of it as a way to upgrade the machines in the future without really upgrading their hardware.
 

nib95

Banned
A few of those scenes mid way through the vid seemed to me at least to have a few instances of lag or lighting pop in at several instances around the level, even at modest and controlled latencies. Could be a bit jarring in real world application, though the comparisons at the end are less so because of the more spaced out variance of light. I'm going to assume it is going to be a long time till this sort of tech comes to fruition and practical application. Do we even know how much computational power this technique will actually save? Didn't see anything that any of those devices could not render locally, especially given one of them at a local level had a GTX 680, and the other a 650 GT M(?).

On a side note, someone did mention earlier that GI could be one of the things that could be implemented because it did not need to be implemented every frame. Seems to be the case here. I did not realise GI would constitute as something less latency sensitive, interesting to see that they've managed to pull this off, albeit with serious Cloud hardware muscle.
 

squidyj

Member
Crassin does good work.

The thing is, you have to consider the latency of the solution, the amount of bandwidth that needs to be moved, and the fact that for a single-player game you can't amortize the cost across multiple players unless the changes in the lighting conditions are predictable in which case you could have predicted it and precomputed the gi solution.
 

benny_a

extra source of jiggaflops
I still see potential for lighting artists who need to have a rough approximate render through the cloud to quickly preview a lighting mood for offline baking.
Would be interesting to see how many lighting artists a company have and if it's worth setting up GeForce cloud servers vs. just putting in Nvidia Titans in to lighting artist's workstation.

In the end economics play the biggest role. If Nvidia started offering GeForce cloud and maybe partner with Epic on Unreal Engine to do what you say they could do, then people could use that.

But that service would need to be cheaper or people would just continue to buy high-end cards for that specific use case. Which is less money for Nvidia again. It's probably a margin game at that point.
 
Top Bottom