• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nintendo researches GPGPU powered cloud computing

hteng

Banned
So they could use Cloud to calculate physics send the data back to the game and do a playback? How does that work? And could it do other things, like AI?

so if you dont have the internet your game will be just a barren waste land?
 

wsippel

Banned
Not you too, Nintendo. :-(
You can do this shit and make it entirely optional. At least in theory. And remember: Nintendo doesn't even have a "proper" account system in place, so DRM most likely won't become an issue, either. If the cloud does something for me, but doesn't restrict me, I'm all for it.
 
Don't worry, Iwata already said that the current Cloud business is not good yet for gaming, so if Nintendo implements this in the Wii U successor it will be differerent and actually a way people will like it.
I won't like it because it will destroy my internet usage. A cloud-dependent game is effectively DRM.
 

Maximilian E.

AKA MS-Evangelist
Stupid move by Nintendo..

The honorable word of GAF has taught us that DA CLOUD has no gaming potential or cannot be beneficial to gaming.

The All seeing GAF has taught us that it is really a waste of time and resources because technology never advances and therefore, such thing as DA CLOUD does not belong to gaming and will never evolve to something that could add myriads of possibilites to games..


or something along these lines..

(seriously, good for you Nintendo, even though you will not have front row seats, it is good to be attending at least, for "DA CLOUD SHOW" :)
 

Anteo

Member
It makes sense. The question with cloud computing is not "Can you do it?" but "Can you do it in time? What do we do if we can't?".

Investigation is always cool. They are checking out some technologies to see if they are viable.
 

Darryl

Banned
Okay then we are on the same page. The numbers might be bs, but it's a viable tech. We just need someone to make it happen in the real world. At least on a console.

ya of course its useful tech its just more of an into-the-future kinda thing that probably exists more-so to infinitely extend this console generation into the future rather than to provide any immediate and practical benefit to gaming
 

Meelow

Banned
When is Sony going to jump on this train?

4rs7535.png


...seems everyone is doing it.

It will be interesting re how this pans out by mid/end gen.

.
 

Godslay

Banned
ya of course its useful tech its just more of an into-the-future kinda thing that probably exists more-so to infinitely extend this console generation into the future rather than to provide any immediate and practical benefit to gaming

We are close though. I don't think it would be out of the realms of possibility to see it implemented in some form this gen. It probably won't be full-baked until sometime in the future. Should be cool to see it happen if they attempt it.
 

wsippel

Banned
I won't like it because it will destroy my internet usage. A cloud-dependent game is effectively DRM.
You can use the cloud without relying on the cloud. And it's not necessarily DRM, either. Using cloud computing doesn't require product activation or anything.
 

Godslay

Banned
They are only researching it for heavens sake. People over react way too much. Bloody hell...

But research gives us real results, which then can hopefully be used in a real world application.

This is the paper I was referring to:

http://research.microsoft.com/pubs/72894/NOSSDAV2007.pdf

It is Microsoft Research with help from University of California, here's the most interesting piece:

Due to our unfamiliarity with available MMOG code bases [7,8],
we evaluate the above mechanism in an open-source first-personshooter (FPS) game, specifically Quake III. Although FPS games
are quite different from MMOGs in many respects, the basic game
loop and combat AI logic are very similar [1,19].
We built a prototype implementation of partitioned AI inside
Quake III’s bot code. We employ Quake III’s standard AI [21]
for every aspect of bot control except the direction of motion,
which we determine by a Taylor-approximate influence field.
Notably, we did not modify the target-selection logic or shooting
accuracy.

In this paper, we propose enhancing the AI of game servers by
offloading computation to clients. To address the problem of
latency, we partition each computation into a critical tight-loop
server-side AI and an advice-giving client-side AI. As an
exemplar, we design an enhanced AI for tactical navigation based
on influence fields, and we partition it using Taylor series
approximation. Prototype experiments show substantial
improvement in AI abilities, even with round-trip latencies up to
one second.


A further step with our prototype is to replicate the client-side AI
and test its ability to deal with client failure. Another step is to
add a local fallback mode to the server-side AI and investigate the
transition between advised and unadvised AI behavior. A minor
but practically important improvement is to execute the client-side
AI on a low-priority thread, to ensure that it does not disturb the
gameplay of the user on the client machine. Moreover, we would
like to implement partitioned AI in an MMOG and conduct a user
study to see whether it improves the game as we expect.
One aspect of client exploitation we did not consider is
information leakage, in which clients inspect glimpses from the
server to learn details of game state they should not be allowed to
observe. We would like to investigate anonymization and
obfuscation techniques to limit client visibility into offloaded
computations.
 
D

Deleted member 80556

Unconfirmed Member
This sounds like something for next-next-gen. I wouldn't expect much to come out until like 5 years have passed.

This is mere R&D, nothing substantial. It's great they're looking into it though!
 

RoboPlato

I'd be in the dick
This sounds like something for next-next-gen. I wouldn't expect much to come out until like 5 years have passed.

This is mere R&D, nothing substantial. It's great they're looking into it though!

Yeah, a lot of people in this thread laughing at those of us who have said the cloud will not work as Microsoft describes don't seem to realize that it is theoretically possible, just years off.
 
But research gives us real results, which then can hopefully be used in a real world application.

This is the paper I was referring to:

http://research.microsoft.com/pubs/72894/NOSSDAV2007.pdf

It is Microsoft Research with help from University of California, here's the most interesting piece:
Right, so...

From a technical perspective, I think it absolutely makes sense. AI decision making falls into a class of high-latency (or rather, latency-resistant) problem that makes WAN-side computation a viable option. It'll likely lead to some weird consequences, e.g. "hang on I need to unplug my router and make the AI stupider so I can beat this level", but the technology is sound. It could have some fun applications, not just for AI but for other, similar problems.

However, there are still a few misgivings:

1) this is only viable for that certain class of latency-resistant problem, which is another way of saying that it's only viable in the context of it being optional. This optional requirement isn't going away until consumers have super-reliable ISPs, and we aren't getting those until we have true competition, and :lol @ that happening in the USA any time soon. In South Korea you might be able to have some real fun, but in the US... we're taking window decorations here. For the immediate future, say the next 15 years or so, games that utilize "the cloud" will need to prepare for it to be unavailable for long stretches of time. Cloud features will, as such, be relatively confined to being window dressing.

2) having a dedicated server farm of processing power for a game (which is the whole point -- to vastly outclass the amount of computational power you have on your local computer/console) strikes me as a very "AAA" path from a game development perspective. Put another way: given infinite money and resources, what features can we add to games to make them better? As long as long as the game has to function without the cloud functionality, or put another way as long as the cloud functionality isn't central to core gameplay, I'm not sure the payoff is worth the expense. Granted, AAA games typically blow money right and left, so they'll probably do it anyway, but that doesn't mean it's a wise decision.

e: I remembered the 3rd thing
3) this isn't really a misgiving, more a qualifier: note that for an MMO, or some other implicitly online game, you have a stronger guarantee of connectivity and thus have less to worry about from a "contingency" perspective. Instead of the game needing built in logic for what happens when the connection goes out to lunch, you just get disconnected. Cloud features might be able to get closer to the core gameplay in that environment. There are also turn based games ("games that don't need a pause button") that could potentially take advantage of the lack of a real-time requirement to offload work to the cloud. So there's some hope for the immediate future here.
 
So in theory, you could offload the processing of all this gaming by moving processing from the individual hardware that people buy -- > to massive gajillion dollar data centers.

Who is going to pay for all those servers?
 

demolitio

Member
The difference here is it's in R&D and something for the future compared to just a PR release claiming a huge boost in overall power without giving any details basically making it look like just a bold claim for now. Even if it's demonstrated at E3 for MS, they better impress people after throwing out such a huge number. I'm sure it could help in some ways right now, but there's a lot of questions left to be answered and that big of an overall boost only made people distrust MS's "cloud" even more compared to a modest assessment of its advantages.

I have no doubt it could help out in the future, but internet speeds and the overall infrastructure need to improve drastically before it could drastically change a system. I hope each and every company is looking into it (and they are apparently), but I don't see it being that effective tomorrow when so many things need to improve for it to be effective to the majority of the market. So not only would companies have to invest billions in infrastructure to work for all of their customers but internet speeds need to increase worldwide and internet companies would need to spend millions or billions in new infrastructure before that can happen in most places in America which is why we haven't seen it yet in most areas here. It says a lot when the internet providers don't see that as a worthy investment right now so they basically dictate the evolution of many industries and the console makers would be looking at some astronomical costs just to set up the infrastructure for genuine cloud computing on a major level when other companies aren't willing to put that much into an investment like that. I live in a major city in the U.S. with basically only two real internet providers that have no real competition to make them spend the money and being an old city means a lot of money would need to be involved to upgrade their services so both of those companies basically are complacent with just even competition with each other.

It's definitely a thing for the future, but MANY things have to change before it can have a major effect. I have no doubt that internet will improve more with time, but it's just not there yet. R&D is one thing, but putting out statements about how the cloud essentially renders any hardware disadvantages irrelevant TODAY is the part that kills me. It could have some major benefits in some areas, but I just don't see the "40x" statement as feasible.

Just my opinion, but I can't wait to see what any of them have up their sleeves. I just don't buy the 40x more powerful claim without any real information on it when it seemed just to be damage control at the time.

Link could be "in the cloud" while making his way through the Sky Temple and Mario could have extra physics calculations done each time he lands on a goomba's head making each one look differently.
 

freddy

Banned
Yeah, a lot of people in this thread laughing at those of us who have said the cloud will not work as Microsoft describes don't seem to realize that it is theoretically possible, just years off.

A lot of us are just laughing that Microsoft is using it as a crutch, right now.
 
So everyone's jumping in on this PR nonsense now? What's the over-under on Sony and Nintendo touting a 40x increase in power to their handhelds once you're in Wifi range come E3?

Cloud gaming may someday come, but between poor internet infrastructure in many countries and the technical limitations on how these things actually work, we're nowhere near a reality in which this kind of stuff is going to start happening. Certainly not near enough to it that any company should claim it as a console feature.
 

leroidys

Member
So you are telling me that when MS as well as USC (iirc) researched AI distributed over the cloud it was bs? Granted it was a prototype using Quake III, but they clearly concluded that up to roughly 1 second of latency distributed AI performs better than local AI. It's in it's infancy, but there are very real applications for this.



Cloud computing is not bullshit. That's the only point. I don't care who uses it. It's in it's infancy for games, but I'm sure it will grow as an application. Especially if someone funds it on a large scale, like it looks like MS is attempting to do.

AI is probably the main area that could benefit from it right now, because it's the computation that is expensive, not just the storing and transferring of data across the system. For a simplified example, visualize the game of chess. Any possible board has exponential possible move histories, and arriving at the optimal next move can take a lot of cpu time. However once it reaches that solution, it would only have to transmit, say, two pieces of data back to the console (which piece to move, where to move it), which can be transmitted very, very quickly.

It's not going to make games look better though.
 

Maximilian E.

AKA MS-Evangelist
AI is probably the main area that could benefit from it right now, because it's the computation that is expensive, not just the storing and transferring of data across the system. For a simplified example, visualize the game of chess. Any possible board has exponential possible move histories, and arriving at the optimal next move can take a lot of cpu time. However once it reaches that solution, it would only have to transmit, say, two pieces of data back to the console (which piece to move, where to move it), which can be transmitted very, very quickly.

It's not going to make games look better though.

A game does not need to look better, to become better..
 

leroidys

Member
A game does not need to look better, to become better..

Absolutely. I'm mainly aiming that comment on microsoft saying that their console is "infinitely powerful with the power of the cloud." More on the side of utter lie than bullshit PR speak. Reminds me of Sony back in the day.
 

rpmurphy

Member
ya of course its useful tech its just more of an into-the-future kinda thing that probably exists more-so to infinitely extend this console generation into the future rather than to provide any immediate and practical benefit to gaming
People have been playing MMOs for years. This research sounds more like scaling out the computational capability of server clusters for stuff like physics and AI, rather than gaming vmware.
 

demolitio

Member
Absolutely. I'm mainly aiming that comment on microsoft saying that their console is "infinitely powerful with the power of the cloud." More on the side of utter lie than bullshit PR speak. Reminds me of Sony back in the day.

Exactly. No one is disputing that cloud computing could help in SOME areas now but mostly in the future, but Microsoft's "PR" about it making the console a shitload more powerful is the part I have a problem with. It can help improve games, but it isn't going to magically make the actual console itself more powerful.
 
You can use the cloud without relying on the cloud. And it's not necessarily DRM, either. Using cloud computing doesn't require product activation or anything.
Here's my logic; it may be faulty:
1) Cloud requires a constant connection.
2) Your system has to therefore always be online.
3) And always connected to a server.
4) Which means the server can check for game authenticity, assuming publishers plan for it (and you know they will).
5) Able to check serial numbers, etc.

I just don't see 1-3 happening without 4-5 happening as well.

I also wonder what developers will actually consider it worthwhile to spend resources developing for a game's features in cloud-game only. Like, I can't see Capcom coming out and saying "Street Fighter 5 will be 720p, or 1080p with cloud connection". I'm not even sure this would be desirable, since I'd rather have consistently mediocre quality over fluctuating quality.
 

TunaLover

Member
I don't think Nintendo would implement something that needs permanent internet connection, they have been always been contrary to putting requirements in games, Nintendo is all about plug and play.
 

komaruR

Member
this cloud computing, does it in anyway make the graphic looks more crisp, high quality, with more & better 3d model (higher polygon count) simultaneously? or is that limited to what the physical graphic chip can output to your hdtv/monitor?

to put is simpler, what can and can't cloud computing can do to benefit in terms of graphic output?
 

Godslay

Banned
Right, so...

From a technical perspective, I think it absolutely makes sense. AI decision making falls into a class of high-latency (or rather, latency-resistant) problem that makes WAN-side computation a viable option. It'll likely lead to some weird consequences, e.g. "hang on I need to unplug my router and make the AI stupider so I can beat this level", but the technology is sound. It could have some fun applications, not just for AI but for other, similar problems.

However, there are still a few misgivings:

1) this is only viable for that certain class of latency-resistant problem, which is another way of saying that it's only viable in the context of it being optional. This optional requirement isn't going away until consumers have super-reliable ISPs, and we aren't getting those until we have true competition, and :lol @ that happening in the USA any time soon. In South Korea you might be able to have some real fun, but in the US... we're taking window decorations here. For the immediate future, say the next 15 years or so, games that utilize "the cloud" will need to prepare for it to be unavailable for long stretches of time. Cloud features will, as such, be relatively confined to being window dressing.

2) having a dedicated server farm of processing power for a game (which is the whole point -- to vastly outclass the amount of computational power you have on your local computer/console) strikes me as a very "AAA" path from a game development perspective. Put another way: given infinite money and resources, what features can we add to games to make them better? As long as long as the game has to function without the cloud functionality, or put another way as long as the cloud functionality isn't central to core gameplay, I'm not sure the payoff is worth the expense. Granted, AAA games typically blow money right and left, so they'll probably do it anyway, but that doesn't mean it's a wise decision.

e: I remembered the 3rd thing
3) this isn't really a misgiving, more a qualifier: note that for an MMO, or some other implicitly online game, you have a stronger guarantee of connectivity and thus have less to worry about from a "contingency" perspective. Instead of the game needing built in logic for what happens when the connection goes out to lunch, you just get disconnected. Cloud features might be able to get closer to the core gameplay in that environment. There are also turn based games ("games that don't need a pause button") that could potentially take advantage of the lack of a real-time requirement to offload work to the cloud. So there's some hope for the immediate future here.

I don't disagree with any of your points, but the items in this particular research involved latency sensitive items. Which is good news for less sensitive processes.

To explore this approach, we develop an improved AI for tactical navigation, a challenging task to offload because it is highly sensitive to latency

Unless we exceed the speed of light and violate causality in 15 years, it ain't gonna happen.

Locality. Elforkksu and I have discussed this to a certain degree. It's not as simple as that just locality, but if they do offer a one hop connection (like they mentioned in the Engineer's Conference) to a server then it should alleviate some of the issues. Other things like how the load is distributed helps as well, mentioned in the paper linked to above.

The speed of light is a cap on everything for sure though, can't disagree.
 
the interview seems to suggest they are exploring gpgpu and cloud independently as well as together... so cloud and heterogeneous data centers aside, more middleware for GPGPU would be amazing...

the entire field is under developed right now... we only recently got people building relational databases driven by GPUs that are available and usable without severe headaches...

also makes me wonder what the next-gen of nintendo hardware will be like - my sense is that the CPU will be even less important than it is today as a proportion of theoretical power

also really like the end of the interview where iwata says that nintendo really wants to work with partners - hopefully people see that as an invitation and try to reach out... nintendo is clearly very eager to expand its footprint and put itself out there as a willing and collaborative partner...

needless to say, a great acquisition, increases collaboration in a key strategic geography, and lays the foundation for future european collaborations and potentially a new european studio... paris has great talent available much more affordably compared to london and IMHO just as talented...
 

Alchemy

Member
The cloud is going to be really fucking huge at some point, already is for many applications. I think Microsoft is just jumping the gun a little, I doubt there will be any significant usage of the cloud in the Xbone anyways. Good to see Nintendo not being scared off by the internet.
 

PhantomR

Banned
Don't worry, Iwata already said that the current Cloud business is not good yet for gaming, so if Nintendo implements this in the Wii U successor it will be differerent and actually a way people will like it.

Can this be repeated please?

Willing to bet this will be tied into the unified account system coming later.
 
I don't disagree with any of your points, but the items in this particular research involved latency sensitive items. Which is good news for less sensitive processes.

Locality. Elforkksu and I have discussed this to a certain degree. It's not as simple as that just locality, but if they do offer a one hop connection (like they mentioned in the Engineer's Conference) to a server then it should alleviate some of the issues. Other things like how the load is distributed helps as well, mentioned in the paper linked to above.

The speed of light is a cap on everything for sure though, can't disagree.
This is where the "optional" part comes in. I'm totally willing to believe that we can achieve stable performance for 98% of the time, but the game still needs to avoid falling all over its face for the 2% of the time that packet loss is killing me or my ISP is being terrible or my router is acting up. That 2% is what kills us.

Also, I'd struggle to classify anything that's allowed to function with over, say, 100ms of lag "latency sensitive", but that may just be personal bias creeping in.(;p) We'll always have other latency to fight with in addition to this network latency we're now introducing. To me, the real takeaway of the research is that they could cope with >1s lag and still have a positive effect on the gameplay. That sort of flexibility will be required, because you're going to have packet loss/etc.
 
Top Bottom