• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Phil Spencer Next Cloud Demo being planned will show BW/ CPU/ Latency Info

Compression relies on redundancy in the data. Music data is stored as 16 bits ints (assuming from a CD), lossless compression exploiting redundancy in stereo can at best bring it to ~50%. Not sure the position of 1000s-millions of particles have any redundancy at all. Much less 80%.

Not to mention that compression itself is quite a computationally intensive process. If we take general-purpose compression algorithm (say, LZMA), we can achieve stated rate with something like a large body of source code (~90%) while compressing at the rate of approximately 2.5 megabytes per second of CPU time. That is, if we agree with the cited estimates, 15 ms, or approximately 50% of the budget estimated there to computing the contents of a frame. And that's not counting decoding time.

Now, if they could come up with a compression scheme which exploits redundancy in data, they would probably reduce encoding time, decoding time and the compression rate would seem more realistic... The problem here is that the way compression works is that it uses the known patterns in data in order to reduce the size at the cost of computation decoder has to do. There is a compression efficiency-vs-complexity slider, which can go from 0% (no compression) to 100% (the client already knows everything it needs and can compute everything without the cloud, after all). I doubt that the values in the middle of that spectrum are actually that attractive, plus in order to develop a complicated compression scheme, they would need a lot of engineering resources and a very specific type of talent (the kind of people who work on video codec).

Even then, this would most likely consume a lot of CPU time and would hardly come anywhere around stated compression efficiency. But then again, we don't know what exactly Microsoft intends to offload to clouds, so they might have better-compressible data or lower requirements on latency. I am not claiming anything about Microsoft technology here (because I don't actually know what they are really doing), merely that the cited calculations on reddit are based on a model which is not actually useful.
 
Still have 1 question:

What happens to cloud-reliant games that are commercial failures? How long will support last?

The advantage of cloud computing over older dedicated servers is that cloud computing can scale. So if less people are playing a certain game, that game won't be using too much resources.

So in theory, support can last a lot longer than dedicated servers.
 

jryi

Senior Analyst, Fanboy Drivel Research Partners LLC
I don't doubt at all that cloud computing is a thing. And I don't doubt that there was a very nice tech demo that showed a destructible building. And I don't even doubt that the technology could be integrated in a game, so you could basically blow up a house and have a shit ton of particles whose movement is individually calculated.

What I do doubt, however, is that this would make a game better. That is what I would like to see demoed.
 

Dredd97

Member
But did you think that Microsoft might have still pushed the Azure stuff even if the One was as powerful or more powerful than the PS4?

Could it be, just possibly, that they want to integrate as many of their services/platforms into all their products to further enhance their ecosystem?

Just a thought.

Here a thought, the always connected console never went away, their just dressing it up with new clothes....
Sure you can play your games offline if you really want to, but prepare for the downgrade...
 

Rembrandt

Banned
I don't doubt at all that cloud computing is a thing. And I don't doubt that there was a very nice tech demo that showed a destructible building. And I don't even doubt that the technology could be integrated in a game, so you could basically blow up a house and have a shit ton of particles whose movement is individually calculated.

What I do doubt, however, is that this would make a game better. That is what I would like to see demoed.

Name one game with destructible environments that would have been better without them. you can't because destructible environments make every game at least a little bit better. lmao.
 

Mastperf

Member
If Spencer is saying this, it must work pretty well.

He wouldn't make this claim if it was all bullshit. He knows how much shit he would get if there's no substance behind this. This and DX12 are really going to change the landscape of the console competition I think.
Change in what way? Even with better utilization of the hardware there will always be a significant difference in performance between the two.
 

jryi

Senior Analyst, Fanboy Drivel Research Partners LLC
Name one game with destructible environments that would have been better without them. you can't because destructible environments make every game at least a little bit better. lmao.
Yes, there have been games with destructible environment. They didn't require cloud computing. What is your point?
 

Alx

Member
Here a thought, the always connected console never went away, their just dressing it up with new clothes....
Sure you can play your games offline if you really want to, but prepare for the downgrade...

Just face the facts : consoles are connected devices. Like it or not, that's where technology is right now, and it's not about to change. Most of the improvements and added features during the past generations and the coming ones are based on online connectivity, and being offline makes you miss all the features that define modern consoles : online gaming, community, VOD, online marketplace, demos, notifications, sharing pictures and videos, watching other people play, ... Remove online and you remove almost everything that makes the new consoles "new", you're just left with "a PSOne with pretty graphics".

The always connected console will be a factual reality. MS just got hated because they wanted to force everybody into it (or so it seemed), while it's just the natural evolution of the technology.
 

KidBeta

Junior Member
Not to mention that compression itself is quite a computationally intensive process. If we take general-purpose compression algorithm (say, LZMA), we can achieve stated rate with something like a large body of source code (~90%) while compressing at the rate of approximately 2.5 megabytes per second of CPU time. That is, if we agree with the cited estimates, 15 ms, or approximately 50% of the budget estimated there to computing the contents of a frame. And that's not counting decoding time.

Now, if they could come up with a compression scheme which exploits redundancy in data, they would probably reduce encoding time, decoding time and the compression rate would seem more realistic... The problem here is that the way compression works is that it uses the known patterns in data in order to reduce the size at the cost of computation decoder has to do. There is a compression efficiency-vs-complexity slider, which can go from 0% (no compression) to 100% (the client already knows everything it needs and can compute everything without the cloud, after all). I doubt that the values in the middle of that spectrum are actually that attractive, plus in order to develop a complicated compression scheme, they would need a lot of engineering resources and a very specific type of talent (the kind of people who work on video codec).

Even then, this would most likely consume a lot of CPU time and would hardly come anywhere around stated compression efficiency. But then again, we don't know what exactly Microsoft intends to offload to clouds, so they might have better-compressible data or lower requirements on latency. I am not claiming anything about Microsoft technology here (because I don't actually know what they are really doing), merely that the cited calculations on reddit are based on a model which is not actually useful.

Yeah that was a pretty big sign to me, 85% compression and 9 bit integers makes me wonder just how much he is talking about, I won't say 9 bit ints are impossible just that it seems like a bad format.
 
Cloud based physics is total bollocks. The tech is completely impractical for gaming, since it would require meaty Internet connections in order to work, and subsequently requires an always online connection which as literally every videogame that requires an Internet connection demonstrates, means crippled infrastructure for a week+. Considering said infrastructure is the same as many businesses rely on, it's going to be lawsuit central, ignoring consumer complaints and returns over their game not working.

And that's assuming that they can fit the data in the average household Internet connection. Spoiler alert:they almost certainly can't.
 
Cloud based physics is total bollocks. The tech is completely impractical for gaming, since it would require meaty Internet connections in order to work, and subsequently requires an always online connection which as literally every videogame that requires an Internet connection demonstrates, means crippled infrastructure for a week+. Considering said infrastructure is the same as many businesses rely on, it's going to be lawsuit central, ignoring consumer complaints and returns over their game not working.

And that's assuming that they can fit the data in the average household Internet connection. Spoiler alert:they almost certainly can't.

I assume you deal with networking and server infrastructure in your career? Lot's of accusations with nothing to back them up. Would be cool if you went through you post point by point and explained why exactly you're right. Oh wait, the internet. I expect nothing.

And yes, some online titles have issues the first week. what is your point?
 
I mean if this does work and is effective all it will do is force other parties to invest in the same technology so, go for it.
 

d9b

Banned
No disrespect to Phil Spencer, but time for talk is over. I want action! We'll do this, we'll do that, DX12, 10% more power....now cloud demo. Want to see all that in action, don't want to dream about it happening one sunny day.
 

geordiemp

Member
I believe what we need to see:

* The specific conditions where it "works".
* How the software reacts to a dropped connection (local or server).
* How the software reacts to a slow or lagging connection.
* Assuming it handles these cases, how much development/design work this takes.
* Assuming the development work is viable, what restrictions does this place on interactivity with the chosen effects. Will they be cosmetic only (because you can't guarantee when and where the changes will appear to the player) and hence better done as pre-calculated scenes.

A demo covering all that would set the record straight.

They need to add a player count stress / discussion - if a new game releases and 500,000 people boot up across a few continents, its a bit more work than a single game demo...

That's where it falls down for me, the player numbers and location spread.

I believe it will work for a few consoles for a demo....
 

Alx

Member
No disrespect to Phil Spencer, but time for talk is over. I want action! We'll do this, we'll do that, DX12, 10% more power....now cloud demo. Want to see all that in action, don't want to dream about it happening one sunny day.

Well, that's precisely the reason why they're preparing a demo. The one mentioned in OP and the thread title...
 

jryi

Senior Analyst, Fanboy Drivel Research Partners LLC
The always connected console will be a factual reality. MS just got hated because they wanted to force everybody into it (or so it seemed), while it's just the natural evolution of the technology.

There is still quite a crucial difference. MS wanted to sell you something that would have been little more than a brick without online connectivity. A PS4 works as a single player machine even if it never sees the internet.
 

Alx

Member
It is a difference, but not one that makes MS approach non viable. The console may be useless without online connectivity, but most people have online connectivity, so that's not a real problem. And for those who don't, well tough luck, that crowd isn't in the target audience.
Same thing happened with the last models of consoles for people who were not up to date with their TV screens. "Your TV is SD ? Or it doesn't have HDMI in ? Well you can't use our console then..."
 

Kayant

Member
If Spencer is saying this, it must work pretty well.

He wouldn't make this claim if it was all bullshit. He knows how much shit he would get if there's no substance behind this. This and DX12 are really going to change the landscape of the console competition I think.

DX12 will change the PC landscape more than it will change XB1 because XB1 already has several DX12 features in it's DX11.X api. So for XB1 it's more of a efficiency/incremental update than a revamp of the API like with DX12 on PC/mobile.

FTyMN1I.png

http://i.imgur.com/2TCRXbX.jpg?1?1850


http://www.slideshare.net/DevCentralAMD/inside-x-box-one-by-martin-fuller

OT - Back to cloud am hoping the demo is done with average connections to see how it works for the majority of people and maybe even demoing how it handles when someone loses their connection to the server.
 

Sweep14

Member
This, imho, will only complicate devs work and extend development times. Games using this tech will require much more tests and adjustments than non cloud one's.
 

jryi

Senior Analyst, Fanboy Drivel Research Partners LLC
It is a difference, but not one that makes MS approach non viable. The console may be useless without online connectivity, but most people have online connectivity, so that's not a real problem. And for those who don't, well tough luck, that crowd isn't in the target audience.
Same thing happened with the last models of consoles for people who were not up to date with their TV screens. "Your TV is SD ? Or it doesn't have HDMI in ? Well you can't use our console then..."

But it's way easier to go buy a new TV than it is to get a reliable broadband somewhere where it's not available. Besides, both PS3 and Xbox 360 could be played on an age old SD television. Of course calling it suboptimal is sugarcoating, but it was at least possible.
 

Freeman

Banned
I don't like having to rely on internet to play single player games , I hope they use thins only in multiplayer games.
 

Genio88

Member
That demo is awesome, if that could be applyed to games would be great, looking forward to see the first implementation in a real game
 

Calabi

Member
I just dont think its going to work because its going to cost Microsoft more to run the servers, than it would, to just run it on the console. Maybe Microsoft havent realised yet or maybe they have a plan, a subscription model or something.

Subscribe to the cloud for extra physics only 9.99 a month.
 
How is that different from time slicing of mainframe time in the bad old days before personal computers? You pay more, they give you more CPU cycles and memory.

So in the concept of "offloading some work to another machine", there is a similarity: "pay money to get access to more computing power than you have locally otherwise".

Specifically though it's a couple of points - first is scale - one local mainframe (or a group of them) vs. hundreds of thousands of Azure servers, so scale is one point.

Geo-dispersal is the other point, there are data centres publicly accessible all around the world with good latency (well, nearly all).

So, yes, architecturally it's "offloading work to other servers to get more processing power than you have locally" - that should give you confidence that it can have real benefit, right?

If you want to distill it down to "like mainframe time slicing" - it's having hundreds of thousands of mainframes around the world that can be tapped into at low cost (free for Xbox devs), programmatically and on-demand, separated into geo-clusters that ensure pretty good coverage to most (not yet all) markets around the world.


2ndly, the "cloud" is fine for non-realtime applications. But games have 33ms to complete all their relevant computations per frame and have to at a minimum do it 30 times a sec.

Server-based realtime games have been around for ever (hi, MMOs!). If you distill it down to "dedicated server-hosted game" there's nothing controversial here - only the amount of number crunching that's being done server-side.

On latency, the point appears be that things like exploding buildings don't require a true 16 or 33ms response time.

Lastly, there is no reason other companies can't do it themselves. This is especially true for AAA development. You have the $$$ to fund a AAA game (100s of millions), you have the $$$ to develop your own cloud server farm that will elastically scale for each of your games.

Opportunity cost is a thing. Risk is another. Companies outsource non-core capabilities all the time. But you're right, there's absolutely no reason why they couldn't build their own scalable server farm on Rackspace, Azure, or Amazon. I have no issue with that - my argument is first and foremost that cloud-based computing is beneficial whoever the supplier, and secondly that Microsoft's offering is good because of its global presence, maturity and ease / cost of use for Xbox.

It would be greatly beneficial to DIY too as this way you aren't tied to MS - bad idea; MS is a company with a history of showing no hesitation when it comes to screw over others for their own benefit, be it competitors, partners or even customers. No one with their head screwed on right would want to be under MS's thumb.

Well... not much I can say on that since you clearly believe MS are evil - but the millions of people all around the world who make a comfortable living, or run their businesses, off the back of MS software and service would disagree. If you pay attention to Microsoft's strategy too, you'll realise that we're becoming a "cloud first" company - the cloud model is our future, it's not a flash in the pan.

PS: I just noticed something. No one is talking about the "cloud" but MS and developers who are firmly in their pocket. Not a pip from EA, Ubisoft, Activision, Crytek, Epic, Take 2 ... etc.

Other than Sony who bought Gaikai for PSNow?

Or OnLive?

Both of which are doing low-latency stuff (rendering in the cloud, control responsiveness).

Or Nvidia who have a video of cloud-based lighting, showing the impact of different latencies (200ms for example) on lighting.

http://www.youtube.com/watch?v=aiWdJxshWMM

Or EA, who published Titanfall?


Seriously - it's okay to be sceptical about how much it will benefit games, but to write it off completely without understanding what is and isn't possible, or just because it's Microsoft, is short sighted IMO.
 

arhra

Member
If they really want me to believe, they need to provide me the demo on my console so I can download it and try it myself :)

If the universe has any sense of cosmic irony at all, the timing will work out so that it makes sense for there to be a Crackdown beta/demo bundled with Halo 5.
 

Lettuce

Member
I don't like having to rely on internet to play single player games , I hope they use thins only in multiplayer games.

I know i cant speak for everyone, but my Virgin Media (UK) internet connection has only gone down once this year and was only for like half hour. So i have no problem with this as i had no problem with the original XB1 vision!
 

Freeman

Banned
I know i cant speak for everyone, but my Virgin Media (UK) internet connection has only gone down once this year and was only for like half hour. So i have no problem with this as i had no problem with the original XB1 vision!

I wish I could say the same about my internet connection and its not just about being able to access internet, but also getting a good ping to the server.
 
yes, your thorough explanation of the difference between "on" and "off" was adequate. my point was the massive differences in scale.

take the PC I'm using now as an example. the GPU has a bandwidth of 31.51GB/s (aka 32,266.24MB/s) both ways (up and down.) a relatively good internet connection (to the cloud) has about 20MB/s down, and usually much slower up. even if you doubled or tripled that, it's like comparing a new Corvette to a new tricycle. thus the massive differences between "local" and "remote." local is ALWAYS better.

The Pc rendering the physics will have tons of internal bandwidth and flops too, it's not like the clouds needs to send the data for the client to process, just need to send the results, which could be as small as some velocity vectors for updating the pieces.
 

Widge

Member
Subtly bringing back the always online it seems.

Well yes, goes without saying, none of these games will be offline single player entities.

Also, I would imagine that they will all be MS first party exclusive titles. No third party will make a game that cuts out a huge chunk of their market. As such, I'm not envisioning a hell of a lot of support for this.
 

MrJoe

Banned
So in the concept of "offloading some work to another machine", there is a similarity: "pay money to get access to more computing power than you have locally otherwise".

Specifically though it's a couple of points - first is scale - one local mainframe (or a group of them) vs. hundreds of thousands of Azure servers, so scale is one point.

Geo-dispersal is the other point, there are data centres publicly accessible all around the world with good latency (well, nearly all).

So, yes, architecturally it's "offloading work to other servers to get more processing power than you have locally" - that should give you confidence that it can have real benefit, right?

If you want to distill it down to "like mainframe time slicing" - it's having hundreds of thousands of mainframes around the world that can be tapped into at low cost (free for Xbox devs), programmatically and on-demand, separated into geo-clusters that ensure pretty good coverage to most (not yet all) markets around the world.




Server-based realtime games have been around for ever (hi, MMOs!). If you distill it down to "dedicated server-hosted game" there's nothing controversial here - only the amount of number crunching that's being done server-side.

On latency, the point appears be that things like exploding buildings don't require a true 16 or 33ms response time.



Opportunity cost is a thing. Risk is another. Companies outsource non-core capabilities all the time. But you're right, there's absolutely no reason why they couldn't build their own scalable server farm on Rackspace, Azure, or Amazon. I have no issue with that - my argument is first and foremost that cloud-based computing is beneficial whoever the supplier, and secondly that Microsoft's offering is good because of its global presence, maturity and ease / cost of use for Xbox.



Well... not much I can say on that since you clearly believe MS are evil - but the millions of people all around the world who make a comfortable living, or run their businesses, off the back of MS software and service would disagree. If you pay attention to Microsoft's strategy too, you'll realise that we're becoming a "cloud first" company - the cloud model is our future, it's not a flash in the pan.



Other than Sony who bought Gaikai for PSNow?

Or OnLive?

Both of which are doing low-latency stuff (rendering in the cloud, control responsiveness).

Or Nvidia who have a video of cloud-based lighting, showing the impact of different latencies (200ms for example) on lighting.

http://www.youtube.com/watch?v=aiWdJxshWMM

Or EA, who published Titanfall?


Seriously - it's okay to be sceptical about how much it will benefit games, but to write it off completely without understanding what is and isn't possible, or just because it's Microsoft, is short sighted IMO.

I take it from this post that you are a Microsoft employee?

isn't commenting on the idea "Microsoft is evil" or to be a bit more subtle about it "Microsoft isn't very nice" therefore a farce? you're their employee, they pay your salary, of course you think they're nice. hey, if they paid me enough money I would think they're nice too (but I would obviously have no credibility in saying so.)
 

Jezehbell

Member
There is still quite a crucial difference. MS wanted to sell you something that would have been little more than a brick without online connectivity. A PS4 works as a single player machine even if it never sees the internet.

Yeah, but games like Destiny requires a internet connection to work. It's just the beginning, more games will be "always online" in the future.
 
DX12 will change the PC landscape more than it will change XB1 because XB1 already has several DX12 features in it's DX11.X api. So for XB1 it's more of a efficiency/incremental update than a revamp of the API like with DX12 on PC/mobile.

FTyMN1I.png

http://i.imgur.com/2TCRXbX.jpg?1?1850


http://www.slideshare.net/DevCentralAMD/inside-x-box-one-by-martin-fuller

OT - Back to cloud am hoping the demo is done with average connections to see how it works for the majority of people and maybe even demoing how it handles when someone loses their connection to the server.

I think that the most impactful change dx12 will bring will be multi threading rendering (threads issuing draw calls) which apparently are not possible yet on xbone, but will be after dx12, which could be very useful when each of the cpu cores are so puny.

But apparently the new api will bring xbone development closer to Pc regarding api usage. This could be helpful too, because optimizations that carry over to multiple platforms are more likely than very specific ones that only benefit a single platform.

EDIT>

Not to mention that compression itself is quite a computationally intensive process. If we take general-purpose compression algorithm (say, LZMA), we can achieve stated rate with something like a large body of source code (~90%) while compressing at the rate of approximately 2.5 megabytes per second of CPU time. That is, if we agree with the cited estimates, 15 ms, or approximately 50% of the budget estimated there to computing the contents of a frame. And that's not counting decoding time.

Now, if they could come up with a compression scheme which exploits redundancy in data, they would probably reduce encoding time, decoding time and the compression rate would seem more realistic... The problem here is that the way compression works is that it uses the known patterns in data in order to reduce the size at the cost of computation decoder has to do. There is a compression efficiency-vs-complexity slider, which can go from 0% (no compression) to 100% (the client already knows everything it needs and can compute everything without the cloud, after all). I doubt that the values in the middle of that spectrum are actually that attractive, plus in order to develop a complicated compression scheme, they would need a lot of engineering resources and a very specific type of talent (the kind of people who work on video codec).

Even then, this would most likely consume a lot of CPU time and would hardly come anywhere around stated compression efficiency. But then again, we don't know what exactly Microsoft intends to offload to clouds, so they might have better-compressible data or lower requirements on latency. I am not claiming anything about Microsoft technology here (because I don't actually know what they are really doing), merely that the cited calculations on reddit are based on a model which is not actually useful.

Two of the DME engines on xbone can handle LZ compression (dunno about the specific implementation), freeing up cpu resources for that.
 
J

JoJo UK

Unconfirmed Member
show me the money. show me an example of a 30 FPS PhysX demo vs. a 60 FPS equivalent running on a remote cloud. we have this MS demo showing a remarkable difference; you may call me paranoid but I don't have any trust when it comes to MS. oh and I am aware that a number of factors are at play: bandwidth caps, latency on the remote side (internet) as well as local latency, and of course the fact that no one but MS has access to the demo. I wonder why they don't just release it like Nvidia does for their demos?
Your paranoid.
I take it from this post that you are a Microsoft employee?

isn't commenting on the idea "Microsoft is evil" or to be a bit more subtle about it "Microsoft isn't very nice" therefore a farce? you're their employee, they pay your salary, of course you think they're nice. hey, if they paid me enough money I would think they're nice too (but I would obviously have no credibility in saying so.)
Would it not be better to address the content of the post rather than discussing employement history? BTW I doubt Microsoft is 'evil', it's a company trying to make money, just like Sony, Nintendo, Tesco, Walmart and Gregs the Bakers.
 

PG2G

Member
I take it from this post that you are a Microsoft employee?

isn't commenting on the idea "Microsoft is evil" or to be a bit more subtle about it "Microsoft isn't very nice" therefore a farce? you're their employee, they pay your salary, of course you think they're nice. hey, if they paid me enough money I would think they're nice too (but I would obviously have no credibility in saying so.)

Is that what you resort to when you can't counter the points that have been brought up?
 

SecretSquirrel

Neo Member
You guys are talking too much about a connection with a lot of through put. That isn't necessary. You just need a connection with low latency and a fast ping. it doesn't need to be a 100/50 connection. I would also imagine that the speed of the return calculation just has to be faster than the sync delay time of the players in a multi player game.
 
I take it from this post that you are a Microsoft employee?

As my profile says: Work @ Microsoft - nothing to do with gaming though - but opinions may be unintentionally biased.

isn't commenting on the idea "Microsoft is evil" or to be a bit more subtle about it "Microsoft isn't very nice" therefore a farce? you're their employee, they pay your salary, of course you think they're nice.

Not necessarily true, I've worked for companies that I thought were ****s in the past :)

However you're right - you should always read people's posts through the bias that they bring to the table, and I try to make a point of mentioning it explicitly because people need to be aware I can't be entirely unbiased in this. However also be aware that astroturfing is a disciplinary offence at Microsoft, we are encouraged to be open and honest so while you won't see me taking a massive unconstructive dump on my employer in public, I will always do my best to be objective.
 

TK Turk

Banned
Am I the only one that doesn't mind having to be online to get the best experience? Sony said 95% of all ps4 users are online so I'm sure it's about the same for X1, so we should handicap games just to satisfy 5% of the user base? I don't think that's a good idea.
 
Not to mention that compression itself is quite a computationally intensive process.

It isn't compared to the slower alternatives like bandwidth or even disk access etc... it actually is super efficient.

Anyway... 1000s of particles would have quite a bit of redundancy is the data structure... so a good compression can be expected.

You guys are talking too much about a connection with a lot of through put. That isn't necessary. You just need a connection with low latency and a fast ping. it doesn't need to be a 100/50 connection. I would also imagine that the speed of the return calculation just has to be faster than the sync delay time of the players in a multi player game.

100% correct.
 

MrJoe

Banned
Your paranoid.Would it not be better to address the content of the post rather than discussing employement history? BTW I doubt Microsoft is 'evil', it's a company trying to make money, just like Sony, Nintendo, Tesco, Walmart and Gregs the Bakers.

the content of the post largely did not interest me so it was ignored. of course microsoft isn't evil, I would think an allegiance with lucifer or sacrificing unicorns would be required for that. I put it better the second time; "Microsoft isn't very nice." yes the same thing could be said about many mega-corporations like wal-mart or whatever, but corporations are (and this is a relatively recent change) now considered people under the law so I can judge them as people. I'll say it again, they're not very nice. as people that is, as a business they are quite successful and have done an A+ job at making money for their shareholders.
 
Top Bottom