• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Anandtech: Tech analysis -- PS4 vs Xbox One, PS4/Xbox One's CPU performance

Can somebody explain to me how this cloud computing will help AI opponents in games? XBONE seems to have this tech and I find it fascinating. At first, I thought it was just jargon to mask some sort of DRM or whatever, but I'm intrigued at how the cloud can actually make gaming better.
 

luffeN

Member
Can somebody explain to me how this cloud computing will help AI opponents in games? XBONE seems to have this tech and I find it fascinating. At first, I thought it was just jargon to mask some sort of DRM or whatever, but I'm intrigued at how the cloud can actually make gaming better.

Theoretically, as far as I understood the cloud thing, you can have more computational power with cloud computing. So you can offload tasks to the cloud so that the console does not have to do all the work. Take OnLive for example: You could play Crysis 3 on your tablet because the computing is done server side and not on your tablet. Please someone correct me if I am wrong.
 

herod

Member
Can somebody explain to me how this cloud computing will help AI opponents in games? XBONE seems to have this tech and I find it fascinating. At first, I thought it was just jargon to mask some sort of DRM or whatever, but I'm intrigued at how the cloud can actually make gaming better.

just check out the latest Sim City!
 

maverick40

Junior Member
This probably a stupid question but will Sony make money off each blu ray disc sold on the Xbox one? I know their stake in it is small but still.
 

Durante

Member
Can somebody explain to me how this cloud computing will help AI opponents in games? XBONE seems to have this tech and I find it fascinating. At first, I thought it was just jargon to mask some sort of DRM or whatever, but I'm intrigued at how the cloud can actually make gaming better.
Sure, I can explain it to you:
Microsoft knows that their box is less capable, so they deflect with vague non-statements about stuff outside of that box.

A real-time interactive simulation (i.e. a game) is one of the worst possible applications for cloud computing.
 

Pie and Beans

Look for me on the local news, I'll be the guy arrested for trying to burn down a Nintendo exec's house.
I feel NeoGAF will need to self censor the term "cloud-computing" into "magic" for the coming year.
 
I read somewhere that the GPU cores in ONE cannot be compared directly to PS4 GPU cores. I guess we'll have to wait for Todd Holmdahl to shed more light on the architecture.
 

nico1982

Member
Is PS4 CPU 1.6 GHz confirmed?, I read 2,0 GHz in other sources.
Well, it is not going to be a difference as big as the GPU since the PS4 CPU would be just 25% faster than the Xbox one and the Xbox CPU would be only 20% slower than the PS4 one...

:p
 

1-D_FTW

Member

I don't really get why they're saying it should have better thermals/power efficiency. Aren't transistors all created equal? If you go by the DF article, PS4 and XBone have similiar die size (XBone just allocated a bunch of transistors so they could go with 8GB RAM... which they though required using slower RAM that needed bandwidth optimizations).

I'm not expecting much difference as a result. Should both be around 100 watts.
 

Durante

Member
I don't really get why they're saying it should have better thermals/power efficiency. Aren't transistors all created equal?
Not really, no. But that's not the important point, GDDR5 uses quite a bit more power than DDR3.

However, I don't see an issue. It will still be much less total TDP than a launch PS3, and Sony had no problem cooling that.
 

Brashnir

Member
just check out the latest Sim City!

I know you jest, but Diablo 3 (or any MMO) is a better example, since monster spawns and behavior are all handled server-side.

As for how much computational power this saves the end user - it's not enough to matter, really, and the negative of hitting lag spikes and dying to them outweighs the possible benefit to single player games.
 

TronLight

Everybody is Mikkelsexual
Not really, no. But that's not the important point, GDDR5 uses quite a bit more power than DDR3.

However, I don't see an issue. It will still be much less total TDP than a launch PS3, and Sony had no problem cooling that.

What about YLOD? Wasn't that because of some overheating issues?
 
LOL, even anandtech can't help but sugarcoat things.

FFS guys, just call it like it is. The Xbone is weaker. It's not "more broad". It's weaker.
 
What about YLOD? Wasn't that because of some overheating issues?

Due to shitty lead-free solder that became a new requirement for electronics due to EU regulations. Even if the quality of lead-free solder hasn't improved in that amount of time, the PS4/XBone will still consume half the power of a launch PS3, so heat should never be a concern this generation.

APUs are going to save both companies a lot of money, and all indications are that neither will be passing the savings onto the customers, so hopefully they'll at least offer good build quality and heatsinks in their original console models.
 
I hope we'll see it in the games then. Currently, DriveClub does not look much more impressive than Forza 5.

Somehow I don't think that rumored platforming game on xboxone is gonna look any worse than Knack, either. MS'll probably show an exclusive shooter as well.

Come E3, there is gonna be a lot of comparisons in graphics, but at this stage, I'm not sure there will be a clear cut winner, even if on paper PS4 is "substantially more powerful".

Have we seen gameplay footage of either? And I mean gameplay, not target render crap.
 
Sure, I can explain it to you:
Microsoft knows that their box is less capable, so they deflect with vague non-statements about stuff outside of that box.

A real-time interactive simulation (i.e. a game) is one of the worst possible applications for cloud computing.

wow
 

Brashnir

Member
Have we seen gameplay footage of either? And I mean gameplay, not target render crap.

I don't think final silicon actually exists yet for either system, so no. Until final silicon, everything is either a target render or running realtime on estimated hardware built from off-the-shelf parts.
 

bomblord

Banned
How did game OS's go from a few MB to several GB of ram usage in one generation?

The wii's entire OS ran on 64mb of ram and now the wiiU uses 1gb.

The 360 went from 32mb reserved for the OS and now is using 3gb.
 

Zabka

Member
How did game OS's go from a few MB to several GB of ram usage in one generation?

The wii's entire OS ran on 64mb of ram and now the wiiU uses 1gb.

The 360 went from 32mb reserved for the OS and now is using 3gb.

Xbone's running three OSes. That 3 GB is for apps to use.
 

Brashnir

Member
How many bits are these systems?

It depends on how you count. the "bits" war was related to the space in the processor registers of old single-processor systems. Modern systems with parallel processing and most of the heavy lifting done off CPU can't really be measured by that metric.

If you wanted to, though, each core of the PS4 and X1 are 64-bit.

If you wanted to break out the 8 cores and add them together, which is kind of dumb, but whatever, that would be 512-bit.

If you wanted to count the registers of every shader on the GPU... I'm not really interested in figuring that out.
 

Hyphen

Member
Theoretically, as far as I understood the cloud thing, you can have more computational power with cloud computing. So you can offload tasks to the cloud so that the console does not have to do all the work. Take OnLive for example: You could play Crysis 3 on your tablet because the computing is done server side and not on your tablet. Please someone correct me if I am wrong.

But this is a bit silly isn't it? Wouldn't that then mean, for example a single player game that doesn't have any online functionality will need to be played 'online' if it uses the cloud for "computational power"?
 

DBT85

Member
Is it possible to replace it with stacked wide IO DDR4 one day (and raise latencies to match GDDR5 levels)?

This was speculated in the RAM threads in the leadup to the PS4 reveal yes.

That would end up lowering the cost and heat requirement of GDDR5 while allowing them to get the bandwidth into the system at launch when DDR4 isn;t ready.

Technologically I have no idea if it would be possible. It is beyond my understanding. Durante would probably be able to help.
 
I'm just wondering... what happens if you registered a code but break/lose your disc? Is having the disc inside the console required? Or is it like Steamworks where you can trash your disc as long as you have the code?
 

herod

Member
But this is a bit silly isn't it? Wouldn't that then mean, for example a single player game that doesn't have any online functionality will need to be played 'online' if it uses the cloud for "computational power"?

yep, it's a nonsense. They claim it doesn't require 'always on' Internet but then hype cloud computing for the games. It doesn't add up.

It might be used in online games for enemy AI instead of having a host console doing the crunching, which should mean less constrictive headroom for devs making multiplayer games, but that's all I can imagine.

There are very few logical reasons to take the latency and availability hit for offloading to cloud computing when you have 8 threads plus a GPU right in front of you.
 
I think someone should post the tech talk Major Nelson had yesterday about the xbox, I would love to join this conversation but have no idea wtf the specs even mean lol but i think most of you guys would like it
 

jaosobno

Member
This was speculated in the RAM threads in the leadup to the PS4 reveal yes.

That would end up lowering the cost and heat requirement of GDDR5 while allowing them to get the bandwidth into the system at launch when DDR4 isn;t ready.

Technologically I have no idea if it would be possible. It is beyond my understanding. Durante would probably be able to help.

Ok, thx for the info. Durante could you please clarify if such a thing would be possible? In the long run Sony would definitely benefit from such an approach.
 

heyf00L

Member
Yes. But the increased throughput of GDDR5 trumps it in almost all graphics-related situations, which is why PC GPUs use GDDR5.

Indeed. Since it's shared RAM, I'd rank the GPU's RAM needs as more important than the CPU's for gaming purposes.

But obviously that's not what MS was concerned with.
 

Durante

Member
Ok, thx for the info. Durante could you please clarify if such a thing would be possible?
I would if I could. It depends on how much developers are allowed to depend on the exact performance characteristics of the GDDR5 incarnation of the PS4. I assume those would be rather hard to replicate with an entirely different memory type -- remember that MS actually had to dedicate hardware to essentially slow down the SoC version of 360. Even small changes are hard in the console space if they are not designed for from day 1.
 

scently

Member
Yeah, I was also a bit surprised to see it being called "more than enough" for frame buffer storage. Even in a forward renderer you can't even do 1080p with 4xMSAA in that.

Do you expect MSAA to be used in this coming generation? really?. 32MB is enough for 1080p framebuffer without MSAA.
 
Top Bottom