amongst the many other members...
http://www.blu-raydisc.com/en/AboutBlu-ray/SupportingCompanies.aspx
What the... Microsoft isn't even on there.
amongst the many other members...
http://www.blu-raydisc.com/en/AboutBlu-ray/SupportingCompanies.aspx
What the... Microsoft isn't even on there.
What the... Microsoft isn't even on there.
What the... Microsoft isn't even on there.
Can somebody explain to me how this cloud computing will help AI opponents in games? XBONE seems to have this tech and I find it fascinating. At first, I thought it was just jargon to mask some sort of DRM or whatever, but I'm intrigued at how the cloud can actually make gaming better.
Can somebody explain to me how this cloud computing will help AI opponents in games? XBONE seems to have this tech and I find it fascinating. At first, I thought it was just jargon to mask some sort of DRM or whatever, but I'm intrigued at how the cloud can actually make gaming better.
Does MS have to pay Sony some kind of licensing fee now that they have a BluRay in their system?
Sure, I can explain it to you:Can somebody explain to me how this cloud computing will help AI opponents in games? XBONE seems to have this tech and I find it fascinating. At first, I thought it was just jargon to mask some sort of DRM or whatever, but I'm intrigued at how the cloud can actually make gaming better.
This probably a stupid question but will Sony make money off each blu ray disc sold on the Xbox one? I know their stake in it is small but still.
Is PS4 CPU 1.6 GHz confirmed?, I read 2,0 GHz in other sources.
Well, it is not going to be a difference as big as the GPU since the PS4 CPU would be just 25% faster than the Xbox one and the Xbox CPU would be only 20% slower than the PS4 one...Is PS4 CPU 1.6 GHz confirmed?, I read 2,0 GHz in other sources.
Specs didn't matter In any of the previous generations. consoles will be judge on the games they have and don't have.
Not really, no. But that's not the important point, GDDR5 uses quite a bit more power than DDR3.I don't really get why they're saying it should have better thermals/power efficiency. Aren't transistors all created equal?
just check out the latest Sim City!
8gb is confirmed. 12 cu is confirmed. 3 gb reserved for os is confirmed. 8 core cpu is confirmed. The rest, I don't know.
Not really, no. But that's not the important point, GDDR5 uses quite a bit more power than DDR3.
However, I don't see an issue. It will still be much less total TDP than a launch PS3, and Sony had no problem cooling that.
What about YLOD? Wasn't that because of some overheating issues?
I hope we'll see it in the games then. Currently, DriveClub does not look much more impressive than Forza 5.
Somehow I don't think that rumored platforming game on xboxone is gonna look any worse than Knack, either. MS'll probably show an exclusive shooter as well.
Come E3, there is gonna be a lot of comparisons in graphics, but at this stage, I'm not sure there will be a clear cut winner, even if on paper PS4 is "substantially more powerful".
Sure, I can explain it to you:
Microsoft knows that their box is less capable, so they deflect with vague non-statements about stuff outside of that box.
A real-time interactive simulation (i.e. a game) is one of the worst possible applications for cloud computing.
Have we seen gameplay footage of either? And I mean gameplay, not target render crap.
How did game OS's go from a few MB to several GB of ram usage in one generation?
The wii's entire OS ran on 64mb of ram and now the wiiU uses 1gb.
The 360 went from 32mb reserved for the OS and now is using 3gb.
"Xbox One should enjoy better power/thermal characteristics compared to the PlayStation 4"
PS4 will be even bigger.
I don't think final silicon actually exists yet for either system, so no. Until final silicon, everything is either a target render or running realtime on estimated hardware built from off-the-shelf parts.
How many bits are these systems?
Theoretically, as far as I understood the cloud thing, you can have more computational power with cloud computing. So you can offload tasks to the cloud so that the console does not have to do all the work. Take OnLive for example: You could play Crysis 3 on your tablet because the computing is done server side and not on your tablet. Please someone correct me if I am wrong.
Yes it will be. The GDDR5 is an interesting choice for a long term proposition.
Is it possible to replace it with stacked wide IO DDR4 one day (and raise latencies to match GDDR5 levels)?
But this is a bit silly isn't it? Wouldn't that then mean, for example a single player game that doesn't have any online functionality will need to be played 'online' if it uses the cloud for "computational power"?
I feel NeoGAF will need to self censor the term "cloud-computing" into "magic" for the coming year.
This was speculated in the RAM threads in the leadup to the PS4 reveal yes.
That would end up lowering the cost and heat requirement of GDDR5 while allowing them to get the bandwidth into the system at launch when DDR4 isn;t ready.
Technologically I have no idea if it would be possible. It is beyond my understanding. Durante would probably be able to help.
Gamespot: PS4 < XBone
WHO AM I SUPPOSED TO BELIEVE???!!!
DDR3 does have lower latency than GDDR5 right?
Yes. But the increased throughput of GDDR5 trumps it in almost all graphics-related situations, which is why PC GPUs use GDDR5.
I would if I could. It depends on how much developers are allowed to depend on the exact performance characteristics of the GDDR5 incarnation of the PS4. I assume those would be rather hard to replicate with an entirely different memory type -- remember that MS actually had to dedicate hardware to essentially slow down the SoC version of 360. Even small changes are hard in the console space if they are not designed for from day 1.Ok, thx for the info. Durante could you please clarify if such a thing would be possible?
Yeah, I was also a bit surprised to see it being called "more than enough" for frame buffer storage. Even in a forward renderer you can't even do 1080p with 4xMSAA in that.