• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

DDayton

(more a nerd than a geek)
As a Wii U owner, I'm finding this whole thing rather amusing... it also means I feel as though I can be a disinterested third party.

Given that, I am really puzzled over any scenario put forth in which the PS4 doesn't solidly trounce the Xbox One in every way. It's clearly more powerful, costs $100 less, and is released at the same time. Given that the biggest selling games in the market at the moment (for the 360 or PS3) are third party titles, it seems like the writing is on the wall.

(What's really amusing is that the last MS press bit didn't even dispute that the PS4 was more powerful -- it just used weasel words to walk around it and vaguely claim it didn't matter. Heheheh.)
 

Y2Kev

TLG Fan Caretaker Est. 2009
Who cares about the price? If you must have only ONE console, if you are not choosing because of the game line up then your priorities are off. IMO of course.


The only reason I am getting both consoles is because they both have game series that I like that I can not get on pc.
Is this serious? "Who cares about the price"?
 

StaSeb

Member
I think you might have found a good English link for bolstering effect if you looked up confirmation bias.

Oh cool, thanks. Did not know the effect by that name. Non-native english-speaker here. Vocabulary is a neverending challenge.

And yes. Buyers remorse - maybe kind of preordered-remorse in this case. I can definetley smell it on GAF! ^^
 

Cesar

Banned
let me ask you a simple question, Cesar.

What do you think is going to happen to Forza 5's "graphical effects" when you play it offline, since the game is completely playable offline? Do you think it's going to be a downgrade? Do you believe when playing offline, all these snazzy visual effects will disappear due to the lack of the POWER OF THE CLOUD™?

An offline and online mode that looks a bit different is 100% possible, look at Killzone wit ist's 60fps online mode and 30fps campaign. Turn 10 can use more artwork with cloud.
 
An offline and online mode that looks a bit different is 100% possible, look at Killzone wit ist's 60fps online mode and 30fps campaign. Turn 10 can use more artwork with cloud.

No. Just no. All turn 10 is using ms cloud infrastructure for is servers and drivatars.
 
An offline and online mode that looks a bit different is 100% possible, look at Killzone wit ist's 60fps online mode and 30fps campaign. Turn 10 can use more artwork with cloud.

didnt expect to see cloud here. and a reference to a competitor title thats not using no damn cloud for argument. the hell
 

stryke

Member
An offline and online mode that looks a bit different is 100% possible, look at Killzone wit ist's 60fps online mode and 30fps campaign. Turn 10 can use more artwork with cloud.

unsure-larry-davidqfbs7.gif
 

gofreak

GAF's Bob Woodward
They sure are, the Move Engines can swizzle textures during copy. Useful for packing data for optimal GPU access and unpacking for optimal CPU access (if accessing linearly on the CPU).

Yeah, but he made it sound like the move engines were saving the GPU on tex decompression in general that - say - a 7770 has to spend ALU time on. Not that they might be useful for packing data in a nice way when sharing texs between CPU and GPU processing. And most tex access is going to be of the vanilla read from static data anyway, where it can be formatted offline however bests suits the GPU.
 
An offline and online mode that looks a bit different is 100% possible, look at Killzone wit ist's 60fps online mode and 30fps campaign. Turn 10 can use more artwork with cloud.

So they're capable of storing an extra 30fps in the cloud?

Forza 5 90fps confirmed (online only).
 
Is this serious? "Who cares about the price"?

I think it's fair to point out that the value proposition is not entirely in PS4's favor. Yes, the PS4 has better specs but it is debatable whether it presents the best value, as Microsoft chose to invest in an improved Kinect Sensor and bundle it for everyone. A better GPU does not have the potential to offer truly different gaming and multimedia experiences, it will just make the same games go a bit faster. In terms of overall package, I think Microsoft made the right call.
 
I think it's fair to point out that the value proposition is not entirely in PS4's favor. Yes, the PS4 has better specs but it is debatable whether it presents the best value, as Microsoft chose to invest in an improved Kinect Sensor and bundle it for everyone. A better GPU does not have the potential to offer truly different gaming and multimedia experiences, it will just make the same games go a bit faster. In terms of overall package, I think Microsoft made the right call.

Kinect over better specs will apply to how many gamers? They've shown zero potential (there is that word again. Funny how it's never been realized, well ever in the history of videogames)


Once word gets out that PS4 is demonstrably more powerful and will most likely get the superior ports (as 360 did last gen) do you think the CoD guys will think "nope, gotta Kinect!"

If SONY actually advertises the most powerful console ever line, it's a done deal IMO.
 

rjinaz

Member
I think it's fair to point out that the value proposition is not entirely in PS4's favor. Yes, the PS4 has better specs but it is debatable whether it presents the best value, as Microsoft chose to invest in an improved Kinect Sensor and bundle it for everyone. A better GPU does not have the potential to offer truly different gaming and multimedia experiences, it will just make the same games go a bit faster. In terms of overall package, I think Microsoft made the right call.

Definitely. If you're into the Kinect. But many people (like myself) owned a 360 that never owned or even wanted a Kinect and so being forced to pay more because of the Kinect being bundled is not appealing making the PS4 the better deal. So I would say it comes down to preference on Kinect.
 
I think it's fair to point out that the value proposition is not entirely in PS4's favor. Yes, the PS4 has better specs but it is debatable whether it presents the best value, as Microsoft chose to invest in an improved Kinect Sensor and bundle it for everyone. A better GPU does not have the potential to offer truly different gaming and multimedia experiences, it will just make the same games go a bit faster. In terms of overall package, I think Microsoft made the right call.

I can't tell you how many times I've looked at a gorgeous game and thought, "You know, I'd give up all of this extra graphical flair and smooth framerate for a chance to wave my arms around like a dipshit to control my character. But only if the input is more laggy than using my controller."

And finally, Xbone gives me that. It's value, baby.

Obviously, I'm joking. Different strokes for different folks.
 
I can't tell you how many times I've looked at a gorgeous game and thought, "You know, I'd give up all of this for a chance to wave my arms around like a dipshit to control my character. But only if the input is more laggy than using my controller."

And finally, Xbone gives me that. It's value, baby.

Your post seems quite objective and articulate in pointing out both positives and negatives of what kinect can do.
 

thefit

Member
I think it's fair to point out that the value proposition is not entirely in PS4's favor. Yes, the PS4 has better specs but it is debatable whether it presents the best value, as Microsoft chose to invest in an improved Kinect Sensor and bundle it for everyone. A better GPU does not have the potential to offer truly different gaming and multimedia experiences, it will just make the same games go a bit faster. In terms of overall package, I think Microsoft made the right call.

That was mighty generouse if them to bundle in something I don't want and charge $100 extra for it instead of focusing that money and time on having better specs.

Value.
 

Chobel

Member
I think it's fair to point out that the value proposition is not entirely in PS4's favor. Yes, the PS4 has better specs but it is debatable whether it presents the best value, as Microsoft chose to invest in an improved Kinect Sensor and bundle it for everyone. A better GPU does not have the potential to offer truly different gaming and multimedia experiences, it will just make the same games go a bit faster. In terms of overall package, I think Microsoft made the right call.

1) You can get PSeye for PS4.
2) I and many others Don't care about Kinect.
3) You assume PS4 can't afford any multimedia experiences, which is not true.
4) Many people don't care about that multimedia experience at all.
 

Vizzeh

Banned
was watching a DX11.2 video and came to the theory that the Esram in the Xbox1 will be used to store a tiled resources file.

Tiled resources on that needs a 16mb file that can stream textures from files as big as 10gb mostly likely stored on the HD/disc
and eSram is dedicated to the GPU and low latency, it is perfect for that,

Thats not to say GDDR5 cant do tiled resources too since it supports Dx11.2 but the latency is slightly higher, but I dunno how tolerant it is - im sure it can work fine.

So obviously there is massive amounts of bandwidth available in high quanlity on 8GB GDDR5, but esram using tiled resources is possible their solution, it can STREAM the large texture data to a 16mb file that basically increases the detail in your POV the closer you get to the object.

This is likely a pain in the ass to develope with, but there is a software you add to your game called granite that can handle most of the code for you, its up to the devs to implement and work with it.

What I dont understand is, how can they get a frame buffer in for 1080p AA alongside it considering the eSram is SMALL - they should have went with atleast 64mb surely.

To answer my own question about the frame buffer, according to this link http://www.gameranx.com/updates/id/16172/article/shin-en-dev-manfred-linzner-explains-wii-u-s-edram/ The 360 needed 16 MB available to do 1080p framebuffer when double buffering - So single buffer is 8mb anyway, so 8mb @ 1080P x2 = 16mb + 16mb Tiled resource = 32mb eSRAM
 
Kinect over better specs will apply to how many gamers?

Judging by the Wii, maybe a metric ton of them. If it's used correctly. I firmly believe that bundling the Kinect was the right decision. A console can't survive on specs alone and the PS4 has nothing to attract the casual crowd with.
 

JonnyLH

Banned
Finally got my account approved on NeoGAF. Cheers admins!

I've been watching this thread since its birth, and I've got to say, the amount of mis-information and conclusions which are being jumped to here is immense. For a community which is highly "in the know" I can't believe anyone hasn't really critically evaluated this with any technological understanding?

This article by Edge over these past days has blown up massively, and guess what, this is the sole purpose of this article. They've got what they wanted, generated a metric f**k ton of ad revenue and page hits. This article is wrong for many reasons:

  • How can drivers in console still be un-finished and the hardware not final when they've gone into mass production?
  • DirectX has been on the platform since the start, it's not buggy or "poor" it just works due to shared codebase. They also released their mono driver during E3 which is the specially optimised version of DirectX for the platform. So saying they have been late with drivers is flat out wrong.
  • They mention "without optimisation". To me, that means someone is working these numbers out without a real kit, and is literally speculating for EDGE for their page views in this upcoming next-gen war. There is more offloaded inside the X1 to special CPU's than there is in the PS4 also.
  • 6 more CU's mean the whole console is 50% faster does it? Its a very well known fact that extra CU's dramatically decrease the efficiency of multi-threading tasks and the shader cores them selves. Its not a linear performance gain, it depends on many factors. I'm not saying the PS4 GPU hasn't got more CU's which it has. What about if I say the PS4 GPU is going to have a lot more to work on outside of games compared to the X1. This includes video encoding, video decoding and even like Mark Cerny said, a lot of the audio tasks will be offloaded to the GPU due to the fact that the GPU is a parralel processing unit which isn't effected by GDDR latency in the same way as the CPU is. Those extra CU's are starting to become less and less without the custom architecture to back them up. Oh and the developers have a lot more leg work managing the threading and task handling of the GPU.
  • Memory reads are 50% faster? From what? I can tell you as a fact that if its the CPU doing the memory read, it would be a heck lot slower. Even if its the GPU doing the read, it the developer doesn't implement the switching of tasks while waiting for GDDR return, then it'll still be slower. It depends how deep the OpenGL wrapper goes.
By any means, I'm not saying the PS4 doesn't have more of a GPU, because it does. The thing is though, it needs that GPU when you've got a CPU crippled by GDDR latency. Audio processing (not be confused by the audio encoder in the PS4) will have to be off-loaded to the GPU, a lot of the physics will be handled by the GPU. Those extra CU's start decreasing and decreasing and when you've got a CPU which you have to think a lot about because they've put GDDR in there, then you're starting to see what Albert Penello is saying.
 

Thrakier

Member
Let's be fair: Microsoft showed TitanFall, which is probably the single most hyped and desired next gen game so far. The lines at PAX were insane for that game and dwarfed all others (based on what I read, I should add. I wasn't there).

They have plenty to get folks excited.

Titanfall will be by FAR the more enjoyable game on the PC. It's supposedly not even 1080p on XB1. If anything, the case of Titanfall prooves that my point is valid.

(And yes, you can buy a 500$PC which plays Titanfall at 1080p/60, I'm pretty sure)
 
a little off topic, anyone post that gif with the ps4 ram being taken out, kitty crying and mark cerny smiling?

i wanna see it again but cant find it
 

Chobel

Member
Judging by the Wii, maybe a metric ton of them. If it's used correctly. I firmly believe that bundling the Kinect was the right decision. A console can't survive on specs alone and the PS4 has nothing to attract the casual crowd with.

I don't know about you, but I really don't care about casual games.
 
Judging by the Wii, maybe a metric ton of them. If it's used correctly. I firmly believe that bundling the Kinect was the right decision. A console can't survive on specs alone and the PS4 has nothing to attract the casual crowd with.

Casual gamers do not spend $500 on a gaming box.

Titanfall will be by FAR the more enjoyable game on the PC. It's supposedly not even 1080p on XB1. If anything, the case of Titanfall prooves that my point is valid.

(And yes, you can buy a 500$PC which plays Titanfall at 1080p/60, I'm pretty sure)

So if it's not 1080p it's not fun?

Tell Naughty Dog, quick!
 

twobear

sputum-flecked apoplexy
Nope, because Drivatar AI processing is done when not playing or racing.

No I meant, in online races couldn't they offload AI drivers in the race to the cloud and then use the CPU resources they get from that to do more stuff locally?

This is what I thought was weird about the cloud stuff at the time, they were talking about offloading the low-latency stuff (graphics) to the cloud instead of the high-latency stuff. Why couldn't they offload high latency stuff to the cloud and then use the spare cycles for more local low-latency stuff?

I'm not saying this is some kind of secret sauce, just that if they're serious about offloading stuff to remote servers this seems like the kind of thing they could hypothetically do.
 
Finally got my account approved on NeoGAF. Cheers admins!

I've been watching this thread since its birth, and I've got to say, the amount of mis-information and conclusions which are being jumped to here is immense. For a community which is highly "in the know" I can't believe anyone hasn't really critically evaluated this with any technological understanding?

This article by Edge over these past days has blown up massively, and guess what, this is the sole purpose of this article. They've got what they wanted, generated a metric f**k ton of ad revenue and page hits. This article is wrong for many reasons:

  • How can drivers in console still be un-finished and the hardware not final when they've gone into mass production?
  • DirectX has been on the platform since the start, it's not buggy or "poor" it just works due to shared codebase. They also released their mono driver during E3 which is the specially optimised version of DirectX for the platform. So saying they have been late with drivers is flat out wrong.
  • They mention "without optimisation". To me, that means someone is working these numbers out without a real kit, and is literally speculating for EDGE for their page views in this upcoming next-gen war. There is more offloaded inside the X1 to special CPU's than there is in the PS4 also.
  • 6 more CU's mean the whole console is 50% faster does it? Its a very well known fact that extra CU's dramatically decrease the efficiency of multi-threading tasks and the shader cores them selves. Its not a linear performance gain, it depends on many factors. I'm not saying the PS4 GPU hasn't got more CU's which it has. What about if I say the PS4 GPU is going to have a lot more to work on outside of games compared to the X1. This includes video encoding, video decoding and even like Mark Cerny said, a lot of the audio tasks will be offloaded to the GPU due to the fact that the GPU is a parralel processing unit which isn't effected by GDDR latency in the same way as the CPU is. Those extra CU's are starting to become less and less without the custom architecture to back them up. Oh and the developers have a lot more leg work managing the threading and task handling of the GPU.
  • Memory reads are 50% faster? From what? I can tell you as a fact that if its the CPU doing the memory read, it would be a heck lot slower. Even if its the GPU doing the read, it the developer doesn't implement the switching of tasks while waiting for GDDR return, then it'll still be slower. It depends how deep the OpenGL wrapper goes.
By any means, I'm not saying the PS4 doesn't have more of a GPU, because it does. The thing is though, it needs that GPU when you've got a CPU crippled by GDDR latency. Audio processing (not be confused by the audio encoder in the PS4) will have to be off-loaded to the GPU, a lot of the physics will be handled by the GPU. Those extra CU's start decreasing and decreasing and when you've got a CPU which you have to think a lot about because they've put GDDR in there, then you're starting to see what Albert Penello is saying.

Who are you again? Honestly asking.
 

xelloss12

Member
Judging by the Wii, maybe a metric ton of them. If it's used correctly. I firmly believe that bundling the Kinect was the right decision. A console can't survive on specs alone and the PS4 has nothing to attract the casual crowd with.

Those gamers have moved on to phone and tablet gaming, and are not coming back.
 
Judging by the Wii, maybe a metric ton of them. If it's used correctly. I firmly believe that bundling the Kinect was the right decision. A console can't survive on specs alone and the PS4 has nothing to attract the casual crowd with.

Wii wasn't $500 nor wasn't the wii remote already on the market for two years bundled with Gamecube prior to the release of the wii. Casual crowd doesn't buy expensive launch consoles early in it's lifecycle. The same people buying 360's last christmas are not the same audience bought 360's in 2005. Casuals prefer a affordable console with a built library. Not a console with an expensive pricetag.
 

libregkd

Member
But a whole lot of people do. The failure of Nintendo has left a void in the market.
I think its more accurate to say Nintendo has failed because the casual market that the Wii had has left, not that there is a market for the taking now that Nintendo has failed with the Wii U. The casual fans that the Wii brought in have already moved onto other things.
 
Finally got my account approved on NeoGAF. Cheers admins!

I've been watching this thread since its birth, and I've got to say, the amount of mis-information and conclusions which are being jumped to here is immense. For a community which is highly "in the know" I can't believe anyone hasn't really critically evaluated this with any technological understanding?

This article by Edge over these past days has blown up massively, and guess what, this is the sole purpose of this article. They've got what they wanted, generated a metric f**k ton of ad revenue and page hits. This article is wrong for many reasons:

  • How can drivers in console still be un-finished and the hardware not final when they've gone into mass production?
  • DirectX has been on the platform since the start, it's not buggy or "poor" it just works due to shared codebase. They also released their mono driver during E3 which is the specially optimised version of DirectX for the platform. So saying they have been late with drivers is flat out wrong.
  • They mention "without optimisation". To me, that means someone is working these numbers out without a real kit, and is literally speculating for EDGE for their page views in this upcoming next-gen war. There is more offloaded inside the X1 to special CPU's than there is in the PS4 also.
  • 6 more CU's mean the whole console is 50% faster does it? Its a very well known fact that extra CU's dramatically decrease the efficiency of multi-threading tasks and the shader cores them selves. Its not a linear performance gain, it depends on many factors. I'm not saying the PS4 GPU hasn't got more CU's which it has. What about if I say the PS4 GPU is going to have a lot more to work on outside of games compared to the X1. This includes video encoding, video decoding and even like Mark Cerny said, a lot of the audio tasks will be offloaded to the GPU due to the fact that the GPU is a parralel processing unit which isn't effected by GDDR latency in the same way as the CPU is. Those extra CU's are starting to become less and less without the custom architecture to back them up. Oh and the developers have a lot more leg work managing the threading and task handling of the GPU.
  • Memory reads are 50% faster? From what? I can tell you as a fact that if its the CPU doing the memory read, it would be a heck lot slower. Even if its the GPU doing the read, it the developer doesn't implement the switching of tasks while waiting for GDDR return, then it'll still be slower. It depends how deep the OpenGL wrapper goes.
By any means, I'm not saying the PS4 doesn't have more of a GPU, because it does. The thing is though, it needs that GPU when you've got a CPU crippled by GDDR latency. Audio processing (not be confused by the audio encoder in the PS4) will have to be off-loaded to the GPU, a lot of the physics will be handled by the GPU. Those extra CU's start decreasing and decreasing and when you've got a CPU which you have to think a lot about because they've put GDDR in there, then you're starting to see what Albert Penello is saying.

so basically you typed all that just to imply the ps4 is unbalanced tech?

OK, then.....
 
Judging by the Wii, maybe a metric ton of them. If it's used correctly. I firmly believe that bundling the Kinect was the right decision. A console can't survive on specs alone and the PS4 has nothing to attract the casual crowd with.

The X1 is nothing like Wii and the people that made the WIi so big are gone to tables and cellphones .
First thing Wii had motion control that never happen people already know about Kinect .
Second the Wii was cheap the X1 is 500 which is far from that .

All the casual crowd need is games and a cheap price and they will come .
And with cheaper games on DD and F2P on consoles that will help also .
 
Top Bottom