Ricerocket
Member
Nothing matters when there is the... Cloud.
I havn't heard the 40x power from cloud in awhile, I wonder if they gave up on that?
Nothing matters when there is the... Cloud.
Penello: "18 CUs [compute units] vs. 12 CUs =/= 50% more performance. Multi-core processors have inherent inefficiency with more CUs, so it's simply incorrect to say 50% more GPU."
Ars: "The entire point of GPU workloads is that they scale basically perfectly, so 50% more cores is in fact 50% faster."
FWIW my expectation is that PS4 ought to have a performance advantage, but I wouldn't expect it to reflect the difference in CU counts. CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited.
There are just too many little things that can significantly impact performance, there were times with early firmwares on PS4 where seemingly innocuous changes would affect performance by as much as 10%.
The eSRAM will certainly provide an advantage under some circumstances, and I'm interested if the ROP difference will end up being a factor, or the lower eSRAM latency will end up nullifying it.
For that matter I could imagine the graphics API having a significant effect, the CPU single threaded performance isn't great on these machines and a poorly conceived implementation, could hurt games across the board. Sony having "lower level access" here isn't necessarily a win.
Hell I could imagine cases where CPU limited games demonstrate an advantage on XB1.
Penello: "Adding to that, each of our CUs is running 6% faster. It's not simply a 6% clock speed increase overall."
Ars: "What the hell does that even mean?"
Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."
Ars: "Just adding up bandwidth numbers is idiotic and meaningless. While the Xbox One's ESRAM is a little faster, we don't know how it's used, and the PS4's GDDR5 is obviously a lot bigger."
That's the WiFi frequency...
I thought that was yhe GDDR5 and has nothing to do with the CPU??
I thought there were killzone shadowfall slides released that indicated it was 1.6.
I thought there were killzone shadowfall slides released that indicated it was 1.6.
Nothing matters when there is the... Cloud.
I thought there were killzone shadowfall slides released that indicated it was 1.6.
As much as I think Penello is mostly blowing smoke, most of the Ars rebuttals were less than thorough.
I agree with this, but I also understand that this specs war thing is beyond beating a dead horse at this point. We're never going to see MS come out and openly say their specs are weaker in comparison to Sony's, nor will they openly admit that they misled the public (neogaf) or lied.
I feel bad for Albert. I shouldn't, but I do.
I appreciate him taking the time to post and interact with the comnunity, but he's doing more harm than good with everything he says making headlines daily (and ALL of it negative in nature). If Microsoft was smart they'd shut him down before he does even more harm.
Most industry folks get PR training before they're allowed to do interviews or post, and his recent posts are prime examples of engagements that should be avoided at all costs.
dont. it was his choice.I feel bad for Albert. I shouldn't, but I do.
I feel bad for Albert. I shouldn't, but I do.
Albert must be sure because he is going all in with his boss.
Extremely. Technology is their original expertise and specialty.How reliable is Ars when it comes to understanding this kind of stuff though?
Extremely. Technology is their original expertise and specialty.
That this type of response is coming from Peter Bright (Ars's own MS reporter, who is a huge MS fanboy by his own admission and uses a Windows Phone) and Kyle Orland (their games editor) says a lot.
Ars's readership also heavily prefers MS and hates Sony, so there's that, too.
let's get everyone from MS here.
I think they've moved on from that talking point.
They called the Vita TV "dead in the water" or something similar, they dont exactly play the favorites game.How reliable is Ars when it comes to understanding this kind of stuff though?
Man so many numbers being thrown around the past few months it's been hard to keep track.
http://www.blogcdn.com/www.engadget.com/media/2013/07/sony-ps4-dev-kit-specs.jpg[/.iMG]
Was that not talking about the CPU?[/QUOTE]
No. It was the max frequency for any chip, transmitter, or receiver within the PS4 devkit. Including WLAN, Bluetooth, whatever else.
[quote="SwiftDeath, post: 81287589"]So what the hell is going to happen with multiplats then?
Albert's continual push that differences will look negligible almost makes me think the tinfoil hat theory of MS buying multiplat parity is not so far-fetched[/QUOTE]
MS might be able to negotiate that devs keep the same quality of assets (Models and textures) but it would take some real work to make the performance any worse on PS4 than on XBOne. They might be able to lock the Xbox One and PS4 versions to the same framerate, but whereas one version would see dips in performance (XBO), the other (PS4) wouldn't
nfs does not look worse than driveclubSo what the hell is going to happen with multiplats then?
Albert's continual push that differences will look negligible almost makes me think the tinfoil hat theory of MS buying multiplat parity is not so far-fetched
Exactly!Full of shit as expected
Wait!Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."
Ars: "Just adding up bandwidth numbers is idiotic and meaningless. While the Xbox One's ESRAM is a little faster, we don't know how it's used, and the PS4's GDDR5 is obviously a lot bigger."
I know, I was surprised they didn't assert that the cloud would make up for the difference. I thought for sure they would. :'/
Currently it is 1.6 until Sony says otherwise. Just like Microsoft confirmed 1.75
Exactly!
Wait!
Penello really said that?
Bwhahahaha!
Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."
Penello: "18 CUs [compute units] vs. 12 CUs =/= 50% more performance. Multi-core processors have inherent inefficiency with more CUs, so it's simply incorrect to say 50% more GPU."
Ars: "The entire point of GPU workloads is that they scale basically perfectly, so 50% more cores is in fact 50% faster."
Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."
Ars: "Just adding up bandwidth numbers is idiotic and meaningless. While the Xbox One's ESRAM is a little faster, we don't know how it's used, and the PS4's GDDR5 is obviously a lot bigger."
Penello: "We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles."
Ars: "Maybe true."
Penello: "Speaking of GPGPU—we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU."
Ars: "I don't know if that's even true."
The sooner AMD and MS release the full rundown on the final tech, the better. If what was said is true, we have to endure just a few more weeks of very predictable tech speculation threads and tech site hit-generators. They need to directly interview with all of these tech sites when they do, otherwise we're gonna have more thread-derailing, conversation-stifling accusations of bias or unreliability if not straight bullshit.
Wait!
Penello really said that?
Bwhahahaha!
I feel bad for Albert. I shouldn't, but I do.
So what the hell is going to happen with multiplats then?
Albert's continual push that differences will look negligible almost makes me think the tinfoil hat theory of MS buying multiplat parity is not so far-fetched
Penello is a marketing person, not a "fanboy". He is trying to sell us a product by any means necessary, that includes misrepresentation of facts and bizarre misdirection (like his "we invented Direct Compute/DirectX" comment).
It's also why he is trying to "buddy up" on the forums.
I highly doubt this is true. If so AMD/Nvidia would just keep adding CU's to cards, instead of increasing bw, ROPS etc. Why consider bandwidth at all if increasing CU's gives perfect scaling?
nfs does not look worse than driveclub
Exactly!
Wait!
Penello really said that?
Bwhahahaha!
http://www.neogaf.com/forum/showpost.php?p=80986853&postcount=949 said:And please allow me to help out your PR department a little. 204gb/sec, according to your understanding of the number, actually implies the old clock rate of 800mhz. The new number should be 853 * 128 * 2 (simultaneous read/write per cycle) = 218gb/sec. They can thank me later.
As far as we know it is 100% true.
Why not try to find out?