Incredible rebutal!
I don't know... you really think Microsoft would do that? Just go on the internet and try to mislead people with numbers?
new theory: Albert got trolled by his coworker
Penello: "Ok dude, are you SURE this stuff is accurate? Those cats on GAF are pretty rough"
Fellow: "Dude, it's totally accurate, go for it" trollface.jpg
new theory: Albert got trolled by his coworker
Penello: "Ok dude, are you SURE this stuff is accurate? Those cats on GAF are pretty rough"
Fellow: "Dude, it's totally accurate, go for it" trollface.jpg
I'm waiting for Albert now, maybe he will explain what he meant by that.
It's a well known site calling bullshit on his claims, not just random gaf posters.Lol, this is going to be the exact same thread all over again but with the twist that the OP cites the first thread. Inception.
Looking forward to a reply. Buckling in.
It's a well known site calling bullshit on his claims, not just random gaf posters.
Penello: "18 CUs [compute units] vs. 12 CUs =/= 50% more performance. Multi-core processors have inherent inefficiency with more CUs, so it's simply incorrect to say 50% more GPU."
Ars: "The entire point of GPU workloads is that they scale basically perfectly, so 50% more cores is in fact 50% faster."
I found it hilarious that he admitted that 'technical fellows' are rare at Microsoft.
It's a well known site calling bullshit on his claims, not just random gaf posters.
Don't hold your breath. When called out on the BS and asked for explanations, he went into "Well I give you guys proofs straight from our elite engineers but you just think I lie. WE MADE DIRECT X!" and "it's pointless for me to post more about the topic, you don't trust me. You'll see at launch anyway" mode.
I'm on mobile so I can't give a link but ERP on B3D (Mod and former first party Sony dev) said that you have have a performance decrease per CU the more you have. And the rest is basically "we don't know better".
Meh article
http://beyond3d.com/showpost.php?p=1782673&postcount=6777FWIW my expectation is that PS4 ought to have a performance advantage, but I wouldn't expect it to reflect the difference in CU counts. CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited.
There are just too many little things that can significantly impact performance, there were times with early firmwares on PS4 where seemingly innocuous changes would affect performance by as much as 10%.
The eSRAM will certainly provide an advantage under some circumstances, and I'm interested if the ROP difference will end up being a factor, or the lower eSRAM latency will end up nullifying it.
For that matter I could imagine the graphics API having a significant effect, the CPU single threaded performance isn't great on these machines and a poorly conceived implementation, could hurt games across the board. Sony having "lower level access" here isn't necessarily a win.
Hell I could imagine cases where CPU limited games demonstrate an advantage on XB1.
Yes it only performs 30% better. What a worthless improvement!Except GPU benchmarks usually don't support this view. Even if you compare GPU's in the same line, performance never scales perfectly. For example, look at Anandbench for a comparison of the 7870 vs 7970. The Tflop difference is +48% in favor of the 7970, yet in benchmarks it performs on average ~30% better.
who is ars , just curious because he really didnt say much but im most likely not understanding something lol
Penello: "We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted."
Ars: "Just adding up bandwidth numbers is idiotic and meaningless. While the Xbox One's ESRAM is a little faster, we don't know how it's used, and the PS4's GDDR5 is obviously a lot bigger."
Yes it only performs 30% better. What a worthless improvement!
For some reason I just feel that when this guy and Major Nelson start talking hardware, they're just parroting what the hardware team tells them.
"Yes, it's just as powerful as the PS4," MS' hardware team tells them.
So that's what they tell people.
When Cerny speaks, however, I know that he designed the damn thing. When he talks numbers, I know that he knows what they mean, even though I might not fully understand him. If he says the PS4 is more powerful, I sure as hell believe him.
WE GET IT. PS4 IS FASTER. What in fuck's name is the point of this all ? Albert WORKS for Microsoft, its his job to make the difference seem as marginal as possible. He will never come out and say "chucks guys, you got me". Xbone fanboys will try to downplay the difference tooth and nail. Do we have to do this each and every day ?
WE GET IT. PS4 IS FASTER. What in fuck's name is the point of this all ? Albert WORKS for Microsoft, its his job to make the difference seem as marginal as possible. He will never come out and say "chucks guys, you got me". Xbone fanboys will try to downplay the difference tooth and nail. Do we have to do this each and every day ?
That is a job title there, why would that be hilarious?
WE GET IT. PS4 IS FASTER. What in fuck's name is the point of this all ? Albert WORKS for Microsoft, its his job to make the difference seem as marginal as possible. He will never come out and say "chucks guys, you got me". Xbone fanboys will try to downplay the difference tooth and nail. Do we have to do this each and every day ?
Another one of these threads I see.
It would have been nice of Ars to actually respond with something insightful instead of just off the cuff remarks.
Was it this?
http://beyond3d.com/showpost.php?p=1782673&postcount=6777FWIW my expectation is that PS4 ought to have a performance advantage, but I wouldn't expect it to reflect the difference in CU counts. CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited.
There are just too many little things that can significantly impact performance, there were times with early firmwares on PS4 where seemingly innocuous changes would affect performance by as much as 10%.
The eSRAM will certainly provide an advantage under some circumstances, and I'm interested if the ROP difference will end up being a factor, or the lower eSRAM latency will end up nullifying it.
For that matter I could imagine the graphics API having a significant effect, the CPU single threaded performance isn't great on these machines and a poorly conceived implementation, could hurt games across the board. Sony having "lower level access" here isn't necessarily a win.
Hell I could imagine cases where CPU limited games demonstrate an advantage on XB1.
and above all, thisYou guys are just having trouble imagining how the less powerful console could, in actuality, be the more powerful console.
WE GET IT. PS4 IS FASTER. What in fuck's name is the point of this all ? Albert WORKS for Microsoft, its his job to make the difference seem as marginal as possible. He will never come out and say "chucks guys, you got me". Xbone fanboys will try to downplay the difference tooth and nail. Do we have to do this each and every day ?
You'd think that Microsoft would like to have large amounts of highly skilled knowledgeable people working for them ... that understand the basics of their technology lol.
Penello: "Adding to that, each of our CUs is running 6% faster. It's not simply a 6% clock speed increase overall."
WE GET IT. PS4 IS FASTER. What in fuck's name is the point of this all ? Albert WORKS for Microsoft, its his job to make the difference seem as marginal as possible. He will never come out and say "chucks guys, you got me". Xbone fanboys will try to downplay the difference tooth and nail. Do we have to do this each and every day ?
WE GET IT. PS4 IS FASTER. What in fuck's name is the point of this all ? Albert WORKS for Microsoft, its his job to make the difference seem as marginal as possible. He will never come out and say "chucks guys, you got me". Xbone fanboys will try to downplay the difference tooth and nail. Do we have to do this each and every day ?
Not everyone will understand all that but it was a damned good post.
I just tell my niece this when she wanted an explanation.
You run faster than I do right? But I'm stronger right?
So what goes faster, you running back and forth from the car and taking two shopping bags at the time or me taking all eight at once?
And she usually gets that better than all technical explanations but she is getting there, can't expect a 13 year old to understand everything, heck even I have to Google up bits of info nearly every damn day since I forgot something....it sucks getting old folks.
WE GET IT. PS4 IS FASTER. What in fuck's name is the point of this all ? Albert WORKS for Microsoft, its his job to make the difference seem as marginal as possible. He will never come out and say "chucks guys, you got me". Xbone fanboys will try to downplay the difference tooth and nail. Do we have to do this each and every day ?
Penello: "Adding to that, each of our CUs is running 6% faster. It's not simply a 6% clock speed increase overall."
Ars: "What the hell does that even mean?"