• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ars Technica: Penello's XB1 Numbers, Ars "Sniff Test" Analysis

Vizzeh

Banned
Can the PS4 not use a Flash cache to prevent ANY latency issues between the CPU/GDDR5 - if a problem were to exist, even tho it may not... Sony have included flash before in their ps3 and Cerny is well aware that there is no latency issues with GPU (so im sure he wouldnt build his system with a weakness)
 
I'd like to point out that neither Sony nor Microsoft know who's console is actually going to perform better. They have the same numbers as us on their competitions specs.

The only people who I'd trust on this are the multiplatform developers.

Actually fuck that, I'll trust the Digital Foundry 'head to head's come November 22nd.
 
Specs are marketing? What?

Shouting them from the roof tops is marketing, but a specification is a specification. They didn't sit round the table and go, 'hey technical fellows, we thought it would be a really good marketing idea if we stuck some of those GDDR5 stick things in this box, get to it men.'

I'm not sure making it known that your specifications are better than your rivals makes it any less relevant. It is still accurate information.

The specs of discrete parts without the whole picture don't really mean anything if you're comparing one only partially complete picture with a far more complete one in PS4. We know from leaks and confirmed info that these systems are not apples-to-apples comparable. They're just not. All this time and energy spent in the press circles on reporting every single Twitter message, or having pointlessly uninformative articles, like this Ars one, is all about keeping the message open about specs, feeding info-hungry messageboarders and, of course, getting hits. They get a free drama to piggyback off of for more traffic. Obviously, specs are important on some level, but they don't dictate quality of games or the choice of exclusives...the real, tangible reasons to buy a platform. For hardcore players with fewer titles to play at launch, specs mean a lot more because of the promise they hold rather than the ultimate results since those take years to filter out of constantly maturing development. In a typical launch vacuum where there is little applicable head-to-head proof of final performance, specs are the thing you crow about more to get people over to buying yours instead of the other. It's a big number on bullet point list right now. It's marketing.
 
I'd like to point out that neither Sony nor Microsoft know who's console is actually going to perform better. They have the same numbers as us on their competitions specs.

The only people who I'd trust on this are the multiplatform developers.

Actually fuck that, I'll trust the Digital Foundry 'head to head's come November 22nd.

They both know. One has been acknowledging it from the start, and the other one...
 

LiquidMetal14

hide your water-based mammals
I'd like to point out that neither Sony nor Microsoft know who's console is actually going to perform better. They have the same numbers as us on their competitions specs.

The only people who I'd trust on this are the multiplatform developers.

Actually fuck that, I'll trust the Digital Foundry 'head to head's come November 22nd.

Prepare to be disappointed since the launch games are not exactly optimized the best and are hardly scratching the surface of next gen development. This goes for mainly 3rd party games.
 

Tmecha

Neo Member
Why don't we let the games do the talking and cut out all this BS whose spec penis is bigger then
the other.
Both consoles have great looking games and that all that matters.
 

Klocker

Member
Am I the only that find it funny that every time there is a dev downplaying the PS4 (like this B3D guy) he is anonymous, yet the are plenty of devs that have no problem going on record saying there is a fairly big difference.


I think it's funny that people think he downplayed played the ps4


cause he didn't. Just gave perspective
 
I'm not going to lie. I love reading these threads. It's crazy addictive.

Last gen was the worst mix of misinformation and ignorant fanboys hashing it out.

However this new gen. It's perfect. It's the perfect blend of broken loyalties, informed destruction of misinformation, humility, learnt lessons and the crushing pain of ignorant PR fools getting force fed their lies.

It's at the point where it's more fun than actually playing the games.
 

twobear

sputum-flecked apoplexy
It's kind of obvious now that Albert shouldn't post about technical stuff that he doesn't understand (and hey, I probably wouldn't understand much of it either in his position), but I don't think he like conspired to lie or something like some people are implying or outright stating. It's PR, he was probably handed a bill of goods from the engineers to overhype the platform, and him, not really knowing how to interpret it one way or the other, simply explained it as he saw it, and ended up getting his shit in a twist. He may have known it was, in fact, PR, but that's as far as I'm willing to toss it. Everything else is conjecture.

For someone who apparently prides himself in his cutting-through-bullshitness, I have to say I completely disagree with you. Albert is a PR guy, his modus operandi is to sell you shit and tell you it's gold. He knows full well that his comparisons are bunk.

It's an unenviable position to be in, but the least MS could do is not deliberately obfuscate. What happened to 'we don't think specs are the be-all-end-all'? At least that was true.
 

jaypah

Member
Why don't we let the games do the talking and cut out all this BS whose spec penis is bigger then
the other.
Both consoles have great looking games and that all that matters.

Because some of us like reading tech discussions? There are plenty of threads that are just about the games. Sometimes we have a hardware discussion, nothing wrong with that.

Gotta pay the bills!

I'm watching you SS! :p
 

Hrothgar

Member
He has a decent point about how there's probably not going to be a huge difference between PS4 and XBO games by the end of the gen, at least in terms of how the average consumer could casually grasp the difference. But for us, on a hardcore gaming message board, it was absolutely the wrong message because we will know the difference and some of it WILL be major for us. Things like a worse framerate, for example, can kill a game for some of us.

Actually, I would argue the difference (between multiplats) will only be visible at the end of the generation, unless developers take the lazy/cheaper approach and keep console versions the same. But then again devs do take pride in their (technical) abilities, otherwise we wouldn't currently be seeing multiplats on PC look so much better.
 

evilalien

Member
Can the PS4 not use a Flash cache to prevent ANY latency issues between the CPU/GDDR5 - if a problem were to exist, even tho it may not... Sony have included flash before in their ps3 and Cerny is well aware that there is no latency issues with GPU (so im sure he wouldnt build his system with a weakness)

Uh flash speed is measured in MB/s while GDDR5 is measured in GB/s. Also the latency for flash is thousands of times higher than RAM. The only kind of cache you could add that would help GDDR5 would be ESRAM/EDRAM.
 
Why don't we let the games do the talking and cut out all this BS whose spec penis is bigger then
the other.
Both consoles have great looking games and that all that matters.

But you see, specs are an important part of the console war as you need to know the girth of the consoles, the various holes for inserting cable and if developers can force it in without getting the shaft. Y'know, scientific analysis.
 
Everytime I see Penello's name I think Pirelli... so I thought the two went together quite well :p

t5pHWJf.jpg

OH MY GOD. That's awesome.

Avatar changed!
 

sobaka770

Banned
Hahahaha, did they really add up bandwidth to prove they have more of it?

Most of this stuff from Ars sounds legit. I'm not a hardcore chip designer, but some of these remarks show a base level of not understanding the subject. What I mean is not only bandwidth (although that's hilarious) but also the whole "we invented DirectX" thing which has nothing to do with hardware. Sure you will optimise the drivers, but the underlying hardware is still the basis for any performance gain.

Anyway, it's not long until release, holding tight.
 
Yeah, they're not equal - they are 6% faster, but there's still less 50% more of them on PS4, so since they scale linearly, it's not exactly some mystery math. Also, I never understood Albert's point with API either. Presumably they both have excelent APIs, I wouldn't doubt MS on that, but SCE has more than proven their worth there as well with PSSL and LIBGCM, and tons of good graphics research and implementation they've done in Ice group.

Well its one thing to say a car engine has 50% more cylinders vs saying it has 50% more power even if displacement and compression ratios are different.

The GameCube for example had 8 pixel pipeline vs Xbox which only had 4, however Xbox's pipelines could each do 2 pixels per pipe in one pass while the GC's pipes could only do 1 each. (4x2 vs 8x1). Then the Xbox GPU was clocked at 233Mhz vs 160Mhz from GC IIRC. So even if we could say that GameCube has twice the Pixel pipes, its completely incorrect to say it would have twice the performance.
 

Vizzeh

Banned
Uh flash speed is measured in MB/s while GDDR5 is measured in GB/s. Also the latency for flash is thousands of times higher than RAM. The only kind of cache you could add that would help GDDR5 would be ESRAM/EDRAM.

I must have mis-read the article as I thought it mentioned you could use flash cache between the RAM+CPU to prevent cache miss. (trying to find it for reference)
 

Ravidrath

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."
 

Tmecha

Neo Member
Because some of us like reading tech discussions? There are plenty of threads that are just about the games. Sometimes we have a hardware discussion, nothing wrong with that.
what tech discussion all I see is 'you're wrong or he's lying'
 

Nafai1123

Banned
He has a decent point about how there's probably not going to be a huge difference between PS4 and XBO games by the end of the gen, at least in terms of how the average consumer could casually grasp the difference. But for us, on a hardcore gaming message board, it was absolutely the wrong message because we will know the difference and some of it WILL be major for us. Things like a worse framerate, for example, can kill a game for some of us.

Oh, I don't argue that average consumers might not notice the difference. But if that's the case, and someones going to claim that specs are marketing, then it's only marketing to that very specific group (NeoGAF). Downplaying the difference is poor marketing towards that group since they are more interested in the technical details to begin with so it comes across as disingenious.

MS continues to flounder with their statements regarding the XB1. At first they claimed they intentionally did not target the highest specs, and now they try to downplay the differences. Average consumes might not even know or care, but to people that are tech oriented it implies a level of stupidity and short-term memory loss from those MS are attempting to convince.
 

evolution

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

No wonder we haven't seen the xbox version
 
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Wow if this is true.
 
OH MY GOD. That's awesome.

Avatar changed!

Not sure what to make of this

Although it's always nice to see people who can take a joke

This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Now that is interesting, very interesting
 
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Holy shit. There's no way that can be true. Either Sony are hardware geniuses or MS really messed up.
 

46w500

Banned
At first they claimed they intentionally did not target the highest specs
This one is categorically true. And Albert's mention that neither did Sony is also categorically true. Albert even went so far as to proclaim that high-end PC gamers would agree on this. And looking at the specs of both, how can you deny that claim?
 
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Wow, that is a huge difference.
 

Tmecha

Neo Member
But you see, specs are an important part of the console war as you need to know the girth of the consoles, the various holes for inserting cable and if developers can force it in without getting the shaft. Y'know, scientific analysis.

Ok my bad, carry on :)
 
That can't be true. That sounds to drastic

I'm not an Xbox One customer, nor a techy, but I do love that Albert is willing to roll with the punches.

Shine on, you crazy diamond!

Yeah I like Albert a lot. Only guy from MS that can talk and doesn't make me want to poke myself in the eye with a stick.
 

USC-fan

Banned
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."
thanks for the info.

I can understand only 32mb of esram being a headache. its the same size as wiiu...
 

Amir0x

Banned
Holy shit. There's no way that can be true. Either Sony are hardware geniuses or MS really messed up.

Honestly, it sounds like it's one of those stories that has hints of truth in it, but it has been exaggerated as it passed through the grapevine, like a game of telephone.

Probably something like "It took PS4 1 month to come out at 60fps unoptimized, and it took Xbox One 2 month to come out at 30fps" when the word first got passed on.
 
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."
That's... quite bad!

It some-what tallies with what The Crew guys were saying, in how quickly they were able to port to PS4 and indeed how easy it was too.

I'm hoping that theme continues, as it should mean that more time is spent on optimisation and getting the most out of the console's unique feature set.
 
Honestly, it sounds like it's one of those stories that has hints of truth in it, but it has been exaggerated as it passed through the grapevine, like a game of telephone.

Probably something like "It took PS4 1 month to come out at 60fps unoptimized, and it took Xbox One 2 month to come out at 30fps" when the word first got passed on.

Your probably right.
 

Skeff

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

meh 120fps or bust.

But seriously that's certainly interesting, I had heard Microsofts API's were behind, but I'm sure they've been updated a lot since then. but wow 4 months to 3 weeks is the big hitter there, if the difference is that large and the PS4 begins to outsell the XB1 then I think Sony could start getting some exclusives based purely on costs of dev and sales. (No, nothing like CoD) smaller games that don't make many sales, for instance a PC indie game being ported across.

As others have stated probably been a bit of truth lost as it was passed around but i'm sure the general Idea is the same.
 
That can't be true. That sounds to drastic



Yeah I like Albert a lot. Only guy from MS that can talk and doesn't make me want to poke myself in the eye with a stick.

Hmm honestly I find the 90fps harder to believe than the 15fps

I always thought Framerates were usually terrible until final optimizations

Basically it's crazy if either of those are true
 

Freki

Member
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Sounds at least logical when it comes to the core of the rumor - no idea about the exact numbers (probably not that drastic but correct direction):
For PS4 everything just gets dumped into 8GB GDDR5 - no dev interaction needed
For Xbox One the dev must decided what to put into the 32MB eSRAM and what to leave in the 8GB DDR3 - probably on a case by case basis to get the best results
More things to consider means more possibilities for failure...
 

RoboPlato

I'd be in the dick
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Holy fuck if that's anywhere near true. Numbers are probably exaggerated but the fact that devs can get stuff up and running well on PS4 in a fraction of the time bodes really well.
 

KoopaTheCasual

Junior Member
Honestly, it sounds like it's one of those stories that has hints of truth in it, but it has been exaggerated as it passed through the grapevine, like a game of telephone.

Probably something like "It took PS4 1 month to come out at 60fps unoptimized, and it took Xbox One 2 month to come out at 30fps" when the word first got passed on.
This sounds much more realistic.
 

Proelite

Member
I would love it if the Xbox One endorsed version of Ghost ran at half of the framerate / resolution of the PS4 version.

Sweet irony. The DF thread will be delicious.

In honesty though, I doubt early development troubles when MS tools were months behind really mean anything for final perf. It's troubling though that we haven't see Ghost nor BF4 running on Xbox One yet.
 

Ravidrath

Member
Honestly, it sounds like it's one of those stories that has hints of truth in it, but it has been exaggerated as it passed through the grapevine, like a game of telephone.

Probably something like "It took PS4 1 month to come out at 60fps unoptimized, and it took Xbox One 2 month to come out at 30fps" when the word first got passed on.

I actually repeated that I had heard this to a high-level Sony guy, and he basically confirmed it. But also wanted to make sure that I didn't actually hear it from someone at Sony because they didn't want it getting back to them.

This Sony guy in particular I don't think would lie or exaggerate, but obviously they would have something to gain if they did. So I don't know.
 

Skeff

Member
I actually repeated that I had heard this to a high-level Sony guy, and he basically confirmed it. But also wanted to make sure that I didn't actually hear it from someone at Sony because they didn't want it getting back to them.

This Sony guy in particular I don't think would lie or exaggerate, but obviously they would have something to gain if they did. So I don't know.

damn...

Pre order cancelled, just so i can order it again.
Joking, sold out till 2014 in uk :'(
 

Guymelef

Member
I would love it if the Xbox One endorsed version of Ghost ran at half of the framerate / resolution of the PS4 version.

Sweet irony. The DF thread will be delicious.

In honesty though, I doubt early development troubles when MS tools were months behind really mean anything for final perf. It's troubling though that we haven't see Ghost nor BF4 running on Xbox One yet.

I don't think Microsoft allows that when the games is behind a great partnership.
 

Metfanant

Member
For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

WHAT!? for realz? staahhhp...
 
Top Bottom