• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Ars Technica: Penello's XB1 Numbers, Ars "Sniff Test" Analysis

I think the people that did the HotChips talk are the most reliable. You don't present nonsense, at least not too much nonsense, at a professional/scientific conference. And, of course, these guys are indeed the engineers.

And even if, let's assume if the engineers got it wrong for the HotChips talk, that the audience couldn't notice and inform MS at the talk that their numbers don't match-up.

Gaffers could do it with basic maths, I don't believe the Hotchips audience couldn't.
 
I don't know why people claim that Sony lied about their PS2/PS3 specs claims.

PS2 was the most powerful console in 2000 period, anyone who thinks that DC is more powerful knows nothing about specs.

PS3 IS the most powerful console this generation easily. How? well, look at TLoU, GoW and Uncharted games. there is nothing in the console space that comes close to those games.

So, Sony said PS3 is more powerful than 360, then they PROVED this advantage by 1st party games output. PS3 only downside is that it was the most difficult console to program for in consoles history followed by the Saturn then PS2. PS3 would have held an advantage over 360 in all games if it was easier to program for, but sadly it wasn't and his potential remained untapped except from bunch of Sony's elite ICE team.

I don't remember Sony saying that PS3 is easy to develop for, they just claimed it's more powerful then proved with time that once you master this exotic design you will end up with more powerful than 360.


Next gen is quite different, both PS4/X1 almost share the same architecture, it's just PS4 has more of everything in addition to easiness to develop for. So, I really hope people stop bringing PS3/360 example when talking about next gen. Totally different scenario as explained above.

IF MS thinks that X1 is on par or better than PS4, then they have to prove it themselves. just like Sony did last gen.


The problem is not just that the PS3 was difficult to develop for, but to get the most out of it, you had to make sacrifices in your game design that you did not need to with the 360. Its of little wonder the PS3's best games follow a very similar core design.
 

Ebomb

Banned

beast786

Member
What exactly is your implication here. We have conflicting statements that must all be lies because ______. Help me filling that blank. If every number released is a lie, whats the right number and why do you think they are lying.

You make your own implication. Fact is that he said that the numbers he posted are correct and were verified. Yet, even now. There is absolutely no explanation how they came up with that number , even when you use there own definition for it. And even at that. They contradict them self.

That number is a theoretical / mathematical number. So there is actually very less room for confusion or error. And its also not very complicated to figure out.
 

Argyle

Member
How ignorant are these MS execs if they think they can run the same HD-DVD style fud-campaign at NeoGAF like they did at AVS forums? Pointless question since these were the same clowns behind HD-DVD.

The same FUD strategy will not fly here as it did there back then primarily due to the user base and the moderation of this forum compared to AVS who were totally unprepared for the FUD and astroturfing campaign that occurred with HD-DVD.

It's funny seeing Penello post here, the only people agreeing with him were already inclined to do so, others are finally seeing the thin coat of PR wash off from his posts.

When someone of claimed authority makes a shit post this forum will eviscerate that post and sometimes that individual. Microsoft need an all-new social marketing strategy since the current FUD tactics have little to no chance of succeeding here. They hugely underestimate their audience. They have learned very little.

I wrote this in another thread:

zomgbbqftw said:
Fuck this guy. Seriously, threads on comments from Major Nelson, Albert Penello and such just need to be closed. It causes a whole load of shit and we're supposed to accept their comments because they work for MS and are in a position of knowledge. Well fuck all of them, either show the goods and documentation of shut the fuck up. Just like every other GAF member.

avaya said:
Penello is really handling GAF. They are going to realise very quickly that NeoGAF is not AVS Forums which they reduced to shit during the HD format wars from 2006-2008.

OMG this x1000! I was going to come in to basically post the same thing.

I was there too when I saw this playbook being run. The pointless Blu-ray vs. HD-DVD format war was in full swing, and there are eerie parallels to this console war (Blu-ray is a technically superior format, with 66% more capacity than HD-DVD. HD-DVD had some short term advantages in authoring software, minimum player spec, and cost of manufacture, but those are things that are again, only short term advantages and not necessarily something that a customer would notice, and all have been effectively nullified in the years since.)

MS sent Amir, a VP who I'm told convinced MS to go all-in on HD-DVD. So people were thrilled to see an executive from one side coming to talk to them, and understandably so, as he shared his insider's perspective. I don't have a problem with this, to be honest.

What I do have a problem is when they come on and basically spread FUD (fear, uncertainty, doubt) and rumors about their competitor's product. I saw this happen over there - the one I remember most vividly was Amir telling everyone that (paraphrased) "dual layer BD-50 discs are science fiction. They can't be manufactured." And a LOT of people believed him, because here is an exec from Microsoft telling you so! Worse, there could be no response from the other side - it's all rumor and hearsay, and maybe no one is at liberty to disclose exactly how much BD-50 capacity they have publicly? IIRC a Sony Pictures guy started posting but he was on the content side so he couldn't really comment on the rumors. (But that guy played it straight - talked about new movies in the pipeline, etc. and from what I remember never got caught up talking crap about HD-DVD as what the hell would he know about HD-DVD anyway?)

Folks on the AVSForum bought Amir's BS right up until the point that the first BD-50 discs rolled out of the replication plants...and HD-DVD died and Amir took his retirement.

Don't get me wrong, I think it's cool to see executives coming in and posting. It's awesome when they share new information with us! But there should be some ground rules, I think. I don't want to see NeoGAF turn into format war-era AVSForums.

Want to talk about how amazing your product is? I don't have a problem with that, as long as you tell us who you are.

Want to clarify misconceptions about your product? Please, be my guest. I'm sure we would all love getting the straight scoop!

Want to spread rumors and innuendo about your competitor's product? IMHO this should be ban-worthy and we should not tolerate it.

To put some timeline into this obvious crazy of circus deception.

Time LIne:

June 28th Friday : Leadbetter at Eurogamer made an article exclusive to him alone where MS found this amazing before never to be known two way ESRAM, which helps double the bandwidth.
http://www.eurogamer.net/articles/digitalfoundry-xbox-one-memory-better-in-production-hardware

Leadbetter claims his source was "Well-placed development sources have told Digital Foundry ".... well now we know that this development source had tobe MS them self. So leadbetter became a mouthpiece PR. As soon as the article hit Gaf, question spark of math that did not make any sense.. As summarize by the post below from the thread..



Well,
800Mhz x 128 bits = 102 GB/s
According to the article the flux capacitor found by MS allows read and write at the same time(intel, hire these guys) and so theorical max bandwidth should be 102 x 2 = 204

http://www.neogaf.com/forum/showthread.php?p=66898711#post66898711

Basic question if it is read and write at the same time then the max should have been 204. Why 194? This was raised and debated but no answer.

Then we had the upclock. And MS increased/upclock GPU. Albert than made a post here in Gaf of comparison with PS4

• We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.
http://www.neogaf.com/forum/showthread.php?p=80951633&highlight=#post80951633

He then was asked:
by Freki

So why is the bidirectional bandwith of your eSRAM 204GB/s although your one-directional bw is 109GB/s - shouldn't it be 218GB/s?


To which albert response:

Yes, it should be. And I was quickly corrected (both on the forum and from people at the office) for writing the wrong number.

So now he claims after talking to his "technical people" the number should be 218.

But here is damn catch AGAIN......

At the Hotchip Presentation, which was done after GPU upclock contradicts that number, which was actually presented by MS own technical experts at states the number to be 204

XBO_diagram_WM.jpg


So seriously, all this circus of misinformation. To a point they cant even keep up with there own deceptions and lies.

I really want to understand this double bandwidth mode. The way it was originally presented made it seem like a "trick" mode - something you could do, but there were severe limitations on using it as you're kind of abusing the hardware into doing something it wasn't originally designed to do. The fact that the bandwidth number does not double (I believe the slightly lower non-doubled number to be the correct number, it can't just be a typo that they have repeated over and over, including at HotChips) suggests that there are times where perhaps it is totally unsafe to write while reading or vice versa, and if you took into account those times, perhaps you could hit 204GB/s or whatever.

I wonder though if it is ever possible to even come close to 204GB/s. The original article says that they were able to get 133GB/sec doing alpha blending on FP16 render targets, but I wonder what limitations are imposed and how crazy the timing has to be to avoid memory corruption. Maybe you can do alpha blend but can only run a trivial shader? Could still be useful for certain things, but yeah, I really want to understand how it works...
 

sangreal

Member
I really want to understand this double bandwidth mode. The way it was originally presented made it seem like a "trick" mode - something you could do, but there were severe limitations on using it as you're kind of abusing the hardware into doing something it wasn't originally designed to do.

That's how DF presented it, not MS
 

artist

Banned
I think the people that did the HotChips talk are the most reliable. You don't present nonsense, at least not too much nonsense, at a professional/scientific conference. And, of course, these guys are indeed the engineers.
Microsoft's HotChips talk was very lacking in terms of actual technical details and filled with all the buzzwords. Other HotChips talks were more in-depth and I'd doubt anyone used such mathemagic or buzzwords in their ppts.

The roundtable after the presentation was even more fluff - it's up on youtube and I'd encourage people to watch it. Major Nelson has all these really important guys and all they do is repeat PR sanctioned buzzwords - Growth via clouds, transistors in the clouds, gamers have changed, gaming has changed, this is what the gamers want, kinect, kinect, kinect, NUI, NUI, NUI, ESRAM, VM OS ..
 

Metfanant

Member
I truly want Albert to answer my ridiculous question about choosing DDR2 over DDR3...

If adding the bw of System RAM and embedded RAM is in no way misleading then DDR2 would have still allowed the Xbone to outperform the PS4...
 

RoboPlato

I'd be in the dick
Microsoft's HotChips talk was very lacking in terms of actual technical details and filled with all the buzzwords. Other HotChips talks were more in-depth and I'd doubt anyone used such mathemagic or buzzwords in their ppts.

The roundtable after the presentation was even more fluff - it's up on youtube and I'd encourage people to watch it. Major Nelson has all these really important guys and all they do is repeat PR sanctioned buzzwords - Growth via clouds, transistors in the clouds, gamers have changed, gaming has changed, this is what the gamers want, kinect, kinect, kinect, NUI, NUI, NUI, ESRAM, VM OS ..

I'm still surprised they tried to pull that at Hotchips. The crowd in attendance must have been pissed. It's a very technical group and I bet they were hoping for some real in depth info.
 

Biker19

Banned
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

Damn...Xbox One is the new PS3 in terms of architecture development.

It's also no less true. You wanting it not to be true so you can hate on it, doesn't make it false.

You should also re read what you posted above. The key word in the quote is SOME

This is 2013 not 1996. I know that for Sony fans the ability to connect to the internet is a relatively new concept that came about late in the PS2's life and was an expensive add on with only a handful of game support and for the PS3 it was still a mess of a system that got better over the generation. But some of us have been online gaming with consoles since the Saturn. It's become a way of life to have a system always plugged in or WiFi connected.

Hell I started being connected online through Sega channel back in the 16 bit days.

I'm ready for the industry to move forward and not to stay stuck in the past.

I got news flash for you: The PS2 was no "add-on." Not everybody wants to move to digital only, & besides, there are tons of countries out there without internet, or with a poor internet connection along with bandwidth caps. And the internet here in America isn't perfect like you think it is, there are also 30% of Americans without broadband.

Get off of your soapbox.
 

beast786

Member
And this is the 1.000.000 Dollar question ;)
And I guess we know it already.

Which was part of my deduction.We now know exactly who leadbetter used as a source and basically was used as a FUD mouth piece. It was part of bunch of his articles in which he keep sliming the PS4-XB1 performance differences using FUD math and FUD setup. That FUD math is now consistent with albert post.

EDIT: At the same time. I am still open to explanation from albert . They were the first to discover same time read/write, so who knows, maybe they have some other majic.
 

Fredrik

Member
I don't know why people claim that Sony lied about their PS2/PS3 specs claims.

PS2 was the most powerful console in 2000 period, anyone who thinks that DC is more powerful knows nothing about specs.

PS3 IS the most powerful console this generation easily. How? well, look at TLoU, GoW and Uncharted games. there is nothing in the console space that comes close to those games.

So, Sony said PS3 is more powerful than 360, then they PROVED this advantage by 1st party games output. PS3 only downside is that it was the most difficult console to program for in consoles history followed by the Saturn then PS2. PS3 would have held an advantage over 360 in all games if it was easier to program for, but sadly it wasn't and his potential remained untapped except from bunch of Sony's elite ICE team.

I don't remember Sony saying that PS3 is easy to develop for, they just claimed it's more powerful then proved with time that once you master this exotic design you will end up with more powerful than 360.


Next gen is quite different, both PS4/X1 almost share the same architecture, it's just PS4 has more of everything in addition to easiness to develop for. So, I really hope people stop bringing PS3/360 example when talking about next gen. Totally different scenario as explained above.

IF MS thinks that X1 is on par or better than PS4, then they have to prove it themselves. just like Sony did last gen.
I'm not saying that they're outright lying, I'm saying that I've been fooled by their hype too many times to suddenly start believing everything they say.

Start of last gen: Sony said that next gen start when Sony says so, Cell and RSX made Xbox 360 seem more like Xbox1.5, Killzone and Motorstorm target renders fooled everyone, etc etc.

End of last gen: PS3 turns out to be a great console but is still almost always getting beaten by Xbox360 in performance in multiformat titles. First party software is great but not to the degree that it makes Xbox360 seem more like Xbox1.5. Specs are still great but you basically need to be a Naughty Dog wizard to unleash the power.

It might not turn out the same way with PS4, Sony seem to have learned from their mistakes, but I get a bit worried when just recently the Resogun dev start talking about PS4 having raw power and some devs might get bloodied knuckles before they get control of that power... And all this talk about some high profile first party games running below 1080p and are aiming for 30fps makes me even more worried. :/
And meanwhile we're seeing reports that some XB1 games are running at 60fps at 1080p while not looking much worse.
 

Iacobellis

Junior Member
I'm not saying that they're outright lying, I'm saying that I've been fooled by their hype too many times to suddenly start believing everything they say.

Start of last gen: Sony said that next gen start when Sony says so, Cell and RSX made Xbox 360 seem more like Xbox1.5, Killzone and Motorstorm target renders fooled everyone, etc etc.

End of last gen: PS3 turns out to be a great console but is still almost always getting beaten by Xbox360 in performance in multiformat titles. First party software is great but not to the degree that it makes Xbox360 seem more like Xbox1.5. Specs are still great but you basically need to be a Naughty Dog wizard to unleash the power.

It might not turn out the same way with PS4, Sony seem to have learned from their mistakes, but I get a bit worried when just recently the Resogun dev start talking about PS4 having raw power and some devs might get bloodied knuckles before they get control of that power... And all this talk about some high profile first party games running below 1080p and are aiming for 30fps makes me even more worried. :/
And meanwhile we're seeing reports that some XB1 games are running at 60fps at 1080p while not looking much worse.

Such as? Only one I can think of is Forza 5.
 

Fredrik

Member
Such as? Only one I can think of is Forza 5.
Killer Instinct is 60fps, but so is Resogun so make that "one" instead of "some" to not start a list war. I just think it's a bit worrisome to not see the claimed 40% power difference even in first party titles when the framerates are so low. After all, the proof is in the pudding, talk about much better specs don't mean much unless it clearly shows.
 
Killer Instinct is 60fps, but so is Resogun so make that "one" instead of "some" to not start a list war. I just think it's a bit worrisome to not see the claimed 40% power difference even in first party titles when the framerates are so low. After all, the proof is in the pudding, talk about much better specs don't mean much unless it clearly shows.

KI is 720p.
 
Killer Instinct is 60fps, but so is Resogun so make that "one" instead of "some" to not start a list war. I just think it's a bit worrisome to not see the claimed 40% power difference even in first party titles when the framerates are so low. After all, the proof is in the pudding, talk about much better specs don't mean much unless it clearly shows.

Killer Instinct isn't 1080p, it's 720p.

Edit: Beaten..
 

CLEEK

Member
but I get a bit worried when just recently the Resogun dev start talking about PS4 having raw power and some devs might get bloodied knuckles before they get control of that power...

You've misinterpreted the Housemarque dev's comment. He was saying that Sony allows for very low level access is you want to delve down that far. Jumping in and coding at the lowest level will give some dves 'bloody knuckles'.

He's not saying that the power of the PS4 in itself cause any issues. Which ties in from everything we've heard from other devs, third party and otherwise. It's very simple to get games up and running on the PS4.
 
Killer Instinct is 60fps, but so is Resogun so make that "one" instead of "some" to not start a list war. I just think it's a bit worrisome to not see the claimed 40% power difference even in first party titles when the framerates are so low. After all, the proof is in the pudding, talk about much better specs don't mean much unless it clearly shows.

KI is also running at 720p last I saw, so that doesn't fit your criteria of 1080p 60fps.
 

Krakn3Dfx

Member
Killer Instinct is 60fps, but so is Resogun so make that "one" instead of "some" to not start a list war. I just think it's a bit worrisome to not see the claimed 40% power difference even in first party titles when the framerates are so low. After all, the proof is in the pudding, talk about much better specs don't mean much unless it clearly shows.

Resogun is 1080p/60fps.
 
You make your own implication. Fact is that he said that the numbers he posted are correct and were verified. Yet, even now. There is absolutely no explanation how they came up with that number , even when you use there own definition for it. And even at that. They contradict them self.
I don't think they necessarily do. It seems SRAM does come in two-port variants that can read and write at the same time. There's also an add-on feature called "zero bus turnaround" where switching between those modes is instantaneous...obviously implying that turnaround is not usually zero. So if Microsoft has two-ported eSRAM but no ZBT feature, the numbers presented both times suddenly can make sense:

102 GB/s two-way is 204 GB/s...but without ZBT, true theoretical max is only 192 GB/s
109 GB/s two-way is 218 GB/s...but without ZBT, true theoretical max is only 204 GB/s

This would clarify a whole lot. First, it means Albert Penello, his Technical Fellow source, and the Hotchips spokesmen aren't blatantly lying in a way that'd inevitably be found out. Second, it would illustrate why we've gotten two different "theoretical max" numbers each time. Third, and most tellingly, it would explain the otherwise apparently coincidental appearance of "204" at both clock rates. If mode-switching overhead is 5.9% at 800 MHz, at 853 MHz it would be 6.3%. And 6.3% overhead on 218 GB/s leaves...204 GB/s.

I don't think anyone from Microsoft is lying. I believe the true theoretical max bandwidth of their eSRAM really is 204 GB/s now.
 

JaggedSac

Member
Microsoft's HotChips talk was very lacking in terms of actual technical details and filled with all the buzzwords. Other HotChips talks were more in-depth and I'd doubt anyone used such mathemagic or buzzwords in their ppts.

The roundtable after the presentation was even more fluff - it's up on youtube and I'd encourage people to watch it. Major Nelson has all these really important guys and all they do is repeat PR sanctioned buzzwords - Growth via clouds, transistors in the clouds, gamers have changed, gaming has changed, this is what the gamers want, kinect, kinect, kinect, NUI, NUI, NUI, ESRAM, VM OS ..

You are mistaken, that thing with Major was filmed at E3 or after their initial reveal(cannot remember). Had nothing to do with Hotchips. Could you link to the hotchips presentation please?

EDIT: It was after their initial reveal in May.
 

teiresias

Member
How ignorant are these MS execs if they think they can run the same HD-DVD style fud-campaign at NeoGAF like they did at AVS forums? Pointless question since these were the same clowns behind HD-DVD.

The same FUD strategy will not fly here as it did there back then primarily due to the user base and the moderation of this forum compared to AVS who were totally unprepared for the FUD and astroturfing campaign that occurred with HD-DVD.

Oh wow, I'd completely forgotten about that whole affair. It was pretty awful over there at the time, and there was one guy in particular who was quite blatantly an HD-DVD shill (and an MS employee to boot). The moderation didn't know what hit them. Granted, they'd also granted a forum account to Deadmeat around that same time period, and I quickly pointed out how stupid a decision that was.
 

Fredrik

Member
You were making claims about 60fps 1080p games on Xbox One.
My bad, that's not the issue I wanted to focus on though, I just think that with the claimed power difference we really shouldn't see ANY XB1 games running at 60fps if there are first party PS4 titles running at 30fps, either at 1080p or 720p, unless there is a big visual difference or there is a bit of a problem to harness that extra power. Otherwise, what's the point of talking so much about it?
And if, as the Resogun dev said, some devs might have to get their knuckles bloodied to harness that power, will multiformat devs bother doing it?
 

astraycat

Member
I don't think they necessarily do. It seems SRAM does come in two-port variants that can read and write at the same time. There's also an add-on feature called "zero bus turnaround" where switching between those modes is instantaneous...obviously implying that turnaround is not usually zero. So if Microsoft has two-ported eSRAM but no ZBT feature, the numbers presented both times suddenly can make sense:

102 GB/s two-way is 204 GB/s...but without ZBT, true theoretical max is only 192 GB/s
109 GB/s two-way is 218 GB/s...but without ZBT, true theoretical max is only 204 GB/s

This would clarify a whole lot. First, it means Albert Penello, his Technical Fellow source, and the Hotchips spokesmen aren't blatantly lying in a way that'd inevitably be found out. Second, it would illustrate why we've gotten two different "theoretical max" numbers each time. Third, and most tellingly, it would explain the otherwise apparently coincidental appearance of "204" at both clock rates. If mode-switching overhead is 5.9% at 800 MHz, at 853 MHz it would be 6.3%. And 6.3% overhead on 218 GB/s leaves...204 GB/s.

I don't think anyone from Microsoft is lying. I believe the true theoretical max bandwidth of their eSRAM really is 204 GB/s now.

This still doesn't make a lot of sense to me. First off, where did your 5.9% come from? And 6.3%? If the overhead is percentage based (a static number of cycles per swap), it should be the same percent no matter the clock speed.

The other half is that the claim is to be able to read/write in the same cycle. If you can read/write in the same cycle, why would you have overhead for switching between read and write?
 

Fredrik

Member
You've misinterpreted the Housemarque dev's comment. He was saying that Sony allows for very low level access is you want to delve down that far. Jumping in and coding at the lowest level will give some dves 'bloody knuckles'.

He's not saying that the power of the PS4 in itself cause any issues. Which ties in from everything we've heard from other devs, third party and otherwise. It's very simple to get games up and running on the PS4.
Then it's even worse... Read my last post. Why aren't we seeing the extra power when comparing screenshots and/or trailers from PS4 games to XB1 games?
What was the theoretical power difference between PS3 and Xbox360?
 

KidBeta

Junior Member
Then it's even worse... Read my last post. Why aren't we seeing the extra power when comparing screenshots and/or trailers from PS4 games to XB1 games?
What was the theoretical power difference between PS3 and Xbox360?

Because different developers target different features, effects, and games and have different skills.

To try and compare two consoles power with exclusives is difficult, wait for the 3rd party cross platform games to come out to compare. I hear most are being demoed on the PS4.
 
I don't know why MS keeps this up about having better hardware. They don't, we all know it. Get your first party shit together and put all your eggs in that basket.

Nobody believes it's better, nobody wants Kinect.
 

malfcn

Member
I don't know why MS keeps this up about having better hardware. They don't, we all know it. Get your first party shit together and put all your eggs in that basket.

Nobody believes it's better, nobody wants Kinect.

There are quite a few people that don't agree with you. Don't speak for me.
 

CLEEK

Member
I just think that with the claimed power difference we really shouldn't see ANY XB1 games running at 60fps if there are first party PS4 titles running at 30fps, either at 1080p or 720p, unless there is a big visual difference or there is a bit of a problem to harness that extra power.

Both the 360 and PS3 can happily run games at 1080p/60 if the devs want (the PS3 launched with one). If just comes down to what they're prepared to compromise to achieve this.

So of course Xbone games can run at 1080p/60 if the devs want.

Don't look at PS4 games running at 30fps, or Xbone games running at 720p as obviously pointing to performance issues, as this doesn't tell the true story. Personally, I think 720p sounds more alarm bells in my mind than 30fps, but it's too early to know what exactly has to be sacrificed on the Xbone for it to run 1080p/60. The only game I know if that does hit this target uses last-gen baked lighting, so has compromised to get the frame rate and resolution.
 

nib95

Banned
There are quite a few people that don't agree with you. Don't speak for me.

Better hardware as in you prefer the design and Kinect? Or better hardware as in more performance and power than the PS4? If it's the latter, no offence buddy, but that's bordering on plain delusional. There is absolutely no way to comprehend the Xbox One as having more hardware performance than the PS4. None.
 

CLEEK

Member
Then it's even worse... Read my last post. Why aren't we seeing the extra power when comparing screenshots and/or trailers from PS4 games to XB1 games?

I don't think anyone can judge console performance based on screenshots. And trailers are usually smoke and mirrors, with in-engine scenes being passed off as in-game, or just rendered on dev kits or even running on PCs.

We'll see in November what all the launch games are really like.

Even though I've just said the above, if you compare KI with Resogun - both games that the devs make a big point about all the particles being thrown about - I would say the PS4 games looks several orders of magnitude more advanced the the Xbone game, even without considering the PS4 game is 1080p compared to 720p of the Xbone game. I've been more wowed by Resogun than any other PS4 or Xbone title.

Again, we'll see how they finally turn out come November.
 
This still doesn't make a lot of sense to me. First off, where did your 5.9% come from? And 6.3%? If the overhead is percentage based (a static number of cycles), it should be the same percent no matter the clock speed.
The 5.9% comes from Microsoft saying max bidirectional bandwidth was 192 GB/s, even though a pure doubling would've been 204 GB/s. As for the other percentage, I have no concept of what physical process might underlie it. I just find it significant that, if the cost isn't static but instead scales linearly with clock speed, you get 6.3% overhead. And thus on 218 GB/s you get the final result of 204 GB/s, which is what Microsoft has announced. However they're figuring that, they're doing it in an intelligible way.

The other half is that the claim is to be able to read/write in the same cycle. If you can read/write in the same cycle, why would you have overhead for switching between read and write?
I don't know, this is way beyond my depth. But if there's a specific variant of SRAM called "zero bus turnaround", that definitely seems to imply variants where bus turnaround isn't zero, right?
 

sangreal

Member
Sure, but who presented it to DF?

Sorry, I was being rushed out the door so I didn't have time to elaborate, but yes obviously DF got the information from a source who got it from MS but who knows how it was *(mis)interpreted. Most people believe they got the PS4 memory situation completely wrong despite that info coming from the SDK as well. I also wanted to add that I agree that 204 is obviously the real number, regardless of how its achieved or under what circumstances it can be reached. The idea that it was a typo and the real number is 218 is ridiculous. AP as a product manager gets his info secondhand from the engineering team, but the hotchip presentations would've been direct info.
 
This is anecdotal from E3, but...

I've heard the architecture with the ESRAM is actually a major hurdle in development because you need to manually fill and flush it.

So unless MS's APIs have improved to the point that this is essentially automatic, the bandwidth and hardware speed are probably irrelevant.

For reference, the story going around E3 went something like this:

"ATVI was doing the CoD: Ghosts port to nextgen. It took three weeks for PS4 and came out at 90 FPS unoptimized, and four months on Xbone and came out at 15 FPS."

What? This can't be true.
 

astraycat

Member
The 5.9% comes from Microsoft saying max bidirectional bandwidth was 192 GB/s, even though a pure doubling would've been 204 GB/s. As for the other percentage, I have no concept of what physical process might underlie it. I just find it significant that, if the cost isn't static but instead scales linearly with clock speed, you get 6.3% overhead. And thus on 218 GB/s you get the final result of 204 GB/s, which is what Microsoft has announced. However they're figuring that, they're doing it in an intelligible way.


I don't know, this is way beyond my depth. But if there's a specific variant of SRAM called "zero bus turnaround", that definitely seems to imply variants where bus turnaround isn't zero, right?

FYI, the difference in the ratio between 204/192 and 218/204 is pretty small (much smaller than 0.4%). This mysterious overhead is likely consistently applied to both numbers, but the rounding to GB/s gives some error.

The point I was trying to make about read/write on the same cycle was that there should be no reason for a turnaround time because if you can read/write at the same time, then there shouldn't be a reason to turn around to begin with for calculating maximum theoretical throughput. You just read/write every cycle.
 

Fredrik

Member
Because different developers target different features, effects, and games and have different skills.

To try and compare two consoles power with exclusives is difficult, wait for the 3rd party cross platform games to come out to compare. I hear most are being demoed on the PS4.
That's yet another problem. If the console is easy to develop for, and super skillful first party devs as Guerrilla and Evolution still have to aim for 30fps to get better effects, but still don't make you go "yuck" whenever you see a XB1 title - doesn't that make you question what's the big deal with that extra power?
It's a missed opportunity and quite annoying really, I know I'm whining a bit here but I'd rather see them all going locked-60fps on everything, otherwise it just seems like the console is too weak or they aren't trying hard enough. WiiU titles running at 60fps is easy to take but XB1's Forza5 at 60fps while PS4's Driveclub is 30fps is simply annoying even if there are more advanced effects etc. KillzoneSF at 30fps will be even more annoying if Halo5 turn out running at 60fps.

Oh well I still have PS4 with a bunch of games preordered for Nov29 while XB1 still haven't got a launch date in my country, so maybe the second wave PS4 games and ND's next wizardry will be out before I even get to try the XB1 launch titles.
 
Top Bottom