• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.

Freshmaker

I am Korean.
Looking worse for Microsoft each day. I'm sure however their games line up will be solid.

They still got infinite power of the cloud.

A part of me is wondering if this "rumor" has anything to do with confusion over the different power states of the Xbox One. ;)

It would make sense that there are different power states, some that would even lower clock rates or entirely disable specific parts of the system when the system isn't busy playing full games, or is occupied with less intensive tasks. Are we sure that the sources informing people of this stuff aren't confusing what are the lower, base clocks to the fact that there is likely a higher boost clock when the system demands it for games?

During manufacturing?
 
Is it just me or does this whole XBone platform seem like the suits and bean counters had way too much influence in every aspect of its design?
 

Chittagong

Gold Member
It's been suggested on Beyond3D that the SRAM array may literally be too large for signals to travel the physical distance in time to be considered valid in a single clock cycle. No one has ever made a pool of SRAM this large before. IBM uses EDRAM for their large CPU caches. Maybe that's in part because they were not sure SRAM could scale effectively do to such issues. If correct, that means the normal measures you'd take to improve yields (additional cooling, deactivating defective regions, even increasing production runs) would not be effective. But lowering the clock would give a signal more time to travel through a wire.

That would be a massive architecture oversight or one of the craziest risks ever.
 
A part of me is wondering if this "rumor" has anything to do with confusion over the different power states of the Xbox One. ;)

This isn't new to xbone PC's have done it for years, ps4 will have multiple power states and the people who would know enough to start these rumours would know the difference.
 

TKM

Member
Is XBone's CPU similar to a 7790 running at 800Mhz with two CUs disabled for yield? If so, maybe they can get some performance back by running 14CUs at a lower clock.
 

KidBeta

Junior Member
Is XBone's CPU similar to a 7790 running at 800Mhz with two CUs disabled for yield? If so, maybe they can get some performance back by running 14CUs at a lower clock.

Then they lower there yield and also increase the heat
 

artist

Banned
Is XBone's CPU similar to a 7790 running at 800Mhz with two CUs disabled for yield? If so, maybe they can get some performance back by running 14CUs at a lower clock.
That's a (big) assumption that the 2 extra CUs were disabled in the first place.
 

GribbleGrunger

Dreams in Digital
if the yield issues are due to the esram, then chopping CUs won't help. Perhaps a heat issue affecting the esram so they have to downclock the GPU

They still have 500 billion transistors and 'rocket science level stuff', so it's not all bad
 

CLEEK

Member
That would be a massive architecture oversight or one of the craziest risks ever.

While it does sound technically plausible, I can't believe they could make such a huge error. Unless they were always aiming for 28nm and for whatever reason, had to stick with 40nm (as was suggested after the reveal).
 
This isn't new to xbone PC's have done it for years, ps4 will have multiple power states and the people who would know enough to start these rumours would know the difference.

The same people who confused for so long the fact that an apparent xeon processor was inside xbox one dev kits purely to emulate the audio chip of the console, but they took this to imply the Xbox One had a 16 core processor.

Those people?
 
Since I'm in the OP, I might as well say that I completely stand by what I said and feel the person that informed me of this is as reliable as it possibly gets on the Xbox One. I feel a source doesn't get more accurate than this.

If it turns out wrong, feel free to perma ban me. Although, if I'm wrong, which I really, really doubt, I throw myself on the mercy of the court.

Who am I kidding? Gaf? Mercy? Still, I take full responsibility for what I said, and I absolutely stand by it. :)

134418906958.jpg
 
It sounds like something that was most definitely designed by a bunch of suits.
It seems to me that any hardware design type person who knew their stuff could've seen this eSRAM problem coming from a mile away but some suit decided to go with it to save $10 per unit or something.
 
Matt's intervention was about heat issues and inadequate yields, which hve been rumored for a while now, so there is no reason to doubt him -especially someone with his post history-. But he didn't hint at a any downclock...

Matt did not hint one way or the other. He offered a vague confirmation of something in the OP. What he was confirming is unknown.
 

Haha, just give me a fair trail. This shit better be confirmed from proper sources, and not some random and completely unverifiable nonsense source that is just piggybacking what they saw elsewhere, and just figured they'd get in on the fun.

If I'm to be banned, I want my money's worth.

the-expendables-jean-clause-van-damme.jpg
 

Klocker

Member
It seems to me that any hardware design type person who knew their stuff could've seen this eSRAM problem coming from a mile away but some suit decided to go with it to save $10 per unit or something.

they built and entire over sized simulated machine of the chip architecture testing all the theories...it's not like it should have been missed but simulations and real silicon are two different things
 
Haha, just give me a fair trail. This shit better be confirmed from proper sources, and not some random and completely unverifiable nonsense source that is just piggybacking what they saw elsewhere, and just figured they'd get in on the fun.

If I'm to be banned, I want my money's worth.

the-expendables-jean-clause-van-damme.jpg

How about this. All of you PM me who your source is. I shall be the judge! D:

Your secrets are safe with me!
 
Here we go again. Ill take it as fact if I want it to be true. Predictable.

Yield issues were reported just as much last te for both and they got through it.
 
It's been suggested on Beyond3D that the SRAM array may literally be too large for signals to travel the physical distance in time to be considered valid in a single clock cycle. No one has ever made a pool of SRAM this large before. IBM uses EDRAM for their large CPU caches. Maybe that's in part because they were not sure SRAM could scale effectively do to such issues. If correct, that means the normal measures you'd take to improve yields (additional cooling, deactivating defective regions, even increasing production runs) would not be effective. But lowering the clock would give a signal more time to travel through a wire.

Brad, would an oversight of that magnitude even be possible? The whole hardware design team needs to be sacked if they are lowering clocks for this reason.
 

daffy

Banned
So basically it's an ice cube tray with some half melted ice here and there? Doesn't sound like a huge deal to me, X1 games will be ok. I don't plan on being an early adopter anyway with the Kinect Trojan and everything.
 
Here we go again. Ill take it as fact if I want it to be true. Predictable.

Yield issues were reported just as much last te for both and they got through it.

PS3 definitely didn't "get through". Sony had to disable one SPE and the original cost of Cell was somewhere around $250 per unit without amortisation of R&D costs. That is the very definition of not "getting through". In the end the PS3 lost Sony all of their profits from PS1 and PS2, again, in what way is that getting through?
 

CLEEK

Member
Brad, would an oversight of that magnitude even be possible? The whole hardware design team needs to be sacked if they are lowering clocks for this reason.

Yeah, it's like building a house, only to find it's so big, it takes too long to get out of it. So you have to get up earlier just to walk out the front door in time for work.

I'm awesome with analogies.
 
It's been suggested on Beyond3D that the SRAM array may literally be too large for signals to travel the physical distance in time to be considered valid in a single clock cycle. No one has ever made a pool of SRAM this large before. IBM uses EDRAM for their large CPU caches. Maybe that's in part because they were not sure SRAM could scale effectively do to such issues. If correct, that means the normal measures you'd take to improve yields (additional cooling, deactivating defective regions, even increasing production runs) would not be effective. But lowering the clock would give a signal more time to travel through a wire.

That would be incredible if true.

So would that hypothesis suggest that when the pool was designed, it was supposed to be fabbed using a smaller process than they are currently utilizing? And that previously-planned on process wasn't yielding an acceptable percentage, so they had to go with a larger process that is now out-of-spec? I'm trying to wrap my head around how this misadventure would come to pass under this explanation.
 

Rooster

Member
I expected initial yield problems with a chip that size. Expect limited stock on release. Release will probably be staggered as well.
 
The only thing I see in here is a possible delay of the launch, not really less performance. Am I wrong?

Edit: Damn everybody talking in Gamecube terms here, seems pretty underpowered.
 

gofreak

GAF's Bob Woodward
That would be incredible if true.

So would that hypothesis suggest that when the pool was designed, it was supposed to be fabbed using a smaller process than they are currently utilizing? And that previously-planned on process wasn't yielding an acceptable percentage, so they had to go with a larger process that is now out-of-spec? I'm trying to wrap my head around how this misadventure would come to pass under this explanation.

That's sort of what I was thinking. Was there any indication at any point that the design was initially targeting 22nm? From that leaked doc a while back, for example?
 

kitch9

Banned
This "power of the cloud" crap should end now. Either post somehing constructive or leave it for yourself.

Its infinite though.

Never forget.

If you were around here the PS3 launch "shooting for the sun" was a thing that got ridiculed along with "get another job."

Complete fucking retard bullshit will get the piss taken out of it, and that particular infinite cloud shit MS spouted is complete retard.
 
Status
Not open for further replies.
Top Bottom