• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.
11593-Woman_spills_beer.gif
Dat sweep to the leg... Right out of karate kid
 

Y2Kev

TLG Fan Caretaker Est. 2009
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
 

StiLt

Member
No matter if Xbone ends up being a 900GF or 1200GF console, multiplatform games will be designed around it and then up-ported to PS4. The only thing that could just is just how much better it's going to end up on the PS4. If MS downclocks to something like 900GF, that could make an impact on design and possibly limit the scope and size of the games, not just IQ and FPS. And then Sony devs are going to destroy everything else.

Don't think that takes developer pride/reputation into the equation at all. No way all 3rd parties would do this to then be shat on from a very great height by first party developers. You're only as good/marketable as your last game, and if that is too far behind the curve (because targeting lowest denominator surely wouldn't be industry unanimous)... well we're already seeing devs dropping like flies of late. Reputation is a massive deal to survival in the current climate. If the power difference is as large as this thread suggests it could be, I would hazard that it will be a case of downwards rather than upwards ports. Particularly if the bulk of core (i.e. frequent game purchasing) gamers gravitate to the more powerful system.
 

GopherD

Member
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?
Production yield targetting and heat mapping are at the same stages of a console development. While they don't actually relate to each other, they can occur at the same time. This may explain the mixed messages from a few posters.
 

thuway

Member
For everyone asking- this information is all pretty recent. Around the PlayStation Meeting the Xbox One was way behind (OS + hardware). Engineers were scrambling to get things sorted out.

It turns out, they didn't sort it out. The OS you saw was a complete and total lie. The current plan is to get the yields up, lower the clock rate, and to have enough units out for a sell out in the Fall.

For those asking how this affects performance- to be perfectly frank; it is nothing turning down features won't solve. The mass market will never notice a difference between 1080p and 900p; neither will they care about dynamic shadows / global illumination / or tesselation. Go to your PC - and turn shadows from Ultra to medium, disable tesselation, and lower the resolution to 900p; and you'll find games run totally fine.

Microsoft is purely behind and it's now time to make drastic decisions. I don't think any one is happy about the lower clocks, but no one is depressed about it either. The Xbox One is an all-in-on device; and that's how it will be marketed.
 

DBT85

Member
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?

I think because eSRAM bandwidth and GPU frequency are linked if I read it correctly.

So if they are having problems making the eSRAM that works at 102GB/s but works fine at say 95GB/s then they would drop the GPU frequency as well.

But I have no idea what I'm talking about.
 

thuway

Member
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?

http://www.neogaf.com/forum/showpost.php?p=61347677&postcount=671

Sony's APU is 3 billion transistors. MS's is 5 billion. And more complex because of added parts.

Because they are all in one part, one part fucked up means the whole APU will be affected by it. And since 25% of APU's die is dedicated for ESRAM (in my opinion this is not smart at all. With this many transistors MS should have just outright put more GPU power), the heat issue can be quite real.

and since ESRAM's bandwidth is directly correlated with GPU's speed (lower GPU speed and lower the ESRAM bandwidth therefore lower the thermal issue) seems plausible, but I still don't think MS can be THAT stupid to figure out this in the blueprint stage.




If they did then this is colossal FUCK UP.

come to think of it the yield for this chip must be horrible.
 
Don't think that takes developer pride/reputation into the equation at all. No way all 3rd parties would do this to then be shat on from a very great height by first party developers. You're only as good/marketable as your last game, and if that is too far behind the curve (because targeting lowest denominator surely wouldn't be industry unanimous)... well we're already seeing devs dropping like flies of late. Reputation is a massive deal to survival in the current climate. If the power difference is as large as this thread suggests it could be, I would hazard that it will be a case of downwards rather than upwards ports. Particularly if the bulk of core (i.e. frequent game purchasing) gamers gravitate to the more powerful system.

I don't think that's going to happen if the difference gets so massive. If you're going 1080p on PS4, you would have to drop to 720p on Xbone to get comparable performance. If you're going to something in between on the PS4, you would have to cut more on Xbone unless you're going sub-HD. Either way, PS4 will end up with a massive IQ advantage. If the devs target a solid IQ on the Xbone and design their games around that, they games will end up looking poorly compared to PS4 exclusives. If they want to match Sony's 1st party stuff, they'll be forced to make severe IQ concessions.
 

DC1

Member
Actually if Apple are anything to go by, nothing makes money like producing great products that people want to buy and then selling them with a healthy profit margin.

I'm not sure why loss leadership has become de riguer in the console space but it's always seemed batty to me. It's as though the companies (except Nintendo) have come to believe that the aim in console manufacture is to 'win' by selling the most units and not turning a profit because that's what the fans think.

Yes. But there is a inherent risk with that.

May not be the best example but... :
The NFL gets 3.1 Billion dollars Per Year from TV rights (CBS,Fox and NBC). 3.1 Billion per year.
How does CBS, Fox and NBC pay for this..Advertisement!

Companies will pay to put their product in front of an audience .. especially if you can guarantee that they are watching ;)
 

v1oz

Member
For everyone asking- this information is all pretty recent. Around the PlayStation Meeting the Xbox One was way behind (OS + hardware). Engineers were scrambling to get things sorted out.

It turns out, they didn't sort it out. The OS you saw was a complete and total lie. The current plan is to get the yields up, lower the clock rate, and to have enough units out for a sell out in the Fall.

For those asking how this affects performance- to be perfectly frank; it is nothing turning down features won't solve. The mass market will never notice a difference between 1080p and 900p; neither will they care about dynamic shadows / global illumination / or tesselation. Go to your PC - and turn shadows from Ultra to medium, disable tesselation, and lower the resolution to 900p; and you'll find games run totally fine.

Microsoft is purely behind and it's now time to make drastic decisions. I don't think any one is happy about the lower clocks, but no one is depressed about it either. The Xbox One is an all-in-on device; and that's how it will be marketed.

lol
 

Deku Tree

Member
For everyone asking- this information is all pretty recent. Around the PlayStation Meeting the Xbox One was way behind (OS + hardware). Engineers were scrambling to get things sorted out.

It turns out, they didn't sort it out. The OS you saw was a complete and total lie. The current plan is to get the yields up, lower the clock rate, and to have enough units out for a sell out in the Fall.

For those asking how this affects performance- to be perfectly frank; it is nothing turning down features won't solve. The mass market will never notice a difference between 1080p and 900p; neither will they care about dynamic shadows / global illumination / or tesselation. Go to your PC - and turn shadows from Ultra to medium, disable tesselation, and lower the resolution to 900p; and you'll find games run totally fine.

Microsoft is purely behind and it's now time to make drastic decisions. I don't think any one is happy about the lower clocks, but no one is depressed about it either. The Xbox One is an all-in-on device; and that's how it will be marketed.

Good to know.
 

The Jason

Member
For everyone asking- this information is all pretty recent. Around the PlayStation Meeting the Xbox One was way behind (OS + hardware). Engineers were scrambling to get things sorted out.

It turns out, they didn't sort it out. The OS you saw was a complete and total lie. The current plan is to get the yields up, lower the clock rate, and to have enough units out for a sell out in the Fall.

For those asking how this affects performance- to be perfectly frank; it is nothing turning down features won't solve. The mass market will never notice a difference between 1080p and 900p; neither will they care about dynamic shadows / global illumination / or tesselation. Go to your PC - and turn shadows from Ultra to medium, disable tesselation, and lower the resolution to 900p; and you'll find games run totally fine.

Microsoft is purely behind and it's now time to make drastic decisions. I don't think any one is happy about the lower clocks, but no one is depressed about it either. The Xbox One is an all-in-on device; and that's how it will be marketed.

They really should have anticipated these problems during the design process.
 
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?

If you have 90% deffective yields with the target specs, being for example 40% for deffective ESRAM + 40% for deffective GPU + 10% for deffective CPU ( symplificating ) you will try to lower the 90% rate at all cost, reducing the ESRAM clock or the GPU clock or whatever.
A GCN GPU per se has a not very high yield rate. I remember hearing that in the beginning it was something like 40% ( 4 of 10 were good chips). Getting worse than a vanilla GCN GPU ( ESRAM is difficult to get fabbed ) would in the end be a severe money lost.
 
I think because eSRAM bandwidth and GPU frequency are linked if I read it correctly.

So if they are having problems making the eSRAM that works at 102GB/s but works fine at say 95GB/s then they would drop the GPU frequency as well.

But I have no idea what I'm talking about.

The esram is on the same die as the GPU/CPU isn't it, which would obviously effect the whole APU..
 

thuway

Member
They really should have anticipated these problems during the design process.

It was a bet, like any other bet, but the downclock isn't what is worrisome, it's the half baked OS. I hope they just fix things in time. I'm optimistic they'll get it right, but it might be a few months after launch.

They have a very ambitious goal with the Xbox One OS.
 

RayMaker

Banned
I wouldn't say its a colossal mess up like ppl are saying.

Its just a result of wanting 8GB ram form the get go, to meet there tv/kinect needs.

Does not really make sense to me because Mark Cerny did say that they were interested in the esram method because of the ease of manufacturability

Was mark cerny lying?
 
I read the OP but I can't find the explanation. Why would an esram issue cause a gpu downgrade?

There are many theories. IMO, the more I think about this one, the more it makes sense, and I generally discount a lot of what is read on B3D:

It's been suggested on Beyond3D that the SRAM array may literally be too large for signals to travel the physical distance in time to be considered valid in a single clock cycle. No one has ever made a pool of SRAM this large before. IBM uses EDRAM for their large CPU caches. Maybe that's in part because they were not sure SRAM could scale effectively do to such issues. If correct, that means the normal measures you'd take to improve yields (additional cooling, deactivating defective regions, even increasing production runs) would not be effective. But lowering the clock would give a signal more time to travel through a wire.

This explanation would require one to believe that either MS had originally designed the eSRAM pool on a razor's edge, with maximum transistor density in mind, and the design failed, either because of leakage/thermal/yield reasons, or they designed the pool with a smaller process in mind, and yields sucked, so they had to shift to a larger process.

If this turns out to be true, the story of this misadventure will be very very interesting.
 
It was a bet, like any other bet, but the downclock isn't what is worrisome, it's the half baked OS. I hope they just fix things in time. I'm optimistic they'll get it right, but it might be a few months after launch.

They have a very ambitious goal with the Xbox One OS.

Well, if this progression goes on, MS could end up with a Wiiu power console. So, is worrisome too.
 

thuway

Member
Well, if this progression goes on, MS could end up with a Wiiu power console. So, is worrisome too.

To get to Wii U levels, you need a new, lower powered, and shittier GPU with 1 GB of DDR3 allocated to games and no ES RAM. That's a far cry from what Xbone is.
 

LukeTim

Member
I would say that's somehow exaggerated. Keep in mind that X1 is a closet hardware with custom strengths as the Data Move Engines. Seeing a nightmarish hardware as the PS3 pushing some titles like GT5 or MotorStorm Apocalypse at 12whatever x 1080 I can easily see games in Xbone running at native 1080p. That would depend on the game, the skill of the studio and the complexity of the effects and assets, of course.

Would PS4 outperform and in specific games even double the frame-rate? If this rumour is true (and I don't buy it much) probably and frequently.
Would this downgrade (again, if it's true) drive to a 1080 vs 720 situation? I doubt it. But man, the differences in AA, now that's where I'd expect a great leap.

I thought those Data Move Engines were just PR nonsense for DMA... which the PS4 will definitely have.

Or is it something more than that?
 

syko de4d

Member
Now PS4 only need to support Oculus Rift and the PS4 can play every X1 Game with the same graphics but in Stereo 3D without any problems xD
 

1-D_FTW

Member
Think this way.

Sony's APU is 3 billion transistors. MS's is 5 billion. And more complex because of added parts.

Because they are all in one part, one part fucked up means the whole APU will be affected by it. And since 25% of APU's die is dedicated for ESRAM (in my opinion this is not smart at all. With this many transistors MS should have just outright put more GPU power), the heat issue can be quite real.

and since ESRAM's bandwidth is directly correlated with GPU's speed (lower GPU speed and lower the ESRAM bandwidth therefore lower the thermal issue) seems plausible, but I still don't think MS can be THAT stupid to figure out this in the blueprint stage.




If they did then this is colossal FUCK UP.

come to think of it the yield for this chip must be horrible.

Jeez. I didn't realize Sony was only using 3 billion. Until you see it directly compared, you just don't appreciate how badly they guessed wrong on needing to go DDR3 to get their 8GB OS hog.
 
There are many theories. IMO, the more I think about this one, the more it makes sense, and I generally discount a lot of what is read on B3D:



This explanation would require one to believe that either MS had originally designed the eSRAM pool with maximum transistor density in mind, and the design failed, either because of leakage/thermal/yield reasons, or they designed the pool with a smaller process in mind, and yields sucked, so they had to shift to a larger process.

If this turns out to be true, the story of this misadventure will be very very interesting.

That would have been one catastrophic mistake to make. That's a fundemental design mistake, not just a yield issue.
 
It was a bet, like any other bet, but the downclock isn't what is worrisome, it's the half baked OS. I hope they just fix things in time. I'm optimistic they'll get it right, but it might be a few months after launch.

They have a very ambitious goal with the Xbox One OS.
A shitty OS that gobbles up RAM?

Someone should tell MS to stop reading Sony's 2006 book of stupid things to do.
 
Now PS4 only need to support Oculus Rift and the PS4 can play every X1 Game with the same graphics but in Stereo 3D without any problems xD

Well, if "Thomas was alone" developer went to the Sony offices to present his PS4 new project with an Oculus Rift...maybe is because his game will need one...
 

jaosobno

Member
More importantly wills this cause a launch delay for Xbone to sort stuff out or will they sacrifice more GPU power instead? So in essence can more time afford them a solution that is not detrimental to the performance?

They will launch, no way they can allow Sony to get ahead of them (even for a couple of months) after shitstorm that Xbox One has been so far.

Thuway was banned for teasing nonsense and posting speculation that all Xbox E3 demos would be downgraded as fact.

He was banned because he was teasing info but didn't want to spill it. Learn some history before posting such things.
 

Durante

Member
I don't think going embedded memory with DDR3 was that bad a decision. What I really don't get is why they went with 6t-SRAM instead of eDRAM. The size difference is humongous. Are there production/process advantages to this I am unaware of?
 

DBT85

Member
Well, if "Thomas was alone" developer went to the Sony offices to present his PS4 new project with an Oculus Rift...maybe is because his game will need one...

I doubt it would NEED one. It might be able to use one.

I get why people are excited by Occulus Rift, but it's not going to permeate the market enough in my view.
 
I don't think going embedded memory with DDR3 was that bad a decision. What I really don't get is why they went with 6t-SRAM instead of eDRAM. The size difference is humongous. Are there production/process advantages to this I am unaware of?

Isn't the problem the way it handles data?
 
That would have been one catastrophic mistake to make. That's a fundemental design mistake, not just a yield issue.

It's a design mistake that never appears if your yields are good and the chips work.

It's happened before that smart designers fuck up silicon. Even Intel has been known to have some catastrophes that never seemed to show up until production wafers started rolling off. Remember the P5 FDIV bug? That was a pretty simple fuckup compared to engineering an eSRAM pool of unprecedented size on an APU.
 

jaosobno

Member
What we are seeing today with MS is their end game/goal. MS is betting that they are the one software company that can bring all the components of life and entertainment together. And by doing so will become the only information gathering conglomerate that will have a major television viewing angle.

Guys "Please understand" Nothing makes money like advertisement sales and MS is looking for a piece of That pie. Not gaming.

Yep, this is accurate. Ballmer himself confessed to this:

http://www.youtube.com/watch?v=UYFyKNQB1Js
 

LukeTim

Member
I don't think going embedded memory with DDR3 was that bad a decision. What I really don't get is why they went with 6t-SRAM instead of eDRAM. The size difference is humongous. Are there production/process advantages to this I am unaware of?

SRAM is cheaper, maybe?
 
Status
Not open for further replies.
Top Bottom