• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

TRUTHFACT: MS having eSRAM yield problems on Xbox One

Status
Not open for further replies.
MS is having trouble making enough XboxOne's with the current configuration. Which may lead to either a delay... or lowering the specs in order to be able to make them in sufficient number.

Bad yields like this is like, for example every 10 working consoles they make 5 don''t make it out of the factory. Which is very expensivo.

They try to manufacture as many chips as they can fit on a silicon wafer. They then test the chips to see if they are up to spec. Low yield means there are more chips than expected not meeting the spec. Their options are to delay the launch, lower the spec (down-clocking) so more chips can meet it, except the low yields and be supply constrained and cost inefficient or use brute force and try to pump out as many as they can, also cost inefficient.
OK thanks guys, that helps a lot.
 
Something I've never understood is why people are so quick to hang us insiders. We literally go out of our way to give you something to talk about and a glimpse behind the scenes. As far as I know- Bruce, Cboat, Matt, or Gopher- none of them have any intentions of starting a console proxy war. The Xbone community we are just telling you what we heard.

I for one appreciate all the information insiders have provided to us. It sucks when the info is bad new for a product I'm hyped for, but I would never shoot the messenger for it.

That said, there is a difference between insider information and a random poster making crazy predictions to generate buzz and claiming them as facts. The mods being there to help filter the real from the fake.
 

KidBeta

Junior Member
Well, MS already did the pooled UMA design thing with X360 while adding a fast EDRAM buffer that benefits more than hurts, AFAICT, just looking released titles over the years. Just looking at the stuff rumors and leaks have revealed, MS, having a longview with their DX roadmap and experience from their own studios, seems to have designed their hardware to address the problem of feeding units work data all of the time as well as trying to wipe out most of the cost of cache being killed to feed commonly used data. That's seems to be a more direct approach than the more general purpose one Sony took, AFAICT, but I'm no hardware or graphics tech-head. MS, it seems to me, built their version of the core hardware both they and Sony share, to be potentially much greater with maximizing the benefit of partially resident textures that UE4, idTech 5/6, and CE3+ support. In a general way, Sony added and then maxed out their pipes to the single RAM pool and CPU/GPU caches, while MS also did much the same except for focusing on having a high-speed on-chip cache and adding more dedicated copy/load/store/compress units to ease the burden from the CPU for data movement. MS' solution seems more complicated, but when these newer APU designs' goals are accounted for, memory virtualization should make managing it a relative snap and probably mostly hidden from programmers unless they specify more micromanagement with prefetching and how data is laid out to be consumed in the 32MB cache. I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance. That's the same goal as building out a high bandwidth connection to GDDR5 RAM and adding more ways to access cache, bypass it, and move data through it. If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly? MS sees the benefit and I'm not convinced that it's just a Rube Goldberg machine method of achieving the same goal because they're using DDR3 for their main work RAM.

Perhaps it's a mistake, but we're not quite near launch yet, with less than six months or so to go. MS could just take a big hit on cost upfront by taking what they get out of current manufacturing and just eat the higher cost/lower yields until the process is smoothed out over the next year, but not necessarily downclocking to make target shipments and time window. It's not uncommon for new consoles to be in less than great shape so close to release, as MS was behind with X360 and was forced to demo in-progress games with less than ideal performance at E3 '05 with their beta kits, IIRC, which don't quite approximate the final hardware's speed and behavior completely. We could be looking at something very similar with X1. In any case, rumors are rumors, and even when they are right, they can miss a lot of important details that can make them seem more major than they really are. I'm not going to underestimate MS' plans and execution when they came out fine twice before.

2 DMA units that can swizzle textures seem par for the course for GCN iirc.

So in reality they added one extra DMA unit which can decode JPG and one that can do LZ77.

These aren't great advancements in tech, DMA is decades old.
 

Minions

Member
more like lack of people who created Xbox what it is today for many people. Most of dudes who were creating Xbox1 is not around anymore

Agreed. Most of the Xbox creators have gone their separate ways. Microsoft has lost a lot of their hardware side. They still have great software devs. Sad to say the least.
 

Minions

Member
Well, MS already did the pooled UMA design thing with X360 while adding a fast EDRAM buffer that benefits more than hurts, AFAICT, just looking released titles over the years. Just looking at the stuff rumors and leaks have revealed, MS, having a longview with their DX roadmap and experience from their own studios, seems to have designed their hardware to address the problem of feeding units work data all of the time as well as trying to wipe out most of the cost of cache being killed to feed commonly used data. That's seems to be a more direct approach than the more general purpose one Sony took, AFAICT, but I'm no hardware or graphics tech-head. MS, it seems to me, built their version of the core hardware both they and Sony share, to be potentially much greater with maximizing the benefit of partially resident textures that UE4, idTech 5/6, and CE3+ support. In a general way, Sony added and then maxed out their pipes to the single RAM pool and CPU/GPU caches, while MS also did much the same except for focusing on having a high-speed on-chip cache and adding more dedicated copy/load/store/compress units to ease the burden from the CPU for data movement. MS' solution seems more complicated, but when these newer APU designs' goals are accounted for, memory virtualization should make managing it a relative snap and probably mostly hidden from programmers unless they specify more micromanagement with prefetching and how data is laid out to be consumed in the 32MB cache. I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance. That's the same goal as building out a high bandwidth connection to GDDR5 RAM and adding more ways to access cache, bypass it, and move data through it. If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly? MS sees the benefit and I'm not convinced that it's just a Rube Goldberg machine method of achieving the same goal because they're using DDR3 for their main work RAM.

Perhaps it's a mistake, but we're not quite near launch yet, with less than six months or so to go. MS could just take a big hit on cost upfront by taking what they get out of current manufacturing and just eat the higher cost/lower yields until the process is smoothed out over the next year, but not necessarily downclocking to make target shipments and time window. It's not uncommon for new consoles to be in less than great shape so close to release, as MS was behind with X360 and was forced to demo in-progress games with less than ideal performance at E3 '05 with their beta kits, IIRC, which don't quite approximate the final hardware's speed and behavior completely. We could be looking at something very similar with X1. In any case, rumors are rumors, and even when they are right, they can miss a lot of important details that can make them seem more major than they really are. I'm not going to underestimate MS' plans and execution when they came out fine twice before.

Even if they wanted to they would end up being supply constrained, and you would be seeing the nextbox at a (much) higher price point than the PS4. They wouldn't take a loss on fab costs and cut the console price in addtion.... we could be seeing (nearly) ps3 level losses per console sold. . .
 

Minions

Member
Basically that he had a source that said there is no issue with MS having to down clock anything and is willing to take a ban bet on it

Ouch.... with all the confirmations by (people who have been correct in the past) I wouldn't put my ass on the line. Sounds like a lose lose proposition to me.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Who is Senjutsu? I have never heard of him what did he say?

Last night he was claiming he knew the down clock wasn't true. Of course based on his posting history here and B3D, he doesn't "know" as much as "want".
 
Who is Senjutsu? I have never heard of him what did he say?

I'll help out. I can confirm for a fact that it's false. If I'm wrong, I'll take a ban. I can't say how I know this for sure, but I'm pretty damn sure.
Nope, I don't speak nearly as often on here about what I know for certain on the new Xbox. In fact, I try not to be the subject of any leaks, because I'd more or less be betraying someone's trust, but in the case of this most recent thing about a downclock of the GPU, I literally begged permission to say definitively that it's absolutely not true.

The only thing that's true is that they are having a bit of a headache with the ESRAM. THAT is certainly true, but (all speculation after this) I think that's more down to the manufacturing difficulty of the ESRAM and yields, not actually using a finished and working version of the GPU for software related purposes.



LOL. Nice dig :)
.
 

Perkel

Banned
Well, MS already did the pooled UMA design thing with X360 while adding a fast EDRAM buffer that benefits more than hurts, AFAICT, just looking released titles over the years. Just looking at the stuff rumors and leaks have revealed, MS, having a longview with their DX roadmap and experience from their own studios, seems to have designed their hardware to address the problem of feeding units work data all of the time as well as trying to wipe out most of the cost of cache being killed to feed commonly used data. That's seems to be a more direct approach than the more general purpose one Sony took, AFAICT, but I'm no hardware or graphics tech-head. MS, it seems to me, built their version of the core hardware both they and Sony share, to be potentially much greater with maximizing the benefit of partially resident textures that UE4, idTech 5/6, and CE3+ support. In a general way, Sony added and then maxed out their pipes to the single RAM pool and CPU/GPU caches, while MS also did much the same except for focusing on having a high-speed on-chip cache and adding more dedicated copy/load/store/compress units to ease the burden from the CPU for data movement. MS' solution seems more complicated, but when these newer APU designs' goals are accounted for, memory virtualization should make managing it a relative snap and probably mostly hidden from programmers unless they specify more micromanagement with prefetching and how data is laid out to be consumed in the 32MB cache. I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance. That's the same goal as building out a high bandwidth connection to GDDR5 RAM and adding more ways to access cache, bypass it, and move data through it. If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly? MS sees the benefit and I'm not convinced that it's just a Rube Goldberg machine method of achieving the same goal because they're using DDR3 for their main work RAM.

Because no one predicted that there will be available 8GB GDDR5 when consoles will be out. Sony all along planed 4GB of GDDR5 and even earlier 2GB. They got lucky with 8GB because it is super fresh tech like january this year and their old 4GB was in clamshell design so they just switched it.

If they wanted their media hub they needed 8GB. Only way back then was to use 8GB of DDR3 there was no 8GB of GDDR5. Even 4GB at that time was still not out.

If they wanted to go with 8GB of DDR3 they needed to use eDRAM or eSRAM to "patch" (not "fix") problem with DDR3 bandwidth.

Real question is now why eSRAM and no eDRAM. Some people mentioned that eDRAM can only be produced in few fabs and it will be harder to downsize it in future.
 

Nikodemos

Member
I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance.
It's a mistake due to an exceptionally simple reason. An APU is all-in-one. If one of the things inside it is a fab dud, the whole APU is a dud. The more things you cram into an APU, the higher the risk it'll be a dud.

If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly?
Because the PS4 was initially supposed to have just 4 GB GDDR5, a solution considered inferior to 8 GB DDR3+specialRAM according to some metrics. Nobody apart for Cerny and a couple others fully believed they could get 8 GB in time. Hirai accepted the GDDR5 solution and hoped for a jackpot, since Sony badly needed it. It got it.
 

Drek

Member
So how would that be a 3rd console curse when SNES had already sold less than the NES & the GameCube sold less than the 3rd console?

People call it a "curse" primarily in jest. It's the "curse" of hubris, feeling like your fans will blindly follow you into the next generation regardless of you decisions, and thinking you can strong arm your way into another successful generation.

Nintendo overplayed their hand when they continued to treat 3rd parties like shit, didn't recognize that an 800 pound gorilla was entering the industry whether they liked it or not, and that carts were a dead medium for home consoles due to cost and storage capacity. They failed to reach an accord with Sony on a new CD based system, failed to convince 3rd parties to stick with the later released, high cost media N64, and had more expensive games with worse profit margins due to carts v. CD.

Sony overplayed their hand when they thought MSRP was irrelevant, everyone would worship at the temple of Sony, and that they could Trojan horse blu-ray on everyone. Also, that ease of development was irrelevant because everyone tolerated the PS2's eccentricities so they'd tolerate the even more eccentric design of the PS3 because it's a Playstation.

SEGA overplayed their hand by releasing a muddled and confusing hardware design stuck halfway between the 2D and 3D transitions, with horrible marketing and a surprise launch. This came on the back of souring the SEGA fanbase with crazy Genesis add-ons though.

It isn't a curse, it's a market trend where the third hardware cycle seems to be when first parties who have seen some recent success have built a thick enough echo chamber to be oblivious to the market's demands. Nintendo has honestly never popped this bubble, the Gamecube had all the same problems as the N64 and the Wii only saw success with non-gamers and extremely casual gamers. They continue to survive thanks to a VERY dedicated core.

SEGA wasn't a strong enough company overall to dig out of the Saturn hole and were no longer financially capable of battling Sony, resulting in an exit from the console hardware market.

Sony is the first console manufacturer we're seeing make a real attempt to learn from these mistakes and dig their way out of the echo chamber. That so far has involved a change at the top of their consumer electronics division, their CEO/corporate president, the head of their worldwide studios, and the very people entrusted with product design. Sony handing the PS4 to Cerny and not their traditional stable of hardware engineers is the equivalent of Nintendo telling Miyamoto to just focus on games and letting real hardware guys design an efficient, powerful console. It's a massive directional shift and the first we've seen in the industry. How it pans out will likely change how the rest of the players in this industry work moving forward.
 
Nope, I don't speak nearly as often on here about what I know for certain on the new Xbox. In fact, I try not to be the subject of any leaks, because I'd more or less be betraying someone's trust, but in the case of this most recent thing about a downclock of the GPU, I literally begged permission to say definitively that it's absolutely not true.

The only thing that's true is that they are having a bit of a headache with the ESRAM. THAT is certainly true, but (all speculation after this) I think that's more down to the manufacturing difficulty of the ESRAM and yields, not actually using a finished and working version of the GPU for software related purposes.
.
lol i love how he has to beg his source to allow him to come to xbone's rescue
 

artist

Banned
capturejxjr9.png


7750 900Mhz Edition is ~920GFlops
7750 is ~820GFlops
 

KageMaru

Member
I still can't believe MS have designed such a mess of a console, especially after how well the last two were made.

So now it's looking like it'll be downclocked on top of eSRAM yield issues? This sounds like a disaster.
 
2 DMA units that can swizzle textures seem par for the course for GCN iirc.

So in reality they added one extra DMA unit which can decode JPG and one that can do LZ77.

These aren't great advancements in tech, DMA is decades old.

Whatever about great advancements. It may just be about maximizing performance based on expected operation. I'm sure MS' hardware/software team at XBOX is aware of what they're doing. Do you have an inside line into how and why they decided upon their design? I mean, you're always down on it in every thread, dismissing it out of hand. So, clearly, you're not fooled by MS' apparently poor or not-so-special configuration. I'm being serious here, so I would like you to explain for us plebs considering your constant lack of enthusiasm for MS' approach.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
So has he been banned?

I'm not sure what happened after I went to bed. Last night his "info" was in the OP, now it is not.

I still can't believe MS have designed such a mess of a console, especially after how well the last two were made.

So now it's looking like it'll be downclocked on top of eSRAM yield issues? This sounds like a disaster.

To be fair, the Xbox OG was not well designed. It was just some PC tech slapped together that cost MS way to much to make. The 360 was much better thought out. Even the Xbone is a smart design on paper.
 
People call it a "curse" primarily in jest. It's the "curse" of hubris, feeling like your fans will blindly follow you into the next generation regardless of you decisions, and thinking you can strong arm your way into another successful generation.

Nintendo overplayed their hand when they continued to treat 3rd parties like shit, didn't recognize that an 800 pound gorilla was entering the industry whether they liked it or not, and that carts were a dead medium for home consoles due to cost and storage capacity. They failed to reach an accord with Sony on a new CD based system, failed to convince 3rd parties to stick with the later released, high cost media N64, and had more expensive games with worse profit margins due to carts v. CD.

Sony overplayed their hand when they thought MSRP was irrelevant, everyone would worship at the temple of Sony, and that they could Trojan horse blu-ray on everyone. Also, that ease of development was irrelevant because everyone tolerated the PS2's eccentricities so they'd tolerate the even more eccentric design of the PS3 because it's a Playstation.

SEGA overplayed their hand by releasing a muddled and confusing hardware design stuck halfway between the 2D and 3D transitions, with horrible marketing and a surprise launch. This came on the back of souring the SEGA fanbase with crazy Genesis add-ons though.

It isn't a curse, it's a market trend where the third hardware cycle seems to be when first parties who have seen some recent success have built a thick enough echo chamber to be oblivious to the market's demands. Nintendo has honestly never popped this bubble, the Gamecube had all the same problems as the N64 and the Wii only saw success with non-gamers and extremely casual gamers. They continue to survive thanks to a VERY dedicated core.

SEGA wasn't a strong enough company overall to dig out of the Saturn hole and were no longer financially capable of battling Sony, resulting in an exit from the console hardware market.

Sony is the first console manufacturer we're seeing make a real attempt to learn from these mistakes and dig their way out of the echo chamber. That so far has involved a change at the top of their consumer electronics division, their CEO/corporate president, the head of their worldwide studios, and the very people entrusted with product design. Sony handing the PS4 to Cerny and not their traditional stable of hardware engineers is the equivalent of Nintendo telling Miyamoto to just focus on games and letting real hardware guys design an efficient, powerful console. It's a massive directional shift and the first we've seen in the industry. How it pans out will likely change how the rest of the players in this industry work moving forward.
Excellent explanation of what is meant by "the curse"
 

Des0lar

will learn eventually
I think the PS4 is back in front for me. If I HAD to chose only one console right now based on what we know, it would be the PS4.
You think? We heard nothing but bad news about the Xbone, while hearing mostly good stuff about the PS4 and you just think that you would choose the PS4 now?

Currently no sane gamer should consider the Xbone, except if you get all the consoles anyway or really really love Halo.
 
So how would that be a 3rd console curse when SNES had already sold less than the NES & the GameCube sold less than the 3rd console?
Nobody said selling less than your predecessor is 'third console curse' exclusive!

Stop spoiling the fun, keep the myth alive! :p
 

Donnie

Member
I don't think going embedded memory with DDR3 was that bad a decision. What I really don't get is why they went with 6t-SRAM instead of eDRAM. The size difference is humongous. Are there production/process advantages to this I am unaware of?

The very first time the 32MB eSRAM was mentioned I remember thinking "That makes no sense at all, that can't be right??", and to this day I'm still trying to figure out why they've gone that way. They could have used 32MB pseudo-static eDRAM, taking up one third of the transistors and the difference in latency wouldn't be at all significant.
 
So has he been banned?

Why, nothing is confirmed yet? Would need an answer before banning him for being wrong.

On top of that, Beyond3D is taking this news with much more grain of salt and are waiting for more confirmed insiders to confirm this possible rumor (mainly a guy named Matt?) before going to the next rumor stage.

It is interesting reading over there, they do not have a high opinion of our GAF "insiders."
 
The very first time the 32MB eSRAM was mentioned I remember thinking "That makes no sense at all, that can't be right??", and to this day I'm still trying to figure out why they've gone that way. They could have used 32MB pseudo-static eDRAM, taking up one third of the transistors and the difference in latency wouldn't have been that significant.

There is no 28 nm EDRAM far as I know.
 

chubigans

y'all should be ashamed
Why, nothing is confirmed yet? Would need an answer before banning him for being wrong.

On top of that, Beyond3D is taking this news with much more grain of salt and are waiting for more confirmed insiders to confirm this possible rumor (mainly a guy named Matt?) before going to the next rumor stage.

It is interesting reading over there, they do not have a high opinion of our GAF "insiders."

Uh, the same Matt as this one?
 
You think? We heard nothing but bad news about the Xbone, while hearing mostly good stuff about the PS4 and you just think that you would choose the PS4 now?

Currently no sane gamer should consider the Xbone, except if you get all the consoles anyway or really really love Halo.

I really really love HALO. But Xbone is such a dissaster that i can live without it easilly....
Still, is there anybody who can live without Naughty Dog ? :)
 
After reading through the confirmations, it sounds like the thread title should get a change. This isn't just about low yields, but downclocking may well be the more salient point.
 
Why, nothing is confirmed yet? Would need an answer before banning him for being wrong.

On top of that, Beyond3D is taking this news with much more grain of salt and are waiting for more confirmed insiders to confirm this possible rumor (mainly a guy named Matt?) before going to the next rumor stage.

It is interesting reading over there, they do not have a high opinion of our GAF "insiders."

Matt posted here on GAF.
 

operon

Member
Why, nothing is confirmed yet? Would need an answer before banning him for being wrong.

On top of that, Beyond3D is taking this news with much more grain of salt and are waiting for more confirmed insiders to confirm this possible rumor (mainly a guy named Matt?) before going to the next rumor stage.

It is interesting reading over there, they do not have a high opinion of our GAF "insiders."

I only asked as the poster seemed to be sure he was full of shit, which would have been confirmed if he had been banned. He pretty much had bet his account and was willing to tell a mod on confidence that he would tell them his source.
 

Fox Mulder

Member
Other than the RROD, I can't think of MS having hardware problems. Or am I missing something?

the first xbox had power supply issues that forced them to give out replacement cords with a trip switch in them to prevent your house from burning down.

If these rumors are true, this is the third console in a row that MS has had trouble with.
 
Because no one predicted that there will be available 8GB GDDR5 when consoles will be out. Sony all along planed 4GB of GDDR5 and even earlier 2GB. They got lucky with 8GB because it is super fresh tech like january this year and their old 4GB was in clamshell design so they just switched it.

If they wanted their media hub they needed 8GB. Only way back then was to use 8GB of DDR3 there was no 8GB of GDDR5. Even 4GB at that time was still not out.

If they wanted to go with 8GB of DDR3 they needed to use eDRAM or eSRAM to "patch" (not "fix") problem with DDR3 bandwidth.

Real question is now why eSRAM and no eDRAM. Some people mentioned that eDRAM can only be produced in few fabs and it will be harder to downsize it in future.
I can see that, as I myself guessed at months ago in another of these threads. Still, seems like an awful big miss for Sony to get and MS to not see considering that they're shipping at nearly the same time with the same visibility for hardware availability. Still, it just seems ridiculously patchwork if it really is just to accommodate DDR3.

It's a mistake due to an exceptionally simple reason. An APU is all-in-one. If one of the things inside it is a fab dud, the whole APU is a dud. The more things you cram into an APU, the higher the risk it'll be a dud.
Yeah, sounds like a potential killer mistake if it's as bad as interpretations of the rumored situation sound.

Because the PS4 was initially supposed to have just 4 GB GDDR5, a solution considered inferior to 8 GB DDR3+specialRAM according to some metrics. Nobody apart for Cerny and a couple others fully believed they could get 8 GB in time. Hirai accepted the GDDR5 solution and hoped for a jackpot, since Sony badly needed it. It got it.

Right, I get it, but I still can't quite believe it would so over-engineered to match something so much simpler. Just feels like a ridiculously amateur mistake for a group who has shipped two consoles before now. Whatever the end result, it should make for some interesting reading if Dean Takahashi has another book on project.
 
Status
Not open for further replies.
Top Bottom