• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One hardware breakdown by ExtremeTech.com after HotChips reveal

ekim

Member
http://www.extremetech.com/gaming/1...d-odd-soc-architecture-confirmed-by-microsoft

The Xbox One SoC appears to be implemented like an enormous variant of the Llano/Piledriver architecture we described for the PS4. One of our theories was that the chip would use the same “Onion” and “Garlic” buses. That appears to be exactly what Microsoft did.
AMD-APU-Diagram.png

That slide is from Llano/Piledriver.
XBO_diagram_WM.jpg

Here’s the important points, for comparison’s sake. The CPU cache block attaches to the GPU MMU, which drives the entire graphics core and video engine. Of particular interest for our purposes is this bit: “CPU, GPU, special processors, and I/O share memory via host-guest MMUs and synchronized page tables.” If Microsoft is using synchronized page tables, this strongly suggests that the Xbox One supports HSA/hUMA and that we were mistaken in our assertion to the contrary. Mea culpa.

You can see the Onion and Garlic buses represented in both AMD’s diagram and the Microsoft image above. The GPU has a non-cache-coherent bus connection to the DDR3 memory pool and a cache-coherent bus attached to the CPU. Bandwidth to main memory is 68GB/s using 4×64 DDR3 links or 36GB/s if passed through the cache coherent interface. Cache coherency is always slower than non-coherent access, so the discrepancy makes sense.

The big picture takeaway from this is that the Xbox One probably is HSA capable, and the underlying architecture is very similar to a super-charged APU with much higher internal bandwidth than a normal AMD chip. That’s a non-trivial difference — the 68GB/s of bandwidth devoted to Jaguar in the Xbox One dwarfs the quad-channel DDR3-1600 bandwidth that ships in an Intel X79 motherboard. For all the debates over the Xbox One’s competitive positioning against the PS4, this should be an interesting micro-architecture in its own right. There are still questions regarding the ESRAM cache — breaking it into four 8MB chunks is interesting, but doesn’t tell us much about how those pieces will be used. If the cache really is 1024 bits wide, and the developers can make suitable use of it, then the Xbox One’s performance might surprise us.

Much more at the link.
 
"Garlic" is essentially the GPU directly to the main ram... which is expected.

It does appear there is an "Onion" solution (GPU to CPU cache), but the cache still needs to be flushed IIRC.

Also, you can't bypass the GPU cache. Onion+ is something Sony did.

I don't know why people keep using the word 'cache' to describe the eSRAM, its not a cache its a scratchpad.

I don't either. You're right, but people are basically saying it's an L3, which it isn't.
 

statham

Member
If the cache really is 1024 bits wide, and the developers can make suitable use of it, then the Xbox One’s performance might surprise us.

I don't understand much about tech, but I like this line.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I don't know why people keep using the word 'cache' to describe the eSRAM, its not a cache its a scratchpad.

Yeah, that statement is wrong. The ESRAM will be primarily there for pixelbuffers, since rendering performance would go down the toilet otherwise. It is not a cache, and not to be confused with embedded large caches in some CPUs. /edit: also, stressing the "wideness" of the ESRAM bus is a weird statement since all that matters is bandwidth which is a result of clock and bus width. (4 * 256bit * 853mhz) / 8 ~= 109,2GB/s.
 

mocoworm

Member
extremetech said:
"Of particular interest for our purposes is this bit: “CPU, GPU, special processors, and I/O share memory via host-guest MMUs and synchronized page tables.” If Microsoft is using synchronized page tables, this strongly suggests that the Xbox One supports HSA/hUMA and that we were mistaken in our assertion to the contrary. Mea culpa."

Nice ! Good of them to put their hands up as well and accept the mistake.
 

Bundy

Banned
So the Xbox One has HUMA?
Doesn't look like it!

Anyone who really thought the Xbox One was going to launch $100 more and significantly weaker was just being dumb or a crazy fanboy. It may, in fact, be weaker but it would never be so much so that third party games would look different.
Well, wait and see. There will be differences.
(Maybe not at launch)
 

dude819

Member
Anyone who really thought the Xbox One was going to launch $100 more and significantly weaker was just being dumb or a crazy fanboy. It may, in fact, be weaker but it would never be so much so that third party games would look different.
 

dude819

Member
Well, wait and see. There will be differences.
(Maybe not at launch)

Just like all those third party games looked better on the Xbox and how they looked better on the PS3 the following gen?

The boxes are clearly very similar and third party games will reflect that. They would have to be wildly different for devs to spend the time making different versions as it saves them money not to.
 

KidBeta

Junior Member
The interesting ESRAM cache

First, there’s the fact that while we’ve been calling this a 32MB ESRAM cache, Microsoft is representing it as a series of four 8MB caches. Bandwidth to this cache is apparently 109GB/s “minimum” but up to 204GB/s. The math on this is… odd. It’s not clear if the ESRAM cache is actually a group of 4x8MB caches that can be split into chunks for different purposes or how its purposed. The implication is that the cache is a total of 1024 bits wide, running at the GPU’s clock speed of ~850MHz for 109GB/s in uni-directional mode — which would give us the “minimum” talked about. But that has implications for data storage — filling four blocks of 8MB each isn’t the same as addressing a contiguous block of 32MB. This is still unclear.

The other major mystery of the ESRAM cache is the single arrow running from the CPU cache linkage down to the GPU-ESRAM bus. It’s the only skinny black arrow in the entire presentation and its use is still unclear. It implies that there’s a way for the CPU to snoop the contents of ESRAM, but there’s no mention of why that capability isn’t already provided for on the Onion/Garlic buses and it’s not clear why they’d represent this option with a tiny black arrow rather than a fat bandwidth pipe.

Even with this new information, the use and capabilities of the ESRAM remain mysterious. It’s not clear what Microsoft expects it to be used for — if it’s for caching GPU data, why break it into 8MB chunks, and why does the CPU have a connection to it?, Microsoft is representing it as a series of four 8MB caches. Bandwidth to this cache is apparently 109GB/s “minimum” but up to 204GB/s. The math on this is… odd. It’s not clear if the ESRAM cache is actually a group of 4x8MB caches that can be split into chunks for different purposes or how its purposed. The implication is that the The interesting ESRAM cache....

http://www.youtube.com/watch?v=G2y8Sx4B2Sk

They really like to push the point that its a cache don't they?.
 

Dunlop

Member
And the 2DS

This thread made me realize the 2DS is a real thing. I am at a loss of what to say about it

On topic, I am glad for the lack of DMZ references this time around and happy the breakdown is making it a little clearer for me to understand lol
 

mocoworm

Member
Extremetech said:
"The big picture takeaway from this is that the Xbox One probably is HSA capable, and the underlying architecture is very similar to a super-charged APU with much higher internal bandwidth than a normal AMD chip. That’s a non-trivial difference — the 68GB/s of bandwidth devoted to Jaguar in the Xbox One dwarfs the quad-channel DDR3-1600 bandwidth that ships in an Intel X79 motherboard. For all the debates over the Xbox One’s competitive positioning against the PS4, this should be an interesting micro-architecture in its own right. There are still questions regarding the ESRAM cache — breaking it into four 8MB chunks is interesting, but doesn’t tell us much about how those pieces will be used. If the cache really is 1024 bits wide, and the developers can make suitable use of it, then the Xbox One’s performance might surprise us."


Kind of relates to this don't you think ?

http://www.neogaf.com/forum/showthread.php?p=77551213

Videogamer.com said:
"Need For Speed Rivals will feature better graphics on one next-gen console than the other, Ghost Games’ executive producer Marcus Nilsson has suggested – but refused to clarify which.

“What we’re seeing with the consoles are actually that they are a little bit more powerful than we thought for a really long time – ESPECIALLY ONE OF THEM, but I’m not going to tell you which one," Nilsson told VideoGamer.com at Gamescom earlier today.

“And that makes me really happy. But in reality, I think we’re going to have both those consoles pretty much on parity – maybe one sticking up a little bit. And I think that one will look as good as the PC.”"
 
The big picture takeaway from this is that the Xbox One probably is HSA capable, and the underlying architecture is very similar to a super-charged APU with much higher internal bandwidth than a normal AMD chip. That’s a non-trivial difference — the 68GB/s of bandwidth devoted to Jaguar in the Xbox One dwarfs the quad-channel DDR3-1600 bandwidth that ships in an Intel X79 motherboard.
With regards to the above statement - I understand its benefit over PC tech, but how does it compare to PS4?
I'd assumed he meant PS4, especially given the quote was re-tweeted by Yoshida.
 
Planet3DNow! (Heavy focus on AMD) disagrees with the conclusion that the Xbox One has hUMA.

Unfortunately it's in german: http://www.planet3dnow.de/cms/1538-...f-der-hot-chips-25-huma-sein-oder-nicht-sein/

Who cares about hUMA? It's just the AMD implementation of a tech. That obsession with being able to call it hUMA(TM) is as meaningless as despise IBM SMT because they can't name it Hyperthreading (TM).

Both Microsoft and Sony improved the memory subsystem of the AMD offer in their own way.
 

mocoworm

Member
With regards to the above statement - I understand its benefit over PC tech, but how does it compare to PS4?

I'd assumed he meant PS4, especially given the quote was re-tweeted by Yoshida.

You shouldn't assume ;)

Who knows !? , but this is real fun while it's lasting. I kind of don't want them to release so we can keep this going.
 
thanks ekim. I quoted this yesterday in the other thread:
If the cache really is 1024 bits wide, and the developers can make suitable use of it, then the Xbox One’s performance might surprise us.

interesting
 

BigDug13

Member
Not even talking about just graphics here:

Honestly this is a good thing. PS4 will be more powerful, but anything on the XBO side that closes the power gap only makes multiplatform games better. Even PC ports of most games are held back based on the weakest of console hardware as far as how big your world can be, how many enemies are on screen at once, and other things beyond simple graphics.

More power in the weaker console means less features being held back due to power constraints.
 

Bundy

Banned
The post with the bold from the aritcle is literally above yours...
Wait, what?
Just like all those third party games looked better on the Xbox and how they looked better on the PS3 the following gen?

The boxes are clearly very similar and third party games will reflect that. They would have to be wildly different for devs to spend the time making different versions as it saves them money not to.
I don't know how often we will hear that "bu... but...PS3/Xbox360 this gen.." comment again. As I've already said in several other threads. The PS3/Xbox360 were nearly identical, performance wise, if you look at their best games.
The gap between the PS4 and XBone is much bigger.
And only the CPU is the exact same. That's it.

btw. the "saves money not to do anything witht he PS4 extra power" is bullcrap,
because you don't have to work 20 hours a day, to get a bit more out of the PS4. This is wishful thinking and not how it works.

Damn, I have to find the great post of Amir0x regarding that topic.
 
Anyone who really thought the Xbox One was going to launch $100 more and significantly weaker was just being dumb or a crazy fanboy. It may, in fact, be weaker but it would never be so much so that third party games would look different.

really? you've already 'validated' the higher price like this in your mind?

price has nothing to do with quality or specs. It's about fucking profit. See: Apple.
 
What I understood of First Post :

Technical Technical Technical Technical Technical Technical Technical Technical Technical Technical Onion and Garlic ? Technical Technical Technical Technical Technical Technical...

I guess Xbox has Onion and Garlic , then ... XD
 

benny_a

extra source of jiggaflops
Who cares about hUMA? It's just the AMD implementation of a tech. That obsession with being able to call it hUMA(TM) is as meaningless as despise IBM SMT because they can't name it Hyperthreading (TM).

Both Microsoft and Sony improved the memory subsystem of the AMD offer in their own way.
hUMA is just the short form of this
as opposed to this:

If the XB1 can do the former without being constrained in one way or another due to its architecture, then great.
 

dude819

Member
really? you've already 'validated' the higher price like this in your mind?

price has nothing to do with quality or specs. It's about fucking profit. See: Apple.

I validated the price in my mind when i pre-ordered on June 10th.

It would just make no sense for MS to put out a product so inferior that the third party games are way worse on theirs. That would end in them getting crushed.

The console may, in fact, be weaker but they would never let it be so much that they lose their existing audience. They need to keep those users while expanding the base with Kinect, TV, etc.
 
Anyone who really thought the Xbox One was going to launch $100 more and significantly weaker was just being dumb or a crazy fanboy. It may, in fact, be weaker but it would never be so much so that third party games would look different.

Looking at the specs we know XB1 is weaker then PS4. The only reason it's more expensive is because of Kinect.
 

Skeff

Member
I validated the price in my mind when i pre-ordered on June 10th.

It would just make no sense for MS to put out a product so inferior that the third party games are way worse on theirs. That would end in them getting crushed.

The console may, in fact, be weaker but they would never let it be so much that they lose their existing audience. They need to keep those users while expanding the base with Kinect, TV, etc.

But it's not just an easy choice to go for Kinect/TV to expand as well as power to keep the existing userbase. At the start of development they had to make a choice about 8gb of ram for multimedia functions, from there they're doing the est they can to make a powerful console, but becuase of the DDR3, they need ESRAM and that takes space on the silicon, the APU is already the largest consumer APU in the world, so they can't just add more GPU horse power.

At the time of design, choices had to be made.
 
Top Bottom