• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One hardware breakdown by ExtremeTech.com after HotChips reveal

eastmen

Banned
Scared of RROD, overconfident on Kinect, and the misfortune of focusing everything around 8GB DDR3 first.

MS made a lot of good long term choices.

1) The chip is large due to the esram but as micron drops happen the cost will rapidly decrease. These are designed for 28nm but 22nm should happen sometime during the second full year of the console's life. So 2015. They should get about a 30% reduction in size.

2) DDR 3 is mass produced and wont go anywhere for a long time. The quanities for it dwarf the quanities for gddr . The GDDR ram sony is using is cutting edge and thus will have a cutting edge price and avalibility. Price for high end ram takes a long time to go down

3) RROD is a problem that happened , but they are far from the only console maker with those problems. The system design is beautiful in its simplicity , they learned a lot from the xbox 360 launch and you can see that in this design. The xbox one is extremely quite . I can't use my xbox 360 slim or my ps3 40 gig as a Netflix or bluray /dvd player because of how loud they are , esp now.

4) Kinect sold over 20m units and is still selling. Its one of the most if not the most successful add on to a console ever. There are a lot of people out there who will want to use the next one and havingit bundled in means it will get a whole lot more support than the previous one.




I've said it before , but I like the original idea of the xbox one and the new changes are still workable. I like the idea of being able to sit on my couch and not have to get up to change from bluray to game to Netflix to game to bluray again and so on. It may sound lazy but sometimes my friends and I jump through 3 different games in an hour depending on who jumps on and who has what. I don't want to constantly get up or constantly looking for the remote to change inputs.
 

miked808

Member
it's not so much a question of wanting them, it's paying a premium for them and then paying a yearly upkeep on top of that for continued access that's unacceptable to many, myself included.

You have to remember that a premium to you is not to some of us, and I for one jump on Amazon and Buy.com LIVE sales so I actually pay about $20 a year for LIVE. I have lunches some times that cost more than that.
 

coldfoot

Banned
You two are not alone. It may be heresy to say it, but a lot of people do want those multimedia features.
Please show me your research that shows many people are willing to buy $500 hardware and pay $60/year to control their TV's with voice, which historically has NEVER been reliable enough to use for anything. Apple and Google couldn't do it with devices where the microphone is next to your mouth, what makes you think MS will be able to do it flawlessly with a microphone 10 feet away.

And for those casuals who want full body control motion games, there's the 360 and Kinect at a much cheaper price. In fact, they may already have one. Why would they want to spend $500+cost of games for games that will play the same?
 

ekim

Member
Nope, they didnt say. It was a misunderstanding by venturebeat.

It's stated on this slide:
XBO_diagram_WM.jpg

(upper left)
 

coldfoot

Banned
The GDDR ram sony is using is cutting edge and thus will have a cutting edge price and avalibility
There is nothing cutting edge about GDDR5-5500 RAM. In fact it's one of the slowest speeds of GDDR5 available.

Also I would like to see where you obtained the standalone pricing for GDDR5, which is based on DDR3. Mandatory kinect is a far bigger detriment to cost reduction than GDDR5. I guess it could be made optional, going against MS's vision as of today.
 

evilalien

Member
There is nothing cutting edge about GDDR5-5500 RAM. In fact it's one of the slowest speeds of GDDR5 available.

Also I would like to see where you obtained the standalone pricing for GDDR5, which is based on DDR3. Mandatory kinect is a far bigger detriment to cost reduction than GDDR5. I guess it could be made optional, going against MS's vision as of today.

Yeah, the only thing cutting edge about the GDDR5 Sony is using is that they are using 4Gbit modules. These will drop in price rapidly as upcoming graphics cards start using these large modules as well.
 

Skeff

Member
There is nothing cutting edge about GDDR5-5500 RAM. In fact it's one of the slowest speeds of GDDR5 available.

Also I would like to see where you obtained the standalone pricing for GDDR5, which is based on DDR3. Mandatory kinect is a far bigger detriment to cost reduction than GDDR5. I guess it could be made optional, going against MS's vision as of today.

there's nothing cutting edge about the speed, but their is in capacity, I believe it will be the first consumer product to use 4Gbit chips.
 

benny_a

extra source of jiggaflops
In all fairness, this is not accurate - what they (Marc Whitten, I believe) explicitly said is that "for every physical Xbox One we build, we're provisioning the CPU and storage equivalent of three Xbox Ones on the cloud".
This is what people are referring to when talking about 3x the power of Xbox Ones with the cloud
http://www.neogaf.com/forum/showthread.php?t=594436
(The part where a regular Xbone is 10x of a 360 and the Xbone with the cloud is 40x the 360.)

Obviously that part backfired, so 2 days later the clarification was made via OXM.

I also wasn't aware of the clarification by Marc Whitten, just wanted to provide the context why people say 3 x Xbones.
 
There is nothing cutting edge about GDDR5-5500 RAM. In fact it's one of the slowest speeds of GDDR5 available.

Also I would like to see where you obtained the standalone pricing for GDDR5, which is based on DDR3. Mandatory kinect is a far bigger detriment to cost reduction than GDDR5. I guess it could be made optional, going against MS's vision as of today.
Clock speed has nothing to do with how cutting edge something is. PS4 is the first device to use 4Gbit chips and the ram is understood to be the most expensive singular component to manufacture. You should read zomgwtfbbqs breakdown, there's nothing wrong with what eastmen said...
 
No, the XB1 GPU is just about dead on the HD 7770 in regards to processing power, pixel and texel fillrate.

I don't care what the 'on paper' specs say, Dev's will get performance far closer to a 7790 than a 7770 on the Xbox and 7870 performance from the 'on paper' 7850 specs of the PS4.
 

artist

Banned
Wait... you want to imply, that the X1 does not have 15 special purpose processors? How do you know that?
My motherboard has 10+ processors in it even without the CPU/GPU. Northbridge, Southbridge, Audio processor, Gigabit controller, Wireless controller, PCIe, Lucid, PWM ..
 

coldfoot

Banned
Clock speed has nothing to do with how cutting edge something is. PS4 is the first device to use 4Gbit chips and the ram is understood to be the most expensive singular component to manufacture. You should read zomgwtfbbqs breakdown, there's nothing wrong with what eastmen said...
Dividing $80-90 of estimated cost of GDDR5 in the PS4 by 16 gives a little over $5 per chip. I doubt that's the most expensive SINGULAR component to manufacture in the PS4.
 

inherendo

Member
Seems weird that they spent all that time and effort designing such an enormous chip with crazy customisations out the wazoo when a more standard part would give them better performance. What's the advantage of this? Sony are probably paying less for a more powerful APU, so what gives? Just over-engineering?

They needed the esram cache because they went with ddr3 unlike sony and gddr5. I'm guessing MS didn't know sony would go for gddr5 or maybe they did. They used a cache with the xbox 360 and chose it because of this.
 

ekim

Member
My motherboard has 10+ processors in it even without the CPU/GPU. Northbridge, Southbridge, Audio processor, Gigabit controller, Wireless controller, PCIe, Lucid, PWM ..

They are not counting the stuff that's usually in there. Just look at those stuff on the left side.
 

MoneyHats

Banned
Difference wasn't near as much or as obvious, considering 360 and PS3 had such different architectures. I don't think there has ever been a difference this great between competitors, aside from the Wii/Wii U, but that's a bit different.

Xbox vs PS2 difference was LEAGUES bigger.

PS2 Xbox Difference

Bandwidth.......3.2GB/s..............6.4GB/s...............100%
Ram total........32MB..................64MB...................100%
FLOPS........... 6.2Mflops...........21.6Mflops ......... 250%
Fillrate.............1.2GT/s..............8GT/s...................570%


You know, I won't even go into details of how vastly superior the Geforce 3 architecture is over Graphics Synthesizer. GF3 was the first programmable shader GPU, the core of what you still see today in modern GPUs.


40% the largest we've seen El oh El.
 

hodgy100

Member
Xbox vs PS2 difference was LEAGUES bigger.

PS2 Xbox Difference

Bandwidth.......3.2GB/s..............6.4GB/s...............200%
Ram total........32MB..................64MB...................200%
FLOPS........... 6.2Mflops...........21.6Mflops ......... 350%
Fillrate.............1.2GT/s..............8GT/s...................670%


You know, I won't even go into details of how vastly superior the Geforce 3 architecture is over Graphics Synthesizer. GF3 was the first programmable shader GPU, the core of what you still see today in modern GPUs.


40% the largest we've seen El oh El.

Come on! the xbox launched two years after the PS2 did and was more expensive than the PS2 as it was launching new. that siutation isnt comparable to this one. we have a situation now where one machine is quite a bit more powerful than the other they are releasing within a month of eachother and the weaker machine is £80 more.

its no doubt the original xbox was much more powerful than the PS2 but thats what happens when you launch 2 years later.
 

Skeff

Member
Perhaps you should use better words to explain yourself. In what world is 8GB GDDR5, made up by 16 chips of 512MB each, considered a SINGULAR component?

Your starting to lean on semantics a little too much now, The discussion obviously wasn't about each single chip, it was about the cost of all 16 of them, Comparing the cost of all 16 of them the RAM as a bundle is probably around the same price as the APU, probably a little more expensive.

It should however be the fastest reduction in costs for the PS4.
 
Perhaps you should use better words to explain yourself. In what world is 8GB GDDR5, made up by 16 chips of 512MB each, considered a SINGULAR component?
In the world of computing? Are you new or something? Or are you just playing cute? Yes, memory is a singular component in a computing system. Which is why it's always reported as a singular number...
 

JP

Member
Xbox vs PS2 difference was LEAGUES bigger.

PS2 Xbox Difference

Bandwidth.......3.2GB/s..............6.4GB/s...............200%
Ram total........32MB..................64MB...................200%
FLOPS........... 6.2Mflops...........21.6Mflops ......... 350%
Fillrate.............1.2GT/s..............8GT/s...................670%


You know, I won't even go into details of how vastly superior the Geforce 3 architecture is over Graphics Synthesizer. GF3 was the first programmable shader GPU, the core of what you still see today in modern GPUs.


40% the largest we've seen El oh El.
Although the flaws in your reasoning will be picked up by everybody else here, you could at least get the numbers right. The same applies to all of the ones you list but the percentage difference between 3.2GBs and 6.4GBs is 100% not 200%.
 

timlot

Banned
They needed the esram cache because they went with ddr3 unlike sony and gddr5. I'm guessing MS didn't know sony would go for gddr5 or maybe they did. They used a cache with the xbox 360 and chose it because of this.

MS used edram in the xb360 so I would see them putting esram on the die as a continuation of a technology they've used before. I seriously doubt they didn't know Sony was using gddr5. At this high level of silicon engineering and corporate co-operation they knew. It isn't like all these components are some kind of secret or something. They decided to invest in kinect, voice recognition, multimedia intergration, as well as acceptable (to me) nextgen graphic technology.
 

ekim

Member
What you said is wrong;

Let's assume I am (which I'm not eniterly, see swizzle, decompress, SHAPE audio, Video de/encode chips) , what was your initial point? (aka MS is desperate)

They are simply saying, they have 15 special purpose processors - what is so desperate about it?
 

artist

Banned
Let's assume I am, what was your initial point? (aka MS is desperate)
Nope, my initial point was (giving MS the benefit of doubt) suspecting venturebeat of misinterpreting. I was wrong there.

They are simply saying, they have 15 special purpose processors - what is so desperate about it?
edit: Since you added this, I'll respond .. nobody does that. Like I said earlier, my motherboard would have 10+ "processors" with same logic.
 

ekim

Member
Nope, my initial point was (giving MS the benefit of doubt) suspecting venturebeat of misinterpreting. I was wrong there.


edit: Since you added this, I'll respond .. nobody does that. Like I said earlier, my motherboard would have 10+ "processors" with same logic.

That's nice and all but said processors sit inside the APU for the most part and have (cache) coherent access to the memory. I doubt these 10+ processors on your mainboard have the same features.

Btw - which 10+ processors with the same logic are on your mainboard?
 

MoneyHats

Banned
Although the flaws in your reasoning will be picked up by everybody else here, you could at least get the numbers right. The same applies to all of the ones you list but the percentage difference between 3.2GBs and 6.4GBs is 100% not 200%.


I was thinking double, twice the size is 200% not a 200% increase since we're not talking performance increase but pool size, but I can see how that can be misleading.
 

ToyBroker

Banned
You two are not alone. It may be heresy to say it, but a lot of people do want those multimedia features.

Oh really?

Because smart-TV boxes (at probably 1/5th the cost--and from big name companies) that do all the features of the Xbone AND MORE haven't sold worth a poop.

More and more people are utilizing DVRs than ever these days...Xbone not being able to control it seems like a major issue.

http://forum.beyond3d.com/showthread.php?p=1762942#post1762942

Dave Baumann view, take it is as you wish. If the Esram is used effectively he'd wager the Xbox one will far outstrip a 7770 and a 7790. As I said take his views as you wish.

Christ almighty...the Dave quote again? What are you SenjutsuSage's minion? :X. We've already said it before, but if what Dave said was even close to being true or relevant, it would have been heavily reported on.
 

Ducktail

Member
Anyone who really thought the Xbox One was going to launch $100 more and significantly weaker was just being dumb or a crazy fanboy. It may, in fact, be weaker but it would never be so much so that third party games would look different.

You seem to forget that the cost of the Xbone is high due to KINECT. Those $500 are not spent necessarily on the console itself.
 

Begaria

Member
Alright, I forget...whose the sonofabitch who posted a link to that misterxmedia's livejournal?

I can't stop reading it now. I swear this dude might be mentally ill.

And reading the comments it ALMOST seems like it's just him posting as different people with slightly better or worse english. And the inside interview he just posted up? Holy fuck it's gold.

I know the feeling! I've been reading that damn liveblog almost every day. It's a fascinating look into obsessive fandom. There's definitely some weird voodoo math going on over there as well. All of that talk about "level 3/level 4 developers", Sony lies, and Microsoft being infallible is a lot of fun to read. Even when Microsoft presents cold hard facts, these guys take them to the extreme and stretch it as far reaching as they can.

Can't wait for September 29th! Then nothing will come out and then they'll say, "Ohhhh, I guess we'll have to wait until the consoles come out!". When they do, tech blogs are going to rip both consoles open to dig around in their guts. Much crow shall be eaten when that happens.
 

ekim

Member
Semi-Accurate weighs in:
http://semiaccurate.com/2013/08/29/a-deep-dive-into-microsofts-xbox-ones-architecture/

If you think about the sheer volume of coherency data that needs to go between the two CPU blocks, Microsoft probably had to beef up the L2 to NB links to almost match that of the L1 to L2 links. While specifics were not given out, SemiAccurate was told it was “significantly wider” along with beefed up buffers and deeper queues. Don’t discount this as a minor change, it is both critical to the system performance and a very complex thing to do. It will be interesting to see how Sony did their variant if they ever give a talk on the PS4 architecture.

That means the XBox One’s 8 Jaguar cores are clocked at ~1.9GHz, something that wasn’t announced at Hot Chips. Now you know.

The CPU NB also has coherent links to the GPU MMU and I/O MMU, something you would expect on any system that takes GPU compute work seriously. AMD has their HSA/HUMA architecture coming with Kaveri in short order but XBO is based on a design ~1+ generations older so no advanced AMD CPU/GPU coherency here. Luckily Microsoft is on the ball here and put their own mechanism in which they would unfortunately not go in to detail on. What SemiAccurate has heard about it says they did a pretty impressive job but until it is fully disclosed we can’t comment with authority. Lets just leave things at, “From what we can tell it looks good”.

More at the link.
 
Top Bottom