• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One hardware breakdown by ExtremeTech.com after HotChips reveal

Plinko

Wildcard berths that can't beat teams without a winning record should have homefield advantage
Why would Microsoft not announce that their CPU was 1.9Ghz? If it were true wouldn't they be making sure everyone knew?

Yeah, I'm a bit puzzled as to why this wouldn't be a huge announcement.
 

Pug

Member
Oh really?

Because smart-TV boxes (at probably 1/5th the cost--and from big name companies) that do all the features of the Xbone AND MORE haven't sold worth a poop.

More and more people are utilizing DVRs than ever these days...Xbone not being able to control it seems like a major issue.



Christ almighty...the Dave quote again? What are you SenjutsuSage's minion? :X. We've already said it before, but if what Dave said was even close to being true or relevant, it would have been heavily reported on.

I said take it as you wish.
 

gofreak

GAF's Bob Woodward
And one mystery is still unsolved:
05.jpg


What's up with that black arrow between the CPU and eSRAM?

If you look to the 'special processors' on the left, a number of the processors listed relate to the Data Move Engines.

I guess the black line signifies that data can be copied by the DMEs through the attaching bus to the CPU. It's not represented as a bus in and of itself because that would confuse people into thinking the CPU has direct access. But they want to have 'something' to show there is a way to copy data from eSRAM to the CPU.
 

ToyBroker

Banned
I said take it as you wish.

And that's how I took it.

It seems I wasn't the only one either.

My point still stands though. These boxes that do all these magic multimedia functions aren't worth a damn in the marketplace. No one is going to shell out $100 bucks let alone $500 bucks because it can control some other things on your multimedia rack.

People have been using universal remotes for a long time and even plain ass remotes for even longer--people like it that way. The sales of these multimedia boxes prove it.

MS made a lot of good long term choices.

1) The chip is large due to the esram but as micron drops happen the cost will rapidly decrease. These are designed for 28nm but 22nm should happen sometime during the second full year of the console's life. So 2015. They should get about a 30% reduction in size.

2) DDR 3 is mass produced and wont go anywhere for a long time. The quanities for it dwarf the quanities for gddr . The GDDR ram sony is using is cutting edge and thus will have a cutting edge price and avalibility. Price for high end ram takes a long time to go down

3) RROD is a problem that happened , but they are far from the only console maker with those problems. The system design is beautiful in its simplicity , they learned a lot from the xbox 360 launch and you can see that in this design. The xbox one is extremely quite . I can't use my xbox 360 slim or my ps3 40 gig as a Netflix or bluray /dvd player because of how loud they are , esp now.

4) Kinect sold over 20m units and is still selling. Its one of the most if not the most successful add on to a console ever. There are a lot of people out there who will want to use the next one and havingit bundled in means it will get a whole lot more support than the previous one.




I've said it before , but I like the original idea of the xbox one and the new changes are still workable. I like the idea of being able to sit on my couch and not have to get up to change from bluray to game to Netflix to game to bluray again and so on. It may sound lazy but sometimes my friends and I jump through 3 different games in an hour depending on who jumps on and who has what. I don't want to constantly get up or constantly looking for the remote to change inputs.

This seems so very off. Do you get up to do these things now on your 360/PS3? No? I didn't think so. A $500 dollar Xbone packed with multimedia features isn't going to change that. You're still going to swap from game, to Netflix, and now Blu-ray (did you forget it's in the Xbone now? You can bet anyone with blu-rays will make PS4/Xbone their new player of choice) without having to get up...same as you did before.

I mean, I love fancy gadgets and new ways to interact with my media just as much as the next guy...but let's not blind ourselves like the Xbone is doing something fancy and revolutionary.
 
I dunno they droped the upclock on gpu randomly so perhaps they dont care. Mabye they are saving it for another event
This would be much more significant than that. This is something you'd tell your potential customers ASAP. It would be something you'd proudly proclaim as an advantage over your competitor. Unless they somehow were hoping Sony didn't find out?
 
This would be much more significant than that. This is something you'd tell your potential customers ASAP. It would be something you'd proudly proclaim as an advantage over your competitor. Unless they somehow were hoping Sony didn't find out?

PAX? TGS? Who knows? MS is hard to figure out these days.
 
Hey guys, have any of you read this? (Caution: English is not his 1st Language):



(source: http://misterxmedia.livejournal.com/132131.html)

So a good amount of MS fanboys are going crazy about the above posted material lol, and of course the same guy wrote an "interesting" article last summer regarding Sony:

(http://misterxmedia.livejournal.com/98352.html)

Surely his name should be "Suzy" now and should therefore be quiet but I digress lol...Anyway what do you guys think?
Lmfao. The shit that guy writes.
Holy fuck. Lmfao.
 
Please show me your research that shows many people are willing to buy $500 hardware and pay $60/year to control their TV's with voice, which historically has NEVER been reliable enough to use for anything. Apple and Google couldn't do it with devices where the microphone is next to your mouth, what makes you think MS will be able to do it flawlessly with a microphone 10 feet away.

And for those casuals who want full body control motion games, there's the 360 and Kinect at a much cheaper price. In fact, they may already have one. Why would they want to spend $500+cost of games for games that will play the same?


I see your point but there are quite a few people who carried Sony when the PS3 was $599 buying it solely for the fact that it was the best bluray player, hell the thing even played SACD. So it is fair to say that additional features do help sell a console. I don't think that attach rates on the PS3 were lower because everyone reads Digital Foundry, but rather gaming is a secondary feature on many - not most - PS3s.

The prospect of how Kinect V2 will affect the buying decisions of families who already own V1 is interesting since Kinect is the first console add-on that hasn't been shat out and forgotten.

I guess we'll all find out by next Xmas.
 

Pug

Member
And that's how I took it. No you took it as was a Minion to another poster.

It seems I wasn't the only one either. At this point you are

My point still stands though. These boxes that do all these magic multimedia functions aren't worth a damn in the marketplace. No one is going to shell out $100 bucks let alone $500 bucks because it can control some other things on your multimedia rack. I am

People have been using universal remotes for a long time and even plain ass remotes for even longer--people like it that way. The sales of these multimedia boxes prove it.

Personally I'd take an AMD project manager (including the 7770 and 7790) view point over any post in this thread. But thats just me, call me crazy.
 

gofreak

GAF's Bob Woodward
Why would Microsoft not announce that their CPU was 1.9Ghz? If it were true wouldn't they be making sure everyone knew?

It seems a bit speculative and I'm not sure the sequence of logic used to conclude that - that there'd be no upside to an arbitrarily clockable interface - is bulletproof. Would welcome more expert opinion, but such an interface to allow another - lower - clock would be useful if it was intended to reach a power consumption/thermal target? From a pure performance point of view there might be no reason for it, but that's not the only consideration.

That's not to say it isn't the right figure, just not sure there's no reason they'd want to temper that speed, as the writer assumes.

As for why they wouldn't announce it if it was the case - it's the same reason they wouldn't announce 1.5Ghz, or why Sony hasn't announced PS4's CPU clockspeed. Because in either case they're unlikely to exceed 2Ghz, and even something like 2Ghz - to the untrained eye - would raise questionmarks in simple comparisons with their last machines (3.2Ghz).
 

ToyBroker

Banned
I see your point but there are quite a few people who carried Sony when the PS3 was $599 buying it solely for the fact that it was the best bluray player, hell the thing even played SACD. So it is fair to say that additional features do help sell a console. I don't think that attach rates on the PS3 were lower because everyone reads Digital Foundry, but rather gaming is a secondary feature on many - not most - PS3s.

The prospect of how Kinect V2 will affect the buying decisions of families who already own V1 is interesting since Kinect is the first console add-on that hasn't been shat out and forgotten.

I guess we'll all find out by next Xmas.

I very much disagree.

What propped the PS3 up in those early years wasn't the secondary feature of the blu-ray player (which was a bonus and very much not-supported very well from the get go--like most new standards are) but it was the Playstation brand name itself.
 

TheD

The Detective
http://forum.beyond3d.com/showthread.php?p=1762942#post1762942

Dave Baumann view, take it is as you wish. If the Esram is used effectively he'd wager the Xbox one will far outstrip a 7770 and a 7790. As I said take his views as you wish.

Hard facts are hard facts.
It is not any more powerful than a HD7770, no amount of extra bandwidth for framebuffers is going to change that unless AMD's cards are stupidly bandwidth limited in that area (that seems very unlikely).
 

ToyBroker

Banned
Personally I'd take an AMD project manager (including the 7770 and 7790) view point over any post in this thread. But thats just me, call me crazy.

Uh, did you skip coldfoot's post? Hell, he even questioned the buying public's want for a fancy multimedia box before I did.

As for Dave, you can believe him if you want..but that quote has been posted plenty of times before you by the infamous SenjutsuSage and a few other rejects from Beyond3D. If it had any significance it would have been heavily reported on. I'm not even sure if he's still with AMD and if he is, he's most certainly not working on either of the consoles.

They would if it played games too. Why do you keep saying "no one"? I might even pick up two Xbones.

I used the word no one because it was in context with previously released media boxes.

There's nothing "new" that Xbone brings to the table in multimedia functions and people know it--I mean, it can't even control the DVR.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
It seems a bit speculative and I'm not sure the sequence of logic used to conclude that - that there'd be no upside to an arbitrarily clockable interface - is bulletproof. Would welcome more expert opinion, but such an interface to allow another - lower - clock would be useful if it was intended to reach a power consumption/thermal target? From a pure performance point of view there might be no reason for it, but that's not the only consideration.

Seems unlikely since their formula does not take the protocol overhead for ensuring coherency into account. In addition, since the PS4 has the same 30GB/s of cache-coherent bandwidth it would also be clocked at this speed. However, in both cases the leaked documents consistently point at 1,6Ghz although in both cases the 30GB/s figures was also already present.
 

MaulerX

Member
Wasn't it reported somewhere that Microsoft left a lot of things out in Hotchips because they didn't want the competition to get a grip on their design? I suppose more news will be incoming.
 

ekim

Member
It seems a bit speculative and I'm not sure the sequence of logic used to conclude that - that there'd be no upside to an arbitrarily clockable interface - is bulletproof. Would welcome more expert opinion, but such an interface to allow another - lower - clock would be useful if it was intended to reach a power consumption/thermal target? From a pure performance point of view there might be no reason for it, but that's not the only consideration.

I really can't follow the math used there - I have an idea but this doesn't work out for the result of 1.9GHz.
 

ToyBroker

Banned
Wasn't it reported somewhere that Microsoft left a lot of things out in Hotchips because they didn't want the competition to get a grip on their design? I suppose more news will be incoming.

I'm pretty sure that was pre-Hotchips presentation and coincidentally only really reported on that crazy ass blog we were talking about a little while ago.

http://misterxmedia.livejournal.com/132494.html#comments

In the comments it's linked to the video.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I really can't follow the math used there - I have an idea but this doesn't work out for the result of 1.9GHz.

Their formula is 1,9Ghz * 256bit / 2 (for a clock ratio of 1:2) = 243,2 Gbit/s = 30,4GB/s (Hence the ~1,9Ghz to come to 30GB/s)

But as I said, cache-coherent busses have protocol overhead.
 

gofreak

GAF's Bob Woodward
Seems unlikely since their formula does not take the protocol overhead for ensuring coherency into account. In addition, since the PS4 has the same 30GB/s of cache-coherent bandwidth it would also be clocked at this speed. However, in both cases the leaked documents consistently point at 1,6Ghz although in both cases the 30GB/s figures was also already present.

Good points.

Anyway, as I edited, I don't see anyone publishing CPU clockspeeds whether they're 1.5 or 2Ghz. To people trained to compare chips by clockspeed, any of the possible candidates will look 'bad' next to the clockspeeds in their last gen systems. That's probably why they're happier to just say 'a 8-core x86'.
 
Maybe I should have just said, "XBOX ONE ROXXX and PS4 SUXXX".

Its just my opinion. There's a weird sense around here that if you haven't drunk the Sony kool-aid something wrong with you. A thread giving a technical breakdown of XBone hardware brings out the usual cast of "tech jargon characters" to bash it and bring up PS4 has more this and 40% more that, ROPs, and Gigafooflopidiflops. I just don't think its that simple because the systems can do different things. It just not an apple to apple comparison when you judge it as an overall system.

You specifically clicked onto a link to lead you to a thread about the technological aspects of the Xbox One. In short, a tech related thread. People started comparing technology to it's nearest competitor... so you come out and say "Well, Xbox One has the games I want." Paraphrasing of course.

Let me fill you in on something.

This isn't the thread for that. No one gives a shit unless they wanted to talk about it in the first place. It's shit like your post, DBZ shit and other garbage that derail threads.

Thanks.
 

McHuj

Member
In addition, since the PS4 has the same 30GB/s of cache-coherent bandwidth it would also be clocked at this speed.

It does? The vgleaks seems to say that the coherent Onion/Onion+ bus's share around 10GB/s and there's less then 20GB/s for CPU to memory (non coherent with the GPU).
 

artist

Banned
That's nice and all but said processors sit inside the APU for the most part and have (cache) coherent access to the memory. I doubt these 10+ processors on your mainboard have the same features.
Cache coherent is not a condition to be called a processor. I doubt the Xbone APU has some of the features present in my motherboard as well ..

Btw - which 10+ processors with the same logic are on your mainboard?
MS logic of calling every trivial block as processor.

Charlie has also been vehemently saying that the Xbone APU yields are disastrous.
 

Pug

Member
Uh, did you skip coldfoot's post? Hell, he even questioned the buying public's want for a fancy multimedia box before I did.

As for Dave, you can believe him if you want..but that quote has been posted plenty of times before you by the infamous SenjutsuSage and a few other rejects from Beyond3D. If it had any significance it would have been heavily reported on. I'm not even sure if he's still with AMD and if he is, he's most certainly not working on either of the consoles.



I used the word no one because it was in context with previously released media boxes.

There's nothing "new" that Xbone brings to the table in multimedia functions and people know it--I mean, it can't even control the DVR.

I believe what he's says implicitly and with very good reason.
 

ekim

Member
Seems unlikely since their formula does not take the protocol overhead for ensuring coherency into account. In addition, since the PS4 has the same 30GB/s of cache-coherent bandwidth it would also be clocked at this speed. However, in both cases the leaked documents consistently point at 1,6Ghz although in both cases the 30GB/s figures was also already present.

Afaik the PS4 only has 20GB/s of cache-coherent bw?!

http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4
 

ToyBroker

Banned
1.9!! Thats another 5-6% gain!

I think we ALL knew you'd show up here for that tidbit.

Something that no one has reported on and is something significant that MS would have released immediately--which means it's probably inaccurate.


You know he's citing an article from May that's 100% guesswork, right?
 

dude819

Member
You seem to forget that the cost of the Xbone is high due to KINECT. Those $500 are not spent necessarily on the console itself.

I do, in fact, remember that.

I am just saying that Microsoft would not price their system at $100 more, which I understand includes the pricing of Kinect, and then put themselves at a huge disadvantage technically that would show itself so soon (as in third party titles this year or next).

The two systems will be similar enough that this is not the case even for a long while.

Will 1st party Sony titles look better within a few years, like everyone's nerd savior Mark Cerny said, sure. I just think third party titles should remain fairly similar. Also, I think MS 1st party titles will also be better when they squeeze as much as they can out of the Xbox One.
 
I very much disagree.

What propped the PS3 up in those early years wasn't the secondary feature of the blu-ray player (which was a bonus and very much not-supported very well from the get go--like most new standards are) but it was the Playstation brand name itself.

Sony thought exactly what you wrote. Gamers would get a second job so they could drop $600 on a PS3 just because of the Playstation brand. They were proven incorrect.
 

ToyBroker

Banned
:) hugs ToyBroker

:D <3

Sony thought exactly what you wrote. Gamers would get a second job so they could drop $600 on a PS3 just because of the Playstation brand. They were proven incorrect.

I think you're misunderstanding me. I was saying that brand loyalty as a result of PS2 (not secondary features like blu-ray or linux) is what kept the PS3 barely propped up in those early years...not the runaway success that Sony thought it was going to be at the time.
 
So the X1 CPU is clocked at 1,9ghz? I thought it was much less.

Do we know what the PS4 is clocked at?

Nope. I can see that MS would up the CPU. It's not hard. I expect Sony to upclock it as well. Both of those are very low power processors and are much more likely to see clock increases over the GPU (since that would produce a lot more heat).
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Afaik the PS4 only has 20GB/s of cache-coherent bw?!

Those are the maximum numbers for individual pathways. The important figure is the total maximum cache-coherent bandwidth (with respect to the CPU caches). My take on the info was that the ~20GB/s shown in the Orbis figures correspond to the 20,8GB/s on the Jaguar modules in the Durango figures, and that the 10GB/s are additional cache-coherent bandwidth for the GPU. That reasoning is also based on the assumption that both systems have the same Jaguar cores which, hence, have the same interconnect to the MMU.

The thing is that the Durango figures show the maximum bandwidth of individual pathways without showing interdependencies. Another example is that part of the figure that shows the move engines, where every single move engine is labeled with 25,6GB/s. The total bandwidth, however, is not the sum of those numbers, but also 25,GB/s.
 

timlot

Banned
You specifically clicked onto a link to lead you to a thread about the technological aspects of the Xbox One. In short, a tech related thread. People started comparing technology to it's nearest competitor... so you come out and say "Well, Xbox One has the games I want." Paraphrasing of course.

Let me fill you in on something.

This isn't the thread for that. No one gives a shit unless they wanted to talk about it in the first place. It's shit like your post, DBZ shit and other garbage that derail threads.

Thanks.

Whoa, somebody woke up on the wrong side. My contention isnt with the thread, but those who choose to side track it into a pissing match. I don't recall the OP saying "here are some Xbone specs now let compare and contrast with PS4 for the millionth time. Chill out though, It ain't that serious bro.
 

coldfoot

Banned
In the world of computing? Are you new or something? Or are you just playing cute? Yes, memory is a singular component in a computing system. Which is why it's always reported as a singular number...
In the semiconductor industry, cost goes up exponentially with die size, so when you're talking about costs, the physical makeup of each component matters.
 

MaulerX

Member
I'm pretty sure that was pre-Hotchips presentation and coincidentally only really reported on that crazy ass blog we were talking about a little while ago.

http://misterxmedia.livejournal.com/132494.html#comments

In the comments it's linked to the video.



Found it! This is where I read it: http://venturebeat.com/2013/08/26/m...se-details-are-critical-for-the-kind-of-expe/

Microsoft disclosed some details but left many important pieces out. Evidently, Microsoft doesn&#8217;t want to tell all of its competitors about how well designed its system is.
 
Whoa, somebody woke up on the wrong side. My contention isnt with the thread, but those who choose to side track it into a pissing match. I don't recall the OP saying "here are some Xbone specs now let compare and contrast with PS4 for the millionth time. Chill out though, It ain't that serious bro.

Pissing match?

Whenever I see these technical threads where the XBone is just getting bashed with all this technomombojombo I think, "well what does this mean for actually games".

Then I'm reminded...
images

xbox-one-games.png

Tell me more. Please.

The PS4 is a perfect piece of hardware to compare considering how similar those are. The article in question mentions Onion and Garlic, two buses that are actually used in the PS4.

It's a very valid discussion.

What isn't valid, is your PR poster that you decided to embed into your post.

Once again, thanks. It's appreciated.
 
Any words what GPGPU API the X1 will use?
From what i gathered the DirectCompute is lagging behind massively compared to Cuda and OpenCL.

Or will they use C++ AMP? Im not sure if its build on DirectCompute or can we expect a massive api update for DirectCompute this or next year?

And is there anything said about the API list for the X1(Win32,WinRT blablabla).
 
Top Bottom