• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One hardware breakdown by ExtremeTech.com after HotChips reveal

I used the word no one because it was in context with previously released media boxes.

There's nothing "new" that Xbone brings to the table in multimedia functions and people know it--I mean, it can't even control the DVR.[/QUOTE]

No media box to date save for the Google TV has come close to making a true attempt at integrating with your Television. As far as the DVR comment, let's be honest,the rise of On demand channels and services are quickly replacing the DVR and for those who watch a lot of sports the potential for extra info through overlays and snap move could be a game changer. Excuse the pun.

While not a core gaming function, do not downplay the attempt to integrate into the home theater rack.
 

timlot

Banned
Pissing match?



Tell me more. Please.

The PS4 is a perfect piece of hardware to compare considering how similar those are. The article in question mentions Onion and Garlic, two buses that are actually used in the PS4.

It's a very valid discussion.

What isn't valid, is your PR poster that you decided to embed into your post.

Once again, thanks. It's appreciated.

The image that I posted was about XBone which is in the spirit of the OP. No where in the OP is PS4 mention, but I see you make your own rules on thread etiquette.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
It might be, might not be. It doesn't seem to be as conclusive as the article suggests.

At least their method of reasoning is not valid.

First, they base their numbers on the assumption that the bandwidth of the bus only depends on clock and bus width. That is not the case for cache-coherent busses since these must employ protocols for cache coherency. (Just google "cache coherency protocol overhead").

Second, if their reasoning would be valid, then the leaked Durango documents could not have shown a CPU clock of 1,6Ghz and a maximum cache-coherent bandwidth of 30GB/s at the same time. Either the clocks would have to had been higher or the bandwidth would have to had been lower. But they do show those numbers.
 

coldfoot

Banned
I see your point but there are quite a few people who carried Sony when the PS3 was $599 buying it solely for the fact that it was the best bluray player, hell the thing even played SACD. So it is fair to say that additional features do help sell a console. I don't think that attach rates on the PS3 were lower because everyone reads Digital Foundry, but rather gaming is a secondary feature on many - not most - PS3s.
Comparing Blu-ray playback, which was the successor of DVD and had tangible visible improvements over it, not to mention the start of the HD flat screen upgrade cycle to voice controls (which have never worked well in the past) is ridiculous. It didn't help the PS3 that much to begin with, as PS3 sales figures were pretty bad for the first year. Besides, for the $60 that you'll spend for one year of Xbox, you can just get a Harmony remote and be done with it.
 

benny_a

extra source of jiggaflops
The image that I posted was about XBone which is in the spirit of the OP. No where in the OP is PS4 mention, but I see you make your own rules on thread etiquette.
Titanfall having critical success has nothing to do with a hardware breakdown. (Which is the topic of this thread.)

I'm questioning your interest in hardware myself, given that you've interpreted the wifi frequency numbers for CPU clockspeed in the PS4 devkit FCC filings. (Something after only casually reading about that topic on GAF would prove that isn't the case.)

Enough backseat moderating from me though, but it's very transparent what your objective is.
 

HokieJoe

Member
And that's how I took it.

It seems I wasn't the only one either.

My point still stands though. These boxes that do all these magic multimedia functions aren't worth a damn in the marketplace. No one is going to shell out $100 bucks let alone $500 bucks because it can control some other things on your multimedia rack.

People have been using universal remotes for a long time and even plain ass remotes for even longer--people like it that way. The sales of these multimedia boxes prove it.

This seems so very off. Do you get up to do these things now on your 360/PS3? No? I didn't think so. A $500 dollar Xbone packed with multimedia features isn't going to change that. You're still going to swap from game, to Netflix, and now Blu-ray (did you forget it's in the Xbone now? You can bet anyone with blu-rays will make PS4/Xbone their new player of choice) without having to get up...same as you did before.

I mean, I love fancy gadgets and new ways to interact with my media just as much as the next guy...but let's not blind ourselves like the Xbone is doing something fancy and revolutionary.



No, I wouldn't say that at all. The tablet landscape is littered with failed examples- see Apple and Microsoft. It wasn't until the technology advanced enough was the platform viable- see iPad.

There is plenty of room for improvement over the current home theater control schema. The question is not if someone will get it right, it's when. Whether the XB1's implementation is the answer is another question entirely, but you're completely off base in your assumption that people are satisfied with the current living room control schema. Nothing better has come along to replace it yet, that's all.
 

ekim

Member
Any words what GPGPU API the X1 will use?
From what i gathered the DirectCompute is lagging behind massively compared to Cuda and OpenCL.

Or will they use C++ AMP? Im not sure if its build on DirectCompute or can we expect a massive api update for DirectCompute this or next year?

And is there anything said about the API list for the X1(Win32,WinRT blablabla).

I'm sure I've read C++ AMP somewhere... I can't find it though. I'll continue looking for it.

edit: yeah - it's C++ AMP :eek:
 

HokieJoe

Member
Just look at those many processors in my processor!

trinity-die.jpg



What bandwidth and latency do those mainboard pipes have? Not comparable IMO. They're part of the sub-system of PC, but then, so is the HDD.
 
The image that I posted was about XBone which is in the spirit of the OP. No where in the OP is PS4 mention, but I see you make your own rules on thread etiquette.

In spirit of the OP? The OP is a technical post. A post about the technical merits of the X1. Not the games the X1 has.

You derailed this thread. Don't try to pin it on the people that brought up the PS4 tech as well, when the OP/Article include information on the PS4.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
From the logical diagrams I've seen. It looks like the CPU is two sets of quad core. Could they be clocked at different rates?
 

artist

Banned
The only valid mention is the second one. The other stuff isn't even mentioned by MS as SP processor.
PCIe, Display, NB .. yeah.

At this point, I'm somewhat thankful they didnt go all out and count all the 768 ALUs and say it has over 1000 processors. (like how they did for the cache)
 

ToyBroker

Banned
No, I wouldn't say that at all. The tablet landscape is littered with failed examples- see Apple and Microsoft. It wasn't until the technology advanced enough was the platform viable- see iPad.

There is plenty of room for improvement over the current home theater control schema. The question is not if someone will get it right, it's when. Whether the XB1's implementation is the answer is another question entirely, but you're completely off base in your assumption that people are satisfied with the current living room control schema. Nothing better has come along to replace it yet, that's all.

Then I'm going to assume that those people have multiple remotes for everything.

And nothing better has come along because there really isn't anything else you can do to enhance the experience (MS PR phrase ftwinz) over the remote.

I highly doubt Xbone's implementation is the answer if it can't even control your DVR--people are seriously underestimating this oversight. That means you will STILL have to swap inputs in order to record.
 
Oh really?

Because smart-TV boxes (at probably 1/5th the cost--and from big name companies) that do all the features of the Xbone AND MORE haven't sold worth a poop.

More and more people are utilizing DVRs than ever these days...Xbone not being able to control it seems like a major issue.



Christ almighty...the Dave quote again? What are you SenjutsuSage's minion? :X. We've already said it before, but if what Dave said was even close to being true or relevant, it would have been heavily reported on.

Umm, no, you're simply just wrong. Way wrong. Dave, I'm fairly certain, understands this stuff a whole hell of a lot better than you and a lot of other people do. It's extremely funny that we have people suggesting a guy such as Dave Baumann somehow doesn't know what he's talking about on this matter.

A simple post from a guy like that carries with it a whole hell of a lot more significance than much of the usually nonsensical back and forth that you normally see people arguing about on forums day in and day out.

"if what Dave said was even close to being true" Goodness lol, this man knows AMD graphics hardware better than you, me and a lot of other people that post here probably ever will. He's far from being ignorant on this matter, and I'm pretty certain that what he said is something you can take to the bank as far as its likely accuracy is concerned. Attempt to discredit the man as much as you like, but you're literally wasting your time. You think his post needs to be heavily reported on to be significant considering the level of direct involvement and, no doubt, very low level understanding that this guy has with AMD graphics hardware?

Seriously, do some reading on the man.

http://www.guru3d.com/articles-pages/an-interview-with-ati-dave-baumann,1.html

http://www.anandtech.com/show/2679/9

There's a snow ball in hell's chance of you or anybody else attempting to discredit his input on any subject concerning AMD graphics hardware, or just graphics hardware in general. The man literally wrote the book on the Xbox 360 graphics subsystem. I don't know, I'm kinda leaning towards the possibility that he knows what the fuck he's talking about? Just a tiny bit.

http://www.beyond3d.com/content/articles/4/1

But, I forgot, he isn't just wrong when he gives input on what he expects from the XB1's graphics performance. According to you, what he has to say isn't even relevant.
 
From the logical diagrams I've seen. It looks like the CPU is two sets of quad core. Could they be clocked at different rates?

I don't think so. They are linked via crossbar. There should be no difference between the two as it thinks all the cores are completely connected, even if it is two CPU modules.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Then I'm going to assume that those people have multiple remotes for everything.

And nothing better has come along because there really isn't anything else you can do to enhance the experience (MS PR phrase ftwinz) over the remote.

It's not really on-topic in this thread, but just as a side note: I have a Logitech Harmony, arguably the best programmable, "unified" remote control, and it is shit. Not sure if the Xbox can improve on that since many problems are inherent to "fire-and-forget" IR-orchestration, but there is definitely space for improvement. HDMI-CEC hasn't delivered on its promises either.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Umm, no, you're simply just wrong. Way wrong. Dave, I'm fairly certain, understands this stuff a whole hell of a lot better than you and a lot of other people do.

I just looked up that quote to see what all the fuss is about, and I don't see why it should be controversial. Both graphic cards referenced in the context of that quote have way less memory bandwidth than the XB1, so the statement that the XB1 will outperform them is perfectly credible.
 

ekim

Member
The image that I posted was about XBone which is in the spirit of the OP. No where in the OP is PS4 mention, but I see you make your own rules on thread etiquette.

Which spirit? This is about the Xbox One HW - it's totally allowed to compare it to the PS4 HW. Maybe I should add this to the OP :lol
 
I just looked up that quote to see what all the fuss is about, and I don't see why it should be controversial. Both graphic cards referenced in the context of that quote have way less memory bandwidth than the XB1, so the statement that the XB1 will outperform them is perfectly credible.

Exactly, just on pure memory bandwidth alone, the XB1 will trounce either of those two, but somehow bringing his quote into discussion was this super controversial thing. Fact is when you consider who the man is and his unquestionable level of involvement and understanding on AMD graphics hardware, it carries some serious weight when he makes a statement like that.

Now you get an idea for what I've been dealing with. Bringing that quote into discussion makes you some kind of nutjob that believes in secret sauce.
 

timlot

Banned
Titanfall having critical success has nothing to do with a hardware breakdown. (Which is the topic of this thread.)

I'm questioning your interest in hardware myself, given that you've interpreted the wifi frequency numbers for CPU clockspeed in the PS4 devkit FCC filings. (Something after only casually reading about that topic on GAF would prove that isn't the case.)

Enough backseat moderating from me though, but it's very transparent what your objective is.

My post about the PS4 devkit being clocked 2.75Ghz was a question. Can't ask questions anymore? I don't have any objective. I'm just curious why every Xbone thread, no matter what the initial topic was about, is bombarded with negativity. There was one thread where they called Albert Penello from MS a liar and every other name in the book because he said there wasn't any cpu yield problems. The anti-MS just seem really mean spirited. But, I'm done derailing bye bye.
 

TheD

The Detective
They've said nothing about CPU clock.



It might be, might not be. It doesn't seem to be as conclusive as the article suggests.

The "1.9 Ghz" rumor is from a writer that is well known for being a complete idiot, so take it with a huge grain of salt.
 

in2repid

Neo Member
I very much disagree.

What propped the PS3 up in those early years wasn't the secondary feature of the blu-ray player (which was a bonus and very much not-supported very well from the get go--like most new standards are) but it was the Playstation brand name itself.

Note that I'm not trying to discount your assertion entirely, but I do believe there is data that shows that the Blu-ray player had more of an impact on purchase motivation than people may believe.

http://www.nielsen.com/us/en/newswire/2010/how-much-do-video-games-matter-when-buying-a-console.html

While Blu-ray capability and the PS3 price reduction are included most often overall, there is evidence that suggests that software and gaming strongly motivate these potential buyers, with the library of games cited by 62%. In addition, the desire to connect and play games with friends who already own a PS3 and the ability to play multiplayer games on the PlayStation Network that speak to the appeal of PS3 as a gaming platform are also included by many gamers, indicating that these respondents are motivated to purchase the system in part for its gaming capabilities. A specific game as the motivator is included by only one in every eight gamers (12%) intending to buy the system
 

HokieJoe

Member
Then I'm going to assume that those people have multiple remotes for everything.

And nothing better has come along because there really isn't anything else you can do to enhance the experience (MS PR phrase ftwinz) over the remote.

I highly doubt Xbone's implementation is the answer if it can't even control your DVR--people are seriously underestimating this oversight. That means you will STILL have to swap inputs in order to record.


You're confusing market reality and opinion (of course, so could I). I have fifty-eleven devices and fifty-eleven remotes. I also have a Harmony One (which I think is subpar in terms of ergonomics/button layout, but anyway). The point is, even with the Harmony, my user experience is less than optimal because the remote is less than optimal.

I agree that it would be nice if the XB1 were to have DVR capability. However, the XB1's IR blaster should allow you to record via the DVR you already own.
 

stryke

Member
I'm just curious why every Xbone thread, no matter what the initial topic was about, is bombarded with negativity.

What negativity? I'm not seeing any here in this thread, unless you're construing objectively weaker hardware = negativity bombardment.
 

Bundy

Banned
no matter what the initial topic was about, is bombarded with negativity. There was one thread where they called Albert Penello from MS a liar and every other name in the book because he said there wasn't any cpu yield problems. The anti-MS just seem really mean spirited. But, I'm done derailing bye bye.
So you think it's your job to enter every thread and post "look how much games and awards the XBone has", just becaue it is a fact that the XBone has the weaker hardware?
--> back on topic
 
Nope. I can see that MS would up the CPU. It's not hard. I expect Sony to upclock it as well. Both of those are very low power processors and are much more likely to see clock increases over the GPU (since that would produce a lot more heat).

Not really. Those Jaguar have short pipelines, so they wouldn't be able to increase clock speed dramatically. 100 mhz? Sure.

PCIe, Display, NB .. yeah.

At this point, I'm somewhat thankful they didnt go all out and count all the 768 ALUs and say it has over 1000 processors. (like how they did for the cache)

Do you think fixed functions work with magic? Good job trying to compare a memory controller with a media decoder.
 

Pug

Member
What negativity? I'm not seeing any here in this thread, unless you're construing objectively weaker hardware = negativity bombardment.

All depends what part of the hardware we are talking about it i suppose. Isn't shape objectively stronger than than its counterpart in the PS4? By the way I don't know the answer to this, I'm guessing.
 

stryke

Member
All depends what part of the hardware we are talking about it i suppose. Isn't shape objectively stronger than than its counterpart in the PS4? By the way I don't know the answer to this, I'm guessing.

It's reasonable to have this expectation I suppose, but it doesn't become objective until we actually have details on its capabilities other than "PS4 has a dedicated audio chip".
 

StudioTan

Hold on, friend! I'd love to share with you some swell news about the Windows 8 Metro UI! Wait, where are you going?
Then I'm going to assume that those people have multiple remotes for everything.

And nothing better has come along because there really isn't anything else you can do to enhance the experience (MS PR phrase ftwinz) over the remote.

I highly doubt Xbone's implementation is the answer if it can't even control your DVR--people are seriously underestimating this oversight. That means you will STILL have to swap inputs in order to record.

No you won't. It just means you'll need to use your DVR remote to set up the recording. You won't have to swap any inputs because the XB1 is post cable box.
 

Pug

Member
It's reasonable to have this expectation I suppose, but it doesn't become objective until we actually have details on its capabilities other than "PS4 has a dedicated audio chip".

Ah so Sony hasn't released details on their audio chip, interesting.
So if Shape saves CPU cycles and say frees up a core or two, it could be that the Xbone may have some overhead on the CPU that maybe the PS4 won't have? Again I don't know, I'm just guessing!
 
What negativity? I'm not seeing any here in this thread, unless you're construing objectively weaker hardware = negativity bombardment.

Well, there's definitely enough negativity in the thread to go around, and while that post was off topic, I understand it to a certain degree. One can get so caught up in how "weak" the hardware is that they can forget just how great some of the games coming to the platform look, or how well received some of them have been. Not exactly relevant in a thread about a hardware discussion, so people are right about that much, but it's good sometimes to hit the reset button and realize that we aren't exactly dealing with a system lacking real power here, because when some posters really get on a role, you'd swear the Xbox One can't even handle a game of pong at 1080p.

It's important to keep in mind that we are dealing with a system that's weaker in raw performance capability compared to the PS4, and the numbers certainly back this up, but the system is, by no stroke of the imagination, weak.
 

longdi

Banned
dont get the multi-tasking home theater integration about Xbone...i can do most tasks now with ps3. seems like some really pr talk about the wonderful world of xbone media snapping. really dont understand the hype.

If anything Xbone + kinect is but to me a voice enabled logitech harmony remote.

imo, i rather save the console processing power and do my own channel/input switching with my fingers than my voice..., and at least i dont have to turn on a console, turn on the internet modem just to watch tv.
 
Well, there's definitely enough negativity in the thread to go around, and while that post was off topic, I understand it to a certain degree. One can get so caught up in how "weak" the hardware is that they can forget just how great some of the games coming to the platform look, or how well received some of them have been. Not exactly relevant in a thread about a hardware discussion, so people are right about that much, but it's good sometimes to hit the reset button and realize that we aren't exactly dealing with a system lacking real power here, because when some posters really get on a role, you'd swear the Xbox One can't even handle a game of pong at 1080p.

It's important to keep in mind that we are dealing with a system that's weaker in raw performance capability compared to the PS4, and the numbers certainly back this up, but the system is, by no stroke of the imagination, weak.


Well said. Er, written as it were.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
People fed up with negativity in tech threads can still have fun by playing fallacy bingo.

techthread_bingob7sb4.png
 

ToyBroker

Banned
Umm, no, you're simply just wrong. Way wrong. Dave, I'm fairly certain, understands this stuff a whole hell of a lot better than you and a lot of other people do. It's extremely funny that we have people suggesting a guy such as Dave Baumann somehow doesn't know what he's talking about on this matter.

A simple post from a guy like that carries with it a whole hell of a lot more significance than much of the usually nonsensical back and forth that you normally see people arguing about on forums day in and day out.

"if what Dave said was even close to being true" Goodness lol, this man knows AMD graphics hardware better than you, me and a lot of other people that post here probably ever will. He's far from being ignorant on this matter, and I'm pretty certain that what he said is something you can take to the bank as far as its likely accuracy is concerned. Attempt to discredit the man as much as you like, but you're literally wasting your time. You think his post needs to be heavily reported on to be significant considering the level of direct involvement and, no doubt, very low level understanding that this guy has with AMD graphics hardware?

Seriously, do some reading on the man.

http://www.guru3d.com/articles-pages/an-interview-with-ati-dave-baumann,1.html

http://www.anandtech.com/show/2679/9

There's a snow ball in hell's chance of you or anybody else attempting to discredit his input on any subject concerning AMD graphics hardware, or just graphics hardware in general. The man literally wrote the book on the Xbox 360 graphics subsystem. I don't know, I'm kinda leaning towards the possibility that he knows what the fuck he's talking about? Just a tiny bit.

http://www.beyond3d.com/content/articles/4/1

But, I forgot, he isn't just wrong when he gives input on what he expects from the XB1's graphics performance. According to you, what he has to say isn't even relevant.

Christ on a popsicle.

I know who he is. I know he works/worked for AMD.

Noticed how I said true or relevant? Did you even read the context in which he said it?

Previous post was talking about Xbone aiming for 7790 performance but in reality getting 7700 performance (1.28 TFlops). Gee, where is the Xbone sitting at from Microsoft's own mouth? Right around 1.28 TFlops.

I never took offense to the source, neither did anyone else. I took offense to the quote being used as some kind of messiah message from the AMD gods to alleviate all fears of Xbone fans when compared to the PS4.

Even in the own thread where it's from he's barely quoted on it, and when asked to expand further on it or ellaborate on "far and away" he declines to.

The quote even drops of the face of the discussion within a page or two.

So, as I said, if his quote contained any relevance to the power discussion between the two, sites would have reported on the matter since, as you said it, Dave is an AMD god or whatever. You've single handedly alone posted that quote more times than the media did.

So I'm so very sorry for questioning the relevance of a quote that you are so highly fond of--and I guess I should also apologize for everyone else who completely ignored it.
 

Dunlop

Member
dont get the multi-tasking home theater integration about Xbone...i can do most tasks now with ps3. seems like some really pr talk about the wonderful world of xbone media snapping. really dont understand the hype.

If anything Xbone + kinect is but to me a voice enabled logitech harmony remote.

imo, i rather save the console processing power and do my own channel/input switching with my fingers than my voice..., and at least i dont have to turn on a console, turn on the internet modem just to watch tv.

It is something that differentiates itself from it's competitors, which I think is great as I questioned why I owned both a PS3/360 when they were pretty much the same box outside of first party.

It has little merit to the "core" but could be a bigger factor with the casual market.
 

longdi

Banned
It is something that differentiates itself from it's competitors, which I think is great as I questioned why I owned both a PS3/360 when they were pretty much the same box outside of first party.

It has little merit to the "core" but could be a bigger factor with the casual market.

not at 499, internet connection, monthly subs, kinect-is-not-new.
 
Top Bottom