• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox One specs from Hot Chips session (8GB Flash, 1.31TFlops, 204GB/s peak BW)

Status
Not open for further replies.

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
No. 102GB/s is the ESRAM minimum and 204GB/s the peak.

That is absolute bullshit, it's 102GB/s peak reads and 204GB/s peak simultaneous reads and writes and neither figure will be achievable in real-world use.

How on earth anyone can think that translates to 102GB/s minimum bandwidth is laughable in the extreme.
 

Ushae

Banned
I'm very happy with the specs on both these platforms.

Now bring on games, innovative design and wow us. Use that touchpad, use Kinect and get every drop of power from these systems.

Cant wait :)
 

Flatline

Banned
That is absolute bullshit, it's 102GB/s peak reads and 204GB/s peak simultaneous reads and writes and neither figure will be achievable in real-world use.

How on earth anyone can think that translates to 102GB/s minimum bandwidth is laughable in the extreme.


So now we know where Leadbetter got his bullshit math from, Microsoft itself. They're adding reads and writes together without the existence of a second bus. Ridiculous.
 

Klocker

Member
I think this is exactly why they have it implemented.

Smart move, MS.

What is it's purpose?

Reduce latency and to bring the memory system of DDR3 up to speed when taken as a Whole, using the esram as a fast cache for the gpu... it does not need to be as big as the ram pool for that.

The whole system was designed around the fact that ms and Sony gambled on the availability of GDDR5 today and Sony won the gamble and Ms played it safe with a contingency.
 

allan-bh

Member
That is absolute bullshit, it's 102GB/s peak reads and 204GB/s peak simultaneous reads and writes and neither figure will be achievable in real-world use.

How on earth anyone can think that translates to 102GB/s minimum bandwidth is laughable in the extreme.

The slide says 102 GB/s minimum (Or 109 GB/s, it's not very clear.)
 

EagleEyes

Member
I'm very happy with the specs on both these platforms.

Now bring on games, innovative design and wow us. Use that touchpad, use Kinect and get every drop of power from these systems.

Cant wait :)
Now thats what I want to hear. :) We need more of this on gaf and less of the immature fanboyism that dominates these kinds of threads. Rock on.
 
Wrong, Microsoft was full of shit from the beginning:

http://www.techpowerup.com/184398/xbox-one-chip-slower-than-playstation-4.html




This is from May. The thread title is indeed misleading PR.

I thought so then, too. However, newer information since then, through both DF's development sources, and especially now at hot chips seems to directly contradict that initial assumption that they were simply just cobbling bandwidths together.

Unless they're outright lying, which I really don't think is the case, the ESRAM peak theoretical bandwidth is indeed 204GB/s. I'm looking at the diagram from hotchips. I see 68GB/s for the DDR3 off by itself somewhere, I see a 30GB/s coherent bandwidth path way off by itself somewhere, and then further down the diagram I see 4 x 8MB blocks worth of ESRAM totaling 32MB that is labeled as having a minimum bandwidth of 109GB/s and a peak bandwidth of 204GB/s. This isn't the first time that a bandwidth figure from an MS console has been questionable. The 360's 256GB/s for its EDRAM was highly questioned in the early days of the 360, as I remember reading quite a bit of the doubt going around until it was later explained, I believe, by Dave Baumann on beyond3D. People thought Microsoft was playing funny business then, too, but it turned out to be true.

Now obviously I don't know what's going on, but neither do a lot of other people. As an example, Mark Cerny himself stated that it was possible to have an embedded memory chip with upwards of 1TB/s worth of bandwidth for the PS4, but they opted for a simpler design that was easier for developers to extract power from. He said he didn't want developers having to crack some puzzle to get at the system's power. After an unbelievable bandwidth figure like that? 204GB/s doesn't seem all that pie in the sky. I know there's a tendency to be very skeptical of what Microsoft is saying, but they've more or less acknowledged, although not without some kicking and screaming, in so many ways that they aren't packing the kind of raw horsepower that the PS4 is. I don't see what further meaning there is in lying about their memory bandwidth now, as if it would honestly make a big difference to public perception of the console now, especially at hot chips of all places. They gave an honest presentation on Xbox 360 silicon at hot chips when the 360 was launching, and I don't see any reason why they wouldn't do the same this time with the Xbox One. They gain nothing at all with an attempt to wave a bigger penis on their ESRAM bandwidth.


I'm very happy with the specs on both these platforms.

Now bring on games, innovative design and wow us. Use that touchpad, use Kinect and get every drop of power from these systems.

Cant wait :)

I couldn't agree more. Both are great systems. Developers have what they need to blow our socks off in the coming months and years.
 
The slide is lying. In order to achieve that they'd need a dual bus ESRAM. X1 does not have one. It's the exact same bullshit Leadbetter was trying to spoonfeed us.

And you know this how? :)

Pretty big assumption there when there is so much information on both these consoles. I have a friend in development that I'll ask about this, though.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
The slide says 102 GB/s minimum (Or 109 GB/s, it's not very clear.)

It's utterly misleading, we know exactly where the 102GB/s figure comes from, it's simple math to do once you have the clockspeed and based off 100% bus utilization which is never seen in real world use, the doubling to 204GB/s comes from the revelation that it is capable of simultaneous reads/writes. but that level of utilization will certainly never be seen either.
 

cchum

Member
PS4=
CgqawpG.jpg
Xbone=
Windows-8-Phone.jpg

WiiU=
hVpONtm.jpg
 

Chobel

Member
That is absolute bullshit, it's 102GB/s peak reads and 204GB/s peak simultaneous reads and writes and neither figure will be achievable in real-world use.

How on earth anyone can think that translates to 102GB/s minimum bandwidth is laughable in the extreme.

Yeah, that's really weird. No one can't guarantee the minimum bandwidth.
PS: it says 109 not 102, but that is beside the point.
 

Reg

Banned
How do we know that? I've never seen him actually fight. But he looked like he could kick some ass.

I'll say this much about lee, he was the most influential and important martial artists ever. And I think that's another reason why the comparison to Nintendo is apt.
 

pixlexic

Banned
There is so many computer engineers online you would think MS would've known better.

Apparently they didn't read the Wikipedia teraflop article.

And the worst part is if/when it turns out to not matter. Like when these same statements were made about the ps3s gpu.. then later we get games like GOW3. Nobody cares anymore about what was said. Its like a big aquarium and the fish keep going round and round and round.
 

StevieP

Banned
So this has become a console warrior troll thread along with a correction on MS' own BS figures. *shrug*

On the bright side, at least the specs are out in official channels now. We can go back to discussing games... right?
 

Majanew

Banned
It's utterly misleading, we know exactly where the 102GB/s figure comes from, it's simple math to do once you have the clockspeed and based off 100% bus utilization which is never seen in real world use, the doubling to 204GB/s comes from the revelation that it is capable of simultaneous reads/writes. but that level of utilization will certainly never be seen either.

It's 109GB/s. The bump came from the increase to 853MHz.
 
Are you implying that Microsoft upgraded to dual bus ESRAM a couple of months before release and didn't tell us?

My point is that they've never really told us with any suitable detail how that ESRAM was designed. This is literally the first time we're learning that its somehow 4 separate 8MB ESRAM chips for the combined 32MB that we know about. Before we were all thinking it was one huge 32MB chunk of ESRAM. We've never been told outside of some leaks, and even those didn't go into the level of detail required, to tell us just what kind of memory we were dealing with. Who made it for them? How was it designed? How about some details on how it was incorporated into the GPU? There's a lot we don't know.

This wasn't done a couple of months before release. This was clearly done from the early architectural design stages of the system, or we wouldn't be looking at it now. They simply never went into the required detail on any of this stuff. Remember, it was Sony being more forthcoming on the various details of the PS4 architecture largely because they're so proud of the turnaround from the PS3. Microsoft was being more cryptic and shy about giving out certain details, and now we have some more details.
 

K1ng P3n

Member
So this has become a console warrior troll thread along with a correction on MS' own BS figures. *shrug*

On the bright side, at least the specs are out in official channels now. We can go back to discussing games... right?

Good luck with that, you know every MS thread turns into a versus thread.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
There's a lot we don't know.

Given your posts in any technical threads I'd amend that to:

"There's a lot SenjutsuSage doesn't know"

Starting with basic math.
 
Status
Not open for further replies.
Top Bottom