• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DigitalFoundry: X1 memory performance improved for production console/ESRAM 192 GB/s)

Status
Not open for further replies.
Just out of interest how come there has been so much focus on memory bandwidth anyway? is it because both companies have taken different approaches? is Microsoft intentionally trying to distract us from the fact that their $100 more expensive console has a much more inferior GPU?

PS4 has:
50% more Shader units - 768 v 1152
50% more Compute units - 12 v 18
50% more Texture mapping units (TMU's) - 48 v 72
100% more Render output units (ROP's) - 16 v 32

Even if PS4 & Xbox One had the exact same memory bandwidth/subsystem there is still a huge disparity in GPU power for two consoles of the same generation (which are supposedly going head to head).

Memory bandwidth is less vital when processing power shrinks. It is more necessary for PS4 to have a higher bandwidth because it can crunch data faster.
 
Just out of interest how come there has been so much focus on memory bandwidth anyway? is it because both companies have taken different approaches? is Microsoft intentionally trying to distract us from the fact that their $100 more expensive console has a much more inferior GPU?

PS4 has:
50% more Shader units - 768 v 1152
50% more Compute units - 12 v 18
50% more Texture mapping units (TMU's) - 48 v 72
100% more Render output units (ROP's) - 16 v 32

Even if PS4 & Xbox One had the exact same memory bandwidth/subsystem there is still a huge disparity in GPU power for two consoles of the same generation (which are supposedly going head to head).

Where did you get the TMU numbers from?
 

Vestal

Gold Member
Dreamcast had a real VGA adapter and a modem years before other consoles did. It was dramatically ahead of it's time. It's sad what happened to the DC, though a great deal of it was Sega's own fault.

remember playing virtua tennis over that 56k connection!!!!

and that sexy ass controller with pressure triggers, was so revolutionary back then, awesome for baseball games.
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
Just out of interest how come there has been so much focus on memory bandwidth anyway? is it because both companies have taken different approaches? is Microsoft intentionally trying to distract us from the fact that their $100 more expensive console has a much more inferior GPU?
...

Two reasons;

1.) It looks like BS, simply because you don't, under any normal circumstances, discover a sudden 88% increase in bandwidth months before a console launches.
2.) It might have inadvertently confirmed a down clock, which is quite amusing since it appears to be spin in the first place.

The downclock would be only very marginal, and probably not have much of an effect.
 

guch20

Banned
its amazing the universal love for the dreamcast. Damn i miss Sega.

Same. I fondly remember playing NFL2K, Sonic Adventure, Power Stone, and Ready 2 Rumble all day and all night on launch day. Some of those haven't aged well, but the console itself was amazing for its time.

And the ads! "It's thinking."

http://youtu.be/EH9Xx-nXTo8

Dreamcast had a real VGA adapter and a modem years before other consoles did. It was dramatically ahead of it's time. It's sad what happened to the DC, though a great deal of it was Sega's own fault.

Yeah, whoever decided it would be a good idea to flood the market with hardware in a short span of time should have their mouth peed in. Sega CD, 32X, Saturn, Dreamcast...it was too much, too fast.
 
This thread is fun.
As the general consensus is still that this is bunk, and apparently a post I made a few pages ago was ignored, I'll venture a repost for the sake of on-topic dicussion:
This being from the starting point, not a before-after comparison of the total performance (if I have 5 apples and I double my apples to 10, it's a 100% increase in apples but also only 50% more apples when compared to the original amount.)
If you have 10 apples and previously had 5 apples you have 100% more apples than previously. 50% more apples than an original amount of 5 apples is 7.5 apples. Perhaps the comment was ignored because it was based on a false premise.
 
Alright if last gen was the move to HD, I want this gen to be the move to high IQ. I don't want to see aliasing anywhere. It ruins high quality visuals so much. Those 6 extra CUs on the PS4 better at least give me extra high IQ on games designed to not look too gimped on Xbox One.
 

Pwn

Member
Man, a eSRAM with only 32 MEGA BYTES. I highly doubt it can do something significant in terms of graphics.

Moreover, industry will get rid of DDR3 as VRAM in a year or two. This will lead GDDR5 price down and industry uses it as standard VRAM.

So Xbox One specs will seem dated after 2 years.
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
Alright if last gen was the move to HD, I want this gen to be the move to high IQ. I don't want to see aliasing anywhere. It ruins high quality visuals so much. Those 6 extra CUs on the PS4 better at least give me extra high IQ on games designed to not look too gimped on Xbox One.

This is up to developers to an extent, you should really get into PC gaming if IQ is a such a significant priority.
 

astraycat

Member
So the same guy went a little more in detail about how sometimes, DDR3+eSRAM can be better than GDDR5. He may be full of shit, but sounds knowledgable.



So does this clear anything up or just muddy the waters further?

The scenario posted omits a couple things: caches, and memory controllers.

Looking at the GCN whitepaper, you'll see a block with Z$, C$ and Color ROP/Z Stencil ROP units. The Z$ is the depth buffer cache and the C$ is the color buffer cache.

It's only in the case of a miss (and when flushing) that you'll actually need to go out to main memory (or eSRAM) to fetch whatever information you need, and I hope that scheduling is smart enough to pipe fragment data from wavefronts to ROP groups that have already been working on that portion of the screen, as to minimize reading/flushing of the caches.

Also, in the case of depth (depending on operating mode) the rasterizer can cull entire triangles based on early-Z, and this process is sped up by having a hierarchically compressed Z buffer for fast min/max depth on a region.

Lastly, as the diagrams in the whitepaper show, there are several individual memory controllers. I don't know much about how memory controllers work, but I'd assume that they can each individually read or write on a given cycle (please correct me if I'm wrong here).
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
I do play PC games. It lets me know how much poor IQ ruins console games. TTT2 :(

Yeah, then you know that it's a trade off situation, more graphics on the screen vs higher IQ.

I'd love for games to come with a priority option you can set in the settings menu.
 

guch20

Banned
remember playing virtua tennis over that 56k connection!!!!

and that sexy ass controller with pressure triggers, was so revolutionary back then, awesome for baseball games.
I remember playing one of the NFL2K games online. Me and my brother on the same console, taking on all comers.

The first game we played online--ever--was Chu Chu Rocket.
 

I2amza

Member
Oops, crap. Didn't see you posted this and posted his whole blurb. He does specifically say it's better for some things, but I'm wondering if that's true. Only because I've read everywhere that Sony's solution is not only more powerful, but more elegant. But I'm no techie.

If it turns out Xbox has the better GPU solution again, I think a lot of folks will be munching on crow.

He later replied in the same thread that he wasn't really saying the PS4 GPU was weaker. He was just talking about how to get most of the X-One.

Oh Joy, I never said it was better, just making the best use of what it's got. Its more "efficient" at what has got to play with.
Better? Thats pie in the sky stuff.

PSU Post 164
 
Yeah, then you know that it's a trade off situation, more graphics on the screen vs higher IQ.

I'd love for games to come with a priority option you can set in the settings menu.

Well that's why I was talking about multiplatform games designed to not look significantly worse on Xbox One. For example, something like BF4. If they are making it to have the same assets and interactability on both PS4 and Xbox One, I want those extra 6 CUs on PS4 at least pumping out extra high IQ.
 

guch20

Banned
The scenario posted omits a couple things: caches, and memory controllers.

Looking at the GCN whitepaper, you'll see a block with Z$, C$ and Color ROP/Z Stencil ROP units. The Z$ is the depth buffer cache and the C$ is the color buffer cache.

It's only in the case of a miss (and when flushing) that you'll actually need to go out to main memory (or eSRAM) to fetch whatever information you need, and I hope that scheduling is smart enough to pipe fragment data from wavefronts to ROP groups that have already been working on that portion of the screen, as to minimize reading/flushing of the caches.

Also, in the case of depth (depending on operating mode) the rasterizer can cull entire triangles based on early-Z, and this process is sped up by having a hierarchically compressed Z buffer for fast min/max depth on a region.

Lastly, as the diagrams in the whitepaper show, there are several individual memory controllers. I don't know much about how memory controllers work, but I'd assume that they can each individually read or write on a given cycle (please correct me if I'm wrong here).
Even though I'm not a techie, I love reading this stuff. Thank you!
 

guch20

Banned
He later replied in the same thread that he wasn't really saying the PS4 GPU was weaker. He was just talking about how to get most of the X-One.



PSU Post 164
What confuses me is how he says it's more efficient at what it's got to work with. Granted, my knowledge of such things is limited, but wouldn't better efficiency mean better performance?
 

Vestal

Gold Member
Alright if last gen was the move to HD, I want this gen to be the move to high IQ. I don't want to see aliasing anywhere. It ruins high quality visuals so much. Those 6 extra CUs on the PS4 better at least give me extra high IQ on games designed to not look too gimped on Xbox One.

I have a feeling that this Gen will be the online everything Gen, with services up the wazooo.

not really happy about that, since I know we are going to get games with online components duck taped on.
 

ethomaz

Banned
He later replied in the same thread that he wasn't really saying the PS4 GPU was weaker. He was just talking about how to get most of the X-One.

PSU Post 164
lol I got quoted.

How he defined the platforms is inline with how I see both:

"PS4=Way more PC like, and straightforward, you don't really need to go looking for optimizations unless you somehow need them.

Xbox One=More like a supercharge 360 in terms of having access to that edram, which also makes it more complicated."
 

guch20

Banned
I have a feeling that this Gen will be the online everything Gen, with services up the wazooo.

not really happy about that, since I know we are going to get games with online components duck taped on.

So basically, a repeat of this gen.

2013_%2525206_30_21_23.jpg
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
So does this clear anything up or just muddy the waters further?

I have no idea where he got the idea that the ESRAM has "zero" latency or that it allows for "free" z-buffer tests.

The other thing he is talking about is that the XBox has tow busses from the GPU to memory, one to the ESRAM and the other one to main memory. He concludes that you can read on one bus and write on the other concurrently. While this may be true, GDDR5 can also issue read and write requests on the same cycle, so I am not sure where the benefit is supposed to be.
 
This thread is fun.
If you have 10 apples and previously had 5 apples you have 100% more apples than previously. 50% more apples than an original amount of 5 apples is 7.5 apples. Perhaps the comment was ignored because it was based on a false premise.

Ouch..that hurt MY feelings. Sadaharu owns.
 

badb0y

Member

CLEEK

Member
Oh man. I stopped reading GAF mid afternoon on Friday, and come back to this! Insane that there is more drama in this thread about something this trivial (in the scheme of things) comprated to all the Durango/Orbis/Xbox One/PS4 hardware threads so far.

Man, a eSRAM with only 32 MEGA BYTES. I highly doubt it can do something significant in terms of graphics.

Moreover, industry will get rid of DDR3 as VRAM in a year or two. This will lead GDDR5 price down and industry uses it as standard VRAM.

So Xbox One specs will seem dated after 2 years.

This is nonsense.

There is absolutely no doubt the ESRAM will mitigate the low bandwidth of the main memory. It is only up for discussion whether that ESRAM can offers results comparable to the PS4 and its unified 8GB GDDR5. The use of embedded RAM in itself is not bad hardware design, and 32MB is ample for its designed purpose.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Lastly, as the diagrams in the whitepaper show, there are several individual memory controllers. I don't know much about how memory controllers work, but I'd assume that they can each individually read or write on a given cycle (please correct me if I'm wrong here).

Yes, every memory controller occupies 2x32bit channels of the memory interface independently.
 

Biker19

Banned
You make it sound like Sony hasn't been one of the technical leaders of music hardware in the industry.

Yeah, Sony knows what they're doing. They're also a hardware company after all, unlike Microsoft.

Just wanted to post that everyone that keeps bringing up launch/launch window/1st year titles to show the full power of the system should tame their expectations a bit.

Yes the hardware is closer to a PC than other generations, but the devs from both sides have been working on incomplete (as in alpha, beta and not final) dev kits for a good chunk of their time. Which means that they were never able to target final specs (until recently) because the specs keep changing.

Food for thought.

Exactly, & I just laugh at people that thinks that the PS4 isn't powerful enough when the games that Sony showed off at E3 were ran off of early dev kits & at 4 GB's of GDDR5 RAM.
 

Dlacy13g

Member
I have enjoyed this. Memories of ps1, dreamcast and blue shadows.

Agreed its been a fun read but at some point (if this happened sometime yesterday then excuse my ignorance) I wish some would have gotten DF to thoroughly spell out what the meant or how they came to their numbers as in were there provided to them without verification, etc...
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
Agreed its been a fun read but at some point (if this happened sometime yesterday then excuse my ignorance) I wish some would have gotten DF to thoroughly spell out what the meant or how they came to their numbers as in were there provided to them without verification, etc...

Some users have tried by tweeting them and the like, and many comments on the article page itself. No answers were given.

I'm surprised Digital Foundry is going to shit up it's reputation over it.
 

guch20

Banned
Some users have tried by tweeting them and the like, and many comments on the article page itself. No answers were given.

I'm surprised Digital Foundry is going to shit up it's reputation over it.
To be fair, they probably tried to get clarification from Microsoft, and their tech guys are furiously working with their PR guys on how best to explain it without explaining a god damn thing.
 

I2amza

Member
A recap of this thread:

- DF link for "X1 eSRAM performance increase by 88%"

- 1st couple pages of people saying good news, and others asking if it makes it better than PS4, and even some saying "CBOAT fail" - CBOAT Fail

- Some members finding that the math is wrong and in conclusion = Downclock!

- Same members stating that the DF article is pure PR from MS

- Finding out that DF's source is MS directly (pure speculation imo) due to a Tweet. Tweet Image

- At the same time we had people not knowing the movie Spaceballs............

- Oldergamer kept pulling stuff out of his arse and when asked for proof, he kept refusing and his answer was always: "It's just a simple google search away!". Needless to say it didn't go too well for him. And let's not forget this golden post here:

I keep telling people the GPU differences aren't going to manifest it self in real world. MS put a ton of extra hardware in the xbOne to allow for preventing stalls in the graphics pipeline. They've done a ton of research to determine how much GPU's sit idle and the points they most like sit stalled waiting for data.

Even with the few areas that the GPU is PS4 is faster ( and it's only in a few areas ) ps4 developers are going to run into stalls and having the GPU sit idle more then the GPU in the xbOne. It's going to make the performance much more of a wash then you are all expecting.

- Ultimatums to Oldergamer by Bish:
Ultimatum 1
Ultimatum 2
Ultimatum 3

- People also found out that specialguy is Rangers from B3D, not that it means much tbh.

- Meanwhile we had some members rambling about the cloud, cloud size, cloud cost, cloud etc....

- Then some members had a persecution breakdown of some sort and believed all of GAF was against them.

- Afterwards people started arguing about full console specs with some stating that the specs were not finalized or it's all about games (in a tech analysis thread...) and also started comparing PS3 and 360 once again.

- Then we started comparing E3 showings of Infamous SS and Forza 4. With Hawk269 saying he played the game at E3 without showing proof. The problem is that according to many, the game was not playable for anyone. Guess now he has more time for his "special showings". Here is Bish's cease and desist: Send me credentials naoooo!

Yep. This has been a pretty damn good thread so far.
 

Vestal

Gold Member
A recap of this thread:

- DF link for "X1 eSRAM performance increase by 88%"

- 1st couple pages of people saying good news, and others asking if it makes it better than PS4, and even some saying "CBOAT fail" - CBOAT Fail

- Some members finding that the math is wrong and in conclusion = Downclock!

- Same members stating that the DF article is pure PR from MS

- Finding out that DF's source is MS directly (pure speculation imo) due to a Tweet. Tweet Image

- At the same time we had people not knowing the movie Spaceballs............

- Oldergamer kept pulling stuff out of his arse and when asked for proof, he kept refusing and his answer was always: "It's just a simple google search away!". Needless to say it didn't go too well for him. And let's not forget this golden post here:



- People also found out that specialguy is Rangers from B3D, not that it means much tbh.

- Meanwhile we had some members rambling about the cloud, cloud size, cloud cost, cloud etc....

- Then some members had a persecution breakdown of some sort and believed all of GAF was against them.

- Afterwards people started arguing about full console specs with some stating that the specs were not finalized or it's all about games (in a tech analysis thread...) and also started comparing PS3 and 360 once again.

- Then we started comparing E3 showings of Infamous SS and Forza 4. With Hawk269 saying he played the game at E3 without showing proof. The problem is that according to many, the game was not playable for anyone. Guess now he has more time for his "special showings"

Yep. This has been a pretty damn good thread so far.

You should have included the 2 ultimatums by Bish. This thread has had everything.
 

CLEEK

Member
Oh god...this can't be real can it? I've been using Excel to plan our company budget for years D:

I would normally shy away from such a massive thread derail, but god damn at this thread already.

That article states that 88% of spreadsheets contain errors, not that Excel is causing the errors.
 

Gestault

Member
This thread is fun.
If you have 10 apples and previously had 5 apples you have 100% more apples than previously. 50% more apples than an original amount of 5 apples is 7.5 apples. Perhaps the comment was ignored because it was based on a false premise.

For one, even assuming my specific example were wrong, I assume you can follow the reasoning to a point of being able to address it.

And secondly, a 100% increase to 5 apples gives you 10 apples. 5 apples is 50 percent of 10 apples. Which was why I wrote

If I have 5 apples and I double my apples to 10, it's a 100% increase in apples but also only 50% more apples when compared to the original amount.
 
Some users have tried by tweeting them and the like, and many comments on the article page itself. No answers were given.

I'm surprised Digital Foundry is going to shit up it's reputation over it.

I was not aware that DF had a reputation to shit up in the first place. Richard Leadbetter is well known for being a blatant MS and 360 fanboy and because of this almost all of their cross-platform 360 Versus PS3 game comparisons were unreadable biased trash.
 

Suikoguy

I whinny my fervor lowly, for his length is not as great as those of the Hylian war stallions
I was not aware that DF had a reputation to shit up in the first place. Richard Leadbetter is well known for being a blatant MS and 360 fanboy and because of this almost all of their cross-platform 360 Versus PS3 game comparisons were unreadable biased trash.

I thought they were at least had some clout about spec information... guess not.
 
For one, even assuming my specific example were wrong, I assume you can follow the reasoning to a point of being able to address it.

And secondly, a 100% increase to 5 apples gives you 10 apples. 5 apples is 50 percent of 10 apples.
I'm sorry I don't follow what your example was supposed to show and how it relates to the 88%. Feel free to elaborate?

Yes, 5 apples is 50% of 10 apples. Yes, 5 apples is 50% less than 10 apples. While 10 apples is a 100% increase in apples from 5 apples. We've had this song and dance before around the difference in FLOPS. What is that supposed to show with regard to these numbers?

We know 192 is 88% more than 102. What people are seemingly querying is why is the new figure 192. A doubling, i.e. a 100% increase, would be 204.

What exactly are you trying to say with your example?

EDIT: I see you edited. No 10 apples is not 50% more than 5 apples, when using the original amount as a base. It is 100% more apples than 5 apples. 10 is not 50% more than 5.
 
Status
Not open for further replies.
Top Bottom