• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DigitalFoundry: X1 memory performance improved for production console/ESRAM 192 GB/s)

Status
Not open for further replies.

benny_a

extra source of jiggaflops
I have to contest this Infamous: Second Son point. Yes, it has pop-in and 2d trees, but what do you expect? The PS4 does not have limitless power and this is a launch window title. It's going to have problems even when it ships.

But when I go to the outside of the map and then look in the sky and then jump. Then I will not get any pop-in and 60 FPS and it will be glorious. ;-)

I found a comment from an Indie dev about the article and Leadbetter:

http://www.psu.com/forums/showthrea...-hugely-underestimated-claim-developers/page3

I don't know what to make out of this - sounds reasonable but afaik it is not possible to have a bus that can read and write at the same time and the depth/color block also sounds familiar when you read the GCN white paper but I'm not sure here.
I'm very skeptical when someone talks about DDR3 + 32 MB eSRAM as the superior choice. Assuming similar yields the DDR3 + 32MB eSRAM combo is less expensive. Of course that's a big assumption.

Also am I reading it right and the poster is using the efficiency gambit?
 

klaus

Member
But when I go to the outside of the map and then look in the sky and then jump. Then I will not get any pop-in and 60 FPS and it will be glorious. ;-)

Damn that just reminded me of the one map in Quake 1 where you could get outside of the map and see lots of sky - and rain hell down on unsuspecting players :p

Sorry for the offtopic, but I'm really just waiting for Bish's verdict..
 

ekim

Member
But when I go to the outside of the map and then look in the sky and then jump. Then I will not get any pop-in and 60 FPS and it will be glorious. ;-)


I'm very skeptical when someone talks about DDR3 + 32 MB eSRAM as the superior choice. Assuming similar yields the DDR3 + 32MB eSRAM combo is less expensive. Of course that's a big assumption.

Also am I reading it right and the poster is using the efficiency gambit?

I don't know if he got his hands on devkits but years ago he developed his own 3D engine - so he has some knowledge about the matter at least. But sounds fishy - yeah.
 

GribbleGrunger

Dreams in Digital
I have to contest this Infamous: Second Son point. Yes, it has pop-in and 2d trees, but what do you expect? The PS4 does not have limitless power and this is a launch window title. It's going to have problems even when it ships.

Pop in and 2D trees? Any videos to support that because I can't say I've noticed. I'm willing to be proven wrong though.

edit: I don't see it here: http://www.youtube.com/watch?v=7PaC52BQe2I
edit 2: I saw a few pop ins but not 2D trees lol
 
Damn that just reminded me of the one map in Quake 1 where you could get outside of the map and see lots of sky - and rain hell down on unsuspecting players :p

Sorry for the offtopic, but I'm really just waiting for Bish's verdict..
Haha, I used to do that all the time in Unreal Tournament as well. :D

Man, I miss Unreal Tournament. I should stop by the Q3 vs UT thread.
Hawk269
Banned

oh snap.
Greatness Awaits.
 

guch20

Banned
Hawk269
Banned

...

Oh shit son. I guess his opinion on Infamous can be completely disregarded (or at least lowered in importance to any other average joe who's never played the game).

I found a comment from an Indie dev about the article and Leadbetter:

http://www.psu.com/forums/showthrea...-hugely-underestimated-claim-developers/page3

I don't know what to make out of this - sounds reasonable but afaik it is not possible to have a bus that can read and write at the same time and the depth/color block also sounds familiar when you read the GCN white paper but I'm not sure here.

On topic, is this guy telling the truth? Is Xbone's GPU better than PS4's?
 

Majanew

Banned
lol, see ya, Hawk

I love that people boasting about having certain insider info, or pull in the industry, are told to provide proof to mods.
 

benny_a

extra source of jiggaflops
Oh shit son. I guess his opinion on Infamous can be completely disregarded (or at least lowered in importance to any other average joe who's never played the game).
The dumbest thing about it is that it's still true. The game does have pop-in as can be seen in this thread. Why even lie about this shit when there is actual footage of the game out there.

Also I'm glad that a mod agreed that whether or not someone is being truthful matters in a discussion, no matter the outcome.
 

Gestault

Member
I have a question based on "common" sense, and not a particular degree of technical knowledge. If they're saying they discovered that they can read/write simultaneously when previously they had assumed they could only do one or the other (so firmware/tools would have been set up with that assumption), wouldn't the roughly doubling of memory transfer speed (simultaneous reading/writing compared to sequential) via the eSRAM/DDR3 combination account for an 88% (if not more) theoretical bump in memory performance?

This being from the starting point, not a before-after comparison of the total performance (if I have 5 apples and I double my apples to 10, it's a 100% increase in apples but also only 50% more apples when compared to the original amount.)

Anyone have a Star-Trek-style breakdown of why that wouldn't work? I don't assume this is a "bingo" moment, but I'm curious myself.
 

joeblow

Member
Someone mentioned Cerny's statement about developers getting more out of the PS4 over time. I'm pretty sure he is referring to his earlier statements about lower level tools not yet being available for a while. Once they are released, developers can code closer to the metal for even better results.
 

guch20

Banned
The dumbest thing about it is that it's still true. The game does have pop-in as can be seen in this thread. Why even lie about this shit when there is actual footage of the game out there.

Also I'm glad that a mod agreed that whether or not someone is being truthful matters in a discussion, no matter the outcome.
Yeah, I didn't get his angle either. If he wanted to say he thinks Forza looks better, fantastic. You are capable of forming an opinion. On incomplete games. In alpha states.

But to then try to bolster your opinion by pretending to be some kind of big shot is just stupid. When you're found out, it not only invalidates your posts, it makes you look absolutely childish.
 

Cidd

Member
Only 3 as far as I know. 2 high profile ones and one dude that couldn't restrain himself when accusing others.

I was expecting more to be honest every time I return to this thread a different argument is taking place about things that have nothing to do with the thread title.
 

Gestault

Member
I have to contest this Infamous: Second Son point. Yes, it has pop-in and 2d trees, but what do you expect? The PS4 does not have limitless power and this is a launch window title. It's going to have problems even when it ships.

I know it's...unwise to clarify an off-topic point to a mod, but I was under the impression people weren't even saying Second Son didn't look good or anything, just the relative amount of polish was less than other games on the floor like Forza 5. Because Infamous obviously looks sweet. That was part of why I was so surprised when people were saying it was more polished and that it hadn't been playable as part of the same point.
 

Vestal

Gold Member
I'm all for being impartial--but have you said anything even remotely positive about PS4?

I haven't said anything negative outside of saying that I prefer the launch lineup for the xbone over theirs.

They have the better hardware on paper as far as we know. They still have to translate that into a noticeable advantage in games.
 

Y2Kev

TLG Fan Caretaker Est. 2009
Pop in and 2D trees? Any videos to support that because I can't say I've noticed. I'm willing to be proven wrong though.

There's like a 2d tree somewhere and omgno


I know it's...unwise to clarify an off-topic point to a mod, but I was under the impression people weren't even saying Second Son didn't look good or anything, just the relative amount of polish was less than other games on the floor like Forza 5. Because Infamous obviously looks sweet. That was part of why I was so surprised when people were saying it was more polished and that it hadn't been playable as part of the same point.

a) I don't ban people I'm arguing with.
b) But that's exactly my point. If there are 2d trees, they're there for a reason-- maybe they chose to put resources somewhere else. The PS4 is not limitless in power, so corner cutting and/or graphical sacrifices are not indicative of a lack of polish. The Last of Us has a shitty AA solution but I don't think it's because Naughty Dog didn't polish the game.
 

Cidd

Member
What I find even more silly is that people were comparing a driving game versus an open world game. Forza 5 is a straightforward driving game with no fancy bells and whistle like real time weather or dynamic lighting. Infamous: SS is an open world game thats has some destructible environments with physics based particle effects and lots of Ai on screen. In it early stages which of these game do you really expect to look better?

seriously.
 

Grimhammer

Neo Member
I came here for fun read - stayed to share.

Isn't Sony faulty electronics well below industry standards?

And if we are in truth really talking about whether Sony has a better track record for console hardware quality vs MS......then duh, Sony wins via experience and a track record with done errors - but nothing compared to RROD.

A big part that I see many not mentioning; MS tried for months to simply ignore it! And only when it was no longer possible, did they put that 1yr addition to warranty for RROD.
 

Gestault

Member
a) I don't ban people I'm arguing with.
b) But that's exactly my point. If there are 2d trees, they're there for a reason-- maybe they chose to put resources somewhere else. The PS4 is not limitless in power, so corner cutting and/or graphical sacrifices are not indicative of a lack of polish. The Last of Us has a shitty AA solution but I don't think it's because Naughty Dog didn't polish the game.

a) I uhh....never suggested you did. No one did.

b) Wouldn't that apply theoretically to every game ever? That any inadequacy could be justified as a design decision, even when it's a massively distracting issue that'll be dealt with before release like buildings appearing?
 

beast786

Member
Those defending Hawk's honor look silly as well.

I love how they attack nibs right away.


Nib, I usually don't react to you, because your interactions are pretty predictable, but really, please stop. Just stop. You're trying too hard, and with regularity you try to tell people who have played these games what they saw, or that they couldn't have seen it if it contradicts your view on it.
 

Y2Kev

TLG Fan Caretaker Est. 2009
a) I uhh....never suggested you did. No one did.

b) Wouldn't that apply theoretically to every game ever? That any inadequacy could be justified as a design decision, even when it's a massively distracting issue that'll be dealt with before release like buildings appearing?

Yes, it applies to every game ever. Like how trees pop up in the middle of no where and cause annoying car crashes in GTA4. I don't consider that a lack of polish.
 

guch20

Banned
I found a comment from an Indie dev about the article and Leadbetter:

http://www.psu.com/forums/showthrea...-hugely-underestimated-claim-developers/page3

I don't know what to make out of this - sounds reasonable but afaik it is not possible to have a bus that can read and write at the same time and the depth/color block also sounds familiar when you read the GCN white paper but I'm not sure here.

So the same guy went a little more in detail about how sometimes, DDR3+eSRAM can be better than GDDR5. He may be full of shit, but sounds knowledgable.

The Framebuffer is the only guaranteed piece of of memory that you will access on your GPU.

Textures, model data etc, will change either frame to frame, or over time.

In any one given scene you will "draw" you models or textures once per frame (yes you can instance to get multiple model of the same model if you want), but you only access it once.

So in a GPU commands you will do this.

Get model data 1
Texture data 1
Draw
Get model data 2
Texture data 2
Draw...etc etc




So in memory terms it looks like this.

GDDR READ model data 1
GDDR READ Texture data 1
Draw
GDDR WRITE Framebuffer
GDDR READ model data 1
GDDR READTexture data 1
Draw
GDDR WRITE Framebuffer


So we have...

Read Address 1, Read Address 2, Write Address 3
Read Address 4, Read Address 5, Write Address 3
Read Address 6, Read Address 7, Write Address 3

Notice Address 3 is in every line, its the framebuffer (what get displayed on screen).

But it gets even worse because we often do a "compare" in depth data to make sure we aren't drawing something that's actually behind something already on screen.

So a real memory access is closer to this...

Read Address 1, Read Address 2, Read Address 3, if pass Write Address 3
Read Address 4, Read Address 5, Read Address 3,if pass Write Address 3
Read Address 6, Read Address 7, Read Address 3,if pass Write Address 3

Fun times eh, we have Read, read, read, write repeated over and over agian till our final picture is "created". One frame.

Every time we write to our frame, we also do a little thing called latency.
We issue a command "hey we want to read, or we want to write".

The GPU has to issue that command and WAIT for the Ram to return an address data.

Thats usually a cycle or two of "waiting" where it stalls the GPU pipeline.

Now, throw in to that mix a second bus, that requires no latency.


Take your "write" and place it there.

What do you get now?

GDDR READ model data 1
GDDR READ Texture data 1
Draw
ESRAM WRITE Framebuffer
GDDR READ model data 1
GDDR READTexture data 1
Draw
ESRAM WRITE Framebuffer

The effect on Main memory and the GPU memory controller for it is this...

READ, READ,READ,READ,READ,READ.

On the ESRAM-> WRITE, WRITE, WRITE

You also do not have a wait state on the memory.

In short you can get close to you maximum read speeds.

Compare that to a read/write/wait effect:

176gb/s could vary wildly as it will depend on how many write and how many read states you put in place.

In some cases for example where you read/writing it could be less that the speed of the DDR3.

So does this clear anything up or just muddy the waters further?
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Isn't Sony faulty electronics well below industry standards?

I'm normally a Xbox fanboy but I've got loads of Sony stuff in my house. From alarm clocks to TV's. An none of them have died. I think their build quality is very good.
 

Gestault

Member
As the general consensus is still that this is bunk, and apparently a post I made a few pages ago was ignored, I'll venture a repost for the sake of on-topic dicussion:

I have a question based on "common" sense, and not a particular degree of technical knowledge. If they're saying they discovered that they can read/write simultaneously when previously they had assumed they could only do one or the other (so firmware/tools would have been set up with that assumption), wouldn't the roughly doubling of memory transfer speed (simultaneous reading/writing compared to sequential) via the eSRAM/DDR3 combination account for an 88% (if not more) theoretical bump in memory performance?

This being from the starting point, not a before-after comparison of the total performance (if I have 5 apples and I double my apples to 10, it's a 100% increase in apples but also only 50% more apples when compared to the original amount.)

Anyone have a Star-Trek-style breakdown of why that wouldn't work? I don't assume this is a "bingo" moment, but I'm curious myself.
 

benny_a

extra source of jiggaflops
So does this clear anything up or just muddy the waters further?
He talks by adding a second bus that this will help. That's one of the modifications done on the PS4.

I'm outside my element though but it ignores any cache and would not have any waiting time associated with it if I understand it correctly.

woah what's going on in this thread? Why did hawk get banned?
This is not a live radio show where you're going to miss something if you're not present on the last page. You can just read the thread and find out.
 

guch20

Banned
I guess he is saying that with eDRAM the Xbone GPU is better utilized than PS4 GPU.... he tried to explain in this post:

http://www.psu.com/forums/showthrea...lopers/page7?p=6132276&viewfull=1#post6132276
Oops, crap. Didn't see you posted this and posted his whole blurb. He does specifically say it's better for some things, but I'm wondering if that's true. Only because I've read everywhere that Sony's solution is not only more powerful, but more elegant. But I'm no techie.

If it turns out Xbox has the better GPU solution again, I think a lot of folks will be munching on crow.
 

ethomaz

Banned
So does this clear anything up or just muddy the waters further?
Is 32MB enough to 1080p framebuffer? Because he is saying that framebuffer will be only storage into eDRAM.

If I remember a framebuffer fof 1280x720 with 4xMSAA needs ~28MB... 1920x1080 will need more.
 

guch20

Banned
He talks by adding a second bus that this will help. That's one of the modifications done on the PS4.

I'm outside my element though but it ignores any cache and would not have any waiting time associated with it if I understand it correctly.


This is not a live radio show where you're going to miss something if you're not present on the last page. You can just read the thread and find out.
Sorry, but do you mean that the modifications done to PS4 wouldn't have the wait times he's talking about? Like maybe he's basing his opinion on old info?
 
Status
Not open for further replies.
Top Bottom