• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DigitalFoundry: X1 memory performance improved for production console/ESRAM 192 GB/s)

Status
Not open for further replies.

Hawk269

Member
Anybody who argues there isn't much of a difference between 720p and 1080p should by default have his/her opinion on tech/graphics shot down. Clearly that person would be fine saying Wii games looked just as good as PS3 games.

Thing is that there are a lot of people that say that. I still remember the threads here saying that the would be fine with 720p PS4/Xbox One games since there is not that much of a difference with 1080p. To me that is insane. I have said since the first word of these consoles is 1080p or bust.
 

benny_a

extra source of jiggaflops
As soon as you are moving data to or from the DDR3 it maxes at 68GB/s of course. I'm wondering if the latest news is talking about something entirely on the eSRAM and not that bridge to the GPU memory system which most people are thinking it is.
Yup, that's also what I infer from the article because it's talking about some new advancement.

My point with the DDR3 was just that you could add the read and the write together and have a higher bandwidth than any lane can actually sustain and reminds me of the fancy arithmetic that was done last generation trying to obscure the numbers.
 
Thing is that there are a lot of people that say that. I still remember the threads here saying that the would be fine with 720p PS4/Xbox One games since there is not that much of a difference with 1080p. To me that is insane. I have said since the first word of these consoles is 1080p or bust.

Eh, just wait. I had read some make the same argument when it was DVD vs Blu Ray.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Anybody who argues there isn't much of a difference between 720p and 1080p should by default have his/her opinion on tech/graphics shot down. Clearly that person would be fine saying Wii games looked just as good as PS3 games.

I agree, but still these threads will keep happening and we will exchange the same arguments for another decade.
 

jaypah

Member
In that case MS is only prolonging the inevitable.

IF the specs Make a visual difference then the truth will be for the whole world to see and MS cannot stop or prevent that from happening.

Comparisons will be made.
Right, so why do it now? I mean, literally, from a business standpoint what sense does it make to release the specs now? MS has been making some dumb statements/decisions lately but with all of the downplaying of specs, cloud talk and meaningless touting of 5 billion transistors why would they come out and basically say "We're 4 months out from launch and trying to get our pre-orders up but I know how much GAF likes tech specs so allow us to break down our technical inferiority in detail". Not gonna happen.
 

Hawk269

Member
Since this thread has derailed into so many sub-topics, can anyone explain what those Move Engines are on the diagram posted above? Has there been a good explanation of what they do?
 

Xenon

Member
The problem is that this is framed with the bit about this being a discovery by Microsoft.


No, the problem is that people are acting like this was some unified message from MS. For all we know someone in Microsoft just discovered this info and thought it might be interesting to let people know. Since he or she has been following the hardware discussions on the interwebs and understands them.

But for some reason the idea that MS is trying to use this in some PR move makes more sense to people here. Which is silly since this is going to fly over the heads of 99%(being generous) percent of the people out there and I am almost sure 100% of MS marketing department. We are reaching console cold war levels of paranoia here.
 

ekim

Member
As soon as you are moving data to or from the DDR3 it maxes at 68GB/s of course. I'm wondering if the latest news is talking about something entirely on the eSRAM and not that bridge to the GPU memory system which most people are thinking it is.

durango_memory.jpg

Maybe dual bus ESram? Don't know.
 

jaypah

Member
No, the problem that people are acting like this was some unified message from MS. For all we know someone in Microsoft just discovered this info and thought it might be interesting to let people know. Since he or she has been following the hardware discussions on the interwebs and understands them.

But for some reason the idea that MS is trying to use this in some PR move makes more sense to people here. Which is silly since this is going to fly over the heads of 99%(being generous) percent of the people out there and I am almost sure 100% of MS marketing department. We are reaching console cold war levels of paranoia here.

That's what gets me about all of this. Most gamers don't care about/understand this news and those that do shouted "MUST BE A DOWNGRADE!" almost immediately. So how were they expecting to benefit from this?
 

kittoo

Cretinously credulous
Ok then just for you:

Xbone has 109% of the memory bandwidth, 50% of the ROP performance, and 67% of the shader performance of PS4.

Memory banwidth:
Xbone: 192GB/s
PS4: 176GB/s

ROP's:
Xbone: 16 ROP's
PS4: 32 ROP's

Shader units:
Xbone: 768
PS4: 1152

Even theoretically Xbone still looks to be lacking quite considerably?

WTF dude! Go through the basic facts first man! Even if the ESRAM has 192GBPS, its only the 32MB ESRAM that will have it. The actual 8GB DDR3 is somewhere around 60GBPS I think, while PS4 has whole 8GB at 176GBPS.
 
No it can't. Well, it *might* be able to squeeze some compute into idle cycles, but I doubt most devs would rely other, they'd want to control the situation better.

It does seem to have more fine grained control of compute though. So potentially the situation is even worse than I stated, if Xbox devs have to bluntly reserve entire CUs due to less control over compute jobs.

Your right I looked back at the quote and he said "it radically reduces the overhead of running compute and graphics together on the GPU.", so there will be some trade off. I watched his Gamelab video and it seems that this is one of the more complicated aspects of the system that he expects developer to be able to take advantage of down the line. He said he wanted the system to be easy to learn, but difficult to master.
Cerny said:
The hardware should have a familiar architecture and easy to develop for in the early days of the console life cycle, but that also there needs to be a rich feature set which the game creators can explore for years.
 

Takuya

Banned
Ok then just for you:

Xbone has 109% of the memory bandwidth, 50% of the ROP performance, and 67% of the shader performance of PS4.

Memory banwidth:
Xbone: 192GB/s
PS4: 176GB/s

ROP's:
Xbone: 16 ROP's
PS4: 32 ROP's

Shader units:
Xbone: 768
PS4: 1152

Even theoretically Xbone still looks to be lacking quite considerably?

lolllllllll...

Only 32MB of eSRAM can achieve that "theoretical" peak, the 8GB DDR3 doesn't even come close by itself.
 

ekim

Member
Require a redesign of the entire system, bus's arent free.

Maybe they were in there all the time but not usable within the API. But I'm guessing here.
And lol:
http://beyond3d.com/showpost.php?p=1761253&postcount=4268
I've gotten some PMs here and at Neogaf talking about a modest upclock for the xbox one. The 4 messages I received from 4 different accounts ( could be the same person but some of them are accounts from years ago) all state 75mhz for the gpu at this stage and it may go up slightly more.

I don't know how accurate this is. But perhaps this is where the increase in bandwidth is really coming from and MS simply wont ever announce clock speeds for the chip ?

Now that is less believable than a downclock IMHO. But I would love to know if he really got those PMs and from which users. Especially here at GAF. SpecialGuy? :p
 

Flatline

Banned
That's what gets me about all of this. Most gamers don't care about/understand this news and those that do shouted "MUST BE A DOWNGRADE!" almost immediately. So how were they expecting to benefit from this?


In our defense it's not our fault DF posted a ridiculously vague and incomprehensible fluff piece that even experts can't decipher. Any company that does shit like that deserves the backlash.
 

klaus

Member
In that case MS is only prolonging the inevitable.

IF the specs Make a visual difference then the truth will be for the whole world to see and MS cannot stop or prevent that from happening.

Comparisons will be made.

Finally someone making sense ^^

All I asked for was some visual proof (or clear testimony by a multiplat dev or any other sound proof), that the PS4 is significantly more poweful than the One. What I got was a lot of.. interesting answers.

Again, I do not argue that the PS4 will be more powerful in almost any aspect regarding power, I just wonder why people automatically assume that the Xbox One will have clearly worse looking games, especially in a thread where the topic is that the bandwith will presumably be better than thought..

Just my 5 cents.
 
All of this argument over 32MB of RAM.

Also, what do people mean when they say things like "Microsoft just discovered this info"? That they just happened upon 90GB/s of mystery bandwidth to their astonishment on a component that's presumably been in development for years? That shouldn't make a jot of sense to anyone with sense.
 

neptunes

Member
Thing is that there are a lot of people that say that. I still remember the threads here saying that the would be fine with 720p PS4/Xbox One games since there is not that much of a difference with 1080p. To me that is insane. I have said since the first word of these consoles is 1080p or bust.
What happens when a good majority of next-gen games are not natively 1080p? But instead rendered somewhere between 720p and 1080p? on both systems no less.
 

iceatcs

Junior Member
It seem to me that memory speed ram is not that important even DC and PS3 have faster ram.
I think GPU processor is the most important part. Whoever got better GPU, that high likely get best version from multiplatform.

The 1st party or exclusive titles are different story though because those titles made for this system.
 
What is the purpose of these 32 megs of ram anyway?
Do you want a frank answer or a PR-friendly answer?

It's there to make up for Microsoft needing large quantities of RAM to accomplish their intended goal of multitasking with TV, Apps and other non-gaming functionality, as they try and target US audiences as the "all-in-one" box to own the living room, and consequently going the route of lower bandwidth DDR3, as they didn't foresee and couldn't bet on GDDR5 being possible in the quantities they needed. When one looks at the XB1 and sees the broad overarching goals behind it, it frames the hardware decisions they've made and explains them quite clearly.

Or something something latency the cloud it's better something something if you want the latter.
 

Flatline

Banned
All of this argument over 32MB of RAM.

Also, what do people mean when they say things like "Microsoft just discovered this info"? That they just happened upon 90GB/s of mystery bandwidth to their astonishment on a component that's presumably been in development for years? That shouldn't make a jot of sense to anyone with sense.


It's because technology has advanced so far that components upgrade themselves without the knowledge of their creators anymore. Next up: Self replicating Xbones.
 

klaus

Member
Ask yourself this question; did people give a fuck this gen about minute differences? How many meltdowns did we have when DF face offs constantly favored 360 versions of multiplats?

Well on the forums, they obviously did give lots of fucks. With their wallets... I am not so sure - PS3 seems to have fared quite well for the trainwreck it was in the beginning.
 

benny_a

extra source of jiggaflops
Well on the forums, they obviously did give lots of fucks. With their wallets... I am not so sure - PS3 seems to have fared quite well for the trainwreck it was in the beginning.
PlayStation kicked out Sega and dethroned Nintendo when it was introduced.
The follow-up was the most successful gaming console ever.

And that follow up barely beat Xbox 360 in world wide sales while losing money overall in the SCE devision.

That isn't faring quite well. That's merely an alright recovery.
 

Myshoe

Banned
WTF dude! Go through the basic facts first man! Even if the ESRAM has 192GBPS, its only the 32MB ESRAM that will have it. The actual 8GB DDR3 is somewhere around 60GBPS I think, while PS4 has whole 8GB at 176GBPS.

If you follow the conversation I originally quoted a 'generous' (I thought) 133GB/s but was told that I should use theoreticals by a disgruntled Xbox fan, I was just pointing out that even with best case theoretical figures the Xbone hardware still looks severely lacking, infact it probably doesn't even have the GPU power to make use of the bandwidth even if it exists.
 

benny_a

extra source of jiggaflops
If you follow the conversation I originally quoted a 'generous' (I thought) 133GB/s but was told that I should use theoreticals by a disgruntled Xbox fan, I was just pointing out that even with best case theoretical figures the Xbone hardware still looks severely lacking, infact it probably doesn't even have the GPU power to make use of the bandwidth even if it exists.
I think if this was a who can come up with the dumbest arithmetic contest you could use the 20GB/s that the PS4 can use to write while ignoring the cache on top of the 176GB/s figure as it uses a different physical lane.

But that would just be a pissing contest with people drunk pissing all over each other.
 

Mastperf

Member
To be clear, there are 18 CU's. Sony suggests you use 14 for graphics and 4 for compute. Their research showed more than 14 CU's for graphics helped little (which means the system is bottlenecked somewhere else, imo it's the CPU)

You're gonna need to back a claim like that up with a source.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
The difference is that Jack is right, but nobody not already on board will be swayed by those statements.

The statement that your console is the most powerful is not as forceful as the direct challenge to compare multi-platform titles. You need to have some substantial confidence to say that, whereas you don't need anything to say generic gibberish about infinite power.
 

Artorias

Banned
What, for asking people to prove their claims? Sorry for being critical, is this a bannable behaviour now?

I think it's more that you're asking for something you know can't be provided yet, and literally saying that the numbers aren't good enough for discussion.

The numbers that are confirmed are the most solid facts we have on the consoles and you're saying that isn't proof because there are no games released yet...Would you prefer that nobody discuss the obvious and significant gap in power simply because it hurts your feelings?
 

benny_a

extra source of jiggaflops
did they really ever said that?
"Most powerful console ever made"

ElTorro said:
The statement that your console is the most powerful is not as forceful as the direct challenge to compare multi-platform titles. You need to have some substantial confidence to say that, whereas you don't need anything to say generic gibberish about infinite power.
Fair enough, there is a difference.
 

RayMaker

Banned
I know he is a business man and maybe everything he says cant be 100% trusted but in that interview I do genuinely believe him.
 

klaus

Member
I think it's more that you're asking for something you know can't be provided yet, and literally saying that the numbers aren't good enough for discussion.

The numbers that are confirmed are the most solid facts we have on the consoles and you're saying that isn't proof because there are no games released yet...Would you prefer that nobody discuss the obvious and significant gap in power simply because it hurts your feelings?

Well I agree that I am asking for a proof that might no exist yet, but I do so because at least 2 people in this thread claimed things that required exactly such proof. So don't shoot the messenger / person asking for proof please.

There will be a performance gap and I am happy to discuss it, but how can anyone at this point be 100% sure that it will be significant? Anyone who has followed the discussions around hardware over the last 15 years knows that pure numbers don't tell the whole story, and that they are prone to be used as marketing ploy.

And (for the nth time) - I am not arguing in any way that the PS4 will be generally more powerful, but why insist on the difference being big when there is no solid proof whatsoever backing that claim up? And why do so in a thread that seems to be positive news for the One?
 

nib95

Banned
Where is this incredible persecution complex coming from? Jesus.

I think those who were too aligned with Microsoft took all the recent criticism and negativity towards Microsoft (DRM, always online, weaker specs for higher price, forced Kinect, constant PR fluff or outright lies, communication issues, company arrogance etc) personally, not realising it was all fully justified and actually brought on by Microsoft themselves.
 

jaypah

Member
In our defense it's not our fault DF posted a ridiculously vague and incomprehensible fluff piece that even experts can't decipher. Any company that does shit like that deserves the backlash.

The hell are you talking about? I'm asking what MS thought they were going to gain by putting out info that was going to be immediately dissected by the only people who care. Gamers non-versed in hardware don't understand it and those who understood it immediately came to the conclusion of a downclock. So, again, how were MS expecting to benefit from this news? If forumites instantly thought downgrade it seems like the person from MS would have also seen that the info would be used as evidence of a downgrade. I don't mind people delving into the issue at all, in fact it's one of the reasons I love GAF. Perhaps you thought I liked to indulge in fanboy shenanigans. I assure you I do not.
 

Espada

Member
And (for the nth time) - I am not arguing in any way that the PS4 will be generally more powerful, but why insist on the difference being big when there is no solid proof whatsoever backing that claim up? And why do so in a thread that seems to be positive news for the One?

PS3 got ports that looked and ran worse than those on the Xbox 360, and the gap between those two is much smaller than the one between the PS4 and Xbox One. To make matters worse, developers preferred the console that was easier to work with. Why is it wrong to state that there will be a significant gap between the two?
 

Ushae

Banned
Btw if B3D can't make sense of this PR bullshit noone in this thread will be able to. It's a fun thread but completely pointless imo.

Its only a select few that insist on justifying hardware specs. When people going to realise that its the exclusives and games that determine the ability of the consoles. PS4 IS more powerful, period. There is no which way about it. But the fact remains next gen games will all be coming to both platforms. What does that mean?

Both consoles have different capabilities and over different experiences. So far X1 seems to have showcased this a lot more variety than PS4, which looks to be a full on hardware boost and some nice indie support.

Smart matchmaking, dedicated servers, Kinect sensor, instant switching, Live TV and Windows 8 synergy opening up the platform to many devs. X1 has some good things coming, you'd have to be ignorant not to see that.
 

ekim

Member
From the AMD GCN white paper:
ECC protected data is about 6% larger, which slightly reduces the overall memory capacity and bandwidth.
As an architecture, GCN is customizable for both professional and consumer products. Professional applications can take advantage of ECC protection for on-chip SRAM and optional external DRAM coverage to provide the highest levels of reliability with minimal performance degradation. GPU products that support ECC will allow it to be enabled or disabled via the graphics driver.
http://www.amd.com/us/Documents/GCN_Architecture_whitepaper.pdf

So if they have ECC enabled and thus 6% larger data and ~6% less bandwidth due to this, we can also apply do this math:
204GB/s (which would be the actual peak bandwidth claimed by other at 800Mhz) * 0.94 = 191.76 = ~ 192 GB/s

:p
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
here:
http://www.scei.co.jp/corporate/release/pdf/130221a_e.pdf

The GPU contains a unified array of 18 compute units, which collectively generate 1.84 Teraflops of processing power that can freely be applied to graphics, simulation tasks, or some mixture of the two.

Another misconception many people have is that you have to assign a compute unit to either graphics or compute. That is not the case. A compute unit can do both at the same tim, prioritize one over the other, and fill in compute tasks when graphics computations are stalled. That is actually the point of many alterations of PS4 over vanilla GCN.
 

Hawk269

Member
I stated this before and will do so again. At the end of the day it all comes down to the games and how they look and how they play. Up until now, there has been ONE show that has allowed people to play both consoles. I was one of those people that was able to play many games on both platforms. Some were played on the show floor, some behind closed doors as part of meetings.

Based on what I played and I stress this as my opinion, the Xbox One games seemed cleaner, running smoother and just overall looked better than anything on the PS4. Forza 5 was the only playable game that was running pure 1080p/60fps and it really showed it strengths because of that.

You all can take that for what you will, but like anything in this industry, it is going to come down to the games, not a spec sheet. While one on paper can mathematically be more powerful, when it comes to games that usually always tells a different story.

Just to be clear: Yes, I realize that E3 is not a great barometer because of how early games are and nothing shown/playable was anywhere near final. However, with that being said, right now that is the only comparison that can be made between the two consoles until another event is hosted that allows for people to play both systems again.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
So if they have ECC enabled and thus 6% larger data and ~6% less bandwidth due to this, we can also apply do this math:
204GB/s (which would be the actual peak bandwidth claimed by other at 800Mhz) * 0.94 = 191.76 = ~ 192 GB/s

Yeah, but then you could not retain a peek bandwidth of 102.4GB/s for single reads or writes.
 
Status
Not open for further replies.
Top Bottom