• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DigitalFoundry: X1 memory performance improved for production console/ESRAM 192 GB/s)

Status
Not open for further replies.

coldone

Member
It could be a elegant solution to get the maximum out of the GPU. But when you cut down the clock by 50Mhz.. you only land up with a 1.1T flops GPU. Sure Xbox one may able to eek out every last bit of that 1.1T flops.

Gafers are upset that they are not getting the original 1.2T flops GPU that they thought they will be getting :(

Then there are the sony folks who are trying to rub salt on the wound by pointing out that they have 1.84T flops GPU.

Oops, crap. Didn't see you posted this and posted his whole blurb. He does specifically say it's better for some things, but I'm wondering if that's true. Only because I've read everywhere that Sony's solution is not only more powerful, but more elegant. But I'm no techie.

If it turns out Xbox has the better GPU solution again, I think a lot of folks will be munching on crow.
 

benny_a

extra source of jiggaflops
Is 32MB enough to framebuffer? Because he is saying that framebuffer will be only storage into eDRAM.

If I remember a framebuffer fof 1280x720 with 4xMSAA needs ~28MB... 1920x1080 will need more.
KZ2's framebuffer was 36MB. Earlier in the thread ElTorro said the KZ:SF frame buffer is 39MB. (But he wasn't sure.)

The 32MB eDRAM stuff is a band-aid, everyone has already acknowledged that in the past.

Sorry, but do you mean that the modifications done to PS4 wouldn't have the wait times he's talking about? Like maybe he's basing his opinion on old info?
I'm saying he is suggesting that a second bus with no latency helps. And as far as my understanding goes the PS4 has that. But I also said I'm out of my element here, I only remember that the PS4 has a second GPU bus that ignores any cache (which are usually associated with waiting times.)
 

guch20

Banned
And chance you more technologically minded folks could go into that PSU thread and get him to explain this stuff? I would, but I wouldn't know what to ask and wouldn't know if his answer had any holes in it or if it's based on old news.
 

ethomaz

Banned
KZ2's framebuffer was 36MB. Earlier in the thread ElTorro said the KZ:SF frame buffer is 39MB. (But he wasn't sure.)

The 32MB eDRAM stuff is a band-aid, everyone has already acknowledged that in the past.
I agree... the eSRAM (sorry I wanted to say eSRAM before and I wrote eDRAM) is a workarround to avoid the DDR3 bottlenecks but that can works well if the devs code appropriately.

That's don't change the fact the PS4 GPU has more power to be used... you can't use band-aid for that lol.

I found that about the framebuffer in AnadTech "At 32MB the ESRAM is more than enough for frame buffer storage, indicating that Microsoft expects developers to use it to offload requests from the system memory bus." - http://www.anandtech.com/show/6972/xbox-one-hardware-compared-to-playstation-4/3
 

benny_a

extra source of jiggaflops
And chance you more technologically minded folks could go into that PSU thread and get him to explain this stuff? I would, but I wouldn't know what to ask and wouldn't know if his answer had any holes in it or if it's based on old news.
Here is a thread where his basic argument is also discussed with the waiting times.

He is basically not making a new argument. (This does not mean eSRAM has no benefits. Using it as a big cache is a benefit. That does not mean in the end it's the superior solution overall.)
 

coldone

Member
http://www.beyond3d.com/content/articles/4/5

On Xbox 360 using "tile + deffered rendering" the were able to fit it all in 10MB too. There are more tricks. Hopefully we will hear more during the hotchips session in Aug.

Basically the frame buffer is split into portions, instead of one frame buffer for the entire 1080p screen.

Is 32MB enough to 1080p framebuffer? Because he is saying that framebuffer will be only storage into eDRAM.

If I remember a framebuffer fof 1280x720 with 4xMSAA needs ~28MB... 1920x1080 will need more.
 

nib95

Banned
He said Infamous was wrought with frame rate issues when what we've seen of it has no such thing. He says he'd played the game when apparently no one else has. Pop in is real, obviously, but it's minor and in a lot of the games we saw. In the end he was just a nobody with a particular console bias. Meh. Predictable imo. Next time don't snap at us for rightly raising an eye brow when people make lofty claims about stuff not even real insiders and devs muster.

Back on topic, has Leadbetter followed this up?

Any more detailed information on how this new figure was derived, and whether it does spell a downclock or not?
 

benny_a

extra source of jiggaflops
And chance you more technologically minded folks could go into that PSU thread and get him to explain this stuff? I would, but I wouldn't know what to ask and wouldn't know if his answer had any holes in it or if it's based on old news.
To your second point. Everything he argues can only be based on old information.

The slower main memory combined with faster embedded RAM is something that already exists.

What didn't exist already is the system the PS4 employs with only fast main memory and proprietary customization made for it.

So we can argue until we're blue in the face, all we can tell is that what the guy describes is a real world problem and the way that the PS4 tries to tackle it is by using a bus that can ignore cache.
 

badb0y

Member
A couple of things that people mentioned that are inaccurate.

First, the person comparing the GTX 680 to a GTX 670 and coming to the conclusion that the PS4 and Xbox One would have a similar disparity in performance because of the difference in teraflops is wrong. The GTX 680 and GTX 670 are based off of the same GK104 chip so everything on these chips are exactly the same except the core clock and the texture units/stream processors. The PS4's GPU on the other hand is based of off "Pitcairn" which is the chip used in HD 7850/7870s while the Xbox One's GPU is based off of "Bonaire" which is a chip used in the HD 7790s. So what does that mean in layman's terms? The chip used as a reference for the Xbox One's GPU is a mainstream part while the chip used in the PS4 is part of the enthusiast class GPUs.

To get a better understanding of the disparity in power here, take a look at Anandtech's fantastic benchmark suite: http://www.anandtech.com/bench/Product/776?vs=857

Second, the person who brought up the GTX 580 to the GTX 680 is also making the same mistake because GTX 580 is based off of Fermi family of GPUs while the GTX 680 is based off of Kepler family of GPUs. Kepler is refined version of Fermi and making direct comparisons between the two is no where near the same as comparing a 7790 to a 7870. Both the 7870 and 7790 fall under the Southern Island family of graphics cards as indicated by the "7" in the front.

Third, I am not too keen on the use of ESRAM but my understanding of the situation is that throwing in ESRAM into the equation makes the system a lot more complex than it should be. The way the PS4 is designed is the APU has direct contact to the memory pool so both the CPU and the GPU can access the memory use it however the developers want them to.

On the other hand the Xbox One memory configuration seems a bit less optimized. Here's a look at at the memory configuration:
durango_memory.jpg

From what I can understand from this diagram the ESRAM and DDR3 RAM are seperate pools. The GPU is the only thing that can read/write to the ESRAM while the DDR3 RAM is available to both the CPU and the GPU. Another point to be noted here is that simply adding the ESRAM and the DDR3 bandwidth to come up with a number that compares to PS4's GDDR5's 176 GB/s is totally wrong, the Xbox One's memory system is a lot more complex and I don't think that we will reach that theoretical maximum that Microsoft is putting out (I don't think PS4 will hit it either) but the memory system for the Xbox One looks a lot more fragmented compared to the PS4.
 

benny_a

extra source of jiggaflops
[...] I don't think that we will reach that theoretical maximum that Microsoft is putting out (I don't think PS4 will hit it either) but the memory system for the Xbox One looks a lot more fragmented compared to the PS4.
Would it be reasonable to expect that the PS4 will hit closer to its theoretical high easier than the Xbox One as a developer has to do nothing but read or write to the memory while on the Xbox One the eSRAM needs to be managed by the developer themselves?
 

FINALBOSS

Banned
I haven't said anything negative outside of saying that I prefer the launch lineup for the xbone over theirs.

They have the better hardware on paper as far as we know. They still have to translate that into a noticeable advantage in games.


Ok. So you've said nothing negative, that's fine.


Mind doing it a little less douchey and confrontational? I'm sure others agree you've been extremely brash every single time you've posted here. I don't see anyone else behaving that way.
 

Vestal

Gold Member
It could be a elegant solution to get the maximum out of the GPU. But when you cut down the clock by 50Mhz.. you only land up with a 1.1T flops GPU. Sure Xbox one may able to eek out every last bit of that 1.1T flops.

Gafers are upset that they are not getting the original 1.2T flops GPU that they thought they will be getting :(

Then there are the sony folks who are trying to rub salt on the wound by pointing out that they have 1.84T flops GPU.

here we go again. Stop posting rumors or your own made up crap as fact.
 

Vestal

Gold Member
Ok. So you've said nothing negative, that's fine.


Mind doing it a little less douchey and confrontational? I'm sure others agree you've been extremely brash every single time you've posted here. I don't see anyone else behaving that way.
confrontational with coldone yes. If you look back about 20 pages you will see why.
 
This thread:

3oe3mk.jpg


Honestly this discussion is pretty useless until we get hard specs from the source or until a teardown is done. Speculation doesn't really contribute anything other than to the fanboy wars.
 

badb0y

Member
Would it be reasonable to expect that the PS4 will hit closer to its theoretical high easier than the Xbox One as a developer has to do nothing but read or write to the memory while on the Xbox One the eSRAM needs to be managed by the developer themselves?

In my opinion utilizing the PS4's memory system will be easier than the Xbox One's because it seems like the PS4 has a simpler memory system. Everything connects to one single pool of memory and the developers don't have to worry about bandwidth problems or using the ESRAM or the DMEs etc.On the other hand I do think that if coded right the Xbox One's memory system will work well it's up to the developer to optimize the code as opposed to the PS4 where the work is pretty much done for you.
 

Vestal

Gold Member
In my opinion utilizing the PS4's memory system will be easier than the Xbox One's because it seems like the PS4 has a simpler memory system. Everything connects to one single pool of memory and the developers don't have to worry about bandwidth problems or using the ESRAM or the DMEs etc.On the other hand I do think that if coded right the Xbox One's memory system will work well it's up to the developer to optimize the code as opposed to the PS4 where the work is pretty much done for you.

yup basically. It will be also up to MS and Sony to offer robust enough tools to take advantage of both systems. Given the architecture difference the pressure is more on MS since its not a single memory pool.
 

Vestal

Gold Member
Not to pile on, but isn't your attitude towards others the reason you're a Junior again?

if you must know. It was a mistake I made. A stupid comment in regards to how the last few weeks have been. And how I viewed a particular group of posters in how they would jump on some issues. It was stupid and in all honesty I deserved it.

now in regards to coldone he decided to question my credibility while posting false information. When I confronted him on it he simply abandoned the thread. Then comes back here and there.

so for fun I've decided to point out his bs whenever he tries to do it.
 

guch20

Banned
if you must know. It was a mistake I made. A stupid comment in regards to how the last few weeks have been. And how I viewed a particular group of posters in how they would jump on some issues. It was stupid and in all honesty I deserved it.
It was a rhetorical question. I was in that thread.

Edit: To clarify, and so I don't sound like a dick, I don't mean rhetorical as in it doesn't require an answer. More in that I already knew the answer. But yeah.
 

Pimpbaa

Member
Someone mentioned Cerny's statement about developers getting more out of the PS4 over time. I'm pretty sure he is referring to his earlier statements about lower level tools not yet being available for a while. Once they are released, developers can code closer to the metal for even better results.

I think this is about true for every console launch. Rushed launch titles made with incomplete or unoptimized tools. Much have been a bitch for some of the more notoriously difficult consoles like the Saturn or N64. Hell in the Saturn's case all they were given was incomplete documentation and were expected to program "to the metal" (I could be very wrong on this).
 

Vestal

Gold Member
I think this is about true for every console launch. Rushed launch titles made with incomplete or unoptimized tools. Much have been a bitch for some of the more notoriously difficult consoles like the Saturn or N64. Hell in the Saturn's case all they were given was incomplete documentation and were expected to program "to the metal" (I could be very wrong on this).
yup. Just look at halo 4 a tlou. We never thought either console could pull that off 7 years ago.
 

guch20

Banned
I think this is about true for every console launch. Rushed launch titles made with incomplete or unoptimized tools. Much have been a bitch for some of the more notoriously difficult consoles like the Saturn or N64. Hell in the Saturn's case all they were given was incomplete documentation and were expected to program "to the metal" (I could be very wrong on this).

If I remember correctly, wasn't the Saturn's launch so bad that Sega rereleased Virtua Fighter 2 with improved graphics because the launch edition was so fucked?
 

Vestal

Gold Member
If I remember correctly, wasn't the Saturn's launch so bad that Sega rereleased Virtua Fighter 2 with improved graphics because the launch edition was so fucked?
was it really??

I actually enjoyed the Saturn alot. Got it on release too.
 

kyo27

Member
If I remember correctly, wasn't the Saturn's launch so bad that Sega rereleased Virtua Fighter 2 with improved graphics because the launch edition was so fucked?

It launched with VF 1. They released Virtua Fighter Remix later with better graphics, but I don't remember Virtua Fighter being "so fucked".
 

coldone

Member

Vestal

Gold Member
http://www.extremetech.com/gaming/1...re-specs-and-games-detailed-the-anti-xbox-one

"Inside the PS4 is, essentially, a specialized mid-range gaming PC. There’s an 8-core AMD Jaguar/Kabini x86-64 CPU, a Radeon 7870-derived GPU with 18 compute units (vs. Xbox One’s 12 CUs), and 8GB of unified GDDR5 RAM.:

18 vs 12 CU has been published in several articles. Is it not ?
selective eh?? Were is the down clock listed?? While you are at it I am still waiting for the math on those servers.
 

Xenon

Member

Vestal

Gold Member
Has it ever been confirmed that the XBO has 12 CUs? Did I miss the confirmation?
I believe it comes from the leaks before the reveals. Since there are no spec sheets for the xbone there is no confirmation either way..

above poster looks like he's got it.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
Has it ever been confirmed that the XBO has 12 CUs? Did I miss the confirmation?

It was confirmed by MS that it does a total of 768 ops per clock.

It's a chip based on AMD's GCN architecture.

GCN is 64 ops per clock per Shader Unit.

768 / 64 = 12.

So yeah, it's has 12 CU's, confirmed.
 
I believe it comes from the leaks before the reveals. Since there are no spec sheets for the xbone there is no confirmation either way..

above poster looks like he's got it.


That and the the system architects revealed the 768 ops at the xbone hardware panel, which as artist outlined equates to 12 CUs.
 

guch20

Banned
It launched with VF 1. They released Virtua Fighter Remix later with better graphics, but I don't remember Virtua Fighter being "so fucked".
That's right, it was VF1. But yeah, the first release was a rushed port with a piss poor poly count (even by Saturn standards) and pissed a lot of people off. I remember reading in EGM that Sega made VF Remix in an attempt to regain gamers' favor, since so many who had been hoping to have a faithful arcade conversion were so disappointed with their first attempt.

I believe they even gave the game to Saturn owners for free in an attempt to placate them.

Edit: http://en.wikipedia.org/wiki/Virtua_Fighter:_Remix#Home_versions

On release, Famicom Tsūshin scored the Sega Saturn version of the game a 36 out of 40.[1] Despite the success and acclaim that Virtua Fighter received in its arcade incarnation and internationally, the game was considered by Western audiences to have taken several missteps in making its transition to home consoles. The original Sega Saturn port was rushed to market in order to be ready in time for that system's surprise early American launch in May 1995, and as a result it suffered from inferior visuals, gameplay glitches, and a lack of adequate game modes. As an apology to fans who felt burned by this version, Sega released "Virtua Fighter Remix" in July of that year, available free to all registered Sega Saturn owners for set amount of time, which improved on the Saturn original and was generally considered to be a better version of the game, though still noticeably inferior to the arcade version. Famicom Tsūshin scored the Remix version of the game a 35 out of 40.[2] Finally, a version of the game was released for Sega's short-lived 32X add-on for the Sega Mega Drive in 1995.

So fucked.
 

Xenon

Member
I believe it comes from the leaks before the reveals. Since there are no spec sheets for the xbone there is no confirmation either way..

above poster looks like he's got it.

768 ops per clock (confirmed) as per GCN arch (confirmed) is 12 CUs.

It was confirmed by MS that it does a total of 768 ops per clock.

It's a chip based on AMD's GCN architecture.

GCN is 64 ops per clock per Shader Unit.

768 / 64 = 12.

So yeah, it's has 12 CU's, confirmed.

Ok I remember that now. Thanks.
 

Myshoe

Banned
Just out of interest how come there has been so much focus on memory bandwidth anyway? is it because both companies have taken different approaches? is Microsoft intentionally trying to distract us from the fact that their $100 more expensive console has a much more inferior GPU?

PS4 has:
50% more Shader units - 768 v 1152
50% more Compute units - 12 v 18
50% more Texture mapping units (TMU's) - 48 v 72
100% more Render output units (ROP's) - 16 v 32

Even if PS4 & Xbox One had the exact same memory bandwidth/subsystem there is still a huge disparity in GPU power for two consoles of the same generation (which are supposedly going head to head).
 
Status
Not open for further replies.
Top Bottom