• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

X1 DDR3 RAM vs PS4 GDDR5 RAM: “Both Are Sufficient for Realistic Lighting”(Geomerics)

ZaCH3000

Member
Sony deserves lots of credit. They also should thank our harsh criticisms of last gen. The game footage has been live footage. That to me is a testament of confidence Sony has with this machine.
 
I assume that this post has popped up in this thread but I haven't been keeping up:


http://forum.beyond3d.com/showpost.php?p=1763732&postcount=4689




Basically a ram upgrade that makes zero sense as far as logic goes.

An overclock to the cpu (seems possible...?)

Some other crap.


I can definitely see them trying to get the clock speed of the cpu up but given that this is in the same rumor as something as absurd as redesigning the entire motherboard to hold more ram (unless there is a DDR3 size/config I've never heard off...?) makes me doubt the entire thing. That and the people on that forum are as crazy as we are about Platinum games on Nintendo systems or Half Life 3.
 
I assume that this post has popped up in this thread but I haven't been keeping up:


http://forum.beyond3d.com/showpost.php?p=1763732&postcount=4689




Basically a ram upgrade that makes zero sense as far as logic goes.

An overclock to the cpu (seems possible...?)

Some other crap.


I can definitely see them trying to get the clock speed of the cpu up but given that this is in the same rumor as something as absurd as redesigning the entire motherboard to hold more ram (unless there is a DDR3 size/config I've never heard off...?) makes me doubt the entire thing. That and the people on that forum are as crazy as we are about Platinum games on Nintendo systems or Half Life 3.
They're essentially trying to turn some wishful-thinking into a rumor even though it's this late in the process.

12gb? Are they serious? They're destroying what's left of the B3D's reputation as a legitimate place for tech discussion.
 

Key2001

Member
To achieve 12GB wouldn't they have to either use 24 4-Gbit chips instead of 16 4-Gbit chips they currently use? The board they showed at the reveal didn't look like it has room for an additional 8 chips.

Why are they just now addressing the large system reservation if it is such a problem? This seems like it would be something that would had been known well in advance with more than enough time to address the problem before the reveal or E3.

It would also seem that this would be something Xbox fans would hope not to be true. What happens if it is true developers are complaining about the system reservation and MS decides against or is unable to upgrade to 12GB RAM?
 
To achieve 12GB wouldn't they have to either use 24 4-Gbit chips instead of 16 4-Gbit chips they currently use? The board they showed at the reveal didn't look like it has room for an additional 8 chips.

Why are they just now addressing the large system reservation if it is such a problem? This seems like it would be something that would had been known well in advance with more than enough time to address the problem before the reveal or E3.

It would also seem that this would be something Xbox fans would hope not to be true. What happens if it is true developers are complaining about the system reservation and MS decides against or is unable to upgrade to 12GB RAM?

rumors become what now? the design is done
 

calder

Member
Lost Planet, Assassins Creed, GTA 4, Red Dead Redemption, Mass Effect 3, Transformers and Skyrim on the PS3 say hello.

They are all garbage ports and are so clearly better on the Xbox 360 it isn't even a contest.

Uh, so you're agreeing with him? Just in an oddly phrased manner? Hope so anyway. ;)
 
Man when will this sort of nonsensical wishful thinking end?

I've shared this story on this forum once before, I believe, so sorry if it's a repeat to some of you. But back when I was in high school I loved Weezer. The blue album and Pinkerton were amazing. Then they broke up and disapeared for several years. When they got back together I was pretty active on the official weezer message boards. The single for Hashpipe came out and we didn't really like it very much but we tried to convince ourselves that it was just the radio single and the rest of the album was going to be good.

And then the album leaked online about two weeks before it was supposed to. It was awful. It was so watered down and lacked all of the passion and fucked up neuroses of their prior work. The guitars sounded neutered, Rivers' voice sounded lame, his lyrics were complete shit.

Then someone floated the idea that they released this leak on purpose and it was actually all the throw away songs. We started digging up quotes from interviews where Rivers claimed to have written and recorded hundreds of songs during their breakup. It made perfect sense! Hashpipe, we actually hated the least at this point, was the only song that sounded even halfway like their old stuff. IT HAD TO BE FAKE! And we all believed that. Or wished it atleast. And then the album came out and it was exactly what we had heard.

That was my big reality check on wishful thinking. I see so many people construct these insane conspiracy theories either in favor of their console of choice or against the ones they don't like... or even fear, like that MS is going to pay developers to make shitty PS4 ports - and I laugh. I laugh really hard. But once upon a time I was those dudes and weezer crushed my soul.
 

Dosia

Member
Lol I remember that with Weezer. That same thing happens with Eminems albums after Em Show. Those ppl on b3d are delusional.
 
Uh, so you're agreeing with him? Just in an oddly phrased manner? Hope so anyway. ;)

Yes I absolutely agree with him. PS3 has so many awful ports especially per-2010. Dead Space seemed like a black sheep of multi plats because of how good it was on the Ps3. My roommate at the time cracked his lost planet disc in two after frustration with the game. I walked into my room, took out the 360 version and his jaw dropped at the difference. Even bionic commando was slightly better on the 360.

After 2010 things got better except for lazy unreal 3 ports... I don't even touch ue3 games on ps3 even if they get good reviews unless I am just supporting the devs like me3 collectors edition.

I personally don't feel xbone multi plats will fair any better in this regard. It didn't happen last gen and it won't happen this gen unless money hats rule the coop.
 

mrklaw

MrArseFace
With HMC the bandwidth feeding mobile SoCs will far outstrip that delivered by GDDR5, 2013 is really bad timing for next-gen. We are on the cusp of a paradigm shift on the semiconductor side, performance is not going to increment but explode across the entire range of TDPs. Intel finally getting serious about mobile will aid that ramp.

I think this will have to be a 5yr cycle this time around since mobile will start to give comparable performance within 3-4yrs.


Agree. But I think you're overplaying the mobile side. They'll still be restricted by wattage.

But absolutely there is a potential large shift coming, and that make this generation interesting - if Nintendo continue to fail,nor if MS does worse than expected - might they be tempted to have a short gen and release quickly? MS have done that before.
 

dr_rus

Member
I think people need to keep in mind that third party developers on PS4 are going to be competing with Sony's first party and if there is a huge disparity then third parties will get shit from the hardcore until they bring their titles up to scratch which we know isn't going to be hard given the ease of development on PS4. While the same could be said for Xbone, it will be easier for third parties to dismiss it as being difficult to work with because of the eSRAM and DMAs complicating development.
It is even less complicated than this: PS4 will get better fps and/or better resolutions in the same multiplatform titles. A developer doesn't have to do jack shit for that to happen, it simply comes with faster GPU and memory bandwidth. Then there's supposed additional 2 GBs of RAM available to games on PS4 which can lead to better textures and less pop-in, and this is very simple to implement too. And after that comes more complex shaders and effects possible with a faster GPU.

I'm pretty sure that the difference will be quite visible in basically every mutliplatform game in one way or another.
 

twobear

sputum-flecked apoplexy
The esram is by default more complex than the PS4 setup. The key is how automated is it to gain its benefits? If it works like a cache then it is no effort to use, but then you also don't have much control over it. Alternatively if you want more control to get better results,you need to put more effort in. And by your own comments, Multiplatform devs will go for the easy route where possible. It's perfectly possible that if Xbox one's memory system is in any way complex to use, some devs simply won't bother because the publishers won't pay for the time needed.

My understanding of the ESRAM is that it is for all intents and purposes exactly like the EDRAM of the 360, but with fewer restrictions. Developers will use it in exactly the same way. Sure, some 1st party developers might do funny things with it (hopefully not anything as fucking dumb as Bungie's Halo 3 HDR implementation) but the overwhelming majority will just use it for a framebuffer.

This is barely any more complex than the PS4's setup as far as I know. I don't know why people are making out like it's some vast gap in complexity and difficulty. Honestly a lot of it reads like 'it's not good enough that I should succeed—others must fail'. They're both going to be very easy for developers to work with.

It is even less complicated than this: PS4 will get better fps and/or better resolutions in the same multiplatform titles. A developer doesn't have to do jack shit for that to happen, it simply comes with faster GPU and memory bandwidth. Then there's supposed additional 2 GBs of RAM available to games on PS4 which can lead to better textures and less pop-in, and this is very simple to implement too. And after that comes more complex shaders and effects possible with a faster GPU.

I'm pretty sure that the difference will be quite visible in basically every mutliplatform game in one way or another.

Precisely this. You don't have to imagine that developers will spend more time and effort on the PS4 version, the difference will be obvious straight away even if they spend exactly the same time and effort.
 
That was my big reality check on wishful thinking. I see so many people construct these insane conspiracy theories either in favor of their console of choice or against the ones they don't like... or even fear, like that MS is going to pay developers to make shitty PS4 ports - and I laugh. I laugh really hard. But once upon a time I was those dudes and weezer crushed my soul.
I actually really enjoyed reading this post, because I feel we've all done something similar at some point. Sometimes you just want something to be true so badly that you lie to yourself.

It can be fun at times though, every time a new update gets released on Steam everyone's combing through the data trying to uncover an ARG for Half Life 3. They'll piece bits of data together to make a coherent argument and I'll almost be on the band wagon.

But sometimes you just have to step back and think 'am I only believing this because I want to?'.
 

Perkel

Banned
This is barely any more complex than the PS4's setup as far as I know. I don't know why people are making out like it's some vast gap in complexity and difficulty. Honestly a lot of it reads like 'it's not good enough that I should succeed—others must fail'. They're both going to be very easy for developers to work with.

Because PS4 has UMA after you load things to RAM there is no other work you need to use that data. It is as vast as you can get.

After eDram in x360 and generally split memory systems of last gen this is not issue for many devs but for PS4 is way better solution than split ram.
 

twobear

sputum-flecked apoplexy
Because PS4 has UMA after you load things to RAM there is no other work you need to use that data. It is as vast as you can get.

After eDram in x360 and generally split memory systems of last gen this is not issue for many devs but for PS4 is way better solution than split ram.

360 was always described as having a UMA too, because the EDRAM was only every used to store the framebuffer (as I recall, the ROPs had to write the framebuffer to the EDRAM); both the CPU and GPU had access to the same 512MB of RAM. For all intents and purposes 360 was a UMA, and unless there's some major reason why it's different now, Xbone is too.

Split memory (a la PS3) was a problem because you were severely constrained by what you could put in RAM: your GPU could never access more than 256MB of data and your CPU could not have access to more than 256MB of data either, without passing through the other. That is not a problem with Xbone: both have access to the same 5GB pool.
 
I don't think both situations are comparable. Xbox 360 had ~4% more RAM for games than PS3 at the same speed AND additional eDRAM. Xbox One has ~29% less RAM for games at ~39% of the speed of the PS4's RAM and additional eSRAM. Having a eDRAM/eSRAM configuration is not a problem when you already have more RAM at the same speed than the competition, because then it's a bonus. But it probably becomes a small annoyance when you have less RAM at a lower speed than the competitor.
 

twobear

sputum-flecked apoplexy
Gemüsepizza;69323561 said:
I don't think both situations are comparable. Xbox 360 had ~4% more RAM for games than PS3 at the same speed AND additional eDRAM. Xbox One has ~29% less RAM for games at ~39% the speed of the PS4's RAM and additional eSRAM. Having a eDRAM/eSRAM configuration is not a problem when you already have more RAM at the same speed thanthe competition, it's a bonus. But it probably becomes a small annoyance when you have less RAM at a lower speed than the competitor.

Why is this 'Xbone will be harder to work with' rather than merely 'Xbone is less powerful', though? They'll just cut effects that the Xbone can't manage.

Put it another way, is Wii U harder to program than the Xbone just because it's weaker?

[edit] Sorry, of course it makes it harder. I mean prohibitively harder rather than marginally so.
 

Perkel

Banned
360 was always described as having a UMA too, because the EDRAM was only every used to store the framebuffer (as I recall, the ROPs had to write the framebuffer to the EDRAM); both the CPU and GPU had access to the same 512MB of RAM. For all intents and purposes 360 was a UMA, and unless there's some major reason why it's different now, Xbone is too.

Split memory (a la PS3) was a problem because you were severely constrained by what you could put in RAM: your GPU could never access more than 256MB of data and your CPU could not have access to more than 256MB of data either, without passing through the other. That is not a problem with Xbone: both have access to the same 5GB pool.


My point with UMA was that PS4 has only one pool of memory shared between CPU and GPU, no other eDRAM or anything. After you load things to it you don't do anything with it to use it. That is way better solution than pool of memory + eDram/eSram. Not only memory config is faster but it is also easier to use.

Cerny in last "road to PS4" had two choices: 256bit GDDR5 memory which is currently in PS4 and other 128bit bus GDDR5 memory (88GB/s) with small eDRAM pool @ 1000GB/s

They choose not second because it added complexity.
 

twobear

sputum-flecked apoplexy
My point with UMA was that PS4 has only one pool of memory shared between CPU and GPU, no other eDRAM or anything. After you load things to it you don't do anything with it to use it. That is way better solution than pool of memory + eDram/eSram. Not only memory config is faster but it is also easier to use.

Cerny in last "road to PS4" had two choices: 256bit GDDR5 memory which is currently in PS4 and other 128bit bus GDDR5 memory (88GB/s) with small eDRAM pool @ 1000GB/s

They choose not second because it added complexity.

As interesting as the Cerny videos are, they are PR. They might just as well have not done it because, as we have seen from Xbone, EDRAM lowers yields and pushes up cost. I would prefer to hear from someone with experience whether EDRAM causes serious headaches instead of the chief architect of the console's main competitor.
 

dr_rus

Member
As interesting as the Cerny videos are, they are PR. They might just as well have not done it because, as we have seen from Xbone, EDRAM lowers yields and pushes up cost. I would prefer to hear from someone with experience whether EDRAM causes serious headaches instead of the chief architect of the console's main competitor.
Well, it is more difficult to work with two RAM pools with different bandwidth and sizes. ESRAM of XBO is the only reasonable pool to contain frame buffer but at the same time 32 MBs are not that much for 1080p with deferred shading and all. So you'll have to tile the frame probably which adds complexity to the renderer and increases timeframes.
I think it's safe to assume though that using ESRAM on XBO is nowhere near as difficult as using Cell's SPUs to the fullest of their potential. So it is more difficult than straight 8 GB of GDDR5 on a 256 bit bus of PS4 but it's not that difficult at all for anyone who have been programming engines for PS3 and 360 for the last ten years.
 

astraycat

Member
As interesting as the Cerny videos are, they are PR. They might just as well have not done it because, as we have seen from Xbone, EDRAM lowers yields and pushes up cost. I would prefer to hear from someone with experience whether EDRAM causes serious headaches instead of the chief architect of the console's main competitor.

The Xbone didn't go the EDRAM route either, but went ESRAM, which is cheaper, lower bandwidth, and easier to manufacture.

The alternate route Cerny was proposing would probably have been more expensive than the current layout, but the bandwidth to that little pool would have been astounding. Doubt it'd be more useful than high bandwidth to main memory, and it'd certainly be harder to optimize for.
 
I see so many people construct these insane conspiracy theories either in favor of their console of choice or against the ones they don't like... or even fear, like that MS is going to pay developers to make shitty PS4 ports - and I laugh. I laugh really hard. But once upon a time I was those dudes and weezer crushed my soul.

The people that actually make up these rumors are doing it for no other reason than for the shiggles. It's the sheep that believe and share these rumors who need to be blamed.
 

Biker19

Banned
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?

PS4 is going to be worse AT GPU Boolean comparison operations while other systems can be multithreaded to aid Logic. This may hurt the PS4's use of Tessellation since boolean operations will be slower than Wii U and Xbox 1. It also has a weak CPU. It may not be able to handle many hardware lights ranging anywhere from Wii U to Last Gen. It just uses shader lights which don't have Alpha Channel correction. When a light shines on a ambient reflective object it only reflects back certain colors hiding the rest. PS4's lack of Edram means that the Char persision storage of shader information in RAM will be equivalent to Xbox One.

Booleans increase the number of header files to store their memory address and PS4 has huge memory banks think the location of one fish in a large ocean. PS4 has to store this address on the system to be read by memory controllers. This burden is much less for Embedded RAM which Wii U and Xbox one have which is much smaller in size and can be cycled for large storage. That means boolean dependent GPU operations will be worse. With all the boolean comparisons in tessellation to determine whether the structure is ABA or ABB or BAA the PS4 won't be able to handle as long tessellated strings. At 8bits Char persision Header files will reduce the Ram speed to Xbox One levels. PS4 has yet to show many hardware lights in Graphically intensive games it only has shader lights probably due to limitations which look unnatural.

It's not a myth that DDR5 is worse than DDR3 at the same speed.
 

astraycat

Member
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?

This poster clearly has no idea what he's talking about. Nothing he said remotely made sense.
 

malfcn

Member
Sony got lucky with GDDR5, they were planning on only 4GB,
The prices fell, and devs asked for more so they bumped the specs.
With the original plan everyone would be talking about how MS will crush Sony.

I hope both consoles do very well. Competition breeds excellence.
Honestly I am an Xbox consumer right now. Will the difference in powers yield the "wild advantages" people are already proclaiming? We'll have to see, but I hope it's not a massacre.
 

TheCloser

Banned
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?

This post makes no sense what so ever. What the hell are hardware lights? Anyway slow cpu is not a concern because according to guerilla is being used to schedule tasks for the gpu and gpu compute is being used for some traditional cpu functions. So you can go to the site which cannot be named and tell that poster to troll elsewhere. Fortunately for us, the people who designed the hardware are smarter than that poster and took steps to alleviate several potential issues.
 
It's always a joy to hear about someone somewhere else who doesn't have any idea what they're talking about, especially when there's no link so I can't go tell them how stupid they are.
 

astraycat

Member
i assume that when the poster mentioned booleans, he was refering to branch prediction on the cpu.

I doubt the poster even knows what branch prediction is. He starts off by saying "GPU booleans", as if memory architecture matters for something like that. Then some randomness about tessellation.

Then the next quote has something about "header files to store their memory addresses" -- that's complete and utter bunk. It's like he found some programming terms and jumbled them together to try to form an argument. The rest of the post is the same way.
 
Sony got lucky with GDDR5, they were planning on only 4GB,
The prices fell, and devs asked for more so they bumped the specs.
With the original plan everyone would be talking about how MS will crush Sony.

I hope both consoles do very well. Competition breeds excellence.
Honestly I am an Xbox consumer right now. Will the difference in powers yield the "wild advantages" people are already proclaiming? We'll have to see, but I hope it's not a massacre.

Considering under the original 4GB model, the PS4 still has the GPU being 50% more powerful and even faster GDDR5 RAM, probably not.
 

stryke

Member
Sony got lucky with GDDR5, they were planning on only 4GB,
The prices fell, and devs asked for more so they bumped the specs.
With the original plan everyone would be talking about how MS will crush Sony.

I hope both consoles do very well. Competition breeds excellence.
Honestly I am an Xbox consumer right now. Will the difference in powers yield the "wild advantages" people are already proclaiming? We'll have to see, but I hope it's not a massacre.

Yet Timothy Lottes (FXAA creator) did a blog post on why he still favoured PS4 with its 4GB setup. Clearly you don't know shit.
 

badb0y

Member
PS4 is going to be worse AT GPU Boolean comparison operations while other systems can be multithreaded to aid Logic. This may hurt the PS4's use of Tessellation since boolean operations will be slower than Wii U and Xbox 1. It also has a weak CPU. It may not be able to handle many hardware lights ranging anywhere from Wii U to Last Gen. It just uses shader lights which don't have Alpha Channel correction. When a light shines on a ambient reflective object it only reflects back certain colors hiding the rest. PS4's lack of Edram means that the Char persision storage of shader information in RAM will be equivalent to Xbox One.
Booleans increase the number of header files to store their memory address and PS4 has huge memory banks think the location of one fish in a large ocean. PS4 has to store this address on the system to be read by memory controllers. This burden is much less for Embedded RAM which Wii U and Xbox one have which is much smaller in size and can be cycled for large storage. That means boolean dependent GPU operations will be worse. With all the boolean comparisons in tessellation to determine whether the structure is ABA or ABB or BAA the PS4 won't be able to handle as long tessellated strings. At 8bits Char persision Header files will reduce the Ram speed to Xbox One levels. PS4 has yet to show many hardware lights in Graphically intensive games it only has shader lights probably due to limitations which look unnatural.

It's not a myth that DDR5 is worse than DDR3 at the same speed.

What is this? I don't even.
 

badb0y

Member
OK, so I went to the link and it just looks like some idiot pulling crap out of his ass.

If you want a better understanding of GDDR5 vs DDR3 in consoles you guys should check out the posts by dividedbyzero on this link:
http://www.techspot.com/community/t...-between-ddr3-memory-and-gddr5-memory.186408/

I think he does a good job of explaining it and I just don't feel like writing up a wall of text right now.

Come to think of it, I think there's a person on these forums with the same tag, I wonder if it's the same guy lol.
 
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?

It reads like complete bullshit, he might as well have tried to convince you that he rides a fucking unicorn.
 

Perkel

Banned
I suppose this fits here since I can't find a dedicated thread when talking about PS4's hardware, but what do you make of what I found from one poster on another site concerning PS4's RAM?

Oh my. Reads like something written by B3D junior
 

Averon

Member
It's so sad to see B3D forums reduced to the state it is in. It used to be a great site for technical discussion, even for those who didn't understand a lot of it.
 

Raymo

Member
Yeah I don't know what the fuck he is talking about. Boolean comparisons? Booleans are fucking true/false values.

Boolean comparisons return a true/false value.


Is this greater than that - true or false?
Is this equal to that - true or false?
Is this less than that - true or false?

Etc.

His post is still nonsense though.
 
z3hWjVI.png

Whoever the copywriter assigned to this needs a raise lmao.
 
I see so many people construct these insane conspiracy theories either in favor of their console of choice or against the ones they don't like... or even fear, like that MS is going to pay developers to make shitty PS4 ports - and I laugh.
The fan-drivel is one of the highlights of GAF, especially when new hardware is on the horizon.
 
Top Bottom