• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

X1 DDR3 RAM vs PS4 GDDR5 RAM: “Both Are Sufficient for Realistic Lighting”(Geomerics)

FINALBOSS

Banned
With HMC the bandwidth feeding mobile SoCs will far outstrip that delivered by GDDR5, 2013 is really bad timing for next-gen. We are on the cusp of a paradigm shift on the semiconductor side, performance is not going to increment but explode across the entire range of TDPs. Intel finally getting serious about mobile will aid that ramp.

I think this will have to be a 5yr cycle this time around since mobile will start to give comparable performance within 3-4yrs.

That might as well have been in chinese.
 
With HMC the bandwidth feeding mobile SoCs will far outstrip that delivered by GDDR5, 2013 is really bad timing for next-gen. We are on the cusp of a paradigm shift on the semiconductor side, performance is not going to increment but explode across the entire range of TDPs. Intel finally getting serious about mobile will aid that ramp.

I think this will have to be a 5yr cycle this time around since mobile will start to give comparable performance within 3-4yrs.

We've already seen part of it. The vita has a SiP 512 bit memory interface for its VRAM. 3D is here.
 

TheCloser

Banned
If anyone refers to the PS4 as a super computer, they clearly have questionable logic.
Cerny went to great lengths to show how they went with time to triangle and a more straight forward architecture with small (but effective) tweaks here and there to improve efficiency.. He even mentioned using eDRAM as a more exotic approach but it would take developers too long to get to grips with..

I massively respect his approach of effectively KISS design and I hope it bears real fruit, as the PS3 to me was the shining example of attempts to be a supercomputer falling at the first hurdle..

Sorry, my mistake, i meant super charged computer. Just quoting Cerny.
 

avaya

Member
8 was the last rumor I heard, but it hasn't been confirmed. I'm assuming at least 4 for each.

Cerny said 64 compute threads in several interviews, each ACE is 8 can handle 8 threads. PS4 has 8. It's not a rumour. You have zero basis for thinking the Bone has more than the standard GCN of 2. Considering the lackadaisical design, I don't think it has more than 2 either.
 

MORT1S

Member
They could release the devkits. Isn't that similar to what happens with 360 and a last minute bump in ram? They went with the devkit motherboard or something?

I am fairly certain that was the case. The 1 GB kit didn't hit for a while after launch.

Isn't that why the JTAG exploit was possible on earlier boards? Maybe I am remembering wrong
 

Perkel

Banned
With HMC the bandwidth feeding mobile SoCs will far outstrip that delivered by GDDR5, 2013 is really bad timing for next-gen. We are on the cusp of a paradigm shift on the semiconductor side, performance is not going to increment but explode across the entire range of TDPs. Intel finally getting serious about mobile will aid that ramp.

I think this will have to be a 5yr cycle this time around since mobile will start to give comparable performance within 3-4yrs.

This generation is already too long waiting even more doesn't make any sense.
 
Cerny said 64 compute threads in several interviews, each ACE is 8 can handle 8 threads. PS4 has 8. It's not a rumour. You have zero basis for thinking the Bone has more than the standard GCN of 2. Considering the lackadaisical design, I don't think it has more than 2 either.

Thanks for the correction. I will be very surprised if Xbone only has 2 ACEs (as GCN has at least 4 from the rumors), and calling the design lackadaisical is laughable. It's bigger, has more transistors and by all appearances more custom circuitry. Just because it's targeting a less aggressive TF number doesn't mean its shit. Different budgets, different design goals.
 

TheCloser

Banned
With HMC the bandwidth feeding mobile SoCs will far outstrip that delivered by GDDR5, 2013 is really bad timing for next-gen. We are on the cusp of a paradigm shift on the semiconductor side, performance is not going to increment but explode across the entire range of TDPs. Intel finally getting serious about mobile will aid that ramp.

I think this will have to be a 5yr cycle this time around since mobile will start to give comparable performance within 3-4yrs.

Interesting, but i would shy away from making predictions that mobile SoC's will give comparable performance in 3-4 years. I'm not saying its not possible but lets just say i have my doubts. I will take a wait and see approach and if you are correct, i will be the first to applaud you.
 

Perkel

Banned
Thanks for the correction. I will be very surprised if Xbone only has 2 ACEs (as GCN has at least 4 from the rumors), and calling the design lackadaisical is laughable. It's bigger, has more transistors and by all appearances more custom circuitry. Just because it's targeting a less aggressive TF number doesn't mean its shit. Different budgets, different design goals.

I think he meant in comparison to PS4 design. PS4 did have luck with 8GB of GDDR5 where for MS 8GB was needed for their vision. At that time (of disign) only 8GB of DDR3 was possible so they designed their console with that in mind. ESRAM is just one of sideeffects of their design.

I have no doubt if MS could backtrack they would get also 8GB of GDDR5 instead of what is in xbone.

"Realistic lighting" is compute intensive. This story makes no sense.

You are not only one
 

avaya

Member
5yrs is the perfect refresh cycle. Also these won't be the last consoles IMO, so we will see the stacking dream come to fruition.

The demise of the console myth is the result of popular myopia about trends in technology. Furthermore broadband penetration and average reliable connection speeds +latency with unlimited transfer will not be widespread enough (by widespread I mean >70% of pop) in the majority of markets to make the pure cloud/streaming model a bit of a fantasy as the primary way of enjoying next-next gen.

FTTH (not the VDSL/VDSL+Vectoring variety) is cost prohibitive. VDSL is just shit past 250metres and the explosion in data use will eat up bandwidth, especially considering LTE/LTE R10 backhaul will rely so heavily on fibre. Cable is good but again in many markets will face issues with scope and availability.
 
I think he meant in comparison to PS4 design. PS4 did have luck with 8GB of GDDR5 where for MS 8GB was needed for their vision. At that time (of disign) only 8GB of DDR3 was possible so they designed their console with that in mind. ESRAM is just one of sideeffects of their design.

I have no doubt if MS could backtrack they would get also 8GB of GDDR5 instead of what is in xbone.



You are not only one

Then the designs would have likely been the same. That would have been remarkably boring.
 

TheCloser

Banned
"Realistic lighting" is compute intensive. This story makes no sense.

This is true, unfortunately, that is not the focal point of this thread. It simply has turned into a PS4 vs Xbox One specs comparison thread. I don't think anyone even payed attention to the lighting aspect. I for one did not. Just saw, 5gb vs 7gb and the debate started.
 
One last B3D cross post, it's a doozy:

http://beyond3d.com/showpost.php?p=1764168&postcount=4728

COPS N RAPPERS said:
Stringing together fragments of theories for the xb1 and how much of it has surfaced.

SuperDaE -
s6pKi2r.jpg


The interesting thing about SuperDaE's information is that it's based on the Display model(s) , which were confirmed to be old. (which could possibly date back to the beginning of this year)

Pre-E3
DSUCUoW.png

http://youtu.be/ifa9Q7ATfVA?t=1m50s

NeoGaf/Xbox-Scene - (Pre may 21st-E3 incident)
fq4lAoM.jpg


which were confirmed Xb1 devkits -

PR Manager Galit Motai -

The interesting thing is that both of the sources got the GPU/CPU and quantity of memory on the devkits correct; but they also got the speeds of the memory incorrect.


http://www.eurogamer.net/articles/digitalfoundry-xbox-one-memory-better-in-production-hardware

As of Digitalfoundry's new information, the Xb1 is said to make a couple more silicon revisions till launch. So i'm guessing "((interference))"'s statement would further backup the E3 incident.

These people are actually deluded.
 

androvsky

Member
One last B3D cross post, it's a doozy:

http://beyond3d.com/showpost.php?p=1764168&postcount=4728

These people are actually deluded.

ShiftyGeezer is editing out the "rumor" stuff, I can't believe he (and almost everyone else) is being so kind to it. That reads like a Half-life 3 ARG deconstruction.

And they're still taking the RAM bump seriously. Is there a higher capacity of DDR3 chip in the offering I'm not aware of, or are they talking about a total redesign of the Xbox One's memory architecture after the May 21 announcement?
 
I also tend to think that 5GB will become an issue later in the generation - especially when your oponent has 2 GB more of faster memory.

btw: How much RAM was common in PCs when 360/PS3 were revealed?

Nothing is stopping them from making a "refresh" 2 years down the road, like the S. Didn't that have a faster processor?

everyone is assuming that theyre just going to stick with this exact model for 10 years, they wont.
 
I look at the console Crysis 3 or GTA V among dozens of other great looking games that had a pool of 512 to draw from and they did just fine.

Moving from 512x512 textures as a base to 2048x2048 textures is a 16x increase in texture space required so 256 meg of average VRAM becomes 4gb. I can see 2048 maps being the highest worth going to on most surfaces as resolution of textures is more or less bound to the 1080p resolution of the display.

That is the texture res we are using as default moving forward in our projects.
 

ekim

Member
I look at the console Crysis 3 or GTA V among dozens of other great looking games that had a pool of 512 to draw from and they did just fine.

Moving from 512x512 textures as a base to 2048x2048 textures is a 16x increase in texture space required so 256 meg of average VRAM becomes 4gb. I can see 2048 maps being the highest worth going to on most surfaces as resolution of textures is more or less bound to the 1080p resolution of the display.

That is the texture res we are using as default moving forward in our projects.

So to some degree you doing the textures for the X1 and the PS4 suffers from this because you won't do extra assets for the PS4 version?
joking
 

astraycat

Member
Nothing is stopping them from making a "refresh" 2 years down the road, like the S. Didn't that have a faster processor?

everyone is assuming that theyre just going to stick with this exact model for 10 years, they wont.

You sort of have to. Console games are built with the restrictions and hardware quirks in mind -- you can't just change things in the hardware and expect old games to still work as they did. There may be shrinks to save cost, but they're going to ensure that all behavior is the same.

The only reason you can get away with stuff like this on the PC is because of how comparatively heavy the OS layer is -- it abstracts everything for you. On the GPU side shaders are all compiled at run-time (even offline compilations will just compile down to DX ASM or some similar IL, which are then compiled by the driver to actual GPU machine code).

Let's take your faster clock for example -- games may hard-code the clock speed. If the clock were to suddenly speed up, all the simulations based on cpu cycle timing are now faster, which leads to everything going out of sync in game.
 

Freki

Member
Nothing is stopping them from making a "refresh" 2 years down the road, like the S. Didn't that have a faster processor?

everyone is assuming that theyre just going to stick with this exact model for 10 years, they wont.

Sorry - but you clearly do not know what you are talking about.
HW upgrades will not happen until a new xbox (or PS) generation is launched.
 
I look at the console Crysis 3 or GTA V among dozens of other great looking games that had a pool of 512 to draw from and they did just fine.

Moving from 512x512 textures as a base to 2048x2048 textures is a 16x increase in texture space required so 256 meg of average VRAM becomes 4gb. I can see 2048 maps being the highest worth going to on most surfaces as resolution of textures is more or less bound to the 1080p resolution of the display.

That is the texture res we are using as default moving forward in our projects.

Thanks for this post, helps quite a bit explaining how the memory might be used.
 
You sort of have to. Console games are built with the restrictions and hardware quirks in mind -- you can't just change things in the hardware and expect old games to still work as they did. There may be shrinks to save cost, but they're going to ensure that all behavior is the same.

The only reason you can get away with stuff like this on the PC is because of how comparatively heavy the OS layer is -- it abstracts everything for you. On the GPU side shaders are all compiled at run-time (even offline compilations will just compile down to DX ASM or some similar IL, which are then compiled by the driver to actual GPU machine code).

Let's take your faster clock for example -- games may hard-code the clock speed. If the clock were to suddenly speed up, all the simulations based on cpu cycle timing are now faster, which leads to everything going out of sync in game.

I understand, but still, If the N64 can add more ram via an expansion pack, i don't see why a next gen console couldn't do something like that either, as long as its designed that way.
 

CoG

Member
I understand, but still, If the N64 can add more ram via an expansion pack, i don't see why a next gen console couldn't do something like that either, as long as its designed that way.

Fortunately, we have a product for people who want to upgrade specs routinely, it's called a PC.
 

xxracerxx

Don't worry, I'll vouch for them.
Nothing is stopping them from making a "refresh" 2 years down the road, like the S. Didn't that have a faster processor?

everyone is assuming that theyre just going to stick with this exact model for 10 years, they wont.

Sorry everyone who bought a launch console, you can no longer play a game built for that system unless you buy a whole new refreshed console! Aahhhh
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Nothing is stopping them from making a "refresh" 2 years down the road, like the S. Didn't that have a faster processor?

No. Consoles don't upgrade specs, even the latest PS3 uses the same 2x BD drive speed to keep compatibility. The 360 die shrinks went through a lot of pain to keep the timings all the same, making it the same as the 90nm parts.
 
You're not going to see any major differences between the two systems FOR NOW.

I say that because the PS4 does have 50% more shader cores that the Xbone, so that's definitely something that will show up later down the road.

I say FOR NOW, because we have to remember that we're coming from a gen that had less than 512mb of ram available for devs... i'll say that again..LESS THAN 512MB RAM AVAILABLE F/ DEVS...

that's an insanely low amount f/ 2013.

Devs in the first few years will be getting their first games out and most of these aren't going to take full advantage of the hardware at all because they're coming from 8 year old tech.

Once engines start adapting and devs start retooling their work, we should see a difference between the two mainly on exclusives because multi platform devs aren't going push the boundaries on one and not the other. they'll just make a game that is comparable on both systems and maybe the ps4 will have a higher native rez or smoother frame rate.

wait till 2015-16 when devs are able to get their 2nd game out. that's when we'll truly see what this gen is about.

so, what the guy in the OP is talking about is pretty true. the jump in power vs current gen far outweighs the differense in ram types.
 
You're not going to see any major differences between the two systems FOR NOW.

I say that because the PS4 does have 50% more shader cores that the Xbone, so that's definitely something that will show up later down the road.

I say FOR NOW, because we have to remember that we're coming from a gen that had less than 512mb of ram available for devs... i'll say that again..LESS THAN 512MB RAM AVAILABLE F/ DEVS...

that's an insanely low amount f/ 2013.

Devs in the first few years will be getting their first games out and most of these aren't going to take full advantage of the hardware at all because they're coming from 8 year old tech.

Once engines start adapting and devs start retooling their work, we should see a difference between the two mainly on exclusives because multi platform devs aren't going push the boundaries on one and not the other. they'll just make a game that is comparable on both systems and maybe the ps4 will have a higher native rez or smoother frame rate.

wait till 2015-16 when devs are able to get their 2nd game out. that's when we'll truly see what this gen is about.

so, what the guy in the OP is talking about is pretty true. the jump in power vs current gen far outweighs the differense in ram types.

Agreed. Especially with all these cross gen games out there. This transition is going to be different from last because of the bloated budgets. The cross gen games are going to be around longer then last time, maybe another 2-3 years, because the publishers can't afford to ignore the large install bases. Unfortunately this is going to hinder next gen game development and scope.

But I also agree its going to take a while tell devs fully exploit all of PS4s resources and advantages. If they cant even max out an XB1 right now, how would they be able to use the extra advantage offered by PS4?
 

RoboPlato

I'd be in the dick
I'd also like to know as well.
It's been rumored for a while. I'd say it's a safe bet. It's exactly double the game/OS reserves from before the RAM upgrade. That much RAM is a ton for games and it'll allow them to add a lot of OS features compared to before.
 
so wait, theres seriously people that still believe XB1 is getting a GPU upgrade? This late in the game? 4 months from launch? How are people even logically proposing this? They off in la-la land?
 
I understand, but still, If the N64 can add more ram via an expansion pack, i don't see why a next gen console couldn't do something like that either, as long as its designed that way.

the reason you haven't seen anyone try things like RAM expansion packs since the N64 is because it doesn't work.

it didn't take long before games we're released that were broken without it or refused to run, and that pisses off the userbase and fragments it.
 

stryke

Member
so wait, theres seriously people that still believe XB1 is getting a GPU upgrade? This late in the game? 4 months from launch? How are people even logically proposing this? They off in la-la land?

Microsoft have so much money they can bend the space-time continuum and upgrade to whatever they want in any time they want. /TheKayle
 

Nilaul

Member
Sooo yo dude, I heard there gonna pimp the XboxOne; there gonna add essentially add an Xbone inside an Xbone, dual sli everything. Heat wont be an issue, as heat will be dispensed with a heat stink made out of a bone. The casing of the console will be made out of rare elephant tusk.

lol
 

FourMyle

Member
that game was played live on stage. no one sane said this.

You are factually wrong. I was there the entire day of the conference and I very specifically remember pointing out that in a couple of months I would look back on that day and laugh at the retards who thought the footage was faked.

Use the search function and look around.
 
I wonder what would happen if someone told them that the mainboard design is finalised and all of the supply contracts are already in place. Production is due to start very soon if it hasn't started already. It takes a minimum of three months from the first finished box rolling off the production line in China to it being in stores in the US. To build up enough supply for a launch in the US and Europe simultaneously they will have to start producing them now (it should have started a couple of weeks ago going by a normal production timetable), especially if they want to avoid the same mistakes as last time with QC, I think another RRoD style fiasco will just kill the Xbox brand.

These deluded idiots won't even believe Microsoft when they announce the system has 8GB of RAM. Access to facts are not their problem.

If anyone refers to the PS4 as a super computer, they clearly have questionable logic.
Cerny went to great lengths to show how they went with time to triangle and a more straight forward architecture with small (but effective) tweaks here and there to improve efficiency.. He even mentioned using eDRAM as a more exotic approach but it would take developers too long to get to grips with..

I massively respect his approach of effectively KISS design and I hope it bears real fruit, as the PS3 to me was the shining example of attempts to be a supercomputer falling at the first hurdle..

Ironically, Microsoft officially markets the Xbox One as having supercomputer performance:

z3hWjVI.png
 
Top Bottom