• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

vcc

Member
Latency difference is even close to 0 since identified X1 memory chips are Micron "2133 Mhz CAS 14" which is not exactly high end modules.

Translation in real world latency :
(14 / 1066) x 1000 = 13.1 ns

Some GDDR5 chips have better latency than this. Even with a CL 20 it could be close, cause the base clock is higher (1375).

Well then, we can spread sheet equally well on both.
 
This one.


EDIT: I forgot there's Onion and Onion+.

It's actually 40 GB/s for the CPU across all 3 different buses.

Got it, thanks.
PS4
176GB/s OR 30GB/s (Diagram shows 40GB?)

XB1
DDR3 68GB/s + ESRAM 109GB/s OR 30GB/s

You don't add the CPU's view of the main memory. As far as I know the GPU can't access main memory full throttle if the CPU is poking around.

Just read your previous post, thanks.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
EDIT: I forgot there's Onion and Onion+.

It's actually 40 GB/s for the CPU across all 3 different buses.

Those two share the same pathway and thus the same bandwidth. It's 10GB/s each way, probably accumulating to the 20GB/s Cerny refered to in some interview. Those are coherent to the other 20GB/s available to the CPU.
 
I'd say 'won't'. It could be used but it doesn't seem like the XB1 is made to use it as easily as the PS4. Memory bandwidth limits, need to copy work back and forth between ESRAM, and the lack of much spare GPU resources suggested if it is used it'd be to a much lesser extent.

I'm sure they will make their way on to multiplatform games too, but to what extent, no idea. The teams working on exclusives will have been working with the development of the PS4 longer so will have known that the architecture allows for a very efficient use of GPGPU that could be planned for. Personally I'd like to soft body Dyamics in Battlefield, it would be outrageous.

So my question is lets say Dice are making BF5. Will they completely skip the use of soft body dynamics since GPGPU usage is so limited on the XB1? Or will they make a version of the game without it specifically for the XB1? Cause now the questions isn't a difference in resolution or frame rate. Physics are fundamental to the gameplay.

Edit: the point I'm trying to make is it appears to me that GPGPU usage is truly a next gen feature. But since one of the next gen consoles is lacking in that department if it will have a negative impact on multiplatform games.
 

vcc

Member
So my question is lets say Dice are making BF5. Will they completely skip the use of soft body dynamics since GPGPU usage is so limited on the XB1? Or will they make a version of the game without it specifically for the XB1? Cause now the questions isn't a difference in resolution or frame rate. Physics are fundamental to the gameplay.

I suspect multi-platform game studios will make it cosmetic instead of gameplay impacting. So the PC/PS4 version the player and NPC will seems to physically interact with objects while on the XB1 they'll do the 3 inches away hand wave they do this gen. You see this in infamous. Or if they choose to make a game to uses GPGPU heavily they'd tone down the Image Quality and resolution for the XB1 to make room. It'll depend on how close the XB1 one install base is. If the 4 PS4:1 XB1 number we hear persists then multi-plats will make a PC/PS4 version and heavily gimped it for the XB1 in a ham handed fashion. If it's 1PS4 :1 XB1 or better they'll keep it cosmetic.
 
I suspect multi-platform game studios will make it cosmetic instead of gameplay impacting. So the PC/PS4 version the player and NPC will seems to physically interact with objects while on the XB1 they'll do the 3 inches away hand wave they do this gen. You see this in infamous. Or if they choose to make a game to uses GPGPU heavily they'd tone down the Image Quality and resolution for the XB1 to make room. It'll depend on how close the XB1 one install base is. If the 4 PS4:1 XB1 number we hear persists then multi-plats will make a PC/PS4 version and heavily gimped it for the XB1 in a ham handed fashion. If it's 1PS4 :1 XB1 or better they'll keep it cosmetic.

Makes sense. I'd love to see more developers use voxels and soft body physics, etc.
 

artist

Banned
Since your other wrong parts of your post were taken care off, I'll tackle this bit.
The ps4 has a much better gpu than the xbox one, but both consoles have the same cpu. Doesn't this make the xbox one more balanced? As in mid-end gpu with a mid-end cpu, compared to the ps4 high-end gpu with a mid-end cpu?
No.

A mid-end cpu with a high-end gpu is a FAR better balanced gaming configuration.
 

viveks86

Member
First time poster, long time lurker, here.

This has been one hell of a thread to watch this past month. Only wish I could have participated in it from the start. Can't wait to be apart of the crazy community here. ;p

Welcome! :)

Any GPGPU algorithm that can be implemented on PS4 can also be implemented on XBO. There isn't a fundamental barrier preventing such techniques on either platform. The PS4, however, has the benefit of having significantly more compute resources to spend, and, in addition, has the more efficient architecture (scheduling, cache management).

This.


Since your other wrong parts of your post were taken care off, I'll tackle this bit.

No.

A mid-end cpu with a high-end gpu is a FAR better balanced gaming configuration.

This. The reason being games are GPU intensive and not CPU intensive.
 

Skeff

Member
So my question is lets say Dice are making BF5. Will they completely skip the use of soft body dynamics since GPGPU usage is so limited on the XB1? Or will they make a version of the game without it specifically for the XB1? Cause now the questions isn't a difference in resolution or frame rate. Physics are fundamental to the gameplay.

Edit: the point I'm trying to make is it appears to me that GPGPU usage is truly a next gen feature. But since one of the next gen consoles is lacking in that department if it will have a negative impact on multiplatform games.

I would imagine they'd use a different update rate, a 60 fps game doesn't necessarily have to have everything running at 60fps. Perhaps you would get 20fps destruction on PS4 and 10fps destruction on XB1.

Also as vcc says, it really depends on the install base and the software attach rate for third party games. They have to make money so they'll do whatever is best for them the things they'll need to consider when doing this are:

Competition - Both first party and third party support, IF BF5 can get GPGPU soft body dynamics working correctly it would give the yet another edge over CoD, this would be good for DICE. Especially if they are also competing against Halo and Killzone.

Parity - Parity is a word not used very well at the moment, the games need to have "Parity" between consoles but not exact parity, they need to have both versions running well, though discrepancies in resolution and frame rate dips are fine. This is only the case now though when sales are unconfirmed, if the XB1 somehow out sells PS4 comfortably then the last thing you want is a reputation for making poor XB1 games when all you actually did was make a good PS4 game, after a few years this may change.

Sales on each platform - If PS4 outsells XB1 4:1, the DICE won't care about XB1, PS4 will get full GPGPU systems and XB1 will get a reskinned BF4. If it is 2:1 with even atach rate then we'll likely get a GPGPU system across both, but if it is 2:1 and a lot of XB1 owners have a PS4 and are buying the better multiplat so sales on PS4 are higher than the suggested 2:1 from pure install base we may get more GPGPU and be closer to the full GPGPU vs. reskinned game. The Handheld/PS2 Fifa's are a good example of this reskin technique for low sales.

Cost - This is something Sony can help with, as we've already discussed the PS4 exclusive games are using GPGPU, If Sony were able to and would want to they could help by creating specific API's for GPGPU physics which would significantly reduce the cost i implementing them, I have no doubt Sony will be trying to do this to get an edge over the XB1, by providing something to developers that would be difficult to duplicate o the XB1 themselves and would likely require Microsoft to help create. If these are in place then perhaps we would get GPGPU on BF5, but only on PS4, MS updates the XB1 dev environment after this and BF6 is GPGPU on both platforms.

There are so many things to consider when implementing features like these on multiplatform games.

EDIT: the parity bit was rearding the difference between exact parity (we won't get that) and mainly parity (same game, higher res, higher framerate, better shadows)
 

viveks86

Member
I would imagine they'd use a different update rate, a 60 fps game doesn't necessarily have to have everything running at 60fps. Perhaps you would get 20fps destruction on PS4 and 10fps destruction on XB1.

Good post, Skeff! My slightly alternate theory is that they would have some way to throttle the number of physics calculations being done. So the PS4 could have a 100,000 particles with physics in realtime while the xbox has 50,000. FPS in this case could be the same. Just the density could be reduced.

Also, I should add that we shouldn't look at this from a game to game basis. Most of these multiplats run on game engines that are improved over time to get the best out of each platform. So if the game engine has the feature included, those benefits would be seen in most games using that engine (if the dev includes that as part of the game design, of course). So even if GPGPU doesn't make it at launch on some of these engines, they eventually would, since GPGPU compute will be big in PC gaming as well.
 
Some say the Cpu will be a bottleneck for the ps4 and it does make sense.
Not really. If you're cpu bound on PS4 you're doing it wrong as that means you should have gone GPGPU and offloaded processing to the GPU. By design there's an incentive to go GPGPU on PS4, if you decide to go against it, well, that's a mistake.
 
Not really. If you're cpu bound on PS4 you're doing it wrong as that means you should have gone GPGPU and offloaded processing to the GPU. By design there's an incentive to go GPGPU on PS4, if you decide to go against it, well, that's a mistake.

The whole balance argument MS was making was trying to say the same thing that basically the beefier GPU of the PS4 is useless because it's not balanced with the weaker CPU which of course isn't true
 
Regarding hUMA, all we know is that the ps4 most definitely supports it, the xbox one migh support it according to this:
From the latest info it seems the One GPU might not be able to write to main memory, which would mean that hUMA would only be possible within the eSRAM.

Regarding the bolded parts, there shouldn't be any texture loading differences comparing gddr5 vs ddr3 as there is barely any latency difference.
Texture loading is affected by bandwidth, not latency. That means One is very likely to start showing texture loading issues before PS4 (though individual games may not have any issues on either machine).
 

RayMaker

Banned
For anyone who has been studying this thread has it given us a clearer picture as to what the actual difference will be in games?
 
For anyone who has been studying this thread has it given us a clearer picture as to what the actual difference will be in games?

depends on the game.

Multiplatform titles could be a crapshoot. who is lead platform, how much effort is going into the game, etc. The PS4 is never going to look or play worse, but you might struggle to notice the difference in something that's cranked out every year like madden.

Higher profile third party games (say, GTA) with a lot of resources and time devoted to them will be noticeably different, with the edge to PS4. higher resolutions and better framerates, higher quality textures due to more free memory, etc.

Exclusive third party games and first party titles will be significant and substantially better on PS4.
 

mrklaw

MrArseFace
So my question is lets say Dice are making BF5. Will they completely skip the use of soft body dynamics since GPGPU usage is so limited on the XB1? Or will they make a version of the game without it specifically for the XB1? Cause now the questions isn't a difference in resolution or frame rate. Physics are fundamental to the gameplay.

Edit: the point I'm trying to make is it appears to me that GPGPU usage is truly a next gen feature. But since one of the next gen consoles is lacking in that department if it will have a negative impact on multiplatform games.

Personally, I think GPGPU is going to take off big time next gen, which means on PCs as well. It has to, because the CPUs are so relatively underpowered. So as PC engines develop, PS4 will get better ports than Xbox 1. Even if PS4 had the same number of CUs as Xbox 1, the increased ACEs give it the potential to use those more efficiently.
 

Snubbers

Member
Adding to that, the latest real world bandwidth numbers for the PS4 are 172 GB/s, the latest real world numbers are 140 GB/s for the Esram. We do not have real world bandwidth numbers for the Xbox One's DDR3 ram,

The DF interview with MS yield some actual real world and theoretical numbers,
ESRAM / DDR3
204 / 68 theoretical
150 / 50-55 in a real application

Having over 200GB/s in a real application setting isn't too bad.. Of course, assuming they put that to good use, it's never going to make up the gap in sheer GPU ability of the PS4, but it doesn't take a blind man to see that at least the XB1 has some usage patterns that might get it some moderate performance, however as soon as a Dev stays from the MS designed yardage pattern, it's going to really struggle. IMHO of course.
 

Skeff

Member
The DF interview with MS yield some actual real world and theoretical numbers,
ESRAM / DDR3
204 / 68 theoretical
150 / 50-55 in a real application

Having over 200GB/s in a real application setting isn't too bad.. Of course, assuming they put that to good use, it's never going to make up the gap in sheer GPU ability of the PS4, but it doesn't take a blind man to see that at least the XB1 has some usage patterns that might get it some moderate performance, however as soon as a Dev stays from the MS designed yardage pattern, it's going to really struggle. IMHO of course.

Just to point out, they were maximums not averages.
 

vcc

Member
The DF interview with MS yield some actual real world and theoretical numbers,
ESRAM / DDR3
204 / 68 theoretical
150 / 50-55 in a real application

Having over 200GB/s in a real application setting isn't too bad.. Of course, assuming they put that to good use, it's never going to make up the gap in sheer GPU ability of the PS4, but it doesn't take a blind man to see that at least the XB1 has some usage patterns that might get it some moderate performance, however as soon as a Dev stays from the MS designed yardage pattern, it's going to really struggle. IMHO of course.

If you read the interview (the one on spare cycle extra reads/writes) it actually

109GB / 68GB designed theoretical MAX

209GB/68GB improbable projection of a one off trick that they could only get to 133GB on the best case fake work load.

<109GB / <68Gb real world application

They don't have 200+GB/s theoretical. If dev's use the ESRAM well and run it parallel to main memory they'll get a restrictive rough parity in bandwidth with many many asterisks.
 
I read Cerny talk about having the option of DDR3 + very fast eDRAM that would give him 10x the bandwidth of GDDR5. Instead he chose the simpler 8GB GDDR5. So MS could have gone with a faster embedded ram? Or maybe even larger? And why didn't they?

Actually it was 8gb of gddr5 coupled with a small cache of edram(probably 32mb) and drop the bus from 256 to 128 bit. He said that they could get bandwidth of over 1tb/s. And that this method was actually cheaper to develop. He also said that companies would need to design "special techniques" in order to unlock the systems true potential. I complained back when he mentioned this but it actually makes sense. I don't even think the ps4's gpu would be capable of reading 1tb/s. Also if he went that route they system wouldn't be hUMA compliant then, which is why he said that it might be counter intuitive but less (176 unified) is more.
 
I read Cerny talk about having the option of DDR3 + very fast eDRAM that would give him 10x the bandwidth of GDDR5. Instead he chose the simpler 8GB GDDR5. So MS could have gone with a faster embedded ram? Or maybe even larger? And why didn't they?

My guess is ESRAM will be significantly easier/cheaper come the next die shrink about 12-18 months from now.
 

viveks86

Member
I read Cerny talk about having the option of DDR3 + very fast eDRAM that would give him 10x the bandwidth of GDDR5. Instead he chose the simpler 8GB GDDR5. So MS could have gone with a faster embedded ram? Or maybe even larger? And why didn't they?

Actually it was 8gb of gddr5 coupled with a small cache of edram(probably 32mb) and drop the bus from 256 to 128 bit. He said that they could get bandwidth of over 1tb/s. And that this method was actually cheaper to develop. He also said that companies would need to design "special techniques" in order to unlock the systems true potential. I complained back when he mentioned this but it actually makes sense. I don't even think the ps4's gpu would be capable of reading 1tb/s. Also if he went that route they system wouldn't be hUMA compliant then, which is why he said that it might be counter intuitive but less (176 unified) is more.

This. Posting direct quotes from Cerny for reference:

"One thing we could have done is drop it down to 128-bit bus, which would drop the bandwidth to 88 gigabytes per second, and then have eDRAM on chip to bring the performance back up again," said Cerny. While that solution initially looked appealing to the team due to its ease of manufacturability, it was abandoned thanks to the complexity it would add for developers. "We did not want to create some kind of puzzle that the development community would have to solve in order to create their games. And so we stayed true to the philosophy of unified memory."

In fact, said Cerny, when he toured development studios asking what they wanted from the PlayStation 4, the "largest piece of feedback that we got is they wanted unified memory."

"I think you can appreciate how large our commitment to having a developer friendly architecture is in light of the fact that we could have made hardware with as much as a terabyte [Editor's note: 1000 gigabytes] of bandwidth to a small internal RAM, and still did not adopt that strategy," said Cerny. "I think that really shows our thinking the most clearly of anything."

So MS could have gone with a faster embedded ram? Or maybe even larger? And why didn't they?

Faster embedded RAM would increase the BOM further. Larger embedded RAM would have increased both BOM and die space further. Think MS was already maxed out on both fronts.
 

twobear

sputum-flecked apoplexy
One thing that confuses me is that Cerny says that the eDRAM is easier to manufacture. But we know that the Xbone's chip is a beast that they're having difficulty manufacturing in good quantities. Is the difference solely down to eDRAM/eSRAM?

Seems like another one of those completely baffling design choices by MS.
 

Skeff

Member
One thing that confuses me is that Cerny says that the eDRAM is easier to manufacture. But we know that the Xbone's chip is a beast that they're having difficulty manufacturing in good quantities. Is the difference solely down to eDRAM/eSRAM?

Seems like another one of those completely baffling design choices by MS.

The esram MS are using takes up 3x the space as the same amount of edram I believe.
 

James Sawyer Ford

Gold Member
One thing that confuses me is that Cerny says that the eDRAM is easier to manufacture. But we know that the Xbone's chip is a beast that they're having difficulty manufacturing in good quantities. Is the difference solely down to eDRAM/eSRAM?

Seems like another one of those completely baffling design choices by MS.

Microsoft chose eSRAM so that they wouldn't need a separate chip and it could be integrated with their APU
 

Biker19

Banned
Parity - Parity is a word not used very well at the moment, the games need to have "Parity" between consoles but not exact parity, they need to have both versions running well, though discrepancies in resolution and frame rate dips are fine. This is only the case now though when sales are unconfirmed, if the XB1 somehow out sells PS4 comfortably then the last thing you want is a reputation for making poor XB1 games when all you actually did was make a good PS4 game, after a few years this may change.

But what happens when PS4 versions of games consistently outsells Xbox One versions of games, despite the Xbox One having a big install base than the PS4? Then it's really not going to be worth the investment for publishers/developers to make Xbox One the lead platform for most multiplat games over PS4's versions of games. It'll all be a big waste.
 
But what happens when PS4 versions of games consistently outsells Xbox One versions of games, despite the Xbox One having a big install base than the PS4?

I don't follow

PS4 sales of multiplats will almost certainly outsell XB1's due to the PS4 having the larger install base

Is that what you were saying?
 

viveks86

Member
But what happens when PS4 versions of games consistently outsells Xbox One versions of games, despite the Xbox One having a big install base than the PS4? Then it's really not going to be worth the investment for publishers/developers to make Xbox One the lead platform for most multiplat games over PS4's versions of games. It'll all be a big waste.

Huh? I think everyone agrees the PS4 install base will be larger. Don't think this is up for debate for the first year at least. So I'm not sure I understand your point

Microsoft chose eSRAM so that they wouldn't need a separate chip and it could be integrated with their APU
This.

But why did they choose to integrate it with the APU, if eDRAM could have been done for 3 times lesser die space? Latency? Did MS really think latency was the highest priority? May be it had to do with low latency requirements for the OS features, rather than gaming?
 

Biker19

Banned
I don't follow

PS4 sales of multiplats will almost certainly outsell XB1's due to the PS4 having the larger install base

Is that what you were saying?

No, I'm saying if the Xbox One has a larger install base than the PS4, but most multiplat games sells well on PS4 over Xbox One's versions of games, then it would be a big waste to make Xbox One the lead platform for multiplat games.
 
No, I'm saying if the Xbox One has a larger install base than the PS4, but most multiplat games sells well on PS4 over Xbox One's versions of games, then it would be a big waste to make Xbox One the lead platform for multiplat games.

All the pre-order data we have on hand suggests quite strongly that the PS4 will outsell the XB1 at launch by a good margin and likely do well over the holiday season

I see little reason to believe the XB1 install base will be larger with a more limited launch
 
But what happens when PS4 versions of games consistently outsells Xbox One versions of games, despite the Xbox One having a big install base than the PS4? Then it's really not going to be worth the investment for publishers/developers to make Xbox One the lead platform for most multiplat games over PS4's versions of games. It'll all be a big waste.

If they are smart they wouldn't be using either system, instead they should be using the PC as the lead platform.

It's weird, all the arguments seem to be from the XBox and Playstation camps. We seen the difference in titles from the PC to that of the PS3/XBox 360 and then to the Wii and then to the handhelds. So why is all the attention on those two systems that were relatively close in power last generation with minimal differences to now with the upcoming systems? Some cried foul developers were lazy during the PS3 and XBox 360 times and the same camp will likely do the same thing on the PS4 and XBox One. Yet chances are the PC will show improvements over the PS4 and XBox One, but those two will show improvements compared to the Wii U.
 

Skeff

Member
No, I'm saying if the Xbox One has a larger install base than the PS4, but most multiplat games sells well on PS4 over Xbox One's versions of games, then it would be a big waste to make Xbox One the lead platform for multiplat games.

Well if install base is higher on XB1 but sales are higher on PS4, I'd expect them to go for very similar games, I don't think we'd get too much difference, unless the PS4 software sales were a looooong way ahead of the XB1 then we might get more GPGPU and more pushing of the Hardware. But the projected split of game sales for a title is only one thing to consider when devoting time and personnel to a title.
 
my 2 cents...

remote play locks ps3, so it will lock ps4 I suspect. On the PS3 it's complete utter dog shit and barely supports any games. I suspect things will change in that regard with the PS4.

As far as PSN speeds are concerned, I don't have any problems lately, makes me believe they've fixed things. Not discounting other people's experiences, just stating my own.

Like Sword of Doom said, I've experienced improvements in speed over the past year or so...
 

Skeff

Member
my 2 cents...

remote play locks ps3, so it will lock ps4 I suspect. On the PS3 it's complete utter dog shit and barely supports any games. I suspect things will change in that regard with the PS4.


As far as PSN speeds are concerned, I don't have any problems lately, makes me believe they've fixed things. Not discounting other people's experiences, just stating my own.

Like Sword of Doom said, I've experienced improvements in speed over the past year or so...

We already have confirmation that it works on all games that do not require the camera or ay other special peripheral, We also know you can play as a DS4 player and a vita Remote player at the same time, however with identical picture output. Perfect for games like Diablo 3
 

RoboPlato

I'd be in the dick
my 2 cents...

remote play locks ps3, so it will lock ps4 I suspect. On the PS3 it's complete utter dog shit and barely supports any games. I suspect things will change in that regard with the PS4.

As far as PSN speeds are concerned, I don't have any problems lately, makes me believe they've fixed things. Not discounting other people's experiences, just stating my own.

Like Sword of Doom said, I've experienced improvements in speed over the past year or so...

There's dedicated internal hardware for it in the PS4. It'll be supported by every game that doesn't require a camera and should be pretty smooth.
 

viveks86

Member
No, I'm saying if the Xbox One has a larger install base than the PS4, but most multiplat games sells well on PS4 over Xbox One's versions of games, then it would be a big waste to make Xbox One the lead platform for multiplat games.

Hmmm. Considering the architectures are now similar to PC, I would expect PC to become the lead platform for most multiplats. There would be a few console only titles, but I expect to see more devs releasing on PC as well. In these cases, PC would be the lead platform anyway.

Also, the choice of lead platform has more to do with hardware performance and ease of development than the install base. The install base will be a factor only to decide whether they should develop for it at all. But once that choice has been made, the lead platform will be the one that has the potential to realize the devs' vision to the maximum.

To summarize, for multiplats, it is reasonable to expect either PC or PS4 to be the lead platform, regardless of whether Xbox (hypothetically) ends up with a larger install base.
 
No, I'm saying if the Xbox One has a larger install base than the PS4, but most multiplat games sells well on PS4 over Xbox One's versions of games, then it would be a big waste to make Xbox One the lead platform for multiplat games.

The way it is now where PS3 has outsold the 360 but the 360 sells better multiplatform games and has the higher attach rates in general.

Edit: with Sony addressing the issues with the DS4, ease of development, PSN, cross party that, more power than the Xbox. Not only will the PS4 outsell the xbone globally but it'll also have the higher attach rates.
 

BigJoeGrizzly

Neo Member
It's a pain to program for.

Is it really though? Didn't Microsoft use a similar memory architecture (embedded RAM) with the 360, and it was lauded for its ease of development (especially over the PS3)? How is it all of a sudden a pain to program for when devs have had nearly 8+ years of experience with it in current-gen? I also remember the recent Digital Foundry interview regarding the Xbone architecture where the engineers said they've actually made improvements to the embedded RAM setup compared to the 360 that made it even EASIER for developers.
 

RoboPlato

I'd be in the dick
Is it really though? Didn't Microsoft use a similar memory architecture (embedded RAM) with the 360, and it was lauded for its ease of development (especially over the PS3)? How is it all of a sudden a pain to program for when devs have had nearly 8+ years of experience with it in current-gen?

360's eDRAM useage was managed by the API but the eSRAM in XBO has to be managed by devs and they're still coming to grips with what needs to go into it. The 360's main pool of GDDR3 was also much better multipurpose RAM than the DDR3 in the XBO so those decisions of what to put where are even more important.
 
Is it really though? Didn't Microsoft use a similar memory architecture (embedded RAM) with the 360, and it was lauded for its ease of development (especially over the PS3)? How is it all of a sudden a pain to program for when devs have had nearly 8+ years of experience with it in current-gen? I also remember the recent Digital Foundry interview regarding the Xbone architecture where the engineers said they've actually made improvements to the embedded RAM setup compared to the 360 that made it even EASIER for developers.

ESram =/= EDram

From my understanding of it, EDram on the X360 was handled by the SDK while the ESram on the XB1 requires progammers input and is not handled by the SDK automatically

I'm sure MS's sdk will improve but right now it seems like MS's ram solution is a bit of a pain to program for

Edit: Beaten by RoboPlato ^^^
 

viveks86

Member
Is it really though? Didn't Microsoft use a similar memory architecture (embedded RAM) with the 360, and it was lauded for its ease of development (especially over the PS3)? How is it all of a sudden a pain to program for when devs have had nearly 8+ years of experience with it in current-gen?

It's all relative. The 360 was easier to develop relative to the PS3. But the architecture still posed many challenges. One might argue that since devs have been programming for the 360 for a while, things should be easy now. That may be the case. But we must realize that when you move from one generation to another, you end up doing quite an overhaul of the code base to make sure you can get the maximum out of next gen. So the lesser hurdles to get this done, the better. The Xbox one would be relatively harder to program for than the PS4, because of the ESRAM.

Also, a good point from RoboPlato and SwiftDeath. I didn't know that! :)
 

BigJoeGrizzly

Neo Member
360's eDRAM useage was managed by the API but the eSRAM in XBO has to be managed by devs and they're still coming to grips with what needs to go into it. The 360's main pool of GDDR3 was also much better multipurpose RAM than the DDR3 in the XBO so those decisions of what to put where are even more important.

From the Digital Foundry interview:

Andrew Goossen: I just wanted to jump in from a software perspective. This controversy is rather surprising to me, especially when you view ESRAM as the evolution of eDRAM from the Xbox 360. No-one questions on the Xbox 360 whether we can get the eDRAM bandwidth concurrent with the bandwidth coming out of system memory. In fact, the system design required it. We had to pull over all of our vertex buffers and all of our textures out of system memory concurrent with going on with render targets, colour, depth, stencil buffers that were in eDRAM.

Of course with Xbox One we're going with a design where ESRAM has the same natural extension that we had with eDRAM on Xbox 360, to have both going concurrently. It's a nice evolution of the Xbox 360 in that we could clean up a lot of the limitations that we had with the eDRAM. The Xbox 360 was the easiest console platform to develop for, it wasn't that hard for our developers to adapt to eDRAM, but there were a number of places where we said, "Gosh, it would sure be nice if an entire render target didn't have to live in eDRAM," and so we fixed that on Xbox One where we have the ability to overflow from ESRAM into DDR3 so the ESRAM is fully integrated into our page tables and so you can kind of mix and match the ESRAM and the DDR memory as you go.

Sometimes you want to get the GPU texture out of memory and on Xbox 360 that required what's called a "resolve pass" where you had to do a copy into DDR to get the texture out - that was another limitation we removed in ESRAM, as you can now texture out of ESRAM if you want to. From my perspective it's very much an evolution and improvement - a big improvement - over the design we had with the Xbox 360. I'm kind of surprised by all this, quite frankly.
 
Is it really though? Didn't Microsoft use a similar memory architecture (embedded RAM) with the 360, and it was lauded for its ease of development (especially over the PS3)? How is it all of a sudden a pain to program for when devs have had nearly 8+ years of experience with it in current-gen? I also remember the recent Digital Foundry interview regarding the Xbone architecture where the engineers said they've actually made improvements to the embedded RAM setup compared to the 360 that made it even EASIER for developers.

Maybe because now developers have to try to squeeze a lot more performance out of the architecture. Now they have to try and get the same amount of performance they're getting out of the PC and the PS4 out of the Xbox one. I'm not a tech guy so I can't explain it that well but that's how I see it.
 

Skeff

Member
Is it really though? Didn't Microsoft use a similar memory architecture (embedded RAM) with the 360, and it was lauded for its ease of development (especially over the PS3)? How is it all of a sudden a pain to program for when devs have had nearly 8+ years of experience with it in current-gen?

It wasn't really ease of development, it was easier than the CELL, It's like being the the tallest sibling in a extremely short family, Even as the tallest you wouldn't be considered tall at 5 foot 6 inch.

Worse API support and it's a bigger crutch this time.

Last time the 360 RAM was at around 22 gb/s between RAM and GPU or something where as the PS3 has a bandwidth of essetially the same between GPU and RAM.

It was not a crutch last time, it was an advantage, now it must be used to to make up for the differences in the speed of the main memory, rather than just being used for the limited cases it was great for.
 
Top Bottom