• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: PlayStation 4 supports hUMA, Xbox One does not

On a classical system you have a RAM pool and a VRAM pool that are physically speperated. Copying data from one pool to the other creates latency. The GPU is very good ad hiding latency. What it needs most is high bandwidth. The CPU on the other hand is extremely sensitive to latency. The CPU needs extremely low latency to work efficiently. Copying data from the RAM (CPU) to the VRAM (GPU) creates latency, but that's okay for the GPU. Copying data from RAM (CPU) to VRAM (GPU) and back to the RAM (CPU) creates even more latency. It's too much for the CPU. The copying alone takes longer than the computation wich makes this roundtrip highly ineffective.

Xbox360 and older APUs have a unified RAM. This means that the RAM is no longer physically seperated, but even though it's the same RAM chips, the system still distincts between memory partition for the differenct processors. You still need to copy the data between CPU partition and GPU partition, but this will be much more efficient than copying it between physically seperated pools. But it's still too much latency for a CPU, GPU, CPU roundtrip.

PS4 will have hUMA wich means that you no longer need a distinction between CPU partition and GPU partition. Both processors can use the same pieces of data at the same time. You don't need to copy stuff and this allows for completely new algorithms that utilize CPU and GPU at the same time. This is interesting since a GPU is very strong, but extremely dumb. A CPU is extremely smart, but very weak. Since you can utilize both processors at the same time for a single task you have a system that is extremely smart and extremely strong at the same time.

It will allow for an extreme boost for many, many algorithms and parts of algorithms. On top of that it will allow for completely new classes of algorithms. This is a game changer.

Sounds great!
 
GDDR5 > DDR3
More GPU flops > Less GPU flops
More CUs > Less CUs
HuMa > Cloud

Sony wins again, Xbox One is last gen, PS4 is next gen confirmed
Not when it comes to latencies. I don't know what are those advantages of the HUMA, but in terms of CPU power, that GDDR5 could be VERY problematic.
 

Chobel

Member
Since they're all being rushed, though, I'm assuming that the detriment to each is the same. I'm not assuming that they're all indicative of what visuals will look like in three years, only that if the gap were as big as Xbox and DC (even PS2), we'd already see it. Unless you think Forza 5 is as good as Xbone will ever look and Driveclub is only utilising ~20% of the PS4's power, of course. Which is possible but seems unlikely to me.

[edit] I picked DC and F5 because they're both racers. I think the point remains valid if you pick, say, Fable Heroes (Legends?) and Knack, though. The visuals in Knack are noticeably better but they're not Dreamcast to Xbox level.

You can't just compare Forza graphics to Driveclub, because they're using two different rendering techniques. Forza is using static lighting whereas Driveclub is using dynamic lighting.
Yes, Better graphics in Forza but at the expense of changing lighting (morning/noon/night...) in real time.
 

Toski

Member
You're saying these things as though I disagree with them...I was replying specifically to the comment that the gap will be akin to Xbox vs Dreamcast.

Well then I agree with you. I still think this is bad for MS going forward, as it adds onto the the list of deficiencies that the X1 has compared to the PS4.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I added another quote from the ArsTechnica article to the OP to make things clearer.

http://arstechnica.com/information-...orm-memory-access-coming-this-year-in-kaveri/

As well as being useful for GPGPU programming, this may also find use in the GPU's traditional domain: graphics. Normally, 3D programs have to use lots of relatively small textures to apply textures to their 3D models. When the GPU has access to demand paging, it becomes practical to use single large textures—larger than will even fit into the GPU's memory—loading the portions of the texture on an as-needed basis. id Software devised a similar technique using existing hardware for Enemy Territory: Quake Wars and called it MegaTexture. With hUMA, developers will get MegaTexture-like functionality built-in.
 

longbisquit

Neo Member
On a classical system you have a RAM pool and a VRAM pool that are physically speperated. Copying data from one pool to the other creates latency. The GPU is very good ad hiding latency. What it needs most is high bandwidth. The CPU on the other hand is extremely sensitive to latency. The CPU needs extremely low latency to work efficiently. Copying data from the RAM (CPU) to the VRAM (GPU) creates latency, but that's okay for the GPU. Copying data from RAM (CPU) to VRAM (GPU) and back to the RAM (CPU) creates even more latency. It's too much for the CPU. The copying alone takes longer than the computation wich makes this roundtrip highly ineffective.

Xbox360 and older APUs have a unified RAM. This means that the RAM is no longer physically seperated, but even though it's the same RAM chips, the system still distincts between memory partition for the differenct processors. You still need to copy the data between CPU partition and GPU partition, but this will be much more efficient than copying it between physically seperated pools. But it's still too much latency for a CPU, GPU, CPU roundtrip.

PS4 will have hUMA wich means that you no longer need a distinction between CPU partition and GPU partition. Both processors can use the same pieces of data at the same time. You don't need to copy stuff and this allows for completely new algorithms that utilize CPU and GPU at the same time. This is interesting since a GPU is very strong, but extremely dumb. A CPU is extremely smart, but very weak. Since you can utilize both processors at the same time for a single task you have a system that is extremely smart and extremely strong at the same time.

It will allow for an extreme boost for many, many algorithms and parts of algorithms. On top of that it will allow for completely new classes of algorithms. This is a game changer.

Daaaaaamn! Why you soo wicked?!
 

twobear

sputum-flecked apoplexy
You can't just compare Forza graphics to Driveclub, because they're using two different rendering techniques. Forza is using static lighting whereas Driveclub is using dynamic lighting.
Yes, Better graphics in Forza but at the expense of changing lighting (morning/noon/night...) in real time.
You didn't read my post very well, I take it? I said Driveclub has better graphics (because it does).
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Not when it comes to latencies. I don't know what are those advantages of the HUMA, but in terms of CPU power, that GDDR5 could be VERY problematic.

GDDR5 doesn't have inherently higher latencies than DDR3. It's all a matter of how the memory controller is setup to work, and those factors are different for CPU/GPU. That's why PS4 has multiple pathways (or "access modes" although not technically correct).
 

Pug

Member
You can't just compare Forza graphics to Driveclub, because they're using two different rendering techniques. Forza is using static lighting whereas Driveclub is using dynamic lighting.
Yes, Better graphics in Forza but at the expense of changing lighting (morning/noon/night...) in real time.

But Forza is 60fps, Drive is 30 due to the expense of dynamic light. Swings and roundabouts.
 

Osiris

I permanently banned my 6 year old daughter from using the PS4 for mistakenly sending grief reports as it's too hard to watch or talk to her
A lot of people confusing Unifed memory for Uniform memory in here.

Standard unified set-ups (UMA) allows devs to use one pool of memory split however they want into CPU/GPU 'pools', but the two still remain separate.

hUMA allows both CPU and GPU to address the same memory pool, removing the split and therefore removing the need to ever copy data between CPU / GPU pools which would still need to be done on a UMA based design.
 
What is a huma?

It's what this guy refers to the natives of Earth as.

quark-ferengi-9330446toyha.jpg
 

orioto

Good Art™
...and Driveclub looks better as a result. Not sure what your point is here? You think that DC is 4.5x the computational workload of F5? Okay...

My point is that the game have such big tech choices that they are not really relevant to compare power of both consoles.
 
The graphical king on consoles or every platforms?

What i see is hUMA will bring performance with some rendering techniques that can't be touched by PCs brute forcing right now, and it's why it make PS4 a "a generation ahead of high end PC" in terms of architecture right now (and until 2014), am i right?

alexandros coming in 3... 2... 1...

You rang? Tell you what guys, how about we try a little role reversal? Since most of the time it is me who is asked to specify PC hardware that could theroretically match the PS4, how about you do it this time?

So here's my question: if you had to make an educated guess, what kind of graphics horsepower would I need to match the PS4? A GTX 780? A TITAN? A 7970?

Second question: What kind of differences do you expect to see in multiplatform games between the PS4 and the XBO? 60 fps vs 30 fps? 1080p vs 720p?
 
GDDR5 doesn't have inherently higher latencies than DDR3. It's all a matter of how the memory controller is setup to work, and those factors are different for CPU/GPU. That's why PS4 has multiple pathways (or "access modes" although not technically correct).

correct. the ps4's memory controllers are designed with that memory in mind. there IS no "latency issue" with the gddr5.
 
If there are fewer frame rate drops and better textures on the PS4 that is a big deal.

We had those discrepancies during this console cycle and that didn't stop most games from selling reasonably on either platform. Didn't Bayonetta, as an example, end up selling more on the Playstation 3 than it did on the 360, even though the 360 was significantly better in most aspects?

I'm not saying it won't make a difference for informed consumers such as us or people who happen to own both consoles and are able to distinguish and appreciate the differences, but I still honestly don't think that for the general market they will make much of a difference.

I'm not seeing any third-party developer trying to make the PS4 or XB1 versions significantly better than one another. The way that they choose to distinguish the two platforms is by settling on content deals with either one.

We'll just have to wait and see how things unravel. I'm just glad that both platforms are now easier to develop for, regardless of which one is slightly better. Game development costs really needed to come down.
 

Kum0

Member
Sounds interesting... I wonder how Mega Textures would work or would they even be needed in this type of environment?
 

JaseC

gave away the keys to the kingdom.
Welp, I learned something new today. I just assumed it was easier the other way around since that's how it was done with 360 and PS3. But given that they had different internal architectures, I suppose that makes sense.

And Soul Destroyer said the pc will be the lead platform this gen so it sounds like the pc version will look best, followed by PS3 and 360. Hopefully that means no gimped pc versions this time around (I'm looking at you Dark Souls)

The PS3 isn't significantly more powerful than the X360, though, as while Cell's superiority can be used to aid in rendering tasks such as HDR lighting, the fundamental ground work still needs to be done by the rather weak RSX -- areas where the X360's GPU is better-equipped. It's certainly ideal to port from the PS3 to the X360 if the engine is up to the task, but due to the PS3's unique architecture, this was and still is rarely the case. With the PS4 and X1, however, the roadblock of architectural dissimilarities doesn't exist -- the only notable concern for developers is having to micromanage memory utilisation in the X1 due to the DDR3+ESRAM configuration versus the unified GDDR5 approach in the PS4.

In fewer words, due to the PS3 and X360 largely being opposites (the PS3 has a much better CPU and a crummy GPU versus the X360 having a weaker CPU but a more capable GPU), the theoretical power advantage of the PS3 is largely cancelled out; the PS4 and X1, however, have virtually identical CPUs and sport different flavours of the same GPU family.
 

Perkel

Banned
Still "funny" that an AMD representative is so public about the performance differences of its customer's competing products.

What will MS do now ? Back out of AMD ?

If anything this may hurt them in future console (if there will be future for them or consoles)
 
Lets wait and see how multi platform games end up looking before jumping to conclusions.
geeze this site cracks me up. Looks at Forza 5, the X1 is no slouch in the graphics department.
IMO, I still haven't seen anything on PS4 that matches that. And Im not biased Im getting both systems.
 
What will MS do now ? Back out of AMD ?

If anything this may hurt them in future console (if there will be future for them or consoles)


highly unlikely. what are they going to do, go back to Nvidia? Nvidia burned MS once with the OG xbox, then turned around and burned sony with a gimped RSX.
No one in their right mind would give Nvidia a third shot at designing a console GPU.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
So sorry to have offended you. Hopefully you aren't angry the rest of the day.
Perhaps you missed where I wasn't explaining, I was asking.

Then I very much apologise. Sorry.
 

Chobel

Member
But Forza is 60fps, Drive is 30 due to the expense of dynamic light. Swings and roundabouts.

No argue there, but my point is comparing Forza graphics and drive club is unfair and shouldn't be used to compare the 2 consoles graphic power.

You didn't read my post very well, I take it? I said Driveclub has better graphics (because it does).

Sorry, I jumped the gun a bit.
 

longbisquit

Neo Member
"With the PS4 and HSA, AMD may well be well on its way to dominating the gaming scene in future. Not only will their platform be easier to code for, but Intel and Nvidia are currently not releasing any products with the same benefits. Intel’s Haswell graphics still partition off the GPU memory and Nvidia’s graphics cards won’t be compatible with hUMA. Its a great time to be an AMD fan."

http://mygaming.co.za/news/hardware/53750-amds-plan-for-the-future-huma-fully-detailed.html

Nice. Had to search to get a better understanding. Seems legit.
 

Myshkin

Member
I added a quote from the extensive article at ArsTechnica to the OP.

http://arstechnica.com/information-...orm-memory-access-coming-this-year-in-kaveri/

This is a useful add.

Just give the programmer the flexibility they want, as long as their life is simple. It's up to them to restrain themselves. For example, in socket programming, one process should have R/W privileges to some shared memory while other process has only R access. Someone can come along and say, here, I give you sockets where all can R/W, but you should ask yourself what effect this would have on debugging, not to mention maintenance. I'd suggest you turn down such sockets. But UMA I would never turn down.
 

Alej

Banned
Not when it comes to latencies. I don't know what are those advantages of the HUMA, but in terms of CPU power, that GDDR5 could be VERY problematic.

There's no problem with GDDR5's latency.

Mark Cerny said:
Latency in GDDR5 isn’t particularly higher than the latency in DDR3. On the GPU side… Of course, GPUs are designed to be extraordinarily latency tolerant so I can’t imagine that being much of a factor.

http://www.dualshockers.com/2013/07/13/mark-cerny-on-ps4-gddr5-latency-not-much-of-a-factor/
 

twobear

sputum-flecked apoplexy
My point is that the game have such big tech choices that they are not really relevant to compare power of both consoles.
I guess? It seems to me that you can always make these kinds of arguments though, 'oh that one runs at 60fps this one runs at 30fps', etc. My point remains, if the gap were really as big Xbox to DC there would be no argument; it would be massively obvious; nobody thinks that Splinter Cell: Chaos Theory on Xbox looks worse than Soul Calibur on DC just because the latter runs at 60fps and has baked shadows while the latter runs at 30fps and has a slew of expensive graphical effects.

At the rate the Xbone is currently slipping backwards down the hardware power rankings in these threads I wouldn't be surprised if next time people are pushing an Xbox vs NES power gap though. Seems to me both sides of this debate have a taste for their fave's secret sauce.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
Can someone please explain this to me.

From the translation of the article by ElTorro.

This was explained by AMD's Senior Product Marketing Manager Marc Diana to c't [big German IT magazine] at gamescom. This should put the 3D-performance of PlayStation 4 much farther ahead of Xbox One than many have expected so far.

There is a clear implication that the AMD Senior Product Marketing Manager has said that the PS4 is much further ahead relative to performance then Xbox One.

Did the original german article really say that?
 
Hey. My preferred next gen console (PS4) is shaping up just nicely thank you. I just don't take the OP's word at face value and try to apply some common sense to the OPs implication.

Also. For the GPU or CPU to 'see' the updated data in real time. The respective caches still need to complete a read to the main memory pool ( I think).
I still don't see why there is a advantage over what XBO does. I'm missing a piece of the puzzle.

Doesn't need to.

Check out Onion and Onion+

h3e67FC.png
 
I guess? It seems to me that you can away make these kinds of arguments though, 'oh that one runs at 60fps this one runs at 30fps', etc. My point remains, if the gap were really as big Xbox to DC there would be no argument; it would be massively obvious.

At the rate the Xbone is currently slipping backwards down the hardware power rankings in these threads I wouldn't be surprised if next time people are pushing an Xbox vs NES power gap though. Seems to me both sides of this debate have a taste for their fave's secret sauce.


anyone who claims the gap will be as huge as DC to Xbox never owned either. the claim is insane. there's going to be a gap, but nowhere near that large.
 

TheHater

Member
Forza 5, Ryse, and Killer Insinct were all running on actual consoles. I certainly don't see anyone using COD: Ghosts as a comparison point against PS4 exclusives.
You cannot compare exclusive on either system due to those games being exclusive. You can only compare multiplatform games running on both system. In this case, multiplatform games are running on PS4 devkits while Xbox one are running on PC's with Xbox one controller.

BTW, I am not denying the fact that Forza, Ryse, and KI are running on Xbox one hardware. They article was about 3rd party studio and their games running on both system, so that why I'm saying what I'm saying.
 

strata8

Member
cyberheater said:
There is a clear implication that the AMD Senior Product Marketing Manager has said that the PS4 is much further ahead relative to performance then Xbox One.

Did the original german article really say that?

That second sentence might not necessarily be quoting the AMD rep.

You cannot compare exclusive on either system due to those games being exclusive. You can only compare multiplatform games running on both system. In this case, multiplatform games are running on PS4 devkits while Xbox one are running on PC's with Xbox one controller.

BTW, I am not denying the fact that Forza, Ryse, and KI are running on Xbox one hardware. They article was about 3rd party studio and their games running on both system, so that why I'm saying what I'm saying.

Considering we haven't seen a multiplatform game running on both consoles (AFAIK), this point is a bit moot. It's a bit fallacious to claim that all multiplat PS4 games are running on devkits since The Divison was shown on a PC.
 

gruenel

Member
Can someone please explain this to me.

From the translation of the article by ElTorro.



There is a clear implication that the AMD Senior Product Marketing Manager has said that the PS4 is much further ahead relative to performance then Xbox One.

Did the original german article really say that?

I think that part was actually a conclusion from heise, not from the AMD guy.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Did the original german article really say that?

I translated it as literal as possible. I haven't changed the structure of the sentences. Should be accurate. Others are invited to double-check.
 

ZiggyRoXx

Banned
Lol..

I'd like to have seen the looks on the faces of the XO hardware design team when they learned about the PS4 GPU having this advantage.

Bet they were not happy.
 
You rang? Tell you what guys, how about we try a little role reversal? Since most of the time it is me who is asked to specify PC hardware that could theroretically match the PS4, how about you do it this time?

So here's my question: if you had to make an educated guess, what kind of graphics horsepower would I need to match the PS4? A GTX 780? A TITAN? A 7970?

Who gives a shit? Really? And how is it even relevant to the topic at hand?

A "generation ahead" does not equate to "more powerful". But you already know this as low-end next gen GPUs are invariably less powerful than previous gen high-end cards.

Disingenuous much?
 
I always find it amusing when people want to put so much weight behind what developers say about the new consoles being "about the same", etc. Developers are never going to bite the hand that feeds them, or could potentially be feeding them soon. They just don't do that. So they're always going to sugar coat things and say they're comparable. Every once in a while someone will step out of line and say something honest, but it's rare.

At the end of the day I think it's undeniable the PS4 is going to have a substantial advantage over the XB1. The GPU and memory bandwidth advantage is too big to ignore. Details like this hUMA nugget only further reinforce that.

It should be very interesting to see those multiplatform comparisons at launch, but also down the line after launch. The big question is will developers be willing to show up the XB1 by making full use of that PS4 horsepower, because if they do it will be noticeable.
 

Hana-Bi

Member
Thats what AMD said in April (same german source that posted this News):

Das hUMA-Konzept unterstützt unterschiedliche Speichertypen. So erklärte Phil Rogers, dass AMD in der Lage sein werde, APUs mit DDR3- oder GDDR5-Speicher zu bauen. Wie bereits Anfang Januar bekannt wurde, wird der für Ende 2013 erwartete Kaveri-Prozessor beide Speichertypen unterstützen. Außerdem sei hUMA kompatibel zu embedded DRAM, wie er etwa in Spielekonsolen genutzt wird: er lässt sich in den gemeinsamen Speicherbereich einblenden oder als Cache einbinden.

http://www.heise.de/newsticker/meld...er-Kaveri-und-Co-1850975.html?view=zoomzoom=1

Translation: hUMA will work with DDR3 or GDDR5 - embedded DRAM - as in gaming consoles - could be embedded as cache.
 
Lol..

I'd like to have seen the looks on the faces of the XO hardware design team when they learned about the PS4 GPU having this advantage.

Bet they were not happy.

that had to be known from day 1 of the planning stages, if the assumption that the ESRAM is the reason its not possible on xbone is correct.

MS NEEDED 8 gigs because of their plans with the OS and TV functions. incorporating ESRAM was the only way to make that feasible.


MS's assumption was that sony would be limited to only 4 gigs of ram, not that they would have hUMA and MS wouldn't.
 
Top Bottom