• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD: PlayStation 4 supports hUMA, Xbox One does not

Vestal

Gold Member
Well, I guess we've reached a point where further discussions are useless until someone of the involved parties comes clear.

AMD won't say anything and Sony already implied in interviews and slides that PS4 will use hUMA. It's time for Microsoft to finally give some official statements on their tech.
Monday at hotchips.
 

Nafai1123

Banned
It's already been confirmed the PS4 uses shared address space between the CPU/GPU, so we know it's doing some sort of hUMA.

MS hasn't commented much on any of the specific hardware specs of the XB1, so at this point it's just conjecture.

People will continue to argue back and forth regardless and this post will go mostly unnoticed.
 
I dunno if its just me, but I already thought it was revealed that PS4 was using unified memory a long time ago, and specs leaks and X1 reveal showed X1 did not have it.

UMA is one thing. HUMA is another. The "h" stands for heterogeneous. It allows for both GPU and CPU to access the same address space.

UMA is one big pool that is virtually partitioned off between CPU and GPU. Examples include N64 and Xbox 360. They have one big pool. Devs dictate how much memory is allocated for both, and each partition has their own address space.
 

Skenzin

Banned
Dude, a spokesman from the company whose parts drive both consoles said that the gap between the two will be more than what many have expected. That's why the thread was popular.

That performance gap maybe true. But a PR/marketing person of limited intelligence. yes I say this person was not very intelligent because to put down an billion dollar account holder to a journalist, when you work in PR/marketing.. the guy has questionable judgement.

If its true, i want to start hearing from multiplatform devs, "having a hard time getting game x, have to lower resolution, cut shaders.. etc.."

We know there is a gap, but I wish we could start hearing from actual devs as to whether its negligible or significant.. This rabid cheerleading atmoshphere is not interesting.. well maybe a little.
 
This is getting a bit childish even for an internet forum.

hUMA is a copyrighted trademark, both PS4 and XBONE probably have some sort of implementation that allows something very similar, XBONE with ESRAM+MOVE engines; PS4 with customized bridges.

The only reason this thread has gone on like it has is because it showed Xbone in a negative light.

Copyrighted by who ? AMD ?
Durango is an AMD APU too.
 

jiggles

Banned
I'm sorry for veering off topic, but what the hell is this "bathtub" meme I keep seeing? Where did it come from? What does it mean?

It's been bugging me all day.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
Last time around (back in 2005, that is) I remember a lot of discussion about various architectural features in Xenos that people were saying would put it above and beyond the PS3, stuff like MEMEXPORT, and iirc something about how it was much better at 'dynamic branching' than RSX.

MEMEXPORT is an explicit instruction that copies data from the GPU partition into the CPU partition of the 360s memory. I don't have a concrete reference from a developer about a concrete use case.

Did any of these end up being meaningful? As I recall in the end the RSX ended up being basically as powerful as the Xenos anyway, so I'm assuming they weren't the secret sauce that some people thought they would be?

Xenos is indeed more advanced and powerful than RSX (due to other reasons). The main reason why PS3 exclusives look so good is that RSX can work in tandem with Cell which can implement tasks that are usually done on more sosphisticated GPUs, like post-processing effects or, the latest example, tessellation in GT6.
 
I'm sorry for veering off topic, but what the hell is this "bathtub" meme I keep seeing? Where did it come from? What does it mean?

It's been bugging me all day.

There was a Mark Cerny interview thread. Some guy said "time to watch it in the bathtub." Kinda creepy, but... whatever floats his boat. Lol.
 

Chobel

Member
Update from the Heise author:

He had a phone call with AMD and he says that they left open whether Marc Diana's statements are true or untrue. He also says that AMD made pretty clear that they're not allowed to talk about PS4 or Xbox One

"AMD ließ auch während eines Telefongespräches offen, ob die Behauptungen von Marc Diana der Wahrheit entsprechen oder nicht. Besonderen Wert legte das Unternehmen vor allem auf die Aussage, dass die Firma keine Aussagen zu den Produkten seiner Kunden treffen darf – also auch nicht zu Sonys Playstation 4 oder Microsofts Xbox One."

http://www.heise.de/newsticker/meld...et-Unified-Memory-Xbox-One-nicht-1939716.html

Thanks for this news, It should be included in OP.
I'll keep my tub filled just in case.
 
Update from the Heise author:

He had a phone call with AMD and he says that they left open whether Marc Diana's statements are true or untrue. He also says that AMD made pretty clear that they're not allowed to talk about PS4 or Xbox One

"AMD ließ auch während eines Telefongespräches offen, ob die Behauptungen von Marc Diana der Wahrheit entsprechen oder nicht. Besonderen Wert legte das Unternehmen vor allem auf die Aussage, dass die Firma keine Aussagen zu den Produkten seiner Kunden treffen darf – also auch nicht zu Sonys Playstation 4 oder Microsofts Xbox One."

http://www.heise.de/newsticker/meld...et-Unified-Memory-Xbox-One-nicht-1939716.html

not surprised.
 
I already wish I could go back to not knowing.

Here is how I listen if it helps you picture it.

vSOzsrk.png
 
MEMEXPORT is an explicit instruction that copies data from the GPU partition into the CPU partition of the 360s memory. I don't have a concrete reference from a developer about a concrete use case.

Pretty much. Shader output could become input for more shader output.

The only game I ever heard of using it was Viva Piñata for tessellation.
 

Nafai1123

Banned
Old interview from Cerny, but it seemed appropriate.

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=2

"A typical PC GPU has two buses," said Cerny. "There’s a bus the GPU uses to access VRAM, and there is a second bus that goes over the PCI Express that the GPU uses to access system memory. But whichever bus is used, the internal caches of the GPU become a significant barrier to CPU/GPU communication -- any time the GPU wants to read information the CPU wrote, or the GPU wants to write information so that the CPU can see it, time-consuming flushes of the GPU internal caches are required."

•"First, we added another bus to the GPU that allows it to read directly from system memory or write directly to system memory, bypassing its own L1 and L2 caches. As a result, if the data that's being passed back and forth between CPU and GPU is small, you don't have issues with synchronization between them anymore. And by small, I just mean small in next-gen terms. We can pass almost 20 gigabytes a second down that bus. That's not very small in today’s terms -- it’s larger than the PCIe on most PCs!
•"Next, to support the case where you want to use the GPU L2 cache simultaneously for both graphics processing and asynchronous compute, we have added a bit in the tags of the cache lines, we call it the 'volatile' bit. You can then selectively mark all accesses by compute as 'volatile,' and when it's time for compute to read from system memory, it can invalidate, selectively, the lines it uses in the L2. When it comes time to write back the results, it can write back selectively the lines that it uses. This innovation allows compute to use the GPU L2 cache and perform the required operations without significantly impacting the graphics operations going on at the same time -- in other words, it radically reduces the overhead of running compute and graphics together on the GPU."
•Thirdly, said Cerny, "The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, we’ve worked with AMD to increase the limit to 64 sources of compute commands -- the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that's in the system."

Particularly the first point about bypassing bus to directly access system memory with the CPU/GPU sounds like shared address space.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I am so confused :( ... so does PS4 supports or not? Can someone please summarize what happened

Yes, that is no controversy in my opinion. If they want to officially name it "hUMA" may be another question.
 
From the word go I figured that someone at AMD had opened their mouth a little too far on this topic and spoke with a bit too much frankness. Regardless of the differences, AMD wouldn't throw one of their customers under the bus unless it was someone talking when they shouldn't have been.

I am unsurprised this concludes with AMD PR coming out to say that statements were inaccurate and they've nothing more to say about it; it was the only way this could end.
 
Oh, this thread got real good in a hurry. That said, let's consider for a moment that we are still dealing with closed box systems that will be easier and easier for developers to utilize over time. Let's also consider that whether one has hUMA and another doesn't, or whether none of them have hUMA, there are many proven technologies and software practices that already do just fine without AMD's hUMA technology, and I can't see why these things wouldn't continue to prove effective absent AMD's technology.

In other words, I think the consoles will be just fine with or without hUMA.
 
Especially display lists, that is, lists of geometrical primitives together with instructions to draw them. The CPU assembles them as a kind of "task list" that the GPU executes in order to draw the frame. This is usually done via a bus like PCI, here it is done via the shared memory.

kzsfsharedmem1moin.png

The GG slides have become so useful it seems here on NeoGaf
 
Oh, this thread got real good in a hurry. That said, let's consider for a moment that we are still dealing with closed box systems that will be easier and easier for developers to utilize over time. Let's also consider that whether one has hUMA and another doesn't, or whether none of them have hUMA, there are many proven technologies and software practices that already do just fine without AMD's hUMA technology, and I can't see why these things wouldn't continue to prove effective absent AMD's technology.

In other words, I think the consoles will be just fine with or without hUMA.

Dat damage control.
 

QaaQer

Member
From the word go I figured that someone at AMD had opened their mouth a little too far on this topic and spoke with a bit too much frankness. Regardless of the differences, AMD wouldn't throw one of their customers under the bus unless it was someone talking when they shouldn't have been.

I am unsurprised this concludes with AMD PR coming out to say that statements were inaccurate and they've nothing more to say about it; it was the only way this could end.

.

No way AMD would have officially sanctioned those comments. Still, I'm glad the AMD guy at Gamescom opened his big mouth as now we have a better understanding of the consoles...hope he doesn't get fired tho. MS likes having bigmouths fired.
 

calder

Member
I haven't really followed or gave much of a shit about this thread, but damn you KNOW someone was talking out of turn when an OP has 7 legit news updates to the original story.
 

SaintR

Member
Oh, this thread got real good in a hurry. That said, let's consider for a moment that we are still dealing with closed box systems that will be easier and easier for developers to utilize over time. Let's also consider that whether one has hUMA and another doesn't, or whether none of them have hUMA, there are many proven technologies and software practices that already do just fine without AMD's hUMA technology, and I can't see why these things wouldn't continue to prove effective absent AMD's technology.

In other words, I think the consoles will be just fine with or without hUMA.

As an innocent bystander on this console war (my word means nothing), it's really hard not to point out your transparency when it comes to this issue in terms of the technical differences between each individual machine. Sometimes just sometimes there are things in life you just can't coat in a different shade of green just for you own volition.
I'm not arguing the premise of what you wrote but more the intent.
 

QaaQer

Member
Its a beautiful OP, and a great story.
Indeed, they told him to STFU as saying that one of your vendors drastically improved your die, and another barely bothered, is bad corporate relations.

yup. & I think the credit goes to W!CKED, a pretty great junior.
 

Stike

Member
Old interview from Cerny, but it seemed appropriate.

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=2

Particularly the first point about bypassing bus to directly access system memory with the CPU/GPU sounds like shared address space.

Okay, this sounds pretty amazing - because apparently, Cerny built a new Amiga.

The Amiga came with an array of custom chips, designed for their specific tasks. One for audio, one to process colors and move objects, one for the video output etc. and every one of those chips had direct access to the RAM, and thus DMA (direct memory access) was born. (Guess where DMA Design came from, the guys behind Lemmings, Blood Money and, later, GTA and Crackdown...)

Here it is explained that there are custom chips and the GPU has direct access to the RAM, much like the Amiga back then.

Suddenly, it is not only a simple coincidence that Shadow of the Beast, one of the memorable technical showcases for the Amiga has been resurrected, huh?

This design gave programmers a lot of options to squeeze out more and more performance even after many years. Just take a look at those examples (sorry if not appreciated here) :

This demo from 1993 ran on a vanilla Amiga 500, 7 MHz, 512 KB RAM, built 1986:
http://www.youtube.com/watch?v=bfg8IFnRvFg

Same hardware: Shadow of the Beast, 1989 (too bad you can't see the 50 fps this was running at)
http://www.youtube.com/watch?v=e-U6HUaAONI

I am VERY much looking forward to what devs can get out of that pretty black box...
 
D

Deleted member 80556

Unconfirmed Member
So, what I got from the updates is that it's very likely that both consoles have hUMA, right?

But AMD is not supposed to talk about it at all, but they slipped and so we're here?
 

Leb

Member
This thread has enjoyed remarkable longevity, especially when one considers that we know precisely as much now as we did before the thread was posted in the first place.
 

Sounddeli

Banned
MS Engineer Architect Panel

we have to invest a lot in coherency throughout the chip, so there's been io coherency for a while but we really wanted to get the software out of the mode of managing caches and put in hardware coherency for the first time on the mass scale in the living room on the gpu.
 
Top Bottom