• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD confirms GCN, incl. XBO, doesnt support DX12 12_1 (but does 12_0 stop panicking)

tuxfool

Banned
Not just the GPU but also the CPU which is 10000x less likely to happen considering how ansty AMD/Intel seem to give out x86 licences.

Totally. However I'm uncertain if license Intel gave to AMD permits sharing of x86 IP and only covers limitations to manufacturing or if it also includes restrictions of who can design x86 devices.
 

RandSec

Neo Member
AMD has been very clear about their Semicustom business: They offer to modify their already working designs, which supports a quicker part design process. The customer pays for the customization work as it progresses and owns the resulting chip design. Since AMD has the right to build both the x86-64 CPU parts and their own GPU designs, AMD can get the parts built for the customer. Both of the game chips were AMD Semicustom designs.
 

tenchir

Member
Why do people say that AMD fucked up with their 200 series concerning 12.1? The 290x was launched in Oct/2013, while the 970/980 was launched in Sept/2014. That's almost a year difference between launches. So why put the blame on something that AMD don't control a year down the line? Even Nvidia first gen Maxwell card 750Ti that was launch early in 2014 doesn't even have 12.1.
 

tuxfool

Banned
Why do people say that AMD fucked up with their 200 series concerning 12.1? The 290x was launched in Oct/2013, while the 970/980 was launched in Sept/2014. That's almost a year difference between launches. So why put the blame on something that AMD don't control a year down the line? Even Nvidia first gen Maxwell card 750Ti that was launch early in 2014 doesn't even have 12.1.

People aren't. It is mostly a single poster with a certain point of view regarding anything AMD does. It is indescribably pathetic.
 
What about the PS4? Didn't it have a more modern architecture? I remember it shared a lot with the x290 features(not performance off course).
 

tuxfool

Banned
What about the PS4? Didn't it have a more modern architecture? I remember it shared a lot with the x290 features(not performance off course).

More modern depends on what features you define as the individual requirements of GCN1.0 or GCN1.1.

There are features in the ps4 gpu that aren't in the xb1, such as the cache volatile bit (also shared with the 290 series). But on the whole they are fairly similar.
 
More modern depends on what features you define as the individual requirements of GCN1.0 or GCN1.1.

There are features in the ps4 gpu that aren't in the xb1, such as the cache volatile bit (also shared with the 290 series). But on the whole they are fairly similar.

and? so it has the same limitations?
 

tuxfool

Banned
and? so it has the same limitations?

Yeah, in terms of hardware features required by 12_1.

Of course, the ps4 has its own api, but it just means that most games will target hardware 12_0 features mostly.

12_1 features will have to be added specifically for the PC version.
 
Nobody suggested that. However there is a contingent that is suggesting that MS bought the rights to the GCN IP then redesigned it and then handed it off to amd to build.

Even more laughable is the thought that only they had a part in defining the hardware requirements for dx12_1 (no input from AMD, Nvidia, Intel, Qualcomm etc). Thus the xb1 gpu secretly has full dx12_1 features because MS had a hand in its specifications. It should also be mentioned that at the time the APU was designed the dx12 api and featureset probably still was entirely in flux.

Well, I wouldn't suggest that nobody else had input, and nor would I suggest the crazy idea that Microsoft somehow bought the rights to GCN, redesigned it and then said to AMD "Build what we just designed for you." It was obviously a collaborative effort in which AMD's pre-existing expertise on the subject of graphics and, make no mistake, Microsoft's own as well, was a significant factor. And that isn't to disregard the contributions from others including Nvidia, but there is no way Microsoft's effort in bringing the whole thing together and making DX12 happen in the way that it has thus far is something Microsoft shouldn't receive full credit for.

I'm not saying you specifically are doing so, but let us not forget that Microsoft has been at this graphics API programming thing for quite some time now.
 

Durante

Member
There's no way Microsoft didn't have serious input on the design of the Xbox One GPU. I mean, are people implying they just said "hey, build us a GPU and get back to us when it's done?"

Surely Microsoft collaborated with AMD to ensure to a strong degree that whatever they had in the pipeline, AMD did as much as possible to accommodate and get the most out of that. Microsoft may not be a known major hardware manufacturer, but what exactly do people believe Microsoft does when they talk with and work so closely with companies like AMD or Nvidia in prep for a new DirectX release? Considering when the console released, there's no way we get feature level 12_0 on the Xbox One without a serious collaboration between Microsoft and AMD.
There is collaboration with DirectX, but not historically like people are now imagining it. MS doesn't say "these will be the features, create the hardware", GPU companies create hardware and then DX versions and features are specified to use it.
 

Dice

Pokémon Parentage Conspiracy Theorist
Here is the thing.

When DX11 was the new shit, we had tessellation and some fancy dynamic lighting and shadows as the main attractions. There were, at the start, cards marketed as compatible, and sure enough they could render that shit in a frame. Cards like the HD5850. However, they ran like donkey balls with those features turned on, so you were going to get shitty/inconsistent performance, or just run in DX9 mode. It has been this way with every single DX release I can remember.

I have full faith in current cards sucking ass at any proper DX12 game until the next set of cards with hardware design built from the ground up to utilize the way DX12 works. And even with proclaimed reduced CPU load, I bet CPU load is still increasing and the reduction is just to mitigate the exponentially growing nature of it. Same with the relationship between system RAM and VRAM.

Because of these things I'm waiting a couple more years to build a whole new nice DX12 system. The AAA games really built around it rather than just slapping it on in gimmicky ways should be finishing by then.
 

Panajev2001a

GAF's Pleasant Genius
For free you won't get it, but vastly cheaper than before with ROVs.
Supported by Maxwell Gen 2 and Intels Haswell.
Sadly, AMD doesn't seem to support this Feature till 2016.

I know, which is why I said "one day" :). There might be a set of bottlenecks of future rendering must have's which could make this feature essentially free like arithmetic ops in a heavily texture dependent game. We shall see :).
 
Is there a picture somewhere of misterxmedia's physical manifestation out there?, i'm curious...
*imagines The Simpson's comic book guy*

I think a picture of a stacked dual gpu would suffice. :p

There is collaboration with DirectX, but not historically like people are now imagining it. MS doesn't say "these will be the features, create the hardware", GPU companies create hardware and then DX versions and features are specified to use it.

It works both ways. Companies like Nvidia and AMD make hardware and then DX versions and features are created to utilize it, but I'm certain the reverse is also true where there are features MS has in mind that would improve the graphics rendering process, but they know existing hardware isn't quite where they should be tech wise in order to properly get the most from it. Basically it's a two way street. Microsoft offers up ideas and solutions, and their partners like AMD, Nvidia etc do the same on their end, and when it all comes together and some consensus is arrived at, we get new pieces of hardware and eventually new directx releases. As an example, just because new hardware with a certain set of capabilities may release well before there's ever a new version of DirectX to expose it to windows gamers doesn't automatically mean that AMD or Nvidia didn't have discussions with, or at least coordinate those efforts with Microsoft or even other parties outside of Microsoft before they went ahead and made their move. To a large degree I imagine there's an extensive and very streamlined process of back and forth that goes on where Nvidia or AMD would want Microsoft (or also in the reverse) to know exactly where their heads are at with regards to certain features.
 

Irobot82

Member
Has anyone checked out this article?

In order to ensure that Direct3D 12 could support the widest range of hardware, without significant compromises that could limit the longevity of the new API, Microsoft and its partners established to divide into three “tiers” the level of support of the new resource-binding model.
Each tier is the superset of its predecessor, that is, tier 1 hardware comes with the strongest constraints about the resource-binding model, tier 3 conversely has no limitations, while tier 2 represent intermediate level of constrictions.

If we talk about the hardware on sale, the situation about the resource-binding tiers is the following:

Tier 1: INTEL Haswell e Broadwell, NVIDIA Fermi
Tier 2: NVIDIA Kepler, Maxwell 1.0 and Maxwell 2.0
Tier 3: AMD GCN 1.0, GCN 1.1 and GCN 1.2
Regarding the resource binding, currently only AMD GPUs come without hardware limitations, which has been erroneously defined as a “full support” by some sources.

I'm not really sure what this part means.
 

BeEatNU

WORLDSTAAAAAAR
giphy.gif


I have no idea what I'm reading, but it sounds interesting. What does it mean for DX12 on X1?

lol
 
Has anyone checked out this article?



I'm not really sure what this part means.

What it means is developers are going to have an interesting awful time trying to support DX12 in games with so many different GPUs out there which all support a different set of features.

The good news is the important part of DX12, which is reduced CPU overhead, is something which requires no hardware support as it's improvement in the API itself.
 

dr_rus

Member
What it means is developers are going to have an interesting awful time trying to support DX12 in games with so many different GPUs out there which all support a different set of features.

The good news is the important part of DX12, which is reduced CPU overhead, is something which requires no hardware support as it's improvement in the API itself.

Not really. I'd wager that most devs will opt for one of feature levels with the most of them choosing 12_0 since that's what is there in both consoles (probably). The resource binding is just one feature in these feature levels and there is no FL which require tier 3 of resource binding. It's probably just won't be used until some DX12.2 or DX13.
 

wachie

Member
Wow FUD much? GCN 1.1 came out 2 years ago. Kepler owners are seriously fucked with your attitude and in light of the recent tanking performance. GJ nVidia.

DX%20Feature%20Levels_2.jpg



I would bet there's a high possibility there.
This.

Most likely 12_1 will become the DX11.1 aka wont get mainstream because the majority of the GPUs support 12_0, including both consoles. (yes PS4 has its own API but is likely based on the same feature set as the Xbox One GPU - Sea Islands)
 

ekim

Member
http://gamingbolt.com/epic-games-un...rs-to-squeeze-out-even-more-from-the-xbox-one

Microsoft’s Xbox One already has a low level API that resembles DirectX 12. What possible benefits will DX12 along with Unreal Engine 4 bring to games development on Xbox One?

“Unreal Engine 4 already does a great job of showcasing what the Xbox One is capable of and with the advent of DirectX 12 we’re excited to see developers squeezing even more out of the hardware. Several internal Microsoft teams are using UE4 for games development so it’s made it incredibly easy to closely partner with them and to ensure that Unreal Engine is a great development tool for the broader Microsoft ecosystem.”
 

dr_rus

Member
DX12 is a mess:
http://wccftech.com/directx-12-supp...12-1-gcns-resource-binding-tier-3-intels-rov/

Strange to see wccftech doing analysis instead of rumors but there it is. Nobody supports the "full" DX12 right now, devs get to pick and choose what they want to support. If they want to make sure Kepler or Intel IGP works properly with their games, they have to restrict themselves to mostly DX11 feature levels.

No more a mess than DX11 was. The only truly uniform API was DX10 - and that didn't last long because things got complex again with DX10.1.
 
Top Bottom