• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD confirms GCN, incl. XBO, doesnt support DX12 12_1 (but does 12_0 stop panicking)

ps3ud0

Member
So what are the differences in hardware feature set between DX12.0 and DX12.1 full compatibility?

ps3ud0 8)
 
DX releases are often a feature year for GPU manufacturers (ie, expensive rebadges for little effort), i'm not expecting any cards released in 2015 to be fully DX12 compatible.
 

Seanspeed

Banned
So if I'm in the market for a GPU, how do I know which current cards will make use of Direct X 12.1?

I take it a 750ti does not?
No, 750Ti does not. All other Maxwell cards, 960 and up, do, though.

It's probably not any major deal, though. I certainly wouldn't go judging which GPU I got over this, unless all else was equal.
 
So if Nvidia is confirming support for DX_12_1 on 970,980 why is it their site updated the titan x page with support for it when 980ti was announced but didnt /haven't yet done so for 970 or 980. Something is wrong here.

http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-970/specifications

http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-titan-x/specifications

All of the features specified in 12_1 are in the docs detailing the second gen Maxwell architecture so GM200(Titan X, 980Ti), GM204(970, 980), and GM206(960) are 12_1 feature set level.

You should just assume they haven't updated their store page.
 

SmokedMeat

Gamer™
No, 750Ti does not. All other Maxwell cards, 960 and up, do, though.

It's probably not any major deal, though. I certainly wouldn't go judging which GPU I got over this, unless all else was equal.

If I'm doing a build though, I guess it's worth going for the GTX960 over the 750ti?
 

jett

D-Member
I thought it was confirmed that R9 cards like the 280X would support DX12...I guess not, sigh.
 

Ty4on

Member
I thought it was confirmed that R9 cards like the 280X would support DX12...I guess not, sigh.

DX12 =/= DX12.1

I think DX12 is primarily CPU optimizations and all DX11.1 compatible cards support it. DX12.1 has new features.
Edit: They can't mean GCN 1.0 won't get any DX12, can they?
 

dr_rus

Member
What I don't get is what they mean by only *some* GCN cards supporting 12.0. Is that not base DX12 support? Or is there another, more basic level, with 12.0 being the first tiered 'feature level'?

It seems than GCN 1.0 won't be able to support feature level 12_0. Thus it's maximum supported feature level under DX12 will be 11_1. This means that Pitcairn and Tahiti owners will only receive "legacy" DX12 support ("thin" API itself with the performance improvements but no new rendering features), like all owners of non-Maxwell 2 DX11 GPUs from NV.

This remains to be seen though, they didn't say anything officially about GCN 1.0 yet.
 

Kayant

Member
Maybe because the 12_1 only got fully finalised after the fact? Just a hunch.

All of the features specified in 12_1 are in the docs detailing the second gen Maxwell architecture so GM200(Titan X, 980Ti), GM204(970, 980), and GM206(960) are 12_1 feature set level.

You should just assume they haven't updated their store page.

Fair enough that's great I was worrried it won't support it. Thanks.
 

Panajev2001a

GAF's Pleasant Genius
AMD's global head of technical marketing Robert Hallock confirmed to Computerbase GCN-based graphics cards will support DX12 and at least some will also support DX12 feature level 12_0. Computerbase mentions Tonga, Hawaii and the Xbox One GPU will surely support 12_0, however question marks remain for older chipsets like Bonaire, Tahiti and Pitcairn.

Meanwhile, Nvidia confirmed earlier their chipsets GM200, GM204 and GM206 will already support DX12 feature level 12_1. At GDC this year, MS gave an overview of feature level 12_0 and 12_1:


Hallock said it shouldn't be an issue that AMD cards only support feature level 12_0 since all the features relevant for games are included in 11_1 and 12_0. Also, most games will stick closely to what the consoles use and will therefore mostly use 12_0. Keep in mind, though, he's the global head of technical marketing, so his words shouldn't just be taken as gospel. Computerbase also pointed out Nvidia said the same thing when their Kepler cards didn't support feature level 11_1.

tl;dr: If you want a fully compatible DX12 card you should look into getting a Nvidia GTX 960, 970, 980, 980 Ti or Titan X. Or wait until more information regarding AMD's upcoming Fiji card(s) are released.

PS4? It probably does not have full feature level 12_1, but it would be interesting how they would place it.
 

leeh

Member
PS4? It probably does not have full feature level 12_1, but it would be interesting how they would place it.
It won't have the hardware changes to theoretically support 12_1.

From the sounds of that, it creates the impression that at a hardware level it would be able to support 11_1 theoretically which is what all the GCN cards will support.

My question is, whats the difference between 12_0 and 11_1?
 

Ty4on

Member
My question is, whats the difference between 12_0 and 11_1?

I'm pretty sure it was CPU optimization for draw calls. DX11 couldn't use more than one core very efficiently making it scale poorly on more than four threads.

Regarding GCN 1.0 not getting DX12:
73019.png
 

Ty4on

Member
Ah, so it's literally just the CPU optimisations?

So all the stuff around command list/bundles and the descriptor heaps and tables are just 12_0 and 12_1?

I have no idea why 12.0 and 12 would be different, but to my understanding 12 doesn't have any new features. The CPU optimization is pretty big though as that has held back a lot of bigger games and means you can save something on the CPU by not needing a K series i5.
 

leeh

Member
I have no idea why 12.0 and 12 would be different, but to my understanding 12 doesn't have any new features. The CPU optimization is pretty big though as that has held back a lot of bigger games and means you can save something on the CPU by not needing a K series i5.
From my understanding, 12_0 had the optimisations above, but 12_1 had hardware changes which added more hardware based optimisations for voxels, and enabling real-time GI?
 

Seanspeed

Banned
Well, specifically. I'm trying to stay under $200.
Wait and see what AMD do. If they do a rebadged 290 as a 380 and price it around $200, that is going to be much preferable to a GTX960. It'll probably be more like $250, but you never know.

If not, then I'd still say a 280X is a better bet. More powerful and more vRAM.

Like I said, I wouldn't base your choice on whether it has 12.1 functionality. At that price level, I'd recommend looking for best bang-for-buck. And AMD seems to do better at this level.
 

etta

my hard graphic balls
Yea.... I doubt it. DX is a Microsoft product, I am sure when they built the Xbox One, they also knew what would be required for DX12.1. I mean, these guys are not amateurs, they are Microsoft.
 

Seanspeed

Banned
Yea.... I doubt it. DX is a Microsoft product, I am sure when they built the Xbox One, they also knew what would be required for DX12.1. I mean, these guys are not amateurs, they are Microsoft.
Microsoft did not build the GPU.....

The company doesn't revolve around Xbox, either.
 

Locuza

Member
There is no DX12.1, there is only DX12 and its Feature-Levels 11_0, 11_1, 12_0 and 12_1.
The consoles are using GCN Gen 2 / GCN 1.1 / IP v7 / Sea Islands, you name it.
FL12_0 support is the highest (from DX12 perspective) .

For FL12_1 one need serious hardware modifications, for what the console technology is too old.
 

Rootbeer

Banned
But when developers start to adopt DX12, will they even care about 12_1 since they can target way, way more GPUs by sticking to 12_0?

Didn't we already go through this with DX 10.1 and 11.1? How many games even required you to have a GPU with those feature sets, I can't recall many. Why should we believe this is going to be a different scenario, despite that DX 12_0 and 12_1 are becoming available at the same time and not spaced apart like the previous feature levels.

If the XB1 is also 12_0 only then it gives devs even less incentive to target 12_1 even for ports.

I'm making certain that my next GPU supports 12_1 but I'm wondering if it will even matter.
 
There is no DX12.1, there is only DX12 and its Feature-Levels 11_0, 11_1, 12_0 and 12_1.
The consoles are using GCN Gen 2 / GCN 1.1 / IP v7 / Sea Islands, you name it.
FL12_0 support is the highest (from DX12 perspective) .

For FL12_1 one need serious hardware modifications, for what the console technology is too old.

This should be in OP.
 

etta

my hard graphic balls
Microsoft did not build the GPU.....

The company doesn't revolve around Xbox, either.

Except they designed the features. AMD just provided the base hardware and manufacturing. It is like how Apple designs their CPU's but Samsung manufactures them.

There's no way in hell Microsoft did not anticipate and design the Xbox One accordingly.
 

Locuza

Member
But when developers start to adopt DX12, will they even care about 12_1 since they can target way, way more GPUs by sticking to 12_0?

Didn't we already go through this with DX 10.1 and 11.1? How many games even required you to have a GPU with those feature sets, I can't recall many. Why should we believe this is going to be a different scenario, despite that DX 12_0 and 12_1 are becoming available at the same time and not spaced apart like the previous feature levels.

If the XB1 is also 12_0 only then it gives devs even less incentive to target 12_1 even for ports.

I'm making certain that my next GPU supports 12_1 but I'm wondering if it will even matter.
It's of course very unlikely that FL12_1 will be a prominent requirement.
I only see very few Software-Vendors/Games using it for PC exklusive features.
I assume this will be more a job tackled by the various Hardware Vendors with small middleware like VXGI from Nvidia which uses Tiled Resources Tier 3 and Conservative Rasterization for a Global Illumination solution.
 

Panajev2001a

GAF's Pleasant Genius
It's of course very unlikely that FL12_1 will be a prominent requirement.
I only see very few Software-Vendors/Games using it for PC exklusive features.
I assume this will be more a job tackled by the various Hardware Vendors with small middleware like VXGI from Nvidia which uses Tiled Resources Tier 3 and Conservative Rasterization for a Global Illumination solution.

I am more interested in having fast, one day free again, Order Independent Transparency... The DreamCast's GPU was doing it (HW translucency sorting, basically the same end result) in HW years ago :(...
 

Ke0

Member
Except they designed the features. AMD just provided the base hardware and manufacturing. It is like how Apple designs their CPU's but Samsung manufactures them.

There's no way in hell Microsoft did not anticipate and design the Xbox One accordingly.

So MS ended up designing a GPU that's JUST like the GPU Sony had designed for them by AMD and it performs better.

Weird. It's like AMD sold Sony MS's GPU design, they should totally sue or something!
 

Locuza

Member
I am more interested in having fast, one day free again, Order Independent Transparency... The DreamCast's GPU was doing it (HW translucency sorting, basically the same end result) in HW years ago :(...
For free you won't get it, but vastly cheaper than before with ROVs.
Supported by Maxwell Gen 2 and Intels Haswell.
Sadly, AMD doesn't seem to support this Feature till 2016.
 

dr_rus

Member
My question is, whats the difference between 12_0 and 11_1?

http://en.wikipedia.org/wiki/Direct3D#Direct3D_12_levels


I'm pretty sure it was CPU optimization for draw calls. DX11 couldn't use more than one core very efficiently making it scale poorly on more than four threads.

Regarding GCN 1.0 not getting DX12:
73019.png

CPU optimization for draw calls is a feature of DX12 API which has four feature levels for hardware - see wiki above. This optimization will be available on all GPUs which will have DX12 driver and it's not tied to feature levels in any way.
 

Locuza

Member
So MS ended up designing a GPU that's JUST like the GPU Sony had designed for them by AMD and it performs better.

Weird. It's like AMD sold Sony MS's GPU design, they should totally sue or something!
The GPU design/technology is the same, sold from AMD to both of them.
MS and Sony just said how many units of flavor X (ALUs, ROPs, ACEs) they like.
The custom-portion happened on other sections, like the memory-network and special fixed-function hardware.
 

Seanspeed

Banned
Except they designed the features. AMD just provided the base hardware and manufacturing. It is like how Apple designs their CPU's but Samsung manufactures them.

There's no way in hell Microsoft did not anticipate and design the Xbox One accordingly.
Microsoft did not build the GPU's, though. If AMD simply didn't have the capability for it, then there's nothing else to do. Microsoft aren't going to limit the development of DX12 just because there's a few things the Xbox couldn't do. Like I said, the company doesn't revolve around the Xbox.
 

etta

my hard graphic balls
So MS ended up designing a GPU that's JUST like the GPU Sony had designed for them by AMD and it performs better.

Weird. It's like AMD sold Sony MS's GPU design, they should totally sue or something!

Microsoft did not build the GPU's, though. If AMD simply didn't have the capability for it, then there's nothing else to do. Microsoft aren't going to limit the development of DX12 just because there's a few things the Xbox couldn't do. Like I said, the company doesn't revolve around the Xbox.

Let me clarify. AMD provided the base hardware, let's assume an HD 7970 GPU. Microsoft took that core product, and added some features, removed some features, and redesigned some features. It is now a heavily modified HD7970 that is in most ways better than the original 7970, but it may also miss features, that Microsoft thought the Xbox One didn't need, and making it worse than 7970 in those aspects.

Sony did the same thing with the PS4. They took that base GPU and heavily modified it. Just because the core is the same as the Xbox One GPU does not mean they are both the same in every single way. Some features PS4 clearly has advantage, like the extra ROP's and all that other stuff. Xbox One version on the other hand has a higher clock, because maybe Microsoft modified in some way, ie. the cooling process, to make it run cooler at the same temperature. AMD then just manufactures the GPU given the specifications by Microsoft/Sony.

The same process then repeats with the CPU. Microsoft and Sony took that base Jaguar CPU and they each modified it to their liking. Of course with a CPU there is less headroom for customization, but some things can be modified. As an example, the Xbox One Jaguar runs at a higher clock speed, and again this is probably tied to the cooling design Microsoft has for the rest of the system.

Some of the hardware, such as the RAM, is just base hardware, and not modified. The same can be said for HDD, but these are probably tested more thoroughly to ensure it lasts an entire generation.

Is this clearer?
 

hodgy100

Member
Let me clarify. AMD provided the base hardware, let's assume an HD 7970 GPU. Microsoft took that core product, and added some features, removed some features, and redesigned some features. It is now a heavily modified HD7970 that is in most ways better than the original 7970, but it may also miss features, that Microsoft thought the Xbox One didn't need, and making it worse than 7970 in those aspects.

Sony did the same thing with the PS4. They took that base GPU and heavily modified it. Just because the core is the same as the Xbox One GPU does not mean they are both the same in every single way. Some features PS4 clearly has advantage, like the extra ROP's and all that other stuff. Xbox One version on the other hand has a higher clock, because maybe Microsoft modified in some way, ie. the cooling process, to make it run cooler at the same temperature. AMD then just manufactures the GPU given the specifications by Microsoft/Sony.

The same process then repeats with the CPU. Microsoft and Sony took that base Jaguar CPU and they each modified it to their liking. Of course with a CPU there is less headroom for customization, but some things can be modified. As an example, the Xbox One Jaguar runs at a higher clock speed, and again this is probably tied to the cooling design Microsoft has for the rest of the system.

Some of the hardware, such as the RAM, is just base hardware, and not modified. The same can be said for HDD, but these are probably tested more thoroughly to ensure it lasts an entire generation.

Is this clearer?

no not really. it's more likely ms and sony provided specifications of what they wanted thier SoC to be capeable of and then AMD designed them.
 

etta

my hard graphic balls
no not really. it's more likely ms and sony provided specifications of what they wanted thier SoC to be capeable of and then AMD designed them.

So Microsoft would do the R&D for most of the console but omit the GPU R&D? Makes sense...
not really.

This is exactly like the case with Apple and Samsung. Apple designs the CPU, but lacks the manufacturing abilities, so they pay Samsung to manufacture it. Also, they would have to pay a lot more to AMD/Samsung for them to design it as well, and not just manufacture it. Microsoft/Apple also has way more control on the design process if they do it themselves.
 
Top Bottom