• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DirectX 12 GPU exclusive features to be shown/announced at GDC

Skinpop

Member
Xbox One development is via DirectX and Xbox SDK. You can't direct access hardware now. You're even limited to your HyperV VM container.
I see, this surprises me. For XBOX360 they had a close to the metal version of DX if I remember correctly?
 

Chobel

Member
Xbox One development is via DirectX and Xbox SDK. You can't direct access hardware now. You're even limited to your HyperV VM container.

That said, DirectX 12's goal is to unlock the lower level calls (via API calls). Xbox One has a 1.75ghz core CPU with 8 cores, 6 usable (+1 part usable), so everything needs to be as multicore as possible. In this case, Xbox is driving the market here, with *significant* gains for PC gamers.

PS: XB1 gfx chip is AMD.

Source? AFAIK you can "write to the metal" since before launch, plus the devs of Metro said this last year

Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available
 

gossi

Member
I see, this surprises me. For XBOX360 they had a close to the metal version of DX if I remember correctly?

During the lifetime of 360 development MS unlocked a significant portion of direct access. It's happening here, however MS are trying to do it in a much more controlled fashion this time. And rightly so, it means all titles will potentially have big performance gains via DX12, rather than just Big Boy(tm) developers.
 

gossi

Member
Source? AFAIK you can "write to the metal" since before launch, plus the devs of Metro said this last year

You cannot do anything on XB1. True story. MS have been very very good at working with and listening to developers at unlocking things via SDK. Eg you can do more work via multicore already. Example, core 7 is now part usable (shared with OS), when it wasn't at launch. There's also an SDK API layer to turn off Kinect sensor, which frees up the reservations.
 

TheMAK

Banned
You cannot do anything on XB1. True story. MS have been very very good at working with and listening to developers at unlocking things via SDK. Eg you can do more work via multicore already. Example, core 7 is now part usable (shared with OS), when it wasn't at launch. There's also an SDK API layer to turn off Kinect sensor, which frees up the reservations.

Cocaine's one hell of drug. True story. brah!
 

dr_rus

Member
it's probably already been said somewhere in this thread, but a 20% gpu boost to the xbone is a pretty big deal
http://m.windowscentral.com/what-microsoft-revealed-about-xbox-and-windows-10-today-gdc

Yeah, that quote was for DX12 version of Fable on PC and has no relation to Xbox One. There is a rather lengthy thread on this somewhere around here.

As for DX12 somehow making a game harder to run on PS4 - that's a complete nonsense. Nothing will change in multiplatform development with DX12 (I consider Win PC and Xbox One to be one platform already).
 

Chobel

Member
You cannot do anything on XB1. True story. MS have been very very good at working with and listening to developers at unlocking things via SDK. Eg you can do more work via multicore already. Example, core 7 is now part usable (shared with OS), when it wasn't at launch. There's also an SDK API layer to turn off Kinect sensor, which frees up the reservations.

Just making sure here, this is your experience working on Xbox One devkits? Or this is some stuff you read online?
 

tuxfool

Banned
Please enlighten me then, because everything I've seen has shown them to be mostly equivalent performance wise.

oh, and something that isn't Civ 5

If that is the case, why did you state it would hurt performance if an i7 were to be used? Even then you're far more likely to benefit (even if it doesn't scale to all games).
 
lol no
That figure was for the PC version of Fable Legends.
I remembered that, the way the article was written though I figured it was affecting the xbone too.
There's no 20% GPU boost. Please rewatch the full keynote instead of quoting journos who don't have clue.
how do you fuck that up though?!
Yeah, that quote was for DX12 version of Fable on PC and has no relation to Xbox One. There is a rather lengthy thread on this somewhere around here.

As for DX12 somehow making a game harder to run on PS4 - that's a complete nonsense. Nothing will change in multiplatform development with DX12 (I consider Win PC and Xbox One to be one platform already).
someone else has posted. as for your second statement, are we to presume then that the circle 5 dev was worrying about a whole lot of nothing?

http://gamingbolt.com/dx12-adoption...production-cross-support-with-ps4-is-a-factor
Probably you missed this thread. I'm sorry to disappoint you!

http://www.neogaf.com/forum/showthread.php?t=1003987
ah, no I do remember that thread but when I came across that article I thought that maybe it was affecting the xbone as well. but thanks.
 

King_Moc

Banned
it's probably already been said somewhere in this thread, but a 20% gpu boost to the xbone is a pretty big deal
http://m.windowscentral.com/what-microsoft-revealed-about-xbox-and-windows-10-today-gdc

It's a 20% boost to PC's using Direct X. The XB1 is already "coding to the metal", so there is no such performance boost to be gained. The update basically allows PC's to get closer to the level of efficiency that the consoles currently have.

how do you fuck that up though?!

A lot of journalists don't have much of a clue about the technical ins and outs of these things. Also, a few may have willfully misinterpreted it.
 

tuxfool

Banned
During the lifetime of 360 development MS unlocked a significant portion of direct access. It's happening here, however MS are trying to do it in a much more controlled fashion this time. And rightly so, it means all titles will potentially have big performance gains via DX12, rather than just Big Boy(tm) developers.

Except it is mostly the Big Boy developers that will use and have the expertise to use low level apis. DX12 isn't a drop in replacement to DX11.
 
Except it is mostly the Big Boy developers that will use and have the expertise to use low level apis. DX12 isn't a drop in replacement to DX11.

Given that many of the smaller Developers don't even touch DX11 or DX12 but rather use ready made engines like Unity, Unreal Engine or CryEngine we actually might see more of a difference there in the short term than with Publisher's (big boy's) in-house engines as those are pretty much asured of gaining DX12 compatibility sooner rather than later.

When it comes to expertise you're right though. There is a reason why few smaller developers write their own engines nowadays.
 

Skinpop

Member
During the lifetime of 360 development MS unlocked a significant portion of direct access. It's happening here, however MS are trying to do it in a much more controlled fashion this time. And rightly so, it means all titles will potentially have big performance gains via DX12, rather than just Big Boy(tm) developers.

I don't see how this is a good thing. You can keep dx while giving access to a low level api at the same time, like how ps3 has support for ogl(though most don't use it). It does make some sense from a cross platform perspective, but when it comes to efficiency seamless optimization between platforms is pie in the sky anyway. wouldn't leveraging the strengths of consoles, aka low level access to the hardware make more sense?

it would look bad for ms to admit that they use another low level api for better performance. i think it's nothing more than marketing, trying to make dx look like the default.
 

tuxfool

Banned
That was my point.

I inferred that you meant only big boys got to use low level apis close to the metal. DX12 doesn't change that, those that don't have the expertise will either use ready made engines or will program using the dx11 api. At least during the near term.
 

dr_rus

Member
someone else has posted. as for your second statement, are we to presume then that the circle 5 dev was worrying about a whole lot of nothing?

http://gamingbolt.com/dx12-adoption...production-cross-support-with-ps4-is-a-factor

Even the wording on that link makes no sense whatsoever:
The biggest catch with it right now is the support fragmentation with PS4. There are around 20 million PS4s on the market right now and if a developer pursues DX they’re making a calculated decision to not release to that user base.
So all these developers who are using DX (11.X) on the Xbox One right now are making a calculated decision to not release these games on PS4, am I right?
 

100k

Neo Member
The 20% GPU boost came from Fable Legends converting from DX11 to DX12. Also has it been confirmed that xbox one won't support all of dx12 hardware features?
 

Heigic

Member
If Xbox One fully supported DX12 Microsoft would have said it by now The fact that they haven't makes it pretty obvious it does not.
 

Kezen

Banned
If Xbox One fully supported DX12 Microsoft would have said it by now The fact that they haven't makes it pretty obvious it does not.

They have said it does but what is debated is whether or not it supports all the hardware features. I was sceptical at first but the more I think of it the more it makes sense the Xbox One GPU must support at least one of them (tiled ressources tier 3) since Directx 11 does not expose all the GPU features of the GCN. I also remember AMD saying the 300 series (GCN 1.1) supported conservative rasterization so in all likelyhood the Xbox (and PS4) must support it as well.
 

dr_rus

Member
The 20% GPU boost came from Fable Legends converting from DX11 to DX12. Also has it been confirmed that xbox one won't support all of dx12 hardware features?

GCN 1.0-1.2 does not support all hardware features of DX 12.1. So there's that.
 

Mivey

Member
GCN 1.0-1.2 does not support all hardware features of DX 12.1. So there's that.
There's currently no way of knowing what is in the standard to support it in the first place. It's like trying to find a browser that fully supports HTML5 when HTML5 wasn't even fully standardized.
 

dr_rus

Member
There's currently no way of knowing what is in the standard to support it in the first place. It's like trying to find a browser that fully supports HTML5 when HTML5 wasn't even fully standardized.
You've missed the GDC info I take it? It is known what a GPU have to support to be compatible with 12.0 or 12.1 feature levels. It is most certain that current GCN won't support 12.1. There is a possibility that they will support 12.0. But to be fully DX12 a GPU need to support 12.1, and right now only Maxwell 2 GPUs are capable of that.
 

Kezen

Banned
You've missed the GDC info I take it? It is known what a GPU have to support to be compatible with 12.0 or 12.1 feature levels. It is most certain that current GCN won't support 12.1. There is a possibility that they will support 12.0. But to be fully DX12 a GPU need to support 12.1, and right now only Maxwell 2 GPUs are capable of that.

AMD have confirmed that all GCN cards support feature level 12_0. Like Fermi and Kepler.
Maxwell support 4 DX12 hardware features.
 

Mivey

Member
You've missed the GDC info I take it? It is known what a GPU have to support to be compatible with 12.0 or 12.1 feature levels. It is most certain that current GCN won't support 12.1. There is a possibility that they will support 12.0. But to be fully DX12 a GPU need to support 12.1, and right now only Maxwell 2 GPUs are capable of that.
I am sure that any new GPU could actually support the upcoming standard. AMD is part of the group that acutally develops the Direct3D standard after all.
I just meant that we don't fully know what that standards fully entaily yet, except the broad goals. These special hardware operations, that were shown in a slide a while ago, didn't seem too important to me, maybe useful for a few edge-cases, where they could make things faster.

I wonder if feature level 12_0 is going to be much slower in actual games then a GPU with feature level 12_3. Maybe a few percent in benchmarks ?
 

dr_rus

Member
AMD have confirmed that all GCN cards support feature level 12_0. Like Fermi and Kepler.
Maxwell support 4 DX12 hardware features.

a. Where did AMD confirm that?
b. Fermi and Kepler certainly won't support FL 12_0 and unlikely to support 11_1 as well.
c. Maxwell 2 support FL 12_1.

I am sure that any new GPU could actually support the upcoming standard. AMD is part of the group that acutally develops the Direct3D standard after all.
I just meant that we don't fully know what that standards fully entaily yet, except the broad goals. These special hardware operations, that were shown in a slide a while ago, didn't seem too important to me, maybe useful for a few edge-cases, where they could make things faster.

I wonder if feature level 12_0 is going to be much slower in actual games then a GPU with feature level 12_3. Maybe a few percent in benchmarks ?

Well until we have any official data on Fiji we can only guess from the leaks. But it is strange that AMD didn't say that they support 12.1 and opted for saying that they support Tier 3 of resource binding feature instead. That kinda say to me that they might not support FL 12_1 in Fiji.

There is no FL 12_3.
 

Kezen

Banned
a. Where did AMD confirm that?
http://www.legitreviews.com/amd-says-gcn-products-will-support-dx12-amd-freesync-still-coming_137794


b. Fermi and Kepler certainly won't support FL 12_0 and unlikely to support 11_1 as well.
c. Maxwell 2 support FL 12_1.
Hum I may be confused then, I thought Fermi supported DX12 but no dx12 hardware features so which feature level does this correspond to ? 11_0 ?


Well until we have any official data on Fiji we can only guess from the leaks. But it is strange that AMD didn't say that they support 12.1 and opted for saying that they support Tier 3 of resource binding feature instead. That kinda say to me that they might not support FL 12_1 in Fiji.

There is no FL 12_3.

I guess they will support dx12_1 at the very least.
 

dr_rus

Member
That's the confirmation that they'll have WDDM2.0 drivers for GCN chips basically. Meaning that GCN will be able to use "thin API" advantage that DX12 is providing. This doesn't say anything about features supported by the hardware though. GCN chips support FL 11_1 in DX11 currently and this may be the case with DX12 as well. Although I have a feeling that FL 12_0 is what Xbox One will support basically and that means that at least GCN 1.1+ should support FL 12_0 as well.

Hum I may be confused then, I thought Fermi supported DX12 but no dx12 hardware features so which feature level does this correspond to ? 11_0 ?
Yes. Fermi and Kepler will have WDDM2.0 drivers (like old GCN chips) and will be able to use "thin" DX12 API but their hardware feature support is likely to be limited to FL 11_0.

I guess they will support dx12_1 at the very least.

Who? AMD in Fiji? This is unknown at the moment and we can only guess.

DX12_0 corresponds to DX11_2.

DX12_1 corresponds to DX11_3.

Unless MS announced some new FLs for DX11 in Win10 and I've missed that there are no 11_2 and 11_3.

It is interesting how they'll handle the addition of new features in DX11.3 actually. In DX 11.2 they've opted for expanding the optional FL 11_1 functionality instead of introducing a new feature level. I'm thinking that they may do the same thing with DX 11.3 which would mean that DX11 will still have only two feature levels - 11_0 and 11_1 - but the latter would be further expanded by additional features of Maxwell 2 and GCN3.
 

tuxfool

Banned
That's the confirmation that they'll have WDDM2.0 drivers for GCN chips basically. Meaning that GCN will be able to use "thin API" advantage that DX12 is providing. This doesn't say anything about features supported by the hardware though. GCN chips support FL 11_1 in DX11 currently and this may be the case with DX12 as well.

Unless MS announced some new FLs for DX11 in Win10 and I've missed that there are no 11_2 and 11_3.

Hmm. Yeah, I'm confusing DX release features with feature levels. DX11.2 is DX11_1, the added features are only software related.

This would imply DX12_0 is equivalent to DX11_1. I did read in AT that Dx11.3 will support the same features as DX12_1, obviously to accomodate the same features in both APIs. Now, what feature level 11.3 is, I don't know.
 

Kezen

Banned
DX12_0 corresponds to DX11_2.

DX12_1 corresponds to DX11_3.

There's nothing there about hardware feature levels.

That's the confirmation that they'll have WDDM2.0 drivers for GCN chips basically. Meaning that GCN will be able to use "thin API" advantage that DX12 is providing. This doesn't say anything about features supported by the hardware though. GCN chips support FL 11_1 in DX11 currently and this may be the case with DX12 as well. Although I have a feeling that FL 12_0 is what Xbox One will support basically and that means that at least GCN 1.1+ should support FL 12_0 as well.


Yes. Fermi and Kepler will have WDDM2.0 drivers (like old GCN chips) and will be able to use "thin" DX12 API but their hardware feature support is likely to be limited to FL 11_0.



Who? AMD in Fiji? This is unknown at the moment and we can only guess.



Unless MS announced some new FLs for DX11 in Win10 and I've missed that there are no 11_2 and 11_3.

It is interesting how they'll handle the addition of new features in DX11.3 actually. In DX 11.2 they've opted for expanding the optional FL 11_1 functionality instead of introducing a new feature level. I'm thinking that they may do the same thing with DX 11.3 which would mean that DX11 will still have only two feature levels - 11_0 and 11_1 - but the latter would be further expanded by additional features of Maxwell 2 and GCN3.

Okay my mind was not clear about the differences between API and feature levels.
 

nkarafo

Member
I remember this with DX9 cards and DX10. I don't even remember DX10 features being important or making a huge difference in games. I do remember Lost Planet having slightly better fur effects on coats and Crysis having some extra lighting effects but that's it. Not to mention that you could actually enable most of "DX10" effects on Crysis even if you had a DX9 card.

I don't know what DX11 brought to the table as i was away from PC gaming for more than 5 years.
 
And i just uninstalled the WDDM 2.0 driver to try the latest one for W8.1 -.-, anyway downloading.

edit: bah, nevermind

The API Overhead feature test is available now in the latest version of 3DMark Advanced Edition and 3DMark Professional Edition. The Steam version of 3DMark updates automatically. The standalone will prompt you to download and install an update. The test cannot be run from the free 3DMark Basic Edition or Steam demo.
 

Journey

Banned
Source? AFAIK you can "write to the metal" since before launch, plus the devs of Metro said this last year

Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available



But they also said it was too late to do anything for Metro, and that's a pretty recent game which would indicate most available games have not been able to take advantage of this DX12/GNM Style API, enphasis on the word style because it does not mean it features everything that DX12 will allow.


Here's the full quote on comparing API's:

Oles Shishkovstov: Let's put it that way - we have seen scenarios where a single CPU core was fully loaded just by issuing draw-calls on Xbox One (and that's surely on the 'mono' driver with several fast-path calls utilised). Then, the same scenario on PS4, it was actually difficult to find those draw-calls in the profile graphs, because they are using almost no time and are barely visible as a result. [Thats how much faster the PS4 API is handling draw calls]

In general - I don't really get why they [Microsoft] choose DX11 as a starting point for the console. It's a console! Why care about some legacy stuff at all? On PS4, most GPU commands are just a few DWORDs written into the command buffer, let's say just a few CPU clock cycles. On Xbox One it easily could be one million times slower because of all the bookkeeping the API does.

But Microsoft is not sleeping, really. Each XDK that has been released both before and after the Xbox One launch has brought faster and faster draw-calls to the table. They added tons of features just to work around limitations of the DX11 API model. They even made a DX12/GNM style do-it-yourself API available - although we didn't ship with it on Redux due to time constraints.


Sounds horrible, and it looks like the important changes didn't come soon enough for games like Metro and possibly many other games with the same release window. In particular alarming is the PS4 is handling draw calls so fast that they're barely visible in the profile graph, that just sceams how bad it's handled on the bone where the CPU core was found to be hosed by draw calls.
 

diffusionx

Gold Member
buyers of 970/980 are going to be pissed.
In hindsight, it will be awhile for windows 10 to come out right?


Old post, but man, buyers of the 970 are already pissed.

I have been supporting PC gaming for almost 20 years but I always seem to get hosed in some way shape or form. Even when I buy the "sure thing." Comes with the territory I guess.
 
Top Bottom