• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DirectX 12 GPU exclusive features to be shown/announced at GDC

I've learned a lot about PC's and how they work over the past couple of years but I must ask:

If I were to keep my i5 processor and get a Titan X, would it be a bad idea?

You definitely would not be using Titan X to its full potential in that scenario. I went from an i7 975 Extreme to a 4770k, and with the same gpu (gtx 680), I saw significant gains in many games - all that from just a cpu upgrade.
 

Kronik

Banned
So based on what we currently know about DX12 features and hardware, it looks like this:

o4wOYxm.png
 

RVinP

Unconfirmed Member
So based on what we currently know about required DX12 functions and hardware, it looks like this:

o4wOYxm.png

Most of those features on AMD are already present on hardware level, while on Nvidia its either emulated (prev gen) or partially supported (cur gen)?

Would these features make difference on the consumers end?
 
So based on what we currently know about DX12 features and hardware, it looks like this:

o4wOYxm.png

AMD way ahead of the curve with a GPU architecture released in 2012. Christ.

I might ride my R9 290 until 16nm depending on how DX12 games run (I'm expecting "really well" because of async compute and general efficiency improvements). This way I can put the money I would spend on a 390X this year towards video games and a new monitor. Maybe a Vive.

Not planning to go beyond 1440p in any case.

I dodged a bullet by not getting a GTX 780. Feel pretty smug right now,

EDIT: Also I was right when I said DX12 was going to be "whatever GCN can do" heh
 

Chobel

Member
Sum of Absolute Differences instruction. Probably not a big deal, but useful in some case.

Thanks! I did a quick search on it, and it looks it mainly useful in object detection stuff and video compression. I couldn't find how it can be useful for 3D rendering.
 

Kronik

Banned
Yeah, well, the GCN architecture was always more feature rich and forward thinking, that's one of the reasons why it consumes more power than the cut down, mobile first Kepler/Maxwell.

However, most of the GCN features couldn't be used previously on PC, and will only come into play with DX12. So, in a sense, Nvidia made the right calls for the present, and AMD went for future proofing -- that's why it's good that GCN is in consoles.

GCN is also a stateless compute architecture, which will be a huge advantege when doing async warp, which is what the Oculus Rift will heavily rely on.

Also, if you have a new Maxwell card, I wouldn't worry too much, that's why DX12 has tiers. And, ofcourse, the first DX12 games are a long time away...
 

riflen

Member
So based on what we currently know about DX12 features and hardware, it looks like this:

This is interesting. What's the source, please?
Looks like Nvidia gambled that you wouldn't be seeing any DirectX 12 games in the wild until 2016 and that even if there were one or two, GM20x had support that was "good enough".
 

Kronik

Banned
Thanks! I did a quick search on it, and it looks it mainly useful in object detection stuff and video compression. I couldn't find how it can be useful for 3D rendering.

Yes, it's current use is mainly on the Xbox One Kinect stuff. But many developers are working on analytical AA methods that use SAD4, so Microsoft made it's support mandatory in DX12 (also accepts emulation).
 

Durante

Member
We'll probably see a few DX12 games in the not too distant future (e.g. 2016) I'd say, simply because the large engines will be quick to support it. They just won't use any feature levels which apply to only <10% of the market.

The major reason to use DX12 is clearly the CPU side of things.

Though personally, I wish everyone would just use Vulkan :p
 

Kronik

Banned
We'll probably see a few DX12 games in the not too distant future (e.g. 2016) I'd say, simply because the large engines will be quick to support it. They just won't use any feature levels which apply to only <10% of the market.

The major reason to use DX12 is clearly the CPU side of things.

Though personally, I wish everyone would just use Vulkan :p

You seem to forget that effects that can be used on max DX12 FEATURE_LEVEL can also be used on consoles and that's not <10% of the market. Especially when porting from Xbox One, the developers simply run a check at the start, and if the user has a GCN card, they switch on the Xbox One effects.
 

tuxfool

Banned
AMD way ahead of the curve with a GPU architecture released in 2012. Christ.

That is insane. I always knew it was a forward looking architecture, but not to this level. GCN doesn't have the dedicated voxelization hardware found in maxwell, however. Maybe this will quiet down the people claiming that even recent gcn iterations are crusty and old. It sort of validates their GCN based iteration vs developing completely "different" architectures. One does have to bear in mind this defines support, not how well those features are executed.

But it also is as you said MS has an incentive to fully support dx12 around GCN.
 

Kronik

Banned
This is interesting. What's the source, please?
Looks like Nvidia gambled that you wouldn't be seeing any DirectX 12 games in the wild until 2016 and that even if there were one or two, GM20x had support that was "good enough".

Sorry, should've said that the source is Prohardver, whos editor compiled the sheet based on currently available facts - DX12 is not finalized yet, so no official sheet. It's in hungarian, so I translated the sheet to english. I know it's not Anandtech or something, but I trust them 100%. They are the leading hardware site in Hungary, very knowledgeable, and also their architecture&hardware reviews are very in-depth and comparable to Anandtech (here is an example).
 

Durante

Member
You seem to forget that effects that can be used on max DX12 FEATURE_LEVEL can also be used on consoles and that's not <10% of the market. Especially when porting from Xbox One, the developers simply run a check at the start, and if the user has a GCN card, they switch on the Xbox One effects.
And what effects would that be exactly? Can you give some concrete examples? Because of all the features in there, the only truly impactful ones I can see only apply with a significant change in the rendering pipeline, and aren't isolated effects you just turn on and off. I don't really see people shipping games with multiple rendering paths which differ significantly based on feature level -- it's just not something that has generally happened in the history of PC gaming. I can see something like SAD4 being used in a post-processing pass where it makes sense, but that's not a huge difference.

But hey, we'll see in 2016.

Source GCN supports Conservative Rasterization and ROVs?
Yeah, I'm especially curious about the conservative rasterization part. (In other words: I don't believe it)
 

dr_rus

Member
So based on what we currently know about DX12 features and hardware, it looks like this:

o4wOYxm.png

Based on what we currently know no GCN chip supports conservative rasterization and ROVs.

And if the image on Fiji above is true it is highly likely that Fiji won't support DX 12.1 FL while Maxwell 2 will.

It is a bit messy because a GPU can support a higher tier of a feature but it needs to support minimal required tiers for all FL features to support that FL.

Leaked slides for AMD's next-gen cards suggest their new architecture is the first to be fully DX12 hardware compatible:

yPBxI49.jpg

Nlc7jlt.jpg

These are highly questionable as well because AFAIK there are no DX12 feature levels which require tier 3 support for resource binding while it would probably be more informative to indicate what DX12 feature level the GPU supports.
It is possible that Fiji doesn't support FL 12.1 and only support FL 12.0 (with Tier 3 support for resource binding) which would mean that Maxwell 2 is actually more advanced in terms of DX12 specs.

I wouldn't read much into this anyway as most of these features won't be used in games until the next gen of console hardware probably.
 

dr_rus

Member
Have these been posted?

And does slide 16 confirm full DX 12 for Xbox One

We know that DX12 is coming to Xbox One for some time now so what's the point of this question? Will the GCN 1.1 GPU of Xbox One support all features of DX12 to their maximum tiers? No. The hardware isn't advanced enough for this. Fiji might do this. Maxwell 2 will support FL 12.1. But all pre-Fiji GCN cards and APUs (with a possible exception of Carrizo) will have limited DX12 support only.
 

deadman69

Member
just saw this article about how dx12 could harm development of games for ps4.

with my very basic knowledge of these things i feel this is a crock of shit, but someone smarter than me could shed some light on whether these are genuine concerns.
 
just saw this article about how dx12 could harm development of games for ps4.

with my very basic knowledge of these things i feel this is a crock of shit, but someone smarter than me could shed some light on whether these are genuine concerns.

I don't know if I'd say "harm." You could start seeing more multiplats developed first for PC, then Xbox, and only then PS4, just because the port to X1 would be easier.
 

Ziffles

Member
That is simply not true, don't give miss information to people who are looking to buy something new. You might say it's not worth it but some games definitely do take advantage of i7.

You're right, that was inaccurate. Allow me to correct myself:

i7 CPUs don't see a benefit in almost any game. In fact, a few games even take a performance hit in some circumstances.

Better? :)
 

Skinpop

Member
Something I'd like clarified is whether xbone devs actually write their graphics code in direcx. wouldn't ms offer to-the-metal level control with an internal api/extension and thus make any ease of porting claims irrelevant? It just doesn't make sense to me that xbone would run on regular dx 11 when on of the advantages of consoles is that you get to fully take advantage of the hardware.

This in extension would mean that dx12 realistically won't do anything for xbone. it's just marketing boys.

Can't GNM and GNMX interfere with made for DX 12 games?

usually the gpu communication sits on its own software layer, so they just need to port that code. most gpu apis are largely the same, but offer varying degrees of granularity.
 

Seanspeed

Banned
You're right, that was inaccurate. Allow me to correct myself:

i7 CPUs don't see a benefit in almost any game. In fact, a few games even take a performance hit in some circumstances.

Better? :)
Worse actually. Far more games will benefit from it than be hurt by it, yet you're explaining things as if it were opposite.
 

Ziffles

Member
Worse actually. Far more games will benefit from it than be hurt by it, yet you're explaining things as if it were opposite.

Please enlighten me then, because everything I've seen has shown them to be mostly equivalent performance wise.

oh, and something that isn't Civ 5
 

gossi

Member
Xbox One development is via DirectX and Xbox SDK. You can't direct access hardware now. You're even limited to your HyperV VM container.

That said, DirectX 12's goal is to unlock the lower level calls (via API calls). Xbox One has a 1.75ghz core CPU with 8 cores, 6 usable (+1 part usable), so everything needs to be as multicore as possible. In this case, Xbox is driving the market here, with *significant* gains for PC gamers.

PS: XB1 gfx chip is AMD.
 
Top Bottom