• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DirectX 12 GPU exclusive features to be shown/announced at GDC

SapientWolf

Trucker Sexologist
While I have no idea where GPUlover is getting their ‘information’ from except for cherry picking sources from around the net it would be much for interesting (and constructive) to point out their errors with sound reasoning and conversation.
Cue the oh it’s too late for that replies with little to no substance. :)

While much of the actual topic is over my head I find technology thread interesting reads and always want to learn more. When the majority of the replies are berating another poster (even if they seem to be ‘riding the dragon’) the thread become almost embarrassing to read, I mean sarcasm is great and all but it does gets old very quickly ;) .

Anyway as I said if someone could steer the conversation back into topic for us who would like to learn rather than just post ‘lol look at this muppet’ that would be great!
It's not even worth the effort. Every correction is met with another post that's even more nonsensical than the last. There isn't much to post here besides speculation until the actual presentation so I guess the sideshow helps to pass the time.
 

leeh

Member
Is XB1 so gimp with DX11 that it only can muster 1.3TF?
Will DX12 be that key to get that 2.3 - 2.6TF?
Any API improvements would just allow the console to run closer to it's theoretical maximum more of the time, in certain scenarios, it doesn't suddenly boost the theoretical maximum.

If there's any hardware differences in the GPU in DX12 which could boost the theoretical maximum, then we could see a boost. Can't see this happening though, especially to that degree.

Out of all the members on this forum, we must have some people who like hard drugs.
 

GPUlover

Banned
So is the hidden GPU on the XB1 an Nvidia chip?

I think the DEVES are wetting themselves with excitement.

NO! We don't know at this time. Let's take a look at the past as to where we can go from here. Fans over look the DETAILS like this. (remember i only use facts)

Some fans said that XB1 GPU was weak and the same, but if that was true then why is the XB1 architect saying this when talking about PS4's GPGPU?
" We've actually taken a very different tack on that. The experiments we did showed that we had headroom on CUs as well. In terms of balance, we did index more in terms of CUs than needed so we have CU overhead. There is room for our titles to grow over time in terms of CU utilisation."
There are two things in this statement that was over looked
1.Headroom on CUs.
How is that ? This was before the 10% given back from the kinetic.
2. There is room for our titles to grow over time in terms of CU utilisation.
Grow from where?
Phil Spencer said the hardware will not change.
Maybe a great API will use that hardware?

"Microsoft's approach to asynchronous GPU compute is somewhat different to Sony's - something we'll track back on at a later date."
How is this when it's the same as PS4's GPU?
Right.

"Exemplar ironically doesn't need much ALU. It's much more about the latency you have in terms of memory fetch, so this is kind of a natural evolution for us," he says. "It's like, OK, it's the memory system which is more important for some particular GPGPU workloads."
This is a good one. Ubisoft's GPGPU slides that show PS4 with this high in the SKY work load of 1600 V.S. XXX, and PS4 fans jump up in joy, What system has the lowest latency, it's ESRAM by a long shot, and what did Ubisoft not put in that little slide show? That's right ESRAM.
inside-xbox-one-by-martin-fuller-8-1024.jpg

It makes every thing better!
Maybe with the right API?
$hit look it's right there. Name a GPU in the world that can read at one time 1024bits + 256bits with 8 memory controllers?
Here this will help.
PS4 is only 64bits on 4 lines with 4 memory controllers.
R9 290X 512bits
AMD's next PC GPU is 4096bits (not out as of yet)
At last count it was 0.
Where is all this data going too? Must be that 1.3TF GPU from AMD.
 
This shit makes me laugh, why would Microsoft "hide" performance from developers and consumers considering their second place in the market and low sales just so they can go 'taadaaaaaaa' now we've unlocked twice the power.

All DX12 will do is allow developers to max out more of the 1.3TF GPU than they could with DX11 overheads.

OpenGL 4.5 was created in response to Mantle and DX12, there is no reason why Sony can't adapt their PSSL which is OpenGL 4.4 based.
 

MacBosse

Member
Jesus wading trough this thread in search for actual insight and information turns into an impossible proposition for the un-initiaded (=me).

When is GDC btw?
 
J

JoJo UK

Unconfirmed Member
Jesus wading trough this thread in search for actual insight and information turns into an impossible proposition for the un-initiaded (=me).

When is GDC btw?
Starts next week (I think), MS show is next Wednesday, 7PM GMT IIRC.
 

leeh

Member
Interesting article here on Toms Hardware:
http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html

Basically, DX12 is rumoured to be allowing different cards to work together and create SFR (Split Frame Rendering) support between AMD and Nvidia. Similar to Mantle but without it just being AMD cards. Very exciting! Could seriously improve multi-GPU configurations.

In terms of XB1, I suppose it could allow the SHAPE block with the programmable DSP's and the 15 co-processors to potentially be used for graphics processing as suggested on Beyond3D.
 

Locuza

Member
OpenGL 4.5 was created in response to Mantle and DX12, there is no reason why Sony can't adapt their PSSL which is OpenGL 4.0 based.
GL 4.5 is not a real response to Mantle and DX12.
glNext will.

And you really must define "based on", when claiming PSSL is based on GLSL 4.0.
 

GPUlover

Banned
It's not even worth the effort. Every correction is met with another post that's even more nonsensical than the last. There isn't much to post here besides speculation until the actual presentation so I guess the sideshow helps to pass the time.

Look more BS, What is nonsensical ?

Nonsensical right, ok....
Are you the one that said you can't put textures in ESRAM? wrong!
Are you the one that said that consoles don't have a 2x? Wrong!
Are you the one that said XB1 has the same GPU as PS4? Wrong!
Are you the one saying that XB1 is using a too the metal API already? Wrong!
Are you the one that saying it's just PR ? wrong!
Are you the one that said there are not any NDAs on XB1 hardware? Wrong!
Are you the one that said that PS4s memory system was better than XB1? Wrong!
Are you the one that said that DX12 will not help XB1? Wrong!

Get a mirror you are looking at nonsensical.
Where are all the fans links that are saying I'M wrong?
 
GL 4.5 is not a real response to Mantle and DX12.
glNext will.

And you really must define "based on", when claiming PSSL is based on GLSL 4.0.

"based on" as in the PS4 keynote at GDC 2013 said it is based on 4.4..................

NsWsIh4.jpg


Get a mirror you are looking at nonsensical.
Where are all the fanboy links that are saying I'M wrong?
That's just it, it's never been true just fanboys kissing A$$.
Just because someone disagrees with you doesn't mean they are fanboys, get a grip.
 

GPUlover

Banned
This shit makes me laugh, why would Microsoft "hide" performance from developers and consumers considering their second place in the market and low sales just so they can go 'taadaaaaaaa' now we've unlocked twice the power.

All DX12 will do is allow developers to max out more of the 1.3TF GPU than they could with DX11 overheads.

OpenGL 4.5 was created in response to Mantle and DX12, there is no reason why Sony can't adapt their PSSL which is OpenGL 4.4 based.

Let me say this, outside of the walls of MS/AMD we may never know.

AS i have already said hardware is always ahead of software. DX12 is still not 100% done.
DX11 is not efficient for XB1's hardware....
As said here by TURN10.
"What we have is newer to DX12 and not even on XB1 resource tables, and what that lets us do is create a complex set of resources like I said
all the constants and textures we need for a particular bundle, and we can set that entire set with a single command. So a single command to set the
full resource set and a single command to set the bundle gives us the maximum efficiency."
 
I'll ask like I did in a previous thread

has anyone actually PMed a mod? They can't be everywhere. So, if you have enough, don't just complain, do something about it.
 

Bobnob

Member
Gpulover, i'm not sure who your trying to convince, others or yourself. I guess we will find out the truefact soon enougth.
 

Angel_DvA

Member
This shit makes me laugh, why would Microsoft "hide" performance from developers and consumers considering their second place in the market and low sales just so they can go 'taadaaaaaaa' now we've unlocked twice the power.

All DX12 will do is allow developers to max out more of the 1.3TF GPU than they could with DX11 overheads.

OpenGL 4.5 was created in response to Mantle and DX12, there is no reason why Sony can't adapt their PSSL which is OpenGL 4.4 based.

Please, stop using logic.
 

Shabad

Member
Look more BS, What is nonsensical ?

Nonsensical right, ok....
Are you the one that said you can't put textures in ESRAM? wrong!
Are you the one that said that consoles don't have a 2x? Wrong!
Are you the one that said XB1 has the same GPU as PS4? Wrong!
Are you the one saying that XB1 is using a too the metal API already? Wrong!
Are you the one that saying it's just PR ? wrong!
Are you the one that said there are not any NDAs on XB1 hardware? Wrong!
Are you the one that said that PS4s memory system was better than XB1? Wrong!
Are you the one that said that DX12 will not help XB1? Wrong!

Get a mirror you are looking at nonsensical.
Where are all the fans links that are saying I'M wrong?

  • Who said otherwise ? What does that have to do with DX12 ?
  • What's a "2x" ?
  • Of course they don't have the same GPU, that's one of the key point of differentiation
  • Most consoles have custom API, Xbox One certainly is no different.
  • Wut ?
  • XB1 is out there in the wild, you can open it up and see what's inside. What can Microsoft really hide about it... ? And more importantly why would they ?
  • Well it is no, isn't it ? Sony considered both options and chose the most expensive one. Why would they have gone this route if they thought it was worse and more expensive ?
  • Noone is saying that DX12 won't help XB1.

Edit: NOOOO !!! He is banned :-(
 

Kezen

Banned
No one saw that coming, no one. The Xbox One has no secret sauce of any sort, that does not mean the console's hardware will not be more shrewdly used as time goes on.

Directx 12's impact on the Xbox One will be 1/10th of what it will be on PC, especially those with capable CPUs.
 

Bliman

Banned
Let me first say that I don't know the ins and outs of the hardware or how directx12 works exactly.
But I have a problem with how some are ridiculing GPUlover without giving some facts or saying how directx12 works or how much maybe it is of benefit for the Xbox one.
How come this is tolerated ,but when he states his opinion it gets jumped on without giving a explanation?
Let's just say that if at GDC Microsoft explains how it benefits the Xbox one and it makes a big difference , then would all these naysayers apologize?
I don't doubt there is much who have much knowledge at this forum. It's just this picking at someone that I hate, especially if you don't explain why these things work.
I just like to hear some constructive things not this easy picking.
Regards
Manuel
 

riflen

Member
Let me first say that I don't know the ins and outs of the hardware or how directx12 works exactly.
But I have a problem with how some are ridiculing GPUlover without giving some facts or saying how directx12 works or how much maybe it is of benefit for the Xbox one.
How come this is tolerated ,but when he states his opinion it gets jumped on without giving a explanation?
Let's just say that if at GDC Microsoft explains how it benefits the Xbox one and it makes a big difference , then would all these naysayers apologize?
I don't doubt there is much who have much knowledge at this forum. It's just this picking at someone that I hate, especially if you don't explain why these things work.
I just like to hear some constructive things not this easy picking.
Regards
Manuel

I'm sorry, but if you can't see why he got the replies he did from simply reading the content of his posts, then I don't think any explanation will satisfy you. It's like arguing with moon-landing conspiracists.
 

Bliman

Banned
I don't agree. I don't know enough of the hardware or directx12 to commentate on it.
But things like the second gpu is in the psu and what hard drugs are you taking don't make for much constructive conversation.
In my opinion it sets up these camps of those who expect miracles and those that says it doesn't do anything.
Why not give some reasoning why it doesn't work that way, so that people like me know what to expect (with all these experts).
I think that all these hardware is complicated enough not to shovel them under such ridicule.
And if not explain what it does and why directx12 doesn't give the xbox one(if it is prepared for directx12) any more graphic fidelity or frees up more time to work on such things or frees up resources.
 
Guys, I am so happy the mods are what they are. Thank you mods.

Back on topic, anyone have any idea what other unannounced GPU features we could be seeing? Perhaps they will announced en open ended design whereby DX12 gets more and more features each year or so? That could be interesting.
 

tuxfool

Banned
Guys, I am so happy the mods are what they are. Thank you mods.

Back on topic, anyone have any idea what other unannounced GPU features we could be seeing? Perhaps they will announced en open ended design whereby DX12 gets more and more features each year or so? That could be interesting.

I assume you have read the AT article about DX11.3/12?

It would kind of make sense to extend the API. They had a primitive version in 10 (supporting a dx9 codepath). Directx 11 extended it to feature levels, but I hope that it doesn't go to OGL extension insanity...
 
Really stupid question, but could someone explain to me the physical limitation of a Graphics Card not being updatable to say the next version of DirectX or OpenGL. I always hear about Card X or Y wont be able to support the newest version of whatever Graphics Library, but I've never fully understood it. Is the limitation in a Card it not having a specific hardware component required to run the full version, or is it something that is controlled by firmware. For example, if a Graphics Card provider really wanted, could they release a firmware update that flashes the card and updates it with the latest APIs?
 
Interesting article here on Toms Hardware:
http://www.tomshardware.com/news/microsoft-directx12-amd-nvidia,28606.html

Basically, DX12 is rumoured to be allowing different cards to work together and create SFR (Split Frame Rendering) support between AMD and Nvidia. Similar to Mantle but without it just being AMD cards. Very exciting! Could seriously improve multi-GPU configurations.

In terms of XB1, I suppose it could allow the SHAPE block with the programmable DSP's and the 15 co-processors to potentially be used for graphics processing as suggested on Beyond3D.

o_O

I wonder if it would be possible to blend AMD and Nvidia exclusive tech into a single game through a DX12 SLI config?

Or would it just default everything to the lowest common denominators for the sake of smooth synching?
 

Locuza

Member
Really stupid question, but could someone explain to me the physical limitation of a Graphics Card not being updatable to say the next version of DirectX or OpenGL. I always hear about Card X or Y wont be able to support the newest version of whatever Graphics Library, but I've never fully understood it. Is the limitation in a Card it not having a specific hardware component required to run the full version, or is it something that is controlled by firmware. For example, if a Graphics Card provider really wanted, could they release a firmware update that flashes the card and updates it with the latest APIs?
It's a good question.
Since we have programmable logic, nearly every effect is possible with a workaround.
But you guessed it, without necessary hardware adjustment and special abilities coming from it, the desirable function will be really slow or not even possible.
So sometimes it needs new hardware for the next graphics library to support certain specifications effectively.
But simple driver updates for a new API are also possible.
Like in DX11.1/2/OpenGL and now DX12.
Many things or even all of them were supported just by a new driver.

I wonder if it would be possible to blend AMD and Nvidia exclusive tech into a single game through a DX12 SLI config?

Or would it just default everything to the lowest common denominators for the sake of smooth synching?
I think a multi-gpu config with Geforce + Radeon will only be possible if the driver don't check for a certain ID.
Nvidia for example locks out Radeon-GPUs if the driver detects a radeon as the primary GPU, so a Nvidia GPU used as a Coprocessor for PhysX is not working, only with modded drivers.

For rendering I think, a cross vendor Multi-GPU is not the best idea.
But I would like a Coprocessor relationship.
One card rendering the scene, the other Compute-things.
Something like that.
If both should render a scene together, I fear negatives effects due to the hardware-difference.
For example the AF-Filter is different, the sample-position could be different if using MSAA or other anti-aliasing methods, maybe some things don't play nicely as one would wish.
 

wachie

Member
I don't agree. I don't know enough of the hardware or directx12 to commentate on it.
But things like the second gpu is in the psu and what hard drugs are you taking don't make for much constructive conversation.
In my opinion it sets up these camps of those who expect miracles and those that says it doesn't do anything.
Why not give some reasoning why it doesn't work that way, so that people like me know what to expect (with all these experts).
I think that all these hardware is complicated enough not to shovel them under such ridicule.
And if not explain what it does and why directx12 doesn't give the xbox one(if it is prepared for directx12) any more graphic fidelity or frees up more time to work on such things or frees up resources.
You don't get it, do you? The guy was following one nonsense post with another and any discussion with him lead to even more ridiculous claims, which he kept claiming as "facts", besides calling out "the rest of neogaf". If I was a mod, I would have let him continue on his tirade as he was atleast providing us with some entertainment till next week.

As for your last question, it's been covered quite well in the thread already. Console APIs are lower level than the traditional Direct3D API, if anything the performance boost that DX12 will bring will be to the lower-specced PCs, especially the CPUs.
 
Lulz, who banned the gpu guy, aka mistercteam? he is the guy that makes all those ms paint graphics correlating random technologies and saying they are in the xbox one lol.
 

Putty

Member
Lulz, who banned the gpu guy, aka mistercteam? he is the guy that makes all those ms paint graphics correlating random technologies and saying they are in the xbox one lol.

All that super duper tech knowledge, you'd think he'd get picked up by some big tech firm but no, he works in a pharmacy. Nut job, all of that mob.
 

Skinpop

Member
Hardware support for a tiled shading stage would be sweet. Could make super sampling free in terms of memory usage.

Maybe fully programmable blending?
 

Raven77

Member
I'm coming into this thread a little late but have their been any specific DX12 only graphical features revealed or hinted at so far? Any interesting innovations it is going to allow for once developers have time to take full advantage of it?
 
Top Bottom