Is it though? Wouldn't it be better for Nvidia (not the gamer), to release their own version of mantle, causing a API fragmentation, which would make mantle's work in the space fall apart (no clear winner, thus everyone defaulting back to DX/OGL). Having "overhead" allows Nvidia (and AMD) to continue to sell premium models that try to brute force through the inefficiencies inherent with the heavy API's? Afterall, why does it make sense for NVidia to make older cheaper hardware to run better, rather than forcing buyers to buy new $600+ cards (or multiple of them) to get that extra 20%?
Nvidia hasn't really shown that they're into the gamer, but more into locking gamers into their ecosystem to crush competition in the market place. Sad really.
Perhaps, but mantle was also designed with GCN in mind, so even if NVidia could make their own thin driver layer, all the mantle pieces may not fit together with how their architecture works. if it was easy, and worth it, why doesn't ATI also support older hardware that isn't GCN, but in a depricated/lesser mode in their driver? Far as I can see, GCN architecture was designed with mantle in mind, and is critical for how it functions.
The way I see it is, kind of like Chess, AMD finally has NVidia in check, now it's NVidias move. I think it's in Nvidias best interest to no join mantle, and find a way to make mantle fail, because I don't think Nvidia has such a long term vision in their architecture to support such a venture (to support something like this your core architecture can't change too much overtime, or it becomes unmanageable). Nvidias newest card + Gsync + market share + general opinion on drivers is sort of like Nvidia's reciprocating check. If they can kill mantle, it'd be check mate.
It's kind of a pessimistic way of looking at Nvidia, and it's very anti-gamer, but from what I've seen that's how I see Nvidia feeling about mantle.