• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Johan Andersson's Keynote on MANTLE - Nov 13, 4:15PM EST

Perkel

Banned
That's definitely for non-gaming things, right? And wouldn't the scaling be pretty bad, or have they got a nice solution to that problem?


MANTLE was created for such things. With mantle they essentially fix ton of CF/SLI problems and they can use multiGPU properly now.

IT opens for example few new directions in which multiGPU can go especially for VR,

For example 1 GPU per eye.

He also mentioned that the implementation is surprisingly cheap - only 2 months of work. 1st half harder to get the engine running, 2nd half much easier to optimize everything.

To be fair he is not really good person to visualize time of implementation.


IT would be better to hear that from other party that wasn't involved from start with MANTLE and how much time they ported their engine.
 

belmonkey

Member
MANTLE was created for such things. With mantle they essentially fix ton of CF/SLI problems and they can use multiGPU properly now.

IT opens for example few new directions in which multiGPU can go especially for VR,

For example 1 GPU per eye.

Would be amazing to get nearly full scaling on 3 or more GPUs, and it could give some real incentive to go beyond single GPU setups, although I wouldn't look forward to the power draw :/
 
Nvidia getting on board with Mantle is pretty big reveal. Usually you see tech competitors trying to hoard stuff to get an edge on the competition, so AMD not doing that shows how confident they are that this is must-have tech, plus it gets them positive good will from the gaming community.
 

joesiv

Member
Nvidia is not getting on board just yet. The only thing that is confirmed is that API supports multiple vendors, if those vendors want to adopt it.

Not sure how it would benefit NVidia to jump on board. Clearly Mantle is designed with GCN in mind, so any other vendor would have to run in a crippled mode. There may be tangeble benefit to gamers in this cripled mode, however in the long run if mantle were to stick around and become the norm, Nvidia/Intel would get creamed in all benchmarks even with similar hardware.
 

DieH@rd

Banned
Not sure how it would benefit NVidia to jump on board. Clearly Mantle is designed with GCN in mind, so any other vendor would have to run in a crippled mode. There may be tangeble benefit to gamers in this cripled mode, however in the long run if mantle were to stick around and become the norm, Nvidia/Intel would get creamed in all benchmarks even with similar hardware.

Mantle is designed to talk to "thin AMD driver". Johan was quite specific when he spoke about cross-vendor support. He definetley want nVidia to adopt Mantle, optimize it and make their own thin driver.
 

ekim

Member
Mantle is designed to talk to "thin AMD driver". Johan was quite specific when he spoke about cross-vendor support. He definetley want nVidia to adopt Mantle, optimize it and make their own thin driver.

Which is the only way to make mantle a success IMHO.
 

Mr Swine

Banned
Mantle is designed to talk to "thin AMD driver". Johan was quite specific when he spoke about cross-vendor support. He definetley want nVidia to adopt Mantle, optimize it and make their own thin driver.

Would probably be the best and "cheapest" way for Nvidia without screwing half of the PC user based with their own thing
 

Zarx

Member
Not sure how it would benefit NVidia to jump on board. Clearly Mantle is designed with GCN in mind, so any other vendor would have to run in a crippled mode. There may be tangeble benefit to gamers in this cripled mode, however in the long run if mantle were to stick around and become the norm, Nvidia/Intel would get creamed in all benchmarks even with similar hardware.

Well if all games that support Mantle (and there are already quite a few including all future major EA games) suddenly run significantly faster on cards with Mantle support you better believe that they will bend over backwards to add support. Even if Nvidia's implementation only received a smaller performance boost. And if all major vendors support it it will be good for the PC market in general which is good for Nvidia.
 

joesiv

Member
Well if all games that support Mantle (and there are already quite a few including all future major EA games) suddenly run significantly faster on cards with Mantle support you better believe that they will bend over backwards to add support. Even if Nvidia's implementation only received a smaller performance boost. And if all major vendors support it it will be good for the PC market in general which is good for Nvidia.
Is it though? Wouldn't it be better for Nvidia (not the gamer), to release their own version of mantle, causing a API fragmentation, which would make mantle's work in the space fall apart (no clear winner, thus everyone defaulting back to DX/OGL). Having "overhead" allows Nvidia (and AMD) to continue to sell premium models that try to brute force through the inefficiencies inherent with the heavy API's? Afterall, why does it make sense for NVidia to make older cheaper hardware to run better, rather than forcing buyers to buy new $600+ cards (or multiple of them) to get that extra 20%?

Nvidia hasn't really shown that they're into the gamer, but more into locking gamers into their ecosystem to crush competition in the market place. Sad really.

Mantle is designed to talk to "thin AMD driver". Johan was quite specific when he spoke about cross-vendor support. He definetley want nVidia to adopt Mantle, optimize it and make their own thin driver.
Perhaps, but mantle was also designed with GCN in mind, so even if NVidia could make their own thin driver layer, all the mantle pieces may not fit together with how their architecture works. if it was easy, and worth it, why doesn't ATI also support older hardware that isn't GCN, but in a depricated/lesser mode in their driver? Far as I can see, GCN architecture was designed with mantle in mind, and is critical for how it functions.



The way I see it is, kind of like Chess, AMD finally has NVidia in check, now it's NVidias move. I think it's in Nvidias best interest to no join mantle, and find a way to make mantle fail, because I don't think Nvidia has such a long term vision in their architecture to support such a venture (to support something like this your core architecture can't change too much overtime, or it becomes unmanageable). Nvidias newest card + Gsync + market share + general opinion on drivers is sort of like Nvidia's reciprocating check. If they can kill mantle, it'd be check mate.

It's kind of a pessimistic way of looking at Nvidia, and it's very anti-gamer, but from what I've seen that's how I see Nvidia feeling about mantle.
 

Mr Swine

Banned
Is it though? Wouldn't it be better for Nvidia (not the gamer), to release their own version of mantle, causing a API fragmentation, which would make mantle's work in the space fall apart (no clear winner, thus everyone defaulting back to DX/OGL). Having "overhead" allows Nvidia (and AMD) to continue to sell premium models that try to brute force through the inefficiencies inherent with the heavy API's? Afterall, why does it make sense for NVidia to make older cheaper hardware to run better, rather than forcing buyers to buy new $600+ cards (or multiple of them) to get that extra 20%?

Nvidia hasn't really shown that they're into the gamer, but more into locking gamers into their ecosystem to crush competition in the market place. Sad really.


Perhaps, but mantle was also designed with GCN in mind, so even if NVidia could make their own thin driver layer, all the mantle pieces may not fit together with how their architecture works. if it was easy, and worth it, why doesn't ATI also support older hardware that isn't GCN, but in a depricated/lesser mode in their driver?

Isn't it a bit late for Nvidia to do their own Mantle? If it took AMD 2 years then they probably have a much bigger head start against Nvidia.
 

joesiv

Member
Isn't it a bit late for Nvidia to do their own Mantle? If it took AMD 2 years then they probably have a much bigger head start against Nvidia.

But how long would it take for them to build up a thin driver to support their current/legacy/future cards? A year? less? more?

if they wanted, they could work hard and build up a draft, and soft launch it to the public in the short term, thus deflating the whole mantle discussion, gamers tied to Nvidia would hold out for Nvidias version with the promise that all the "Nvidia the way it's meant to be played" games will get support (a lot of games btw). They could "target" another year for release of their "dismantle" thin API, and then conveniently miss their ship date. Over promise, under deliver, seems right in line ;)
 
Is it though? Wouldn't it be better for Nvidia (not the gamer), to release their own version of mantle, causing a API fragmentation, which would make mantle's work in the space fall apart (no clear winner, thus everyone defaulting back to DX/OGL). Having "overhead" allows Nvidia (and AMD) to continue to sell premium models that try to brute force through the inefficiencies inherent with the heavy API's? Afterall, why does it make sense for NVidia to make older cheaper hardware to run better, rather than forcing buyers to buy new $600+ cards (or multiple of them) to get that extra 20%?

Nvidia hasn't really shown that they're into the gamer, but more into locking gamers into their ecosystem to crush competition in the market place. Sad really.
Not if Mantle's performance boost is noticeable enough to make its consumers switch to AMD en masse. You've got the studios and publishers of tons of AAA games behind Mantle thanks to its performance enhancements and integration into the next-gen consoles. If Nvidia wanted to cause API fragmentation, they should've gotten their graphics cards into one of the consoles. But since they didn't, they have to stay competitive with AMD, even if that means adopting their API and tweaking it to work on Nvidia cards. There's just too many people adopting Mantle for Nvidia to say "fuck it" and do their own thing, so the best thing Nvidia can do for its bottom line is adopt Mantle and figure out a way to get it to work with their new features like G-Sync.
 

Zarx

Member
Is it though? Wouldn't it be better for Nvidia (not the gamer), to release their own version of mantle, causing a API fragmentation, which would make mantle's work in the space fall apart (no clear winner, thus everyone defaulting back to DX/OGL). Having "overhead" allows Nvidia (and AMD) to continue to sell premium models that try to brute force through the inefficiencies inherent with the heavy API's? Afterall, why does it make sense for NVidia to make older cheaper hardware to run better, rather than forcing buyers to buy new $600+ cards (or multiple of them) to get that extra 20%?

Nvidia hasn't really shown that they're into the gamer, but more into locking gamers into their ecosystem to crush competition in the market place. Sad really.


Perhaps, but mantle was also designed with GCN in mind, so even if NVidia could make their own thin driver layer, all the mantle pieces may not fit together with how their architecture works. if it was easy, and worth it, why doesn't ATI also support older hardware that isn't GCN, but in a depricated/lesser mode in their driver? Far as I can see, GCN architecture was designed with mantle in mind, and is critical for how it functions.



The way I see it is, kind of like Chess, AMD finally has NVidia in check, now it's NVidias move. I think it's in Nvidias best interest to no join mantle, and find a way to make mantle fail, because I don't think Nvidia has such a long term vision in their architecture to support such a venture (to support something like this your core architecture can't change too much overtime, or it becomes unmanageable). Nvidias newest card + Gsync + market share + general opinion on drivers is sort of like Nvidia's reciprocating check. If they can kill mantle, it'd be check mate.

It's kind of a pessimistic way of looking at Nvidia, and it's very anti-gamer, but from what I've seen that's how I see Nvidia feeling about mantle.

Mantle offers a cross platform API that promises superior performance to existing APIs. Considering Valve's current Linux/SteamOS push (which Nvidia is involved with) Mantle offers an avenue for developers to support both Windows and Linux/SteamOS with the same high performance render path. Additionally it will allow high end setups like multi GPU perform much better and reliably which is a good thing for Nvida (sell more high end cards). I wouldn't expect Nvidia to add support for their existing GPUs either (hell they don't even support all the DX11.2 features) so they could use it as a selling point for Maxwell. Nvidia does like their proprietary APIs for sure, but that is all the more reason to deprive AMD of exclusive Mantle. Killing Mantle doesn't really do Nvidia any good at all, if they could have it too. They could launch their own competing API but they would be at a severe disadvantage, because one of the primary driving forces of Mantle is allowing developers to leverage their console optimizations on PC Nvidia doesn't have that link. Also they would be starting from behind with the list of Mantle supported games growing steadily with all Frostbite 3 games, Star Citizen, Thief (and likely the next Deus EX and Tomb Raider games along with it) and that new 64-Bit Strategy engine Nitrous (which will be powering Stardock's future titles among others) already pledging support.
 

backstep

Neo Member
I think the major obstacle to nvidia getting onboard with Mantle is that it's an AMD initiative. It's more likely that they'd provide a similar driver implementation once it becomes part of an independent API like DX or OGL. I don't believe nvidia will try to kill it with a competing proprietary API, they'll just publicly ignore it (and privately work toward it) until it or something similar joins a standard.

It could well play out like shader languages did, with the proprietary standard like Cg leading to existing API versions in HLSL and GLSL, or the way compute started with Cuda and lead into other API implementations with DirectCompute and OpenCL.

I don't think there's any technical obstacle preventing nvidia from developing a similar thin driver that handles the same API commands (in most cases). If you watched repi's really excellent presentation, nothing was super specific to the GCN architecture, or even AMD cards, it was more about handling memory resources and submitting commands in a way that's closer to how modern GPUs work.

If you think about the current DX/OGL APIs, they present the rendering pipeline based on an SGI model from the 80s (i think). It's extended a lot of course though. Still, it requires the graphics driver do a lot of translation and management to present that view of resources and commands to a modern GPU.

The main thrust of Mantle seems to be more about avoiding all that work in the driver by directly managing the GPU through an API that exposes the way GPUs really work now, rather than obfuscating it with an outdated model.

There are still architecture-specific features that you'd want to expose, and I think one of the slides actually said there are API extensions available similar to OpenGL for that. That in itself implies a certain commonality to the base API though, if extensions are provided for specific GPU features. It might get a bit weirder with cross-vendor shader support, since they use different instruction sets to implement the same capabilities, but I would expect Mantle still uses some form of shader language and doesn't expect you to write them all in assembly.

It looks like repi's been pushing for the vendors and API holders to address this for a good few years, and it's just that AMD are first to step up. You can see why they would too. It's true that a newer API is going to improve GPU utilisation with things like resource aliasing, bindless resources, GPU generated command buffers etc, but it also addresses AMD's biggest weakness in the CPU arena. Current drivers have to submit commands to the GPU from a single CPU thread with all the associated overhead, and single threaded performance is where AMD has been behind for years now. So allowing for multithreaded and low-overhead command submission basically nullifies Intel's lead in gaming/visualisation CPU performance.

For nvidia it would only improve their GPU performance, but for AMD it's a double win.

I expect it's going to be a win for gamers too, in the long run. Whichever vendor you prefer, Mantle is addressing the shortcomings in PC development that vendors/APIs have avoided for a while now, and it's to our benefit that PC gaming progresses beyond them.
 

Rafterman

Banned
Not if Mantle's performance boost is noticeable enough to make its consumers switch to AMD en masse. You've got the studios and publishers of tons of AAA games behind Mantle thanks to its performance enhancements and integration into the next-gen consoles. If Nvidia wanted to cause API fragmentation, they should've gotten their graphics cards into one of the consoles. But since they didn't, they have to stay competitive with AMD, even if that means adopting their API and tweaking it to work on Nvidia cards. There's just too many people adopting Mantle for Nvidia to say "fuck it" and do their own thing, so the best thing Nvidia can do for its bottom line is adopt Mantle and figure out a way to get it to work with their new features like G-Sync.


Where are you getting this from? There aren't "tons" nor are there currently "too many" to ignore. Currently there are a handful of studios and publishers on-board with Mantle and no one knows anything about what kind of performance enhancements it will bring at this point because AMD hasn't shown any performance results, nor have their current partners.
 

B_Boss

Member
Sheesh 12 years? When did he start....the age of 10? He looks young as hell. What an amazingly talented guy lol.
 
Where are you getting this from? There aren't "tons" nor are there currently "too many" to ignore. Currently there are a handful of studios and publishers on-board with Mantle and no one knows anything about what kind of performance enhancements it will bring at this point because AMD hasn't shown any performance results, nor have their current partners.

I agree nvidia sponsored games are about as many as AMD ones are.

To combat AMD the most likely move that nvidia will make is to make AMD users start missing more and more effects in nvidia titles, not join Mantle themselves!

"You can get an AMD card for 10% better performance or... you can get our card for exclusive AA/AO/DoF/Physics/etc"
 

-SD-

Banned
Sheesh 12 years? When did he start....the age of 10? He looks young as hell. What an amazingly talented guy lol.
I think I'm seeing some grey hair in Johan's latest photos, so I'd think he's around 40. He's just one of those people who stay young-looking for the most part of their lives.
 

B_Boss

Member
I think I'm seeing some grey hair in Johan's latest photos, so I'd think he's around 40. He's just one of those people who stay young-looking for the most part of their lives.

You never know. I started growing grey hairs about the age of 12-13 lol....true story.

While I'm at it, does anyone have any info (transcript, etc) on the PS4 Processor from the conference?
 
I think I'm seeing some grey hair in Johan's latest photos, so I'd think he's around 40. He's just one of those people who stay young-looking for the most part of their lives.

Half life of an engineer is short so he could be late 20 for all we know :p
 
Top Bottom