• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Inside Playstation 4 Pro: How sony made the first 4k games console

dr_rus

Member
It is not like part of the engine but part of the GPU.

ROPs has it own clock.
SPs has it own clock.

This is only a example... GTX 280 runs ROPs at 602Mhz while the SPs at 1296MHz... I don't know the clocks for GTX 1080 because nVidia only share the SPs clock and in AMD each part of the GPU has it own clock (mostly of time they use the same base clock).

And like I showed not all parts of Polaris get upgraded and these parts could be slower than PS4's GPU if running at lower clocks.

That is all guess too.

Everything is running on the same clock in Polaris, the only thing which is running on a different clock are the memory controllers, I believe (they run on the memory base clock).

Thing is that all the talk about how same (more or less) h/w with a slight clock bump leads to compatibility issue shows that Sony don't really care about s/w compatibility with future platforms. Otherwise this shit would be abstracted beyond APIs enough to not make any difference.

This kinda reaffirms Leadbetter's impression that PS5 may be a "clean slate" again, loosing all compatibility with PS4/Pro s/w.
 

RoboPlato

I'd be in the dick
i'm a bit scared about some games having a worse framerate on pro (Cod iw, mantis burn racing, killing floor 2...) I mean pushing the resolution race too hard looks it can hurt framerate and nobody seems to care of it
Mantis Burn has some minor drops, we haven't had an extended look at CoD,-and KF2 doesn't run well on the standard PS4. Sony has mandated that the framerate can't be lower than the standard PS4, I imagine that Mantis Burn is just within a margin of error.
 
It is when your engine is completely built in that way and would take months or even a year to redesign just for the purpose of *maybe* being compatible with a console in the future that doesn't exist in any form at the time of production.

Hyper Light Drifter just got a 30fps to 60fps patch, and it took months to reprogram the game. What you're asking for is one hundred percent unreasonable.



Decades? Hell naw, that is not true at all for game development. What developer was making a PS2 game back in the day and thinking, hey, we should create some extra assets just in case this game gets an HD port in the future at a resolution that we have no way of knowing will be standard or the kind of performance the game will have on completely exotic architecture.

No one thinks this way. Only more recently do we have a general idea where games are heading, tech wise, and that's only for the immediate 4k future. Will 120fps monitors skyrocket in sales before then? Or will 8k be a thing in ten years? Or will it all be VR/AR based? Who the hell knows, but I'm not wasting even a day trying to future proof for something that no one will benefit from for years, if they even do. Again, what you're asking for is madness, and no one is even doing this sans working on consoles that are in production or to be released very soon.

Increasing the base framerate is indeed unreasonable to expect. But tons of games have not great framerates and a stronger console incapable of running those games flawlessly as they were intended is an huge missed opportunity.
 

ethomaz

Banned
Yeah that's my line of reasoning too and why, not being as tech literate as some of the guys here, I'm trying to underdand what could go wrong with a frequency bump.

I understand the whole "logic tied to framerate" thing but nobody's expecting a capped 30 fps game to run at 60.
But something more along the lines of a PS4 game that would sometimes drop to 25 because of a cpu or gpu bottleneck running at or closer to 30 on the pro.
Logic tied to framerate is not a good way to explain because most game today has logic running at different times than framerates... so they are not tied.

The issue is if your game logic runs faster in a way you didn't predicted it can generate bugs, glitches, opening for somebody abuse of the game.

That is why is safe and recommended maintain the closest hardware performance possible if you don't want to quality test and patches the eventual issues for your full library of games.
 

ethomaz

Banned
This kinda reaffirms Leadbetter's impression that PS5 may be a "clean slate" again, loosing all compatibility with PS4/Pro s/w.
Even Cerny said to EG.

PS5 will be the new arch to push things without holding old hardware compatibility.

About the clocks on AMD... yes you right AMD uses the same base clock to almost everything being the exceptions the memory controller and SPs clock (that increases on demand... it is called boost clock).
 
Correct me if I'm wrong, but does support for 4k mode, either native or checkerboarded, mean that there will invariably be downsampling support doe hdtv's? Or is that still a significant amount of dedicated dev time/implementation?

I'm definitely interested in getting the Pro on launch, even though I'm not ready to jump onto the 4k/hdr train yet until this time next year-ish, so hdtv support for upcoming games (and the easiness to provide that support) is definitely paramount for me. It seems like Cerny at least is adamant in support for hdtv's at least, but who knows how far that'll take us.
 
Ah logic was the wrong word, but physics and such (the Bethesda engine that powers Fallout/Elder Scrolls).

I mean logic/physics too. I concur absolutely that going 30 to 60 is completely unreasonable to expect games to have prepared in advance.

However, slowdowns are quite common and the engine is capable of handling that, and those could and should be completely eliminated by stronger hardware.

Bethesda games for instance, have terrible framerate on consoles.
 

ethomaz

Banned
Correct me if I'm wrong, but does support for 4k mode, either native or checkerboarded, mean that there will invariably be downsampling support doe hdtv's? Or is that still a significant amount of dedicated dev time/implementation?

I'm definitely interested in getting the Pro on launch, even though I'm not ready to jump onto the 4k/hdr train yet until this time next year-ish, so hdtv support for upcoming games (and the easiness to provide that support) is definitely paramount for me. It seems like Cerny at least is adamant in support for hdtv's at least, but who knows how far that'll take us.
Cerny said in the interview it is the minimum requirement to devs that they create a 4k mode and this mode will be downsampled to 1080p in HDTV without any additional dev time.

But devs has the free to create it own and better HDTV mode and that needs dev time even if small.

I mean logic/physics too. I concur absolutely that going 30 to 60 is completely unreasonable to expect games to have prepared in advance.

However, slowdowns are quite common and the engine is capable of handling that, and those could and should be completely eliminated by stronger hardware.

Bethesda games for instance, have terrible framerate on consoles.
You don't need better hardware to eliminate these issues... you just needs a new patch from the dev to fix slowdowns and framerate drops... your main issue is with the developer and not the hardware.

They didn't do that because time and cost.

Most are not happy in waste developer time with Pro but Sony convinced them ;)
 

rambis

Banned
Logic tied to framerate is not a good way to explain because most game today has logic running at different times than framerates... so they are not tied.

The issue is if your game logic runs faster in a way you didn't predicted it can generate bugs, glitches, opening for somebody abuse of the game.

That is why is safe and recommended maintain the closest hardware performance possible if you don't want to quality test and patches the eventual issues for your full library of games.

Yeah Cerny even said they ran some tests to this effect. The Xbox 1S really isn't comparable here and will only serve to confuse people.
 
Increasing the base framerate is indeed unreasonable to expect. But tons of games have not great framerates and a stronger console incapable of running those games flawlessly as they were intended is an huge missed opportunity.
If the developer wants a locked framerate, that intention would already be manifested.

There's a lot going on when you start messing with framerates and even interpolation between frames has drawbacks if you aren't careful in your logic cycle.

That's not a "missed opportunity" - that's what we devs call "unknown unknowns". They are borderline impossible to prepare for and huge time and finance dumps that almost always lead to nowhere. Virtually impossible to prepare for.

I'm really amazed so many people are clamoring for Sony to fix what some of us devs fucked up to begin with. Sony's job is to ensure compatability. It's on us to deliver solid performances, regardless of hardware.
 

ethomaz

Banned
Yeah Cerny even said they ran some tests to this effect. The Xbox 1S really isn't comparable here and will only serve to confuse people.
XB1 S barely runs faster... 7% faster is something won't exploit a gaming logic.

Pro GPU running at 911Mhz has more 13% of clock over the ~10% performance of Polaris... that can end up to 25% performance difference.
 

ethomaz

Banned
If it fits the magic point, it could have impact.
Things like higher jump at certain framerate has happened before.
Yeap it can... I worded in a wrong way :D
The changes to that happen is lower in S than in Pro running at 800Mhz... imagine if it was at 911Mhz... you are only increasing the changes of get in this scenario.

Like some said here... Cerny in test entered in these magic points.
 

Moosichu

Member
It seems overly glib to simply call this bad design if pretty much everyone in the development world supposedly agrees its bad design yet a significant portion do it anyway. AKA, there's clearly advantages to engaging in the practice. The console world is not a separate insular segment of developers who simply didn't get the memo - there's abundant overlap between console and PC development these days.

Very few devs do it. The only recent notable example has been the Soul's games. It's bad practice to tie logic to framerate nowadays.
 
XB1 S barely runs faster... 7% faster is something won't exploit a gaming logic.

Pro GPU running at 911Mhz has more 13% of clock over the ~10% performance of Polaris... that can end up to 25% performance difference.

But doesn't it cap at base PS4 specs? Or am I misunderstanding you here?
 

belvedere

Junior Butler
Wuuuuut?

"In other words, at full floats, we have 4.2 teraflops. With half-floats, it's now double that, which is to say, 8.4 teraflops in 16-bit computation. This has the potential to radically increase performance."
 

chubigans

y'all should be ashamed
Increasing the base framerate is indeed unreasonable to expect. But tons of games have not great framerates and a stronger console incapable of running those games flawlessly as they were intended is an huge missed opportunity.

You'd have to do it on a title by title basis...and at that point it would be better for the developer to test and patch their own game, which is what they're aiming for.

Very few devs do it. The only recent notable example has been the Soul's games.

And Bethesda's games (Fallout/Elder Scrolls), and Konami's MGS 5, and a lot of indie games as well. Heck, the engine I use ties the framerate in so I had to target 60fps at the outset.
 
Uh.

I don't get this argument. Youre asking Sony and game developers to have hindsight on a console they haven't even made yet.

Honestly it's giving me a headache.
He's implying that Sony always had the Pro in it's plans. Maybe they didn't. Maybe they got scent of Scorpio and put the Pro out
 
Don't know why some are talking about framerates when Cerny said frequency was the problem.

Anyways, do anyone have an example where you would tie something to the frequency of the GPU or CPU?
 

ethomaz

Banned
But doesn't it cap at base PS4 specs? Or am I misunderstanding you here?
It caps at base PS4 clock... not specs... the Pro's GPU is up to 10% faster at the same clock than PS4's GPU... it is Polaris... 4rd Gen of AMD GPU while PS4's GPU is based in 1st Gen.
 

Nydus

Member
If the developer wants a locked framerate, that intention would already be manifested.

There's a lot going on when you start messing with framerates and even interpolation between frames has drawbacks if you aren't careful in your logic cycle.

That's not a "missed opportunity" - that's what we devs call "unknown unknowns". They are borderline impossible to prepare for and huge time and finance dumps that almost always lead to nowhere. Virtually impossible to prepare for.

I'm really amazed so many people are clamoring for Sony to fix what some of us devs fucked up to begin with. Sony's job is to ensure compatability. It's on us to deliver solid performances, regardless of hardware.

So much truth in this post. Yeah it would be nice if older games would run at stable 30 but its not on the consolemaker to salvage bad programming. I mean look at some pc games who run like shit on a 2000$ pc. Do you blame the hw or the game?
 
You don't need better hardware to eliminate these issues... you just needs a new patch from the dev to fix slowdowns and framerate drops... your main issue is with the developer and not the hardware.

They didn't do that because time and cost.

Most are not happy in waste developer time with Pro but Sony convinced them ;)

If the developer wants a locked framerate, that intention would already be manifested.

There's a lot going on when you start messing with framerates and even interpolation between frames has drawbacks if you aren't careful in your logic cycle.

That's not a "missed opportunity" - that's what we devs call "unknown unknowns". They are borderline impossible to prepare for and huge time and finance dumps that almost always lead to nowhere. Virtually impossible to prepare for.

I'm really amazed so many people are clamoring for Sony to fix what some of us devs fucked up to begin with. Sony's job is to ensure compatability. It's on us to deliver solid performances, regardless of hardware.


Now that's completely unreasonable. For locking the framerate you have to ensure that the minimum framerate is at least the one you are aiming. You can't do that and push the console at the same time. For example, even the almighty Uncharted 4 have framerate drops.

The game doesn't have to have a terrible framerate, a mostly solid 30fps with minor drops would benefit already.

What you guys are not considering is that throwing a better hardware to a game is precisely the easiest way to get that game to run flawlessly, one that sony is simply blocking by making the pro auto cap itself into a regular Ps4 when playing old games.

They shouldn't have to do anything extra other than expose the extra powers to the games, which would in turn run better on the Pro without any dev extra work.

Dunno why they aren't doing it, but it's definitely possible. Ms is doing it with even the 360 games when running on the xbone.
 
Can someone explain to me if FP16 a big deal or not? Sorry if it was explained before, just saw one post in this page saying in PC nobody uses it so only first party games could take advantage of it. But even for those, would it be a big deal?
 

Marmelade

Member
So much truth in this post. Yeah it would be nice if older games would run at stable 30 but its not on the consolemaker to salvage bad programming. I mean look at some pc games who run like shit on a 2000$ pc. Do you blame the hw or the game?

It's not about who's to blame
It's just that it would have been nice being able to brute force through these issues with the Pro (just like it's done on PC for some games)
But if it's not mean to be, it's not meant to be
 

Fisty

Member
It's not about who's to blame
It's just that it would have been nice being able to brute force through these issues with the Pro (just like it's done on PC for some games)
But if it's not mean to be, it's not meant to be

But you would be asking Sony to either falsely claim that all 700 PS4 games would work flawlessly on PS4Pro without patches, or tell people that some games wouldn't work without capping performance. Neither one of those is a good outcome. Sony can't test every game to make sure they work on the new specs, so they are leaving it up to the devs across the board, the only realistic solution.

People keep saying this should work no problem without needing patches, and for maybe a large portion of those 700 games it is true, a simple patch that tells the game "hey its cool to stretch your legs" is all it would need. But for those that need more, hey at least they are guaranteed to work as they did before instead of finding out 20 hours in that you've corrupted your save due to compatibility issues.

Not to mention, I'm willing to bet real money that Sony is giving all devs a free patch for their old titles if they support Pro in some way, even if it's just a small res bump to get supersampling at 1080p
 

mrklaw

MrArseFace
Even Cerny said to EG.

PS5 will be the new arch to push things without holding old hardware compatibility.

About the clocks on AMD... yes you right AMD uses the same base clock to almost everything being the exceptions the memory controller and SPs clock (that increases on demand... it is called boost clock).

It'll inevitably be BC though, just maybe not forward compatible
 
The pro sounds like a nice piece of hardware. Sony has done good here with PS4. I am judging waiting for reviews before I jump in.


However this makes me worry about BC on PS5. I have bought a ton of digital games on my PS4 and Xbox one from both 3rd and 1st parties. Looks like I might start getting all 3rd party games on Xbox one. Idk we shall see how Scorpio handles this but this kind of makes me shaky investing into the PS4 ecosystem.
 

Fisty

Member
The pro sounds like a nice piece of hardware. Sony has done good here with PS4. I am judging waiting for reviews before I jump in.


However this makes me worry about BC on PS5. I have bought a ton of digital games on my PS4 and Xbox one from both 3rd and 1st parties. Looks like I might start getting all 3rd party games on Xbox one. Idk we shall see how Scorpio handles this but this kind of makes me shaky investing into the PS4 ecosystem.

Don't be worried about BC on PS5, it's almost guaranteed at this point. There's a reason they went to x86. I wouldn't expect Bloodborne to take advantage of PS5 specs though unless specifically patched, just normal BC. I doubt forward compatibility though, that would get muddy over time
 

Fisty

Member
Some Xbox Fanboys sure, all the regular common sense people? No way.

PS4 Pro was leaked way before Scorpio.

Well MS certainly wants us to believe it, which is the reason they announced a console 18 months before they planned on launching just so they could meet/beat Pro's announcement
 

ethomaz

Banned
Can someone explain to me if FP16 a big deal or not? Sorry if it was explained before, just saw one post in this page saying in PC nobody uses it so only first party games could take advantage of it. But even for those, would it be a big deal?
My take on FP16.

Nobody uses in PC because it is fucking slow... you can do 64 FP32 operations at time of 1 FP16 operation in GTX 1080 for example... in better words in Pascal FP32 is 64x faster FP16.

Pro runs FP16 twice faster than FP32.

Why bother to use if it is so slow? Devs uses it only on mobile/portable (Android, iOS, Tegra, etc)... so the experience in how the FP16 improve performance in some places where is not required FP32 is not a know thing in PC scheme... they will need to change the quote and see if they got performance increase.

That is why I believe it will only be used in exclusive games and not game that share PC development because the use of FP16 can literally broke your game in PC.

You can read more here: http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5
 

kaching

"GAF's biggest wanker"
Very few devs do it. The only recent notable example has been the Soul's games. It's bad practice to tie logic to framerate nowadays.
I have no firsthand knowledge, just observing in the course of the discussion that there seems to be some dispute over how limited the practice is. As such, simply calling it "bad practice" without better qualifying why numerous devs may still engage in the practice, isn't really helpful or strictly accurate.

And, also, the practice has been clarified since as not being about game logic per se.
 

Tripolygon

Banned
My take on FP16.

Nobody uses in PC because it is fucking slow... you can do 64 FP32 operations at time of 1 FP16 operation in GTX 1080 for example... in better words in Pascal FP32 is 64x faster FP16.

Pro runs FP16 twice faster than FP32.

Why bother to use if it is so slow? Devs uses it only on mobile/portable (Android, iOS, Tegra, etc)... so the experience in how the FP16 improve performance in some places where is not required FP32 is not a know thing in PC scheme... they will need to change the quote and see if they got performance increase.

That is why I believe it will only be used in exclusive games and not game that share PC development because the use of FP16 can literally broke your game in PC.

You can read more here: http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5
Umm because Nvidia artificially limits the performance of FP16 on their consumer cards.

Mantis Burn uses FP16 and that game is on PC, Xbox One, PS4 and PS4 Pro.
 

Fafalada

Fafracer forever
Marmelade said:
It's just that it would have been nice being able to brute force through these issues with the Pro (just like it's done on PC for some games)
PCs brute force things through driver-layers (not APIs, as some people here think). There's no such thing as "free" lunch for backwards compatibility - you either have a software-stack that is maintained asymmetrically to the applications (graphic drivers in specific case of GPUs) or you have to update the applications.

Clock-speed is something you can "mostly" change without touching the software - but the unpredictability factor remains, and whenever consoles increased unpredictability in recent history, reactions have been less then positive.

LukasTaves said:
Ms is doing it with even the 360 games when running on the xbone.
As is Sony with PSP on Vita, PS2 on PS3/PS4, PS1 on PSP/PS2/PS3... But emulators are a simulation of the hardware. The application doesn't get to talk to different hw - it talks to something that was optimized to pretend it's the same hw, running at the same clock speed.
It sometimes run internally faster/slower - but that is exceedingly unlikely to break anything (And when it does, the emulator gets fixed to work around it - part of the reason for releases trickling).
 

ethomaz

Banned
Umm because Nvidia artificially limits the performance of FP16 on their consumer cards.
I really believe Polaris has slower FP16 than FP32 too.... so no matter where you will broke your game on PC.

Pro feature of 2x faster FP16 is something coming from AMD roadmap (Vega?).

Edit - I found some info the FP16:FP32 ratio on Polaris is 1:1... so it takes the same time to process a operation no matter if it is FP32 or FP16... the only beneficie is the lower memory usage for FP16 in RX 480... no performance improvement.

So why use something that didn't give beneficies (Polaris) or run slower (Pascal) to make a game???

Mantis Burn uses FP16 and that game is on PC, Xbox One, PS4 and PS4 Pro.
It won't... only the Pro version looks to use it.
 

Tripolygon

Banned
I really believe Polaris has slower FP16 than FP32 too.... so no matter where you will broke your game on PC.

Pro feature of 2x faster FP16 is something coming from AMD roadmap (Vega?).
It doesn't. On Polaris, you can calculate 1 FP16 ops at the same time it takes to calculate an FP32 ops. On Vega you can do 2:1 meaning you can do 2 FP16 ops in the time it takes to do 1 FP32 ops.
 

ethomaz

Banned
It doesn't. On Polaris, you can calculate 1 FP16 ops at the same time it takes to calculate an FP32 ops. On Vega you can do 2:1 meaning you can do 2 FP16 ops in the time it takes to do 1 FP32 ops.
I fixed my comments some minutes ago and yes.

There is no incentive to use FP16 in PC... there is no performance increase the opposite all GPUs from nVidia and AMD will run slower except Polaris lol (yes I checked GCN before Polaris didn't even have native FP16 support only slowly emulated).
 

Tripolygon

Banned
I fixed my comments some minutes ago and yes.

There is no incentive to use FP16 in PC... there is no performance increase the opposite all GPUs from nVidia and AMD will run slower except Polaris lol (yes I checked GCN before Polaris didn't even have native FP16 support only slow emulated).
The incentive to use that on console is, it is a single closed platform and you can improve performance, save on memory without major sacrifice to your goals.
It won't... only the Pro version looks to use it.
Yes that's my point ,and you said
That is why I believe it will only be used in exclusive games and not game that share PC development because the use of FP16 can literally broke your game in PC
 
Now that's completely unreasonable. For locking the framerate you have to ensure that the minimum framerate is at least the one you are aiming. You can't do that and push the console at the same time. For example, even the almighty Uncharted 4 have framerate drops.

The game doesn't have to have a terrible framerate, a mostly solid 30fps with minor drops would benefit already.

What you guys are not considering is that throwing a better hardware to a game is precisely the easiest way to get that game to run flawlessly, one that sony is simply blocking by making the pro auto cap itself into a regular Ps4 when playing old games.

They shouldn't have to do anything extra other than expose the extra powers to the games, which would in turn run better on the Pro without any dev extra work.

Dunno why they aren't doing it, but it's definitely possible. Ms is doing it with even the 360 games when running on the xbone.
You have no clue how game development works on closed architecture on consoles.

None. Nada. Zip. Zilch.

You have no clue how low-level APIs work.

"throwing a better hardware to a game is precisely the easiest way to get that game to run flawlessly"

Oh, why sure it is! Sure, sure. I mean, forget the fact that people like Chubigans and I work on PS4 daily. We have no clue what we are talking about. We certainly don't have an intimate knowledge of the system architecture or the APIs for OG/Pro and can't cite the differences leading to the decision that performance as 1:1 is the appropriate choice.

Wait, no... That's not it... Dammit where are my PS4 developer cliffnotes!

I don't even really need to explain the 360/XO compatability, do I? I'll just ask and see if you can get to the answer yourself:

Does every 360 game run on XO right out of the box? Or do only certain games have compatability?

There should be a light bulb going off above your head any second now followed by an "ah ha!" and then a "doh!".
 

ethomaz

Banned
The incentive to use that on console is, it is a single closed platform and you can improve performance, save on memory without major sacrifice to your goals.

Yes that's my point.
Exclusives will use ;)

Multiplatform will mostly ignore it with few exceptions or indie games.
 

dr_rus

Member
About the clocks on AMD... yes you right AMD uses the same base clock to almost everything being the exceptions the memory controller and SPs clock (that increases on demand... it is called boost clock).
There are no "SPs clock", the core clock is the only clock there is (plus the memory clocks for MCs) and it changes with boost.

My take on FP16.

Nobody uses in PC because it is fucking slow... you can do 64 FP32 operations at time of 1 FP16 operation in GTX 1080 for example... in better words in Pascal FP32 is 64x faster FP16.
Nobody uses it on PC because it made no sense to use it up till Pascal since FP16 was at best as fast as FP32.

Pro runs FP16 twice faster than FP32.

Why bother to use if it is so slow? Devs uses it only on mobile/portable (Android, iOS, Tegra, etc)... so the experience in how the FP16 improve performance in some places where is not required FP32 is not a know thing in PC scheme... they will need to change the quote and see if they got performance increase.

That is why I believe it will only be used in exclusive games and not game that share PC development because the use of FP16 can literally broke your game in PC.

You can read more here: http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5
FP16 is a shader calculation precision, if you can have 2x boost by using FP16 precision for some part of you shader(s) then you will most likely use it since performance is scarce on console h/w. Then same shaders will be ported to PC since it's rather rare that there's any kind of a deep rewrite for them between platforms, and if there will be a card which will be able to perform FP16 calculations twice as fast - it will.

Btw, your info on FP16 on NV h/w on PC is correct only for CUDA applications. FP16 is not exposed in DX for Pascal cards at all. In case there will be FP16 hint in the code of some DX game it will just be ignored and the calculation will be performed with FP32 precision, on FP32 througput.

Umm because Nvidia artificially limits the performance of FP16 on their consumer cards.
Not really, unless you mean "specifically cut out the h/w needed for this from their consumer cards to make it a feature of GP100 chip and make consumer cards simpler" under "artificially limits".
 
Top Bottom