Wait, I'm a little confused. Are they saying the PS4 Pro is a legitimate, bonafide 4K games machine? I thought the PS4 Pro was only capable of upscaling retail games to 4K quality unless they are counting native 4K indie, lower budget style games.
Wait, I'm a little confused. Are they saying the PS4 Pro is a legitimate, bonafide 4K games machine? I thought the PS4 Pro was only capable of upscaling retail games to 4K quality unless they are counting native 4K indie, lower budget style games.
Thanks guys. I honestly had no idea that the PS4 could natively run games in 4K beyond indie titles. I insinuated that everything was upscaled.
They didn't cut out features on their consumer cards so they can sell it in their high-margin pro cards?
Nvidia has been doing that for ages.
Nvidia artificially restricts the performance of FP16 in their consumer cards in order to sell it in their pro cards. You are hung up on the word artificial as if it means hardware support is there and they are using software to restrict it (yes they have also done that before via software). That's not what the word artificial means, it means they are creating market segmentation, it is good business.When someone "cuts out" a big engine from a cheaper car is it artificial or not? The h/w isn't there in consumer cards, that was a marketing positioning decision, not something artificial. Nv has been doing this for ages, and look where Nv is now and where their competitors are.
The biggest thing to take out of this is that IF FP16 will begin to actually benefit GeForce (gaming) markets - they will be able to bring that h/w into their consumer products rather fast.
It's more like a Neo Geo that can also run genesis games.So that I can understand this could someone talk to me in bits?
For example sega genesis was 16 bit
then came the 32x
So is that what a PS4 pro is , like some sorta 32x?
It's more like a Neo Geo that can also run genesis games.
Can we say it's like an N64 with the memory pack ?
Can we say it's like an N64 with the memory pack ?
Thing is that all the talk about how same (more or less) h/w with a slight clock bump leads to compatibility issue shows that Sony don't really care about s/w compatibility with future platforms. Otherwise this shit would be abstracted beyond APIs enough to not make any difference.
This kinda reaffirms Leadbetter's impression that PS5 may be a "clean slate" again, loosing all compatibility with PS4/Pro s/w.
Even Cerny said to EG.
PS5 will be the new arch to push things without holding old hardware compatibility.
You have no clue how game development works on closed architecture on consoles.
None. Nada. Zip. Zilch.
You have no clue how low-level APIs work.
"throwing a better hardware to a game is precisely the easiest way to get that game to run flawlessly"
Oh, why sure it is! Sure, sure. I mean, forget the fact that people like Chubigans and I work on PS4 daily. We have no clue what we are talking about. We certainly don't have an intimate knowledge of the system architecture or the APIs for OG/Pro and can't cite the differences leading to the decision that performance as 1:1 is the appropriate choice.
Wait, no... That's not it... Dammit where are my PS4 developer cliffnotes!
I don't even really need to explain the 360/XO compatability, do I? I'll just ask and see if you can get to the answer yourself:
Does every 360 game run on XO right out of the box? Or do only certain games have compatability?
There should be a light bulb going off above your head any second now followed by an "ah ha!" and then a "doh!".
Glad to see more actual devs who know what they're talking about shedding light. I did say god dayuuumn tho lol.You have no clue how game development works on closed architecture on consoles.
None. Nada. Zip. Zilch.
You have no clue how low-level APIs work.
"throwing a better hardware to a game is precisely the easiest way to get that game to run flawlessly"
Oh, why sure it is! Sure, sure. I mean, forget the fact that people like Chubigans and I work on PS4 daily. We have no clue what we are talking about. We certainly don't have an intimate knowledge of the system architecture or the APIs for OG/Pro and can't cite the differences leading to the decision that performance as 1:1 is the appropriate choice.
Wait, no... That's not it... Dammit where are my PS4 developer cliffnotes!
I don't even really need to explain the 360/XO compatability, do I? I'll just ask and see if you can get to the answer yourself:
Does every 360 game run on XO right out of the box? Or do only certain games have compatability?
There should be a light bulb going off above your head any second now followed by an "ah ha!" and then a "doh!".
Uh, if I'm reading that right, it's kinda worrying.
I've been buying more digital titles on PS4 lately on the assumption that the x86 shift this gen would pretty much assure these games will work on future generation machines for the foreseeable future.
I'm going to be really disappointed if PS5 makes us start all over again. I thought MS and Sony were doing away with restarts every generation, I wanted to start building a proper digital library.
Question: do you want better PS4 games or do you want a new generation of games with better interface & better all around instead of just giving you better versions of what you already have?
Nvidia artificially restricts the performance of FP16 in their consumer cards in order to sell it in their pro cards. You are hung up on the word artificial as if it means hardware support is there and they are using software to restrict it (yes they have also done that before via software). That's not what the word artificial means, it means they are creating market segmentation, it is good business.
I don't care about their market share, thats not what i was talking about. Someone said FP16 is slow in PC using Nvidia cards as example, I simply pointed out thats because Nvidia restricts the performance of FP16 on their consumer cards.
You can stay hung up on the word artificial but we are going to agree to disagree. I hate car analogies, they seldom make sense. Two cars with the same engine but one car has a higher horsepower rating. This can be accomplished by restricting the exhaust and air inlets, change valve timing etc etc
Uh, if I'm reading that right, it's kinda worrying.
I've been buying more digital titles on PS4 lately on the assumption that the x86 shift this gen would pretty much assure these games will work on future generation machines for the foreseeable future.
I'm going to be really disappointed if PS5 makes us start all over again. I thought MS and Sony were doing away with restarts every generation, I wanted to start building a proper digital library.
Uh, if I'm reading that right, it's kinda worrying.
I've been buying more digital titles on PS4 lately on the assumption that the x86 shift this gen would pretty much assure these games will work on future generation machines for the foreseeable future.
I'm going to be really disappointed if PS5 makes us start all over again. I thought MS and Sony were doing away with restarts every generation, I wanted to start building a proper digital library.
Cerny is on a different level!
It's unfortunate that technical discussions between people who claim to be professionals turn into name calling and insults so quickly when they could be valuable exchanges of knowledge that would benefit everyone interested in the subject.
That's not what the word artificial means in my context so don't tell me what it means. You sarcastically asked if that was what i meant and i said yes so i dont know what you're still going on about.That's exactly what word "artificial" means and you're just grasping for straws now. There are no FP16 h/w beyond the single SP for CUDA compatibility in NV's consumer GPUs, and you don't know nearly enough to say that it's "artificial" and not in fact a conscious choice to make consumer part simpler and better performing where it actually matters - namely in FP32.
man made or produced by human beings rather than occurring naturally
Limiting the performance of compute-centric features in consumer parts is nothing new for NVIDIA. FP64 has been treated as a Tesla feature since the beginning, and consumer parts have either shipped with a very small number of FP64 CUDA cores for binary compatibility purposes, or when a GeForce card uses an HPC-class GPU, FP64 performance is artificially restricted. This allows NVIDIA to include a feature for software development purposes while enforcing strict market segmentation between the GeForce and Tesla products. However in the case of FP64, performance has never been slower than 1/32, whereas with FP16 we’re looking at a much slower 1/128 instruction rate. Either way, the end result is that like GP104’s FP64 support, GP104’s FP16 support is almost exclusively for CUDA development compatibility and debugging purposes, not for performant consumer use.
As for why NVIDIA would want to make FP16 performance so slow on Pascal GeForce parts, I strongly suspect that the Maxwell 2 based GTX Titan X sold too well with compute users over the past 12 months, and that this is NVIDIA’s reaction to that event. GTX Titan X’s FP16 and FP32 performance was (per-clock) identical its Tesla equivalent, the Tesla M40, and furthermore both cards shipped with 12GB of VRAM. This meant that other than Tesla-specific features such as drivers and support, there was little separating the two cards.
Obviously they're not going to make any promises at this stage, but considering the competition's intent to build BC-library moving forward, I can't imagine PS5's development at the moment not taking into account a PS4-emulator to simulate the PS4 platform.
There were a couple helpful analogies earlier regarding the differences between FP16 and FP32 and how PS4Pro can benefit from that, but can someone answer this:
How will the increased capabilities of FP16 manifest in games? In pure layman's speak, what will this make shinier?
Just higher quality shaders?
There were a couple helpful analogies earlier regarding the differences between FP16 and FP32 and how PS4Pro can benefit from that, but can someone answer this:
How will the increased capabilities of FP16 manifest in games? In pure layman's speak, what will this make shinier?
Just higher quality shaders?
That was a great call, man. You know your shit.Say if a dev made a game for the PS4 & they used FP32 for a lot of tasks that they could have used FP16 for, that same dev could make that game for the PS4 Pro & use FP16 & get a little over 4X the performance.
This is where we will see native 4K PS4 Pro games vs 1080P PS4 games.
http://www.neogaf.com/forum/showpost.php?p=215207877&postcount=3366
FP16 is not a compressed FP32.Say if a dev made a game for the PS4 & they used FP32 for a lot of tasks that they could have used FP16 for, that same dev could make that game for the PS4 Pro & use FP16 & get a little over 4X the performance.
This is where we will see native 4K PS4 Pro games vs 1080P PS4 games.
http://www.neogaf.com/forum/showpost.php?p=215207877&postcount=3366
FP16 is not a compressed FP32.
You lose range and accuracy using FP16 that can end having a loss in IQ after the processing.
And FP16 is only twice faster than FP32 on Pro... that over 4x performance is bullshit again.
You can use FP16 in games where you don't need FP32 range of precision and talking about real game the boost changing to FP16 where you can is around 20% from what I read.
Even the 2x performance boost is fantasious because you can't use FP16 in every single place.
I see that but the difference between a game using FP16 over FP32 is nowhere near 2x in a 2:1 GPU. And you need to if the loss in quality is worth of it.4x performance over Vanilla PS4, i think you didn't understand what he is logically saying.
FP32 task on PS4 is four times slower the same task processed in FP16 on Pro. (assuming the difference in FLOPs between PS4 and Pro is over 2x).
I see that but the difference between a game using FP16 over FP32 is nowhere near 2x in a 2:1 GPU. And you need to if the loss in quality is worth of it.
FP16 is not a compressed FP32.
You lose range and accuracy using FP16 that can end having a loss in IQ after the processing.
And FP16 is only twice faster than FP32 on Pro... that over 4x performance is bullshit again.
You can use FP16 in games where you don't need FP32 range of precision and talking about real game the boost changing to FP16 where you can is around 20% from what I read.
Even the 2x performance boost is fantasious because you can't use FP16 in every single place.
You are comparing to PS4 I misunderstood... I thought you are talking about 4x over Pro with FP32.It all just went over your head,
PS4 Pro is already over 2X the PS4 & when you add in devs optimizing & using FP16 where they used FP32 & FP16 (When FP16 was limited to the same throughput as FP32 on PS4) you get over 4X the performance.
Don't forget that PS4 Pro has extra hardware like advance work distributor , ID buffer , color compression & so on that will lighten the load vs making a game on the PS4.
BTW a bit of GPU story.
Up to NV35 (FX 5900) the GPUs didn't have a native FP32 unit and for that it was way slower than FP16. ATI did have a better FP32 unit but still slower to FP16 but faster than nVidia F32 emulation. ATI had a middle option called FP24 that had close performance to FP16 with better trade in image quality to FP32.
nVidia was accused of cheating because they forced via drivers to games runs in FP16 being faster than ATI solution but the output image was way lower than FP24 option in ATI... the FP16 output was really a big trade in terms of image quality.
GPUs after that moved to FP32 native units and all performance and image quality were mensured in FP32.
Mobile GPUs uses FP16 because they can't show the differences in image quality in a small screen making it a trade worth of it.
Big screen GPUs are all FP32 utilization because the better quality and optimal performance (the fact they moved to native FP32 units made the FP16 be emuled until Polaris and that way lower performance than FP32).
People thing FP16 is a new thing and it will give free performance boost to games is laying to themselves.
FP16 offer lower image quality compared to FP32 and not every part of code can run in FP16 (most shaders can)... the gain won't be close to 2x performance with degradation in image quality.
Industry needs and should move to FP64 when the hardware implementation get cheaper and free on performance
BTW a bit of GPU story.
Up to NV35 (FX 5900) the GPUs didn't have a native FP32 unit and for that it was way slower than FP16. ATI did have a better FP32 unit but still slower to FP16 but faster than nVidia F32 emulation. ATI had a middle option called FP24 that had close performance to FP16 with better trade in image quality to FP32.
nVidia was accused of cheating because they forced via drivers to games runs in FP16 being faster than ATI solution but the output image was way lower than FP24 option in ATI... the FP16 output was really a big trade in terms of image quality.
GPUs after that moved to FP32 native units and all performance and image quality were mensured in FP32.
Mobile GPUs uses FP16 because they can't show the differences in image quality in a small screen making it a trade worth of it.
Big screen GPUs are all FP32 utilization because the better quality and optimal performance (the fact they moved to native FP32 units made the FP16 be emuled until Polaris and that way lower performance than FP32).
People thing FP16 is a new thing and it will give free performance boost to games is laying to themselves.
FP16 offer lower image quality compared to FP32 and not every part of code can run in FP16 (most shaders can)... the gain won't be close to 2x performance with degradation in image quality.
Industry needs and should move to FP64 when the hardware implementation get cheaper and free on performance
On the same size TV 4K pixels will be 1/4 the size of a 1080P pixel so the 4K pixels don't need to have as much detail in them as a pixel from a 1080P game. it's probably going to be hard to tell the difference between pixels made using FP32 & FP16 at 4K resolutions.
In the paper they say they would rather davs use checkerboard rendering at 4K than to have a normal 1440P game because a 1440P game on a 4K TV don't look much different from a 1080P game on a 1080P TV.
4K cheap pixels will look better than 1440P high quality pixels on a 4K TV. it's a middle ground it's not going to look as good as 4K without the tricks but it's going to look better than the next best thing.
This is a reoccurring thing between ATI (AMD) and Nvidia. Nvidia also accused ATI for cheating using the same tactic.BTW a bit of GPU story.
Up to NV35 (FX 5900) the GPUs didn't have a native FP32 unit and for that it was way slower than FP16. ATI did have a better FP32 unit but still slower to FP16 but faster than nVidia F32 emulation. ATI had a middle option called FP24 that had close performance to FP16 with better trade in image quality to FP32.
nVidia was accused of cheating because they forced via drivers to games runs in FP16 being faster than ATI solution but the output image was way lower than FP24 option in ATI... the FP16 output was really a big trade in terms of image quality.
Yeap AMD used FP16 in some games benchmarks recently but it was only for the render buffer I guess... not shaders or others parts.This is a reoccurring thing between ATI (AMD) and Nvidia. Nvidia also accused ATI for cheating using the same tactic.
Important note if you are testing the following applications:
* Dawn of War 2
* Empire Total War
* Need for Speed: Shift
* Oblivion
* Serious Sam II
* Far Cry 1
AMD has admitted that performance optimizations in their driver alters image quality in the above applications. The specific change involves demoting FP16 render targets to R11G11B10 render targets which are half the size and less accurate. The image quality change is subtle, but it alters the workload for benchmarking purposes. The correct way to benchmark these applications is to disable Catalyst AI in AMDs control panel. Please contact your local AMD PR representative if you have any doubts on the above issue.
NVIDIAs official driver optimizations policy is to never introduce a performance optimization via .exe detection that alters the applications image quality, however subtle the difference. This is also the policy of FutureMark regarding legitimate driver optimizations.
NOTE: If you wish to test with Need for Speed: Shift or Dawn of War 2, we have enabled support for FP16 demotion similar to AMD in R260 drivers for these games. By default, FP16 demotion is off, but it can be toggled on/off with the AMDDemotionHack_OFF.exe and AMDDemotionHack_ON.exe files which can be found on the Press FTP.
For apples-to-apples comparisons with our hardware versus AMD, we ask that you run the AMDDemotionHack_ON.exe when performing your graphics testing with these games. In our own internal testing, speedups of up to 12% can be seen with our hardware with FP16 demotion enabled.
Geometry rendering is a simpler form of ultra HD rendering that allows developers to create a 'pseudo-4K' image, in very basic terms. 1080p render targets are generated with depth values equivalent to a full 4K buffer, plus each pixel also has full ID buffer data - the end result is that via a post-process, a 1080p setup of these 'exotic' pixels can be extrapolated into a 4K image with support for alpha elements such as foliage and storm fences (albeit at an extra cost). On pixel-counting, it would resolve as native 4K image, with 'missing' data extrapolated out using colour propagation from data taken from the ID buffer. However, there is a profound limitation.
When the pixel counters go to count the pixels it will be 8294400 pixels & not 2073600 pixels scaled across 8294400 pixels.
Yeah i was interested in this ever since you first mentioned it. btw the ps4 is not capable of FP16 right? This is exclusively a pro feature until the scorpio comes around?Say if a dev made a game for the PS4 & they used FP32 for a lot of tasks that they could have used FP16 for, that same dev could make that game for the PS4 Pro & use FP16 & get a little over 4X the performance.
This is where we will see native 4K PS4 Pro games vs 1080P PS4 games.
http://www.neogaf.com/forum/showpost.php?p=215207877&postcount=3366