• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Inside Playstation 4 Pro: How sony made the first 4k games console

ethomaz

Banned
Mantis burn is not exclusive.
... few exceptions or indie games.

It was already proved for generations and even on PS4... exclusive features are only fully utilized in exclusive titles... Multiplatform always looks for the lower denominator.

Games runs in all platforms coded with FP32... makes sense to recode all these variables where is possible (where you don't require FP32) to build the Pro version???
 

Tripolygon

Banned
Not really, unless you mean "specifically cut out the h/w needed for this from their consumer cards to make it a feature of GP100 chip and make consumer cards simpler" under "artificially limits".
Yes that's what i mean by artificially limits it.
 

Colbert

Banned
The incentive to use that on console is, it is a single closed platform and you can improve performance, save on memory without major sacrifice to your goals.

Question to me is for what type of workloads you can afford it to use that type lesser precision in calculations?
 
Cerny said in the interview it is the minimum requirement to devs that they create a 4k mode and this mode will be downsampled to 1080p in HDTV without any additional dev time.

Where did you get that information? I think you may have interpreted incorrectly. One reason is that Paragon doesn't have a 4K mode, so that debunks a 4k mode being required. From the article I read, it was said that they were strongly encouraging developers to render higher resolutions, but ultimately it was up to them.
 

ethomaz

Banned
Where did you get that information? I think you may have interpreted incorrectly. One reason is that Paragon doesn't have a 4K mode, so that debunks a 4k mode being required. From the article I read, it was said that they were strongly encouraging developers to render higher resolutions, but ultimately it was up to them.
The Paragon 1080p enhanced mode is better than 4k downsampled... so not minimum.

The minimum is about what you have on HDTV... the minimum is to use the Pro Mode (mostly 4k) downsampled to 1080p.... of course it using the same Pro Mode 1080p enhanced will be even better.
 
The Paragon 1080p enhanced mode is better than 4k downsampled... so not minimum.

The minimum is about what you have on HDTV... the minimum is to use the Pro Mode (mostly 4k) downsampled to 1080p.... of course it using the same Pro Mode 1080p enhanced will be even better.

Have a link?
 

panda-zebra

Banned
Imagine some artist who can paint a nicely detailed painting in X amount of time. This is FP32 4.2TF mode.

Then imagine the same artist forced to paint the same painting but with less care, cutting some corners here and there. You may or may not notice depending on which part you're looking at. It looks similar to the original but he took half the amount of time. This is 8.4TF FP16 mode.

Damn you sure are a great mate, i understand now! holly sh**!

How about a slightly different analogy - the painter is producing a mural. This task doesn't require very fine skills due to its nature. Although she can paint very finely with her left hand, nobody would ever see this detail, so her efforts are somewhat wasted.

In the past (ps4), if she chose to paint less accurately, because the task at hand doesn't require such precision, nothing would have been gained: one painter, one brush, same time to complete task either way(FP32).

Instead (ps4pro), now when she paints little less accurately she gets to hold a brush in each hand(FP16). As it's a mural and fine detail is not required, she's choosing to be less accurate to double her throughput, either completing the task in half the time or painting two different murals, one on the ceiling, one on the wall
long arms, yo

Not every task is suitable for this new technique of hers. If she was painting a still life portrait, she'd probably go with the left hand only and produce her finest results, taking her time.

At the end of this article they mention Knack looking better without doing anything to it.

https://www.engadget.com/2016/10/20/ps4-pro-mark-cerny-interview-hardware/

While I don't agree that it's automagically improved without a patch just because Cerny isn't quoted as saying so explicitly, my only true reaction is this:

2590922-3988854157-kqz1p.gif
 

ethomaz

Banned
PS3 pixel shaders were all 16 bits.Go figure.
Because that...

Each pixel pipeline has a pair of dedicated FP32 units. For this reason, nVidia describes the architecture as either 16×1 or 32×0, depending on whether a texturing operation is being executed. The 6800 Ultra is effectively a 16×1 architecture when texture operations are occuring. When texturing operations are not being carried out, the 6800 can act as a 32×0 architecture.

The FP32 texture unit found in each pixel shader pipeline also tends to filtering chores, doing bilinear, trilinear, and up to 128-tap anisotropic filtering. However, these texturing units can also perform filtering on FP16 color values, such as those used in ILM’s OpenEXR format. This preserves pixel color precision by not “dumbing down” the pixel color value to a fixed point 32-bit value, where filtering operations could introduce rounding errors. These errors can show up as banding or blotching in the image, particularly in areas with higher than normal dynamic range. It also means that an FP16 color value can be written into the frame buffer, and then be read back into the GPU without any loss of precision.

This was a issue in old nVidia GPUs (like PS3, 6800, etc).
 

ethomaz

Banned
Have a link?
EG article on the OP.

Devs will require to do a Pro Mode and they are encouraging higher resolutions than 1080p for 4k and the minimum they will to for HDTV is downsample from a higher resolution (no dev time needed).

Dev can choose to do an 1080p Enhanced mode for HDTV instead that requires dev time.
 

ethomaz

Banned
They didn't cut out features on their consumer cards so they can sell it in their high-margin pro cards?
Nvidia has been doing that for ages.
Did you reached the option that was to make the chip small and cheaper???

Every company cuts features from enterprise to end-consumer... it is called product segmentation... you sell the expensive full of feature chip to the big companies and the small lacking non-used features cheaper to consumers.

That happens in cars, phones, etc too if you want a example or just look at the PS4 vs Pro ;)
 

Tripolygon

Banned
Did you reached the option that was to make the chip small and cheaper???

Every company cuts features from enterprise to end-consumer... it is called product segmentation... you sell the expensive full of feature chip to the big companies and the small lacking non-used features cheaper to consumers.

That happens in cars too if you want a example or the PS4 vs Pro ;)
Nowhere in my post do i dispute that or say they can't do that. I said they artificially limit certain features on their consumer cards for market segmentation.
 

Electret

Member
My take on FP16.

Nobody uses in PC because it is fucking slow... you can do 64 FP32 operations at time of 1 FP16 operation in GTX 1080 for example... in better words in Pascal FP32 is 64x faster FP16.

Pro runs FP16 twice faster than FP32.

Why bother to use if it is so slow? Devs uses it only on mobile/portable (Android, iOS, Tegra, etc)... so the experience in how the FP16 improve performance in some places where is not required FP32 is not a know thing in PC scheme... they will need to change the quote and see if they got performance increase.

That is why I believe it will only be used in exclusive games and not game that share PC development because the use of FP16 can literally broke your game in PC.

You can read more here: http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5

I don't suppose someone in the know could produce a rough estimate of the percentage of data in a rendering pipeline that could be run as FP16?
 

DBT85

Member
Looking at the 2TB-thing but how much difference will it be? And how difficult is it to change?

It'll be easy to change, but we have no idea at all if it will make any more difference than SSDs did in the base PS4. Wait for people to test.
 

martino

Member
Hypervisor on Xbox will make things a lot différent for scorpio. Bc will probably become more and more the strength of Xbox ecosystem
 

onQ123

Member
My take on FP16.

Nobody uses in PC because it is fucking slow... you can do 64 FP32 operations at time of 1 FP16 operation in GTX 1080 for example... in better words in Pascal FP32 is 64x faster FP16.

Pro runs FP16 twice faster than FP32.

Why bother to use if it is so slow? Devs uses it only on mobile/portable (Android, iOS, Tegra, etc)... so the experience in how the FP16 improve performance in some places where is not required FP32 is not a know thing in PC scheme... they will need to change the quote and see if they got performance increase.

That is why I believe it will only be used in exclusive games and not game that share PC development because the use of FP16 can literally broke your game in PC.

You can read more here: http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5


Edit: this post was silly by the way




I'll just repost what I posted a 2 weeks ago


You still can't gauge it by that quote either because there has been hardware & API changes to make better use of FP16 since the PS3 was developed.



They was just beginning to use it when PS3 was being thought of & they are just now adding native support for it in AMD PC cards & MS is using it with Direct3D 12 & shader model 6


Half-precision floating point is a relatively new binary floating-point format. Nvidia and Microsoft defined the half datatype in the Cg language, released in early 2002, and was the first to implement 16-bit floating point in silicon, with the GeForce FX, released in late 2002.[1]
ILM was searching for an image format that could handle a wide dynamic range, but without the hard drive and memory cost of floating-point representations that are commonly used for floating-point computation (single and double precision).[2]
The hardware-accelerated programmable shading group led by John Airey at SGI (Silicon Graphics) invented the s10e5 data type in 1997 as part of the 'bali' design effort. This is described in a SIGGRAPH 2000 paper[3] (see section 4.3) and further documented in US patent 7518615.[4]

This format is used in several computer graphics environments including OpenEXR, JPEG XR, OpenGL, Cg, and D3DX. The advantage over 8-bit or 16-bit binary integers is that the increased dynamic range allows for more detail to be preserved in highlights and shadows for images. The advantage over 32-bit single-precision binary formats is that it requires half the storage and bandwidth (at the expense of precision and range).[2]

https://en.wikipedia.org/wiki/Half-precision_floating-point_format





mkTr9G5.jpg
 
Was there any mention of utilization of Freesync?

AMD has FreeSync over HDMI monitors available, but those are HDMI 1.4 at this point. I don't think the TV manufacturers are interested in adding dynamic refresh rate support to their sets, and it would require AMD to at least first make an HDMI 2.0b version of the extension. FreeSync over HDMI isn't a part of Adaptive-Sync standard, it's a proprietary extension by AMD. It doesn't look very likely that Sony would push to support it. There could be issues with handling the dynamic refresh rate with sub 30 FPS performance. Theoretically though AMD has stated iirc that FreeSync could go as low as 9 Hz.
 

viHuGi

Banned
AMD has FreeSync over HDMI monitors available, but those are HDMI 1.4 at this point. I don't think the TV manufacturers are interested in adding dynamic refresh rate support to their sets, and it would require AMD to at least first make an HDMI 2.0b version of the extension. FreeSync over HDMI isn't a part of Adaptive-Sync standard, it's a proprietary extension by AMD. It doesn't look very likely that Sony would push to support it. There could be issues with handling the dynamic refresh rate with sub 30 FPS performance. Theoretically though AMD has stated iirc that FreeSync could go as low as 9 Hz.

Ps4 Pro based on Polaris should totally be able tu support FreeSync since it works with HDMI 2.0 and so on, i doubt Sony cares though since tvs don´t support it anyway and monitor wich support that are not that mainstream.
 

ethomaz

Banned
Yes it was silly . FP16 isn't slow Nvidia just gimped FP16 on it's new consumer cards that has nothing to do with the speed of FP16,
It was always slow on any card... now they are "fixing" this issue with Polaris.

Tesla segment always have better F16 performance compared with games GPUs.
 
How worthwhile is the PS4 Pro for 1080p TV's? Will every game have different options to choose? A base mode for normal PS4, a Pro mode at 1080p with better graphics/FR, and a 4K resolution mode?

I remember reading something about this somewhere but I cant find it. Im willing to pick up the Pro.. but not really willing to upgrade my TV yet.. especially considering it wont have 4K Bluray support... which probably would have tipped me over to getting a new TV as well.
 

renzolama

Member
It seems overly glib to simply call this bad design if pretty much everyone in the development world supposedly agrees its bad design yet a significant portion do it anyway. AKA, there's clearly advantages to engaging in the practice. The console world is not a separate insular segment of developers who simply didn't get the memo - there's abundant overlap between console and PC development these days.

Software is rife with bad design, it's a fundamental problem for which the discipline of software engineering (vs software programming) was built. The term 'bad' has no moral attachment, it just means that something is not designed optimally based on current standards and practices. As software architecture knowledge grows and changes software using designs that were once considered acceptable inevitably becomes software with bad design, it's part of the natural cycle in technical disciplines that are relatively young with lots of problems still being solved.

I'm not casting aspersions on console developers because some of them may still follow practices that are no longer considered acceptable in other software industries, I'm simply pointing out that (in my opinion) you shouldn't limit hardware capability in order to provide those developers an excuse to continue following those practices rather than adapting the way everyone else has. Everyone here who is getting worked up defending developers is missing the point; I can guarantee you almost any of those developers would agree that these practices are bad and tell you that they're moving away from them. I'm not being glib or insulting, I'm just discussing the engineering/architecture the same way I would with any of the developers I work with, and I would phrase it exactly the same if I were to discuss it with any of the console developers in question.
 
My take on FP16.

Nobody uses in PC because it is fucking slow... you can do 64 FP32 operations at time of 1 FP16 operation in GTX 1080 for example... in better words in Pascal FP32 is 64x faster FP16.

Pro runs FP16 twice faster than FP32.

Why bother to use if it is so slow? Devs uses it only on mobile/portable (Android, iOS, Tegra, etc)... so the experience in how the FP16 improve performance in some places where is not required FP32 is not a know thing in PC scheme... they will need to change the quote and see if they got performance increase.

That is why I believe it will only be used in exclusive games and not game that share PC development because the use of FP16 can literally broke your game in PC.

You can read more here: http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5

It seems Nvidia chose to have a very low ratio between FP16 and FP32 Cores. A game using FP16 operations would run really slow on PC with a Nvidia card. Got it, thanks.

I am really fascinated with graphics programming (I'm a software dev) but I don't really know where to start. There are also not a lot of offer in terms of MOOCs it seems
 
I don't understand why Sony doesn't push freesync with AMD, even if only Sony themselves support it on their TV's.

Would be a nice selling point for the Sony console/TV combo.
 

DBT85

Member
You have no clue how game development works on closed architecture on consoles.

None. Nada. Zip. Zilch.

You have no clue how low-level APIs work.

"throwing a better hardware to a game is precisely the easiest way to get that game to run flawlessly"

Oh, why sure it is! Sure, sure. I mean, forget the fact that people like Chubigans and I work on PS4 daily. We have no clue what we are talking about. We certainly don't have an intimate knowledge of the system architecture or the APIs for OG/Pro and can't cite the differences leading to the decision that performance as 1:1 is the appropriate choice.

Wait, no... That's not it... Dammit where are my PS4 developer cliffnotes!
If you indeed have experience working with Ps4 and Pro then you are not making a very strong argument when all you have to say is I am the guy and I'm right.

Chubigans said all game logic is tied to the framerate. I said that's not the case and even explained how most (if not all) modern engines handles this.

Even if you work at AMD and developed the goddamn chip doesn't mean you can make misinformation as a fact.

And I'm telling you again, any engine worth a damn these days does not lock the game logic to the cpu to the point where it would bug out or not benefit at all from said cpu running faster. SPECIALLY when all that changed on the CPU was the clock.

Unless you are saying that whenever Ms or Sony are developing a console and they change the clock mid development all the developers have to work their logic to make their games work again. If that's you saying, than LOL.



I don't even really need to explain the 360/XO compatability, do I? I'll just ask and see if you can get to the answer yourself:

Does every 360 game run on XO right out of the box? Or do only certain games have compatability?

There should be a light bulb going off above your head any second now followed by an "ah ha!" and then a "doh!".
Oh no, you got me! Oh wait, you didn't.

First because, that doesn't cover the Xbox S scenario that is playing 100% of the xbone games with increased performance and better framerates.

And second, in case you missed, Ms said when they announced the feature that unlike what they did with OG Xbox, they now have a single emulator that runs all games, instead of tweak the emulation for each game.

They do have to get individual licenses from publishers again, new art (both for cover and achievements), and make a new entry for the game on xbox Live, but the emulator is the same running all games. They don't have to make tweaks so the emulator don't break an individual game or for that specific game to run better. And in some cases, specially before the emulator got updated with performance improvements it was also able to run the game slower without bugs pumping out of nowhere because of that.

And why? Because these days no ones couples the game logic to a processor clock or anything like that anymore.

Unless that's how they do on Playstation. But that's kinda of my point, that is sony that's dropping the ball here.
 
If you indeed have experience working with Ps4 and Pro then you are not making a very strong argument when all you have to say is I am the guy and I'm right.

Chubigans said all game logic is tied to the framerate. I said that's not the case and even explained how most (if not all) modern engines handles this.

Even if you work at AMD and developed the goddamn chip doesn't mean you can make misinformation as a fact.

And I'm telling you again, any engine worth a damn these days does not lock the game logic to the cpu to the point where it would bug out or not benefit at all from said cpu running faster. SPECIALLY when all that changed on the CPU was the clock.

Unless you are saying that whenever Ms or Sony are developing a console and they change the clock mid development all the developers have to work their logic to make their games work again. If that's you saying, than LOL.




Oh no, you got me! Oh wait, you didn't.

First because, that doesn't cover the Xbox S scenario that is playing 100% of the xbone games with increased performance and better framerates.

And second, in case you missed, Ms said when they announced the feature that unlike what they did with OG Xbox, they now have a single emulator that runs all games, instead of tweak the emulation for each game.

They do have to get individual licenses from publishers again, new art (both for cover and achievements), and make a new entry for the game on xbox Live, but the emulator is the same running all games. They don't have to make tweaks so the emulator don't break an individual game or for that specific game to run better. And in some cases, specially before the emulator got updated with performance improvements it was also able to run the game slower without bugs pumping out of nowhere because of that.

And why? Because these days no ones couples the game logic to a processor clock or anything like that anymore.

Unless that's how they do on Playstation. But that's kinda of my point, that is sony that's dropping the ball here.
I really think you need to understand the difference between emulation and native.

You also need to understand the difference between same hardware with upclock and new hardware with new APIs.

I'll let you work that out on your own. It's not difficult. I mean, all you keep saying is "why can't this apple taste like this orange?" but you aren't hearing yourself.

Hint:
It doesn't work like you think it does, therefore, comparisons can't be drawn.

Now take it from there.
 
Hypervisor on Xbox will make things a lot différent for scorpio. Bc will probably become more and more the strength of Xbox ecosystem
Potentially, yes. The entire OS architecture of the Xbox seems geared towards being moved to different architectures as easily as possible, they seem to have a level of hardware abstraction in mind from the get-go.

It definitely could be a feather in Scorpio's cap if when it's released, it can run Xbone games unmodified at more consistent framerates, or simply due to its API's/OS would allow significantly easier patching to enable higher res/better framerates. That still wouldn't rectify the QA issue though, no matter how small any code change requires QA involvement.
 

Humdinger

Member
How worthwhile is the PS4 Pro for 1080p TV's? Will every game have different options to choose? A base mode for normal PS4, a Pro mode at 1080p with better graphics/FR, and a 4K resolution mode?

This has been discussed a lot in many different threads. Brief answer: At minimum, you will get 4K images downsampled in most games, and that by itself will result in better images at 1080p. Then you will also get game-by-game upgrades, depending on what the developer wants to do with the extra power at the 1080p level. We've already got several examples of games that are putting it to good use (e.g., Horizon, Paragon, several others), and some games that have special upgraded 1080p modes that can be selected (e.g., Tomb Raider, Mass Effect).

How it will all shake out is yet to be determined, because it depends on how the developers handle it. If you're uncertain whether it's worth it to you, wait and watch for the comparisons.
 
I don't understand why Sony doesn't push freesync with AMD, even if only Sony themselves support it on their TV's.

Would be a nice selling point for the Sony console/TV combo.
Part of the reason is that variable frame rate games that run between 40-60fps are generally pretty rare on consoles relative to the entire library - in order for Freesync to work effectively, you really need more than 40fps. Most console games are 30 where it would have no benefit, and the 60fps ones are generally pretty close to that the majority of the time. Games that fluctuate between 40-60fps usually just get a 30fps lock from the dev.

There are exceptions of course, but it's far more applicable on the PC as potentially every modern 3D game can be made to run at a variable framerate of 40-60fps if you just jack up (or lower) the rendering load, freesync/gsync allows the user to enjoy better gfx without a significant increase in stutter. When you take that control away from the end-user and put it in the hands of the developer, these variable sync technologies have less of a value if only a handful of games would show a benefit. For a PC, almost every game can show the benefits.
 
Top Bottom