• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Inside Playstation 4 Pro: How sony made the first 4k games console

I have no idea what Cerny is technically talking about, but I'm gonna trust his word on something he designed versus some Neogaf techies.

If he says something is useful for game performance, I'll take his word for it, dunno what you guys are even arguing about, as a consumer I'm much more interesting in the UI, controller, and actually playing the games.
 

onQ123

Member
Your arm must be sore from clapping yourself on the back ;)

Not yet :) I had to deal with people talking down on me for months but I haven't heard a peep out of them since Cerny did his talk.

Yeah i was interested in this ever since you first mentioned it. btw the ps4 is not capable of FP16 right? This is exclusively a pro feature until the scorpio comes around?


PS4 can use FP16 but it will still be 1.84TF & no it's not exclusive to PS4 pro Nintendo Switch will most likely have a GPU with FP16 as it's peak Flop number & most mobile devices have higher FP16 throughput than FP32.
 
So your saying Pro games framerates will be held back to prevent an advantage over PS4 users?

BF1 will have the same framerate drops as the PS4 to prevent this? That sounds completely ridiculous to me..

Not quite. The ps4/pro versions of multiplayer games will have the same framerate target but due to the 1.33x higher clocked cpu and new 2.3x as powerful gpu the pro will stick to those framerate targets much better than the original ps4.
For example if a game runs consistently in the mid 40's on the ps4 and is cpu limited than the pro version should be able to hit 60 or very close to 60 consistently.
 

onQ123

Member
Geometry rendering seems like a perfect fit for The Witness if they are not able to hit 4K native.


1440p seem like a crazy choice after Sony basically asked devs not to go for 1440p.
 
Not yet :) I had to deal with people talking down on me for months but I haven't heard a peep out of them since Cerny did his talk.




PS4 can use FP16 but it will still be 1.84TF & no it's not exclusive to PS4 pro Nintendo Switch will most likely have a GPU with FP16 as it's peak Flop number & most mobile devices have higher FP16 throughput than FP32.
I believe in u onq
 
Not yet :) I had to deal with people talking down on me for months but I haven't heard a peep out of them since Cerny did his talk.




PS4 can use FP16 but it will still be 1.84TF & no it's not exclusive to PS4 pro Nintendo Switch will most likely have a GPU with FP16 as it's peak Flop number & most mobile devices have higher FP16 throughput than FP32.

And they didn't even lift your Junior status!
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
Geometry rendering seems like a perfect fit for The Witness if they are not able to hit 4K native.


1440p seem like a crazy choice after Sony basically asked devs not to go for 1440p.

They were referring to checkerboarding from 1440p weren't they? Not downsampling down to 1080p. Because they referenced 4K displays
 
And they didn't even lift your Junior status!

His junior status wasn't even from that.

I don't really have a dog in this fight, especially when it comes to tech (since I don't know much about the stuff to begin with), but I rooted for onQ simply because the way people talked down at him was downright arrogant, rude and condescending that it pushed me to his side.

Whether he was right or not never really mattered in my mind, since I didn't care either way, but the way people talked to him was so irritating, so I'm glad he got some level of "payback".

There's constructive ways to discuss rather than resort to insults.
 

TronLight

Everybody is Mikkelsexual
so an extra gig of "slow" ram and a "primitive" discard accelerator...
the age of sony is done.
It's not "primitive" as in "basic", but as vertex. A primitive.

Edit: I didn't notice this topic is 13 pages long.
 

dr_rus

Member
That's not what the word artificial means in my context so don't tell me what it means. You sarcastically asked if that was what i meant and i said yes so i dont know what you're still going on about.



I chose my words carefully, i said Nvidia artificially restricts them in their consumer cards. That is a segmentation that was made by Nvidia to sell pro cards that have higher margins. They've been doing that for compute features in their consumer cards.

Ok after reading the link that ethomaz quoted to support his claim

Anandtech




Clearly the author of the article didn't know nearly enough to make that claim.

Again, there's nothing artificial in not supporting FP16x2 on consumer GPUs as contrary to a popular belief FP16 isn't in fact a half of FP32 and supporting it with 2x throughput does require additional hardware which would lead to more heat, less clocks and bigger dies for consumer GPUs for precisely no gain at all in their main gaming markets as there are no PC games which are using FP16 math at the moment.

There's no "artificial" word in any of your quotes regarding FP16. He's only saying that it's "artificial" when he's talking about FP64 limitations on cards which actually had the FP64 h/w in their GPUs but it was "artificially" limited with s/w (this is mostly about GK110 based 780 cards). This isn't the case for either FP16 or FP64 when it comes to Pascal consumer cards, the decision to not include fast support for these precision modes in GPUs powering them is not "artificial" by any mean.

And again, FP16 isn't slow for gaming usage on any consumer Pascal card, it's just not twice as fast, it's running on FP32 throughput exactly like it was doing previously since G80. What we're talking about is the native FP16 support which is limited to 1 FP16x2 SP per SM in GP10x cards - but this functionality is exposed only for CUDA applications as its presence there is strictly for CUDA code debug purposes, not for actual usage in games.
 

DjRalford

Member
Just read a rundown of the FP16 stuff, and the multi resolution VR stuff, there'll be a few people eating humble pie when Pro mode outputs are streamlined.
 

onQ123

Member
They were referring to checkerboarding from 1440p weren't they? Not downsampling down to 1080p. Because they referenced 4K displays

No he said that they wasn't going to use checkerboard rendering & that they would render at 1440p & upscale to 4K.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
No he said that they wasn't going to use checkerboard rendering & that they would render at 1440p & upscale to 4K.

No, i'm talking about the quote where Sony supposedly discourages 1440p.

If down sampling to 1080p, it would still make a significant difference.
 

onQ123

Member
And they didn't even lift your Junior status!

His junior status wasn't even from that.

I don't really have a dog in this fight, especially when it comes to tech (since I don't know much about the stuff to begin with), but I rooted for onQ simply because the way people talked down at him was downright arrogant, rude and condescending that it pushed me to his side.

Whether he was right or not never really mattered in my mind, since I didn't care either way, but the way people talked to him was so irritating, so I'm glad he got some level of "payback".

There's constructive ways to discuss rather than resort to insults.


They Juniored me for making 3 or 4 4K gaming threads over the span of 2 months


You can check my thread history & you will not find anything worth getting Jr'ed for & these are the threads that got me Jr'ed


Could 4K take PC Gamers out of the Desk Space Ghetto & into the Comfy Couch Suburbs?

4K Support : could it have a influence on which Next Gen Console you buy?

4K Video Gaming is already here (Toshiba Video Demo )
 

onQ123

Member
No, i'm talking about the quote where Sony supposedly discourages 1440p.

If down sampling to 1080p, it would still make a significant difference.


No they was telling devs to use more efficient rendering techniques to get closer to 4K rather than using 1440p the old way.
 

Adam M

Member
I see people asking about the best SSD and many mentioning Samsung.

Yes, Samsung is pretty much the defacto brand for SSDs.

However.....


Don't waste your money getting the best SSD. Save that for a PC or laptop.

Get a decent SSD as it will be many many many times faster than the HDD that ships with the console. Crucial or Mushkin brand SSDs are easy enough to recommend.

http://www.newegg.com/Product/Product.aspx?Item=N82E16820226596

Of course everyone here is assuming that an SSD will even provide a worthwhile benefit, despite the fact that it provided a neglible benefit on the current PS4. Even on SATA2 the SSD should have blown away the stock HDD, but it didn't.

First we need benchmarks to know if the PS 4 Pro can achieve good SATA3 speeds, if it can do about 500mb/s r/w speeds then it could be a killer machine for the money.

I'm also really curious how non patched games will look like if the PS4 Pro upscales instead of the TV.
 

ethomaz

Banned
Yeah i was interested in this ever since you first mentioned it. btw the ps4 is not capable of FP16 right? This is exclusively a pro feature until the scorpio comes around?
It is capable but not 2x faster FP32... so if FP16 is slower or runs at the same speed than FP32 you are better stay with FP32.

There is no advantage do use FP16 neither on PC, PS4 or XB1.

Polaris added a native FP16 unit that runs at the same speed than FP32 that gives no advantage over FP32 at all.

Pro looks to use Vega FP16 native support that runs 2x faster than FP32 and that can give some advantage using it over FP32.
 
I don't think I could handle having less than 1tb storage, 2tb would be better. Regardless of improvement to loading times.

Those cost more than the Pro, right?
 
They Juniored me for making 3 or 4 4K gaming threads over the span of 2 months


You can check my thread history & you will not find anything worth getting Jr'ed for & these are the threads that got me Jr'ed


Could 4K take PC Gamers out of the Desk Space Ghetto & into the Comfy Couch Suburbs?

4K Support : could it have a influence on which Next Gen Console you buy?

4K Video Gaming is already here (Toshiba Video Demo )

Ha! Damn, man. Guess you were too far ahead of your time with trying to start a discussion around 4K :)
 

Tripolygon

Banned
Again, there's nothing artificial in not supporting FP16x2 on consumer GPUs as contrary to a popular belief FP16 isn't in fact a half of FP32 and supporting it with 2x throughput does require additional hardware which would lead to more heat, less clocks and bigger dies for consumer GPUs for precisely no gain at all in their main gaming markets as there are no PC games which are using FP16 math at the moment.
Again we will have to agree to disagree. The rest of your drivel has nothing to do with what i'm talking about. There is no popular belief that FP16 is half of FP32 and supporting it does not mean more heat or less clock. It is all about market segmentation, not making consumer cards simpler or more efficient. Titan X maxwell and Tesla M40 both have similar clocks, both rated for 7TF FP32, same FP16,FP64 performance, both have similar TDP. Only difference is one cost almost $5000 while the other $1500.
Anandtech said:
As for why NVIDIA would want to make FP16 performance so slow on Pascal GeForce parts, I strongly suspect that the Maxwell 2 based GTX Titan X sold too well with compute users over the past 12 months, and that this is NVIDIA’s reaction to that event. GTX Titan X’s FP16 and FP32 performance was (per-clock) identical its Tesla equivalent, the Tesla M40, and furthermore both cards shipped with 12GB of VRAM. This meant that other than Tesla-specific features such as drivers and support, there was little separating the two cards.

There's no "artificial" word in any of your quotes regarding FP16. He's only saying that it's "artificial" when he's talking about FP64 limitations on cards which actually had the FP64 h/w in their GPUs but it was "artificially" limited with s/w (this is mostly about GK110 based 780 cards). This isn't the case for either FP16 or FP64 when it comes to Pascal consumer cards, the decision to not include fast support for these precision modes in GPUs powering them is not "artificial" by any mean.
No what he is saying is Nvidia has either cut hardware from consumer GPU and or artificially restrict performance in order to create differences between Pro and consumer cards, using FP64 as an example. This is what i'm calling artificial restrict.
Anandtech said:
However in the case of FP64, performance has never been slower than 1/32, whereas with FP16 we’re looking at a much slower 1/128 instruction rate. Either way, the end result is that like GP104’s FP64 support, GP104’s FP16 support is almost exclusively for CUDA development compatibility and debugging purposes, not for performant consumer use.

And again, FP16 isn't slow for gaming usage on any consumer Pascal card, it's just not twice as fast, it's running on FP32 throughput exactly like it was doing previously since G80. What we're talking about is the native FP16 support which is limited to 1 FP16x2 SP per SM in GP10x cards - but this functionality is exposed only for CUDA applications as its presence there is strictly for CUDA code debug purposes, not for actual usage in games.
Again, this has nothing to do with what i'm talking about. Whether its slow or fast for gaming usage has ZERO bearing on my argument. My whole point is Nvidia artificially restricts performance on their consumer cards when compared to their Pro cards. Whether via hardware or software, it is an artificial segmentation done by Nvidia to sell pro cards. I think I'm done repeating myself
 

onQ123

Member
It is capable but not 2x faster FP32... so if FP16 is slower or runs at the same speed than FP32 you are better stay with FP32.

There is no advantage do use FP16 neither on PC, PS4 or XB1.

Polaris added a native FP16 unit that runs at the same speed than FP32 that gives no advantage over FP32 at all.

Pro looks to use Vega FP16 native support that runs 2x faster than FP32 and that can give some advantage using it over FP32.

There is some advantages using FP16 on Polaris did you even read the article that's in the OP?

"Finally, there's better support of variables such as half-floats. To date, with the AMD architectures, a half-float would take the same internal space as a full 32-bit float. There hasn't been much advantage to using them. With Polaris though, it's possible to place two half-floats side by side in a register, which means if you're willing to mark which variables in a shader program are fine with 16-bits of storage, you can use twice as many. Annotate your shader program, say which variables are 16-bit, then you'll use fewer vector registers."

The enhancements in PS4 Pro are also geared to extracting more utilisation from the base AMD compute units.

"Multiple wavefronts running on a CU are a great thing because as one wavefront is going out to load texture or other memory, the other wavefronts can happily do computation. It means your utilisation of vector ALU goes up," Cerny shares.

"Anything you can do to put more wavefronts on a CU is good, to get more running on a CU. There are a limited number of vector registers so if you use fewer vector registers, you can have more wavefronts and then your performance increases, so that's what native 16-bit support targets. It allows more wavefronts to run at the same time."
 

ethomaz

Banned
There is some advantages using FP16 on Polaris did you even read the article that's in the OP?
That cool less internal memory use but I don't thing that will make devs on PC move... maybe after Vega and nVidia doing similar solution.

I just think PC devs to maintain compatibility most cases and that makes they don't use some specific GPU feature.

It is most probably exclusives games on Pro using it.
 

onQ123

Member
That cool less internal memory use but I don't thing that will make devs on PC move... maybe after Vega and nVidia doing similar solution.

I just think PC devs to maintain compatibility most cases and that makes they don't use some specific GPU feature.

It is most probably exclusives games on Pro using it.


I can't remember where it is right now but I read about Microsoft doing something with DirectX 12 or whatever where it would choose the least amount of FP needed for the process.
 

ethomaz

Banned
I can't remember where it is right now but I read about Microsoft doing something with DirectX 12 or whatever where it would choose the least amount of FP needed for the process.
You mean it choose the best option based on the GPU it is running? Like FP16 for Polaris and FP32 for Pascal if you said the minimum is FP16.

That is so good if true.

I remember years ago OpenGL setting everything to FP32 when choose float... maybe the add more custom options.
 
Would this extra 1 gig ram, the DD3 one actually make things snappy/responsive each time you hit the PS button to get to the home screen when you're in the middle of a game? Things are sluggish and then it smoothens out in the base PS4!
 

Frostman

Member
I'm not a seriously techy guy when it comes to this stuff, but I have a question haha.

To support 4K 60hz you require HDMI 2.0. How will downsampling work if a 1080p monitor doesn't support HDMI 2.0? Does it or doesn't it?

Because going by this, how will the PS4 Pro send the 4K data if the monitor is not compatible? Or does the Pro detect the 1080p display, and downsample the picture itself then send it to the monitor?

Sorry for being a dumbass.
 

Tripolygon

Banned
I'm not a seriously techy guy when it comes to this stuff, but I have a question haha.

To support 4K 60hz you require HDMI 2.0. How will downsampling work if a 1080p monitor doesn't support HDMI 2.0? Does it or doesn't it?

Because going by this, how will the PS4 Pro send the 4K data if the monitor is not compatible? Or does the Pro detect the 1080p display, and downsample the picture itself then send it to the monitor?

Sorry for being a dumbass.
The TV is not doing the downsampling.

Edit: And no its not a dumb question. Trust me when i say a good 90% of us in this thread, myself included, only have a vague idea of what's in the OP.
 

onQ123

Member
You mean it choose the best option based on the GPU it is running? Like FP16 for Polaris and FP32 for Pascal if you said the minimum is FP16.

That is so good if true.

I remember years ago OpenGL setting everything to FP32 when choose float... maybe the add more custom options.

Found it

https://msdn.microsoft.com/en-us/library/windows/desktop/hh968108(v=vs.85).aspx

Using HLSL minimum precision

Starting with Windows 8, graphics drivers can implement minimum precision HLSL scalar data types by using any precision greater than or equal to their specified bit precision. When your HLSL minimum precision shader code is used on hardware that implements HLSL minimum precision, you use less memory bandwidth and as a result you also use less system power.
You can query for the minimum precision support that the graphics driver provides by calling ID3D11Device::CheckFeatureSupport with the D3D11_FEATURE_SHADER_MIN_PRECISION_SUPPORT value. For more info, see HLSL minimum precision support.




Declare variables with minimum precision data types


To use minimum precision in HLSL shader code, declare individual variables with types like min16float (min16float4 for a vector), min16int, min10float, and so on. With these variables, your shader code indicates that it doesn’t require more precision than what the variables indicate. But hardware can ignore the minimum precision indicators and run at full 32-bit precision. When your shader code is used on hardware that takes advantage of minimum precision, you use less memory bandwidth and as a result you also use less system power as long as your shader code doesn’t expect more precision than it specified.

You don't need to author multiple shaders that do and don't use minimum precision. Instead, create shaders with minimum precision, and the minimum precision variables behave at full 32-bit precision if the graphics driver reports that it doesn't support any minimum precision. HLSL minimum precision shaders don't work on operating systems earlier than Windows 8 so if you plan to target earlier operating systems, you'll need to author multiple shaders, some that do and others that don't use minimum precision.
 
I'm not a seriously techy guy when it comes to this stuff, but I have a question haha.

To support 4K 60hz you require HDMI 2.0. How will downsampling work if a 1080p monitor doesn't support HDMI 2.0? Does it or doesn't it?

Because going by this, how will the PS4 Pro send the 4K data if the monitor is not compatible? Or does the Pro detect the 1080p display, and downsample the picture itself then send it to the monitor?

Sorry for being a dumbass.
Ding.
 

JohnnyFootball

GerAlt-Right. Ciriously.
First we need benchmarks to know if the PS 4 Pro can achieve good SATA3 speeds, if it can do about 500mb/s r/w speeds then it could be a killer machine for the money.

I'm also really curious how non patched games will look like if the PS4 Pro upscales instead of the TV.

Yes, that's exactly what my first was saying! An SSD on the current model PS4 does not yield dramatic improvements. Even on a SATA2 bus the improvements should have been much greater.

I hope that the PS4 Pro fixes this.

While its known that the Witcher 3 will NOT get a PS4 Pro patch, if the SSD on a Pro performs up to its potential and can dramatically reduce the load times on Witcher 3 and Fallout 4, it easily becomes the best way to play on a console.

Given the fact that the PS4 Pro is utilizing a SATA3 bus I am actually confident that will be a possibility. If not, then why bother spending the extra $$ for SATA3 when they could have stuck with 2
 

ethomaz

Banned
I think that is the biggest issues for devs...

HLSL minimum precision shaders don't work on operating systems earlier than Windows 8 so if you plan to target earlier operating systems, you'll need to author multiple shaders, some that do and others that don't use minimum precision.
To use this feature they will be tied to Windows 8+ unless they create two different shaders code.

I think most will prefer the old way with all declared with FP32 that runs in older OS (Windows 7 is still popular) or newer OS the same way.

They will start to use fully this when the market get out from Windows 7.
 

Flandy

Member
Long shot but do we know of the pro will give an option to output to 1440p?
Would be nice to still get the benefits of downsampling while playing at my native resolution
 

Aceofspades

Banned
Not yet :) I had to deal with people talking down on me for months but I haven't heard a peep out of them since Cerny did his talk.


You were right all along, I have to admit that.
Since Im not as knowledgeable with tech like some people here, I didn't have conflict of opinion with you, but I admit I showed some discomfort everytime you said "uprendering" and "double precision" which both ended be very true.

So I apologize for that :)
 
Long shot but do we know of the pro will give an option to output to 1440p?
Would be nice to still get the benefits of downsampling while playing at my native resolution

IDK, I don't think that will be an option. You can't select 900p on a 900p monitor so that's why I have doubts. (Responded to you here too in case you aren't checking the other thread)
 

Flandy

Member
IDK, I don't think that will be an option. You can't select 900p on a 900p monitor so that's why I have doubts. (Responded to you here too in case you aren't checking the other thread)
Guess I'll just have to hope my monitor accepts a 4K signal without issue lol
 
So, when are we expecting more info from Sony? I presume the Pro enhanced games are going to be demonstrated more before launch.
 

Ozorov

Member
So, when are we expecting more info from Sony? I presume the Pro enhanced games are going to be demonstrated more before launch.

"There's going to plenty of articles about the Pro games soon so you can make an informed decision as to whether the differences are worth it. And whether you get all the FASA, LOG or whatever other nonsense it is, those articles should make it pretty clear. "
 

Aceofspades

Banned
What's a PS4 Pro to the best PC has to offer?

I guess mid-range going by pure specs, but actually PRO is really punching above its weight (I didn't expect the large number of 4k games we got so far, also checkerboard rendering technique is providing spectacular results).

If you go by results, I say mid-high.

Pure guesses though.
 
Top Bottom