• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4 PRO: for best IQ (Native 4k + HDR + 60 HZ + 4:4:4) 2160p-YUV420 or 2160p-RGB?

Aceofspades

Banned
Quoting myself from 2 months ago..

This remind me of "HD Ready" TVs in 2006. Best advise is to wait for newer generations of displays next year.

I'll stick to my 1080p for my Pro.

its truly just like 2006 to me, at the time TV were advertised as "HD Ready" while in fact they were only 720p. better wait for the next wave of displays next year and hopefully a better standardization of the tech.

EDIT: Top of the page :(
 
Looking forward to testing the Pro this Thursday on my Sony 55X8509c. It might not be the best 4K TV for HDR, with it being a little older now but it will do the job decent enough for now and the 4k picture quality is awesome.

I guess what we need to do is find a 60fps 4K HDR game (cod infinite Warfare) then do a test of the same cutscenes in RGB, then YUV. That'll give us our answer.
 

Alej

Banned
So what would you choose on your PS4.

RGB or Yuv?

Automatic is always an unknown quantity... as it may choose the wrong setting (like full range / limited)

I would choose automatic.
Or RGB444 if automatic oddly produces chroma subsampling.

X83C doesn't support HDR or 10bits natively. You can force HDR10 via content on, for example, an USB storage, though it won't work on a PS4 pro.
 
I guess what we need to do is find a 60fps 4K HDR game (cod infinite Warfare) then do a test of the same cutscenes in RGB, then YUV. That'll give us our answer.


COD Infinite Warfare isn't native 4K, but checkerboard 2160p. You will notice some artefacts here and there. You best way is to try NBA 2K17 which has everything but no TV can profit form it.
 
First person to do a test with a 4K native HDR game... on a tv that supports 4k60 HDR .... on the two settings RGB / YUV gets a big biscuit.
 

Alej

Banned
COD Infinite Warfare isn't native 4K, but checkerboard 2160p. You will notice some artefacts here and there. You best way is to try NBA 2K17 which has everything but no TV can profit form it.

We talk about the output, not the internal res of any games.

Th internal res doesn't change anything here if you output a 4K signal via HDMI in the end.

No HDR10 and 444RGB simultaneously. Even on a 1080p game.
 

Alej

Banned
First person to do a test with a 4K native HDR game... on a tv that supports 4k60 HDR .... on the two settings RGB / YUV gets a big biscuit.

This is simple, you choose between HDR10 at 420 chroma (resulting in some subtle blur) or HDR8 at 444 chroma (which could hypothetically causes some banding).

BTW, you probably won't notice anything if you aren't in person in front of the TV to do the test. This won't translate well in screenshots or offscreen pics.
 
We talk about the output, not the internal res of any games.

Th internal res doesn't change anything here if you output a 4K signal via HDMI in the end.

No HDR10 and 444RGB simultaneously. Even on a 1080p game.

This is simple, you choose between HDR10 at 420 chroma (resulting in some subtle blur) or HDR8 at 444 chroma (which could hypothetically causes some banding).

I know. I just wanted to clarify that you may get some artefacts that some may think are caused by a specific mode. Just to avoid such situation, NBA 2K17 is the best example.
 

Alej

Banned
I know. I just wanted to clarify that you may get some artefacts that some may think are caused by a specific mode. Just to avoid such situation, NBA 2K17 is the best example.

No it won't. Uncharted4 at 1440p will output a 4K60p signal from PS4Pro, exactly like NBA2K17.

If you don't force 1080p in a/v settings.
 

Elios83

Member
It's a HDMI bandwidth limitation.
Basically at 4K, 60fps and HDR the current HDMI standard hasn't enough bandwidth to transmit an uncompressed chroma signal.
So you need to switch from 4:4:4 (uncompressed chromaticity) to a compressed format where the chroma samples are not sent to the TV for every pixel (although the console calculates them) but sharing some values among pixels (4:2:0 is the worst compression...basically you get square blocks of 4 pixels all sharing the same chroma but with different luminance since luminance is never compressed).
So basically the point is to stick to RGB (equivalent to 4:4:4).
For Pro I don't think this is a significant limitation since the numbers of titles running at native 4K+60fps+HDR will be really a few exceptions and personally I'd choose color accuracy over HDR in these titles especially if there's small text to read. But as always things have to be tried first hand to decide the best settings.
 
No it won't. Uncharted4 at 1440p will output a 4K60p signal from PS4Pro, exactly like NBA2K17.

If you don't force 1080p in a/v settings.

I didn't mean the artefacts of a non native 4K game outputting in 4K signal but I was talking about the artefacts caused by checkerboard. This is common knowledge. Read Digital Foundry coverages.
 
It's a HDMI bandwidth limitation.
Basically at 4K, 60fps and HDR the current HDMI standard hasn't enough bandwidth to transmit an uncompressed chroma signal.
So you need to switch from 4:4:4 (uncompressed chromaticity) to a compressed format where the chroma samples are not sent to the TV for every pixel (although the console calculates them) but sharing some values among pixels (4:2:0 is the worst compression...basically you get square blocks of 4 pixels all sharing the same chroma but with different luminance since luminance is never compressed).
So basically the point is to stick to RGB (equivalent to 4:4:4).
For Pro I don't think this is a significant limitation since the numbers of titles running at native 4K+60fps+HDR will be really a few exceptions and personally I'd choose color accuracy over HDR in these titles especially if there's small text to read. But as always things have to be tried first hand to decide the best settings.

Good post, but this a sacrifice I wanted to avoid. Either we have to switch between modes each time according to whether a game has HDR support or not (and we don't know the changelogs of all games so we don't know the full games that support HDR) or we have to rely on the automatic option that I don't trust to choose the best mode each time.
 
You know when enabling Game Mode, all the post processing tools and supersampling effects are turned off which isn't the case when playing movies. So you will get a noticeable loss in details and IQ in Game Mode. This will be accentuated with the attenuation of colour grading and details if you use 4:2:0 or HDR8. What would be the purpose of highlighting HDR and better colours in the recent talks if we are stuck to use last-gen modes?

Why are you lumping them together? As also shown in the chart 4:2:0 is not "HDR8". 4:2:0 can have 10 bit color, like on 4K Blu-Ray.
Also most people turn off or limit most post-processing and image enhancement when calibrating even in movie modes.

People have also sampled PS4 Pro games in person at 10 bit 4K HDR 60fps and said it looks very good.
You're over exaggerating that 4:2:0 can't look good.

Normally it is possible yeah. I didn't know 4K Blu-Ray are stuck to 4:2:0. I read that the available UHD Blu-Ray players aren't real 4K ones and they just use some upscaling and post processing methods to enhance the picture to 4K since for a real native 4K content you need Blu-Ray Disks that can go to 250 GB. Real 4K movies are around that size too.

I wonder if digital 4K content has the same limitations but anyway the PS4 PRO Media Player isn't updated yet to read 4K content.

Off-topic but:

Huh? Uhd blu-ray players are real 4K players, they use 3840x2160 video. You may be confusing this with what Hollywood originally considered 4K (4096x2160) compared to what has become the consumer 4K/UHD standard (3840x2160) or those "4K upscaling Blu-Ray players" that only play regular Blu-rays. You can't go by file size because UHD BDs use H.265/HEVC to get smaller file sizes. I believe one of the other reasons in addition to what Chamber said for why they chose 4:2:0 is that the spec supports first gen 4K TVs limited to HDMI 1.4 provided the TV supports the latest HDCP copy protection. This way even though those people that can't get the new HDR, wide color gamut, etc, could seemingly still get at least 4K from a 4K Blu-Ray player.
 

Wallach

Member
What? Game Mode is just basically turning all the enhancements that the TV does in order to get the least Input Lag in games.

none of those enhancements are capable of improving image quality in the case of a raw digital source like this, in fact they can pretty much only potentially do the opposite

you wouldn't want them enabled even if they didn't increase input lag when dealing with a source like a console or a PC

the only thing you really want in this case is HDR
 
I rather have the lowest input lag on my KS9000 which is the 4.2.0. Do I need to change any setting for this to happen or will the automatic mode on the PS4 pro do it for me. ?
 
I rather have the lowest input lag on my KS9000 which is the 4.2.0. Do I need to change any setting for this to happen or will the automatic mode on the PS4 pro do it for me. ?

I think to get 4:4:4 on a TV you need to enable HDMI UHD Color first. By not ebaling it on your TV, you will avoid getting 4:4:4 in whatver mode the PS4 PRO offers.

Depends. If all 3 HDMI ports support 4k60 4:4:4 then you will need to ensure that HDMI UHD color is turned off for the port that you are plugging your pro in to, this should limit it to receiving 4:2:0 signals.

Yep.
 

jonno394

Member
I rather have the lowest input lag on my KS9000 which is the 4.2.0. Do I need to change any setting for this to happen or will the automatic mode on the PS4 pro do it for me. ?

Depends. If all 3 HDMI ports support 4k60 4:4:4 then you will need to ensure that HDMI UHD color is turned off for the port that you are plugging your pro in to, this should limit it to receiving 4:2:0 signals.
 

Gitaroo

Member
They should add 4:2:2 12bit, someone has posted a comparison pic somewhere in this forum, 4:2:2 much closer to 4:4:4. 4:2:0 loose a lot of fine detail. Weird that I can not get my gf 970 to output 4:2:0 10bit, had to use 4:2:2 12bits for shadow warrior 2 hdr.
 

BigEmil

Junior Member
Quoting myself from 2 months ago..



its truly just like 2006 to me, at the time TV were advertised as "HD Ready" while in fact they were only 720p. better wait for the next wave of displays next year and hopefully a better standardization of the tech.

EDIT: Top of the page :(

Worthy of being top of the page people should listen to this advice just wait it out on the TV's imo you can still play PS4 Pro on your 1080p TV's for now
 

McSpidey

Member
This is pretty simple in my view. If you're using a TV use YUV, if you're using a PC monitor use RGB. There are no HDR PC monitors or standards yet, only TV - that's why HDR10 is YUV - it was designed for video media to be viewed on a TV.

RGB on a TV at couch distance simply isn't worth the pain and misery since it's non standard, media by and large isn't designed or tested for it (all video is YUV) and now with 10bit HDR signals it simply isn't supported.
 

etta

my hard graphic balls
Why the fuck is that naming so complex?
Why didn't they use 4K instead of 2160p. Do they expect the average Joe to know that 2160p is 4K? And what the hell is that YUV420 crap, which is even worse to most people since they will see RGB and use that because it's familiar, but now HDR doesn't work?
 

Elios83

Member
This is pretty simple in my view. If you're using a TV use YUV, if you're using a PC monitor use RGB. There are no HDR PC monitors or standards yet, only TV - that's why HDR10 is YUV - it was designed for video media to be viewed on a TV.

RGB on a TV at couch distance simply isn't worth the pain and misery since it's non standard, media by and large isn't designed or tested for it (all video is YUV) and now with 10bit HDR signals it simply isn't supported.

Problem is that according to this:

http://i.imgur.com/hja7O1X.jpg

Pro uses YUV for 4:2:0 alone even if not required.
So no you should use RGB.
 

dark10x

Digital Foundry pixel pusher
You mean: 4k @ 60Hz @ 4:4:4 + HDR drops to HDR 8 instead of HDR10 and the only to get 4k @ 60Hz + HDR10 is 4k @ 60Hz @ 4:2:0 + HDR10 through 2160p - YUV 420?

So it means that even the latest TVS don't allow 4k @ 60Hz @ 4:4:4 + HDR10 and we should wait for newer TVs to get such configuration and then we can use 2160p - RGB?
RTINGS has really confused so many people with that diagram.

Yes, you can use HDR with 8-bit color which means 4:4:4 at 4K60 + HDR is possible. It simply results in very noticeable color banding in many areas of the image (depending on how the material takes advantage of this feature).

That said, I think you should really give it a look before condemning it. I would prefer RGB, sure, but 420 on a 4K display looks far better than what you'd expect.

Why the fuck is that naming so complex?
Why didn't they use 4K instead of 2160p. Do they expect the average Joe to know that 2160p is 4K? And what the hell is that YUV420 crap, which is even worse to most people since they will see RGB and use that because it's familiar, but now HDR doesn't work?
It's even worse since the PSVR processing unit only supports 8-bit 4:2:0. If you own both, you pretty much need to disconnect it when playing regular content if you want improved image quality.
 

McSpidey

Member
Problem is that according to this:

http://i.imgur.com/hja7O1X.jpg

Pro uses YUV for 4:2:0 alone even if not required.
So no you should use RGB.

I know what it does, but I also know the problem using RGB on TVs causes so stand by recommending YUV on a TV. If Sony mistakenly changed things to use the wrong YUV format in some resolutions where it's unnecessary they should fix that, obviously.
 

Alej

Banned
The biggest issue i have with this thread is:
"How do you know 2160p-RGB is 444 only?"

Because by default and common sense, and as HDR and Deep Colour are different sub-settings, you would think you'll get chroma subsampling automatically with the "2160p-RGB" mode while playing a HDR10 source.

No HDR setting and No HDR source = rgb444
HDR setting but No HDR source = rgb444
HDR setting and HDR source = yuv420

I don't think the 2160p setting, RGB or YUV, is related to this. It is obviously for old UHD TVs without RGB444 support altogether.

Edit: forgotten the yuv and rgb prefixes, causing some confusion.
 
The biggest issue i have with this thread is:
"How do you know 2160p-RGB is 444 only?"

Because by default and common sense, and as HDR and Deep Colour are different sub-settings, you would think you'll get chroma subsampling automatically with the "2160p-RGB" mode while playing a HDR10 source.

No HDR setting and No HDR source = 444
HDR setting but No HDR source = 444
HDR setting and HDR source = 420

I don't think the 2160p setting, RGB or YUV, is related to this. It is obviously for old UHD TVs without RGB444 support altogether.

This is a good question.
It may do this switch on its own even in the RGB setting just when detecting HDR.
Which would make it the same as choosing automatic and the system automatically picking the RGB setting for you when it reads that your TV supports it (which then may already adjust itself when detecting HDR).
 

Elios83

Member
The biggest issue i have with this thread is:
"How do you know 2160p-RGB is 444 only?"

In RGB you don't have lumimance and chroma as separated entities.
It's a different color space where the coordinates are just the relative intensities of three primaries.
To get chroma subsampling in RGB you need first to convert the signal in an other color space like YUV assuming specific luminance and chroma values for the RGB primaries (and here there there's already a standardization problem because infact every TV uses slightly different RGB primaries), do the chroma subsampling, reconverting to RGB and send to signal to the TV.
I think it's very unlikely they're doing this.
RGB is equivalent to uncompressed chroma in YUV as far as I know.


I know what it does, but I also know the problem using RGB on TVs causes so stand by recommending YUV on a TV. If Sony mistakenly changed things to use the wrong YUV format in some resolutions where it's unnecessary they should fix that, obviously.

You should try RGB as well because YUV420 could be very messy in games with small and lots of text to read IMO.
 
Why the fuck is that naming so complex?
Why didn't they use 4K instead of 2160p. Do they expect the average Joe to know that 2160p is 4K? And what the hell is that YUV420 crap, which is even worse to most people since they will see RGB and use that because it's familiar, but now HDR doesn't work?
The naming is complex because precision is needed. For people who care, they want to know exactly what the settings do. For people like you who don't understand the terminology (e.g. incorrectly assuming that 2160p and 4K are the same), there's no need to ever even look at this stuff. Leave it on "automatic" and forget about it.
 
In RGB you don't have lumimance and chroma as separated entities.
It's a different color space where the coordinates are the relative intensities of three primaries.
To get chroma subsampling in RGB you need first to convert the signal in an other color space like YUV assuming specific luminance and chroma values for the RGB primaries (and there there's already a standardization problem), do the subsampling, reconverting to RGB and send to signal to the TV.
I think it's very unlikely they're doing this.
RGB is equivalent to uncompressed chroma in YUV as far as I know.

I think his point is that the RGB setting may "smartly" auto-adjust to 4:2:0 automatically just when it detects HDR (in the third scenario). Just like a TV can auto-activate its HDR mode.
 

etta

my hard graphic balls
The naming is complex because precision is needed. For people who care, they want to know exactly what the settings do. For people like you who don't understand the terminology (e.g. incorrectly assuming that 2160p and 4K are the same), there's no need to ever even look at this stuff. Leave it on "automatic" and forget about it.
So 2160p is not the same as the 4K they've been advertising?
 

Elios83

Member
I think his point is that the RGB setting may auto-adjust to 4:2:0 automatically just when it detects HDR (in the third scenario). Just like a TV can auto-activate its HDR mode.

But in that case it isn't RGB anymore, he should say that it auto switches to YUV420.
I guess that to know precisely how the console behaves things simply need to be tested hands on otherwise we can only speculate.

So 2160p is not the same as the 4K they've been advertising?

No, same difference between 1080p and 2K.
4K is mostly a cinematographic format with a resolution of 4096 × 2160 pixels. It is not a 16:9 format.
The 4K used in current TVs is precisely called 4K UHD or 2160p, a resolution of 3840*2160 pixels, a perfect 16:9 format just like the screen ratio of the TVs.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
Åesop;223526663 said:
No worries guys. There is a console for you, it's called the PS4 Casual ( ͡° ͜ʖ ͡°)

@OT: This is all really confusing.. an OT with all the important information and tutorials would be nice :/

LOL at PS4 Casual. So condescending that it feels so good to say.
 

sloppyjoe_gamer

Gold Member
Reading this thread trying to find out which PS4 video setting I should use with my 4K HDR TV:

chalkboard-620x354.jpg
 

Cleve

Member
Man, I think some of you are trying to confuse yourselves. Use auto if you don't understand, it does the thinking for you.

If you thought you were going to get 10 bit hdr rgb @ 4k/60, sorry to disappoint you.
 
Yup

But in that case it isn't RGB anymore, he should say that it auto switches to YUV420.
I guess that to know precisely how the console behaves things simply need to be tested hands on otherwise we can only speculate.

True enough, but I think that was implied for simplicity in the third scenario since it's the only way.
 

jonno394

Member
I've decided i'll just plug it in to HDMI 1 on my TV and unplug whenever I want to use my PC. HDMI1 on my TV supports 4k60 4:4:4 with 8 bit and 4k60 4:2:0 at 10 bit, if I just set the Pro to auto, it'll do my thinking for me!

If I chose to just use HDMI 2 (which doesn't support RGB 4:4:4 or UHD Color due to only 1 of the HDM supporting full bandwidth) I'd be locked in to only 4:2:0, so at least if I use socket 1 i'll have the choice!
 

MrBigBoy

Member
But I...I mean Bob doesn't trust the console's auto setting especially given how much it seems to bork things up with the black level set to automatic...
Tell Bob he should read this thread.

I'm sure he'll understand why automatic will be best for him :p
 

adamsapple

Or is it just one of Phil's balls in my throat?
This thread is confusing my little brain.

I was told that for a 4K TV with HDMI 2.0 but without HDR, the best option to use is RGB. That's what I'm gonna use.
 

jonno394

Member
This thread is confusing my little brain.

I was told that for a 4K TV with HDMI 2.0 but without HDR, the best option to use is RGB. That's what I'm gonna use.

Indeed, now you have just got to make sure all 3/4 of your HDMI ports support full bandwidth/UHD Colour (or the equivalent for Sony/LG/Panasonic devices), otherwise you have to make sure you're using the right one!
 
So 2160p is not the same as the 4K they've been advertising?
Sort of. The term "4K" is used with multiple meanings: the UHD TV spec 3840×2160, and several slightly wider movie specs.

For conversational uses "4K" is fine, but in settings greater precision is desirable. (Technically, "2160p" isn't precise either, but in context it's clearly UHD.)

Again, anyone who doesn't want to learn all this can just let the machine choose. If you want more control, you gotta get down in the weeds.
 
Top Bottom