• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Television Displays and Technology Thread: This is a fantasy based on OLED

Paragon

Member
ABL is always in effect in bright scenes. It's a known issue especially for those sensitive to it. Watching a hockey game exposes this greatly. It's why when you professionally calibrate, just like for Plasma, you don't use 100% windows but 20%.
The dropoff in luminosity on rtings even indicates this from which you linked:
Well yes, the ABL is obviously in effect at high brightness settings since the brightness drops from 790 nits with a 10% window pattern to 150 nits full-screen.

The question I have is whether the ABL was still in effect when the display was calibrated to 150 nits or less, rather than when it is set to its maximum brightness.
Because everything I had read online up to this point said that it was not in effect below 150 nits, and you're suggesting that is not the case.

So if it was calibrated to 100 nits (SDR spec) you're saying that it would still dim the picture below that if the image was bright?
If it was still enabled below 150 nits, I'm glad that they're fixing a stupid decision, and disappointed that this is yet another area of performance that OLED owners have been misrepresenting.

Trying to discuss issues with OLED now is like trying to discuss issues about the Kuros back when they were still being sold.
People refused to admit that they were anything less than perfect until something better finally arrived. (better = deeper black level on AV sites, apparently nothing else matters)
 

vivftp

Member
Wait... Triluminos is Sony's marketing term for quantum dots? Always thought it was some color calibration stuff...
It shows how much i followed the TV scene these past few years lol.

Triluminos was originally used as a term on the XBR8 TVs which used a full array RGB backlighting system back around 2009 or so. They brought the term back around 2012 or 2013 when they partnered with QD Vision to use quantum dots as a colour filter to improve the picture on some of their higher end sets.

After the partership with QD Vision ended I'm quite sure they moved away from quantum dots, but dunno what they went to. Haven't looked into it in a long time.
 

Geneijin

Member
Well yes, the ABL is obviously in effect at high brightness settings since the brightness drops from 790 nits with a 10% window pattern to 150 nits full-screen.

The question I have is whether the ABL was still in effect when the display was calibrated to 150 nits or less, rather than when it is set to its maximum brightness.
Because everything I had read online up to this point said that it was not in effect below 150 nits, and you're suggesting that is not the case.

So if it was calibrated to 100 nits (SDR spec) you're saying that it would still dim the picture below that if the image was bright?
If it was still enabled below 150 nits, I'm glad that they're fixing a stupid decision, and disappointed that this is yet another area of performance that OLED owners have been misrepresenting.

Trying to discuss issues with OLED now is like trying to discuss issues about the Kuros back when they were still being sold.
People refused to admit that they were anything less than perfect until something better finally arrived. (better = deeper black level on AV sites, apparently nothing else matters)
Yes, it does even though I set the luminosity of 100% white to 120 nits. hdtvtest.co.uk and rtings.com has confirmed this. The only reason I first noticed this myself was because there's a scene in the season finale (Episode 24) of an anime called Haikyuu!! Season 2 where sometimes, there's focus on a singular character on a white background (spoilers) or really bright white highlights and I noticed dimmed whites. The contrast was ruined for me, and I would have never noticed had I not owned a X800D for less than a month and watched the same episode on both the X800D and C6P. I've had to change my nits to 140 at 100% white to compensate for ABL because of this. Otherwise, 140 is usually too bright for my room conditions normally compared to my old LED (X800D) in my room. My brightness setting (black levels) is 65 for reference compared to 50 that rtings' uses. Whether or not you notice is another thing entirely. I can confirm this since I calibrated my own TV with a i1 Display Pro, i1 Pro 2, and HCFR.

Edit:

A video that shows this
 

mrklaw

MrArseFace
While not an OLED, I recently bought a 65in ks9000 and I find myself using it more now than my 55in w900a. I think it's the size difference, never thought I would appreciate the difference between 55 and 65.

You get about an extra 3.5 square feet of TV area going from 55" to 65" - 1291sq in -> 1800 sq in

That's an increase in size greater than the entire viewable area of a 32" widescreen TV :)
 

Sky Chief

Member
I just called Cleveland Plasma, and they told me that they do not have any kind of X number of day price guarantee. That's a huge bummer, because if they did that would have locked me in for a purchase knowing that I wouldn't have to worry about any major drops between now and the Super Bowl. Without that guarantee, it has turned the prospect of buying from them into a potential minefield of buying and then missing out on a cheaper deal between now and one of the biggest sales periods of the year for TVs.

What's their current pricing?
 

Paragon

Member
Yes, it does even though I set the luminosity of 100% white to 120 nits. hdtvtest.co.uk and rtings.com has confirmed this. The only reason I first noticed this myself was because there's a scene in the season finale (Episode 24) of an anime called Haikyuu!! Season 2 where sometimes, there's focus on a singular character on a white background (spoilers) or really bright white highlights and I noticed dimmed whites. The contrast was ruined for me, and I would have never noticed had I not owned a X800D for less than a month and watched the same episode on both the X800D and C6P. I've had to change my nits to 140 at 100% white to compensate for ABL because of this. Otherwise, 140 is usually too bright for my room conditions normally compared to my old LED (X800D) in my room. My brightness setting (black levels) is 65 for reference compared to 50 that rtings' uses. Whether or not you notice is another thing entirely. I can confirm this since I calibrated my own TV with a i1 Display Pro, i1 Pro 2, and HCFR.

Edit: A video that shows this
Yes I know what an ABL looks like - it's one of the reasons that I couldn't live with the Kuros and something I am concerned about with OLED when viewing HDR content.
It's why I'm glad to see improvements still being made to LCD like QLED and Panasonic's 1,000,000:1 IPS panels.

You reply is still a bit ambiguous though.
You say that you have to set "100% white" to 140 nits.
When you say "100% white" do you mean a full white screen (100% APL) or do you mean 100% luminance in a 10-20% APL window?
 

The_Spaniard

Netmarble

I'll start the process tomorrow, though anybody that has used Cleveland Plasma before, correct me if I'm wrong, but are they open to some light negotiating? I think they quoted me around $3,200 on the set I was interested in, and I'm more looking at $3,000.
 

molnizzle

Member
25ms is my threshold where I can feel lag yes, but we are only talking 2ms above that so I don't think it will be an issue. I didn't like the 34-36ms that the c6 had though.

This is because of the the way frame rates actually work on fixed-refresh displays.

In 60fps games each "frame" is on screen for 16.6 ms, whereas 30fps "frames" are on screen for 33.3 ms. So anything between 16.6-33.3 is gonna feel the same in a 30fps game since there will always be at least 33.3ms of lag since that's how long each frame lasts.

For 60fps games you're looking at 2 frames of lag in the 16.6-33.3 sweet spot. To achieve a single frame you'd need a screen that has less than 16.6 ms of input lag. But again, 14 ms of lag would feel identical to 1 ms since there's always gonna be at least a one frame delay.

This is all assuming the games are running at locked frame rates on a locked 60hz display. Safe to assume for current gen console gaming.

tl;dr - your sweet spot isn't 25 ms. It's 33.3 ms, because that's where you gain an extra frame of lag in both 30fps and 60fps console games.
 

Paragon

Member
This is because of the the way frame rates actually work on fixed-refresh displays.
In 60fps games each "frame" is on screen for 16.6 ms, whereas 30fps "frames" are on screen for 33.3 ms. So anything between 16.6-33.3 is gonna feel the same in a 30fps game since there will always be at least 33.3ms of lag since that's how long each frame lasts.
For 60fps games you're looking at 2 frames of lag in the 16.6-33.3 sweet spot. To achieve a single frame you'd need a screen that has less than 16.6 ms of input lag. But again, 14 ms of lag would feel identical to 1 ms since there's always gonna be at least a one frame delay.
This is all assuming the games are running at locked frame rates on a locked 60hz display. Safe to assume for current gen console gaming.
tl;dr - your sweet spot isn't 25 ms. It's 33.3 ms, because that's where you gain an extra frame of lag in both 30fps and 60fps console games.
Display latency doesn't block the game rendering process. It's added on top of it.

If you're playing a game at 30 FPS with two frames of lag from V-Sync (67ms) on a display with 50ms latency, you're going to feel 117ms lag.
If the display had 1ms latency, you would feel 68ms lag instead.
If it was a VRR display with 1 frame latency because there's no need for V-Sync, it would drop to 33ms.

The display latency doesn't "cover up" other sources of latency, it's all additive.
You want the minimum latency possible from all components.
 

jstevenson

Sailor Stevenson
I'll start the process tomorrow, though anybody that has used Cleveland Plasma before, correct me if I'm wrong, but are they open to some light negotiating? I think they quoted me around $3,200 on the set I was interested in, and I'm more looking at $3,000.

Chris doesn't mess around. He gives you his current low price.

His price will go down over time or if LG issues promotions.

If you're using a Citicard price match to a website, does it even matter what you pay Cleveland Plasma?
 

holygeesus

Banned
Trying to discuss issues with OLED now is like trying to discuss issues about the Kuros back when they were still being sold.
People refused to admit that they were anything less than perfect until something better finally arrived. (better = deeper black level on AV sites, apparently nothing else matters)

I'm not sure what you mean with this. Nigh on every owner on this site, myself included, love our sets but are willing to accept they aren't perfect. To counter your argument, there is a clear group of people, with whatever unknown agenda, who seem to latch onto whatever stick they can find, to twat at the technology - now lag has largely been solved, it seems ABL will be next.

For the record, I never have noticed ABL, but then I don't watch hockey......maybe it is down to how you calibrate your panel (my contrast is 84 and OLED light 55) but in the thousands of hours I've watched my B6, it's never been an issue. And I came from a Kuro before, so if I was sensitive I would see it.

As to improvements I would like to see - near black improvement on low quality source material is probably number one. I'm actually surprised that *that* isn't the thing that the anti-OLED brigade focus on more, as it is more noticeable than anything else. Feed these sets poor source material and they produce poor results - however feed them a decent signal and they shine.

In terms of picture processing. Again, I don't have a problem. 1080p material scales up perfectly, and even most of the 720p material I watch looks great, as long as the compression isn't too harsh.

If you think though, that OLEDs are just about black levels, you've never seen one.
 

Paragon

Member
I'm not sure what you mean with this. Nigh on every owner on this site, myself included, love our sets but are willing to accept they aren't perfect.
Yeah I meant on other AV sites, GAF has been pretty good about it.
Except for the whole "image retention doesn't happen" thing.

For the record, I never have noticed ABL, but then I don't watch hockey......maybe it is down to how you calibrate your panel (my contrast is 84 and OLED light 55) but in the thousands of hours I've watched my B6, it's never been an issue. And I came from a Kuro before, so if I was sensitive I would see it.
I think that if you weren't bothered by it on your Kuro or OLED you just don't notice it.
For many types of content - games being one of them - it's very noticeable to me.

If you think though, that OLEDs are just about black levels, you've never seen one.
I don't, but that's where the majority of people on AV sites seem to focus their attention. If it can't do pure black, it's not worth looking at. (Kuros magically did "pure black" until OLEDs came out)
 

KaoticBlaze

Member
Glad to hear the 2016 OLED models will be getting the HLG update as well. Also the fact that he mentions picture quality is basically the same as the 2017 models makes me feel better about getting the C6 instead of waiting for C7.
 

The_Spaniard

Netmarble
Chris doesn't mess around. He gives you his current low price.

His price will go down over time or if LG issues promotions.

If you're using a Citicard price match to a website, does it even matter what you pay Cleveland Plasma?

Quite simply, Citi price-matches a maximum of 500 bucks under what you paid. So I'm going to try to get the lowest base price possible before finding the lowest listed price I can find anywhere else to lower it another 500 bucks.
 

holygeesus

Banned
Yeah I meant on other AV sites, GAF has been pretty good about it.
Except for the whole "image retention doesn't happen" thing.

I think that if you weren't bothered by it on your Kuro or OLED you just don't notice it.
For many types of content - games being one of them - it's very noticeable to me.

I don't, but that's where the majority of people on AV sites seem to focus their attention. If it can't do pure black, it's not worth looking at. (Kuros magically did "pure black" until OLEDs came out)

Image retention does happen. Burn-in doesn't. IR is actually very easy to achieve, but it disappears very quickly in my experience. If you want to see this at it's worst, a game like The Witness in HDR mode really does show it. This has been the only game I have noticed IR in thus far though - I guess it is because of the extremely bright white puzzles on-screen, coupled with HDR which has contrast set to max by default.

I think you are right in that ABL may be something you are either sensitive to or aren't.

As to the pure black thing. it's a massive selling point for sure, but I don't subscribe to the whole Kuro = perfect set thing either. My blacks on mine, had a slight reddish tinge to them, whereas my OLED is complete black, as in the TV disappears into the darkness at night. Every set out there has it's issues - even the ZD9 (arguably the best all-round consumer set on the market). It is finding out which of these issues you can live with that is key to finding the set for you - for example, I would find DSE far more a problem for gaming than ABL - something OLEDs do not suffer from, whereas most LCD panels do.
 

Geneijin

Member
Yes I know what an ABL looks like - it's one of the reasons that I couldn't live with the Kuros and something I am concerned about with OLED when viewing HDR content.
It's why I'm glad to see improvements still being made to LCD like QLED and Panasonic's 1,000,000:1 IPS panels.

You reply is still a bit ambiguous though.
You say that you have to set "100% white" to 140 nits.
When you say "100% white" do you mean a full white screen (100% APL) or do you mean 100% luminance in a 10-20% APL window?
Ah, my bad. Yeah, 100% luminance in a 20% window. I forget offhand what it lowers to, but I'm thinking like down to 120 nits from 140 since I couldn't do 120 nits anymore in my room because of the ABL. I'll post a reading when I find time to recalibrate it. It has a gamma of 2.3 right now, but I might experiment with 2.2 again.

I think that if you weren't bothered by it on your Kuro or OLED you just don't notice it.
For many types of content - games being one of them - it's very noticeable to me.
Yeah, if you don't notice it, don't ever go looking for it. But when you do though, man, the contrast in bright scenes is so disappointing.
 
The problem is Samsung calling it a "QLED TV", because it's not. The Q's don't even have anything to do with the LED (backlights) in these sets. They're just LCDs with Quantom dots.

Also, using edge-lit on them is just embarrassing.

What I don't understand, what are they going to call their actual QLEDs when they come out. They fucked themselves over with the naming for some short term marketing buzzword.
 

ACH1LL3US

Member
This is because of the the way frame rates actually work on fixed-refresh displays.

In 60fps games each "frame" is on screen for 16.6 ms, whereas 30fps "frames" are on screen for 33.3 ms. So anything between 16.6-33.3 is gonna feel the same in a 30fps game since there will always be at least 33.3ms of lag since that's how long each frame lasts.

For 60fps games you're looking at 2 frames of lag in the 16.6-33.3 sweet spot. To achieve a single frame you'd need a screen that has less than 16.6 ms of input lag. But again, 14 ms of lag would feel identical to 1 ms since there's always gonna be at least a one frame delay.

This is all assuming the games are running at locked frame rates on a locked 60hz display. Safe to assume for current gen console gaming.

tl;dr - your sweet spot isn't 25 ms. It's 33.3 ms, because that's where you gain an extra frame of lag in both 30fps and 60fps console games.

Lol!!

You explained this perfectly to me and also explains why the c6 I could feel the lag yet it wasn't an issue on the ks8000.

Should we going by the bottom bar on the Leo bodnar? So for instance the c6 bottom bar was 37ms and the b6 bottom bar with the new firmware is 29.8ms, so that means the b6 is under the two frame 33.3ms number. The ks8000 had a bottom bar of 27.5ms. So I completely agree with your explanation and now helps me understand it fully. This b6 should be on point for me in regards to lag :0
 

AddiF

Member
I received my E6 65" last night. Haven't unboxed the TV itself, only cut the straps and took out the box with remotes and manuals. Now I'm a bit worried. The box was laid flat on its side but on the box there is a picture showing not to do it and it also says not to do so in the instruction manual. Elderly man was driving the delivery van and drove carefully. Should I be worried or is this ok?
 
I received my E6 65" last night. Haven't unboxed the TV itself, only cut the straps and took out the box with remotes and manuals. Now I'm a bit worried. The box was laid flat on its side but on the box there is a picture showing not to do it and it also says not to do so in the instruction manual. Elderly man was driving the delivery van and drove carefully. Should I be worried or is this ok?
I've seen plenty of YouTube videos of people laying it on it's side so they could install the stand. Personally the guy who wall mounted mine made sure not to tip it at all. You should be ok, but the delivery person shouldn't have tipped it at all, so maybe contact where you purchased it from to see what they say before you open the box.
 

Chao

Member

pswii60

Member
Glad to hear the 2016 OLED models will be getting the HLG update as well. Also the fact that he mentions picture quality is basically the same as the 2017 models makes me feel better about getting the C6 instead of waiting for C7.
Really? Where is this confirmed?

Only thing I've seen is a 'rep' saying 'most likely' at CES but no official confirmation.
 

NYR

Member
Really? Where is this confirmed?

Only thing I've seen is a 'rep' saying 'most likely' at CES but no official confirmation.
Jesus. People are so cynical. Why would they put out a press release advertising that last years model will be just as good as the model they are just putting out for sale. Be realistic.
 

Kyoufu

Member
I don't think the 2017 models are much of an upgrade tbh. 10-20% brightness isn't enough for me to think about upgrading from an E6.
 

KaoticBlaze

Member
Really? Where is this confirmed?

Only thing I've seen is a 'rep' saying 'most likely' at CES but no official confirmation.

It's in the youtube video a few posts up. The guy from LG says during the interview that they will roll out a firmware update with HLG for the 2016 models sometime during the year after the 2017 models come out. He also says the picture quality is basically the same except for the 2017 model being about 25% brighter. Seems like they focused more on rolling out that new wallpaper model this year.
 

Vanillalite

Ask me about the GAF Notebook
As I said before the key is how soon HDR content becomes mainstream.

You'll generally still get a better picture outside of some edge cases like the hockey example on OLED. This is especially the case for movies. Plus the input lag has been fixed for games.

That being said there are just a handful of good things to watch in HDR. Not that OLED is bad at handling HDR, but no it can't compete with LCD variants getting brighter and brighter nits wise and being better at sustaining it.

We really are kinda fucked in that I don't think we are gonna get anything close to OLED blacks anytime soon in LCD land while I also don't think the OLED brightness HDR problem is magically gonna be solved anytime soon either.

It would be nice to have one tech to rule them all, but I don't see that happening at least in the next 3 to 5 years.

This problem becomes more of an issue the more HDR content we get too.
 

Anarion07

Member
I've seen plenty of YouTube videos of people laying it on it's side so they could install the stand. Personally the guy who wall mounted mine made sure not to tip it at all. You should be ok, but the delivery person shouldn't have tipped it at all, so maybe contact where you purchased it from to see what they say before you open the box.

You're supposed to tip it on the side/front/panel to install the stand. It's in the instructions.
 

BumRush

Member
It's in the youtube video a few posts up. The guy from LG says during the interview that they will roll out a firmware update with HLG for the 2016 models sometime during the year after the 2017 models come out. He also says the picture quality is basically the same except for the 2017 model being about 25% brighter. Seems like they focused more on rolling out that new wallpaper model this year.

Why would a rep ever say any of that lol? I wonder if he still has a job.
 

holygeesus

Banned
As I said before the key is how soon HDR content becomes mainstream.

You'll generally still get a better picture outside of some edge cases like the hockey example on OLED. This is especially the case for movies. Plus the input lag has been fixed for games.

That being said there are just a handful of good things to watch in HDR. Not that OLED is bad at handling HDR, but no it can't compete with LCD variants getting brighter and brighter nits wise and being better at sustaining it.

We really are kinda fucked in that I don't think we are gonna get anything close to OLED blacks anytime soon in LCD land while I also don't think the OLED brightness HDR problem is magically gonna be solved anytime soon either.

It would be nice to have one tech to rule them all, but I don't see that happening at least in the next 3 to 5 years.

This problem becomes more of an issue the more HDR content we get too.

The ZD9 is as close as I've seen to OLED level blacks, but even that isn't there yet, and the dimming technology isn't good enough for me. On the demo I saw, you kinda lose fine details, such as during night time shots, the stars disappear.

The ZD9 also blows away any OLED when it comes to HDR, as you mention. For me though, I can't live with DSE as I watch a lot of sports and obviously game, and the demo model had signs of this which I noticed even in-store. I have no idea if it's a prevalent problem though.

Taking into consideration the cost, and the fact that we mainly watch non-HDR material in a dark room, I can't budge from OLED now. If I had the money, and was doing any day-time viewing, I would probably go for the ZD9 (and an OLED for the bedroom ;)
 

molnizzle

Member
Display latency doesn't block the game rendering process. It's added on top of it.

If you're playing a game at 30 FPS with two frames of lag from V-Sync (67ms) on a display with 50ms latency, you're going to feel 117ms lag.
If the display had 1ms latency, you would feel 68ms lag instead.
If it was a VRR display with 1 frame latency because there's no need for V-Sync, it would drop to 33ms.

The display latency doesn't "cover up" other sources of latency, it's all additive.
You want the minimum latency possible from all components.

Right, but each "frame" still displays on screen for the same amount of time. In the example you used, the 117ms of lag would feel the same as 132ms because both would result in 4 "frames" of lag in a 30fps game (technically 8 actual frames at 60hz). Now 140ms? That's over the threshold for 5 frames so you might feel the difference if you're sensitive to input lag.
 

Paragon

Member
Right, but each "frame" still displays on screen for the same amount of time. In the example you used, the 117ms of lag would feel the same as 132ms because both would result in 4 "frames" of lag in a 30fps game (technically 8 actual frames at 60hz). Now 140ms? That's over the threshold for 5 frames so you might feel the difference if you're sensitive to input lag.
People don't work in frames.
We feel as much lag as there is, we don't lock to 33ms when we view something running at 30 FPS.
That's like saying you can't tell the difference between displays that have 1ms and 32ms latency in a 30 FPS game.

The display's latency is not connected to the game at all.
It's just additional latency that's added on top.
 

molnizzle

Member
People don't work in frames.
We feel as much lag as there is, we don't lock to 33ms when we view something running at 30 FPS.
That's like saying you can't tell the difference between displays that have 1ms and 32ms latency in a 30 FPS game.

The display's latency is not connected to the game at all.
It's just additional latency that's added on top.

In a 30fps game on a 60hz display the image only changes once ever 33.3ms. So no matter what, there's at least 33.3ms of lag (not even taking stuff like Vsync into account). Even if the display has 1ms of input lag it's still gonna take 33.3ms for the next frame to appear on screen. If you have two 30fps console games running side by side—one on a 1ms display and the other on a 32ms display—and you test the same input at the same exact time (perfectly at the begining of a frame), both displays would reflect the input on the next rendered frame.

Obviously there is more to take into account since games have built-in input lag depending on the engine, Vsync implementation, etc. And I suppose with lower latency you can enter an input slightly later and still have it reflected in the next frame. But the multiples are still 33.3ms for 30fps games and 16.6ms for 60fps games, as a rule of thumb.

...exception being PC games running on VRR displays (or any display over 60hz) or any game running at an unlocked frame rate. Then it gets dicier.
 

Theonik

Member
The display lag is the time that passes between a frame being received by the display and being displayed. It can be arbitrary values. It's not tied to framerate.

If you are trying to determine input to photons then framerate matters but for games there is a number of factors involved.
 

Paragon

Member
In a 30fps game on a 60hz display the image only changes once ever 33.3ms. So no matter what, there's at least 33.3ms of lag (not even taking stuff like Vsync into account). Even if the display has 1ms of input lag it's still gonna take 33.3ms for the next frame to appear on screen. If you have two 30fps console games running side by side—one on a 1ms display and the other on a 32ms display—and you test the same input at the same exact time (perfectly at the begining of a frame), both displays would reflect the input on the next rendered frame.
Let's make this even simpler than pressing a button at the same time.
You have a single console and the output is going into an HDMI splitter.

When you press a button, TV1 (1ms) would update the screen after 34.3ms while TV 2 (32ms) would update after a 65.3ms delay.
It doesn't matter that the frames are spaced 33.3ms apart, what matters is how much of a delay the TV adds on top of that.

The delay is not linked to the source in any way, it's just an arbitrary delay caused by the TV's image processing when it receives a signal.
 
From what I've read yesterday before bed, all of the new Samsungs are edge-lit LCD even Q9. Which means that this year's KS9800 is probably a better TV for HDR with it's full array backlight.

EDIT* Beaten by The Beard.

Looking at Samsung's KS line this year, it seems the curved TVs (KS8500 at least) are marketed as "full array" as well on Amazon. Is that true? If so, I'll go against my intial bias against a curved TV for the next few years.
 

molnizzle

Member
Let's make this even simpler than pressing a button at the same time.
You have a single console and the output is going into an HDMI splitter.

When you press a button, TV1 (1ms) would update the screen after 34.3ms while TV 2 (32ms) would update after a 65.3ms delay.
It doesn't matter that the frames are spaced 33.3ms apart, what matters is how much of a delay the TV adds on top of that.

The delay is not linked to the source in any way, it's just an arbitrary delay caused by the TV's image processing when it receives a signal.

...but the game doesn't stop rendering new frames when you're not entering inputs. It's showing a new frame (even if it's the same scene) every 33.3ms. A fixed-refresh screen can't update at arbitrary numbers like 34.3ms or 65.3ms. It updates every 16.6ms no matter what (or 33.3 in the case of 30fps games). You got

1st frame - 33.3ms
2nd frame - 66.6ms
3rd frame - 99.9ms
etc.

In your example, both 34.3ms and 65.3ms are over 33.3 but under 66.6. So the actual action on screen would be reflected at the same time, because the displays would be adding 2 additional frames of lag on top of whatever the game has natively.
 
Looking at Samsung's KS line this year, it seems the curved TVs (KS8500 at least) are marketed as "full array" as well on Amazon. Is that true? If so, I'll go against my intial bias against a curved TV for the next few years.

The 8500 is edge lit. It does not feature full array local dimming as far as I am aware.
 

Yukstin

Member
I received my E6 65" last night. Haven't unboxed the TV itself, only cut the straps and took out the box with remotes and manuals. Now I'm a bit worried. The box was laid flat on its side but on the box there is a picture showing not to do it and it also says not to do so in the instruction manual. Elderly man was driving the delivery van and drove carefully. Should I be worried or is this ok?

it will be fine. When I setup my C6, I laid it down to attach the wall mount and had no issues. Just make sure you put something down so the screen doesn't get scratched. Here's an unboxing video of a B6 to show this: https://www.youtube.com/watch?v=uHBFqnYVd5w
 

Yukstin

Member
Quite simply, Citi price-matches a maximum of 500 bucks under what you paid. So I'm going to try to get the lowest base price possible before finding the lowest listed price I can find anywhere else to lower it another 500 bucks.

I would monitor this thread on the AVS forums for deals: http://www.avsforum.com/forum/322-o.../2399970-2016-lg-b-c-e-g-series-deals-65.html

Consensus seems to be that prices have spiked now post holiday and will drop in the next couple of weeks for super bowl deals so you may want to hold off until then.
 

Paragon

Member
...but the game doesn't stop rendering new frames when you're not entering inputs. It's showing a new frame (even if it's the same scene) every 33.3ms. A fixed-refresh screen can't update at arbitrary numbers like 34.3ms or 65.3ms. It updates every 16.6ms no matter what (or 33.3 in the case of 30fps games). You got

1st frame - 33.3ms
2nd frame - 66.6ms
3rd frame - 99.9ms
etc.

In your example, both 34.3ms and 65.3ms are over 33.3 but under 66.6. So the actual action on screen would be reflected at the same time, because the displays would be adding 2 additional frames of lag on top of whatever the game has natively.
Yes, each frame is 33.3ms apart at 30 FPS.
But the starting point is different on the two TVs due to their processing delays.

So a TV with 1ms latency:
1st frame - 34.3ms
2nd frame - 67.6ms
3rd frame - 100.9ms
etc.

A TV with 32ms latency:
1st frame - 65.3ms
2nd frame - 98.6ms
3rd frame - 131.9ms
etc.
 

Midas

Member
If I'm not getting a new model from either Samsung, LG or Sony this year, I'll either get the KS8005 (Europe model, as I understand it this is KS9000 in North America?) or the OLED55B6V.

Are there things I need to think about before making my purchase in regards to pros and cons when it comes to these two TV sets? Or is it just depending on the budget? Should I just go for the OLED if I have the money?
 

pswii60

Member
...but the game doesn't stop rendering new frames when you're not entering inputs. It's showing a new frame (even if it's the same scene) every 33.3ms. A fixed-refresh screen can't update at arbitrary numbers like 34.3ms or 65.3ms. It updates every 16.6ms no matter what (or 33.3 in the case of 30fps games). You got

1st frame - 33.3ms
2nd frame - 66.6ms
3rd frame - 99.9ms
etc.

In your example, both 34.3ms and 65.3ms are over 33.3 but under 66.6. So the actual action on screen would be reflected at the same time, because the displays would be adding 2 additional frames of lag on top of whatever the game has natively.

You know, the fixed refresh screen still updates every 16.6ms in 30fps games, but the native refresh is 60hz (actually 120hz for OLED and most modern LCDs, so your screen is refreshing every 8.3ms regardless of the input, and just duplicating frames where necessary). The console is just sending the same frame twice during a 30fps game to the 60hz output, and your screen is still refreshing at its native 60hz/120hz regardless.

The Leo Bodnar doesn't lie. It tests the precise gap between pressing the button and the white square showing on the screen. So, there's nothing complicated about this - if the Bodnar reports 34ms of lag, then it's taking exactly 34ms from when you press the button to the output being displayed on the screen. It's a button, and a light sensor. A simple, but incredibly elegant solution.
 

Kyoufu

Member
I can't help but drool whenever I see the Sony A1E OLED. That design is just so intriguing. Really curious to read reviews of it and how it uses the X1 processor.
 
Looking at Samsung's KS line this year, it seems the curved TVs (KS8500 at least) are marketed as "full array" as well on Amazon. Is that true? If so, I'll go against my intial bias against a curved TV for the next few years.

Don't know where you read that but Amazon and Rtings tell me that only KS9800 (US) / KS9500 (EU) are full array of the 2016 Samsungs. Seems to be a great TV to but a bit too expensive for me.
 

molnizzle

Member
Yes, each frame is 33.3ms apart at 30 FPS.
But the starting point is different on the two TVs due to their processing delays.

So a TV with 1ms latency:
1st frame - 34.3ms
2nd frame - 67.6ms
3rd frame - 100.9ms
etc.

A TV with 32ms latency:
1st frame - 65.3ms
2nd frame - 98.6ms
3rd frame - 131.9ms
etc.

I don't get it...

Wouldn't both screens be
1st frame - 33.3ms
2nd frame - 66.6ms
3rd frame - 99.9ms

no matter what? How can they be anything else on a locked 60hz display?

You know, the fixed refresh screen still updates every 16.6ms in 30fps games, but the native refresh is 60hz (actually 120hz for OLED and most modern LCDs, so your screen is refreshing every 8.3ms regardless of the input, and just duplicating frames where necessary). The console is just sending the same frame twice during a 30fps game to the 60hz output, and your screen is still refreshing at its native 60hz/120hz regardless.

The Leo Bodnar doesn't lie. It tests the precise gap between pressing the button and the white square showing on the screen. So, there's nothing complicated about this - if the Bodnar reports 34ms of lag, then it's taking exactly 34ms from when you press the button to the output being displayed on the screen. It's a button, and a light sensor. A simple, but incredibly elegant solution.

Yes yes I know it's repeating frames, I tried to indicate that by airquoting "frames" when referring to 33.3ms refresh.

That's kinda my point, if a 60hz display is refreshing every 16.6ms no matter what, how can it matter if one set takes 15ms to process a frame and another takes 1ms? Both would be showing the new frame after 16.6ms. They can't show it after 1ms, it has to be a multiple of 16.6.
 
What's the input lag like for the Z9D right now?

"We tested the ZD9 with our Leo Bodnar tester and in Game mode with the local dimming off we got a measurement of 42ms and with the local dimming on we got 47ms, which should be low enough for most people, although serious gamers may find it slightly high. It is also slightly higher than the measurements we got for the XD93 and XD94, so we suspect the increased lag may be the result of the extra processing in the ZD9. However, personally we found gaming on the Sony to be an enjoyable experience as we blasted through a few sessions of No Man's Sky on our PS4." Source
 
The 8500 is edge lit. It does not feature full array local dimming as far as I am aware.

I apologize, you are correct. I'm confused with the KS9800, which is a bit out of price range for me.

Don't know where you read that but Amazon and Rtings tell me that only KS9800 (US) / KS9500 (EU) are full array of the 2016 Samsungs. Seems to be a great TV to but a bit to expensive for me.

I agree, at that price I would just buy the OLED. My case is the room will be dark most of the time so performance in bright rooms isn't an issue with me, hence why I don't shy from the LCD/LED panels.
 
Top Bottom