Why OLED for PC use?

While I am still getting used to the intense highlights associated with an HDR capable screen, it's amazing how quickly you get used to TRUE BLACKS.

Looking at my side screen is instantly almost bothersome by comparison.
 
While I am still getting used to the intense highlights associated with an HDR capable screen, it's amazing how quickly you get used to TRUE BLACKS.

Looking at my side screen is instantly almost bothersome by comparison.

I don't mix displays for this reason. I see the worst in each tech if they're next to each other for comparison.

Blacks is the reason I stayed with CRT initially. (Though over time I came to see their other virtues.) Absolutely hated what LCD had done to blacks. It just seemed outrageous to me. And to dynamic range. Just looked so shallow in comparison to a CRT. (Obviously over time LCD improved. Especially with FALD.)

And though HDR is neat, I do wonder if there would have been such a clamor for it if we had never abandoned displays with high dynamic range in the first place.

Text on LG OLED looks ok to me with BetterClearTypeTuner set to grayscale. I know it could be better if regular RGB pattern, but it doesn't look bad to me.

OLED Light set to 0 and contrast to 75 for my work stuff, which is plenty bright to me. Admittedly, I always use a room with dimmer ambient light. I never want anything competing with the display if I can help it.
 
The co
I have to do 15-17 hour drive about 4 times a year for family. Don't you have a screen that just uses Android Auto/Carplay? Why would the phone screen itself be on for that entire time? Having a truck/car with CarPlay for me is pretty much the only 'feature' purchase that I require for my daily driver. I wouldn't buy one without it.
The company trucks don't have the fancy new technology. Their trucks are a few years older. That's why lol.
 
You’re lucky. Being micromanaged by camera remotely is awful.
I'm not that lucky lol. While the owners are too cheap to install navigation in any of the trucks, they absolutely have installed front/back facing cameras in all the trucks as well as GPS tracking. So to be fair, we are prolly in similar boats lol but I'd rather be out driving as opposed to sitting in the office all day (what I used to do).
 
I'm not that lucky lol. While the owners are too cheap to install navigation in any of the trucks, they absolutely have installed front/back facing cameras in all the trucks as well as GPS tracking. So to be fair, we are prolly in similar boats lol but I'd rather be out driving as opposed to sitting in the office all day (what I used to do).


I imagine this is pretty much a liability/insurance thing these days. Install them, or pay many times more :/
 
Exactly. Cheaper insurance.

I wonder how much cheaper it is. Would be interesting to balance that cost against driver turnover. No one likes being micro-managed or watched all the time. That is a total morale killer, and must hurt employee retention a lot.
 
Text on LG OLED looks ok to me with BetterClearTypeTuner set to grayscale. I know it could be better if regular RGB pattern, but it doesn't look bad to me.

I've been going back and forth on this. The Windows UI looks much better with grayscale font AA, but in Word it makes all the fonts look really light, making it difficult to pick up markup colors when reviewing team edited documents. I've been switching back and forth between the grayscale and RGB for this reason, and I haven't decided where to leave it.

I know it wouldn't be perfect, as there is some inherent lack of smoothness in fonts due to the pixel layout, but it would be nice if we got specially designed cleartype settings for this subpixel layout as well.
 
do most oleds really not have a text mode yet? an update on my uled gave me one and its a night and day difference.
 
What is th e
Text on LG OLED looks ok to me with BetterClearTypeTuner set to grayscale. I know it could be better if regular RGB pattern, but it doesn't look bad to me.
AFAIK, setting to greyscale mode in BCTT is the same as just disabling ClearType in the built in ClearType tuner, so if that is the only thing one wants, you can do it without 3PP.
 
do most oleds really not have a text mode yet? an update on my uled gave me one and its a night and day difference.
I'm not aware of one on the LG 48GQ900 4k 48" "Ultragear" gaming monitor I picked up. As best I can tell it's a TV with a more aggressive AG coating (main reason I went with this model) and a DisplayPort input. I don't find the text on it terrible at all. I actually think it's better than the 43" 4k Samsung Neo G7 LCD I had briefly before taking it back because it had issues with both DisplayPort and HDMI. HDMI would randomly lose signal and DisplayPort wouldn't wake up from sleep. HDMI cable has tested good on the replacement LG... Both get crushed on text by my 4yo Dell 43" 4k 60Hz no HDR IPS pure business monitor.

As for OLED vs. VA vs. IPS, it depends on how far you are from the screen. First, the IPS screen is the gold standard for text. OLED text fringing doesn't seem to affect readability much if I plant my face in it at programming distance. It's ok everywhere. The goal there is max real estate for text, so scaling off. The VA screen is good on center but worse than the OLED for the outer 1/2 or so of the screen (1/4 + 1/4 = 1/2) at programming distance and good everywhere at a larger distance, but if I move back text is too small and I need to turn on scaling. The problem with the VA screen is color shifting. It makes everything look muddy out at the edges if you're at programming distance.

I ended up just buying a little more cheap Ikea furniture (it's my office, don't care if it's pretty, nondescript all black like I got is fine), rearranging things a bit, and going with an OLED + IPS setup. I'd be ok with just using the OLED, but I already have the IPS, it's better for work, and saving the OLED for gaming, video, etc. will avoid any possible burn-in issues. I suppose I ought to look into that a bit more. Never worried about burn-in with CRTs and never had a problem with it. After not liking that VA panel and realizing that pretty much all 43" 4k gaming monitors used the same panel I decided to just get an OLED and some more furniture and rearrange my setup rather than trying to replace that 43" 4k Dell IPS LCD.
 
I for4get who it was earlier on this thread who was on a rant about how OLED isn't bright enough and this is why you should always choose MicroLED, but after a good amount of first hand experience with a 42" LG C3, I can confidently say I disagree.

Even after Windows 11 HDR calibration, with HDR on for desktop use every freaking white icon or window is eye-scorchingly bright to the point of being unusable.

HDR can look great when well implemented in games and films, but on the desktop? I don't understand how anyone ever leaves it on. I feel the need to switch HDR off every time I exit a game.

Then again, maybe it's just me. I'm an avid indoorsman. My office is in my basement and doesn't have any windows, and that's the way I like it, away from the bothersome sun. I essentially never go outside during daylight without wearing sunglasses, as it is uncomfortable.

Heck, I feel like wearing sunglasses when sitting in front of this damn LG OLED in HDR mode. :p

No, I am not a vampire...

...or goth.
 
I saw this subject and I realized it has been almost a year since I bought my Alienware AW3423dfw. I've had no issues with burn in.

Steps I take:

- Do the pixel refresh on occasion. Maybe once or twice a day if I'm heavily using it. Most of the time Its a drive-by checking email or a couple websites so I don't run the pixel refresh.
- Turn monitor off when not in use for more than 20 minutes.
- Run Buttery Taskbar app so the taskbar is totally hidden. Before installing this app the top edge of the bar could be seen at the bottom of my screen.

Text looks fine to me. Either I've gotten used to it or I enabled cleartype early on as cleartype is currently enabled.

 
I saw this subject and I realized it has been almost a year since I bought my Alienware AW3423dfw. I've had no issues with burn in.

Steps I take:

- Do the pixel refresh on occasion. Maybe once or twice a day if I'm heavily using it. Most of the time Its a drive-by checking email or a couple websites so I don't run the pixel refresh.

Most of the TV's trigger the pixel refresh on their own when powered off after 4-6 hours of cumulative use. The Alienware doesn't do this? I guess it doesn't hurt to do it more often though.

- Turn monitor off when not in use for more than 20 minutes.

Does having it on with a blank black screen cause any harm? Or do you have icons on the desktop, a wallpaper and taskbar?

- Run Buttery Taskbar app so the taskbar is totally hidden. Before installing this app the top edge of the bar could be seen at the bottom of my screen.

This is something I've been concerned about as well. In Linux Mint Cinnamon edition it's a one pixel wide grey line. Being grey the burn-in threat is probably minimal, and I'm not sure if it would even be particularly visible in one line across the bottom of the screen like that.

Still, concerning because it is there almost all the time.
 
This is something I've been concerned about as well. In Linux Mint Cinnamon edition it's a one pixel wide grey line. Being grey the burn-in threat is probably minimal, and I'm not sure if it would even be particularly visible in one line across the bottom of the screen like that.

Still, concerning because it is there almost all the time.
This sort of concern, even for what some would consider not worth worrying about, is precisely why I won't use current self-emissive tech on the desktop.

Gorgeous for entertainment purposes, undeniable. But for all other purposes I really don't want that constant thought of burn-in holding me back from using a screen as I should be able to. It would always be there, making me conscious of what I'm holding on the screen in one place for too long. Annoying mitigations like hiding things I'm used to having visible can get in the bin, honestly. Not to mention weird subpixel layouts.

I very much hope for a self-emissive tech where this burn-in crap isn't an issue because the benefits otherwise are massive and obvious. But with even MicroLED suffering it I don't know what or when that will be.
 
This sort of concern, even for what some would consider not worth worrying about, is precisely why I won't use current self-emissive tech on the desktop.

Gorgeous for entertainment purposes, undeniable. But for all other purposes I really don't want that constant thought of burn-in holding me back from using a screen as I should be able to. It would always be there, making me conscious of what I'm holding on the screen in one place for too long. Annoying mitigations like hiding things I'm used to having visible can get in the bin, honestly. Not to mention weird subpixel layouts.

I very much hope for a self-emissive tech where this burn-in crap isn't an issue because the benefits otherwise are massive and obvious. But with even MicroLED suffering it I don't know what or when that will be.

Well, I have been following RTING's (I always pronounce it "Arr- Tings") long term burn-in tests.

What convinced me it was time to give it a try was that starting with the 2022 models (C2, G2. etc.) LG's offerings appear to be much more burn-in resistant.

And in their torture test from hell (endless 24/7 stream of CNN with the CNN logo on high brightness) this is what the screen looks like after 4500 hours:

1697040542012.png


Now, my screen is probably only on 11-12 hours a day, 5 days a week probably closer to 8 on average if we factor in screen saver time and the days I actually go into the office in person, and maybe 5 hours on the weekends when I get some gaming time, so that totals maybe 50 hours a week.

At that usage rate, it would take me 90 weeks to accumulate 4500 hours on mine, which is what, a year and 9-10 months?

In their screen above there is some slight burn in visible on the grey uniformity test screen, but it would probably be mostly unnoticeable on the screen in regular use.

Then add to the fact that my use is nowhere near as worst case torture testy as their test. I do have some semi static elements, but I move them occasionally, and I work in my basement office where I have no sunlight and can thus run my screens pretty dark. For my desktop use I run it in SDR mode at much lower brightness than in the HDR torture test.

I usually upgrade my screen every 4-5 years, and I'm thinking with this level of burn-in performance I should be able to make it to 4 without significant problems.

And if I don't, I paid for the 5 year Geek Squad protection plan, so I'm covered.

So yeah, I was concerned about burn-in, but I think I've gotten to the point where I really no longer am. OLED's have improved. (I mean, my phone has an OLED screen, and I've been using it for 2.5 years without burn-in). I still take precautions (no background image, self hiding menu bars, no icons on the desktop) but I'm not even sure that is necessary.

I'm not saying I am 100% sure I won't get burn-in. I'm not. But I think the chances are on the low side, and if the worst case happens, I have the protection plan.


It really does look amazing in titles that do HDR right. Cyberpunk 2077 is a masterpiece with RT turned up, in 4k with HDR on this screen. The bright sunrays through the leaves of the palm trees, the neon signs at night, and the lightning bolts are spectacular, not to mention sunsets and sunrises. I wish I had played it this way the first time around, but at least I'll get a chance with the new Phantom Liberty expansion.

Other games have a ways to go in improving their HDR implementation. Starfield is pretty bad and Windows 11's AutoHDR feature can only help so much (but there are mods that help)

But yeah. Ask me how it went in 2028, and I'll be sure to give you my full review :p
 
Well, I have been following RTING's (I always pronounce it "Arr- Tings") long term burn-in tests.

What convinced me it was time to give it a try was that starting with the 2022 models (C2, G2. etc.) LG's offerings appear to be much more burn-in resistant.

And in their torture test from hell (endless 24/7 stream of CNN with the CNN logo on high brightness) this is what the screen looks like after 4500 hours:

View attachment 605124

Now, my screen is probably only on 11-12 hours a day, 5 days a week probably closer to 8 on average if we factor in screen saver time and the days I actually go into the office in person, and maybe 5 hours on the weekends when I get some gaming time, so that totals maybe 50 hours a week.

At that usage rate, it would take me 90 weeks to accumulate 4500 hours on mine, which is what, a year and 9-10 months?

In their screen above there is some slight burn in visible on the grey uniformity test screen, but it would probably be mostly unnoticeable on the screen in regular use.

Then add to the fact that my use is nowhere near as worst case torture testy as their test. I do have some semi static elements, but I move them occasionally, and I work in my basement office where I have no sunlight and can thus run my screens pretty dark. For my desktop use I run it in SDR mode at much lower brightness than in the HDR torture test.

I usually upgrade my screen every 4-5 years, and I'm thinking with this level of burn-in performance I should be able to make it to 4 without significant problems.

And if I don't, I paid for the 5 year Geek Squad protection plan, so I'm covered.

So yeah, I was concerned about burn-in, but I think I've gotten to the point where I really no longer am. OLED's have improved. (I mean, my phone has an OLED screen, and I've been using it for 2.5 years without burn-in). I still take precautions (no background image, self hiding menu bars, no icons on the desktop) but I'm not even sure that is necessary.

I'm not saying I am 100% sure I won't get burn-in. I'm not. But I think the chances are on the low side, and if the worst case happens, I have the protection plan.


It really does look amazing in titles that do HDR right. Cyberpunk 2077 is a masterpiece with RT turned up, in 4k with HDR on this screen. The bright sunrays through the leaves of the palm trees, the neon signs at night, and the lightning bolts are spectacular, not to mention sunsets and sunrises. I wish I had played it this way the first time around, but at least I'll get a chance with the new Phantom Liberty expansion.

Other games have a ways to go in improving their HDR implementation. Starfield is pretty bad and Windows 11's AutoHDR feature can only help so much (but there are mods that help)

But yeah. Ask me how it went in 2028, and I'll be sure to give you my full review :p

Yeah either WOLED really is more durable than QD OLED, or LG has got a pretty good implementation of compensation cycles. Makes me kinda weary about getting that Asus QD OLED coming out next year because I have no intention of babysitting my screen, I treated my CX as I would any other PC display and it's got zero burn in to this day. Meanwhile the Alienware QD OLED from RTINGS has started to develop burn in after a measely 2400 hours.

1697051990906.png
 
I've realized that OLED whether WOLED or QD causes me eye strain. I can't really figure out why.

My C2 did, the AW3423DW did and so did the 27 Asus of my friends that I demo'd for a day. Within like 1-2 hours my eyes just feel tired where as I can go 8 hours straight on a DC dimmed LCD without issue.
 
I am the exact opposite lol, most screens tire my eyes easily but the CX at home does not and feels by far the best after hours and hours behind it.
 
I've realized that OLED whether WOLED or QD causes me eye strain. I can't really figure out why.

My C2 did, the AW3423DW did and so did the 27 Asus of my friends that I demo'd for a day. Within like 1-2 hours my eyes just feel tired where as I can go 8 hours straight on a DC dimmed LCD without issue.

I think it has to do with the peak brightness these panels have. The bright whites give me a headache and tire my eyes a lot.

In just about every situation except films and games I prefer SDR with the brightness turned way down.

These OLED's are just too damn bright!

(No one tell kramnelis :p )
 
I think it has to do with the peak brightness these panels have. The bright whites give me a headache and tire my eyes a lot.

In just about every situation except films and games I prefer SDR with the brightness turned way down.

These OLED's are just too damn bright!

(No one tell kramnelis :p )

Yeah...still running my CX at OLED Light 0 plus 120Hz BFI High for office work, which seems plenty bright to my eyes.
 
I think it has to do with the peak brightness these panels have. The bright whites give me a headache and tire my eyes a lot.

In just about every situation except films and games I prefer SDR with the brightness turned way down.

These OLED's are just too damn bright!

(No one tell kramnelis :p )
My PG32UQX was literally 9x brighter (1200nits full field vs 140) and I never once experienced eye strain with it.

I think it might be that PWM like brightness dip that all of these OLEDs have in common. Maybe I'm just sensitive to it.
 
I've been going back and forth on this. The Windows UI looks much better with grayscale font AA, but in Word it makes all the fonts look really light, making it difficult to pick up markup colors when reviewing team edited documents. I've been switching back and forth between the grayscale and RGB for this reason, and I haven't decided where to leave it.

I know it wouldn't be perfect, as there is some inherent lack of smoothness in fonts due to the pixel layout, but it would be nice if we got specially designed cleartype settings for this subpixel layout as well.
Have you tried MacType for Windows from www.mactype.net?

Don't forget to do the Chrome tweak, since it tries to insist on using its own built-in font rendering.

Well, I have been following RTING's (I always pronounce it "Arr- Tings") long term burn-in tests.
[...]
So yeah, I was concerned about burn-in, but I think I've gotten to the point where I really no longer am. OLED's have improved. (I mean, my phone has an OLED screen, and I've been using it for 2.5 years without burn-in). I still take precautions (no background image, self hiding menu bars, no icons on the desktop) but I'm not even sure that is necessary.

Check out the LCD images on RTINGs burnin/aging tests.... MONTH 00 versus MONTH 08

For example, the Insigna LCD TV. And there's a number of panels. I was surprised that the newest fabs of LG WOLED wears less badly than some of the cheap LCDs (!!)

Usually, slight aging improves LCD due to liquid reflow, but some of the uniformities have become noticeably worse.

It appears that the "aging degradation" venn diagram now finally overlaps between (worst) LCD and (best) OLED, from an aging perspective.
 
Last edited:
HDR can look great when well implemented in games and films, but on the desktop? I don't understand how anyone ever leaves it on. I feel the need to switch HDR off every time I exit a game.
I only use Windows 10 but Windows 10 gives an option to set the desktop brightness of the HDR. As I mentioned it's worth keeping in mind that LG OLED's have aggressive brightness limiting features that engage shortly when the image hasn't changed enough, so SDR is usually a better experience for desktop usage.

And I understand why they leave it on. They're sticking it to OLED-gang by staring at 2000 nits for 8 hours straight. If they feel like they need to burn their eyes out to get full usage of their monitor, I say go for it!
 
Yea you're supposed to use the Windows HDR brighntess setting if you're gonna run HDR on the desktop. WIth that slider set to 0 I get exactly the same brightness as SDR with OLED light 20 (80 nits on my CX). But HDR content can still use the full range if needed. This slider ONLY affects SDR content in HDR mode. It sounds like Zarathustra[H] may have missed that part?

That's the whole point of Window HDR: to let us see both SDR and HDR as intended and never have to fiddle with settings. (it's far from perfect but it has gotten better)
 
Yea you're supposed to use the Windows HDR brighntess setting if you're gonna run HDR on the desktop. WIth that slider set to 0 I get exactly the same brightness as SDR with OLED light 20 (80 nits on my CX). But HDR content can still use the full range if needed. This slider ONLY affects SDR content in HDR mode. It sounds like Zarathustra[H] may have missed that part?

That's the whole point of Window HDR: to let us see both SDR and HDR as intended and never have to fiddle with settings. (it's far from perfect but it has gotten better)

That's true. I always leave HDR on with slider set to "0", but you have to tweak the black level with black stabilizer/ fine tune dark areas, because the black levels are raised when you enable HDR for desktop usage. To match SDR with gamma 2.2 and OLED light 20 on my C1, I have set the black stabilizer to 9 and fine tune dark areas to -13: http://www.lagom.nl/lcd-test/black.php

When I'm gaming (always in HDR) I simply change the game preset within the game optimizer menu on my LG OLED.
 
Your display at 18% is about 160 nits if we follow the measured levels from tftcentral, even at 0% is almost 100 nits lol


View: https://i.imgur.com/fiV2ivl.png


There still must be some unit to unit variation or firmware changes, because according to TFTcentral, my new LG C3 is about 120 nits at a brightness setting of 41 in SDR mode, but in brightness matching the monitors on a white screen, I lowered the C3's brightness until it approximately matched, and that took me as low as about 20-25 somewhere.

So one of them or the other is wrong :p
 
Last edited:
There still must be some unit to unit variation or firmware changes, because according to TFTcentral, my new LG C3 is about 120 nits at a brightness setting of 41 in SDR mode, but in brightness matching the monitors on a white screen, I lowered the C3's brightness until it approximately matched, and that took me as low as about 20-25 somewhere.

So one of them or the other is wrong :p
Both of them may be correct, it's easy to miss some nuances nowadays given how complex displays have become in HDR era:

- ABL on/off setting;
- HDR on/off setting;
- VRR on/off setting (it can affects nits ever, ever so slightly on some panels);
- Window size (1%, 5%, 10%, etc) of the nit testing pattern;
- Whether HDR white or SDR white is used;
- Whether there are non-black pixels outside the window size;
- Firmware revisions that might adjust thresholds for specific window sizes for specific settings;
- Ambient light sensor in the display that auto-adjust brightness thresholds (sometimes hard to disable in certain displays);
- Even how long the test pattern was displayed (OLED displays can dial back peak HDR brightness after a few seconds of static HDR pixels, e.g. photo HDR instead of video HDR)

CRTs and plasmas also behaved similarly too with APL/ABL behaviors.

You can turn off HDR and turn off ABL behaviors, to get more linear color behavior, but you lose all the nice HDR peaking behavior you want in gaming.

It's so shockingly a big rabbit hole -- just a different rabbit hole than things like LCD GtG and LCD blacks.
 
Last edited:
Damn I run mine in hdr brightness at 100 in the TV menu and set the sdr slider to 4 for win 11 hdr desktop. I don't touch black level or fine tune black areas at all. Looks perfect to me. Though I did use cru to set the hdr max luminance to 128 so it reflects 800nits correctly.
 
Damn I run mine in hdr brightness at 100 in the TV menu and set the sdr slider to 4 for win 11 hdr desktop. I don't touch black level or fine tune black areas at all. Looks perfect to me. Though I did use cru to set the hdr max luminance to 128 so it reflects 800nits correctly.

That works great in Windows 11 I think, but I spend most of my desktop time in Linux, which as of yet has no HDR support that I am aware of. At least outside of some specialized video decode codecs.
 
I purchased the Alienware QD-OLED AW3423DW last week from Best Buy while it was on sale. I use it mainly for work and some gaming. So far with cleartype, the text isn't that bad. It just looks a bit blurry to me. I may try that MacType. After 20 hours, no burn in. :)

I hope it lasts. The faster refresh compared to my older IPS (75hz) and reduction of blurring with fast moving scenes makes it worth it in my opinion. I may change my mind by 2400 hours if/when burn in appears.
 
I purchased the Alienware QD-OLED AW3423DW last week from Best Buy while it was on sale. I use it mainly for work and some gaming. So far with cleartype, the text isn't that bad. It just looks a bit blurry to me. I may try that MacType. After 20 hours, no burn in. :)

I hope it lasts. The faster refresh compared to my older IPS (75hz) and reduction of blurring with fast moving scenes makes it worth it in my opinion. I may change my mind by 2400 hours if/when burn in appears.

I've got 60 hours on my 42" LG C3 thus far. So far so good.

But honestly, even on the old "bad" OLED panels that were prone to burn-in,, judging by RTING's long term burn-in tests I wouldn't expect to see anything until you get into the thousands of hours.

Think not week two, but year 2-3. That's when we find out if we made a mistake or not :p

So time will tell.
 
I purchased the Alienware QD-OLED AW3423DW last week from Best Buy while it was on sale. I use it mainly for work and some gaming. So far with cleartype, the text isn't that bad. It just looks a bit blurry to me. I may try that MacType. After 20 hours, no burn in. :)

I hope it lasts. The faster refresh compared to my older IPS (75hz) and reduction of blurring with fast moving scenes makes it worth it in my opinion. I may change my mind by 2400 hours if/when burn in appears.

https://www.reddit.com/r/Monitors/c.../?utm_source=share&utm_medium=web2x&context=3
 
And though HDR is neat, I do wonder if there would have been such a clamor for it if we had never abandoned displays with high dynamic range in the first place.
When did we have displays with HDR? I don't ever recall any home tech with HDR capabilities. CRT certainly didn't have it. One of the major issues with CRT was brightness. The brighter you pushed it, the more blooming you got and the less saturation you could get out of phosphors. That's why the sRGB standard got set at 80 nits and a dim room, not because that we something we though was awesome, but for display quality.

HDR isn't about perfect black point, it is about more range, brightness being an important part. Also black point was, in my experience, overstated on CRTs. As they operated their black point would rise, and many had various features to pull it up some anyhow to deal with burn-in. My Electron22BlueIV, last CRT I had and first one where I owned a calibrator, measured like 900:1 contrast ratio.

Likewise, the color gamut increase, which not technically required for HDR kind of comes part and parcel, is something we couldn't really have with CRTs. The reason the BT.709/sRGB primaries got set for what they were was that was about what we could get phosphors to actually do at a reasonable brightness. The original NTSC 1953 gamut called for wider colors, but it just wasn't achievable.

What older display tech did we have that did HDR?
 
HDR isn't about perfect black point, it is about more range, brightness being an important part. Also black point was, in my experience, overstated on CRTs. As they operated their black point would rise, and many had various features to pull it up some anyhow to deal with burn-in. My Electron22BlueIV, last CRT I had and first one where I owned a calibrator, measured like 900:1 contrast ratio.

Agreed. To me good HDR has to do with the ratio between the brightest and dimmest part of the screen, not the absolute brightness or the absolute darkest blacks. Your eyes will adjust to absolute levels.

Measure the light emission at the brightest point (with a 5 or 10% window) and divide by the light emission at the darkest point, and that should give you a representation of the quality of HDR.

(Of course, with an OLED you might be dividing by zero....)
 
Agreed. To me good HDR has to do with the ratio between the brightest and dimmest part of the screen, not the absolute brightness or the absolute darkest blacks. Your eyes will adjust to absolute levels.

Measure the light emission at the brightest point (with a 5 or 10% window) and divide by the light emission at the darkest point, and that should give you a representation of the quality of HDR.

(Of course, with an OLED you might be dividing by zero....)
It is about ratio, but getting higher max level matters too. The thing is, we are only so sensitive to low light changes, particularly in the presence of brighter light and if we want color (cones don't function much if at all for low light levels). So we can't just make a display like OLED that goes to 0 black, and then have a low brightness but declare it "HDR" because we have a lot of driving levels in between. I could design a display that does 12-bit levels between 0 and 1 nit, but it would still look dark and uninteresting, and most of those changes would be unnoticeable. Really I would argue a more useful contrast measure for actual content would be the difference between the first symbol that has a noticeable change and the maximum output.

So to get high dynamic range we need more light output too. We need to push that peak up so that the perceptible difference is more.

We also like it better as it gets brighter. The whole thing of the 10,000-nit target maximum and PQ EOFT was actually done because of research by Dolby. They did tests with people and discovered that most observers liked darker black points, but also higher bright points, and then research into what was the minimum change in luminance they could perceive at various levels.

Also also, our eyes are tricky and adapt to what we see, as you note, and can get fatigued, cones can get over saturated, and so on, so the impact of something bright is different if it is sustained vs only a short time. That is one of the reasons why HDR content is generally mastered such that most of the scene is in SDR brightness ranges, with only parts popping higher. If the whole thing is bright all the time, it loses its impact, the impact comes from the small areas of brightness. The other reasons are SDR compatibility, and simple display capabilities, of course.

Basically all to say that you don't get to be "HDR" just because you can claim a theoretical 0 black point or "infinite" contrast. The brightness matters too, and that's what makes consumer HDR displays new. It's also why good MiniLED can look really impressive. Even if the measured contrast isn't "infinite" like OLED and is "only" 500,000:1 or something, its near-zero black and higher brightness can add up to a more impactful HDR experience in some cases.

It isn't about theoretical numbers, it is about human perception.
 
It is about ratio, but getting higher max level matters too. The thing is, we are only so sensitive to low light changes, particularly in the presence of brighter light and if we want color (cones don't function much if at all for low light levels). So we can't just make a display like OLED that goes to 0 black, and then have a low brightness but declare it "HDR" because we have a lot of driving levels in between. I could design a display that does 12-bit levels between 0 and 1 nit, but it would still look dark and uninteresting, and most of those changes would be unnoticeable. Really I would argue a more useful contrast measure for actual content would be the difference between the first symbol that has a noticeable change and the maximum output.

So to get high dynamic range we need more light output too. We need to push that peak up so that the perceptible difference is more.

We also like it better as it gets brighter. The whole thing of the 10,000-nit target maximum and PQ EOFT was actually done because of research by Dolby. They did tests with people and discovered that most observers liked darker black points, but also higher bright points, and then research into what was the minimum change in luminance they could perceive at various levels.

Also also, our eyes are tricky and adapt to what we see, as you note, and can get fatigued, cones can get over saturated, and so on, so the impact of something bright is different if it is sustained vs only a short time. That is one of the reasons why HDR content is generally mastered such that most of the scene is in SDR brightness ranges, with only parts popping higher. If the whole thing is bright all the time, it loses its impact, the impact comes from the small areas of brightness. The other reasons are SDR compatibility, and simple display capabilities, of course.

Basically all to say that you don't get to be "HDR" just because you can claim a theoretical 0 black point or "infinite" contrast. The brightness matters too, and that's what makes consumer HDR displays new. It's also why good MiniLED can look really impressive. Even if the measured contrast isn't "infinite" like OLED and is "only" 500,000:1 or something, its near-zero black and higher brightness can add up to a more impactful HDR experience in some cases.

It isn't about theoretical numbers, it is about human perception.

Good reply. Have to say that FALD still raises blacks around lit areas in a scene though. It's non-uniform, using a tetris brickwork of shapes, often more than one cell wide like a short gradient in order to avoid more overt blooming (blending it across zones). The larger dark areas vs larger bright areas, like testing contrast ratings using large white and black tiles or full screen, will have the larger contrast numbers but the mixed contrast areas and areas around their edges will drop back to 3000:1 to 5000:1 since the brighter zones are lifting the blacks~darks and/or dimming the brights (and can lose some detail on either end).

It also doesn't help that most if not all FALD screens have matte abraded outer layer which will raise the blacks to more like grey-blacks when any ambient lighting hits the abraded layer. That happens with matte on any tech but at least there are some oleds available in glossy.

FALD tech does a good job considering the # of zones they have to work with, not saying that it's crap or anything. Both FALD and OLED use a variey of tricks/hacks to squeeze the most out of the limitations of their current tech.
 
Last edited:
My statement was in regard to the transition from CRT, with full on/off contrast ratios or dynamic range in excess of 15:000 to 1, to LCD's with a small fraction of that. HDR with its further expanded color and contrast is great, but will not be as impressive to someone transitioning from another great technology.

When my TV broke last year, my GDM-F520 became my TV for a bit as I was figuring out what to do. Even though on paper it has a fraction of today's display's color and contrast, it could still have a spectacular picture. One that already looked closer to HDR than say an LCD without FALD anyway...
 
Back
Top