Why OLED for PC use?

Yes but i was wishlisting one that would work globally in the gaming TV's osd no matter what the source is (ext media players, consoles, tv os apps, etc.). and even if it's hdr material. Thanks though I'll check into that sounds useful for some windows os based material.
 
RTINGS: Longevity Burn-In Investigative Paths After 3 Months QD-OLED vs. WOLED, LG vs. Sony, And More

s95b-burn-in-large.jpg


What about PC users; should they avoid using QD-OLED displays? These results don't look good for computer users. A computer's user interface often has large white areas, even if you're using your computer's Dark Mode feature, and those areas are likely to cause burn-in. There are steps you can take to reduce it, though, and as long as you mix up your usage, you probably won't have any issues. It's also unclear if the QD-OLED panels used for computer monitors perform the same. They use different compensation cycles than the TV versions, and this could play an important role in reducing image retention or preventing burn-in. We're looking into possibly adding a QD-OLED monitor like the Dell Alienware AW3423DW or AW3423DWF or the Samsung OLED G8 to the test temporarily to see how they perform. Let us know if this is something you're interested in or if you have any ideas for why these QD-OLED panels might perform differently when used as a TV compared to the Monitor implementations.
Interesting findings by Rtings.com about OLED burn-in. As always, I will strictly use my LG C1 OLED TV for gaming and media consumption only. If I ever consider an OLED monitor, it would be for a dedicated gaming PC and I would go W-OLED over QD-OLED.

This has got me thinking: MicroLED panel monitors can't come soon enough. Hopefully that will come with a more standard pixel layout so text fringing wouldn't be an issue with it.
 
Last edited:
RTINGS: Longevity Burn-In Investigative Paths After 3 Months QD-OLED vs. WOLED, LG vs. Sony, And More


Interesting findings by Rtings.com about OLED burn-in. As always, I will strictly use my LG C1 OLED TV for gaming and media consumption only. If I ever consider an OLED monitor, it would be for a dedicated gaming PC and I would go W-OLED over QD-OLED.

This has got me thinking: MicroLED panel monitors can't come soon enough. Hopefully that will come with a more standard pixel layout so text fringing wouldn't be an issue with it.


. .

Saw this in one of the youtube comments (yes... I know) of a video regarding the s95b burn in. Again, it's a youtube comment so it is probably pulled right from someones ass, but maybe it has some truth to it:
"Regarding the burn in on S95B and A95K, 1st gen QD-OLED uses Hydrogen (Protium) while LG’s EVO WOLEDs use Deuterium which is much more stable, the new 2nd Gen QD-OLED now also has Deuterium which along with the other improvements contributes to the 2x longer life vs the 1st gen."
...
Idk which gen rtings was using but if Gatecrasher's quote is true that might make a difference vs current qdoleds.

OLED tvs , at least LGs, from what I've read reserve the top 25% of their brigthness/energize range for a wear evening routine that runs periodically, including a more aggressive one that burns the emitters all down evenly every so many thousand hours. Then it boosts eveything back up to normal again. If you abuse your screen foolishly you'll burn through that buffer sooner. IDK how badly RTINGs was abusing theirs and with what settings, I'll have to read up on that later. Afaik there is no way to know how much "charge is left in the battery" so to speak in regard to that burn-down buffer.

Personally I do use my LG oled as a media and gaming "stage", with side screens for 2d desktop/apps.

This has got me thinking: MicroLED panel monitors can't come soon enough. Hopefully that will come with a more standard pixel layout so text fringing wouldn't be an issue with it.

Once we move to 8k screens in the long run we won't have to mask and smudge edges as aggressively/ as much as we must now to hide how very bad the ppd, percieved pixel sizes actually are in the first place.

These methods of text-ss and graphics AA are just blurring and smudging how blocky the pixels and their edges are, how low the density of the pixel grid is. Display tech on larger screens (vs. phones for example) has been so bad for so long we just accept it as a given that we have to try to hide how bad the resolution actually is . . but it's a temporary solution until tech advances for high enough resolution where this wouldn't be necessary. Similarly fald is an even lower resolution for lighting, with large 7000 pixel cells (28k worth of pixels per cell at 8k I think, like four 4k screens). Eventually we should get resolutions to PPD high enough that we won't need to rely on edge smudging hacks anymore and everything will go per pixel emissive with microLED so we won't have a whole segment using bucket lighting anymore either.

Text sub-sampling is made for RGB. In the meantime, someone could develop a WRGB and a pentile text sub-sampling method theoretically. Perhaps they could apply the 2d desktop to a 3d cube in a game engine or something so that game AA types could be available too.

Text sub sampling does nothing for 2d desktop's graphics and imagery so PPD is still poor and unmasked/smudged there even with subpixel layout appropriate text-ss applied.

Also, a big reason (pun intended) ... that the non-standard subpixel layouts vs. text is a more vocal complaint is because people are shoehorning 42" - 48" - 55" gaming tvs directly onto their desks instead of using a simple tv stand, wall mount or other mount to get a little more distance for the 60 to 50 degree human viewing angle where they'd get 64 to 77 ppd, respectively.

The rendering/aliasing is masking how low your ppd really is by smudging the edges.

1400p to 1500p like perceived pixel sizes at sub 60 PPD (e.g ~50 PPD) on a larger 4k screen viewed too near will will look fringed regardless but wrgb will make it look even worse because the subpixel smudging isn't aligned to it.

Once your drop below 60ppd , text sub sampling and graphics anti aliasing can't compensate fully anymore even on rgb though.

In addition, 2D desktop graphics and imagery have no sub sampling or AA to smooth/smudge/blur the edges so they will remain uncompensated, have no masking.

At the optimal 50 to 60 degree human viewing angle you get 64 to 77ppd on a 4k screen.

It's not some crazy distance relative to screen size. It's the optimal human viewing angle and it's the same viewing angle and PPD regardless of the (4k) screen size. It would have the same pixels per degree and perceived pixel size as a 27 inch 4k at around 23 inch view distance ( ~70PPD). The higher the PPD, the less you have to lean on masking how large the pixel structure appears.

Low PPD exacerbates fringing issues in general. High contrast graphics aliasing, (rgb subpixel format) text fringing, frame insertion edge artifacts, DLSS AI upscaling edge artifacts, and non standard pixel structures like pentile and wrgb. It's probably not a coincidence that the most vocal complaints about wrgb text are often from people cramming a 42", 48", or 55" oled into a near onto a desk setup that results in closer to 50PPD- than 70PPD+. Bigger perceived pixels bigger problems.
 
Last edited:
OLED is the biggest trade-off for PC use. You can only buy OLED for SDR yet still afraid of burn-in regardless.
You active at AVSForum?
Folks there really need your help, experience & knowledge calibrating their OLEDs.
 
RTINGS: Longevity Burn-In Investigative Paths After 3 Months QD-OLED vs. WOLED, LG vs. Sony, And More

View attachment 555874


Interesting findings by Rtings.com about OLED burn-in. As always, I will strictly use my LG C1 OLED TV for gaming and media consumption only. If I ever consider an OLED monitor, it would be for a dedicated gaming PC and I would go W-OLED over QD-OLED.

This has got me thinking: MicroLED panel monitors can't come soon enough. Hopefully that will come with a more standard pixel layout so text fringing wouldn't be an issue with it.
This is also a pretty extreme setup. About 16 years ago as one of my duties I was managing a system for ads on a local shopping mall's TVs and one of the ads managed to cause burn in on one of the LCD displays because it had a similar white and bright red banner on it and came up regularly enough in the short cycle of ads shown.

Some YouTubers managed to get burn-in on their LG OLEDs in 6 months because they did not bother adapting their usage at all. Meanwhile mine is 2.5 years old, with 2 years of 8+ hour desktop usage almost every day of the week. No burn in on that one.

I would not expect the QD-OLEDs to have significant issues for burn-in as long as you make some effort in mitigation. The QD-OLED monitors are most likely also more aggressive in their pixel refresh cycles which should help. I guess we will see in a year or two how the current AW etc models fare.
 
You active at AVSForum?
Folks there really need your help, experience & knowledge calibrating their OLEDs.
They don't need help calibrating OLED because there's not much that can be done with it.

To calibrate a display, it needs to be over-spec. OLED can be over-spec for sRGB 80 nits, allowing the range to be reduced, but it's not possible to magically calibrate Rec.2020 1000nits from 400nits. The range can only be reduced to prevent clipping and achieve a "calibrated" HDR 300.
 
This is also a pretty extreme setup. About 16 years ago as one of my duties I was managing a system for ads on a local shopping mall's TVs and one of the ads managed to cause burn in on one of the LCD displays because it had a similar white and bright red banner on it and came up regularly enough in the short cycle of ads shown.

Some YouTubers managed to get burn-in on their LG OLEDs in 6 months because they did not bother adapting their usage at all. Meanwhile mine is 2.5 years old, with 2 years of 8+ hour desktop usage almost every day of the week. No burn in on that one.

I would not expect the QD-OLEDs to have significant issues for burn-in as long as you make some effort in mitigation. The QD-OLED monitors are most likely also more aggressive in their pixel refresh cycles which should help. I guess we will see in a year or two how the current AW etc models fare.

People already have burn in on the Alienwares.
 
Told y'all so.

That's why newer LG OLEDs are ready for office desktop use nowadays. RTINGS noticed how massively LG improved in their burnin resistance compared to their early OLEDs.

And that's why I use Visual Studio and office usage on my OLED.

Edit to add note: You can do good longevity for a Samsung QD-OLED but that requires more manual steps to mitigate since they are brighter and may be partially the cause for faster wear and tear. So you need to do more intentional mitigations with a Samsung panel, while LG's mitigations are much more automatic and frequent. Samsung is still new to the large-consumer OLED game, and will catch up to LG longevity soon. I love how bright Samsung looks though, but LG OLED has arrived to office computing before Samsung did.
 
Last edited:
This has got me thinking: MicroLED panel monitors can't come soon enough. Hopefully that will come with a more standard pixel layout so text fringing wouldn't be an issue with it
Unfortunately, MicroLED burns in too (and faster than some OLEDs). Some do burn-in slower, but not all of them.

Even MiniLED too and Jumbotrons too. Anything utilizing *LED emissive tech (OLED, MiniLED, MicroLED). You've seen those few-year-old Daiktronics LED jumbotrons running at sunlight-bright settings -- those jumbotron billboards often have burnin after a while if they're showing the same static ads in rotation for over a year at maximum sunlight-fighting brightness on the side of freeways. The better advertisers are creative about it to extend their expensive investments in LED billboards. Some new jumbotrons systems also have pixel refresh systems now though to reduce this (not too different from, say, an LG OLED), where their equivalent pixel refresh process re-calculates power output to each pixel based on a specific pixel's wear and tear. So doe many OLED panels, especially with years of experience LG has with their particular OLEDs.

In many pixel refresh systems, a powerful FPGA/ASIC is like an automated per-pixel electrician that readjusts voltage and current to each pixel to push more power to a more worn-down pixel, to maintain the same light output as less-used pixels. Often it's just 1% or 2% more power to a pixel to make a dimmed lightbulb shine as bright as a new lightbulb, so you have a lot of headroom (say, 25% is healthy).

For anything emissive *LED, all LED lightbulbs have finite lifetime -- regardless of whether they are in a light fixture or a microdisplay subpixel.

It will be ironic if an OLED-avoiding person buys the first-ever small 55" all-MicroLED TV (full every-pixel dimming, no LCD layer), and it burns in faster than a mature recent LG OLED panel... Hopefully they engineer it to last at least 3 years without any faint burn-in.
 
Last edited:
Unfortunately, MicroLED burns in too (and faster than some OLEDs). Some do burn-in slower, but not all of them.

Even MiniLED too and Jumbotrons too. Anything utilizing *LED emissive tech (OLED, MiniLED, MicroLED). You've seen those few-year-old Daiktronics LED jumbotrons running at sunlight-bright settings -- those jumbotron billboards often have burnin after a while if they're showing the same static ads in rotation for over a year at maximum sunlight-fighting brightness on the side of freeways. The better advertisers are creative about it to extend their expensive investments in LED billboards. They have some fantastic pixel refresh systems now though to reduce this (not too different from, say, an LG OLED), where their equivalent pixel refresh process re-calculates power output to each pixel based on a specific pixel's wear and tear. So doe many OLED panels, especially with years of experience LG has with their particular OLEDs.

For anything emissive *LED, all LED lightbulbs have finite lifetime -- regardless of whether they are in a light fixture or a microdisplay subpixel.

It will be ironic if an OLED-avoiding person buys the first-ever small 55" all-MicroLED TV (full every-pixel dimming, no LCD layer), and it burns in faster than a mature recent LG OLED panel... Hopefully they engineer it to last at least 3 years without any faint burn-in.
I was actually going to ask this… People keep saying micro LED won’t burn in, but aren’t ALL emissive displays at risk? Ironically enough, I’ve used my CRT’s a lot. And when kept calibrated and not over bright (90 or so nits in a light controlled room) I’ve never experienced burn in on them.
 
Told y'all so.

That's why newer LG OLEDs are ready for computer desktop use nowadays. RTINGS noticed how massively LG improved in their burnin resistance compared to their early OLEDs.

And that's why I use Visual Studio and office usage on my OLED.
I've used my 42" C2 for almost a year now as a PC monitor with ASBL disabled from day one and don't have a hint of burn in surprisingly enough. And that is with about 8-10 hours of work every day with max brightness (I'm a brightness junkie, don't hate me). Don't even bother to hide my task bar anymore. Honestly, I deserve to have burn in by now but still, nothing, just checked it again with slides. My previous OLEDs with the same usage did eventually develop some burn in but the C2 seems rock solid (the GX had a panel replacement dude to faulty pixels so have to count that one out), but the C8 before it developed some burnin.
 
I was actually going to ask this… People keep saying micro LED won’t burn in, but aren’t ALL emissive displays at risk? Ironically enough, I’ve used my CRT’s a lot. And when kept calibrated and not over bright (90 or so nits in a light controlled room) I’ve never experienced burn in on them.
CRT is a single light source, so it can't burn in the same way.
 
Personally I wouldn't use an oled for static desktop/apps. I keep mine as it's own media/gaming "stage" with side screens for static desktop/apps. Kind of like how everyone has their own workstation screens in star trek but they also have the big main screen to view events. I use the turn off the screen emitters trick when not viewing content on the oled. That turns the emitters off without dropping the screen out of the monitor array or affecting anything running on the screen, incl audio. I only raise the curtain so to speak when I'm actually playing a game or viewing media, and activate the tots (emitters) feature when going afk etc.

Plenty of games have HuDs with a lot of hours on them. The LG's use logo dimming which helps a bit. . . but I prob wouldn't play a static app-like game like magic the gathering card game 24/7 at high brightness as that is more like a static desktop app than a more dynamic game.

Still there are people in the oled threads who have used their oleds for static desktop apps mixed usage with a lot of gaming as an all-around display - even disabling asbl in the svc menu, etc. for 4 years and more currently w/o burn in. OLEDs reserve the top 25% of brightness/energize capability for wear evening routine so you shouldn't get burn in until after that runs out completely. That should last a long time if you aren't foolishly abusing the screen. If you are worried about it that much , you can get the more expensive G model LG OLEDs (55" minimum though i think) which come with 4 yr burn in warranty, or pony up for the pricey best buy warranty on any brand/model oled which also covers burn in for several years. e.g. 5 yr bb warranty on the 42" C2 is ~ $210 usd. It's typically more or less around 1/5th of the price of a tv.



So maybe more affordable 8k oleds in the future I hope. Though 8k is 4x the pixels it still might mean more affordable 55" 8k oleds potentially since a 55" is near size to four 27" 4k screens. Maybe even a 48" at the figures you linked from discord. Can hope.

A LOT determines when your using OLED in a PC environment is how many hours? if you use it like 2-3hrs per day it wont matter what you do it will be fine for years.

The concerns arise with heavy usage 10-12hrs a day managing workloads on it.
 
Unfortunately, MicroLED burns in too (and faster than some OLEDs). Some do burn-in slower, but not all of them.

Even MiniLED too and Jumbotrons too. Anything utilizing *LED emissive tech (OLED, MiniLED, MicroLED). You've seen those few-year-old Daiktronics LED jumbotrons running at sunlight-bright settings -- those jumbotron billboards often have burnin after a while if they're showing the same static ads in rotation for over a year at maximum sunlight-fighting brightness on the side of freeways. The better advertisers are creative about it to extend their expensive investments in LED billboards. Some new jumbotrons systems also have pixel refresh systems now though to reduce this (not too different from, say, an LG OLED), where their equivalent pixel refresh process re-calculates power output to each pixel based on a specific pixel's wear and tear. So doe many OLED panels, especially with years of experience LG has with their particular OLEDs.

In many pixel refresh systems, a powerful FPGA/ASIC is like an automated per-pixel electrician that readjusts voltage and current to each pixel to push more power to a more worn-down pixel, to maintain the same light output as less-used pixels. Often it's just 1% or 2% more power to a pixel to make a dimmed lightbulb shine as bright as a new lightbulb, so you have a lot of headroom (say, 25% is healthy).

For anything emissive *LED, all LED lightbulbs have finite lifetime -- regardless of whether they are in a light fixture or a microdisplay subpixel.

It will be ironic if an OLED-avoiding person buys the first-ever small 55" all-MicroLED TV (full every-pixel dimming, no LCD layer), and it burns in faster than a mature recent LG OLED panel... Hopefully they engineer it to last at least 3 years without any faint burn-in.

I stand corrected about MicroLED then. If that's the case, then all the more reason to use the right display technology for the right application. I'll keep on using IPS for my general PC use and OLED for gaming and video consumption. I wanted something that could do it all and really well but I will just have to wait for self emissive panel tech to advance enough to be able to.

I think, while all panel technologies are susceptible to burn-in and image retention, they do so at different degrees and rates.From what I've read, OLED being organic makes it more prone to burn in compared to other panel technologies, hence why OLED TV and monitor manufacturers go to great lengths to mitigate burn in compared to other panel technologies. I guess OLED vs LCD burn in is something worth testing by RTINGS.
 
I stand corrected about MicroLED then. If that's the case, then all the more reason to use the right display technology for the right application. I'll keep on using IPS for my general PC use and OLED for gaming and video consumption. I think, while all panel technologies are susceptible to burn-in and image retention, they do so at different degrees and rates.From what I've read, OLED being organic makes it more prone to burn in compared to other panel technologies, hence why OLED TV and monitor manufacturers go to great lengths to mitigate burn in compared to other panel technologies. I guess OLED vs LCD burn in is something worth testing by RTINGS.
Yolo my man. Just get the best you can afford and let it rip. My plasma has some burn in too. Most of the time you can’t see it. Still looks great and I’m not worried about the kids using it.
 
The concerns arise with heavy usage 10-12hrs a day managing workloads on it.
Um, that's 2018-era news, when RTINGS tested those old B6/B7 era panels.

With latest formulations, it's okay to office with OLED, at least with 2023-era LG OLED formulations and mitigations. They burn in slower than plasma now.

There are people doing LG OLED with overtime programming work at 250h/month work with no burn in over 2 years. I've jumped into the fray.

Samsung and LG pushed the burn-in problem further down the line via various methods such as:
  • Unification of light emission medium, equalizing predictable wear and tear per hour of illumination for any subpixels at any given nit. It is easily wear-mapped into a LUT for Pixel Refresh algorithms.
    ... All Samsung subpixels emit only blue and use QD for red/green
    ... All LG subpixels emit only white light, and use filters for red/green/blue
  • Wear-reducer optimizations
    ... Samsung uses ultra-efficient QD phosphors, given blue OLED formulations are most durable and bright;
    ... LG uses white subpixels, as a wear-per-lumen reducer for color-filtered subpixels (inefficiencies of using filters).
  • Modern Pixel Refresh algorithms which monitors the lifetime of every single pixel and adjusts power on a per-pixel basis to compensate.
From brightness tests in the past (HDR window sizes), it appears that LG has more conservatively lumens-capped than Samsung, which may have contributed to LG's longer longevity being revealed in these brand new RTINGS tests. Not surprising, although both the Samsung and LG routes are steeped in their separate scientific progresses;

True LCDs are more proven durable, but it's incredible they have optimized OLED to last much longer than they used to. RGB OLED burn in very fast, so necessitated a unification of light emission medium.

Also, LCD backlights have a limited lifetime too. I also have a BenQ XL2411Z that is down to 100 nits from extreme edgelight wear and tear. I've also seen 7-year-old DELL and HP 1080p office monitors that can't go bright anymore, making them uncomfortable to use under office fluorescent lighting. YMMV.

LCD is lower risk for burn in obviously, but that doesn't preclude the fact that there are several OLEDs today, that have longevity ready for fulltime+overtime office desktop use. You kinda still have to cherrypick (and possibly mitigate any vendor that has more uncapped brightness), but matter-of-factly, the panels are here today.

I've used my 42" C2 for almost a year now as a PC monitor with ASBL disabled from day one and don't have a hint of burn in surprisingly enough. And that is with about 8-10 hours of work every day with max brightness (I'm a brightness junkie, don't hate me). Don't even bother to hide my task bar anymore.
I've witnessed thousands of anecdotes like yours now; at least with ultra-recent LG panels, at least when running in LG-branded product.

I believe it largely contributed to LG's decision to finally release OLEDs for computer use.

Be noted -- there are two kinds of image retention behaviors for OLEDs, with very different causes. One is the phosphorescent-style effect (the temp image retention that lasts for moments or a few minutes, without needing a Pixel Refresh), and the other is the permanent wear-and-tear-based burn in. Those people who have seen those phosphorescent-style effects of some OLEDs, should not confuse that with wear-based burn in.

Now that being said, the RTINGS test show that some OLEDs have poorly optimized Pixel Refresh logic. It's pretty important to have good Pixel Refresh logic in an OLED. And one that is very user friendly (e.g. pausable/interruptible), whereupon Pixel Refresh resumes at the last pixel left off, the moment a user tries to use the OLED while the display is in the middle of a Pixel Refresh. The algorithms behind Pixel Refresh are pretty incredible powerful FPGA engineering feats equivalent of create a few million personal electricians babysitting every single OLED subpixel with a voltmeter/multimeter, computing a bunch of Ohm's Law stuff, running through a LUT, and readjusting a variable power supply on a per-subpixel basis... Well, I'm certainly glorifying, but in a roundabout way, that's what Pixel Refresh is.

Bunch of this stuff are just Coles-Notes of Display Journals from SID/DisplayWeek and various;
 
Last edited:
Then why is there so panels documented from reddit desktop icons being burned in permanently?

Linus OLED got burned in as well along with his other employees
 
Then why is there so panels documented from reddit desktop icons being burned in permanently?

Linus OLED got burned in as well along with his other employees
Some of his panels did, and some other panels did not.

In Linus' video, watch Linus's video beginning at 12 minutes:

1678781471038.png

Yes, as a cautionary tale, you do have to be careful, do some cherrypicking, settings-mitigations. It's becoming easier to abuse an OLED with lower risk, but it's not 100% risk free. Be preemptively conservative, and you'll get great longevity on current LG panels now. The fact is, even LTT acknowledges me, if you're a whole-video watcher;

You will notice the pattern (more or less, on average) that the newer the generation of an LG panel, the less likely burn in is after a specific XXXX or XXXXX hours, apples-vs-apples wear and tear (same brightness setting etc). LG really worked hard on reducing burn in.
 
Last edited:
Nah Linus had a CX iirc other workers in his office did as well let me try and find the video again.


That's the exact video I linked to.

My screenshot is sometime between the 12 and 13 minute marks.

Start watching the whole video, including beyond the 12 minute mark.

It's a great video, so watch from the beginning all the way to the end. In typical fashion, he'll cherrypick the worst-worn panels, and then later on in the video, LTT acknowledge pretty much what I've said (most of the content beginning at time offset 12:00). In a roundabout way, you'll see LinusTechTips acknowledge exactly what I am saying: He acknowledges there is no burn in in some other / newer panels, and he mentions it's generally the recent LG panels that have the vast amounts of slow-wear accolades, even showing off many people who witnessed lack of wear and tear on OLEDs.

Ask oneself: Clickbait (understandable for views) YouTube titles vs objectively understanding the whole video;

Oh yes, you can definitely permanently burn-in a CX OLED, albiet needs more hours than with a B6/B7 OLED at same brightnesses. Now, the C2 OLEDs (newer) has more longevity than CX. Now it's gotten even harder beyond CX to permanent burn-in an OLED after the same number of hours at the same brightness. There's fuzz factor involved (panel lottery) so you have some variability here. But you can pretty much certainly see what LTT is getting at; if you watch the whole video objectively.

The fact is, even LTT acknowledges me, if you're a whole-video watcher;

Also, I currently execute more mitigations on my OLED than LTT does; I don't use 100% brightness -- not necessary for office usage scenarios with gaming in between. Running bright Windows Desktops for good video editing requires LTT to drive their slightly older OLEDs a bit harder than I do for, say, Visual Studio, CMD, WSL2, SSH, Microsoft Office, various Google Apps, or whatnot.

You will eventually burn in your OLED, given enough time. All displays have wear and tear. Being that said, we are already reaching the point where you can approach full tech-gadget lifecycle with an OLED without reaching objectionable levels of any-kind of nonuniformities anymore. (Be noted, LCD can have a different kind of nonuniformities that some people find far worse than an OLED's first few years -- e.g. splotchy blacks from BLB / glow / etc, or even a specific edge of screen that ghosts more than the opposite edge, or weird edgelight unevenness appearing after a few years, or finding you can't brighten the screen as much, etc).

Just because your desktop OLED gains a still-reasonably-faint 0.5% burnin nonuniformity after a reasonable time (such as 3 years) does not mean the display is more worthless than a more-splotchy-behaving LCD panel (e.g. various kinds of dark nonuniformities, or ghosting nonuniformities, or backlight nonuniformities). There's all kinds of weirdnesses of various panel technologies you got to contend with. For example on one VA especially for dark games, even the bottom half of one of my VA panels ghosts noticeably more than the top half in dark dungeon/cyberpunk style games after a while, probably because of embedded power supply at the back of panel heating and creating the VA GtG differences along its temperature gradient (pixels in the cold area has slower GtG, and pixels in the power-supply-heated area has faster GtG pixel response -- LCD GtG is very temperature sensitive). External power bricks are preferred by me nowadays with more temperature-sensitive VA LCDs because of that. Not many reviewers even pay attention to GtG-nonuniformity effects like these! You've got different poisons to contend with with the various panel technologies.

Pick your poison from the poison buffet obviously, but one can't deny that OLED is becoming increasingly less and less poison for office desktop use.
 
Last edited:
That's the exact video I linked to.

My screenshot is sometime between the 12 and 13 minute marks.

Start watching the whole video, including beyond the 12 minute mark.

It's a great video, so watch from the beginning all the way to the end. In typical fashion, he'll cherrypick the worst-worn panels, and then later on in the video, LTT acknowledge pretty much what I've said (most of the content beginning at time offset 12:00). In a roundabout way, you'll see LinusTechTips acknowledge exactly what I am saying: He acknowledges there is no burn in in some other / newer panels, and he mentions it's generally the recent LG panels that have the vast amounts of slow-wear accolades, even showing off many people who witnessed lack of wear and tear on OLEDs.

Ask oneself: Clickbait (understandable for views) YouTube titles vs objectively understanding the whole video;

Oh yes, you can definitely permanently burn-in a CX OLED, albiet needs more hours than with a B6/B7 OLED at same brightnesses. Now, the C2 OLEDs (newer) has more longevity than CX. Now it's gotten even harder beyond CX to permanent burn-in an OLED after the same number of hours at the same brightness. There's fuzz factor involved (panel lottery) so you have some variability here. But you can pretty much certainly see what LTT is getting at; if you watch the whole video objectively.

The fact is, even LTT acknowledges me, if you're a whole-video watcher;

Also, I execute more mitigations on my OLED than LTT does; I don't use 100% brightness -- not necessary for office usage scenarios with gaming in between;

You will eventually burn in your OLED, given enough time. All displays have wear and tear. Being that said, we are already reaching the point where you can approach full tech-gadget lifecycle with an OLED without reaching objectionable burn in anymore.
As a LG CX 48" owner, the reason why Linus has burn in is because he never bothered to adapt his workflow to try to mitigate it in any way. He can do this because if it burns in, he makes a video of it and that pays for the TV.

For us normal folk, just doing some mitigation that is a one time setup will help a lot. Here are the tactics I used:
  • Don't run it super bright in desktop use. I used the same 120 nits I use on any LCD.
  • Hide the taskbar/dock/topbar. You gained some desktop space and lost a thing that doesn't need to be there most of the time and pops up just by putting your mouse to the bottom.
  • Set up a blank screensaver that triggers before display sleep. I used 10 minutes.
  • Set display sleep to shorter time than on a LCD. I used 15 or 20 minutes.
  • Turn the TV off from the remote if you take a longer break like go to lunch or something.
  • Use a black desktop background. I almost never see my desktop anyway. Alternatively use a slideshow with lots of pictures.
  • Hide desktop icons. Again, I don't use the desktop for anything so don't care about these.
The way I see it, LG C9 and up should be relatively safe for burn-in as long as you take a bit of effort to mitigate it. If this is not ok for you, use a LCD instead.
 
Yep. While I mitigate more than LTT, I don't even mitigate to THAT extent. I'm a 75%-Brightness setting person + automatic display sleep + dark mode + orbit setting. I don't turn off ABL. 75% also just so happens to be the factory default setting on the 240Hz OLED I'm also testing out privately;

C9 and up is correct (when executing mitigations). C2 can be abused somewhat more with fewer mitigations (default settings are now pretty burn-in resistant). Yes, risk not zero, but see my previous post for perspective.

Also, people like me and Linus can generally afford to abuse our OLEDs in the name of science, taking the sacrifice so end users like you can begin to Office + overtime on an OLED tomorrow. Providing valuable data to OLED vendors to further improve longevity of their OLED product...

Although I go straight to calibrated settings (120nit PhotoShop league) for plain old-fashioned Office use, I'm surprised I still can't burn in despite my hours of on-time. But then again I started off on 2022-era OLEDs that's sometimes running a screensaver or rotating login desktop wallpapers overnight (Just Because!). I don't manually Sleep my monitor when walking away for a meal; just relying on the auto-dim (of LG auto power save on static imagery) and auto-screensaver (of Windows), though half of the time I manually Sleep the monitor before bed.

I do unleash maximumness from time to time! I'll certainly from time to time unlock the glowy HDR + maximum brightness peaking for a flagship game like Cyberpunk 2077. Or a space game. Or other adventure to make it sing. But I keep the Windows desktop at pretty eye-comfortable "roughly as bright as my surrounding lighting" settings that is plenty bright and so vastly more eye-ergonomic for office usage. The subpixel rendering may not be IPS quality, but fairly mitigatable by ClearType tuner, and still eye pleasing. (Subpixel text rendering improvements are coming). Yes, while you can't get 400 nits for a whole white field, the neons in Cyberpunk 2077 gets incredibly bright during nighttime gaming -- far brighter than most non-FALD LCDs for 10%-and-fewer pixels of the panel surface. Starry fields in space games get the royal red carpet treatment on OLED (ultra bright 500+ nit pixels can be immediately adjacent to inky black pixels). In many games, it's not like you're going to get large-windows of brightness, and you're continually in motion in games, so you can quite easily unleash more brightness when motion is involved, though watch those HUDs.
 
Last edited:
Fuck i just want something that will last at least 5years especially with the pricing and cost of living in Sydney.

Something that will not look like a abomination in Picture Quality after 2 or 3 years.

The RTINGS torture tests are truly terrifying for OLED and LED.
 
You active at AVSForum?
Folks there really need your help, experience & knowledge calibrating their OLEDs.

I know you were just baiting him but . .

OLED's range and sustained limitations/burn-in avoidance methods, WRGB's white pixel polluting color space, as well as the fald's implementation of jumbo 45x25 lighting resolution, it's non-uniformity overall and varying luminance throughout dynamic scenes are hacks. They are very clever stop-gap solution work-arounds on both technologies that provide gains but they both have major tradeoffs/failings and - - - - - they are both inaccurate throughout actual general usage. Especially inaccurate in HDR media and games due to their respective limitations being exacerbated by HDR's greater extremes in both cases. So you should take the idea of calibration on these screens as far as it being resolved 1:1 screen wide in real world usage with a grain of salt. You'd have to hobble both of them pretty extremely to approach accuracy across the whole screen in actual usage. A sub-ABL range of brightness cap on OLEDs at the very least (180 - 200nit, a bright sdr range for a darkroom really) and on a FALD screen the dynamic FALD array would have to be disabled to get accuracy which doesn't even make it a FALD anymore, just a basic screen with bad contrast and black depth.

I'm not saying they both suck or that one does and the other doesn't. I'm saying they are both inaccurate and with visible issues - but both are very usable for some fun HDR gaming and media playback considering what clunky workarounds the tech has to resort to using in order to even provide what good each display tech can currently. There are plenty of reviews to back up those pros and cons up of each but they typically give the top screens of both technologies high marks with caveats. Personally I'm all in on per pixel emissive as I think it's the ultimate way to display things. OLED just can't do per pixel emissive without major tradeoffs but both display types have major tradeoffs.


I stand corrected about MicroLED then. If that's the case, then all the more reason to use the right display technology for the right application. I'll keep on using IPS for my general PC use and OLED for gaming and video consumption. I wanted something that could do it all and really well but I will just have to wait for self emissive panel tech to advance enough to be able to.

I think, while all panel technologies are susceptible to burn-in and image retention, they do so at different degrees and rates.From what I've read, OLED being organic makes it more prone to burn in compared to other panel technologies, hence why OLED TV and monitor manufacturers go to great lengths to mitigate burn in compared to other panel technologies. I guess OLED vs LCD burn in is something worth testing by RTINGS.

Even the 2000nit+ samsung qdled FALD 4k and 8k LCDs have aggressive ABL, probably because they get too hot.

I think panel mfgs might have to start looking into boxier designs with large heat sinks and active cooling methods instead of the ultra slim designs at some point.

Dolby's small HDR 4000 testing rig years ago reportedly had some crazy cooling on it.

I read somewhere that their 42" pulsar (pictured below) which is 4000nit actually has to use liquid cooling too.

GkH2FEF.png


I guess sony had a big 8k prototype that could do 10,000nit at ces years ago that probably required some serious active cooling as well.


Meta's starburst VR prototype that supposedly goes to 20,000nit also employs a big bulky heatsink with fans.



https://www.auganix.org/hands-on-with-meta-reality-labs-starburst-prototype-for-hdr-vr/

Why is Starburst so big?


It is Starburst’s peak brightness of 20,000 nits that answers the question of “why is Starburst so big?” The main reason for the device’s size of course, is the fact that something that bright gets very hot. In order to dissipate that heat, Starburst employs the use of rather large 3D printed aluminium heat sinks, which according to Matsuda, account for at least 50 percent of the device’s weight. Then, there are the large fans on top of the device that suck the heat in these sinks up and away from the headset (and ultimately away from the user’s head).


Matsuda also noted that originally when the DSR team first started designing Starburst, there was a hope that it would have to be water-cooled (simply because that would have been… cool). But, as it turns out, water-cooling was not required, and simple fans sufficed and achieved the same outcome. This was mainly due to the fact that water cooling would not have really had any impact on the weight, given that the aluminium heat sinks would still have been required regardless.

As well as the two hanging and functional prototypes, an exploded view of Starburst was also proudly being displayed inside a glass box. According to Matsuda, the device consists of 36 parts (plus screws).


Meta-Reality-Labs-Starburst-Prototype.jpg



Meta-Reality-Labs-Starburst-headset.jpg






Meta-Reality-Labs-User-in-Starburst.jpg
 
Fuck i just want something that will last at least 5years especially with the pricing and cost of living in Sydney.

Something that will not look like a abomination in Picture Quality after 2 or 3 years.

The RTINGS torture tests are truly terrifying for OLED and LED.

Linus kept a bright WHITE cross in the middle of his screen where four quadrants of app window frames were. He wasn't using a dark theme or shrinking his window frame sizes with 3rd party customization apps or manual registry edits, windows themes, etc.. and he was prob using extreme brightness relatively, no screen saver or off time, and not using the turn off the screen emitters "trick" I outlined in my previous replies. Both him and the other guy in the video were pretty abusive to their screens.

That said, personally I don't use my OLED as a static desktop/app screen. I've used two or more screens at my desk since at least 2005 since there have always been big tradeoffs with each tech between gaming/media and desktop/app use. For the most part my OLED is a multimedia and gaming "stage", and the turn off the screen "trick" is the curtain, or acts as a way to minimize the whole screen contents so to speak when not giving the screen face time/eyeballs, going afk, etc.

If you are worried about it... the LG 42" C2 is around $1100 right now at best buy and the 5 year bb warranty that covers burn in is another $210 on top of that. $210 is $42 a year insurance on the whole tv and it covers burn in with panel replacement or the whole TV. So if you are afraid then get the insurance for ~ $3.50 a month or whatever it is after all taxes heh.

Anyone have experience with recent OLED's for ~10 hour a day productivity work?

Are we at the point where we no longer need to worry about burn-in?

I would love to pick up a 42" LG C2 (or C3 I guess whenever it is released) for my desktop, but I am still concerned that within a few monts the office ribbon or start menu or something else will be burned in...


I used my LG CX 48" like that for two years. ~8h work and personal use on top of that. The display is still without burn in and working fine.

This will heavily depend on how you use your display. I had some mitigations in place:
  • Dark modes where available.
  • Autohide taskbar/dock/topbar. I use MacOS for work.
  • Turn off the display with the remote when taking a longer break.
  • Keep display connected to power so it can run its pixel refresh cycles.
  • Brightness calibrated to 120 nits.
  • Virtual desktops in use so there is some movement between content.
  • Blank screen saver going on in 10 minutes of idle. Faster to get out of that than display off.
  • Display off in 20 minutes of idle.
While this may seem like a lot, it's a one time setup. You really don't need the taskbar/dock for anything 99% of the time so even after returning to a smaller LCD I keep it hidden.

I don't use mine as a static desktop/app screen other than a browser once in awhile or something since I have side screens for static apps and desktop stuff. I've been using multiple monitors for years so it's normal to me to do so.

I think of it like the mainscreen in star trek.. they aren't doing all of their engineering and science work and experiments on their large main viewer typically. All of their data is on other workstation screens while the main screen is the big show. Or you might think of the oled screen as a "stage" for playing media and games.


That's my personal preference. Like kasakka said there are a lot of burn-in avoidance measures, many which he listed. If you keep asbl on it would be even less likely to burn "down" (see below), but most people using them for desktop/apps turn off asbl dimming via the service menu using a remote since it's annoying to have full bright pages dim down.

=======================================================================

Pasting some info from my comment history here for you in case you find any of it useful:

Some burn-in (burning through your "burn-down" buffer) avoidance measures
A few reminders that might help in that vein:

....You can set up different named profiles with different brightness, peak brightness, etc.. and maybe contrast in the TV's OSD. You can break down any of the original ones completely and start from scratch settings wise if you wanted to. That way you could use one named profile for lower brightness and perhaps contrast for text and static app use. Just make sure to keep the game one for gaming. I keep several others set up for different kinds of media and lighting conditions.
  • Vivid
  • Standard
  • APS
  • Cinema
  • Sports
  • Game
  • FILMMAKER MODE
  • iisf Expert (Bright Room)
  • isf Expert (Dark Room)
  • Cinema Home
....You can change the TV's settings several ways. Setting up the quick menu or drilling down menus works but is tedious. Keying the mic button on the remote with voice control active is handy to change named modes or do a lot of other things. You can also use the remote control software over your LAN , even hotkeying it. You can change a lot of parameters using that directly via hotkeys. Those hotkeys could also be mapped to a stream deck's buttons with icons and labels. In that way you could press a stream deck button to change the brightness and contrast or to activate a different named setting. Using streamdeck functions/addons you can set up keys as toggles or multi press also, so you could toggle between two brightness settings or step through a brightness cycle for example.

....You can also do the "turn off the screen emitters" trick via the quick menu, voice command with the remote's mic button, or via the remote control over LAN software + hotkeys (+ streamdeck even easier). "Turn off the screen" (emitters) only turns the emitters off. It doesn't put the screen into standby mode. As far as your pc os, monitor array, games or apps are concerned the TV is still on and running. The sound keeps playing even unless you mute it separately. It's almost like minimizing the whole screen when you are afk or not giving that screen face time, and restoring the screen when you come back. It's practically instant. I think it should save a lot of "burn down" of the 25% reserved brightness buffer over time. Might not realize how much time cumulatively is wasted with the screen displaying when not actually viewing it - especially when idling in a game or on a static desktop/app screen.

...You can also use a stream deck + a handful of stream deck addons to manage window positions, saved window position profiles, app launch + positioning, min/restore, etc. You could optionally swap between a few different window layouts set to a few streamdeck buttons in order to prevent your window frames from being in the same place all of the time for example.

... Dark themes in OS and any apps that have one available, web browser addons (turn off the lights, color changer), taskbarhider app, translucent taskbar app, plain ultra black wallpaper, no app icons or system icons on screen (I throw mine all into a folder on my hard drive "desktop icons"). Black screen saver if any.

... Logo dimming on high. Pixel shift. A lot of people turn asbl off for desktop but I keep it on since mine is solely for media/gaming. That's one more safety measure.

. .

Turn off the Screen (emitters only) trick
I use the "turn off the screen" feature which turns the oled emitters off. You can set that turn off the screen command icon to the quick menu so it's only 2 clicks to activate with the remote (I set mine to the bottom-most icon on the quick menu), or you can enable voice commands and then hold the mic button and say "turn off the screen". You can also use the color control software to set a hotkey to the "turn off the screen(emitters)" function, and even map that hotkey to a stream deck button if you have one. Clicking any button on the remote or via the color control software hotkeys wakes up the emitters instantly. I usually hit the right side of the navigation wheel personally if using the remote.

https://www.reddit.com/r/OLED/comments/j0mia1/quick_tip_for_a_fast_way_to_turn_off_the_screen/

While the emitters are off everything is still running, including sound. This works great to pause games or movies and go afk/out of the room for awhile for example. I sometimes cast tidalHD to my nvidia shield in my living room from my tablet utilizing the "turn off the screen" (emitters) feature. That allows me to control the playlists, find other material, pause, skip etc from my tablet with the TV emitters off when I'm not watching tv. You can do the same with youtube material that is more about people talking than viewing anything. I do that sometimes when cooking in my kitchen that is adjacent to my living room tv. You can probably cast or airplay to the tv webOS itself similarly. Some receivers also do airplay/tidal etc directly to the receiver.

. . .
Distrust Screensavers
I wouldn't trust a screensaver, especially a pc screensaver. Not only do they fail or get blocked by apps - Apps can crash and freeze on screen, so can entire windows sessions or spontaneous reboots stuck on bios screen, etc. It's rare but can happen. Some apps and notifications even take the top layer above the screensaver leaving a notification/window there static.

While on the subject. I kind of wish we could use the LG OSD to make mask areas. Like size one or more black boxes or circles, be able to set their translucency, and move them via the remote to mask or shade a static overlay, HUD element, bright area of a stream, etc.

. .
LG's reserved brightness buffer. You aren't burning in because you are burning down that buffer first, for a long time (depending on how badly you abuse the screen).
From what I read the modern LG OLEDs reserve the top ~ 25% of their brightness/energy states outside of user available range for their wear-evening routine that is done in standby periodically while plugged in and powered. Primarily that, but along with the other brightness limiters and logo dimming, pixel shift, and the turn off the "screen" (emitters) trick if utilized, should extend the life of the screens considerably. With the ~25% wear-evening routine buffer you won't know how much you are burning down the emitter range until after you bottom out that buffer though. As far as I know there is no way to determine what % of that buffer is remaining. So you could be fine abusing the screen outside of recommended usage scenarios for quite some time thinking your aren't damaging it, and you aren't sort-of .. but you will be shortening it's lifespan wearing down the buffer of all the other emitters to match your consistently abused area(s).

A taskbar, persistent toolbar, or a cross of bright window frames the middle of the same 4 window positions or whatever.. might be the first thing to burn-in when the time comes but on the modern LG OLEDs I think the whole screen would be down to that buffer-less level and vulnerable at that point as it would have been wearing down the rest of the screen in the routine to compensate all along over a long time.
The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again.
Or you might think of it like a phone or tablet's battery you are using that has an extra 25% charge module, yet after you turn on your device and start using it you have no idea what your battery charge level is. You can use more power hungry apps and disable your power saving features, screen timeouts, run higher screen brightness when you don't need to, leave the screen on when you aren't looking at it etc. and still get full charge performance for quite some time but eventually you'd burn through the extra 25% battery.

. .
View distance vs. display quality

You aren't going to get the full picture quality larger 42", 48", 55" 4k screens are capable of when you keep them near on top of a desk rather than decoupling them from the desk and gaining distance using a flat-foot or caster wheeled slim spine floor stand . . or a non-modular wall mount option, or some other 2nd surface to put it on. Your text will be fringed and your graphics in games aliased like a ~ 1500p screen, even with aggressive/alternate text sub sampling and aggressive in game AA applied. Also, the 2d desktop typically has no AA at all outside of text-ss, so desktop graphics an imagery will be uncompensated and have even more pixelization. So they aren't really the best choice for most people's desks and workstation areas.

There are two ways imo to get clear text.

- The opmitmal way is to sit far enough away to get 60PPD, or better yet higher. Sitting at near distances with the screen on your desk will result in a 1500p like pixel density to your eyes which will make text and graphics look fringed even with text-ss and game AA applied aggressively. It will also push the sides of the screen outside of your 50 deg to 60deg human viewpoint and exacerbate off-axis and color uniformity issues on the sides of the screen.

- If you have to sit that close, where you are ~ 1500p like pixel density, you can sacrifice desktop real estate down from 4k 1:1 pixel by using windows scaling to scale everything up some. That will mean more pixels per character of text so will help but you will lose space from 1:1 px 4k.



tJWvzHy.png


---------------------------------------------------------
PPD and optimal/full PQ view distances

It's all about PPD (Pixels Per Degree) and viewing angle. All 4k screen size's perceived pixel density and viewing angle is the same when the distance is scaled.

At 50 to 60 deg viewing angle on all sizes of 4k screens, you will get 64 PPD to 77 PPD.

Massaged or alternative text sub-sampling and aggressive graphics anti-aliasing (at a performance hit) starts to compensate enough vs more gross text fringing and graphics aliasing at around 60 PPD, which on a 4k screen is about 64 deg viewing angle. This works though it's outside of the 50 deg to 60deg human viewpoint a bit.

Beneath 60PPD you will get text fringing and graphics aliasing more like what a 1500p screen would look like at traditional near desk distances, and if you scale the text up to compensate for the pixelization, you'll then be dropping from 4k 1:1 to around 1500p like desktop real-estate too.. It's also worth noting that on the 2D desktop there is no AA for desktop graphics and imagery typically, just for text via text sub sampling. So the aliasing is uncompensated there entirely outside of certain authoring app's 3d viewports etc.


View attachment 540412

View attachment 540413

. . .

On a flat screen, the edges of the screen are always off axis by some amount. On OLED and VA screens, these off axis extents of the screen are non uniform color (OLED) or shift/shading (VA) gradients whose sizes grow the closer you sit to the screen.
The distortion field and eye fatigue zone is still there when at the optimal viewing angle (on a flat screen), but it is smaller. The edges of a screen are still as off-axis as if you were sitting an equivalent distance from them outside of the screen:

View attachment 540414


When you sit closer than 50 to 60 deg viewing angle, the sides are pushed more outside of your viewpoint causing a larger eye fatigue and non-uniform screen area on each side (as well as the PPD being driven down):


View attachment 540415


. . . . . . . . . . . . . .

Almost always better off decoupling the screen from the desk entirely with larger screens, going with something like this to get a better viewing angle and higher pixel density:

View attachment 540416
Something like that or one with caster wheels . . or a wall mount, pole mount, separate surface, etc. and drop the desk back.
Anyone have experience with recent OLED's for ~10 hour a day productivity work?

Are we at the point where we no longer need to worry about burn-in?

I would love to pick up a 42" LG C2 (or C3 I guess whenever it is released) for my desktop, but I am still concerned that within a few monts the office ribbon or start menu or something else will be burned in...

 
Last edited:
Yep. While I mitigate more than LTT, I don't even mitigate to THAT extent. I'm a 75%-Brightness setting person + automatic display sleep + dark mode + orbit setting. I don't turn off ABL. 75% also just so happens to be the factory default setting on the 240Hz OLED I'm also testing out privately;

C9 and up is correct (when executing mitigations). C2 can be abused somewhat more with fewer mitigations (default settings are now pretty burn-in resistant). Yes, risk not zero, but see my previous post for perspective.

Also, people like me and Linus can generally afford to abuse our OLEDs in the name of science, taking the sacrifice so end users like you can begin to Office + overtime on an OLED tomorrow. Providing valuable data to OLED vendors to further improve longevity of their OLED product...

Although I go straight to calibrated settings (120nit PhotoShop league) for plain old-fashioned Office use, I'm surprised I still can't burn in despite my hours of on-time. But then again I started off on 2022-era OLEDs that's sometimes running a screensaver or rotating login desktop wallpapers overnight (Just Because!). I don't manually Sleep my monitor when walking away for a meal; just relying on the auto-dim (of LG auto power save on static imagery) and auto-screensaver (of Windows), though half of the time I manually Sleep the monitor before bed.

I do unleash maximumness from time to time! I'll certainly from time to time unlock the glowy HDR + maximum brightness peaking for a flagship game like Cyberpunk 2077. Or a space game. Or other adventure to make it sing. But I keep the Windows desktop at pretty eye-comfortable "roughly as bright as my surrounding lighting" settings that is plenty bright and so vastly more eye-ergonomic for office usage. The subpixel rendering may not be IPS quality, but fairly mitigatable by ClearType tuner, and still eye pleasing. (Subpixel text rendering improvements are coming). Yes, while you can't get 400 nits for a whole white field, the neons in Cyberpunk 2077 gets incredibly bright during nighttime gaming -- far brighter than most non-FALD LCDs for 10%-and-fewer pixels of the panel surface. Starry fields in space games get the royal red carpet treatment on OLED (ultra bright 500+ nit pixels can be immediately adjacent to inky black pixels). In many games, it's not like you're going to get large-windows of brightness, and you're continually in motion in games, so you can quite easily unleash more brightness when motion is involved, though watch those HUDs.
At 75% brightness, OLED displays typically only reach a maximum of 150 nits or even lower. Of course LG displays are less susceptible to burn-in this way.

Nonetheless, you end up with an OLED display with low brightness levels.

It's nothing comparable running 300nits-400nits flicker-free 24/7. Or just keep on HDR.
 
At 75% brightness, OLED displays typically only reach a maximum of 150 nits or even lower. Of course LG displays are less susceptible to burn-in this way.
Incidentally, ~150 nits is just about perfect for very ergonomic office use too.

During many games and HDR, you can still have that brightness leagues for typical content of many games -- especially space, cyberpunk, dungeon, horror, and other "dark and bright mix" games. You get stunningly above-400nit pixels next to inky black with the starry scenes of space games.

It's nothing comparable running 300nits-400nits flicker-free 24/7.
Ah, the joy of unboxing:
To unpack your statement fully:

Number One: 300-400 nits is gigantically too bright for me (and many!) for office usage scenarios;
Number Two: I've replied to you about flicker in that other thread; your text content involving the subject of flicker is gigantically overblown when it comes to the 240Hz OLED panels.
Number Three: 300-400 nits is great for games and I already can exceed that brightness (e.g. 500nit stars next to inky black outer space) in my dark games where the bright pixels are only a few percentage of the surface of the screen; ABL is a graceful workaround (CRTs / plasmas / some FALDs also have ABL-like behavior so it's not objectionable to my eyes). Also while FALD looks amazing on a lot of content, traditional FALD makes starry scenes look flatter than they do on OLEDs. If you play a lot of cyberpunk and space games, they already really shine with OLEDs.

Pick your poisons of pros/cons. OLED still ticks quite a lot of great boxes. FALD and OLED are great technologies with their respective pros/cons. I have FALD, VA, IPS, TN, IPS Black, 360Hz+, and still like the eye-friendly (for me) look of the OLED for a general-purpose jack-of-all-trades display that feels downright pleasing on eyes for prolonged multiple-usage scenarios. Displays are typically jack-of-all-trades, masters-of-none, as you get compromises. However, the multipurposeness of a large OLED casts a rather gigantically wide net for a lot of people in these parts.

You can still motion-blur-reduce better with strobing, but oh my, brute framerate-based motion blur reduction hasn't looked better on any consumer digital display other than OLED. Very important if you're sensitive to motion blur, ala my name sake of Blur Busters. Strobing on most displays also drops pretty close to 100-200 nits, with sometimes rather nasty squarewave flicker that affects more people than invisible (unnoticeable/unfeelable to a supermajority) shallow-depth-shallow-wave flicker. And the glory of extremely bright low-blur content thanks to OLED, is quite a stunning Goldilocks of low-blur and no-visible/feelable-flicker combination.
 
Last edited:
  • Like
Reactions: elvn
like this
I'm not sure kram is ever going to understand that brightness isn't the only factor in image quality. If anything, taking it too far is uncomfortable.
 
Incidentally, ~150 nits is just about perfect for very ergonomic office use too.

During games and HDR, you can still have above-400nit pixels, e.g. starry scenes, as I've said.


Ah, the joy of unboxing:
To unpack your statement fully:

Number One: 300-400 nits is gigantically too bright for me (and many!) for office usage scenarios;
Number Two: I've replied to you about flicker in that other thread; your text content involving the subject of flicker is gigantically overblown when it comes to the 240Hz OLED panels.
Number Three: 300-400 nits is great for games and I already can exceed that brightness (e.g. 500nit stars next to inky black outer space) in my dark games where the bright pixels are only a few percentage of the surface of the screen; ABL is a graceful workaround (CRTs / plasmas / some FALDs also have ABL-like behavior so it's not objectionable to my eyes). Also while FALD looks amazing on a lot of content, traditional FALD makes starry scenes look flatter than they do on OLEDs. If you play a lot of cyberpunk and space games, they already really shine with OLEDs.

Pick your poisons of pros/cons. OLED still ticks quite a lot of great boxes. FALD and OLED are great technologies with their respective pros/cons. I have FALD, VA, IPS, TN, IPS Black, 360Hz+, and still like the eye-friendly (for me) look of the OLED for a general-purpose jack-of-all-trades display that feels downright pleasing on eyes for prolonged multiple-usage scenarios. Displays are typically jack-of-all-trades, masters-of-none, as you get compromises. However, the multipurposeness of a large OLED casts a rather gigantically wide net for a lot of people in these parts.
Office use typically requires 250-300nits of brightness, unless you're in a house with low ambient light.

Games in SDR require a much higher brightness than 150nits, essentially requiring what "HDR" OLED offers. However, the HDR on OLED screens can be dimmer than SDR. Yet the flickering kick in a slightly higher APL. You cannot stare at an APL of 250 nits on an OLED screen in a dimly lit room for long. You will say it is too "bright".

Games in HDR require a much higher brightness than what OLED can offer. Just like Cyberpunk require a range of colors and brightness levels that often exceed what OLED can offer. They don't shine with OLED at all.
52342202119_0901d582b7_o_d.png


Even in low APL night scene the range of highlight already reaches 1500+nits. It's not the same image on OLED.
52746318877_e6eaaa0438_o_d.png
 
Last edited:
I agree with the bright office fluorescent environment and showroom floor at best buy etc. conditions being poor. Our eyes see everything relatively so in bright viewing conditions you prob need 300 (to 400nit depending) to see SDR as the same as it looks in dim to dark viewing conditions at 100 (to 200 nit depending). Reviews aren't giving high marks to 400nit capable SDR screens to watch 400nit sdr in dim to dark viewing conditions though lol... they are cheering those nits to compensate for bright viewing conditions since our eyes see the brightness as dimmer relative to the bright room room - not to torch sdr in dim to dark viewing conditions.

It's like viewing a full 100% brightness setting on a phone in a dark bedroom at night vs viewed outside in the sunlight on a bright sunny day in the summer. And like viewing a phone at 25% brightness in a dark bedroom at night vs that phone set to 25% brightness viewed outside in the sunlight on a bright sunny day in the summer. It will look brighter in the darkness and dimmer in extreme bright environments because our eyes work relatively. So you have to turn the brightness slider up when viewing the phone in the sunlight, and you will benefit from a 300 - 400 nit sdr setting in a bright room in the exact same way, getting back to how it looks in dim to dark viewing environments.

However that bright SDR compensating for a bright viewing environment, higher SDR levels to compensate vs the light and get back to how lower nit SDR would look in dim to dark conritions scenario to our eyes and brains - when combined with matte type AG abraded layer on most screens - that bright ambient light viewing environment is going to reflect off of the abrasions causing the black levels to lift to our eyes and brain (even on oled's blacks), and will cause a frost like sheen on the screen compromising how saturated and clear it looks. It will also reflect light sources even if it's scattering the light some so the screen surface will be polluted by light compromising it's settings in blobs and bar areas of the screen in that scenario. So it's a poor viewing environment, especially for media and games.. HDR is best in dim to dark viewing conditions anyway so why buy a HDR multimedia display for office work if you intend to keep your office lit up like an operating room.
 
Last edited:
It's not just "compensating" for high ambient light anymore.

With local dimming the dynamic contrast is increased to the level of HDR400 lol. This is why SDR can be even better than OLED HDR.
 
OLED calibration (to the degree it's even possible) realized as accuracy in real world use?
=============================================================================

. . . OLED have lower hdr color volume/brightness peaks a lower %'s of the screen able to be 400 - 800nit at a time, something like 35%, and sustains brightness for shorter periods.


( .. though 50% of screen space of dolby vision material's typical scenes may be in the 100 - 200 nit range. Then 25% screen surface HDR mids capable of up to ~ 600nit in dynamic content, and/or some occasional 800 - 1000nit small highlights 10% of screen surface at a time mixed in depending on the scene composition and the model of the oled and if in gaming mode vs watching hdr media. It's still a very decent HDR experience imo of since oled gets extreme contrast down in darks and all the way to oblivion ultra blacks contrasted right next to ~ 8million other pixels that can be colored or dark right alongside them to a razor's edge pixel by pixel by pixel).


. . . OLED and some very bright 2000nit+ FALDS hit ABL limits - especially in scenes with very high brightness on the whole screen.

(though most media and especially games are very dynamic movement wise and with darker areas detailing things or cast by things so it doesn't happen all that often , or for very long when it does typically. Still sucks to have to suffer it on scenes that are exceptionally bright across the whole surface of the screen - and makes calibrated accuracy within the full HDR capability of screens with aggressive ABL questionable, probably impossible).


. . .WRGB OLED have a white subpixel so the brighter the colors get, the less accurate it is since more white is mixed in.

(still looks very good for general media and gaming, especially on a per pixel basis and contrasted all the way down to oblivion blacks)


Considering those tradeoffs, I'd take calibration accuracy of OLEDs in real world usage with a grain of salt.


. .

FAD calibration realized as accuracy in real world use?
===============================================


You can calibrate or test a FALD in the middle of fields of white, black and color and say it's accurate but you'd be lying to yourself. FALD is less than a 45x25 lighting resolution. In each contrasted bright vs. detail via dark area, FALD dims thousands of bright pixels darker than they should be and/or thousands of pixels lifting darkness paler in each affected FALD cell. In some cases, especially the samsungs in gaming mode, a wider band of more cells straddling contrasted areas and with slower transitions.


FALD adjusting ~ interpreting large tetris block cell's lighting, the result even affecting different pixels in the cell in different directions +/- in the same cell at the same time depending, relative to what the pixels in the cell should be if they were lit individually. For example making thousands of dark pixels brighter and/or thousands of bright pixels darker in the same cell potentially.
Those contrasted area's FALD zones will drop to low contrast at their set point. You might think that's ok and maybe a contrast comparable to an older non-fald VA or something in those brickworks structures of multiple tetris block backlight zones, but the rest of the screen and scene is much darker and might brighter, especially in HDR content - with a much, much higher contrast ratio, and those area's pixels should be individual much darker and/or much brighter side by side - so those cell areas are like watered down tetris block grouping areas of non-uniformity that don't match the rest of the screen/scene or backdrop, the rest of the areas around them. That would be bad enough but then the camera changes to a different shot or the camera pans, or in games the FoV is moved by the player's mouse-looking, movement keying, controller panning . . so the scene is moved across all of those ice cub tray cells puddle jumping through them causing varying luminance across the scene. Most scenes are full of contrasted areas where dark areas (dark areas, not just dark-er brights relative to extreme brightness areas) give detail to brighter ones in geography, architectures, room features and room dressings, hair and clothing, and general relief/details/textures.


Most review sites recommend turning off FALD if you want to do anything accurate on the desktop. That means reality check - fald is always inaccurate and non-uniform in overall usage so turning it off is the only way to get an accurate screen.

Hell, it's not even uniform across static letterboxing bars vs. dynamic content in the content field, and some fald screens can't even show text credits and closed captions as white due to them being dimmed more to greys to prevent them from blooming/having a glow aura which is just another way of saying they'd otherwise lift the darker screen contents and backdrops/rooms/environments, etc surrounding them brighter/paler. It has to do that one direction or the other (and sometimes both to varying pixels per block) as they are large 7000pixel cells. It's a very low lighting resolution. That's not isolated to letterbox bars and white texts either . . it translates to any bright areas in a scene having to spill into darker ones lifting them, or darker ones spilling into brighter ones dimming them .. (across even wider # of zones and with slower transitions in samsungs game mode) and compromised contrasted edges and small details. So again take claims of calibration's accuracy screen wide 1:1 in real world use with a grain of salt in a ~ 45x25 lighting grid.
Turning off fald drops ucx down to 1300:1 via it's OSD, and samsung qdled FALD LCDs down to like 3200:1 (though you can only disable FALD on the samsungs via svc menu so not really a general use on/off capability people would use typically). Those are very poor contrast values and accompanying black depths, especially compared to an oled's per pixel contrast down to oblivion black.



ucx:

ucx_mouse-bloom_1.gif



. .


Both OLED's range on the top end and sustained limitations/burn-in avoidance methods as well as the fald's implementation of large 45x25 lighting resolution are hacks. They are very clever stop-gap solution work-arounds on both technologies but they both have major tradeoffs/failings, and they are both inaccurate throughout actual general usage but especially in HDR media and games due to their respective limitations being exacerbated by HDR's greater extremes in both cases. You'd have to narrow both of them pretty extremely to even approach anything remotely nearing accuracy. A sub-ABL range of brightness cap on OLEDs at the very least (180 - 200nit, a bright sdr range for a darkroom really) would be less inaccurate and the FALD array completely inactive on FALD screens to make it accurate which doesn't even make it a FALD anymore, just a basic screen with bad contrast and black depth.

This is from eizo's marketing of their dual layer LCD tech from awhile back. We are still stuck with a pick your poison choice tech wise for now:

NVsBTV1.png



I'm not saying they both suck or that one does and the other doesn't. I'm saying they are both inaccurate and with visible issues - but both are very usable for some fun HDR gaming and media playback considering what clunky workarounds the tech has to resort to using in order to even provide what good each display tech can currently. There are plenty of reviews to back up those pros and cons up of each but they give the top screens of both technologies high marks which says a lot for both of them. Personally I'm all in on per pixel emissive as I think it's the ultimate way to display things. OLED just can't do per pixel emissive without major tradeoffs but both display types have major tradeoffs for what they are capable of.
 
Last edited:
I know you were just trying to bait him with that but since you brought it up . .


OLED calibration (to the degree it's even possible) realized as accuracy in real world use?
=============================================================================

. . . OLED have lower hdr color volume/brightness peaks a lower %'s of the screen able to be 400 - 800nit at a time, something like 35%, and sustains brightness for shorter periods.


( .. though 50% of screen space of dolby vision material's typical scenes may be in the 100 - 200 nit range. Then 25% screen surface HDR mids capable of up to ~ 600nit in dynamic content, and/or some occasional 800 - 1000nit small highlights 10% of screen surface at a time mixed in depending on the scene composition and the model of the oled and if in gaming mode vs watching hdr media. It's still a very decent HDR experience imo of since oled gets extreme contrast down in darks and all the way to oblivion ultra blacks contrasted right next to ~ 8million other pixels that can be colored or dark right alongside them to a razor's edge pixel by pixel by pixel).


. . . OLED and some very bright 2000nit+ FALDS hit ABL limits - especially in scenes with very high brightness on the whole screen.

(though most media and especially games are very dynamic movement wise and with darker areas detailing things or cast by things so it doesn't happen all that often , or for very long when it does typically. Still sucks to have to suffer it on scenes that are exceptionally bright across the whole surface of the screen - and makes calibrated accuracy within the full HDR capability of screens with aggressive ABL questionable, probably impossible).


. . .WRGB OLED have a white subpixel so the brighter the colors get, the less accurate it is since more white is mixed in.

(still looks very good for general media and gaming, especially on a per pixel basis and contrasted all the way down to oblivion blacks)


Considering those tradeoffs, I'd take calibration accuracy of OLEDs in real world usage with a grain of salt.


. .

FAD calibration realized as accuracy in real world use?
===============================================


You can calibrate or test a FALD in the middle of fields of white, black and color and say it's accurate but you'd be lying to yourself. FALD is less than a 45x25 lighting resolution. In each contrasted bright vs. detail via dark area, FALD dims thousands of bright pixels darker than they should be and/or thousands of pixels lifting darkness paler in each affected FALD cell. In some cases, especially the samsungs in gaming mode, a wider band of more cells straddling contrasted areas and with slower transitions.


FALD adjusting ~ interpreting large tetris block cell's lighting, the result even affecting different pixels in the cell in different directions +/- in the same cell at the same time depending, relative to what the pixels in the cell should be if they were lit individually. For example making thousands of dark pixels brighter and/or thousands of bright pixels darker in the same cell potentially.
Those contrasted area's FALD zones will drop to low contrast at their set point. You might think that's ok and maybe a contrast comparable to an older non-fald VA or something in those brickworks structures of multiple tetris block backlight zones, but the rest of the screen and scene is much darker and might brighter, especially in HDR content - with a much, much higher contrast ratio, and those area's pixels should be individual much darker and/or much brighter side by side - so those cell areas are like watered down tetris block grouping areas of non-uniformity that don't match the rest of the screen/scene or backdrop, the rest of the areas around them. That would be bad enough but then the camera changes to a different shot or the camera pans, or in games the FoV is moved by the player's mouse-looking, movement keying, controller panning . . so the scene is moved across all of those ice cub tray cells puddle jumping through them causing varying luminance across the scene. Most scenes are full of contrasted areas where dark areas (dark areas, not just dark-er brights relative to extreme brightness areas) give detail to brighter ones in geography, architectures, room features and room dressings, hair and clothing, and general relief/details/textures.


Most review sites recommend turning off FALD if you want to do anything accurate on the desktop. That means reality check - fald is always inaccurate and non-uniform in overall usage so turning it off is the only way to get an accurate screen.

Hell, it's not even uniform across static letterboxing bars vs. dynamic content in the content field, and some fald screens can't even show text credits and closed captions as white due to them being dimmed more to greys to prevent them from blooming/having a glow aura which is just another way of saying they'd otherwise lift the darker screen contents and backdrops/rooms/environments, etc surrounding them brighter/paler. It has to do that one direction or the other (and sometimes both to varying pixels per block) as they are large 7000pixel cells. It's a very low lighting resolution. That's not isolated to letterbox bars and white texts either . . it translates to any bright areas in a scene having to spill into darker ones lifting them, or darker ones spilling into brighter ones dimming them .. (across even wider # of zones and with slower transitions in samsungs game mode) and compromised contrasted edges and small details. So again take claims of calibration's accuracy screen wide 1:1 in real world use with a grain of salt in a ~ 45x25 lighting grid.
Turning off fald drops ucx down to 1300:1 via it's OSD, and samsung qdled FALD LCDs down to like 3200:1 (though you can only disable FALD on the samsungs via svc menu so not really a general use on/off capability people would use typically). Those are very poor contrast values and accompanying black depths, especially compared to an oled's per pixel contrast down to oblivion black.



ucx:

Asus-pa32ucx_k-halo.gif



. .


Both OLED's range and sustained limitations/burn-in avoidance methods as well as the fald's implementation of large 45x25 lighting resolution are hacks. They are very clever stop-gap solution work-arounds on both technologies but they both have major tradeoffs/failings, and they are both inaccurate throughout actual general usage but especially in HDR media and games due to their respective limitations being exacerbated by HDR's greater extremes in both cases. You'd have to narrow both of them pretty extremely to even approach anything remotely nearing accuracy. A sub-ABL range of brightness cap on OLEDs at the very least (180 - 200nit, a bright sdr range for a darkroom really) would be less inaccurate and the FALD array completely inactive on FALD screens to make it accurate which doesn't even make it a FALD anymore, just a basic screen with bad contrast and black depth.


I'm not saying they both suck or that one does and the other doesn't. I'm saying they are both inaccurate and with visible issues - but both are very usable for some fun HDR gaming and media playback considering what clunky workarounds the tech has to resort to using in order to even provide what good each display tech can currently. There are plenty of reviews to back up those pros and cons up of each but they give the top screens of both technologies high marks which says a lot for both of them. Personally I'm all in on per pixel emissive as I think it's the ultimate way to display things. OLED just can't do per pixel emissive without major tradeoffs but both display types have major tradeoffs for what they are capable of.
did you forget you already replied to that this morning?
 
did you forget you already replied to that this morning?

no... that was just abbreviated version.. lol... most of that reply was about cooling / ABL in the main body, and this latest reply was to the overall stream of this thread currently but using his comment as a stepping stone again. I could omit the RE: part and it would still stand just fine so maybe I will though.
 
Speaking of accuracy lol.

Even with blooming FALD is way more accurate than OLED in HDR.

At 0.01%, 0.1%, 1% window size OLED can be calibrated for more accuracy. It's a hit or miss at 1%-10% windows size. Anything higher than 10% window size OLED cannot be calibrated since the brightness is off the chart.
 
[/QUOTE]
I was actually going to ask this… People keep saying micro LED won’t burn in, but aren’t ALL emissive displays at risk? Ironically enough, I’ve used my CRT’s a lot. And when kept calibrated and not over bright (90 or so nits in a light controlled room) I’ve never experienced burn in on them.
Inorganic LEDs can also loose their luminance but nowhere as much as OLED

Likewise CRT had this issue and "burn-in" is term from CRT tech.
That said none of my CRTs developed any visible burn-in. Phosphor is resilient enough for this to not affect even people who use these displays for decades. Its much more likely something else will fail than tube to develop burn-in.

I did saw quite a few burn-in CRTs and these include old B&W work/terminal tubes and tubes from arcade cabinets. I do not remember any color computer monitor with burn-in.

OLED unfortunately is nowhere near as resilient and burn-in happens much quicker even when used at the same brightness levels... and of course typically OLEDs are used with much higher luminance than what typical CRT could achieve.
 
Speaking of accuracy lol.

Even with blooming FALD is way more accurate than OLED in HDR.

At 0.01%, 0.1%, 1% window size OLED can be calibrated for more accuracy. It's a hit or miss at 1%-10% windows size. Anything higher than 10% window size OLED cannot be calibrated since the brightness is off the chart.
Challenging question for you!

Is there anything you do like about OLED?
 
Challenging question for you!

Is there anything you do like about OLED?
Why do you even care? You said you don't care.

I like OLED being a dim SDR monitor in the long term.

The company like CSOT, which holds 10% of JOLED, can only use printers to make OLED for fun. It had a long plan to buy 10% stock 9 years ago. As the supplier of Samsung, CSOT find itself way more beneficial making a 5000-zone mini-LED fast VA prototype in 2021.
 
Why do you even care? You said you don't care.

I like OLED being a dim SDR monitor in the long term.

The company like CSOT, which holds 10% of JOLED, can only use printers to make OLED for fun. It had a long plan to buy 10% stock 9 years ago. As the supplier of Samsung, CSOT find itself way more beneficial making a 5000-zone mini-LED fast VA prototype in 2021.
Just wondering why you spend so much time and effort on your anti-OLED proselytism, that's all.

And if you like them as low brightness SDR monitors, which have their place, what's the problem? Why attack everyone in this thread and claim their individual preferences and opinions don't count?

I'm trying to figure out where you're coming from, if not simply from a place of sheer hate and trolling behaviour. I don't understand the goal.
 
Back
Top