42" OLED MASTER THREAD

Well why would they bother with 144Hz for the C4 then? That's just as useless outside of PC gaming.
144 is evenly divisible by 48 while 120 is not, though I don't know if there is a lot of 48 Hz content out there. Avatar: The Way of Water is a notable recent release that is 48 Hz.
 
Why would LG implement 240hz when it's useless outside PC gaming? The reason these newer OLEDs do 4k@120hz is mainly for 2 reasons: Allowing proper frame pacing of 24fps/30fps movie content; and consoles that support 120hz.
Even if 4k@240hz is possible with DSC, you need better and faster components to not have banding issues. This means increased costs for a feature that is ultimately meaningless for the market this TV is primarily designed for.

LG has done a lot specifically for PC gaming.

In 2019 LG partnered with NVIDIA for GSYNC certification. https://www.nvidia.com/en-us/geforce/news/lg-gsync-compatible-hdmi-big-screen-gaming/ This is before consoles supported VRR or 4k@120hz. LG even marketed the 48" as a PC gaming monitor.

They were the first to do 48gbps HDMI 2.1 despite the ONLY thing that could take advantage of it was a PC.

The 42" size was also made for PC gamers. I bet the 42" TV is used more as a PC monitor than an actual TV. There are very few people spending over $1000 on tiny 42" TVs for their living rooms.

During the high frame rate interpolation craze 10 years ago there were TVs that did 240hz interpolation. LG had LCD TVs with "TruMotion 240hz". Now that there is actual competition between different OLED panels I think we'll 240hz in a TV soon. It may be a while before one actually accepts 4k@240hz signal because of bandwidth limitations though.
 
They both use the same panel. This one is just overdriven to get 138 Hz refresh rate. There was a firmware update on the C3 that fixed the VRR flickering. I got my C3 on day one and it was really bad before the firmware update.
Well, Rtings review still says it's an issue. And shows a video clip of it. And there are still recent complaints about it, on Reddit. So I decided on this KTC.
 
  • Like
Reactions: Xar
like this
Well why would they bother with 144Hz for the C4 then? That's just as useless outside of PC gaming.
To set it apart from last year's model and to make it equal to Samsung's competing models. Same reason why some new facelift cars have a small increase in horsepower.
Also, the jump from 120hz to 144hz is small enough and inexpensive compared to 240hz.
 
  • Like
Reactions: Xar
like this
LG has done a lot specifically for PC gaming.

In 2019 LG partnered with NVIDIA for GSYNC certification. https://www.nvidia.com/en-us/geforce/news/lg-gsync-compatible-hdmi-big-screen-gaming/ This is before consoles supported VRR or 4k@120hz. LG even marketed the 48" as a PC gaming monitor.

They were the first to do 48gbps HDMI 2.1 despite the ONLY thing that could take advantage of it was a PC.

The 42" size was also made for PC gamers. I bet the 42" TV is used more as a PC monitor than an actual TV. There are very few people spending over $1000 on tiny 42" TVs for their living rooms.

During the high frame rate interpolation craze 10 years ago there were TVs that did 240hz interpolation. LG had LCD TVs with "TruMotion 240hz". Now that there is actual competition between different OLED panels I think we'll 240hz in a TV soon. It may be a while before one actually accepts 4k@240hz signal because of bandwidth limitations though.
Again, these upgrades were more likely aimed towards the PS5 and Xbox Series X, the later in particular can run some games at 4k@120hz. It's more of a happy coincidence that said upgrades also happened to benefit PC gamers rather than a deliberate design choice.
4k@240hz is an expensive niche reserved only for the highest performing PCs, and for that, LG already has a 4k@240hz OLED monitor on the roadmap.
Sure, I'll be happy if the C5/C6 supports 4k@240hz eventually, but I wouldnt hold my breath.
 
  • Like
Reactions: Xar
like this
To set it apart from last year's model and to make it equal to Samsung's competing models. Same reason why some new facelift cars have a small increase in horsepower.
Also, the jump from 120hz to 144hz is small enough and inexpensive compared to 240hz.

I guess. I'm also not expecting anything worthwhile on the C5 as well, and it also looks like the C4 will STILL not have MLA tech in it. My lord the LG C series almost freaking reminds me of Intel's quad core tick tock decade.
 
You guys are funny. My C2 plays 4k games at 120hertz. Now there is a push for more? I get it totally. But why does everyone want 4k at a higher refresh now? Let the tech grow and usher in 8k. Stop pushing 4k at 5000 hertz on everyone. Not a lot of people care about 4k at ridiculously high refresh rates. One, the video cards do not even pump out those frames at that hertz. Really?
 
You guys are funny. My C2 plays 4k games at 120hertz. Now there is a push for more? I get it totally. But why does everyone want 4k at a higher refresh now? Let the tech grow and usher in 8k. Stop pushing 4k at 5000 hertz on everyone. Not a lot of people care about 4k at ridiculously high refresh rates. One, the video cards do not even pump out those frames at that hertz. Really?
I probably would have kept the c2 if it did 144hz, although the terrible text quality bothered me more. video cards do pump out high frames at 4k. my 3070 was able to do 120 fps at 4k in overwatch 2.
 
You guys are funny. My C2 plays 4k games at 120hertz. Now there is a push for more? I get it totally. But why does everyone want 4k at a higher refresh now? Let the tech grow and usher in 8k. Stop pushing 4k at 5000 hertz on everyone. Not a lot of people care about 4k at ridiculously high refresh rates. One, the video cards do not even pump out those frames at that hertz. Really?

I can literally say the same thing to you? Let the tech grow and user in 1000Hz instead of 8K. If you want 8K that's fine but I would personally take higher Hz over more pixels if I had to choose.
 

That's a disappointing spec bump, that "slightly more horsepower in a new car model" is a very apt comparison. Sure, 144 Hz is better than 120 Hz but at the same time, the real world difference is pretty negligible and seems like only something to combat Samsung's 144 Hz QD-OLED products.

To me LG should be pushing for 4K 240 Hz on their TVs and bringing 120/144/240 Hz BFI along with it. Even if true 240 Hz is only going to be usable on PCs, there could still be benefits on TV use from the higher refresh rate when used with BFI or motion interpolation. Who can benefit from motion interpolation? People who watch sports. I think it works pretty well for that usecase.

The LG CX 48" has been one of the best display purchases I've made and I have few complaints about it. But at the same time it's disappointing to read that we are likely to go yet another year without any significant improvements. Mine is still working fine after 3 years of ownership, but eventually the writing will be on the wall and I wish at that point there was a truly superior upgrade.
 
That's a disappointing spec bump, that "slightly more horsepower in a new car model" is a very apt comparison. Sure, 144 Hz is better than 120 Hz but at the same time, the real world difference is pretty negligible and seems like only something to combat Samsung's 144 Hz QD-OLED products.

To me LG should be pushing for 4K 240 Hz on their TVs and bringing 120/144/240 Hz BFI along with it. Even if true 240 Hz is only going to be usable on PCs, there could still be benefits on TV use from the higher refresh rate when used with BFI or motion interpolation. Who can benefit from motion interpolation? People who watch sports. I think it works pretty well for that usecase.

The LG CX 48" has been one of the best display purchases I've made and I have few complaints about it. But at the same time it's disappointing to read that we are likely to go yet another year without any significant improvements. Mine is still working fine after 3 years of ownership, but eventually the writing will be on the wall and I wish at that point there was a truly superior upgrade.
BFI at higher refresh rates is definitely needed.
 
I guess. I'm also not expecting anything worthwhile on the C5 as well, and it also looks like the C4 will STILL not have MLA tech in it. My lord the LG C series almost freaking reminds me of Intel's quad core tick tock decade.
Very bad move if MLA isnt available for the C series and smallers sizes. Samsung has already proved they can downsize QD-OLEDs to 32", so if they decide to make 42~48" versions of their TVs (S90C), which have amazing HDR performance, then the LG's will no longer be a favorite.
 
  • Like
Reactions: Xar
like this
Very bad move if MLA isnt available for the C series and smallers sizes. Samsung has already proved they can downsize QD-OLEDs to 32", so if they decide to make 42~48" versions of their TVs (S90C), which have amazing HDR performance, then the LG's will no longer be a favorite.
Samsung is a pile of crap for smart TV functionality if you care about that though. But then you can replace it with Google Chromecast, Apple Tv, Nvidia Shield etc.

Samsung's settings menus are also awkward to work with, not sure if the latest ones do the "settings per input" thing that LG allows.

I do agree leaving MLA out of the smaller sizes is a big minus. Samsung sofar hasn't ventured to the under 49/50" sizes afaik for the QD-OLEDs so maybe they consider it a niche market.
 
That's a disappointing spec bump, that "slightly more horsepower in a new car model" is a very apt comparison. Sure, 144 Hz is better than 120 Hz but at the same time, the real world difference is pretty negligible and seems like only something to combat Samsung's 144 Hz QD-OLED products.

To me LG should be pushing for 4K 240 Hz on their TVs and bringing 120/144/240 Hz BFI along with it. Even if true 240 Hz is only going to be usable on PCs, there could still be benefits on TV use from the higher refresh rate when used with BFI or motion interpolation. Who can benefit from motion interpolation? People who watch sports. I think it works pretty well for that usecase.

The LG CX 48" has been one of the best display purchases I've made and I have few complaints about it. But at the same time it's disappointing to read that we are likely to go yet another year without any significant improvements. Mine is still working fine after 3 years of ownership, but eventually the writing will be on the wall and I wish at that point there was a truly superior upgrade.

1701194673704.png
 
8k will almost certainly get 120hz and higher in the longer run (that AUO model demoed is 120hz). Manufacturers, putting 8k on ice for awhile for the most part, likely slowed the progress on competition in 8k features and pricing considerably though, unfortunately.

The higher Hz the 4k limits get the more likely they'll eventually be pushed to make 8k higher Hz imo, though it will be lagged behind. Just like oled will eventually be pushed to deliver higher hz in the wake of higher hz 4k FALD LCDs, rather than oled only doing higher hz at 1440p.

Outside of pc use, 8k is not quite as big of an upgrade for someone to watch movies on with ordinary sized tvs at relatively far living room distances that are often viewed at ~ 30 deg or so, closer to half of your binocular viewing angle so nearly twice the PPD as a desktop sized monitor at a desk. I.e. - a 4k screen at 30deg+ viewing angle will already have perceived pixel sizes looking nearly 8k-ish. That and gpu power is more comfortable running 4k at higher hz and consoles barely do 4k really at all (though AI upscaling to 8k rez has some promise). So 8k is probably not very enticing for most people, (especially at the prices samsung is demanding for some of their 8k screens, though they lack competition in the 8k arena for now).

It doesn't have to be an either/or though. They can deliver 8k tech while having higher hz lower/4k rez screens and scale up over time. Just like 1080p lcds were much higher hz, or 1440p oled and 4k FALD LCD vs 4k oled.


The LG CX 48" has been one of the best display purchases I've made and I have few complaints about it.


I think there is less push for many of these increases because people can get such a good gaming tv for $750 - $1200 now (or a lot less for less impressive screens), so fewer people would buy into 8k by spending 2x - 3x as much for it, or spend 2x or more on a 240hz 4k screen (made up figures). The market is probably pretty slim for that.

That's a disappointing spec bump,

Hz increase cuts the sample and hold blur by half (increases the motion clarity 2x) every time the fps+HZ doubles. So yeah, 120 vs 144, 165 is incremental but not as big of a difference as each full leap every time fps+Hz doubles. It's a skip not a jump.

We'll prob get diminishing returns on the motion definition/articulation/smoothness (more dots defining a dotted line curve/path, more unique animation cells in a flip book flipping faster) aspect of high fpsHz somewhere after 200fpsHz ~ 240fpsHz (guessing) , but full HDR color volume w/o BFI will continue to benefit from higher and higher fpsHz to reduce image persistence/sample-and-hold blur, especially blur of the entire viewport~game-world while moving your in game FoV via mouse-looking, movement-keying, controller panning at speed in games.

from blurbusters.com
lcd-motion-blur-ms.png


144 fps+Hz at 1000 pixels/sec = 6.9 pixels ~ 7 pixels
165 fps+Hz at 1000 pixels/sec = 6.0 pixels

1000px/sec is just a baseline to measure with. FoV movement speed can be slower or much higher than that.
E.g. 3840 pixel wide 4k, panning across 90, 180 degrees, etc in period much shorter than a second would be many more times that # of pixels blur wise, but the ratio of blur between the different fpsHz rates would still be the same.

. . . . . . .

AUO reveals a high refresh rate 8K gaming TV panel at Touch Taiwan (April 2023)

One of the key aspects of this new 8K gaming panel is its use of AUO’s A.R.T. “Reflectionless” technology, which is designed to prevent the screen from becoming reflective when used in bright environments. This screen, which is a demo unit from AUO, is powered by a MediaTek’s Pentonic 2000 SoC, MediaTek’s highest end 8K TV processor which supports both H.266 (also known as VVC) and AV1 decoding.

The 8K 120Hz panel that AUO showcased at Touch Taiwan was a 65-inch model that features a 4,608 zone full-array local-dimming (FALD) backlight, and peak brightness levels of 1,800 cd/m2. This peak brightness can be delivered across 10% of the screen in HDR mode.

Currently it is unclear when we can expect this new 8K 120Hz screen from AUO to make its way to market, as TV manufacturers are currently steering away from 8K due to its costs and the popularity of OLED screens over competing technologies. While it is great to see that 8K 120Hz screens are possible, I do not see many major TV manufactures attempting to adopt this technology urgently, or any TVs using this new screen type being in affordable by most consumers. Expect 8K TVs using this screen panel from AUO to be incredibly expensive.
.
 
Last edited:
  • Like
Reactions: Xar
like this
8k will almost certainly get 120hz and higher in the longer run (that AUO model demoed is 120hz). Manufacturers, putting 8k on ice for awhile for the most part, likely slowed the progress on competition in 8k features and pricing considerably though, unfortunately.

The higher Hz the 4k limits get the more likely they'll eventually be pushed to make 8k higher Hz imo, though it will be lagged behind. Just like oled will eventually be pushed to deliver higher hz in the wake of higher hz 4k FALD LCDs, rather than oled only doing higher hz at 1440p.

Outside of pc use, 8k is not quite as big of an upgrade for someone to watch movies on with ordinary sized tvs at relatively far living room distances that are often viewed at ~ 30 deg or so, closer to half of your binocular viewing angle so nearly twice the PPD as a desktop sized monitor at a desk. I.e. - a 4k screen at 30deg+ viewing angle will already have perceived pixel sizes looking nearly 8k-ish. That and gpu power is more comfortable running 4k at higher hz and consoles barely do 4k really at all (though AI upscaling to 8k rez has some promise). So 8k is probably not very enticing for most people, (especially at the prices samsung is demanding for some of their 8k screens, though they lack competition in the 8k arena for now).

It doesn't have to be an either/or though. They can deliver 8k tech while having higher hz lower/4k rez screens and scale up over time. Just like 1080p lcds were much higher hz, or 1440p oled and 4k FALD LCD vs 4k oled.





I think there is less push for many of these increases because people can get such a good gaming tv for $750 - $1200 now (or a lot less for less impressive screens), so fewer people would buy into 8k by spending 2x - 3x as much for it, or spend 2x or more on a 240hz 4k screen (made up figures). The market is probably pretty slim for that.



Hz increase cuts the sample and hold blur by half (increases the motion clarity 2x) every time the fps+HZ doubles. So yeah, 120 vs 144, 165 is incremental but not as big of a difference as each full leap every time fps+Hz doubles. It's a skip not a jump.

We'll prob get diminishing returns on the motion definition/articulation/smoothness (more dots defining a dotted line curve/path, more unique animation cells in a flip book flipping faster) aspect of high fpsHz somewhere after 200fpsHz ~ 240fpsHz (guessing) , but full HDR color volume w/o BFI will continue to benefit from higher and higher fpsHz to reduce image persistence/sample-and-hold blur, especially blur of the entire viewport~game-world while moving your in game FoV via mouse-looking, movement-keying, controller panning at speed in games.

from blurbusters.com
View attachment 616631

144 fps+Hz at 1000 pixels/sec = 6.9 pixels ~ 7 pixels
165 fps+Hz at 1000 pixels/sec = 6.0 pixels

1000px/sec is just a baseline to measure with. FoV movement speed can be slower or much higher than that.
E.g. 3840 pixel wide 4k, panning across 90, 180 degrees, etc in period much shorter than a second would be many more times that # of pixels blur wise, but the ratio of blur between the different fpsHz rates would still be the same.

. . . . . . .

AUO reveals a high refresh rate 8K gaming TV panel at Touch Taiwan (April 2023)




.
For 8K to be viable to the market, there needs to be 8K content.

There are virtually no movies being finished in 8K. Many are still 2K upscales to 4K. PC games won't be viable for affordable 8K, for at least a few years. And console games will have to wait until Playstation 7, for effective 8K gaming.
 
  • Like
Reactions: Gabe3
like this
For 8K to be viable to the market, there needs to be 8K content.

AI upscaling quality on a higher resolution screen than the native of the content can look a lot better than the original source content's lower native rez on a screen of the same lower native rez . 1080p content upscaled to 4k via my nvidia shield 2019 viewed on my 4k oled looks great compared to how it would look native on a 1920x1080 pixel grid screen of the same size or viewing angle for example. AI upscaling has probably improved since 2019 too, and will continue to do so. That's not worthwhile enough from 4k to 8k for most people to dump money on though. Especially for living rooms/movies:

Outside of pc use, 8k is not quite as big of an upgrade for someone to watch movies on with ordinary sized tvs at relatively far living room distances that are often viewed at ~ 30 deg - 34 deg or so, closer to half of your binocular viewing angle so nearly twice the PPD as a desktop sized monitor at a desk. I.e. - a 4k screen at 30deg+ viewing angle will already have perceived pixel sizes looking nearly 8k-ish. 65" 4k screen at 34 deg viewing angle = 114 PPD

PC desktops and PC games can run 8k though. That or desktop/apps at ~ 6k with scaling for font size readability and with greater # of pixels per font, or games at 4k dlss upscaled to 8k. You can even downsample 16k , supersample, DSR, on PC.

Consoles, yeh, they really can't even do 4k at high fps. Even relying on dynamic downscaling of rez, checkerboarding, etc., a lot of people opt for 1440 for the performance increase.



For 8K to be viable to the market, there needs to be 8K content.

There are virtually no movies being finished in 8K. Many are still 2K upscales to 4K. PC games won't be viable for affordable 8K, for at least a few years. And conse games will have to wait until Playstation 7, for effective 8K gaming.

For the record, I do want (120hz) 8k eventually but I want one for a wall of desktop/app real-estate at high PPD. For games, I'd like to run 4k, 5k, 6k, and different uw resolutions 1:1 pixel (letterboxed) on a big 8k at 120hz - or even higher Hz than the 8k native potentially on some of those resolutions since rez would be lower, if at all possible on some model. That as well as good AI upscaling of 4k to 8k full screen which could be very good if done right, like DLSS can upscale things really well when done right for games, or nvidia's AI upscaling to 4k on the shield for media, sony tv's upscaling, etc. DLSS. As gpu power increases and dlss/AI upscaling, frame gen technology mature, we will be able to use 4k as the foundation for dlss instead of 1080p or 1440p to upscale from when we have the headroom.


8k wall of desktop/app space
===========================

For example, the 55" 4k ark is only ~ 61 PPD at the center of the curvature sitting ~ 40 inches away. Sitting any closer for immersion would start looking more like 1400 - 1500p desktop sized monitors pixel sizes would at a desk. It's not the multi-monitor replacement it's marketed as since it's only like quads of 1080p. Most modern multi-monitor setups aren't using 1080p screens. On a larger 65" 8k, it would be like quads of ~ 32.5" 4k screens. A 4k hits 60 PPD at 64 deg viewing angle for reference which is close enough viewing angle wise for full screen content but it's good, not great, PPD wise imo esp. for desktop/apps and the (uncompensated by text-ss and graphic-aa) 2d desktop's graphics and imagery.

You aren't wrong market wise though. It's very niche, more like /r battlestations or [H]'s show your lcd setup command center style setups. I would love to get a 55" or 65" 8k, one with quality upscaling for media and high enough hz to do 8k 120hz. Optimally even somewhat higher hz at lower resolutions, like you could run 1440p at 120hz on 60hz screens in the past. Market wise, you could argue similarly about 240hz 4k gaming tvs though - what people are willing to spend price/performance wise, and that consoles can't do 240hz. They can't even do 4k at 120hz at high fps really.



. . . . . . . . . . . . .


65 inch 8k
==========

For the size, if nearer than (~60 to 50 deg) optimal for full screen content, viewing a 65" 8k at 71 deg viewing angle in order to get a 40 inch view distance = 109 PPD.
That would make sitting at the 1000R, 1000mm, = ~ 40 inch to the center of curvature of a 1000R curved screen (or nearer) work a lot better ppd wise, and would provide immersion with some degrees outside of your binocular viewing angle.

8k 64 deg viewing angle = 45 inch view distance = ~ 120 PPD
8k 60 deg viewing angle = 49 inch view distance = 128 PPD

8k 50 deg viewing angle = 61 inch view distance = 154 PPD

55 inch 8k
================

A 55" 8k would be a good fit vs. your human binocular viewing angle at the center of a 1000R screen (or a little nearer for immersion though the pixels would start to get off axis the closer you sat from that center).
55" 8k at 1000R curve's ~ 40inch to center of the curve view distance = 62 deg viewing angle = 124 PPD


8k 64 deg viewing angle = 38 inch view distance = ~ 120 PPD
8k 60 deg viewing angle = 42 inch view distance = 128 PPD

8k 50 deg viewing angle = 51 inch view distance = 154 PPD

. . . . .

The 8k at 1000R curvature is just me wish-listing but maybe someone will make one some year, perhaps an upgraded 8k version of the ark by samsung. I'll prob end up with a flat 8k sometime before that happens though, especially if there is ever some competition (in features, pricing) in the 120hz 8k segment in the next few years.
 
Last edited:
I can literally say the same thing to you? Let the tech grow and user in 1000Hz instead of 8K. If you want 8K that's fine but I would personally take higher Hz over more pixels if I had to choose.
but why do we need 240Hz in the first place?
 
1000003738.gif


. .


. .

Higher PPD is great for the deskop.. text and especially 2d graphics and imagery that get no anti-aliasing/sub-sampling to mask how large the pixel's appear. High PPD on a a higher rez screen give a lot more desktop/app real-estate and can replace using multiple monitors, where a 4k screen is only like quadrants of 1080p.

With a larger 120hz 8k screen you could theoretically run lower resolutions at higher hz if a manufacturer designed for it, yet still get all of that desktop/app real-estate at high PPD outside of games.

AI upscaling and frame insertion will mature as well as future gens of gpu power (nvidia 5000 in 2025 supposedly next). So you might be able to upscale a healthy 4k rez's frame inserted high frame rate to 8k too, even if at 8k 120hz to begin with for some gens of gaming tvs .. . but hopefully some models would be able to run a lower rez at higher hz than the 8k's native hz at 1:1 pixel in some generation, or at least 1:1 letterboxed for higher fps vs demanding games, rtx etc. . (at 5k , 6k, various uw resolutions).

. . . . . . .

Gaming monitors will probably continue to push higher hz ahead of gaming tvs, though they need to advance to higher rez than 1080 and 1440p for the higher range, (especially oleds). The g95nc 7680 x 2160 240hz samsung FALD LCD is a big leap there.

It's all good as far as I'm concerned. More options = more competition in features so it could push advancement on all fronts.
 
Last edited:
More is better for sure, long way to go before it becomes useless. On one hand I'm glad my CX is holding up so well (and I love it and it's probably my favourite IT-related purchase of all times) and still "competitive" in 2023, but at the same time I want display technology to progress faster. And while 240fps is a lot, there is a large number of games older or lighter where it is achievable today at 4k, also with DLSS3 in some cases.

If my CX broke now, the replacement would be practically identical and that's kinda sad :(
 
I have a 165Hz VA panel and prefer the 120Hz OLED, hands down. I also have a 24" 1080P IPS 240Hz strictly for 1st person shooters and I do enjoy that as well.
 
More is better for sure, long way to go before it becomes useless. On one hand I'm glad my CX is holding up so well (and I love it and it's probably my favourite IT-related purchase of all times) and still "competitive" in 2023, but at the same time I want display technology to progress faster. And while 240fps is a lot, there is a large number of games older or lighter where it is achievable today at 4k, also with DLSS3 in some cases.

If my CX broke now, the replacement would be practically identical and that's kinda sad :(

The LG C series feels like when Intel was on 14nm+++++++++ node for however many years.
 
Interesting video about OLED coatings and black levels:

View: https://www.youtube.com/watch?v=uF9juVmnGkY


View: https://i.imgur.com/woaYQLh.jpeg

Main takeaways:
-Under lighting conditions, QD-OLED has worse black levels than VA LCDs
-Even under lighting conditions, Glossy and Matte WOLED's always have better black levels than every other display
-While QD-OLED can look worse under light, in ideal conditions it has better contrast due to superior luminance levels
 
Interesting video about OLED coatings and black levels:

View: https://www.youtube.com/watch?v=uF9juVmnGkY


View: https://i.imgur.com/woaYQLh.jpeg

Main takeaways:
-Under lighting conditions, QD-OLED has worse black levels than VA LCDs
-Even under lighting conditions, Glossy and Matte WOLED's always have better black levels than every other display
-While QD-OLED can look worse under light, in ideal conditions it has better contrast due to superior luminance levels

QD-OLED's lack a polarizing layer. That's why the black level raises with ambient light. This has been noted in every review RTings does on a QD-OLED panel. *I think Samsung's newest TVs mitigate this somewhat, with their screen coating. But, they still lack the polarizer.
 
QD-OLED's lack a polarizing layer. That's why the black level raises with ambient light. This has been noted in every review RTings does on a QD-OLED panel. *I think Samsung's newest TVs mitigate this somewhat, with their screen coating. But, they still lack the polarizer.

OLEDs in general should be used in darker room conditions anyway. Buying an OLED for use in a super bright environment is like buying a sports car for towing.
 
OLEDs in general should be used in darker room conditions anyway. Buying an OLED for use in a super bright environment is like buying a sports car for towing.
WOLED and JOLED still show big contrast advantages in lit rooms. That per pixel lighting control, is difficult to summarize in a review. You just gotta see it.

Even at Costco, I enjoy looking at a C3 or a G3 much more than any of the VA panel TVs.
And while mini led are brighter, they don't have enough lighting zones to work well with small details. Such as the ever popular starry skies/space example. OLED are transformative for SCI-FI enthusiasts. And that is still apparent at Costco.

Indeed, a living room with huge windows which may not even have coverings + sunny weather, is tough for a TV.
 
QD-OLED's lack a polarizing layer. That's why the black level raises with ambient light. This has been noted in every review RTings does on a QD-OLED panel. *I think Samsung's newest TVs mitigate this somewhat, with their screen coating. But, they still lack the polarizer.

Any abraded layer will raise the black level when the abraded haze/sheen is "activated" by ambient light splashing on it no matter what the panel type is.

From TFTcentral review of the ASUS PG42UQ (42" LG OLED Panel with matte abraded outer layer) :

The PG42UQ features a more traditional monitor-like matte anti-glare coating, as opposed to a glossy panel coating like you’d find on TV’s including the LG C2. This does a very good job of reducing reflections and handling external light sources like windows and lamps and we noticed much better reflection handling (no surprise) than the LG C2. However this does mean that in some conditions the blacks do not look as deep or inky visually to the user. With this being an OLED panel, famous for its true blacks and amazing contrast ratio this could be considered a problem – are you “wasting” that by having an AG coating that reduces your perceived contrast?
.
In certain conditions blacks look a little more dark grey as the anti-reflective coating reflects some of the surrounding light back at you and it “dulls” the contrast a bit. The anti-glare coating means the image is not as clear and clean as a fully glossy coating. You don’t get this same effect if the coating is fully glossy as there’s no AG layer, but what you do get instead is more reflections. Don’t forget this same thing applies to all AG coated desktop monitors, you have the same impact on perceived black depth and contrast on IPS, TN Film and VA panels depending on your lighting conditions if there’s an AG coating used. You’d still get better relative blacks and contrast on the OLED (not to mention other benefits) compared with LCD technologies. They are all impacted in the same way by their coatings.

While they are concentrating on how it affects the blacks which is bad enough, it can also degrade the color saturation to your eyes esp.when stronger light is hitting the surface of the screen (but any screen is polluted by light sources hitting it). Also compromises detail some.as it creates a mild haze.

. . . . . .

OLEDs in general should be used in darker room conditions anyway. Buying an OLED for use in a super bright environment is like buying a sports car for towing.
Yep.

It's also like a photography studio or movie studio-set trying to create different scenes from books and scripts faithfully, but the studio is forced to leave bright lights on all of the time, or have giant windows open to whatever day-time, night-time, seasons and weather cycle lighting are happening at any given time they are shooting. 📷📽️🎬

Also, brighter screens need to have brighter SDR in bright viewing conditions just to get to the same levels to our eyes+brain as an oled would have in dim to dark reference environment. Our eyes view everything relatively. It's not just whether ambient lighting or direct lighting is hitting the screen or not as a binary thing. Allowing ambient lighting levels to shift will also change how screen parameters look to your eyes/brain. Home theater / controlled lighting designed viewing areas (at least while viewing) are the way to go imo - designing the room around the display and sound systems not the other way around.

For a sound analogy, you probably wouldn't use an open set of headphones in a room that has a lot of ambient noise, and you probably wouldn't like having to put earplugs in that cut out both the ambient noise and some of the headphone quality just to compensate for the poor listening environment you put yourself into. (The screen equivalent of noise canceling headphones would be an enclosed VR headset or a compeltely hooded/boxed screen you sat right up close to). It would be a lot better to design a listening/recording environment that isn't being polluted by other noises if at all possible.

That said, oled could use much higher nit sustained brightness levels for HDR in the future obviously. Lots of tradeoffs between FALD LCD and OLED and screens in general but compromising the screen clarity by putting a scratched surface on it doesn't have to be one of them forced onto everyone. It is in a lot of cases though unfortunately.
. .

In the future with high rez XR/MR/AR glasses, screens and other virtual content in real space won't actually be hit by ambient and direct light sources so it shouldn't be the same kind of problem anymore as far as light splashing onto the screens. The current XR screens do look better when the lenses are dimmed to sunglasses for more contrast though. Glasses/goggles will prob have max contrast and visibility with real world (as video feed) + virtual objects if using forward facing lens type MR instead off see-through lenses. The systems will probably try to emulate real world light play on virtual objects and shadows cast by virtual objects as realistically as possible but it won't have to cast fake light onto virtual screens.
 
Last edited:
Any abraded layer will raise the black level when the abraded haze/sheen is "activated" by ambient light splashing on it no matter what the panel type is.

From TFTcentral review of the ASUS PG42UQ (42" LG OLED Panel with matte abraded outer layer) :


.


While they are concentrating on how it affects the blacks which is bad enough, it can also degrade the color saturation to your eyes esp.when stronger light is hitting the surface of the screen (but any screen is polluted by light sources hitting it). Also compromises detail some.as it creates a mild haze.

. . . .
Sure. But that is a separate issue from what has already been said about QD-OLED lacking a polarizing layer.

IMO, matte finish on OLED isn't as much a problem as it can be with edge lit IPS and VA, because you still are looking at an OLED. Per pixel light control. No light bleed, no halos, no "IPS Glow", all of which can themselves "activate" the material layer, diffusing and obscuring what you are trying to look at. are otherwise still apparent problems, in any lighting condition. And more apparent, in darker conditions.

All that said, no matter the panel type, matte VS. Gloss VS. Glass, all have tradeoffs. Which may very well come down to opinion and use case.

Gloss/semigloss offers a richer picture and can still have reflection absorbing treatments. But those can cause color issues, especially for whites. And especially for edges or corners of the screen, which are at a relatively sharper angle to the eye.

Glass is the clearest and most vivid, but you deal with more true reflections. Which themselves can distract from what you are viewing. Glass is also heavy.

Matte has the lowest display clarity, but their Diffusion and canceling of reflection clarity, can be a real benefit. And in a dark room, you'd be hard pressed to tell the difference.


When done well, I think a Gloss or semi Gloss coating is probably the best all-rounder. But we generally only see that on TVs. Gloss monitors are pretty rare.
 
I was reading so many complains about QD-OLED black level in bright room and comparisons like this put me off for so long. Previously I was using 42"C2 for 1 year.
50884176_1.gif

I have my PC in living room with big window on the left side. Evening time I also don't like to sit in darkness and usually have ceiling lights on.
I spoke to Adam Simmons from pcmonitors.info an he convinced me to try the AW3423DWF.

"Yeah, the lightening up is far from ideal but that example shows you two extremes and is at an angle which further exaggerates the effect. There are many conditions between “dark” and “bright” where the QD-OLED screen surface fares well. Take a look at our review of the AW3423DW with images shot in various lighting conditions for some examples. The notion that you have to use it in a dark room for a good experience is flat out false, but you do need to be able to control your lighting environment to avoid strong ambient light and minimise direct light where possible. In at least reasonably controlled conditions from a normal viewing distance, the reflections I personally find less distracting than the glare patches invited by matte screen surfaces. And yes, I find the reflections more subdued than the sharper glare patches on ‘light to very light matte’ surfaces. The diffusion of light on matte screen surfaces lighten up the screen surface in some conditions significantly more than on the QD-OLEDs, but it is localised or variable across the screen rather than a lightening up of the entire screen. I have extensive (and I mean extensive) experience with QD-OLEDs as I use the AW3424DW as my ‘daily driver’. I’m using it right now, in fact."

Honestly guys I do prefer QD-OLED right now. Mainly because of reflection handling. C2 coating looks darker, but it's like a mirror. I tried to play RE4 in HDR during the day on C2 and in dark scene reflections are absolutely ruining everything.

Just to give you idea AW3423DWF has less reflections with ceiling lights at 100% than the C2 with lights at 20-30%

IMHO QD-Oled and LG G3 have the best coating. Good reflection handling, no grain and black level is still OK if you don't hit the screen with strong light directly.

If someone has window behind your back please do everything what you can to change location of your monitor, because even strong matt coating from 27" OLEDs will look bad in this scenario.
IMG-20231201-113032.jpg


And this is my setup BTW
IMG-20231110-190812.jpg

IMG_20231201_121045.jpg

On C2 In scene like this in the same lighting condition all shadow details are eaten by reflections.
 
Last edited:
You have to find what works for you. I don't have that screen but according to RTings:
"The Dell AW3423DWF has an OLED panel with a near-infinite contrast ratio. It means that it displays perfect blacks next to bright highlights. However, it looks best in a dark room because the black levels raise in a bright room, causing blacks to look purple/pink when there's any ambient light on the screen."

Like chameleoneel informed me, the polarization layer is a problem:


average-room-off-large.jpg


bright-room-off-large.jpg


"In theory, the reflection handling of the Dell AW3423DWF is remarkable. There aren't any distracting reflections from strong light sources, meaning glare won't be an issue if you want to use it in a bright room. However, light also stretches across the screen, as you can see in the photo above. The main issue with QD-OLED displays is that they lack a polarizing layer, which causes the black levels to raise when there's ambient light on them. It means that blacks look closer to purple/pink in a bright room, and you lose the advantage of the near-infinite contrast of OLEDs. You need to be in a dark room to see the perfect black levels. This issue isn't only limited to monitors, but any current QD-OLED display, including the Samsung S95B OLED."


. . . . . . . . . . . . . . . . . . .

Yeah, the lightening up is far from ideal but that example shows you two extremes and is at an angle which further exaggerates the effect. There are many conditions between “dark” and “bright” where the QD-OLED screen surface fares well.


IMHO QD-Oled and LG G3 have the best coating. Good reflection handling, no grain and black level is still OK if you don't hit the screen with strong light directly.


You putting the screen in an "average" room is probably helping to compensate like you said. "OK" black level isn't a good enough tradeoff for everyone though, especially already making tradeoffs in order to get "infinite" black depth with an oled which is one of it's strongest performance facets. There are always tradeoffs with different panels, techs, models though so all good if you found one that works best for you, or whose worst doesn't bother you as much as others might at least, or whatever combination of the two confined by your viewing environment needs/restrictions.

Personally I have a matte abraded ag legion 5 pro laptop because the other features and the price/performance was good enough that I chose to stomach the ag. I still miss glossy on it though and wish it was glossy. I've also considered the 55" 1000R 4k ark or the 57" G95NC 4k+4k super uw screen even though those are both FALD with AG and the tradeoffs that come with it. Every time I've got an ag screen I've had some regret though, ever since I had a 27" 1440p mac ips years ago and ran 1080- 120hz and later a 144hz RoG swift 1440p with ag next to it,etc. For awhile I had a 15.6" 4k glossy laptop too, and my phone and tablet are both glossy. When the LG gaming OLEDs came out I was extremely happy that they were glossy. I don't like the fact that I'm being railroaded into different coatings just to get the other features I like in shopping around for many of these screens, but that's the way it is.

(Rtings images from the review)

Dell Alienware AW3422DWF in "Average Room lighting"


average-room-alternate-large.jpg


.

Dell Alienware AW3422DWF in " Bright Room lighting"

bright-room-alternate-large.jpg


.

Dell Alienware Aw3422DWF next to OLED and IPS displays

reflections-comparison-large.jpg



.

Techspot review:

2022-12-27-image-14.jpg

.
2022-12-27-image-15.jpg

.

* All of the images I added in this reply are obviously limited by what your screen and environment is showing you but they should still give some idea of what's going on. E.g. you aren't going to see dim/dark-room oled black in an image on a screen or screen+environment combo of your own that can't show it's depth fully. The same goes for the levels in the pictures krisdee replied with.
 
Last edited:
Yeah I have seen all these pictures like 100 times 🤪 but honestly in avarge room lighting this screen is fantastic, and I was expecting to be terrible.
I just went to my bedroom to check my C2 again and reflections are really distracting.
IMG_20231201_155256.jpg

IMG_20231201_155148.jpg


And this bedroom is a darker room with window on the right side 2m from the screen.
In living room I hava window on the left 50cm from the screen.
 
Last edited:
I wouldn't be interested in the lack of polarization effect raising the blacks if I could otherwise avoid it but I understand if personal choices~priorities and/or lack of options could make someone opt for different screen choices.

I've been using glossy for years, whenever I can, but I have been forced to use matte/ag screens a lot of times due to limited choices vs. gaming performance over the years. Glossy provides a much more lush, wet, clear screen in controlled environments, layouts designed for it.


Layouts and Lighting
================

In regard to your window layout. Have to keep all windows facing the back of the screen placement wise if you want to control the direct light pollution. That or blackout shutter/blind the offending ones when the screen is in use.

OLED (and HDR screens really) are best in dim to dark home theater environments. Glossy is best with no lighting source past the horizon of the side of the screen, but any screen surface type is polluted by direct light sources hitting it. If the light source was a person and the screen was a mirror (any screen not just glossy), there is no way you should be seeing that person in the mirror vector-wise in a proper layout where the screen is the priority. If you can't avoid that due to restrictions in your environment/layout then you'll probably end up making compromises elsewhere in screen choices, etc. in an attempt compensate. Even non-direct, relatively strong ambient light will "activate" matte abraded surfaces compromising screen's parameters and clarrity - but lighting sources hitting screens directly are the worst polluters on all screens. I'd rather design my room and lighting around the screen than the other way around trying to make compromises on the screen for the room. But different people can have different priorities, or have more limited options.

I keep a standing lamp on each side of my 77" tv a foot or more away from the tv in line with it along the same wall, on smart switches. Plus a bias led strip on the back of the tv I can turn on when I want the standing lamps off. If I'm going to watch a movie, HDR material preferably, in the daytime for some reason, I'll usually pull the blinds and commit to a theater experience. If i want some room lighting while watching things intently on tv, I can turn the pillar lamps on. I usually don't watch tv much in the daytime in my living room anyway though so the blinds aren't always pulled. It's more often a "movie night" for shows and movies.

My PC screens in PLP orientation (incl central 48cx oled for games and some media) are in a spare room on my main floor nearly against one wall with a single high window behind them. There are two high windows on the wall to my left side running back along the wall alongside and further back but I pull the blinds on those when using the gaming pc (esp. if I'm going to be using the oled for gaming sessions). I keep a few clip on brushed steel desk lamps behind my 3 screens, each with a led bulb facing the back wall at an angle. Smart bulbs work great with assistants to change lighting intensity from zero thorugh 100%, and some models can also change colors (though the colors are dimmer peaks by default so usually best kept at 100% in color modes).
I have other normal room lighting but it's on smart switches so I turn it off when I'm going to be using my pc, especially for gaming or media.


. . . . .

Allowing Lighting Swings
===================

Not only do window light and other direct artificial light sources pollute screen surfaces directly, and dimmer non-facing lighting and dark conditions not. It's not a just a binary on/off effect or direct lighting screen spash pollution. Allowing your room lighting to swing overall will change how your eyes perceive all of your screen parameters. Our eyes view everything relatively so wherever your factory calibrated screen + personal/suggested tweaks to settings are, or wherever you hardware calibrated your screen using hardware yourself ends up settings wise - allowing the room lighting to shift from night to day, cloud cover to clear skies, weather events, seasons - will cause those parameters to deviate a lot to your eyes/brain throughout the range the lighting is changing.. Just like phones or flashlights in day, night, overcast days, passing cloud cover vs. bright sun, summer days, etc. Not only that, with direct light sources they typically aren't hitting the whole screen surface uniformly so it's more like a "hot spot". A controlled viewing environment has the most stable parameters to your eyes and is the most accurate way to view screens. That's why a lot of professional/reference screens come with hoods you can install to help block overhead and side lighting from hitting the screen while sitting near, as they have to be very accurate. Movie/TV studio/professional camera viewfinders are also often hooded similarly. Also why photography studios have controlled environments, and audio studios have controlled sound environments.


Different tools for different jobs
=================================

I keep a different screen/rig in my front hall for bright room viewing, doing bills, media storage, ip camera server, etc. I can bring my laptop anywhere though and I usually put my 10" tablet next to it on a metal fold out "mini easel" stand in portrait mode. My BT mouse (and keyboard if I deploy it) for the laptop+tablet swap between them with a quck button press on the peripheral but I usually just manage the tablet side with the mouse. I have a couch desk I can use the laptop with or I sometimes use the laptop on my kitchen table, patio table in summer, etc. I also have a really nice BT boombox speaker for either of those, (or headphones or decent monitor earbuds). Point is, if I'm digging into a virtual world gaming for hours, or digging into a virtual experience via a HDR movie for hours on a high performing HDR gaming and media screen - I want a theater environment lighting wise and speaker layout wise that optimizes the experience. I have other devices if I want to do screen-based things in bright environments. For me, in gaming worlds, HDR movie worlds, etc. the screen and sound come first. The real world is secondary as it's not my focus during that. I'm trying to go through the looking glass and be immersed, not peek at a porthole from a bright, insistent room. But that's me. I don't want to have to compensate or suffer tradeoffs for lighting polluting the space to my eyes/brain any more than I want distracting ambient noise hitting my ears.
 
Last edited:
QD-OLED's lack a polarizing layer. That's why the black level raises with ambient light. This has been noted in every review RTings does on a QD-OLED panel. *I think Samsung's newest TVs mitigate this somewhat, with their screen coating. But, they still lack the polarizer.
And yet there are still people claiming that their particual units does not have this problem and has perfect blacks even in a bright room :)
 
Back
Top