42" OLED MASTER THREAD

I have the LG C2 42". Just found out HDR looks much better when you raise the black levels in the game Optimizer under Picture. Looks more like HDR on a good LCD display now, still with perfect black, but much more shadow detail. There is simply too much black crush and the whole picture looks too dark to me without adjusting Black Levels. I use Black Levels at 14. Anyone else have the same experience? I'm enjoying dark movies much more now. The Nun 2 simply looked too dark at default with not much shadow detail. It may not be "correct", but it simply looks better to me.

You should use whichever settings look best to you dude. Forget all the stupid BS about "creator's intent". I do whatever I want to my games/movies to make them look best to MY eyes and if that means completely changing the colors with ReShade filters, messing with Digital Vibrance in NVCP, injecting HDR into a game that never had it, or changing the TV settings, then that's what I'm going to do. If changing the Black Level in the TV settings gives the best looking image to you then I say keep using it.
 
I have an urgent question: what is the difference btwn. C1 vs. C2? More importantly, does C1 has 4:4:4 chroma for text display?
 
You should use whichever settings look best to you dude. Forget all the stupid BS about "creator's intent". I do whatever I want to my games/movies to make them look best to MY eyes and if that means completely changing the colors with ReShade filters, messing with Digital Vibrance in NVCP, injecting HDR into a game that never had it, or changing the TV settings, then that's what I'm going to do. If changing the Black Level in the TV settings gives the best looking image to you then I say keep using it.
Korea-man said you're DEAD WRONG:

View: https://www.youtube.com/watch?v=O23zZb4ipds
 
For SDR games on my 48cx, I was blown away by some nature/anthropology 4k HDR material I watched when I first got the HDR OLED, with things like an islander spinning double ended flaming torch on a dark beach at night, etc. but I was disappointed when I started gaming on it because I found the colors to be dull. I tried swapping out of game mode to different media modes and the games looked much less muted, but I needed game mode to avoid lag and have the best gaming performance. So, I later upped the saturation/color slider in the TV's OSD for game mode since I found it too muted compared to the other named picture modes.. I actually cranked it a little higher than I'd ever use because I then use Reshade or SpecialK to tweak each game (down) individually to where I want it. Turning the saturation down from the slightly over heightened OSD color level seems easier using the 3rd party tools than the other way around. Can also adjust a lot of other things in Reshade/SpecialK, though I usually only use a few on most games.

HDR game mode has it's own separate HDR game mode named setting that activates when a HDR game is running, so the saturation/color boost I do to the SDR game mode doesn't affect the gaming tv's HDR game mode. HDR games have their own color mapping/range so they never looked muted to me like SDR game mode did so that kind of thing is unnecessary for HDR games. I prioritize HDR games anyway now, and there is autoHDR too, but for any games that don't have those I still have an option for a better gaming picture to my tastes. I'm not talking about blasting saturation to neon or anything, just bringing it back up to where it looks outside of the default game mode which was muted/dulled. That's on the 48cx though, I can't speak for what the other, newer gaming tv's game modes look like.

. . . . . .

Re: black level. I prefer things that should be dark to be dark. If I'm in an adventure game I'll use a torch 🔥 , or in elden ring or lords of the fallen the hanging lantern on your gear you can turn on 🎃
Some games do require adjusting the settings in the games themselves to being with, but imo not every dark area is meant to be seen in full detail in most dark games or game areas unless you bring a bright light near or cast one into the area more.

215618.gif


6_ungodlyPower.gif


sanchez-dc-lotf-artstationshowcase-lanternmainmenu.jpg




Shooters/competitive games are a whole other conversation since eking out little advantages over other players are usually done by people who prioritize that over aesthetics and game design, scene and creator intent, etc. Afaik there is no punkbuster forcing black levels, contrast, level of detail vs distance, turning fx and grass/foliage off, changing shadows settings etc. and nothing at all tied to your OSD's settings either obviously - so people can whittle their image down or blow it out however they want if it lets them see their opponents better (and/or gives them higher frames/second)... manipulating settings kind of like predator vision in some sense/to some degree for advantage compared to what the scene would normally be respresented as.

latest?format=original.gif
 
Last edited:
I have the LG C2 42". Just found out HDR looks much better when you raise the Black Levels in the Game Optimizer under Picture. Looks more like HDR on a good LCD display now, still with perfect black, but much more shadow detail. There is simply too much black crush and the whole picture looks too dark to me without adjusting Black Levels. I use Black Levels at 14. Anyone else have the same experience? I'm enjoying dark movies much more now. The Nun 2 simply looked too dark at default with not much shadow detail. It may not be "correct", but it simply looks better to me.
Check how you have Windows, Mac or your console black level setting. If it is wrong it can result in weird issues.
 
My initial impressions are kind of mixed, after having this for about... two weeks? One? I don't remember now. Anyway, first thing out the door, this does have a dead pixel or dead subpixel:
1703830308674.png


Looks like the white subpixel is dead (iirc these have a different subpixel structure)? I'm not sure.

I do also notice the edge fringing, but only in very particular cases:

1703830355211.png


I can always get Costco to give me an exchange for the dead pixel (though every panel is a roll of the dice...)


What annoys me more is other stuff.

1. The initial settings prompted me to set up "Game Mode". But the colors looked like shit in game mode. I thought it was just maybe me preferring more saturated colors somehow, but I looked up the panel on RTings and set it to their recommended accuracy settings (which took it off of game mode though). And it looked SO much better. Wtf are they thinking with the default color settings for game mode? Looks like garbage. But I've read that turning off the game mode increases input lag or something... and I have kind of noticed some difference since I turned it off, in my gameplay (currently playing Empyrion).
2. There are all kinds of quirks due to this natively being a TV. One of them is display flickering depending on some of the content I have, sometimes even if it's in another monitor. The flickering is pretty annoying sometimes.
3. The pixel shift mode for burn in is glitched and broken, I feel. I can really notice it cutting off some of the edges of GUIs occasionally and it feels like it shifts too much, too long, and is too noticeable sometimes. I can really notice it shifting while I'm on desktop content, too.
4. The lack of curve really does hurt at this size. But I can sort of deal with it.
5. I'm going to need to get VERY inventive to somehow get another monitor above it for other usage at this point. No monitor arm will go up that high with the stock pole...
6. Well, motion clarity is obviously not as good as the 45 inch QD panels at 240Hz, and definitely not when it's glitching. Might be because this is an older title with oddball settings though.

I'm still sort of leaning towards keeping it, but there are definitely a lot of tradeoffs. I really do like the larger single screen for content consumption. I just REALLY wish it was curved sometimes.
 
My initial impressions are kind of mixed, after having this for about... two weeks? One? I don't remember now. Anyway, first thing out the door, this does have a dead pixel or dead subpixel:
View attachment 623276

Looks like the white subpixel is dead (iirc these have a different subpixel structure)? I'm not sure.

I do also notice the edge fringing, but only in very particular cases:

View attachment 623277

I can always get Costco to give me an exchange for the dead pixel (though every panel is a roll of the dice...)


What annoys me more is other stuff.

1. The initial settings prompted me to set up "Game Mode". But the colors looked like shit in game mode. I thought it was just maybe me preferring more saturated colors somehow, but I looked up the panel on RTings and set it to their recommended accuracy settings (which took it off of game mode though). And it looked SO much better. Wtf are they thinking with the default color settings for game mode? Looks like garbage. But I've read that turning off the game mode increases input lag or something... and I have kind of noticed some difference since I turned it off, in my gameplay (currently playing Empyrion).
2. There are all kinds of quirks due to this natively being a TV. One of them is display flickering depending on some of the content I have, sometimes even if it's in another monitor. The flickering is pretty annoying sometimes.
3. The pixel shift mode for burn in is glitched and broken, I feel. I can really notice it cutting off some of the edges of GUIs occasionally and it feels like it shifts too much, too long, and is too noticeable sometimes. I can really notice it shifting while I'm on desktop content, too.
4. The lack of curve really does hurt at this size. But I can sort of deal with it.
5. I'm going to need to get VERY inventive to somehow get another monitor above it for other usage at this point. No monitor arm will go up that high with the stock pole...
6. Well, motion clarity is obviously not as good as the 45 inch QD panels at 240Hz, and definitely not when it's glitching. Might be because this is an older title with oddball settings though.

I'm still sort of leaning towards keeping it, but there are definitely a lot of tradeoffs. I really do like the larger single screen for content consumption. I just REALLY wish it was curved sometimes.
Input lag is a lot higher outside of game mode. RTings says about 90ms Vs. about 11ms. 90ms would be very noticeable.

The flicker may be due to VRR. LG's TVs seem to have an permanent problem with flickering while using VRR. Apparently most noticeable in darker colors, especially dark grey.
 
The flicker, if it's not the VRR flicker, can be reduced by setting the option Reduce Input Delay to Standard instead of Boost.
 
Input lag is a lot higher outside of game mode. RTings says about 90ms Vs. about 11ms. 90ms would be very noticeable.

The flicker may be due to VRR. LG's TVs seem to have an permanent problem with flickering while using VRR. Apparently most noticeable in darker colors, especially dark grey.
Yeah at 90ms the cursor delay makes it feel like the mouse is underwater with extra resistance! ;-)
 
The flicker, if it's not the VRR flicker, can be reduced by setting the option Reduce Input Delay to Standard instead of Boost.
Does this affect the input delay that the previous posters are talking about?

Yeah at 90ms the cursor delay makes it feel like the mouse is underwater with extra resistance! ;-)
Input lag is a lot higher outside of game mode. RTings says about 90ms Vs. about 11ms. 90ms would be very noticeable.

The flicker may be due to VRR. LG's TVs seem to have an permanent problem with flickering while using VRR. Apparently most noticeable in darker colors, especially dark grey.
Yes, I noticed a HUGE difference with the game mode enabled, gameplay wise. It's no wonder I was dying so much in Empyrion, and felt like my actions were sort of disjointed. I tried it out in Red Dead Redemption with and without the game mode, and there was a large difference with just panning around.

Unfortunately, the colors looked a lot worse. A lot worse. I googled around, and this is a common complaint with this display.

So, I tried to see if I could "sort of" fix it. Because this display's performance in games without game mode on is abysmal. For reference, these are the settings I was going to try to get the display closest to, with game mode on. This is what I tweaked the values to, in the short span before I had to leave for work:

Gamma: 1.9 (this made the biggest difference in my experience; while non-game modes--well, and game mode, too--have this at 2.2 by default, from what I'm seeing 1.9 makes game mode come closer to them, for some reason)
Color temperature: 8-50 warm, this is kind of preference, I like it warm; I'm trying 8-15 and they looked fine.
Black stabilizer: 7
White stabilizer: 10 (should be default)
Game Contrast: 85
Game Black Level: 50
Game Sharpness: 0 (this was just lifted from the RTings setting, kind of perference imo)

I think everything else was mostly default (Method 2 points, point high, no tint, etc etc). Just going to do a quick rant to say that working in two different menus is freaking annoying.

FWIW, RTings' color warmth is Warm 50 for the white balance, which is the stock one when you put it on that setting. I'm sure many would think this is too warm, but I'm sort of used to having Night Light on in the past, while working on the displays. I'm a bit dubious that this is the most accurate setting, though I do think that without any warmth game mode looks very washed out. I think around Warm 8-20 may be better while still not being quite as harsh as 0, or the stock setting on the Game Mode. Anyway, with these settings it sort of started coming close enough for my tastes... I think. I swapped back and forth between the two during a sunrise scene in Red Dead 2 and there was a much more subtle change than before. I would be curious if anyone here could mess with it and try to improve it. One setting I haven't touched is "Fine Tune Dark Areas" in the Game Optimizer menu.
 
1. The initial settings prompted me to set up "Game Mode". But the colors looked like shit in game mode.

Input lag is a lot higher outside of game mode. RTings says about 90ms Vs. about 11ms.

Yes, I noticed a HUGE difference with the game mode enabled, gameplay wise. It's no wonder I was dying so much in Empyrion, and felt like my actions were sort of disjointed. I tried it out in Red Dead Redemption with and without the game mode, and there was a large difference with just panning around.

Unfortunately, the colors looked a lot worse. A lot worse. I googled around, and this is a common complaint with this display.


. . .

For SDR games on my 48cx, I was blown away by some nature/anthropology 4k HDR material I watched when I first got the HDR OLED, with things like an islander spinning double ended flaming torch on a dark beach at night, etc. but I was disappointed when I started gaming on it because I found the colors to be dull. I tried swapping out of game mode to different media modes and the games looked much less muted, but I needed game mode to avoid lag and have the best gaming performance. So, I later upped the saturation/color slider in the TV's OSD for game mode since I found it too muted compared to the other named picture modes.. I actually cranked it a little higher than I'd ever use because I then use Reshade(link) or SpecialK(link) to tweak each game (down) individually to where I want it. Turning the saturation down from the slightly over heightened OSD color level seems easier using the 3rd party tools than the other way around. Can also adjust a lot of other things in Reshade/SpecialK, though I usually only use a few on most games.

HDR game mode has it's own separate HDR game mode named setting that activates when a HDR game is running, so the saturation/color boost I do to the SDR game mode doesn't affect the gaming tv's HDR game mode. HDR games have their own color mapping/range so they never looked muted to me like SDR game mode did so that kind of thing is unnecessary for HDR games. I prioritize HDR games anyway now, and there is autoHDR too, but for any games that don't have those I still have an option for a better gaming picture to my tastes. I'm not talking about blasting saturation to neon or anything, just bringing it back up to where it looks outside of the default game mode which was muted/dulled. That's on the 48cx though, I can't speak for what the other, newer gaming tv's game modes look like.
 
Sure it does, in negative way. Although the eyestrain from the Boost setting is even worse.

Do you know how big the input delay difference is? I honestly didn't even know I was sensitive to input delay until I tried this TV's standard settings out lol. Agreed that the flickering does cause a lot of eyestrain btw...
 
Do you know how big the input delay difference is? I honestly didn't even know I was sensitive to input delay until I tried this TV's standard settings out lol. Agreed that the flickering does cause a lot of eyestrain btw...
No, I haven't measured. Standard mode is very easy on eyes though.
 
Last edited:
For SDR games on my 48cx, I was blown away by some nature/anthropology 4k HDR material I watched when I first got the HDR OLED, with things like an islander spinning double ended flaming torch on a dark beach at night, etc. but I was disappointed when I started gaming on it because I found the colors to be dull. I tried swapping out of game mode to different media modes and the games looked much less muted, but I needed game mode to avoid lag and have the best gaming performance. So, I later upped the saturation/color slider in the TV's OSD for game mode since I found it too muted compared to the other named picture modes.. I actually cranked it a little higher than I'd ever use because I then use Reshade or SpecialK to tweak each game (down) individually to where I want it. Turning the saturation down from the slightly over heightened OSD color level seems easier using the 3rd party tools than the other way around. Can also adjust a lot of other things in Reshade/SpecialK, though I usually only use a few on most games.

HDR game mode has it's own separate HDR game mode named setting that activates when a HDR game is running, so the saturation/color boost I do to the SDR game mode doesn't affect the gaming tv's HDR game mode. HDR games have their own color mapping/range so they never looked muted to me like SDR game mode did so that kind of thing is unnecessary for HDR games. I prioritize HDR games anyway now, and there is autoHDR too, but for any games that don't have those I still have an option for a better gaming picture to my tastes. I'm not talking about blasting saturation to neon or anything, just bringing it back up to where it looks outside of the default game mode which was muted/dulled. That's on the 48cx though, I can't speak for what the other, newer gaming tv's game modes look like.

. . . . . .

Re: black level. I prefer things that should be dark to be dark. If I'm in an adventure game I'll use a torch 🔥 , or in elden ring or lords of the fallen the hanging lantern on your gear you can turn on 🎃
Some games do require adjusting the settings in the games themselves to being with, but imo not every dark area is meant to be seen in full detail in most dark games or game areas unless you bring a bright light near or cast one into the area more.

View attachment 622790

View attachment 622791

View attachment 622792



Shooters/competitive games are a whole other conversation since eking out little advantages over other players are usually done by people who prioritize that over aesthetics and game design, scene and creator intent, etc. Afaik there is no punkbuster forcing black levels, contrast, level of detail vs distance, turning fx and grass/foliage off, changing shadows settings etc. and nothing at all tied to your OSD's settings either obviously - so people can whittle their image down or blow it out however they want if it lets them see their opponents better (and/or gives them higher frames/second)... manipulating settings kind of like predator vision in some sense/to some degree for advantage compared to what the scene would normally be respresented as.

View attachment 622793
Windows has an HDR tuning wizard, which has a color saturation slider at the end of it. I had to put the slider at 60 or 70, to ~ match the color saturation in SDR mode, when displaying SDR content via Auto HDR.
 
  • Like
Reactions: elvn
like this
So I found out yesterday that I apparently was never actually in HDR mode. I thought for sure that Windows set it automatically when I connected the thing, but I actually had to manually set it. The display had some visual artifacts at the first when I did it, but then I just unplugged it and plugged it back in and it seems to be stable.

Hogwarts Legacy looked really amazing even in game mode with HDR enabled. Some switch to SDR by default, like Red Dead 2, though. But it started hurting my eyes because it was too bright. I had to turn down the OLED pixel brightness, which might defeat the point of HDR to begin with I guess, but I can't view the display too long at 100% brightness. I have to turn it down to 80-90%, usually 80. And even 80 is kind of pushing it. I'm not sure how you people do 100% HDR brightness lol.
No, I haven't measured. Standard mode is very easy on eyes though.

I tried out standard mode, but I noticed a difference immediately... sigh... guess I just have to deal with the flickering occasionally.

HDR does look really amazing color wise though.
 
A lot of games have their own in game adjustments in their menus for HDR, so by default a lot of them probably aren't at any kind of HDR reference point including brightness, peak brightness, saturation, etc. The majority of the games I've seen only have a gamma/darkness setting "move the slider until this symbol is barely visible", which is not exactly a perfect science. Same goes for the windows HDR calibration tool. A few games have 2 or more HDR settings in their menus though, and some look pretty good by default.

Whether the game has it's own hdr picture controls by the dev, whether the game's own hdr controls cover enough bases, whether the game has good hdr by default to begin with without tweaking, and whatever settings the user decided on to play the game with can make a big difference in how a game looks outside of your gaming TV's own OSD settings/limits. If you haven't mapped the settings to match your monitor/gaming tv's peak brightness correctly, then more of the hdr range will be in the top end of the scale, compressed and clipping (blowing out) the color detail to white on the top end. For example, in the past when a HDR game was dev'd without it's own controls well enough, windows would default to a hdr 4000 nit curve I think instead of a 1000 or 720 - 800 for a lg oled, so the tv would have to compress that 4000nit range it was fed down into the screen's limitations clipping a lot more of the top end to white. Windows hdr calibration tool helps assign a range now, using a color profile it generates, but it's based on a kind of "is this better or this better?" set of decisions from slides by the user so it's not exact values from a decision perspective. But it does help a ton and lets you set a scale without having to use CRU edit to exact values. Some games have a better set of HDR settings in them too which helps a ton in how the end result appears.



That said, if you are in a very dark room you might try some subtle led bias lighting behind the screen. If I remember correctly, red dead2 had a lot of white snow scenes, at least in the beginning, and white snow can be very bright (though an OLED's ABL would prob kick in dimming it anyway). I never really played it through so can't really comment on it much.

. . . .

This is one user's settings from elden ring in HDR. I'm not sure about the windows HDR = off setting working since I leave mine on all of the time, but a big thing is to use HGiG which is more like a static scale and not dynamic tone mapping which instead sort of interpolates and swings the color ranges around and can end up like a showroom floor "torch mode".

Thought I'd drop my settings as I have been doing a lot of research on the best options. I'm using windows 10 with LG C1 55".

Windows HDR Setting: OFF (your TV will automatically switch to HDR mode when you open the game in fullscreen mode and enable HDR, if you have the windows setting turned on it can bug out the game and either wash out or over-expose the colours)

Elden Ring in-game HDR Settings: ON. Brightness: 5-7 (depending on your preference and lighting in your room). Maximum brightness: 800 (any higher and you will lose detail in the brighter areas). Contrast: 5-7 (depending on your preference - I have mine set to 7 so the colours pop).

LG C1 TV Settings: Open the 'Home Dashboard' app and go to the HMDI input source that your PC is hooked up to. You'll need to edit this input device and make sure it is set to 'PC' and HDMI Deep Colour is enabled (this might be in the general TV settings - I can't quite remember). When the game is open and HDR enabled (this will kick your TV into the Game Optimiser HDR mode and a HDR logo should popup on your screen when you tab into the game), go into your TV's Advanced Picture settings -> OLED Brightness: 100, Screen Brightness: 50, Contrast: 100, Gamma: 2.2, Dynamic Tone Mapping: HGIG and then leave everything else default. Under Colour settings, change White Balance to '-50', Colour Depth to 55. This will make sure your TV gives out hue of 6500K which is the reference amount, rather than the default amount of 7000K which is too 'blue-ish'/cool. It might look a little orangey at first but trust me that is the correct amount (as stated by HDTVTest C1 HDR settings on Youtube) and you'll get used to it very quickly. Then, under the sharpness setting, set this to 0 as it is just adding another layer of post-process sharpness which you won't need at 4K.

This is how I got Elden Ring looking absolutely insane on my setup and it looks 100X better than SDR. Let me know if these settings worked for you :)


tl/dr: Windows HDR: OFF. Elden Ring HDR: 7, 800, 7. LG Dynamic Tone Mapping: HGIG.


Here is a screenshot of the elden ring in game hdr controls. Notice the Max brightness setting. A lot of games seem to lack that for some reason.

ouI3DgMMRSpN9PrUPopzxcXSFyhBRpITpeFuGA=s0?imgmax=0.jpg

https://www.hdrgamer.com/2022/02/elden-ring-hdr-settings.html
 
Last edited:
I have a 48in C2 and it completely floored me that I am now afraid to not buy oleds from LG from now on. The pq is that good from coming from ips, tn and va. Crazy beautiful. Gaming and movies hdr, 4k. Awesome. Mine is properly calibrated though.
 

View: https://youtu.be/2L_OtIQEXOk?si=dHji2MzZVu7W-HOn

Some interesting updates, for example LG implementing a new sub pixel layout. And that 240hz 16:9 42" panel has just been completely removed from all roadmap information, so maybe fully canceled, which is a huge kick in the groin to me.


The 42" is pretty much in no man's land right now, I don't think it's even going to get MLA tech next year. So while the other monitor sized panels are getting up to 1300 nits brightness, new RGWB subpixel layout, 240Hz refresh rate, the 42" panel gets nothing.
 
The 42" is pretty much in no man's land right now, I don't think it's even going to get MLA tech next year. So while the other monitor sized panels are getting up to 1300 nits brightness, new RGWB subpixel layout, 240Hz refresh rate, the 42" panel gets nothing.
Yeah it seems like LG has accepted that the 42" is strictly a TV and won't be upgrading it to have monitor features. Probably safe to say now that the C series will continue to offer the same features it has now for the next few years, with maybe slightly faster processing.
Dang...
 
The video is about OLED monitors and not TVs. The roadmap doesnt even cite the C4 getting 144hz for instance.
I find it extremely hard to believe there wont be innovations on the TV side considering how fierce the competition is.
 
That said, if you are in a very dark room you might try some subtle led bias lighting behind the screen. If I remember correctly, red dead2 had a lot of white snow scenes, at least in the beginning, and white snow can be very bright (though an OLED's ABL would prob kick in dimming it anyway). I never really played it through so can't really comment on it much.

I already have two lamps right behind the display. Actually out to the side of it so their light actually reaches me more or less directly. It doesn't really help when it's that bright.

Here is a screenshot of the elden ring in game hdr controls. Notice the Max brightness setting. A lot of games seem to lack that for some reason.

The thing is I tend to play a lot of my games in Borderless Windowed mode because I like to alt tab a lot.
This is one user's settings from elden ring in HDR. I'm not sure about the windows HDR = off setting working since I leave mine on all of the time, but a big thing is to use HGiG which is more like a static scale and not dynamic tone mapping which instead sort of interpolates and swings the color ranges around and can end up like a showroom floor "torch mode".

I already set it to HGiG because that's what the setup guide recommended. It helps during games a bit, but not during desktop usage. As it is, I had to set the OLED brightness to about 45-70%, depending on how photosensitive I am at the time. Otherwise it's just too much. It looks great, but no point if it causes me pain during those times. What is sort of impressive, is how low the brightness actually goes, though. At 0 this can become the dimmest display I've ever tried. Kind of tempted to get one of these for bedroom use because at night time I could read things on it without any strain with the lights out and at 0 brightness.
 
For me, HDR is not too bright at all. I own a 48cx and I also have a 77" C1 in my living room. OLED is actually not bright enough for scenes with 25% and 50% of the screen in the middle ranges. OLED can't sustain it and ABL will kick in.

Most scenes have a breakdown where ~ 50% of the scene in SDR range more or less though (80 to 120 maybe ~ 200nit). 25% of the pixels are mids, 25% are bright highlights/edges/light sources, at least on typical scenes. Can vary on particular scenes but on average it's more like that breakdown.

This if from RTINGs *monitor* review of the 42 inch LG C2:

You can see that the brightness is really relatively low other than if there are a tiny percent 2% - 10% of bright highlights in the scene.

Like I said if you are still having a problem, adding some bias lighting (e.g. a LED lamp/smart bulb or led lighting strip on a smart outlet behind the screen) can help because our eyes view everything relatively (e.g. viewing a phone screen or flashlight in bright daylight vs dark night for an extreme example). However you get the greatest perceived range when viewing in dim to dark home theater scenarios due to the way our eyes work. For example, you'd prob have to use SDR at 250nit+ in a brighter room just to get back to matching how it looks to your eyes in dim to dark viewing conditions at lower SDR brightness levels.

LG.C2.HDR.brightness_RTings.monitor.review.chart.png


. . . . . . .

It helps during games a bit, but not during desktop usage.

I'd definitely use a completely different named picture mode that is much dimmer, set up just for desktop/app use. Then switch to Game mode for games with full range, and other picture mode for movies, shows. On my living room tv I set up several different named picture modes for different lighting conditions (day, night/blinds) or viewed content (anime, cgi animation, sdr recorded "real world" material/shows, etc.). Note that when viewing HDR material, the named picture modes switch to a different HDR version of the picture modes. They are separate. I find SDR needs more tweaking than the more defined HDR material's supplied metadata ranges, and I don't mind perverting SDR a little more to what I find more eye-pleasing since it's so limited to start with by comparison.
 
Last edited:
Yeah it seems like LG has accepted that the 42" is strictly a TV and won't be upgrading it to have monitor features. Probably safe to say now that the C series will continue to offer the same features it has now for the next few years, with maybe slightly faster processing.
Dang...
Plenty of brands using LG 42 and 48 panels, to build monitors.
 
My LG CX used mostly for work so far. OLED Light at 0 for that. Dim lighting in here. (No overhead lightning on, blinds closed...)
 
Like I said if you are still having a problem, adding some bias lighting (e.g. a LED lamp/smart bulb or led lighting strip on a smart outlet behind the screen) can help because our eyes view everything relatively (e.g. viewing a phone screen or flashlight in bright daylight vs dark night for an extreme example). However you get the greatest perceived range when viewing in dim to dark home theater scenarios due to the way our eyes work. For example, you'd prob have to use SDR at 250nit+ in a brighter room just to get back to matching how it looks to your eyes in dim to dark viewing conditions at lower SDR brightness levels.

I mean, I just said I have two lamps behind the display. Like not quite directly, but to the sides of it. I think there's plenty of bias lighting around it, more than an LED strip would provide. Unfortunately, the brightness it gets to even at 75% just hurts my eyes. Physically. I think I just have some periods where I'm photosensitive (usually tied to sleep deprivation and playing too many games during holidays), so I'll just have to lower brightness. Thing is, I'm just used to working on dim monitors at this point. To reduce eyestrain I usually leave many of my monitors at incredibly low brightness values. One of the ASUS monitors I'm using as a side monitor is actually at 0% brightness in the menu.

The inky blacks make everything look great either way, though. HDR does somehow make things pop more even at low brightness values.

My LG CX used mostly for work so far. OLED Light at 0 for that. Dim lighting in here. (No overhead lightning on, blinds closed...)

Yeah I think the ability of OLED to go down to that low of a brightness might be something that's a bit undersold. When I'm reading books or just consuming text content on my S23 Ultra, the ability to just have super low brightness made me kind of ditch my Kindle.
 
The video is about OLED monitors and not TVs. The roadmap doesnt even cite the C4 getting 144hz for instance.
I find it extremely hard to believe there wont be innovations on the TV side considering how fierce the competition is.

The roadmap doesn't show it but he verbally mentions the 144Hz update and says that the 240Hz version has disappeared off the roadmap. There will be innovations at some point yeah, but 2024 ain't it.
 
I mean, I just said I have two lamps behind the display. Like not quite directly, but to the sides of it. I think there's plenty of bias lighting around it, more than an LED strip would provide. Unfortunately, the brightness it gets to even at 75% just hurts my eyes. Physically. I think I just have some periods where I'm photosensitive (usually tied to sleep deprivation and playing too many games during holidays), so I'll just have to lower brightness. Thing is, I'm just used to working on dim monitors at this point. To reduce eyestrain I usually leave many of my monitors at incredibly low brightness values. One of the ASUS monitors I'm using as a side monitor is actually at 0% brightness in the menu.

The inky blacks make everything look great either way, though. HDR does somehow make things pop more even at low brightness values.



Yeah I think the ability of OLED to go down to that low of a brightness might be something that's a bit undersold. When I'm reading books or just consuming text content on my S23 Ultra, the ability to just have super low brightness made me kind of ditch my Kindle.

That's a ping pong usage scenario minimizing the game and doing work/reading in short bursts so I can understand that you wouldn't want to have to change named picture modes in the TV's OSD to a dimmer named setting you set up and then back to game mode in it's full glory. That's kind of an outlier situation and imo would be better served using two different screens - one for dim desktop/app reading, and another for the concert stage / movie theater / gaming world fireworks display. That is, rather than making one or the other usage a lower quality to serve only one use optimally.
 
Input lag is a lot higher outside of game mode. RTings says about 90ms Vs. about 11ms. 90ms would be very noticeable.

The flicker may be due to VRR. LG's TVs seem to have an permanent problem with flickering while using VRR. Apparently most noticeable in darker colors, especially dark grey.
That is outside of Game mode and not in PC mode as I recall it?
 
I mean, I just said I have two lamps behind the display. Like not quite directly, but to the sides of it. I think there's plenty of bias lighting around it, more than an LED strip would provide. Unfortunately, the brightness it gets to even at 75% just hurts my eyes. Physically. I think I just have some periods where I'm photosensitive (usually tied to sleep deprivation and playing too many games during holidays), so I'll just have to lower brightness. Thing is, I'm just used to working on dim monitors at this point. To reduce eyestrain I usually leave many of my monitors at incredibly low brightness values. One of the ASUS monitors I'm using as a side monitor is actually at 0% brightness in the menu.

The inky blacks make everything look great either way, though. HDR does somehow make things pop more even at low brightness values.



Yeah I think the ability of OLED to go down to that low of a brightness might be something that's a bit undersold. When I'm reading books or just consuming text content on my S23 Ultra, the ability to just have super low brightness made me kind of ditch my Kindle.


Not trying to argue vs. your specific case of you not getting along with brighter light. There are conditions people who are sensitive to it may have and you did mention headaches...

https://vestibular.org/article/diagnosis-treatment/vision-hearing/light-sensitivity/
Light-Sensitivity-Infographic-by-Theraspecs.jpg.jpg


. . . . . . . .

But most people without light sensitivity issues tested (by dolby famously with hdr 4000 and hdr 10,000 screens testing viewers on those vs lower brightness ranged screens) prefer the brightest HDR highlights and light sources for the most realism in images / media ~ virtual experiences on screens. That is of course, bright areas of screens from objects, light sources edges, highlights, reflections in the scenes that would be bright in real life, not turning a whole SDR scene's relatively low height range up across the entire scene.

..... 🌚🌒 ☀️ 👀

.

low-light.jpg



tPyXzb2.png


This kind of thing below looks amazing and impactful to me in HDR (they are just SDR gifs here). The same kind of thing in games. OLED can't sustain bright scenes or on larger %'s of the screen however (and ABL will kick in), but it can still provide appreciably bright HDR highlights and of course black depths down to oblivion at the pixel level. Typically scenes are more like the bottom two rather than the sun-like portal in the first two. The bottom two still have very bright HDR highlights but they are more confined to smaller areas. Star wars scenes have a lot of overall normal to dim scenes but with laser light beams and ship thrusters, electronic panel lights, etc too. The first image below would be extra bright on a larger portion of the screen but it would be brief and have a big impact.

It would be nice to have the capability for those scenes when they happen briefly, even in HDR 4,000 and HDR10,000 capable screens in the future. Screens with high HDR color volume / range can display color variations higher up the scale so they show more color detail with many more brighter color values added to the palette, not just that it's a screen with the brightness turned up. The more limited the screen's HDR range, the more the colors will be compressed - so more colors will share the same color value all blended together, and/or the top brightest colors/details will be clipped to white depending.



Maybe HDR is not be for everyone, just like some people have trouble with VR (though using MixedReality instead should help there and be preferable for people suffering since it keeps the real-world as a foundation in the viewport vs vertigo). I'm joking with the powder refernce here. To be honest I have light skin (mostly irish descent) and I actually wear transition lenses at work because I dislike the overhead fluorescent/tube lights excessive brightness in some areas. But those lights are stationary and full bright like an operating room. I also hate those "edison bulbs" that are clear bulbs with a bright lighting coil inside and have no diffusion as they burn into my retinas like a flash bulb 📸. Yet I still love bright (well as bright as oled can get so far on my screens) HDR and I prioritize HDR in both movies and games now - so it might depend on the person.

MV5BMTQ0Nzg4NzMzN15BMl5BanBnXkFtZTcwODAyNDAxMw@@.jpg
 
Last edited:
That is outside of Game mode and not in PC mode as I recall it?

The ms of input lag is outside of game mode.



The flicker as I understand it is because the gamma of the screen is set at 120hz and it is not variable to remain the same as the Hz while the Hz changes with VRR. Therefore (when using VRR), the more extreme and erratic your frame rate range's peaks vs. dips are, and I would think the lower your frame rate sinks below the 100's and the lower fps potholes in game are on those kind of graphs , the worse the flicker and/or black lifting would be. The closer your frame rate range stays to 120fpsHz the closer it would be to the normal gamma, and the less erratic the variance the less strobing the flickering would probably be. Some specific games (and game menus) apparently exacerbate the flicker issue for one reason or another too.

. . .

Running for example 100fps average graph might get

(70fpsHz) 85fpsHz <<<< 100fpsHz average >>> 115fpsHz (118 fpsHz capped)

. . .

Running 80fps average graph might get

(50fpsHz) 65fpsHz <<<< 80fpsHz average >> 95fpsHz (110fpsHz)

. . .

Running a graph that hits 120fps (or higher) on the top end of the graph might help minimize the issue:

(75fpsHz) 90fpsHz <<<< 105fpsHz average >>> 120fpsHz+ (118fpsHz capped)

. . .

Running a graph where 2/3 of it are 120fpsHz would probably reduce it even more

(90fpsHz) 105fpsHz <<< 118fpsHz capped >>> 118fpsHz capped

. . . .

Otherwise, running a graph that is higher than 120fps minimum would probably be the only way to elminate it entirely, and you wouldn't need VRR on at that point because the frame rate would never vary beneath the peak hz of the screen (you could just cap the top fps to 118 to eliminate tearing from overages). Most people are using VRR so that they can squeeze some higher graphics settings out of their game though so it's probably not a very appealing fix for a lot of people, to lower your graphics settings on some games in order to get 120fps+ average.

. . .
 
Last edited:
That is outside of Game mode and not in PC mode as I recall it?

In my experience game mode on and off is a huge difference, and then on top of that, turning off the "Boost" mode also makes a big difference. I tested this out in multiple games just panning around. It's really noticeable, unfortunately.


Not trying to argue vs. your specific case of you not getting along with brighter light. There are conditions people who are sensitive to it may have and you did mention headaches...
<a lot of stuff>

For one, I never said I didn't prefer it at max brightness. I do like how it looks at max brightness. But on the days it causes me pain, it's going to get turned down. I don't care what I'm missing out on visually, any pain isn't worth it.

Two, I never actually mentioned headaches (I'm pretty sure you can CTRL+F on the last three pages of this topic and you won't find a single mention of the word "headache"). I rarely ever get headaches, and if I do, they're usually strictly the direct result of sleep deprivation. As for migraines, I'm not sure if I've ever had a migraine. I have a friend that experiences frequent migraines and I've never felt anything like him during a headache. The only pain I get is strictly eye pain, as in I only feel it strictly in that area, due to exposure to bright lights.

The thing is it's very off and on, too. Today I have the display at 100% brightness, but I'm looking at it without issues. Even on bright web pages, I'm not really experiencing any pain. I'm kind of wondering if it's allergy/dry eye related or something. Some days it's an issue some days it isn't. But discussing my medical issues (or lack thereof) isn't really the point of this topic lol.
 
  • Like
Reactions: elvn
like this

View: https://imgur.com/c5CISOd

Maybe I considered eyestrain as something headache like or related to headache since you mentioned the light caused you pain. The point was light sensitivity issues. If you are basing your position on your photo sensitivity then I think it's pretty relevant to the topic, at least replies to what you put in your replies - otherwise why bring it up in the first place.

I can't view the display too long at 100% brightness. I have to turn it down to 80-90%, usually 80. And even 80 is kind of pushing it. I'm not sure how you people do 100% HDR brightness lol.

As it is, I had to set the OLED brightness to about 45-70%, depending on how photosensitive I am at the time. Otherwise it's just too much. It looks great, but no point if it causes me pain during those times.

Pain is not good, I'm not trying to argue you put yourself into pain. In fact I suggested using a different named picture mode with lower brightness just for static desktop/apps/reading and using full range for game mode but I can see how that probably won't work for the way you are using your screens currently. Its your screens so use them how you want. 😎
 
I can't view the display too long at 100% brightness. I have to turn it down to 80-90%, usually 80. And even 80 is kind of pushing it. I'm not sure how you people do 100% HDR brightness lol.

Just to be clear, the gaming TV's OSD brightness setting is a separate thing that you keep at max for content like games and movies. You don't have to keep your desktop as bright as your osd peaks. I'm pretty sure that most of us who are keeping windows HDR on are using the HDR/SDR brightness slider in windows' hdr settings at a very low setting for desktop/app use. Games use their own metadata/curve with full brightness range that is separate from that which kicks in for the game The idea isn't using desktop/apps at 100% hdr brightness or even 80%, rather to set this slider much lower in windows settings. That would make the desktop a lot more dim compared to the peaks set in the TV's OSD.

Still, it wouldn't reduce the full hdr volume ~ color heights available once you run games and movies with the OSD at full hdr ranges but I suspect living by candlelight reading and then stepping out into the sunlight over and over could be cause a contrast issue with pupil dialation ~ eyesight adjusting to conditions. But remember that even in full HDR range of a particular screen's capabilities - the highest nit colors are mostly isolated in highlights, light sources, bright reflections in smaller parts of the scenes rather than the full field of the screen being bright all of the time. (y)

Brightness-balance-UI.png
 
Just to be clear, the gaming TV's OSD brightness setting is a separate thing that you keep at max for content like games and movies. You don't have to keep your desktop as bright as your osd peaks. I'm pretty sure that most of us who are keeping windows HDR on are using the HDR/SDR brightness slider in windows' hdr settings at a very low setting for desktop/app use. Games use their own metadata/curve with full brightness range that is separate from that which kicks in for the game The idea isn't using desktop/apps at 100% hdr brightness or even 80%, rather to set this slider much lower in windows settings. That would make the desktop a lot more dim compared to the peaks set in the TV's OSD.

Still, it wouldn't reduce the full hdr volume ~ color heights available once you run games and movies with the OSD at full hdr ranges but I suspect living by candlelight reading and then stepping out into the sunlight over and over could be cause a contrast issue with pupil dialation ~ eyesight adjusting to conditions. But remember that even in full HDR range of a particular screen's capabilities - the highest nit colors are mostly isolated in highlights, light sources, bright reflections in smaller parts of the scenes rather than the full field of the screen being bright all of the time. (y)

View attachment 624101
Exactly. The way I set this up is that I simply toggle between SDR and HDR modes and adjust the slider until it looks pretty close. Colors will be a bit worse in HDR mode because it's not running in sRGB color space but one of the HDR color spaces, but otherwise you should be able to get them to look pretty similar on OLEDs.
 
Beyond 2m (6.6ft) without fiber, cable quality starts to matter a lot more. You are way more likely to get blinks if it’s not a great cable at 10+ ft without an active cable so do a lot of checking before you buy a long passive cable.
 
Back
Top