LG 48CX

I've ran mine with Windows HDR permanently enabled, OLED light at 100, and the Windows SDR brightness slider set to 12. ASBL also disabled the first week I got the TV and so far no issues for 2 years. Pretty sure I will end up replacing the CX long before it develops any hints of burn in.
I used SDR mode because it had more accurate colors and since I was mostly using SDR content using HDR mode made no real difference to me. But it's certainly an option that works fine as long as you set the SDR slider.
 
I used SDR mode because it had more accurate colors and since I was mostly using SDR content using HDR mode made no real difference to me. But it's certainly an option that works fine as long as you set the SDR slider.

Windows does a pretty good job of displaying SDR content now even when it is set to HDR. For example, if you have an HDR Youtube video running in windowed mode in one corner of your screen but the rest of your desktop is SDR, only that video will be displayed in HDR while the rest of the desktop looks perfectly SDR normal. When I first tried Windows HDR back when it first launched it was a huge mess and washed everything out. It's come a long way since then so I would recommend to just leave HDR on all the time and adjust your SDR brightness accordingly.
 
Windows does a pretty good job of displaying SDR content now even when it is set to HDR. For example, if you have an HDR Youtube video running in windowed mode in one corner of your screen but the rest of your desktop is SDR, only that video will be displayed in HDR while the rest of the desktop looks perfectly SDR normal. When I first tried Windows HDR back when it first launched it was a huge mess and washed everything out. It's come a long way since then so I would recommend to just leave HDR on all the time and adjust your SDR brightness accordingly.
Yes, but on the LG OLEDs it still results in less accurate color. I think it has to do with running in a color space intended for HDR rather than sRGB. It's a visible difference but not something I would call a dealbreaker or anything, I can notice it mostly in things like purple hues.
 
Windows does a pretty good job of displaying SDR content now even when it is set to HDR. For example, if you have an HDR Youtube video running in windowed mode in one corner of your screen but the rest of your desktop is SDR, only that video will be displayed in HDR while the rest of the desktop looks perfectly SDR normal. When I first tried Windows HDR back when it first launched it was a huge mess and washed everything out. It's come a long way since then so I would recommend to just leave HDR on all the time and adjust your SDR brightness accordingly.
This. Heck, it's only been sometime in the past 6 months or so where Microsoft fixed the longstanding bug where the Windows OSD would auto-kick back into SDR for a second even if everything (Windows + Media) were in HDR mode.
 
This. Heck, it's only been sometime in the past 6 months or so where Microsoft fixed the longstanding bug where the Windows OSD would auto-kick back into SDR for a second even if everything (Windows + Media) were in HDR mode.

SDR Overlays always pop things out of HDR mode. Even on my nvidia shield. The windows volume control doing it with it's overlay was a huge pita to me not because it popped out of HDR for a moment like an overlay on my shield but because on windows it was like doing a whole resolution switch when in fullscreen exclusive mode on a game, resulting in a clunky transition to and from.

On my pc I ended up using a usb midi board and manually assigning my apps and whatever games I was playing at the time to different sliders and knobs (and labeling them with thin 1/4" red or black colored artist tape and black or white artist paint pens). That way I could control their volume levels individually without invoking the global volume control. The small usb midi board set up is pretty handy for adjusting volume levels and muting/unmuting regardless, for various running apps, browser, and also including setting windows system sounds to very low or muted when other volumes are high, microphone, etc.
 
Last edited:
SDR Overlays always pop things out of HDR mode. Even on my nvidia shield. The windows volume control doing it with it's overlay was a huge pita to me not because it popped out of HDR for a moment like an overlay on my shield but because on windows it was like doing a whole resolution switch when in fullscreen exclusive mode on a game, resulting in a clunky transition to and from.
At least on my PC, this got fixed at some point. Not sure if it's the current driver I'm on or if Microsoft fixed this for good, but I haven't had an overlay pop me out of HDR mode. [Note HDR is on within Windows; that might matter here]
 
At least on my PC, this got fixed at some point. Not sure if it's the current driver I'm on or if Microsoft fixed this for good, but I haven't had an overlay pop me out of HDR mode. [Note HDR is on within Windows; that might matter here]

Same here. I've had Windows HDR enabled for the past 2 years and have never experienced this.
 
It was a known issue with the windows volume overlay. They may have fixed it in drivers. It could have had something to do with me using multiple monitors too. When I invoke the system menu overlay in my nvidia shield's android system on my C1 TV (single screen setup) it also drops paused movies out of HDR similarly, greying them out of HDR mode, but less clunkily than it used to happen on my muli-monitor setup.

Using the relatively small usb midi board prevents the windows volume overlay/global volume from having to be invoked so it hasn't been a problem for a long time for me either, regardless. Anyway it's pretty convenient to have a separate slider or knob, mute/unmute toggle button above the slider, etc. for every commonly used audible app, windows system sounds, microphone, incoming voice chat/calls from apps, headphones and surround speakers, or whatever games I'm playing at the moment without having to change the focus off of a fullscreen exclusive game or other focused window of an app I'm using to a software audio mixer window. Stream deck also helps with a lot of things.
 
It was a known issue with the windows volume overlay. They may have fixed it in drivers.
I know I tested earlier in the year and the bug was still present; I only tested again a bit before I got my C1 just to see if it was fixed, and found it surprisingly was. Note it wasn't the overlay itself; I had an app that disabled the overlay from being present, but Windows still (briefly) kicked back into SDR. Other OSDs (Game Bar, etc.) had much the similar effect.

Still, the fact we're *still* talking about this tells you what a train wreck Windows's HDR support has been.
 
I ran a burn in tester earlier today and noticed blotches of slightly different color intensity... Kind of like small clouds. Thought the display was starting to degrade.
I just ran the test on my MacBook Pro 14 with the fantastic mini led screen. I see the same artifacts on it, so crisis averted lol.

I guess its an optical illusion coupled with light smudges on the glass.
 
This app has been posted in this thread previously, but from what I can tell it has been over a year and I didn't see a great bit of detail on its features.
It totally flew under my radar.

I stumbled upon this after updating to Windows 11 yesterday while looking for guides to setup HDR properly..

Color Control:
https://github.com/Maassoft/ColorControl
1663775473544.png


It lets you setup all kinds of macros with different toggles. Looks like it is designed to change screen modes and toggle HDR depending on what is application/game is running. I really don't want to bother going through the trouble to set all of that up, but it has a killer feature I have been wanting.

Built in macros to raise and lower oled light. Unfortunately it can't just tell the tv to go to a specific value, but sends controller commands through the network.
So, say you run the macro to raise oled light +20. It will send the controller presses to go through the menu and raise the oled light value by 20.
Also you could edit the macro trivially to say +-70 to go between 30/100.

Still a hell of a lot easier than doing it your self. It wears me out. While working I keep the value low and raise it for games, so always back and forth.
 
This app has been posted in this thread previously, but from what I can tell it has been over a year and I didn't see a great bit of detail on its features.
It totally flew under my radar.

I stumbled upon this after updating to Windows 11 yesterday while looking for guides to setup HDR properly..

Color Control:
https://github.com/Maassoft/ColorControl
View attachment 512344

It lets you setup all kinds of macros with different toggles. Looks like it is designed to change screen modes and toggle HDR depending on what is application/game is running. I really don't want to bother going through the trouble to set all of that up, but it has a killer feature I have been wanting.

Built in macros to raise and lower oled light. Unfortunately it can't just tell the tv to go to a specific value, but sends controller commands through the network.
So, say you run the macro to raise oled light +20. It will send the controller presses to go through the menu and raise the oled light value by 20.
Also you could edit the macro trivially to say +-70 to go between 30/100.

Still a hell of a lot easier than doing it your self. It wears me out. While working I keep the value low and raise it for games, so always back and forth.

Nice! Thanks.

Can you set any menu item up to a "hotkey" on the controller? I've always wanted to set one of the buttons to do the "turn off the screen" (emitters function). I have that menu item at the bottom of my popup quickmenu at the moment but it takes a click + navigating to that heading. Alternately, with voice control active I can hold the mic button down and say "turn off the screen", but having a one button press would be more convenient and would work more like a mute button. I do see a "screen off" heading in your screen shot but I don't know if that is like hitting the power button of the screen and shutting it down into standby rather than the "turn off the screen" emitters trick.

I'll head over to the link and see if there is a writeup about it. Thanks again.

Not sure if that means the shortcuts can be activated on your pc itself? If that were the case I could link hotkeys to a stream deck which would be extremely convenient.

Looks like that might be the case like the controller software...

https://github.com/Maassoft/ColorControl/releases/tag/v4.0.0.0

129363450-4f2b1989-0161-4881-9fcc-b63fc2e9766d.png


or in your remote controller screenshot they have an entry as "Alt + F5" for example.



I knew I could do some things via the phone software but I've been wanting to do the turn off the screen trick as well as some other tv functions via my stream deck (mapping hotkeys to it) ever since I got the 48cx.

great find. If it was posted before I guess I snoozed on it too so thanks all the same. Maybe I even posted it or replied to it and forgot about it the way things go lol. I vaguley remember some color control software but not the shortcut functions to the remote/tv itself.

Presets​

With the presets you can peform actions on your tv you would normally do via the remote control.
 
Last edited:
Nice! Thanks.

Can you set any menu item up to a "hotkey" on the controller? I've always wanted to set one of the buttons to do the "turn off the screen" (emitters function). I have that menu item at the bottom of my popup quickmenu at the moment but it take a click + navigating to that heading. Alternately, with voice control active I can hold the mic button down and say "turn off the screen", but having a one button press would be more convenient and would work more like a mute button. I do see a "screen off" heading in your screen shot but I don't know if that is like hitting the power button of the screen and shutting it down into standby rather than the "turn off the screen" emitters trick.

I'll head over to the link and see if there is a writeup about it. Thanks again.

Not sure if that means the shortcuts can be activated on your pc itself? If that were the case I could link hotkeys to a stream deck which would be extremely convenient.

Looks like that might be the case like the controller software...

https://github.com/Maassoft/ColorControl/releases/tag/v4.0.0.0

View attachment 512450

or in your remote controller screenshot they have an entry as "Alt + F5" for example.

I knew I could do some things via the phone software but I've been wanting to do the turn off the screen trick as well as some other tv functions via my stream deck ever since I got the 48cx.

great find. If it was posted before I guess I snoozed on it too so thanks all the same. Maybe I even posted it or replied to it and forgot about it the way things go lol. I vaguley remember some color control software but not the shortcut functions to the remote/tv itself.

You can setup anything that is in the menus on the tv or on the tvs remote. It is sent through the software, so your tvs remote isn’t involved.
The menus come up on the tv as if you were pressing the buttons.

I couldn’t get the key binds to work yet. Probably not doing something right. The stream deck is a good idea. I don’t use mine enough as it is.

Before I posted I searched the thread and at a glance, I think it wasn’t spoken about in this context.
 
You can setup anything that is in the menus on the tv or on the tvs remote. It is sent through the software, so your tvs remote isn’t involved.
The menus come up on the tv as if you were pressing the buttons.

I couldn’t get the key binds to work yet. Probably not doing something right. The stream deck is a good idea. I don’t use mine enough as it is.

Before I posted I searched the thread and at a glance, I think it wasn’t spoken about in this context.

Thanks again for the info. Hopefully the shortcuts/hotkeys can work.

I use my stream deck for windows managment all of the time on my multi-monitor array, among other things.

With the stream deck window management method you can set up your own app window sizes and locations so can tile your desktop windows however you want, and deal them out or teleport them back with the press of a button.

------------------------------------------------------------------------------------
Stream deck info from some of my previous comments below:
--------------------------------------------------------------------------------------

I've been doing big verticals with a 43" 4k 60hz samsung VA (nu6900 , 6100:1 contrast) bookended on each side of my main screen for several years now. I call them "the two towers".

It's neat to see more people getting exposed to large vertical screens as usage scenarios though nowadays, with uw's and that ark screen's portrait mode and built in window management, window's 11's easy snap-to window management system, etc.

That's pretty much my reply but in case anyone might be interested or benefit from it, here is some info about how I do window management on my tall screens below.

.........................................................................
Easier Stream Deck + handful of stream deck addons method w/o displayfusion:
.........................................................................

You can do most of what is in the displayfusion section below this section more easily with some of the available plug ins for a stream deck without having to use displayfusion:

https://altarofgaming.com/stream-deck-guide-faq/

>Navigate to "More Actions…" and install the following plugins to your Stream Deck: Advanced Launcher & Windows Mover & Resizer. Advanced Launcher, will not only let you pick the application you want to run, but you can also choose to Run as Administrator, Limit the number of running instances, Kill the existing instances – as well as the best one of all – set your unique arguments on the launcher!

.

>Windows Mover & Resizer on the other hand, takes productivity to a WHOLE new level! The macro will apply to either your currently selected window, or a specific application you define, and it will let you choose the exact monitor, position & size you want the window to take on the click of your button! ?? This. Is. Sick!

.

>And what's the last piece of this glorious puzzle? Stream Deck Multi Actions! Simply combine Advanced Launcher & Windows Mover into a Multi Action, where you first launch the application with the exact settings you need, then it gets automatically positioned in the exact coordinates and size you need!

.

>Bringing us to the last huge step! Creating a HUGE multi action that will instantly launch a "Workspace" for you! Launch your game, audio output & input settings, stream starting soon scene, face camera, lights and whatever else you can dream of, in the press of a button! Oh yes, Stream Deck can get THAT good! ??

\*note by me: you can also set up different sets of those "Saved window position profiles" in that last step (or via displayfusion pro + the displayfusion hotkey mapped to a streamdeck button). In that way, you can hit different buttons or multi-press/toggle a single button to shuffle between different styles of window layouts of your apps.

..

....................................................................
Stream Deck + Displayfusion Methods
....................................................................

I use a stream deck's plugins combined with displayfusion functions to open and set all of my app's "home" locations across three screens. Cobbling together a few existing functions in the displayfusion library, I can hit a single app icon key several time to: ..check to see if the app is open, if not open, launch it. ..check to see if the app is minimized: if not, minimize it. if yes, restore it to its home position. That way I can hit the button once to launch an app or hit it a few times to shuffle the app min/restored to its home position. I also have a bunch of streamdeck buttons that will move whatever window is the active window to a bunch of pre-set locations:

EOKDETCl.jpg


I also have a button set to an overall saved window position profile so that once all of my apps are launched or after anytime I move any from their home positions - I can hit one button and they'll all shuffle back to where I saved them in the window position profile. (I can also save more than one position profile).

I keep the main taskbar dragged off of my primary screens to the top of one of my side portrait mode screens. I use translucent taskbar to make it transparent and I set up taskbarhider app to show/hide the taskbar as a toggle via hotkey/stream deck button. That locks the taskbar away or shows it via hotkey/button rather than relying on mouse-over. I can still hit WIN+S and type two letters and hit enter for anything I have yet to map to a page on my streamdeck's buttons. I can use Win+TAB to page between tiles of all apps/windows (displayfusion can set that to limit which app thumbnails are shown in the tab popup overlay to which apps are open on the currently active screen optionally as well) .. but I can usually just do that with each app's button as I outlined above so rarely need to do that. The start menu button is always available too obviously but again I have little need with Win+S.

.................................

I highly recommend a stream deck to break away from the whole windows kiosk interface even more. It'll change you life.
 
Last edited:
Ok good news, it can directly control the oled light level.
backlight() is the command. Make sure to hit save after putting your keyboard short cut in.
1663880310862.png
 
New firmware for CX series dropped a few days ago at the KR mothership... 04.40.16

Release notes for most recent two:

(04.40.16)
-Minor Software-Related Issues Improvements

(04.40.10)
-Improves speech recognition performance.
-When playing Stone Vision, the game optimizer will automatically switch to the image quality mode.
-Improved accessibility menu usability for people with disabilities.
 
I finally saw some burn in on mine, but I can only see it with a green test screen on. It is around the static Rust UI in the lower right hand corner. 2000+ hours in Rust on this screen in the last year. I cannot see the burn in during any kind of normal usage.
 
I finally saw some burn in on mine, but I can only see it with a green test screen on. It is around the static Rust UI in the lower right hand corner. 2000+ hours in Rust on this screen in the last year. I cannot see the burn in during any kind of normal usage.
Holy smokes, that's annual FTE hours (2080) without much sick time or PTO, some dedication..🙂
 
Holy smokes, that's annual FTE hours (2080) without much sick time or PTO, some dedication..🙂
Well, I will leave it running in the background while I am working all the time. So it is probably more time looking at a wall than anything else and listening for the furnaces to run out of fuel.
 
So I just tried out the recently released Windows HDR Calibration Tool on my CX and supposedly it helps to improve the Auto HDR image quality. So far I'm not seeing a whole lot of difference from before, if any really. Is it because I already did the CRU tweak to make windows report my display's peak brightness to be 800 nits? I did the calibration while having my TV be in HGiG mode btw, and again the CRU edit was already done to make the display recognized as having 800 nits peak brightness a while back prior to doing the calibration. Perhaps this is more useful to people who did not do the CRU edit.
 
So I just tried out the recently released Windows HDR Calibration Tool on my CX and supposedly it helps to improve the Auto HDR image quality. So far I'm not seeing a whole lot of difference from before, if any really. Is it because I already did the CRU tweak to make windows report my display's peak brightness to be 800 nits? I did the calibration while having my TV be in HGiG mode btw, and again the CRU edit was already done to make the display recognized as having 800 nits peak brightness a while back prior to doing the calibration. Perhaps this is more useful to people who did not do the CRU edit.
I did it with HGIG off. Pretty sure the content/game has to support HGIG to pass the values. I would assume with the level of HDR support we have historically had HGIG support would be scarce if not non existant.
As far as quality increase. It is subjective. Looks fine to me. Auto HDR seems to do something. Didn't really wow me though.
 
I did it with HGIG off. Pretty sure the content/game has to support HGIG to pass the values. I would assume with the level of HDR support we have historically had HGIG support would be scarce if not non existant.
As far as quality increase. It is subjective. Looks fine to me. Auto HDR seems to do something. Didn't really wow me though.

I'll try again with HGiG off. What I meant by not seeing a whole lot of difference was that I didn't see much difference in Auto HDR between the pre and post calibration tool image quality. Auto HDR definitely works for sure, it's just that the image before using the calibration tool and after using the calibration tool remains largely the same. The idea that the game itself has to have HGiG support in order to actually make use of it seems correct. I'm not sure why people keep on insisting that HGiG is the way to go for the most accurate or best image when the games themselves do not even support it to begin with. I've previously been using DTM ON when it comes to Auto HDR games.
 
Last edited:
I'll try again with HGiG off. What I meant by not seeing a whole lot of difference was that I didn't see much difference in Auto HDR between the pre and post calibration tool image quality. Auto HDR definitely works for sure, it's just that the image before using the calibration tool and after using the calibration tool remains largely the same. The idea that think the game itself has to have HGiG support in order to actually make use of it seems correct. I'm not sure why people keep on insisting that HGiG is the way to go for the most accurate or best image when the games themselves do not even support it to begin with. I've previously been using DTM ON when it comes to Auto HDR games.
I know what you mean. Maybe they read about it on a AV forum.
 
HGiG turns off all processing a rather than using DTM. DTM takes lower colors and steps on top of color values higher in the curve which results in lost detail in colors since you are muddying the values that should be those colors together with lifted colors. The curve lacks the range so it compresses more, so more colors share the same values where they should be at different levels. DTM also often lifts the lows and mids unnecessarily because it is working on each frame dynamically using algorithms and so its not really doing it logically. So things that should be shadowed and dark will be lifted and some mids may be lifted outside of the artist's intent or in some cases giving a washed out look to areas.

...

HDTVTEST youtube vid .... DTM On, DTM OFF, HGiG explained

We demonstrate the effects of [Dynamic Tone Mapping] "On", "Off" and "HGIG" in [HDR Game Mode] on the LG CX OLED TV

HGiG means the tv is going to be disabling and it's going to follow the PQ EOTF curve up to maybe it's peak brightness capability and it's going to hard clip. By disabling the tone mapping it's going to hand the tone mapping over to the game

He's talking about HGiG handing the tone mapping over to console games that may actually have HGiG curve data but HGiG is still the most accurate on PC since it turns off all processing and isn't passing a high nit HDR curve to the TV.

Using DTM = Off mode will send the full brightness / color volume curve to the TV and then the TV will apply it's own curve, compressing the upper part of the curve.

This from a reddit thread's replies sums it up I think:

DTM off, is still performing tone mapping, (static) up to 4,000 nits. HGiG, is a hard clip at 800 nits. Regardless of the game…or whether it supports HGiG or not.

With different TV's HGiG, the hard clip may be set to a higher value by the mfg. HGiG is the most accurate as it's a hard defined range with cutoff and no funny business, heavy compression, lifting, etc. I think DTM off is still ok but I do not like DTM on for the aforementioned reasons at the top of this reply.

To be clear, when using the DTM off setting your tv is still static tone mapping taller curves down to what the TV can do.

Monstieur explained the static tone mapping curves well much earlier in this thread:

The curves are arbitrarily decided by LG and don't follow a standard roll-off. There does exist the BT.2390 standard for tone mapping which madVR supports (when using HGIG mode).

LG's 1000-nit curve is accurate up to 560 nits, and squeezes 560 - 1000 nits into 560 - 800 nits.
LG's 4000-nit curve is accurate up to 480 nits, and squeezes 480 - 4000 nits into 480 - 800 nits.
LG's 10000-nit curve is accurate up to 400 nits, and squeezes 400 - 10000 nits into 400 - 800 nits.

Regarding MPC for HDR movies in windows not getting metadata:
I would use only the 1000-nit curve or HGIG for all movies, even if the content is mastered at 4000+ nits. All moves are previewed on a reference monitor and are designed to look good at even 1000 nits. It's fine to clip 1000+ nit highlights. HGIG would clip 800+ nit highlights.

LG defaults to the 4000-nit curve when there is no HDR metadata, which a PC never sends.
 
Last edited:
Most accurate seems arbitrary here...sure with HGiG you would get everything accurate up to 800 nits then you lose every single bit of detail beyond that. With DTM Off now you only have accuracy up to 560 nits before it starts trying to compress the rest of it to fit within the 800 nits capability of the TV but at least you would be getting that detail somewhat rather than lose it entirely. This almost seems like a pick your poison kind of thing and there isn't really a "most accurate" choice, it just depends on what you prefer. Do you want to see the details beyond 800 nits even if it's being compressed? Or would you rather just take the hard clip instead. If you played a game where the sun was outputting 1000 nits, would you rather see that sun at 800 nits or see nothing at all? I'm guessing in a game with HGiG support the game would know that the sun cannot be anymore than 800 nits, while a game that doesn't support HGiG would completely ignore that and still output 1000 nits and let your TV just hardclip it out.
 
I finally saw some burn in on mine, but I can only see it with a green test screen on. It is around the static Rust UI in the lower right hand corner. 2000+ hours in Rust on this screen in the last year. I cannot see the burn in during any kind of normal usage.
what brightness do you keep your display at or play at?
 
Most accurate seems arbitrary here...sure with HGiG you would get everything accurate up to 800 nits then you lose every single bit of detail beyond that. With DTM Off now you only have accuracy up to 560 nits before it starts trying to compress the rest of it to fit within the 800 nits capability of the TV but at least you would be getting that detail somewhat rather than lose it entirely. This almost seems like a pick your poison kind of thing and there isn't really a "most accurate" choice, it just depends on what you prefer. Do you want to see the details beyond 800 nits even if it's being compressed? Or would you rather just take the hard clip instead. If you played a game where the sun was outputting 1000 nits, would you rather see that sun at 800 nits or see nothing at all? I'm guessing in a game with HGiG support the game would know that the sun cannot be anymore than 800 nits, while a game that doesn't support HGiG would completely ignore that and still output 1000 nits and let your TV just hardclip it out.

More or less, though as you mentioned - that otherwise clipped detail is not simply added and gained. HGiG is most accurate because otherwise multiple color luminances are sharing the same slots on the higher ~"half" of the display's range in whatever compression scheme the tv uses. And oleds typically can't even do 1000nit curves so they are squashed even a little more. I guess you could say HGiG avoids muddying out of range colors together with those accurate values in the top end of the display's scale and instead cuts off the highest ranges that the tv is technically incapable of displaying in the first place.

With a few exceptions, practically all displays have relatively short % of screen brightness duration limitations as well though, plus ABL so it's quite a juggling or plate spinning act anyway.

I agree that DTM off which uses the display mfgs static tone mapping curve's set compression scheme is a good compromise but you can try dtm off or HGiG on a per game basis. That since some devs drop the ball on HDR to one degree or another so the results can vary.

DTM on however is sort of wacky world lifting lows and mids unnecessarily, "breaking" shadowed areas and lifting artistic intent of mids. Also I think lifting some of the range that would be beneath the static tone mapping curve's roll off threshold into and throughout the high end muddying detail away more than the static tone mapping curve would. It’s sort of compressing the whole range dynamically and all over the place up as well as down - dynamically on a per frame basis, not doing it logically.
 
Last edited:
Where is the latest greatest place to find the most up to date settings for gaming on the 48cx
 
More or less, though as you mentioned - that otherwise clipped detail is not simply added and gained. HGiG is most accurate because otherwise multiple color luminances are sharing the same slots on the higher ~"half" of the display's range in whatever compression scheme the tv uses. And oleds typically can't even do 1000nit curves so they are squashed even a little more. I guess you could say HGiG avoids muddying out of range colors together with those accurate values in the top end of the display's scale and instead cuts off the highest ranges that the tv is technically incapable of displaying in the first place.

With a few exceptions, practically all displays have relatively short % of screen brightness duration limitations as well though, plus ABL so it's quite a juggling or plate spinning act anyway.

I agree that DTM off which uses the display mfgs static tone mapping curve's set compression scheme is a good compromise but you can try dtm off or HGiG on a per game basis. That since some devs drop the ball on HDR to one degree or another so the results can vary.

DTM on however is sort of wacky world lifting lows and mids unnecessarily, "breaking" shadowed areas and lifting artistic intent of mids. Also I think lifting some of the range that would be beneath the static tone mapping curve's roll off threshold into and throughout the high end muddying detail away more than the static tone mapping curve would. It’s sort of compressing the whole range dynamically and all over the place up as well as down - dynamically on a per frame basis, not doing it logically.

Right but let's say a game was mastered to 1000 nits to begin with and now it supports HGiG and you have it set to 800 nits max. Aren't you technically still tone mapping it and squeezing what should be 1000 nits down into an 800 nit container? IIRC HGiG is just to avoid any sort of "double tone mapping" and instead let the game itself do the tone mapping, but it still tone mapping regardless and you are taking nits that are outside of the TV's capability and now compressing it to fit within what it can show. That sounds exactly like what DTM off is already doing except with HGiG you are simply avoiding a double tone map. So if the purpose of HGiG is to let the game tone map and compress the dynamic range into what's appropriate, then it seems like if the game does not support HGiG you should want to use DTM off in order to get the same effect of compressing 1000 nits down to 800 nits rather than clipping it off. Unless of course I'm misunderstanding things and that is exactly what HGiG is supposed to be doing in a supported game, just hard clips off any details beyond 800 nits and doesn't actually tone map anything.
 
what brightness do you keep your display at or play at?

A few reminders that might help in that vein:


....You can set up different named profiles with different brightness, peak brightness, etc.. and maybe contrast in the TV's OSD. You can break down any of the original ones completely and start from scratch settings wise if you wanted to. That way you could use one named profile for lower brightness and perhaps contrast for text and static app use. Just make sure to keep the game one for gaming. I keep several others set up for different kinds of media and lighting conditions.
  • Vivid
  • Standard
  • APS
  • Cinema
  • Sports
  • Game
  • FILMMAKER MODE
  • iisf Expert (Bright Room)
  • isf Expert (Dark Room)
  • Cinema Home

....You can change the TV's settings several ways. Setting up the quick menu or drilling down menus works but is tedius. Keying the mic button on the remote with voice control active is handy to change named modes or do a lot of other things. You can also use the remote control software over your LAN , even hotkeying it. You can change a lot of parameters using that directly via hotkeys. Those hotkeys could also be mapped to a stream deck's buttons with icons and labels. In that way you could press a streamdeck button to change the brightness and contrast or to activate a different named setting. Using streamdeck functions/addons you can set up keys as toggles or multi press also, so you could toggle between two brightness settings or step through a brightness cycle for example.

....You can also do the "turn off the screen emitters" trick via the quick menu, voice command with the remote's mic button, or via the remote control over LAN software + hotkeys (+ streamdeck even easier). "Turn off the screen" (emitters) only turns the emitters off. It doesn't put the screen into standby mode. As far as your pc os, monitor array, games or apps are concerned the TV is still on and running. The sound keeps playing even unless you mute it separately. It's almost like minimizing the whole screen when you are afk or not giving that screen face time, and restoring the screen when you come back. It's practically instant. I think it should save a lot of "burn down" of the 25% reserved brightness buffer over time. Might not realize how much time cumulatively is wasted with the screen displaying when not actually viewing it - especially when idling in a game or on a static desktop/app screen.

...You can also use a stream deck + a handful of stream deck addons to manage window positions, saved window position profiles, app launch + positioning, min/restore, etc. You could optionally swap between a few different window layouts set to a few streamdeck buttons in order to prevent your window frames from being in the same place all of the time for example.

... Dark themes in OS and any apps that have one available, web browser addons (turn off the lights, color changer), taskbarhider app, translucent taskbar app, plain ultra black wallpaper, no app icons or system icons on screen (I throw mine all into a folder on my hard drive "desktop icons"). Black screen saver if any.

... Logo dimming on high. Pixel shift. A lot of people turn asbl off for desktop but I keep it on since mine is solely for media/gaming. That's one more safety measure.
 
Last edited:
Right but let's say a game was mastered to 1000 nits to begin with and now it supports HGiG and you have it set to 800 nits max. Aren't you technically still tone mapping it and squeezing what should be 1000 nits down into an 800 nit container? IIRC HGiG is just to avoid any sort of "double tone mapping" and instead let the game itself do the tone mapping, but it still tone mapping regardless and you are taking nits that are outside of the TV's capability and now compressing it to fit within what it can show. That sounds exactly like what DTM off is already doing except with HGiG you are simply avoiding a double tone map. So if the purpose of HGiG is to let the game tone map and compress the dynamic range into what's appropriate, then it seems like if the game does not support HGiG you should want to use DTM off in order to get the same effect of compressing 1000 nits down to 800 nits rather than clipping it off. Unless of course I'm misunderstanding things and that is exactly what HGiG is supposed to be doing in a supported game, just hard clips off any details beyond 800 nits and doesn't actually tone map anything.

HGiG
=======

Pretty sure HGiG will just truncate/clip near the peak nit of the display regardless. So all colors displayed will be in the range the TV is capable of and will be shown 1:1 per color value. That's why it's quoted as being the most accurate. It's not sharing or swapping color/luminance value locations with higher out of range ones which by nature is less accurate muddying together or substituting.

From user Monstieur... he said pass using a 1000nit curve *OR* HGiG. There are other sites online and reddit threads saying similar.
"I would use only the 1000-nit curve or HGIG for all movies, even if the content is mastered at 4000+ nits. All moves are previewed on a reference monitor and are designed to look good at even 1000 nits. It's fine to clip 1000+ nit highlights. HGIG would clip 800+ nit highlights."

This from a reddit thread's replies sums it up I think:
DTM off, is still performing tone mapping, (static) up to 4,000 nits. HGiG, is a hard clip at 800 nits. Regardless of the game…or whether it supports HGiG or not.

* while the most accurate, it's still not accurate throughout really, due to %brightness sustained duration limits and of course aggressive ABL dropping values reflexively. Like I said, there are a lot of plates being spun in firmware.

...................

DTM Off
========

DTM off will be a HDR scale or curve by LG with that compresses the top end as intelligently as they could in a static, hard defined way rather than dynamically with DTM on. So while less accurate it tries to preserve some higher detail in colors (for example, in textures in games) that would otherwise be lost.

LG's 1000-nit curve is accurate up to 560 nits, and squeezes 560 - 1000 nits into 560 - 800 nits.
LG's 4000-nit curve is accurate up to 480 nits, and squeezes 480 - 4000 nits into 480 - 800 nits.
LG's 10000-nit curve is accurate up to 400 nits, and squeezes 400 - 10000 nits into 400 - 800 nits.

*Some software will allow you to choose which curve you send. For example, MadVR renderer for MPC. Movies have their own limits too. Most are HDR1,000 .. some are 4000. There are only a few HDR 10,0000 movie releases atm. If you have DTM off and send a 1000nit curve from a movie or game it will compress in the first range listed above.

......................

DTM On
=======

DTM on however is sort of wacky world lifting lows and mids unnecessarily, "breaking" shadowed areas and lifting artistic intent of mids. Also I think lifting some of the range that would be beneath the static tone mapping curve's roll off threshold into and throughout the high end muddying detail away more than the static tone mapping curve would. It’s sort of compressing the whole range dynamically and all over the place up as well as down - dynamically on a per frame basis, not doing it logically.



........................

Like I said though, there are a bunch of games where devs dropped the ball on HDR. They don't all have a HDR peak brightness slider, they might only have a middle brightness/white point slider, and some don't have a saturation slider. Games like elden ring have a HDR peak brightness slider, HDR middle brightness slider, and HDR saturation slider for example. In that case, the CRU edit to the peak brightness of the display can help. Still, since HDR is screwed up on some games it's worth experimenting with HGiG on mode or DTM off mode to see which looks best as it probably won't be the same end result on every HDR game.
 
Last edited:
From Vincent of HDTVTest's twitter:

Based on my testing, HGiG behaves correctly on 2021 LG OLEDs on the latest firmware. However, [HDR Tone Mapping] "Off" in [Game Optimiser] mode is hard clipping - that's why you see no difference to HGiG. [HDR Tone Mapping] "Off" in other non-Game picture modes does a roll-off.

John Linneman of digital foundry on twitter:

With HGIG, the game/console does all tone mapping -


Right but let's say a game was mastered to 1000 nits to begin with and now it supports HGiG and you have it set to 800 nits max. Aren't you technically still tone mapping it and squeezing what should be 1000 nits down into an 800 nit container?

The console is doing the tone mapping if the game/console supports HGiG. HGiG is turning tone mapping (even static tone mapping) off on the TV itself.

So yes but only in that case. It would be tone mapped by the console itself not the TV, as long as the console+game supported HGiG in the first place.

"HGiG on" mode active on the TV from pc sources, or any other source that isn't tone mapping on it's own end, will map colors more accurately to their actual color value "location" all the way up through the peak nit of the display, and will hard clip without any compression. According to Vincent,[HDR Tone Mapping] "Off" in [Game Optimizer] mode will also hard clip in 2021 LG firmware, at least when he posted that May 2021.

Another from that tweet of Vincent's:

Certainly, when checked using the PS5 & Xbox Series X HGiG calibration screen, my LG G1 review sample hard clips at 800 nits, which translates well to HGiG-compliant games such as Dirt5.


...........................................
 
Last edited:
That sounds to me like HGiG has two completely different behavoirs depending on whether the game supports it or not. If the game doesn't support it then HGiG is behaving more like a "static tone mapping OFF" button and just letting the display show what it can and clipping what it cannot. But if the game does support it then it is doing some actual tone mapping on the console level and compressing the dynamic range to fit within the display's capabilities while turning off the TV's tone mapping to avoid a double tone map. So what is the real intended use of HGiG here? To display everything up to 800 nits and clip everything beyond or is it to let the game compress everything including the 1000 nits details into 800 nits?
 
Correct on the first point.

In the second usage scenario it's like

From Vincent of HDTVTest's twitter:

Based on my testing, HGiG behaves correctly on 2021 LG OLEDs on the latest firmware. However, [HDR Tone Mapping] "Off" in [Game Optimiser] mode is hard clipping - that's why you see no difference to HGiG. [HDR Tone Mapping] "Off" in other non-Game picture modes does a roll-off.

From what I've read, HGiG is turning off all curves on the TV and hard clipping the TV's actual luminance / HDR color volume range or scale to the TV's actual peak. That leaves it up to the HGiG console + HGiG console game or whatever other source to do it's own tone mapping down to within the range of the TV's clip / cut-off.

This has also been found useful by users that value a greater, "1:1 accuracy" for each color value to match the actual color scale up to the peak nit of the display (~ 725 to 800nit on most OLEDs), rather than "pulling down into" or muddying, substituting higher out of range color volumes/brightness values down into the top end of the TV's actual range using compression methods and relying on what the devs of that fw decided was best.

Still, due to the poor HDR implementation by devs of some games, you might experiment with DTM= Off and HGiG on a per game basis to see if there is any improvement. (That and maybe use CRU edit to the peak nit of your screen beforehand). Due to the various poor implementations of some games, HGiG=on and DTM=off can have different visible results between different games (one might help vs poor HDR implementation more than the other but on a per game basis so you'd have to try them both out on each game).

Personally I'd use either, whichever I find I preferred for a given game but I'd probably lean toward static tone mapping (DTM=off) if finding it was preserving some details compared to HGiG mode. I just don't like DTM=on for the reasons I said in my previous recent replies.
DTM on however is sort of wacky world lifting lows and mids unnecessarily, "breaking" shadowed areas and lifting artistic intent of mids. Also I think lifting some of the range that would be beneath the static tone mapping curve's roll off threshold into and throughout the high end muddying detail away more than the static tone mapping curve would. It’s sort of compressing the whole range dynamically and all over the place up as well as down - dynamically on a per frame basis, not doing it logically.


............................................

To answer your last question, yes like you've been saying - HGiG's main use was to allow an external device to do the tone mapping itself so turn it all off on the TV, avoiding double tone mapping once on console and once on the TV. (HGiG = less processing on the TV even without a HGiG source though technically). I believe the HGiG enabled (console) games, if done right, would act nearly the same as DTM=Off (static tone mapping of the display) if the game supported HGiG,

I think it would also allow the console/console game devs to use their own tone mapping methods/algorithms instead of whatever the TV's flavor is as well. I doubt they are using LG's static tone mapping fw method. Theoretically their HGiG mapped game could be mapped better in relation to the specific game to the designer's intent. I guess for good or ill depending, as the end result compared to doing static tone mapping on your OLED on a per game basis.

..............................................
 
Last edited:
Correct on the first point.

In the second usage scenario it's like

From Vincent of HDTVTest's twitter:



From what I've read, HGiG is turning off all curves on the TV and hard clipping the TV's actual luminance / HDR color volume range or scale to the TV's actual peak. That leaves it up to the HGiG console + HGiG console game or whatever other source to do it's own tone mapping down to within the range of the TV's clip / cut-off.

This has also been found useful by users that value a greater, "1:1 accuracy" for each color value to match the actual color scale up to the peak nit of the display (~ 725 to 800nit on most OLEDs), rather than "pulling down into" or muddying, substituting higher out of range color volumes/brightness values down into the top end of the TV's actual range using compression methods and relying on what the devs of that fw decided was best.

Still, due to the poor HDR implementation by devs of some games, you might experiment with DTM= Off and HGiG on a per game basis to see if there is any improvement. (That and maybe use CRU edit to the peak nit of your screen beforehand). Due to the various poor implementations of some games, HGiG=on and DTM=off can have different visible results between different games (one might help vs poor HDR implementation more than the other but on a per game basis so you'd have to try them both out on each game).

Personally I'd use either, whichever I find I preferred for a given game but I'd probably lean toward static tone mapping (DTM=off) if finding it was preserving some details compared to HGiG mode. I just don't like DTM=on for the reasons I said in my previous recent replies.



............................................

To answer your last question, yes like you've been saying - HGiG's main use was to allow an external device to do the tone mapping itself so turn it all off on the TV, avoiding double tone mapping once on console and once on the TV. (HGiG = less processing on the TV even without a HGiG source though technically). I believe the HGiG enabled (console) games, if done right, would act nearly the same as DTM=Off (static tone mapping of the display) if the game supported HGiG,

I think it would also allow the console/console game devs to use their own tone mapping methods/algorithms instead of whatever the TV's flavor is as well. I doubt they are using LG's static tone mapping fw method. Theoretically their HGiG mapped game could be mapped better in relation to the specific game to the designer's intent. I guess for good or ill depending, as the end result compared to doing static tone mapping on your OLED on a per game basis.

..............................................

That is what HGiG is supposed to do though. I think people should just ask LG for a fourth option called "all tone mapping OFF"
 
Last edited:
They kind of did make that an option, at least in the 2021 fw:

From Vincent of HDTVTest's twitter:


Based on my testing, HGiG behaves correctly on 2021 LG OLEDs on the latest firmware. However, [HDR Tone Mapping] "Off" in [Game Optimiser] mode is hard clipping - that's why you see no difference to HGiG. [HDR Tone Mapping] "Off" in other non-Game picture modes does a roll-off.

End result is the same either way.. semantics.

Would be nice if it was in the same menu as HGiG, DTM: On , DTM: Off though like you said.
 
Last edited:
They kind of did make that an option, at least in the 2021 fw:

From Vincent of HDTVTest's twitter:




End result is the same either way.. semantics.

Would be nice if it was in the same menu as HGiG, DTM: On , DTM: Off though like you said.

He's talking about the C1. Does the CX have a Game Optimizer option? I can only recall instant game response setting.
 
Yeah I get them mixed up b/c I have a C1 in my living room. You are prob right. Either way same end result.
 
Back
Top