LG 48CX

Yeah I get them mixed up b/c I have a C1 in my living room. You are prob right. Either way same end result.

I tried switching between HGiG and DTM OFF and there is definitely zero difference between the two modes after using the Windows HDR calibration tool. This makes sense because windows auto HDR will no longer send any nit signals that's beyond what the TV can display because it is now aware of the TV's capabilities after doing the calibration so there is no need to even perform static tone mapping if the TV never receives any signal above 750/800 nits.



 
There might be a way to set up the windows HDR calibration tool to be at 1000nit so it passes a 1000nit curve, if you wanted a static tone mapping option attempting to add some lost higher details. I'd have to look into it but seems feasible. From what I've read it just creates a HDR color profile.

From a reddit thread:
......................................

I upgraded to Windows 11 specifically for auto HDR and while it did work initially the calibration was off and the whites were way to bright. I'd use it on some games but would have to disable HDR on my monitor when I got off because it made browsing the web, playing other games, watching movies and such horrible. Ultimately I just stopped using HDR because it was a hassle.

With the update and the tool to calibrate your HDR profile I have enabled it and it seems to be perfect no mater what content I am viewing now. The whites are not as bright in general anymore which is fantastic and games which natively support HDR such as Destiny 2 and Cyberpunk 277 look so incredible now.

The app is separate but the functionality is built into your windows with ICC profiles that the tool helps generate. People always hate default applications so I think Microsoft limits them were they feel comfortable.

...............

All the app does is creating an HDR color profile.
Search "Color Management" in start and open the tool, you should see the profile created by the app under: ICC Profiles (Advanced Color).

Obviously, the profile gets created only after completing the calibration process in the app at least once.

If you use an Nvidia GPU, you need to set the color accuracy to "Accurate" under: Nvidia control panel > Display > Adjust desktop color settings > 2. Color accuracy mode.

I tried Ghostwire Tokyo and it looks much better, maybe I went a little to far with the saturation in the calibration app but it looks definitely different than before using the calibration app.

...................

I am aware of a bug that prevents you to go back to accurate if you ever touch the sliders under this option, take a look at this thread, especially post number 9: https://forums.guru3d.com/threads/color-accuracy-mode.435755/

Enhanced should be Accurate with the sliders applied to it, so if you keep the sliders in the default position, it should be equivalent to Accurate even if it says Enhanced.
I don't know if Nvidia ever provided an official comment on this behavior or the current state of it in the latest drivers.


I experienced the same issue a long time ago and in order to revert to accurate I had to reinstall the driver after using DDU. Since then, I am really careful when I use that area of the panel and I avoid touching any of the sliders.

.........................

If you want to truly calibrate hdr for win11 you have use cru to set your peak brightness. Windows by default is 2000nits.

Follow step 3 from this link
https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/?utm_medium=android_app&utm_source=share

This works especially well for hgig gaming

.............................


The way this is all supposed to work is the TV/monitor advertises it's peak HDR10 brightness as part of the display metadata. This is true of most PC monitors and any DisplayHDR certified monitor. When the content/windows advertises the same or lower peak brightness the display disables tone mapping. Freesync 2 Premium Pro monitors also have a feature to explicitly disable tone mapping, though I don't know whether this is used in practice or how it is signalled. Whether the resulting output is correct is up to the monitor and frequently inaccurate, though the latest monitors are getting better.

Windows apps use this HDR display metadata together with the SDR brightness slider (which is a paper white setting for SDR content in the HDR container) to decide what to do with tone mapping. This HDR calibration apps is just providing another way to tell windows what values to provide to the game using the standard windows HDR APIs that have been around since Windows 10. Games that have their own peak brightness sliders can override this but should be using these values as their default (but some don't so check the slider). Games that dont have peak brightness sliders (so-called HGIG games) should be using these values to decide their tone mapping.

Unfortunately for some reason most TV's that support Dolby Vision leave the HDR10 peak brightness metadata in the EDID blank. Windows falls back to using 1599 nits as a default (not sure why they picked this number, maybe it avoids tone mapping up to 1000 nits?). This is especially bad as when windows supplies this as content metadata it kicks the TV into 4000 nits tone mapping making everything overly dim.

HGIG calibration is a silly workaround "standard" getting users to set these values manually with a test pattern (and/or have a database of displays on the console) instead of just updating the TVs to advertise the expected HDR10 display metadata. The most annoying part is most TVs do expose this metadata as Dolby Vision metadata (to support player-lead Dolby Vision) but Windows and consoles don't read that information. I suspect making proper use of this metadata is one of the reasons Dolby Vision on consoles looks "better" (plus 12-bit dithering from the Dolby engine).

An alternative way to more objectively solve this calibration issue is to use edid-info to dump the Dolby Vision metadata and copy that into the appropriate EDID fields (or this Windows calibration app which puts the metadata in a windows specific ICC profile tag instead). This is reasonably easy for brightness but more annoying for color gamut primaries as that isn't exposed or imported by default in CRU so you have to mess with EDID editors.
Having this metadata wrong also make the Windows 11 legacy ICC profile emulation compatibility feature in HDR broken as it generates the wrong ICC profile, but the primaries don't seem to be used by much else yet.

If all of this works correctly end to end on something like an OLED monitor (to avoid backlight) you should get identical output between SDR sRGB mode and the windows desktop in HDR mode with matching SDR brightness settings (possibly with some extra banding due to all the transformations going on).

As a side note the calibration app saturation slider should be all the way to the left on a monitor/TV with accurate HDR output, anything else is messing with the image saturation using the ICC profile. The ICC profile isn't a "normal" one, it includes an undocumented Microsoft Advanced Color MHC2 tag which does global system wide color transformations in the GPU similar to novideo_srgb and applies to all apps. Until this HDR calibration tool that tag was mostly only used for factory calibrating laptop built in displays.

.................................
 
Last edited:
There might be a way to set up the windows HDR calibration tool to be at 1000nit so it passes a 1000nit curve, if you wanted a static tone mapping option attempting to add some lost higher details. I'd have to look into it but seems feasible. From what I've read it just creates a HDR color profile.

From a reddit thread:

1664918195989.png

1664918107391.png


This explains why I saw no difference in highlights before and after doing the windows 11 hdr calibration tool. I had already done the CRU edit a long time ago so that was essentially the HGiG calibration.

So I just tried out the recently released Windows HDR Calibration Tool on my CX and supposedly it helps to improve the Auto HDR image quality. So far I'm not seeing a whole lot of difference from before, if any really. Is it because I already did the CRU tweak to make windows report my display's peak brightness to be 800 nits? I did the calibration while having my TV be in HGiG mode btw, and again the CRU edit was already done to make the display recognized as having 800 nits peak brightness a while back prior to doing the calibration. Perhaps this is more useful to people who did not do the CRU edit.

Now here's my question: If the CRU edit to tell windows I don't have a 2000 nit display but only an 800 nit one, doesn't that mean Windows will now tone map 2000 nits down into an 800 nits container? That would put us back at the original problem of squeezing out of range colors/brightness down to 800 nits. What EXACTLY is CRU doing when it tells windows that your display is only capable of 800 nits? If it tells Windows to only send an 800 nit signal and clip everything above 800 nits up to 2000 nits, how is that any different from not doing the CRU edit which will let windows send the full 2000 nit signal and then letting your TV itself clip everything above 800 nits by enabling HGiG?
 
Last edited:
As I understand it, CRU is just sending the correct (well as correct as you set it to be) peak nit value of the screen to windows and then to the game. The windows HDR calibration tool is attempting to do similar. A game with a peak brightness slider is also doing similar.

Tone mapping on a game might depend on the game dev (especially a console game) but generally, from everything I've read - Tone Mapping on the display will only happen when the nits exceed the peak of the display.

So in your example where I think you are saying you hypothetically have a 2000nit peak display and you:
... use CRU edit to define the display as 800nit
...or purposefully use wrong settings in Windows HDR calibration tool (or maybe editing it's end result ICC profile) to 800nit
...or purposefully set the peak nit slider in a game, who's devs have provideded one, to ~ 800nit.

I think the result would be the same. You aren't exceeding the peak nit of the display so the display won't do any compression/static tone mapping in DTM=off mode.

A game can apparently override this but it should be using it as the foundation so if it's ignoring it, the HDR implementation is probably "broken".

...If you do none of the above and only turn on HGiG on a 2000nit peak display, you'd be turning off all processing/tonemapping/compression. The display would take all color ranges up to the peak of the display and then clip (e.g. if sent HDR4000 or HDR10,000 data).






Windows apps use this HDR display metadata together with the SDR brightness slider (which is a paper white setting for SDR content in the HDR container) to decide what to do with tone mapping. This HDR calibration apps is just providing another way to tell windows what values to provide to the game using the standard windows HDR APIs that have been around since Windows 10.

Games that have their own peak brightness sliders can override this but should be using these values as their default (but some don't so check the slider). Games that dont have peak brightness sliders (so-called HGIG games) should be using these values to decide their tone mapping.
 
Last edited:
As I understand it, CRU is just sending the correct (well as correct as you set it to be) peak nit value of the screen to windows and then to the game. The windows HDR calibration tool is attempting to do similar. A game with a peak brightness slider is also doing similar.

Tone mapping on a game might depend on the game dev (especially a console game) but generally, from everything I've read - Tone Mapping on the display will only happen when the nits exceed the peak of the display.

So in your example where I think you are saying you hypothetically have a 2000nit peak display and you:
... use CRU edit to define the display as 800nit
...or purposefully use wrong settings in Windows HDR calibration tool (or maybe editing it's end result ICC profile) to 800nit
...or purposefully set the peak nit slider in a game, who's devs have provideded one, to ~ 800nit.

I think the result would be the same. You aren't exceeding the peak nit of the display so the display won't do any compression/static tone mapping in DTM=off mode.

A game can apparently override this but it should be using it as the foundation so if it's ignoring it, the HDR implementation is probably "broken".

Right. Tone mapping on the display won't occur because we will not exceed 800 nits. But what happens to everything beyond 800 nits when you do the CRU edit? Is it all being tossed out or will Windows Auto HDR tone map 2000 nits down to 800 nits?
 
Right. Tone mapping on the display won't occur because we will not exceed 800 nits. But what happens to everything beyond 800 nits when you do the CRU edit? Is it all being tossed out or will Windows Auto HDR tone map 2000 nits down to 800 nits?


......

in your example where I think you are saying you hypothetically have a 2000nit peak display and you:
... use CRU edit to define the display as 800nit
...or purposefully use wrong settings in Windows HDR calibration tool (or maybe editing it's end result ICC profile) to 800nit
...or purposefully set the peak nit slider in a game, who's devs have provideded one, to ~ 800nit.

I think the result would be the same. You aren't exceeding the peak nit of the display so the display won't do any compression/static tone mapping in DTM=off mode.


I think the cru edit tells windows that a 800 nit screen is connected so that is all that it will work with. It'd be the same if you edited cru to only show 1920x1080 resolution. No other resolution (or in this case, color range/nits) will be available.
 
......




I think the cru edit tells windows that a 800 nit screen is connected so that is all that it will work with. It'd be the same if you edited cru to only show 1920x1080 resolution. No other resolution (or in this case, color range/nits) will be available.

This is why I gave up on HDR for PC gaming and didn't bother with it again until late 2020 after I got a CX and RTX 30 series for 4k 120Hz 10bit. It was an absolute mess on PC and even today it seems like you need to do a whole bunch of digging around to find out what is the right TV settings, Windows Settings, GPU settings, etc. on a per game basis. From all the testing I've done once you do the CRU edit there is no longer any difference between HGiG and DTM OFF so really the two options are either DTM ON or HGiG and one can look better than the other depending on the game/scene. I guess we just need much much brighter displays to avoid tone mapping altogether.
 
This is why I gave up on HDR for PC gaming and didn't bother with it again until late 2020 after I got a CX and RTX 30 series for 4k 120Hz 10bit. It was an absolute mess on PC and even today it seems like you need to do a whole bunch of digging around to find out what is the right TV settings, Windows Settings, GPU settings, etc. on a per game basis. From all the testing I've done once you do the CRU edit there is no longer any difference between HGiG and DTM OFF so really the two options are either DTM ON or HGiG and one can look better than the other depending on the game/scene. I guess we just need much much brighter displays to avoid tone mapping altogether.

It's way better than it used to be. You have three ways to set the peak brightness now. Windows HDR calibration tool does it overall for windows a lot like a game with a peak brightness slider does for it's individual game. So that is two. Then there is CRU edit which gives you more precise control over the values (though you could probably edit the icc profile that windows hdr calibration tool creates more precisely after the fact too).

The problem was, windows wasn't using the right data or range before in some content/games. Now you have a few choices how to assign the peak brightness windows uses as more or less the peak nit capability of your display.

...........

Still you could probably set up windows to use a 1000nit curve using one of those three methods if you wanted DTM=off mode to allow the TV to apply static tone mapping compression - which might preserve more details (inaccurately color value wise) that would have otherwise been lost.
 
Last edited:
It seems to me that Windows HDR calibration tool is applying a global peak nit similar to how games that have them in their own menus do. That would make the CRU edit less needed - though the calibration style tools are doing the adjustments relative to how you feel examples look, kind of like cleartype tuning, rather than using a more precise number based assignment.
 
It seems to me that Windows HDR calibration tool is applying a global peak nit similar to how games that have them in their own menus do. That would make the CRU edit less needed - though the calibration style tools are doing the adjustments relative to how you feel examples look, kind of like cleartype tuning, rather than using a more precise number based assignment.

It does something similar to the PS5's HDR calibration screen but it has a few extra sections that lets you adjust the saturation, guess that's where people are thinking that it's supposed to be a color profile. My assumption was that this is doing what CRU does so the people who have already done the CRU edit may not need to do this unless they wish to adjust the saturation. I left my saturation values in the middle so I guess that's why I see no difference pre and post calibration since CRU already took care of capping the peak nits. For people who are not keen on using CRU this would be the "official" solution I suppose.
 
According to one of the quotes I posted, regarding the Windows HDR calibration tool:

All the app does is creating an HDR color profile.
Search "Color Management" in start and open the tool, you should see the profile created by the app under: ICC Profiles (Advanced Color).

Obviously, the profile gets created only after completing the calibration process in the app at least once.

If you use an Nvidia GPU, you need to set the color accuracy to "Accurate" under: Nvidia control panel > Display > Adjust desktop color settings > 2. Color accuracy mode.

I tried Ghostwire Tokyo and it looks much better, maybe I went a little to far with the saturation in the calibration app but it looks definitely different than before using the calibration app.

So apparently it does create an actual ICC profile. Even though it's called a "color profile" it's all paramaters incl. brightness, etc.

warning regarding the Nvidia Control Panel -> Display -> Adjust Desktop color settings -> Color accuracy mode: ----->
accurate setting (suggests using enhanced):
I am aware of a bug that prevents you to go back to accurate if you ever touch the sliders under this option, take a look at this thread, especially post number 9: https://forums.guru3d.com/threads/color-accuracy-mode.435755/

Enhanced should be Accurate with the sliders applied to it, so if you keep the sliders in the default position, it should be equivalent to Accurate even if it says Enhanced.
I don't know if Nvidia ever provided an official comment on this behavior or the current state of it in the latest drivers.


I experienced the same issue a long time ago and in order to revert to accurate I had to reinstall the driver after using DDU. Since then, I am really careful when I use that area of the panel and I avoid touching any of the sliders.

Here is where the ICC profiles are referenced. You could probably find a way to automate switching/choosing with a stream deck's buttons somehow if you wanted to set up different curves e.g. 800 nit, 1000nit, different saturations.

https://pcmonitors.info/articles/using-icc-profiles-in-windows/

Display-Profile.png


Obviously it would be a pain to have to go into Colour Management and switch profiles on and off every time you wanted to play a certain game or return to the desktop, or switch between multiple profiles for different purposes. Windows 10 and 11 include a drop-down list in ‘Display Settings’ which makes this easier. Alternatively, there is an excellent and tiny utility called ‘Display Profile’ (above), created by X-Rite, which gives you a very quick and convenient way of doing this. You can download it here. This allows you to toggle between ICC profiles or use the system defaults if you essentially want to disable any ICC profile corrections. This utility lists profiles located in ‘X:\Windows\system32\spool\drivers\color’, where ‘X’ denotes the drive you’ve installed Windows to. You can therefore simply drag and drop any profiles to this folder and they should be selectable in this utility. To use system defaults and disable any specific LUT and gamma corrections simply select ‘sRGB IEC61966-2.1’ in the utility.

.....................

Yes it's like Windows added a peak brightness slider like some games have, so you wouldn't have to use CRU edit. CRU edit is just more specific. (It also allows you to add/remove resolutions e.g remove 4096x and add ultrawide resolutions). The windows HDR calibration tool is pretty recently released to PC after being on xbox. It seems like a good fix.
 
Last edited:
C3 needs to up that refresh rate past 120Hz now. 4090 is ~1.7x over a 3090 as I thought, plenty for well above 120fps.

1665493894653.png
 
For now, 4090 seems like a 4K120 card... wow.

145fps is on the lower end of what it can do since the testing was done with DLSS off and ultra settings. There's wiggle room to improve performance with optimized settings and using DLSS to easily push it into 160fps+ averages.
 
So...another day and another frustrating moment with HDR for me lol. Just tried the new Plague Tale game this morning after watching this video:



He says that the game does obey the Windows HDR Calibration tool and not to deviate too far from the in game HDR default values if you used the calibration tool which I did already. However, I am noticing some pretty insane clipping going on here, which I'm assuming is normal behavior because if the highlight is greater than 800 nits then it will simply get hard clipped off as that is what HGiG is supposed to do isn't it? There is no roll off going on so it just displays 0-750/800 nits as it should be shown then clipping out everything beyond that. So what's the problem? Seems like the game has a lot of highlight detail that's beyond 800 nits, at least in the beginning section so far and it's all getting completely clipped off so now like half of my screen is just filled with clipped highlights. Funny enough, DTM ON actually causes even more clipping than using HGiG and DTM OFF behaves exactly the same as HGiG. The areas in red is where I'm seeing a bunch of clipping, mostly the ground and the clouds/sky but like I said it takes up a large portion of the screen and I wouldn't really consider this to be a great HDR experience if the rest of the game is like this, full of clipped off detail.
 

Attachments

  • 20221018_083607.jpg
    20221018_083607.jpg
    1 MB · Views: 0
Last edited:
Can you delete the windows HDR calibration profile from the color profiles list in windows without screwing things up? (idk if you can do that or remove the calibration ~ zero it to default in the calibration tool). The windows calibration is by your impression rather than numbers for some of it's config tools.

Here is where the ICC profiles are referenced. You could probably find a way to automate switching/choosing with a stream deck's buttons somehow if you wanted to set up different curves e.g. 800 nit, 1000nit, different saturations.

https://pcmonitors.info/articles/using-icc-profiles-in-windows/

Display-Profile.png



Obviously it would be a pain to have to go into Colour Management and switch profiles on and off every time you wanted to play a certain game or return to the desktop, or switch between multiple profiles for different purposes. Windows 10 and 11 include a drop-down list in ‘Display Settings’ which makes this easier. Alternatively, there is an excellent and tiny utility called ‘Display Profile’ (above), created by X-Rite, which gives you a very quick and convenient way of doing this. You can download it here. This allows you to toggle between ICC profiles or use the system defaults if you essentially want to disable any ICC profile corrections. This utility lists profiles located in ‘X:\Windows\system32\spool\drivers\color’, where ‘X’ denotes the drive you’ve installed Windows to. You can therefore simply drag and drop any profiles to this folder and they should be selectable in this utility. To use system defaults and disable any specific LUT and gamma corrections simply select ‘sRGB IEC61966-2.1’ in the utility.


All the app does is creating an HDR color profile.
Search "Color Management" in start and open the tool, you should see the profile created by the app under: ICC Profiles (Advanced Color).

Obviously, the profile gets created only after completing the calibration process in the app at least once.

If you use an Nvidia GPU, you need to set the color accuracy to "Accurate" under: Nvidia control panel > Display > Adjust desktop color settings > 2. Color accuracy mode.

I tried Ghostwire Tokyo and it looks much better, maybe I went a little to far with the saturation in the calibration app but it looks definitely different than before using the calibration app.

So apparently it does create an actual ICC profile. Even though it's called a "color profile" it's all paramaters incl. brightness, etc.
warning regarding the Nvidia Control Panel -> Display -> Adjust Desktop color settings -> Color accuracy mode: ----->
accurate setting (suggests using enhanced):
I am aware of a bug that prevents you to go back to accurate if you ever touch the sliders under this option, take a look at this thread, especially post number 9: https://forums.guru3d.com/threads/color-accuracy-mode.435755/

Enhanced should be Accurate with the sliders applied to it, so if you keep the sliders in the default position, it should be equivalent to Accurate even if it says Enhanced.
I don't know if Nvidia ever provided an official comment on this behavior or the current state of it in the latest drivers.


I experienced the same issue a long time ago and in order to revert to accurate I had to reinstall the driver after using DDU. Since then, I am really careful when I use that area of the panel and I avoid touching any of the sliders.


You already have cru edited to ~800nit I think, so you could leave it like that and set the TV to DTM = off so the game's curve gets static tonemapped down by LG's curve. Also, is autoHDR on? could that clash with true HDR on some games? Seems like a hdr curve problem like you said though. Is there any DTM settings in the game itself? In game peak brightness setting? Middle brightness/white setting?

DTM = On can step over other color values and wash out detail. There are some HDTVtest vids showing it wash out textures in bright highlights so that's not that unexpected really. DTM on is lifting lower color values into those occupied by higher ones. It can overbrighten the picture or parts of it. It will run out of already occupied rungs on the ladder to step onto. It does it all dynamically via analysis so can have some bad results.

I only watched that video on a basic SDR screen at the moment but those areas looked very bright in the video and obviously clipped in your screenshots. . It's tough to show HDR issues by using SDR screenshots and videos rather than using testing hardware and graphs. Even actual HDR images/videos would look different on different screens or screen settings, even different ambient room lighting.v


Edit: More Sliders and subjective editing by images. Maybe this HDR calibration and the Windows Calibration are clashing. Maybe try lowering this one?

17-10-2022_16-53-51-zhav4vp5.jpg
 
Last edited:
Can you delete the windows HDR calibration profile from the color profiles list in windows without screwing things up? (idk if you can do that or remove the calibration ~ zero it to default in the calibration tool). The windows calibration is by your impression rather than numbers for some of it's config tools.








You already have cru edited to ~800nit I think, so you could leave it like that and set the TV to DTM = off so the game's curve gets static tonemapped down by LG's curve. Also, is autoHDR on? could that clash with true HDR on some games? Seems like a hdr curve problem like you said though. Is there any DTM settings in the game itself? In game peak brightness setting? Middle brightness/white setting?

DTM = On can step over other color values and wash out detail. There are some HDTVtest vids showing it wash out textures in bright highlights so that's not that unexpected really. DTM on is lifting lower color values into those occupied by higher ones. It can overbrighten the picture or parts of it. It will run out of already occupied rungs on the ladder to step onto. It does it all dynamically via analysis so can have some bad results.

I only watched that video on a basic SDR screen at the moment but those areas looked very bright in the video and obviously clipped in your screenshots. . It's tough to show HDR issues by using SDR screenshots and videos rather than using testing hardware and graphs. Even actual HDR images/videos would look different on different screens or screen settings, even different ambient room lighting.v


Edit: More Sliders and subjective editing by images. Maybe this HDR calibration and the Windows Calibration are clashing. Maybe try lowering this one?

View attachment 519537

Yeah the weird part is that I dragged that slider all the way to left and it was still clipping. I think it's definitely some sort of conflict between Auto HDR/Windows HDR calibration and the game. Maybe this game is supposed to follow the HDR calibration tool but it isn't doing it properly. I'll try removing the calibration profile and turning auto HDR off and just adjusting the in game settings with HGiG enabled and see if I still get a bunch of clipping.
 
Yes wiping it might be a good starting point but idk if you can swap to a different profile using that windows color profile menu instead. Otherwise, you could try to save that existing HDR calibration tool color profile file into a different folder or rename it, and then run the windows calibration tool again but turn things down. Could be difficult to fine tune it in that manner though. That way you'd still have your tweaked and working windows calibration tool profile around to swap back to file wise without having to go through all of the calibration tools again trying to duplicate what you had if you were happy with it.
 
Yes wiping it might be a good starting point but idk if you can swap to a different profile using that windows color profile menu instead. Otherwise, you could try to save that existing HDR calibration tool color profile file into a different folder or rename it, and then run the windows calibration tool again but turn things down. Could be difficult to fine tune it in that manner though. That way you'd still have your tweaked and working windows calibration tool profile around to swap back to file wise without having to go through all of the calibration tools again trying to duplicate what you had if you were happy with it.

Well I deleted the color profile in color management and that seems to get me halfway to fixing the problem. Now what I had to do in order to restore all the highlight detail back was turn the in game luminance value all the way to the minimum, on the far left. I'm not sure why I need to go so far down on the HDR settings but it is what it is. Here's some screenshots showing the highlight detail being recovered starting from the Default setting which is in the middle on the luminance slider, then gradually adjusting it to the left until I reach the end. So for now it seems like in games that supposedly follow the Windows HDR Calibration tool such as Plague Tale, it may actually end up screwing with your final image in a bad way. It contradicts what GamingTech said about the OS wide Windows HDR Calibration tool working as intended for this game but hey maybe I just did something wrong on my end.
 

Attachments

  • 20221018_204603.jpg
    20221018_204603.jpg
    820.9 KB · Views: 1
  • 20221018_204620.jpg
    20221018_204620.jpg
    884.5 KB · Views: 1
  • 20221018_204647.jpg
    20221018_204647.jpg
    902 KB · Views: 0
  • 20221018_204701.jpg
    20221018_204701.jpg
    762.1 KB · Views: 0
Last edited:
  • Like
Reactions: elvn
like this
Glad you got the clipping to go away. That kind of thing looks terrible. DTM similarly can have a lot of bad tradeoffs like that and muddied/lost details from color values stepping over themselves and scenes being operated on dynamically from analysis which can vary (how bright the sky is for example) just looking at a different angle returning to the same scene - which is why I don't use it.

Maybe the game should have a checkbox to disable the game's own calibration or to not use windows' one. The HDR calibration tool is still relatively new on pc so this might just be some growing pains where the different scalings/curve mappings are stepping over one another.

Can also check a few other things..

make sure you have hdmi deep color enabled on your hdmi input in the TV OSD

make sure you are in PC mode icon RGB/444 rather than 4:2:0

You probably are in both already though.

Might check your game HDR mode settings too:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/


If your other games are working properly it's probably just the dev's fault though like you said.
 
Glad you got the clipping to go away. That kind of thing looks terrible. DTM similarly can have a lot of bad tradeoffs like that and muddied/lost details from color values stepping over themselves and scenes being operated on dynamically from analysis which can vary (how bright the sky is for example) just looking at a different angle returning to the same scene - which is why I don't use it.

Maybe the game should have a checkbox to disable the game's own calibration or to not use windows' one. The HDR calibration tool is still relatively new on pc so this might just be some growing pains where the different scalings/curve mappings are stepping over one another.

Can also check a few other things..

make sure you have hdmi deep color enabled on your hdmi input in the TV OSD

make sure you are in PC mode icon RGB/444 rather than 4:2:0

You probably are in both already though.

Might check your game HDR mode settings too:

https://www.reddit.com/r/OLED_Gaming/comments/mbpiwy/lg_oled_gamingpc_monitor_recommended_settings/



If your other games are working properly it's probably just the dev's fault though like you said.

The only issue now is that by having to adjust the slider all the way to the left, the overall picture is now way dimmer and looks more like SDR. It's still better than actually playing in SDR so maybe it's more like HDR400 I guess. I'm not sure what method GamingTech does to get his measurements on the in game scene's luminance value outputs but I would say the peak brightness is definitely dimmer than your average HDR game on an OLED screen. My other settings like HDMI deep color and RGB chroma are enabled and so is PC mode. Not sure how else I can get the peak brightness to be more in line (700-800 nits) without causing clipping.
 
If you don't use windows calibration HDR profile and use DTM=off it would normally map down to your 800nit. However that game's own hdr calibration seems to be screwed up somehow, using it's own range or formula, or maybe even throwing the gamma off or something.

You could try to sacrifice one of the other default named TV mode's (HDR version of it with the HDR game running to enable that), on the TV's OSD to tweak it's settings in an attempt to compensate for what's happening in that game. However when not using SDR named game mode or HDR named game mode, you'll get a lot more input lag so that's not a great option.

. . . . . . . . . .

Apparently reshade supports 10bit HDR so you could set up reshade for that game. It's pretty easy to work with if you just want to adjust a few simple sliders for the game (brightness, saturation, black detail, white point, etc). You don't have to go crazy adding a ton of additional shaders/plugins to it. Still might be hard to compensate for a hard clip.

Some Reshade plugin recommendations from a 48" OLED reshade thread:

I'd recxomend the fake hdr setting, make sure to put levels on there so you can really play with that contrast and what not, . . . tbh just tinkering is the best thing, there is no one size fits all and your preference could be different than mine.

Magic HDR, fake HDR, prod brightness contrast, prod colour space curves. I am an HDR OLED user.


https://forum.level1techs.com/t/guide-for-installing-running-reshade-and-presets/126368

.

HOW TO RESHADE w/ qUINT LIGHTROOM.fx BASICS

. .

Magic HDR is a separate filter/addon from lightroom but you can load multiple filters

Magic HDR reshade plugin (apologize that it's from a low rez source but you can make out the lettering).
Exposure settings might help your game.

SDtuaXj.png



. . . . . . . . . . .

Trying to think of a way to trick the game into thinking you peak nits are higher. Maybe purposefully doing the windows calibration tool "wrong" to get a higher peak out of it, or using CRU to make the screen 1000 instead of 800 but I wouldn't want to change CRU edit for 1 game. I think the windows calibration method would be an easier thing to mess with since you could swap your more accurate color profile out and copy+paste it back later.

The scaling/range of the in game sliders is probably just a bad HDR implementation and it sounds like there is no way to disable the in game calibration entirely.

I'd give reshade a shot. It looks promising to me.
 
Last edited:
I'll do some more digging on it later but for now I've held off on playing the game until I can get a 4090. With the current settings I'm using the game doesn't necessarily look bad, I just feel like it's not reaching the max peak brightness that I know the CX can pull off.
 
If you ever get around to it, I think the exposure settings could have a chance to mute that white clipping / blowout on the ground. That panel comes up with a hotkey show/hide toggle . . as a side panel once reshade is loaded into that game, so you can move the sliders around experimentally while the game is running and see what results you can get.

When you run reshade it asks what game you want to add so you just navigate to the game exe. Then you pick which filters you want in your library. The screenshot I posted shows a bunch of filters in his list but if you look you can see he's only using 2 that he has checkboxed. You really don't have to go overboard with reshade and complicate it. The lightroom filter has sliders for saturation, brightness, black detail , etc kind of like photoshop/lightroom. The Magic HDR one has a few headings but I think in that game's case the input and output exposure might help.
 
Last edited:
I'll do some more digging on it later but for now I've held off on playing the game until I can get a 4090. With the current settings I'm using the game doesn't necessarily look bad, I just feel like it's not reaching the max peak brightness that I know the CX can pull off.

You might be interested in this workaround for games with that problem.

Youtube Video:

compressing HDR games to show 10,000 nits on LG C1

. . . . .

He's setting the nvidia gpu range to limited and then changing the black detail and white settings in the TVs OSD using provided test patterns.

Similarly clipping like your game:

Tvxh9fM.png


XiSaasJ.png


He changes the Nvidia setting to limited. This isn't the only step though, he changes OSD settings for black detail and the regular brightness settings.

He had that game at screen brightness 46 instead of 50, and at dark area level of -1 when in limited range. He said that it was equivalent in effect to brightness 50 and dark detail of -17 when in unlimited range. He tweaks different values for different games though.

7rJSwNH.png
 
Last edited:
You might be interested in this workaround for games with that problem.

Youtube Video:

compressing HDR games to show 10,000 nits on LG C1

. . . . .

He's setting the nvidia gpu range to limited and then changing the black detail and white settings in the TVs OSD using provided test patterns.

Similarly clipping like your game:

View attachment 520782

View attachment 520783

He changes the Nvidia setting to limited. This isn't the only step though, he changes OSD settings for black detail and the regular brightness settings.

He had that game at screen brightness 46 instead of 50, and at dark area level of -1 when in limited range. He said that it was equivalent in effect to brightness 50 and dark detail of -17 when in unlimited range. He tweaks different values for different games though.

View attachment 520784

I'm currently playing Uncharted atm since my 3080 Ti can do 4k maxed with DLSS at 100+ fps. The HDR experience here has been far less frustrating to say the least. All I did was enable HDR, adjusted the brightness slider from 5 to 7, and everything seems to look just fine as far as I can tell. I really think we shouldn't have to resort to messing around with Reshade, NVCP, CRU, etc. etc. just to get a good looking picture. Hopefully HDR gets to that point one day on PC where it's as simple as turning it on and as Jensen Huang says, "It just works".
 
Yes that would be nice. If I was very into a particular game and spending a lot of time on that game I'd go the extra mile and try to get a better result but not every game or so so games.
 
I'm currently playing Uncharted atm since my 3080 Ti can do 4k maxed with DLSS at 100+ fps. The HDR experience here has been far less frustrating to say the least. All I did was enable HDR, adjusted the brightness slider from 5 to 7, and everything seems to look just fine as far as I can tell. I really think we shouldn't have to resort to messing around with Reshade, NVCP, CRU, etc. etc. just to get a good looking picture. Hopefully HDR gets to that point one day on PC where it's as simple as turning it on and as Jensen Huang says, "It just works".
To my understanding the HGIG setting should be for that so that the game itself could provide the right settings but I don't know if anything on PC implements it.
 
To my understanding the HGIG setting should be for that so that the game itself could provide the right settings but I don't know if anything on PC implements it.

Most games that I've played do have some sliders to adjust the HDR so I guess that's currently the best we have to use with HGiG. Some games do not have any adjustment sliders at all so for those it might be better to just use DTM OFF. Hopefully one day that Windows HDR Calibration tool will act like the PS5's and all PC games will support it.
 
Last edited:
Did anyone swap to the 42C2 yet? Thinking about it as the black friday prices are looking really good. Probably will be cheaper yet though when the C3 comes out next year. I'm so used to the 48 at this point though I wonder if I'll regret downsizing.
 
I go by PPD so the screen size just determines how far away I'll have to modify my setup to sit. 42 inch 4k saves 4.5 to 6 inches in view distance compared to a 48 inch 4k. That's in regard to your size difference reasoning. Newer tech often has a few better specs like peak brightness a bit. You'll find people who swapped in the 42" OLED threads.


..
42" 4k flat: 64deg viewing angle = 29" view distance = 60 PPD

42" 4k flat: 55deg viewing angle = 35" view distance = 70 PPD

42" 4k flat: 48 deg viewing angle = ~ 41" view distance = 80 PPD <----- view distance makes an equilateral triangle pyramid/cone viewing angle so you can see the bulk of the screen surface better

. .

48" 4k flat: 64 deg viewing angle = 33.5" view distance = 60 PPD

48" 4k flat: 55 deg viewing angle = \~ 40" view distance = 70 PPD

48" 4k flat: 48 deg viewing angle = 47" view distance = 80 PPD <----- view distance makes an equilateral triangle pyramid/cone viewing angle so you can see the bulk of the screen surface better

..

759311_VcjhpLn.png

..

At 60 PPD, massaged or alternate types of text sub sampling are perhaps enough vs text fringing, and aggressive amounts of AA applied are enough to compensate for graphics aliasing in 3d game engines. However on the 2D desktop there is no AA typically outside of the AA of text sub-sampling so desktop graphics and imagery will have their pixelization uncompensated. So 70 to 80PPD (for now, or higher someday) is preferable.

Closer than 60 PPD point and your sub-sampling and AA won't be enough to compensate anymore but you will also be sitting nearer than your 60 degree human viewpoint. That results in the sides of the screen being pushed outside of 60 deg viewing angle which means you'd have to dart your eyes side to side to see anything in those "eye fatigue zones" continually during gameplay. The closer you sit, the more off axis or off angle the sides of the screen become as well, which means more viewing angle color shift and in a larger area ofnthe sides ofnthe screen.

So there is a sweet spot range of PPD and viewing angle degrees. Changing the screen size of 4k 16:9 vs. a different size 4k 16:9 just determines how far the viewing distances are for those ranges. The nearer 29" view distance for 60PPD on a 42 inch 4k could be helpful for space constraints but it's still out of bounds of a traditional near upright piano sheet music type setup's 1.5 to 2 foot viewing distance. 70 to 80 ppd is better though, especially for 2d desktop imagery and better looking text (perhaps even more so with WOLED text sub-sampling). For the vast majority of desk depths to be able to get 60PPD, and certainly to get to 70 80 PPD distances, you would have to decouple the screen from the desk using any if various mounting options. A simple slim rail spine floor footed stand is probably the easiest way.
 
Last edited:
Did anyone swap to the 42C2 yet? Thinking about it as the black friday prices are looking really good. Probably will be cheaper yet though when the C3 comes out next year. I'm so used to the 48 at this point though I wonder if I'll regret downsizing.

I've gotten used to 48 over the last 2 years so I don't see much reason to upgrade if the image quality and Hz is the same. Some would argue that the image quality would be downgraded due to the 42s lower peak brightness vs the 48. The only thing that will get me to upgrade at this point is noticeably better image quality like a QD-OLED or a decent bump in Hz to like 160Hz.
 
I've gotten used to 48 over the last 2 years so I don't see much reason to upgrade if the image quality and Hz is the same. Some would argue that the image quality would be downgraded due to the 42s lower peak brightness vs the 48. The only thing that will get me to upgrade at this point is noticeably better image quality like a QD-OLED or a decent bump in Hz to like 160Hz.
If anything Samsung's QD-OLED might actually be worse for desktop use if it keeps its odd pixel arrangement and they don't get Microsoft to support it. At least with the LG OLEDs you can mostly mitigate it with Cleartype adjustments and DPI scaling.

The things I want to see next for OLED from any manufacturer:
  • Higher refresh rate.
  • Higher peak and sustained brightness.
  • Curved 40-43" models.
  • 6-8K models at 40-55" size range.
 
If anything Samsung's QD-OLED might actually be worse for desktop use if it keeps its odd pixel arrangement and they don't get Microsoft to support it. At least with the LG OLEDs you can mostly mitigate it with Cleartype adjustments and DPI scaling.

The things I want to see next for OLED from any manufacturer:
  • Higher refresh rate.
  • Higher peak and sustained brightness.
  • Curved 40-43" models.
  • 6-8K models at 40-55" size range.

That wouldn't bother me personally since I don't use my OLED for any sort of text heavy work, or any work in general. So the trade off of bad subpixel structure for higher peak brightness and color volume would be worth it to me.
 
If anything Samsung's QD-OLED might actually be worse for desktop use if it keeps its odd pixel arrangement and they don't get Microsoft to support it. At least with the LG OLEDs you can mostly mitigate it with Cleartype adjustments and DPI scaling.

The things I want to see next for OLED from any manufacturer:
  • Higher refresh rate.
  • Higher peak and sustained brightness.
  • Curved 40-43" models.
  • 6-8K models at 40-55" size range.

At least with the LG OLEDs you can mostly mitigate it with Cleartype adjustments and DPI scaling.
------>> Or sitting at a higher PPD distance like 70 - 80 PPD + text subsampling methods. Not a fan of losing 4k desktop real-estate by scaling text personally. Also, scaling text and sub-sampling doesn't help aliasing of desktop 2d graphics and imagery. I wouldn't want to be using a PPD where pixels and sub-pixels are so large that you need to scale text due to pixelization. On very high PPD displays of course I'd scale text just so it's not microscopic though.


The things I want to see next for OLED from any manufacturer:
  • Higher refresh rate. <--- agree, but at high resolutions 4k+. Not the 1440p 16:9 higher hz ones that are put out with resulting lower PPD (outside of a 49" g9 if sitting nearer to the ~ 40" focal point)
  • Higher peak and sustained brightness. <---- agree. Especially sustained. Heatsink tech would probably help a lot which I'd trade off vs thinness. QD-OLED's blue basis and color filter theoretically allows for brighter colors at lower energy states/heat too.
  • Curved 40-43" models. <---- Curved yes but my preference would be on the larger end. I'd go up to 48" for immersion at 4k (and additionally uw resolutions on a 4k screen) 1000R ~ 39.4" focal point ~ 70PPD, 55" at 8k sitting at focal point of 1000R curve ~> 122 PPD
  • 6-8K models at 40-55" size range. <-- yes but as above, I'd prefer 1000R 48" 4k to 55" screen. Might as well skip 6k and go right to 8k on a 55" so that you'd be able to get quads of fair sized 4k windows.
  • Gaming Upscaling tech on the screen if possible so you could send 4k high hz bandwidth over the port/cable bottleneck to be upscaled to 8k on the screen without it looking muddy or introducing anything other than negligible input lag. Would be cool if it were possible to get gaming AI upscaling hardware on the monitor end.

Ultrawide resolutions would be great on higher hz, especially on a 8k screen that is ~55" 1000R. On a 4k screen when you do 32:9 or 32:10 you only get 1080px or 1200px high viewable which makes me uninterested in that aspect. These below seem like they'd fit in hdmi 2.1's bandwidth.

From using the LTT resolution/bandwidth calculator I linked - These uw resolutions look like they'd be nice to run on a 16:9 4k or 8k screenif they upped the Hz on oled tvs, even at 4k upscaled to 8k being sent as an 8k signal as far as the display would be concerned. They fit within HDMI 2.1's bandwidth, at least when using DSC 3:1 compression ratio. DP 2.0 would be nice but realistically I'd prob stick with a TV model than paying up to triple+ in some cases for a comparable desktop gaming monitor version to get dp2.0 someday and potentially end up suffering AG coating to boot which would really bother me, especially on an OLED.

(The 8k signals even if as 4k upscaled on the display end in order to get higher fps)

8k at 32:10 ultrawide 7680 × 2400 rez @ 200Hz, 10bit signal at 3:1 compression: 41.03 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

8k at 24:10 ultrawide 7680 × 3200 rez @ 150Hz, 10bit signal at 3:1 compression: 40.02 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)

4k at 24:10 ultrawide 3840 × 1600 rez @ 500Hz, 10bit
signal at 3:1 compression: 40.73 Gbit/s (HDMI 2.1 ~> 41.92 Gbit/s)
 
Finally managed to get an RTX 4090. Decided to pair it up to the CX instead of the X27. Maybe I'm just crazy but it feels like on nvidia there's noticeably more VRR flicker than there was when I was using my 6900 XT. Maybe Freesync Premium helped with this?
 
Finally managed to get an RTX 4090. Decided to pair it up to the CX instead of the X27. Maybe I'm just crazy but it feels like on nvidia there's noticeably more VRR flicker than there was when I was using my 6900 XT. Maybe Freesync Premium helped with this?
Interesting. Might want to compare what games and especially what frame rate ranges you are viewing including the lower end. And how much fluctuation, how erratic the frames are being delivered. Might try lowering settings and see if it's as obvious at higher rame rate ranges, and a different game if one or its settings are making rhe frame rate flow more choppy for some reason.

Make sure hdmi black level is set full etc and things like that, (brightness levels gamma etc not lifting dark areas) and running native rez. (using DLSS?). When the tv is disconnected some things revert like named pc icon mode 444 / rgb) .

Will try to think of anything else.
 
Maybe also lock the max framerate to 119 or 118 to keep Gsync on and in range without going over max refresh and dropping back down under constantly... which a 4090 is powerful enough to do. :)
 
I don't think it has much to do with the average frame rates though. The specific example that caught my eye was the "loading screen" in AC Valhalla where you can free run around in a dark foggy area. The 4090 seemed to exhibit much more pronounced flicker than the 6900 XT did. But it's probably just a case of more/worst frame time spikes on the 4090 during that loading screen or something. VRR flickering was something that I always experienced on loading screens where my frametime graph is all over the place even on non OLED displays so this is nothing new at all. In game though I don't really see much of it, at least not with the games that I play. And yes the 4090 is absolutely able to smash past 120fps at 4K, unless you love ray tracing lol. Might even swap it out for a 7900 XTX if the raster performance is close enough as I honestly can't tell a damn difference between RT on and off at a glance. I either need a direct side by side or I have to really analyze parts of the image to notice and nobody plays games like that.
 
Last edited:
That reminds me of the menu screen issue in "New World" that was frying gpus with 1000 fps or something, fan module overvolting and burning up/popping like pop corn . . I forget but it was definitely related to an extreme fps unlocked menu issue like that.

Idk if you ever watch "Sadly it's Bradly" on youtube. He does insider info + data mining rumors/information on upcoming VR tech. One of his more recent vids was on Valve Prism + Deckard. In it he mentions some indications of VRR brightness compensation tech.

FkAhVQ8.png


. . . . . . . . . . . . .

Steam VR's Prism - Sadly It's Bradly (youtube vid link)
 
Back
Top