42" OLED MASTER THREAD

Reshade
https://reshade.me/

Reshade is pretty easy to use if you just want to edit a few sliders. After you install the reshade app, you just pick the game if it's detected, or recurse directories using "browse" to find a game exe.

Borrowing "how to install" images from here: https://forums.flightsimulator.com/...listic-colors-and-tone-fixes/454411/51?page=3
014dd99fbec7f19e61bae7098765b58ede9fd42d.png

21316c478f17dcfd4b12d352ee4763257ac13b52.png


0f464ec8ba3f338e454bc0897f3551a82adb801f.png


"Now select which shaders (effects) you want to install, my recommendation is to uncheck all, and then recheck all, to install all the shaders. So make sure all are checked and then click next."

You should probably install all of the shaders since you can uncheck them all later within the app and only use the few you want on each game.
742319a82ba144c4e61c43f5fbeb64cf5e5fdcc6.png


"Reshade should then load automatically when you start the sim"
"The first thing I would do would be to navigate to the settings menu in reshade at the top and select a key for effects toggle."

Like he said, select a unique hotkey combo so you can enable/disable the reshade effects in order to see before/after as you tweak.

ingame4-1.png




You really only need to activate a few shader package checkboxes in order to get some controls like: saturation, color strengths/tone, contrast, black detail level, white point, sharpness etc. controls.

9202bcbf0cf114e69c7942b76732d5ef9418e00_2_690x1022.jpg


This one is dated but still shows how to change a lot of things pretty well. Notably blackpoint, whitepoint.
https://forum.level1techs.com/t/guide-for-installing-running-reshade-and-presets/126368

There are also sets of pre-baked arrays of settings you can download that people already took the time to make as a package for particular games. In that case you wouldn't have to edit anything yourself, you'd just install the preset. Personally I'd rather just adjust a few sliders myself though since everyone's monitor settings, viewing environment, and personal tastes are different.

If you want to install a preset someone authored for a game, for the most part it is just decompressing the preset ini into the specific game's folder
1) Install the latest ReShade and ALL of the base effects. You can install the extras if you want, but their not important for this tutorial.

2) Download the ReShade preset of your choice

3) Extract ONLY the INI file into your game folder (where ReShade is installed)

Some presets come with their own custom shaders, LUTs and/or textures. In that case, you'll have to install those too.

4) Open ReShade settings with the default key or a key you assigned by choice.

5) Go to HOME

6) In the drop down, select the preset you downloaded (the INI file you extracted)

7) It should load just fine. If not, press *** to reload.

======================================================

This below shows how to use reshade's HDR tools along with the lilium tone mapping shader to fix "broken" HDR games that are defaulting to hdr 10,000 or HDR 4,000 due to poor implementation/broken sliders by devs:

Before he set the max CLL for static tone mapping to match what the game is trying to push at 10,000 or 4,000 . . and then set the target peak luminance to that of his screen - the brights were crushed to white, and the black level was also lifted and muddied.

PlasmaTVforGaming Youtube channel - Reshade HDR tools

YouTube Video:
HDR tone mapping is now possible with Reshade. What to do with 10000 nits games on any TV or Monitor [Plasma TV for Gaming Channel, 6 mo. ago]

excerpts from the video linked below:

"I'm going to show you a way to get a better HDR on these pc games that are trying to output 10,000 nits, and they do NOT have a peak brightness slider, or they just have broken sliders like here in Horizon Zero Dawn - but we also have Uncharted, The Last of Us, God of War.

Unfortunately it is very frequent that we get (this in) these games and we need to fix them, especially if you are using a monitor that is not able to do tone mapping or a TV that cannot do tone mapping at 10,000nits because on this LG OLED we can actually come here to the settings and we select 'Tone Mapping: Off' or 'Dynamic Tone Mapping' (either one works) and then hit 1, 3, 1, 1, on the remote and then you can change here (in the HDMI signalling overide menu) the peak to Max LL 10,000and then the TV is going to do tone mapping. When you do that the mid tones look dark, it's just too compressed.. it doesn't look great, ok? " "How can they mess this up so bad in Horzion Zero Dawn?" "It's just crap, look at this before and after".

. .

"So now what I'm using here on the PC is Reshade. So on reshade we have this new HDR analysis tool and we have other shaders to fix the HDR. So the shader that I 'm using to fix the HDR is to do tone mapping basically. It's this shader called lilium tone mapping.

So how can you fix 10,000 nit HDR games? Very easy you just have to select where it says 'Max CLL for static tone mapping'. You select '10,000' and then you select 'Target Peak Luminance' and set it to '800'. Or, it (the Max CLL for static tone mapping) can depend on the game - so you can use that HDR analysis tool in reshade (to find the values the game is trying to output). If you want 800nit for HGIG on this LG C1 or one of these LG OLEDs that have 800nit HGiG, you just make sure that the max CLL is 800 or just right below. So that's all you have to do, just have to adjust this Target Peak luminance to do tone mapping and the difference is remarkable."

"Now I am doing other tweakings that I will share with you, that fix the midtones and colors, etc.".

"The difference is just absolutely Gigantic." . . " So that's how we can do tone mapping (with Reshade) - you want 800 if the game is trying to output 4000, you do the same here: select 4000 where it says Max CLL for static tone mapping, and then you select 800 for Target Peak Luminance".

YouTube Video:
HDR tone mapping is now possible with Reshade. What to do with 10000 nits games on any TV or Monitor [Plasma TV for Gaming Channel, 6 mo. ago]

He states the Horizon Zero Dawn and God of War are both trying to push 10,000nit HDR curves/ranges. Not sure if last of us is doing 10,000 or 4,000. So other than following his videos on those games or researching online to find out what a game is pushing, you could instead simply use the HDR analysis tool in reshade that he shows in that video to check where it shows that value in the top left corner of the game, like a fps meter sort-of, so you can set the Max CLL for static tone mapping to the correct value (10,000 or 4,000 usually for those broken HDR games). Then set the Target Peak Luminance to that of your gaming tv/monitor.

. .

Also, some SDR enhancement info about the way he uses Reshade in SDR games on a HDR capable screen:

HDR gaming on the PC keeps getting better. New Reshade and Lilium Shaders update. SDR HDR trick [Plasma TV for Gaming Channel, 5 mo. ago]

He shows how he used the HDR shaders to fix cyberpunk in the video below, and that video goes into more detail later on further tweaking some things in reshade in the HDR game beyond the Max CLL for static tone mapping and the target peak luminance.
"Man, honestly don't know how many times I've tried to fix cyberpunk native HDR but this is it - this is the best of my knowledge and abilities". "I'm very, very happy with this result".


*This video has a HDR version which shows the differences in HDR. Youtube HDR might need chrome or edge to get the hdr option for the vid



View: https://www.youtube.com/watch?v=OKThK63kYj0
 
Last edited:
God damn! Seems like even the professionals don't clearly agree on length/performance standards.
I'll try this club3d cable, and if I still get issues I'll try an optical cable, and if I still have issues after that I'll just live with it.
Maybe I should just man up and move my tower on top of my desk.
The club3d cable did not resolve my issue with bf2042 locking up for half a second every 8 seconds or so :(.
I'm making major assumptions that it's the HDMI cable causing this however, I just can't think of what it would be.
Maybe I'll try an optical cable.
 
Maybe the years of having 'vivid' set on my TVs and phones have derailed what I think a screen should look like...
Or maybe you are brave enough to admit what most people won't and claim that the are after the "creators intent" when all we really want is pop and lots of it :)
 
Or maybe you are brave enough to admit what most people won't and claim that the are after the "creators intent" when all we really want is pop and lots of it :)

Hell yeah I want that pop! I crank up the saturation values when using RTX HDR and I could care less what the creator intent is :D
 
Wanting a picture that Pops is equivalent to using Ketchup on a fine ditch. Feel free to do so, but dont complain when more informed folks give you the weird looks. :D
 
I turn the saturation up a little in my LG OLED's game mode in it's OSD for SDR games since I found it a little dull compared to the other non-game picture modes. Then I use reshade to turn it back down to where I like it on a per game basis, along with tweaking a few other parameters. When I launch a HDR game and HDR game kicks in, it uses it's own HDR game picture mode so that saturation slider I adjusted higher only affects the SDR version of game mode. HDR games don't need the saturation boost from my experience, HDR range is very colorful. Some games have their own saturation slider in the HDR settings of the game though, so depends how you look at it.

More SDR games are able to use win 11 AutoHDR and now nvidia RTX HDR now, so their appearance is enhanced compared to default SDR game mode.
 
Last edited:
Wanting a picture that Pops is equivalent to using Ketchup on a fine ditch. Feel free to do so, but dont complain when more informed folks give you the weird looks. :D

As if we care lol. Image quality is personal preference just like sound. Some want that bass boost cranked up, others do not. I bet you're also the type of person who would never use motion interpolation feature on your TV and whine about soap opera effect, and you know what? That's totally fine because it's your preference, not everyone has to like what you like. It doesn't make us any less "informed". Plenty of people here love using wide gamut mode for playing SDR games. To me, playing SDR games with it's default muted colors that are considered "accurate" is like eating your food plain and without any seasoning.
 
As if we care lol. Image quality is personal preference just like sound. Some want that bass boost cranked up, others do not. I bet you're also the type of person who would never use motion interpolation feature on your TV and whine about soap opera effect, and you know what? That's totally fine because it's your preference, not everyone has to like what you like. It doesn't make us any less "informed". Plenty of people here love using wide gamut mode for playing SDR games. To me, playing SDR games with it's default muted colors that are considered "accurate" is like eating your food plain and without any seasoning.

You are aware but - just an aside since you mentioned it. OLED is so fast you pretty much have to use some level of interpolation or low framerate video playback will look stuttered in some scenes. LCDs are so slow they hide low framerate media/video choppiness in transition blur/smear of their slower response time.

I'm very interested in seeing how samsung's new AI upscaling chip performs in the 8k 900D whenever some real reviews start coming out. It supposedly does great things for things for motion, like the ball in different sports content. It's a FALD LCD so won't have the same stutter look as an OLED on lower frame rate, panning etc. content, but theoretically they could someday put the same chip in more of the gaming tvs and monitors, hopefully even future QD-OLEDs.

. . .

From what I've been reading, nvidia's RTX HDR "autoHDR" can cause an appreciable performance hit. I've heard 6% to 10% compared to other "hdr injector" methods (some people report up to 20% but I haven't seen any confirmation of that). Considering that, some people are using special K (accurate to 480nit) , or Ilium Reshade filter to do their own 'autoHDR'.


Special K HDR Retrofit:
https://wiki.special-k.info/en/HDR/Retrofit

Special K vs. Native HDR

View: https://www.youtube.com/embed/p7J1KnTPa_c?autoplay=1

Excerpt from the video. Unless special K has changed since, this 480nit limit on most games might apply:
"I've tuned these values not to give you the most contrast or to give you the most peak brightness possible but to more accurately match the native HDR presentations in terms of average picture level, contrast, saturation, black levels, and leaving the peak brightness to wherever those sliders leave the peak brightness to - in this case, it's about 480nits. There is not much you can do about this currently with special K. This is the brightness you are limited at. It is still significantly higher than it will ever be in SDR if you're watching SDR in a reference grade environment - and it gives you some little fine tuning adjustments if you want a more punchy image or if you want a more contrasty, less saturated . . whatever you want the image."

"To go over it again, if you were to play a game like Farcry 3 which doesn't have a HDR presentation at all , without having to guesswork where to slide the sliders to - you can just use these pin values and know in the back of your mind that 'if this game had a HDR presentation, this is roughly what it would look like'. "

"There are some limitations with special K currently. Special K currently does not allow you to have a peak brightness whilst retaining the average picture level as dim as it should be, past ~ 480-ish nits and this is just a limitation of how the tone mapper and such works. There are some edge cases or different examples for example Halo Infinite - because the game's native SDR presentation has pixels that exceed 255 RGB value it goes past that and special K can extract this information when you inject it in and it will present it in a brighter format. Halo infinite with these settings goes above 1000nits whereas most games where I can't get that extra information will cap at around 480. You can go past this, obviously the slider is there you can do whatever you want. However for a reference image, the settings in the description are what you see on screen"


. .
Reshade Illium HDR
(have to disable autoHDR in windows to use this method)
You install reshade with full add-on support (his stuff is included nowadays in reshade install, you have to check his shaders when reshade prompts you to download shaders), then you download his add-on from GitHub - it’s 2 files - autohdr32, autohdr64, choose the one that’s right for the game, depending on whether it’s 32bit game or 64, put it next to game exe. Launch the game, in reshade go to add-on options, click on auto hdr, check use hdr box, and optionally “use scRGB”, it’s a bit better quality of colors and banding, but doesn’t work with frame gen games. Restart the game, find Lilium’s inverse tone mapping, enable it. Then you’ll have to manually set peak brightness, tonemapping method, and gamma method. If game looks too dark, it’s probably linear gamma, otherwise use gamma 2.2 or srgb.

Then load Lilium’s hdr analyzer after inverse (load order matters) to check if peak nits are right, blacks not crushed or raised. You can set the MAX_CLL in the Ilium filter to what the game is trying to output (shown in the hdr analyzer in the top left of screen sort of like a fps meter), then change the target peak luminance in the filter to the peak nit of your screen (e.g. around 800nit on the older OLEDs).

Some games it messes up things. I think it’s because in scRGB mode it’s trying to remaster 8bit into 16bit and that doesn’t go well sometimes. In last epoch for example hdr10 works fine, but with scRGB some ui elements disappear.

You can also try Special K. It’s also very good, works almost the same, but remastering 8bit, 10bit, 11bit comes as an option for dx11 games, for both hdr10 and scRGB. So it might work better. It also has built in features to reduce latency, so it might help with stuttering.

PlasmaTVforGaming youtube channel: SDR enhancement info about the way he uses Reshade in SDR games on a HDR capable screen


View: https://www.youtube.com/watch?v=GXRLSc4soiY

HDR gaming on the PC keeps getting better. New Reshade and Lilium Shaders update. SDR HDR trick [Plasma TV for Gaming Channel, 5 mo. ago]

"We can get a nice HDR effect using Reshade for games that do not support native HDR, or AutoHDR from windows 11".
"Enable SDR black floor emulation. It's not losing any details. We can get a perfect black floor with a single click, you don't have to tweak settings up, down and do a lot of guesswork"
"There is a white point heading, max it out as bright as possible" (after following his earlier steps in the tutorial)
"SDR HDR Slider on Win11 86%(not 97 like I said, I didn't remember it correctly). This is to Get Maxcll 401 to use Mastering Peak Maxcll 400 tone mapping OFF on LG OLEDs. "


Reshade (download button at bottom of page) : https://reshade.me/

Illium HDR filter repository: https://github.com/EndlesslyFlowering/ReShade_HDR_shaders

Direct download to latest version (2-24-2024) : https://github.com/EndlesslyFlowering/ReShade_HDR_shaders/releases/tag/2024.02.24

. . .

That or you could use windows auto HDR on supported titles, which is less than a 1% performance hit supposedly. Win11 auto HDR might even get a higher max CLL than 400 - 480. I don't know windows autoHDR's max accurate nits before it screws up the image (crush black detail, clip bright colors to white blobs, etc), or nvidia RTX HDR's for that matter. The nvidia RTX HDR performance hit is probably less of an issue on older/easier to render games but losing some fps might not be great when you are trying to scale the top end of your frame rate graph over 120fpsHz or 200fpsHz depending on your monitor, and you are trying to keep the bottom end of the graph from sinking down farther than w/o RTX HDR in demanding parts. Anyway, there are some options at least to get better than SDR using one of a few other methods, and those suites of shaders usually provide more tweakability if you want to micromanage more.
 
Last edited:
You are aware but - just an aside since you mentioned it. OLED is so fast you pretty much have to use some level of interpolation or low framerate video playback will look stuttered in some scenes. LCDs are so slow they hide low framerate media/video choppiness in transition blur/smear of their slower response time.

I'm very interested in seeing how samsung's new AI upscaling chip performs in the 8k 900D. It supposedly does great things for things for motion, like the ball in different sports content. It's a FALD LCD so won't have the same stutter look as an OLED on lower frame rate, panning etc. content, but theoretically they could someday put the same chip in more of the gaming tvs and monitors, hopefully even future QD-OLEDs.

. . .

From what I've been reading, nvidia's RTX HDR "autoHDR" can cause an appreciable performance hit. I've heard 6% to 10% compared to other "hdr injector" methods (some people report up to 20% but I haven't seen any confirmation of that). Considering that, some people are using special K (accurate to 480nit) , or Ilium Reshade filter to do their own 'autoHDR'.




. .

. . .

That or you could use windows auto HDR on supported titles, which is less than a 1% performance hit supposedly. Win11 auto HDR might even get a higher max CLL than 400 - 480. I don't know windows autoHDR's max accurate nits before it screws up the image (crush black detail, clip bright colors to white blobs, etc), or nvidia RTX HDR's for that matter. The nvidia RTX HDR performance hit is probably less of an issue on older/easier to render games but losing some fps might not be great when you are trying to scale the top end of your frame rate graph over 120fpsHz or 200fpsHz depending on your monitor, and you are trying to keep the bottom end of the graph from sinking down farther than w/o RTX HDR in demanding parts. Anyway, there are some options at least to get better than SDR using one of a few other methods, and those suites of shaders usually provide more tweakability if you want to micromanage more.

Performance hit seems to vary. Could be based on how much the tensor cores are already being utilized prior to running RTX HDR, or just how fast the GPU is in general like say a 2060 will suffer greater performance hit than a 4090, but I'm not sure. Either way this is a solution that is readily available at the push of a button (two actually since it's Alt + F3) and it just WORKS. For years now I've grown tired of dealing with games that have broken HDR implementations or no implementation and I can't be bothered to manually fix every single one of them every time. RTX HDR has provided me with what is essentially an instant fix for such scenarios and I am so glad it exists.
 
Performance hit seems to vary. Could be based on how much the tensor cores are already being utilized prior to running RTX HDR, or just how fast the GPU is in general like say a 2060 will suffer greater performance hit than a 4090, but I'm not sure. Either way this is a solution that is readily available at the push of a button (two actually since it's Alt + F3) and it just WORKS. For years now I've grown tired of dealing with games that have broken HDR implementations or no implementation and I can't be bothered to manually fix every single one of them every time. RTX HDR has provided me with what is essentially an instant fix for such scenarios and I am so glad it exists.

More options (like RTX HDR) the better, and it could get better performance eventually, and into the 5000 series. Hopefully dlss, frame gen, and rtx hdr will all continue to mature across generations.
 
The club3d cable did not resolve my issue with bf2042 locking up for half a second every 8 seconds or so :(.
I'm making major assumptions that it's the HDMI cable causing this however, I just can't think of what it would be.
Maybe I'll try an optical cable.
For anyone wondering, I installed the beta version of the amd driver and it resolved my screen freezing in bf2042. I waisted all the time swapping cables for nothing lol.
 
Do we know what brightness figures the C2 has with ASBL disabled? Does it actually get brighter or not just dim as quickly?
 
Do we know what brightness figures the C2 has with ASBL disabled? Does it actually get brighter or not just dim as quickly?
I have all of that disabled on my C2, I could not deal with the dimming. It is no brighter, it just remains constant.
 
More options (like RTX HDR) the better, and it could get better performance eventually, and into the 5000 series. Hopefully dlss, frame gen, and rtx hdr will all continue to mature across generations.
For sure. I'm kind of thinking of just using RTX HDR for everything. Gaming wise I mean. Closest to a one and done maybe. Have it set up similar to Plasma for Gaming's recommendations.
 
I updated my C2 42" to WebOS 23 and it could just be my imagination, but the input latency feels slightly worse. Maybe I need to reset the firmware and set it up again. :inpain:
 
If the 42 inch had a 1000R or more curve, I would have kept it. However, it appears that the LG 45 inch Ultrawide OLED monitor is better suited for my needs.
 
If the 42 inch had a 1000R or more curve, I would have kept it. However, it appears that the LG 45 inch Ultrawide OLED monitor is better suited for my needs.
I’ve seen that 3440x1440 stretched out to 45” in person and, ugh. I could never be happy with that sort of PPI.
 
I’ve seen that 3440x1440 stretched out to 45” in person and, ugh. I could never be happy with that sort of PPI.

The thing is, it is not stretched at all, it is a proper 21:9 resolution. Also, it actually looks fantastic on the desktop and in games. You just have to make sure not to sit on top of it. 😊
 
The thing is, it is not stretched at all, it is a proper 21:9 resolution. Also, it actually looks fantastic on the desktop and in games. You just have to make sure not to sit on top of it. 😊
Of course its 21:9, that wasn't the concern. Its just so few pixels for the size, which is better suited to 34". 45" is the domain of upcoming 5120x2160 UW's, IMO.
 
If the 42 inch had a 1000R or more curve, I would have kept it.
That would be the LG Flex. Last month I saw one in a store in Japan and it looked pretty great. Would like them to make an updated version that isn't just a "curvable C2 at 2-3x the price".
 
That would be the LG Flex. Last month I saw one in a store in Japan and it looked pretty great. Would like them to make an updated version that isn't just a "curvable C2 at 2-3x the price".

If I am understanding correctly, the LG Flex is simply a flexible version of the LG 45 Ultragear, which has a permanent 800R curve. (It is what I have.) If the C2 were curved, I would have kept it but, for me, a flat 42 inch unit was too large, overall, since I am 5ft 7 inches tall and therefore, my head does not come to the top of that screen.
 
The thing is, it is not stretched at all, it is a proper 21:9 resolution. Also, it actually looks fantastic on the desktop and in games. You just have to make sure not to sit on top of it. 😊
3440x1440 at 45" is 82 PPI. It's the same pixel density as a 27" 1920x1080 monitor.

3440x1440 is also not a "proper" 21:9 resolution. It's an approximation. 21:9 = 2.[3]:1, while 3440x1440 = 2.3[8]:1. The proper 21:9 resolution would be 3360x1440, just like it should be 5040x2160 instead of 5120x2160.
 
3440x1440 at 45" is 82 PPI. It's the same pixel density as a 27" 1920x1080 monitor.

3440x1440 is also not a "proper" 21:9 resolution. It's an approximation. 21:9 = 2.[3]:1, while 3440x1440 = 2.3[8]:1. The proper 21:9 resolution would be 3360x1440, just like it should be 5040x2160 instead of 5120x2160.

That would mean all the 21:9 monitors are not correct but, close enough. However, the PPI of the LG monitor is something you will not notice, unless you are sitting on top of it. I sit back from it and at first, I was concerned I would not like the text sharpness but, I actually do not notice. I do run Mactype, since it is a RWGB monitor, and that helps a lot, although it appears good on my Ubuntu install as well.
 
Oh yeah, I was thinking of the CORSAIR XENEON FLEX 45WQHD240, you are right.

Yeah, I should have started with the LG Flex 42 but it was more than I wanted to spend. I also purchased the LG 45 Ultragear at $1070 3 days ago and that made it worth it.
 
Hi guys, im having some difficulty with ultrawide resolutions on my 7800xt and CX.

It wont display 2560x1080 properly (it stretches it) - until i enable GPU scaling. I dont wanna use that as its not as nice. What am i missing?
 
Hi guys, im having some difficulty with ultrawide resolutions on my 7800xt and CX.

It wont display 2560x1080 properly (it stretches it) - until i enable GPU scaling. I dont wanna use that as its not as nice. What am i missing?
I believe you will need to use 2560x1440 not to have it stretch.
 
Hi guys, im having some difficulty with ultrawide resolutions on my 7800xt and CX.

It wont display 2560x1080 properly (it stretches it) - until i enable GPU scaling. I dont wanna use that as its not as nice. What am i missing?
If you're trying to set an UW res for a 4K display, you'll want to stick to 3840x1600. That's mostly 21:9 and fits the width of the display natively.
 
If you're trying to set an UW res for a 4K display, you'll want to stick to 3840x1600. That's mostly 21:9 and fits the width of the display natively.

Yep, otherwise, after he sets the screen to a 21:9 aspect ratio, he will get black bars all around.
 
Back
Top