LG 48CX

..

Might even swap it out for a 7900 XTX if the raster performance is close enough as I honestly can't tell a damn difference between RT on and off at a glance. I either need a direct side by side or I have to really analyze parts of the image to notice and nobody plays games like that.

I know that shadows can make a huge difference. People used to turn them off or set them to low, perhaps more often in the past, but good global dynamic shadows can deliver an almost 3D ~ holographic look to a game. I notice that kind of aesthetic gain in a big way personally. It takes away the flatness of the scene, gives things depth. Especially large outdoor games with long view distances and animated objects (and shadows) both near and far, all the way out in the distance. So RT can add very dynamic 3d effect between shadows and highlights, reflections in those types of games, depending how well it's implemented.

I don't use RT due to the performance hit but I can see how it could be valuable if the big performance drop wasn't the tradeoff. The catch 22 now might be that in some cases games that have the frame rate to spare might not be as detailed and/or as open world wise so not have quite as great of an effect overall. Even less detailed stylized rpg/adventure//mmo can be very demanding with very high view distances + #anim objects in distance. Games with very long view distances unlocked + high # of animated objects and effects in the distance maxed in settings usually destroy frame rates when outdoors. Could also depend to a degree on how well the dev designed the game. Sill, even simple minecraft/pixelated games look cool with the dynamic lighting of raytracing so I definitely see the benefit outside of the frame rate hit in games.

. . .

Even dynamic shadows on high+ without raytracing, in large outdoor area games with very far view distances + large # of animated objects in distance, can deliver a more 3d/holographic feel - so RT could do that and better with highlight/reflection/shadow dynamism. You might not notice it in a corridor shooter or some demolition derby arena shooter as much. AC Valhalla HDR, or Odyssey HDR should benefit from RT though as they are large adventure games with long view distances (depending on your settings and where you are in the game worlds).
 
Last edited:
I know that shadows can make a huge difference. People used to turn them off or set them to low, perhaps more often in the past, but good global dynamic shadows can deliver an almost 3D ~ holographic look to a game. I notice that kind of aesthetic gain in a big way personally. It takes away the flatness of the scene, gives things depth. Especially large outdoor games with long view distances and animated objects (and shadows) both near and far, all the way out in the distance. So RT can add very dynamic 3d effect between shadows and highlights, reflections in those types of games, depending how well it's implemented.

I don't use RT due to the performance hit but I can see how it could be valuable if the big performance drop wasn't the tradeoff. The catch 22 now might be that in some cases games that have the frame rate to spare might not be as detailed and/or as open world wise so not have quite as great of an effect overall. Even less detailed stylized rpg/adventure//mmo can be very demanding with very high view distances + #anim objects in distance. Games with very long view distances unlocked + high # of animated objects and effects in the distance maxed in settings usually destroy frame rates when outdoors. Could also depend to a degree on how well the dev designed the game. Sill, even simple minecraft/pixelated games look cool with the dynamic lighting of raytracing so I definitely see the benefit outside of the frame rate hit in games.

. . .

Even dynamic shadows on high+ without raytracing, in large outdoor area games with very far view distances + large # of animated objects in distance, can deliver a more 3d/holographic feel - so RT could do that and better with highlight/reflection/shadow dynamism. You might not notice it in a corridor shooter or some demolition derby arena shooter as much. AC Valhalla HDR, or Odyssey HDR should benefit from RT though as they are large adventure games with long view distances (depending on your settings and where you are in the game worlds).

Well don't get me wrong, I do think RT is cool and that it's the future of gaming. I think the main issue right now is that RT in just about every game is very limited to only certain lighting. It's not like say, Portal to Portal RTX where the entire game is path traced and the difference in visuals is immediately obvious and huge. Turning on RT from off to the maxed out preset in games like CP2077, Control, Dying Light 2, etc. there is a difference in visual fidelity sure, but it's far more subtle and doesn't make an immediate day and night difference like Quake and Portal RTX does IMO. And yeah cutting my fps by ~40-50% on average also isn't a good proposition for turning it on since I'm now turning what is an over 120+ fps experience into an 80fps one, on a $1599+ GPU to boot . The performance hit for turning on RT is still pretty big even on Lovelace, in fact in some games it's actually no better than Ampere in that both architectures will lose the same % performance when turning RT on.

1668104351523.png
1668104358535.png
1668104378911.png
1668104448763.png



I was hoping that by the 3rd generation RTX GPU we could turn on RT without cutting fps by up to 66%. Hell even the FIRST gen RTX GPU, the 2080 Ti isn't that far off from the latest and greatest Lovelace when it comes to % performance loss from turning on RT, kinda sad actually. Seems like nvidia isn't really making these new RT cores more efficient at their jobs at all, instead we are just getting more brute force to push higher frame rates in general. Until either games look massively upgraded with RT on, or the performance hit is down below 20% in even the heaviest RT implementations, I'd prefer to just leave it off.
 
Last edited:
  • Like
Reactions: elvn
like this
It's sound like it's going to take a dev shift for RTX but probably also for the other thing we'd talked about where instead of nvidia's guessing vectors based on a buffered image, the games themselves would report the vectors of objects and backgrounds, cameras, etc. while the OS/drivers would report those of the peripherals. Which I believe VR dev does but I'd think VR game devs expect that going in where pancake screen devs doing it might be a slow adoption. The VR frame projection / spacewarp stuff is based on in game virtual objects, background, camera, etc vectors but also the headset where on pc desktop it would be the mouse and keyboard or gamepad entry (mouse-looking, movement keying, controller panning).

They have to up their game, literally:

-HDR dev (maybe even Dolby Vision)
-RTX Dev
-Frame Amplification Dev (based on actual reported game engine and peripheral/OS driver vectors projectiing the next frame rather than nvidia's "look at the next frame and guess the vectors from the difference in the two images" method.
- Surround sound / ATMOS.

I'd also like if they did gaming AI upscaling hardware on the displays themselves if possible to avoid the port+cable bottleneck, even if 4k to 8k on future 8k screens (or their uw resolutions when gaming).
 
Last edited:
It's sound like it's going to take a dev shift for RTX but probably also for the other thing we'd talked about where instead of nvidia's guessing vectors based on a buffered image, the games themselves would report the vectors of objects and backgrounds, cameras, etc. while the OS/drivers would report those of the peripherals. Which I believe VR dev does but I'd think VR game devs expect that going in where pancake screen devs doing it might be a slow adoption. The VR frame projection / spacewarp stuff is based on in game virtual objects, background, camera, etc vectors but also the headset where on pc desktop it would be the mouse and keyboard or gamepad entry (mouse-looking, movement keying, controller panning).

They have to up their game, literally:

-HDR dev (maybe even Dolby Vision)
-RTX Dev
-Frame Multiplication Dev (based on actual reported game engine and peripheral/OS driver vectors projectiing the next frame rather than nvidia's "look at the next frame and guess the vectors from the difference in the two images" method.
- Surround sound / ATMOS.

I'd also like if they did gaming AI upscaling hardware on the displays themselves if possible to avoid the port+cable bottleneck, even if 4k to 8k on future 8k screens (or their uw resolutions when gaming).

I'm just wondering why everyone believes nvidia has been making improvements to RT performance every gen when the numbers don't line up with that. Let's say the upcoming 4070 Ti is supposed to be on par with the 3090 Ti in raster, but faster in ray tracing because of the 3rd gen RT core vs 2nd gen RT core, and both games get 100fps with RT disabled. If the 3090 Ti turns on RT it loses 40% performance and drops to 60fps, the 4070 Ti should suffer a much less performance penalty for turning RT on and lose like 25% or something, but I'm pretty sure it's going to also lose 40% and be right in the same ballpark as the 3090 Ti in RT as well. So how exactly is nvidia improving things on the RT performance front? Seems like all they did was make more powerful GPUs in general. Anyways I'm getting off topic here so I'll end my rant about RT performance. Seeing the raster performance of the 4090 though really makes me pray for a 4K 160+Hz OLED next year. CES 2023 maybe?
 

Attachments

  • HZD 4090.png
    HZD 4090.png
    2.5 MB · Views: 1
Last edited:
I'm just wondering why everyone believes nvidia has been making improvements to RT performance every gen when the numbers don't line up with that. Let's say the upcoming 4070 Ti is supposed to be on par with the 3090 Ti in raster, but faster in ray tracing because of the 3rd gen RT core vs 2nd gen RT core, and both games get 100fps with RT disabled. If the 3090 Ti turns on RT it loses 40% performance and drops to 60fps, the 4070 Ti should suffer a much less performance penalty for turning RT on and lose like 25% or something, but I'm pretty sure it's going to also lose 40% and be right in the same ballpark as the 3090 Ti in RT as well. So how exactly is nvidia improving things on the RT performance front? Seems like all they did was make more powerful GPUs in general. Anyways I'm getting off topic here so I'll end my rant about RT performance. Seeing the raster performance of the 4090 though really makes me pray for a 4K 160+Hz OLED next year. CES 2023 maybe?
If we look at a purely path traced game (Quake 2 RTX) you start to see where the performance gains are. Unfortunately these are rarely seen in tests.

I don't have exact numbers, but if I set a target framerate of 144 fps and let Quake 2 RTX use dynamic resolution scaling to maintain it, my 2080 Ti would be running at sub-1080p resolutions, probably closer to 720p if not even lower. My 4090 runs at around 2880x1620 so 75% of 4K.

My guess is that even if the percentages for framerate drops look the same, just the fact that the 4090 runs RT games at massively higher framerates means it has to process the RT effects faster because they are going to take a good chunk of frametime.

As another empiric example, with the 2080 Ti I pretty much turned RT off in Shadow of the Tomb Raider because it was a good way to reach better framerates. With my 4090 I can turn off DLSS, turn RT effects to max and still maintain around 200 fps framerates at 4K!

TechPowerup has really done a disservice by not publishing minimum framerates too because I tried my 4090 with my old Ryzen 3700X and then replaced it with a system with 13600K and the minimum framerates improved just massively while the maximum framerates were not always that much higher but the overall experience is so much smoother. Also note that a lot of TechPowerup data is using a 5800X CPU, which is fine but they have themselves tested that 5800X3D or latest gen AMD/Intel do give up to 30% boosts in some games.
 
If we look at a purely path traced game (Quake 2 RTX) you start to see where the performance gains are. Unfortunately these are rarely seen in tests.

I don't have exact numbers, but if I set a target framerate of 144 fps and let Quake 2 RTX use dynamic resolution scaling to maintain it, my 2080 Ti would be running at sub-1080p resolutions, probably closer to 720p if not even lower. My 4090 runs at around 2880x1620 so 75% of 4K.

My guess is that even if the percentages for framerate drops look the same, just the fact that the 4090 runs RT games at massively higher framerates means it has to process the RT effects faster because they are going to take a good chunk of frametime.

As another empiric example, with the 2080 Ti I pretty much turned RT off in Shadow of the Tomb Raider because it was a good way to reach better framerates. With my 4090 I can turn off DLSS, turn RT effects to max and still maintain around 200 fps framerates at 4K!

TechPowerup has really done a disservice by not publishing minimum framerates too because I tried my 4090 with my old Ryzen 3700X and then replaced it with a system with 13600K and the minimum framerates improved just massively while the maximum framerates were not always that much higher but the overall experience is so much smoother. Also note that a lot of TechPowerup data is using a 5800X CPU, which is fine but they have themselves tested that 5800X3D or latest gen AMD/Intel do give up to 30% boosts in some games.

Yeah but again it's kinda like just saying that the 4090 is a faster card in general that's why it does RT better, not that it does it more efficiently. I'd prefer to have a GPU generation where I can turn on RT effects in games and experience minimal performance penalty for doing so. But it seems like we aren't going to head that direction and instead just brute force it so that even with all RT maxed out we can still get 200fps at 4K, but then if you turn RT off you will get 400fps and then that reason for not using RT is still going to be there.
 
Frame amplification tech seems like a good way to at least double frame rates to start with. Perhaps even more in the future. That would have a huge effect on frame rates which could help RT be more easily digested.

. . . . . .

However this is how I understand it:

Imagine an animation flip book with a clear overlay sheet on each drawn page, and blank page in between every drawn page cell.

(Clear overlay) on Drawn Cell 1 + [Blank Cell 2] + Drawn Cell 3

Nvidia's frame generation form of frame amplification tech is starting at a drawn cell 1 page and then flips ahead "two pages" to the next drawn Cell 3 page. Then it's doing it's best AI thinking to guess what the vectors are on the 1st cell in order to figure out how to build a "tween" cell get to the 3rd frame cell. Then it's drawing that best guess on the middle blank page based on it's imagined/guessed vectors. This has a lot of trouble in a lot of things, especially things like 3rd person adventure games or any games where the virtual camera moves around independently of the character, causing artifacts from bad guesses as the camera pathing makes some things in the sceen remain motionless or makes them move at different rates in regard to the FoV even though they are technically moving or at different speeds in the game world itself. It also has more artifacting the lower the foundation frame rate is (e.g. beneath 120fps) since that has larger time gaps between animation cells (animation flip book with far fewer pages so transitioning between fewer animation cells states, more staggered/chopped animation cell cycles).

VR's space warp style frame amplification tech instead has a clear page overlay on top of the first page. The clear page has a z-axis grid with a number of arrows and rates for some of the vectors already written on the clear page. VR system's apps can use actual vector data incl. rates, arcs, player inputs, etc. and then plot and draw measured results on the middle page going forward. So it's not completely guessing what the vectors are for everything in the scene, including the player's inputs and the effects of the virtual camera (or the player's head motion and hand motions in VR).

Unfortunately nvidia went with the former uninformed version, at least in the near timeframe. Hopefully the whole industry will switch to the VR method at some point. That would require the PC game devs to write their games using the same kind of VR tools and do the work to code for that. They have a lot of room to advance. Unfortunately progress is a lot slower on the PC and PC display front than the VR front, though in aspects like PPD,VR is way "behind" or inferior and will continue to be for some time yet.

. . .

from blurbuster.com 's forum replies (Mark R.) :

reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it.

One interesting innovation of reprojection that has not yet been done is making sure the pre-reprojected framerate is above flicker fusion threshold. That causes stutters in reprojection to disappear!

For VR, We Already Have Hybrid Frame Rates In The Same Scene
----------------------------------------------------------------------------------------------------


For example on Oculus Rift 45fps to 90fps, sometimes certain things stutter (hand tracking 45fps) while the background scrolls smooth (90fps) via head turns.

But if we had 100fps reprojected to 500fps, then even physics objects like enemy movements would still be smooth looking, just simply more motionblurred (due to frametime persistence) than turns (due to reprojection-based frame rate amplification).

Not everything in the game world *needs* to run at the same framerate; if motion blur is acceptable for such movements.

Different things running at different frame rates on the same screen is very common with reprojection (Oculus Rift), which is ugly when some of the framerates are below stutter/flicker detection threshold.

But if all framerates could be guaranteed perfect framepaced triple-digit, then no stuttering is visible at all! Just different amounts of persistence motion blur (if using reprojection on a non-strobed display). This will be something I will write about in my sequel to the Frame Rate Amplification Article.

Hybrid Frame Rates Stop Being Ugly if 100fps Minimum + Well Framepaced + Sample And Hold
------------------------------------------------------------------------------------------------------------------------------------------


Hybrid frame rates will probably be common in future frame rate amplification technologies, and should no longer be verboten, as long as best practices are done:

(A) Low frame rates are acceptable for slow enemy movements, but keep it triple-digit to prevent stutter
(B) High frame rates are mandatory for fast movements (flick turns, pans, scrolls, fast flying objects, etc)
(C) If not possible, then add GPU motion blur effect selectively (e.g. fast flying rocket running at only 100 frames per second, it's acceptable to motionblur its trajectory to prevent stroboscopic stepping)

The frame rates of things like explosions could continue at 100fps to keep GPU load manageable, but things like turns (left/right) would use reprojection technology. The RTX 4090 should easily be capable of >500fps reprojection in Cyberpunk 2077 at the current 1 terabyte/sec memory bandwidth -- and this is a low lying apple just waiting to be milked by game developers!

In other words, don't use 45fps. Instead of 45fps-reproject-90fps, use min-100fps-reproject-anything-higher. Then frame rate amplification can be hybridized at different frame rates for different objects on the screen -- that becomes visually comfortable once every single object is running at triple-digit frame rates!

Technically, reprojection could in theory be created as a parallel-layer API running between the game and the graphics drivers, much like how VR APIs do it. Except it's overlaid on top of non-VR games.

One major problem occurs when you're doing this on strobed displays -- sudden double/multi-image effects -- and requires GPU motion blur effect (mandatory) to fix the image duplicating (akin to CRT 30fps at 60Hz). However, this isn't as big a problem for sample-and-hold displays.
 
Last edited:
Decided to keep my 48CX and buy the new zowie 25" XL2566K instead for FPS. I need to find a monitor arm though that can reach ~70-80cm so I can put it in front of my CX and then move it out of the way when not using it. Anyone have suggestions?
 
I need to clean my 48CX after a move is there anything I need to be careful not to use on the screen?
 
I need to clean my 48CX after a move is there anything I need to be careful not to use on the screen?
You're not really supposed to use any cleaners on screens. Just use a dry micro fiber cloth, if it's still dirty get it a little damp and wipe off any spots, then go back to a dry microfiber cloth.

I've heard people say X cleaner is fine to use on screens, but I've never had anything on a screen a damp microfiber cloth can't get off. I wouldn't risk it if you don't have to.
 
I need to clean my 48CX after a move is there anything I need to be careful not to use on the screen?
I use Ecomoist cleaner (available in the UK, there is probably an NA equivalent) - spray it on the cloth then wipe. Alternatively, if you want to be really careful, damp a cloth in distilled water (tap water will have minerals and will streak, I wouldn't use especially if you have hard water). The LG manual says to use a dry cloth but it's impossible to clean it with a dry cloth imo, and you don't want to rub hard when cleaning the screen.
 
Last edited:
^ Agree with those folks. There may be some screen cleaners that are safe to use, but I've rarely had to use anything more than filtered water + a microfiber cloth. Anything beyond that could be detrimental and/or a waste of $. Some of them could work well but make sure you read the fine print (and try water first because why not?).
 
I've gotten used to 48 over the last 2 years so I don't see much reason to upgrade if the image quality and Hz is the same. Some would argue that the image quality would be downgraded due to the 42s lower peak brightness vs the 48. The only thing that will get me to upgrade at this point is noticeably better image quality like a QD-OLED or a decent bump in Hz to like 160Hz.

Yeah I think you're right. Still, the prices this black friday are so good (700 right now in the UK). I still find the 48 slightly too big occasionally (like if I have to do a work call on Zoom and need to fullscreen share), but the immersion is great at around ~1m away.

edit: I couldn't help myself, I pulled the trigger on the 42, that price is too good. I'll be trying out FPS which was my main problem game and report how improved it is.
 
Last edited:
Yeah I think you're right. Still, the prices this black friday are so good (700 right now in the UK). I still find the 48 slightly too big occasionally (like if I have to do a work call on Zoom and need to fullscreen share), but the immersion is great at around ~1m away.

edit: I couldn't help myself, I pulled the trigger on the 42, that price is too good. I'll be trying out FPS which was my main problem game and report how improved it is.

FPS is epic on a big OLED unless you want a competitive edge for multiplayer then yeah I can see 48" being a little problematic. I'm enjoying Crysis Remastered on my CX with Auto HDR and maxed out RT on a 4090 and it's amazing.
 

Attachments

  • 20221124_213225.jpg
    20221124_213225.jpg
    680.7 KB · Views: 1
FPS is epic on a big OLED unless you want a competitive edge for multiplayer then yeah I can see 48" being a little problematic. I'm enjoying Crysis Remastered on my CX with Auto HDR and maxed out RT on a 4090 and it's amazing.
Yeah that’s why I keep thinking 42” will feel like a downgrade, for immersive gaming at least. Been playing cyberpunk with a 4090 on it at max ray tracing settings and it’s glorious. It really is a great size for gaming and watching movies, at about 1m away. It’s my fave monitor of all time and I’ve put thousands of hours of gaming and productivity work on it now.

BUT, I do find it a bit too large occasionally. I can’t see the full screen at 1m when playing FPS so I have to look around to see hud elements. I’m pretty used to it now (have 400 hours playing DRG on it for example) - but it can be annoying sometimes. I basically can’t full screen apps at all on it either, which is annoying 1% of the time for screen sharing at work.

Maybe I should invest in an ergotron HX so I can push it farther back when I need to. It’s sort of hard to justify a £250 arm though when the C2 is on sale for £700. 42 would be better ultimately as a pc monitor I think - would lose some immersion but it is a more practical size.

Okay, I convinced myself to cancel my 42" order and look out for a used HX arm :D.
 
Last edited:
Yeah that’s why I keep thinking 42” will feel like a downgrade, for immersive gaming at least. Been playing cyberpunk with a 4090 on it at max ray tracing settings and it’s glorious. It really is a great size for gaming and watching movies, at about 1m away. It’s my fave monitor of all time and I’ve put thousands of hours of gaming and productivity work on it now.

BUT, I do find it a bit too large occasionally. I can’t see the full screen at 1m when playing FPS so I have to look around to see hud elements. I’m pretty used to it now (have 400 hours playing DRG on it for example) - but it can be annoying sometimes. I basically can’t full screen apps at all on it either, which is annoying 1% of the time for screen sharing at work.

Maybe I should invest in an ergotron HX so I can push it farther back when I need to. It’s sort of hard to justify a £250 arm though when the C2 is on sale for £700. 42 would be better ultimately as a pc monitor I think - would lose some immersion but it is a more practical size.

Okay, I convinced myself to cancel my 42" order and look out for a used HX arm :D.

When I moved, I was thinking of originally selling my CX 48" but it happened to be a great size for our small living room. But I was actually happy to go back to a smaller LCD for desktop use. The CX was good but I was often not using its full desktop, instead using maybe like the bottom 2/3rds instead, sort of like a 3840x1600 display I suppose. Even 1m viewing distance I found it a bit large otherwise even though it was lovely for media and games. I think the 42" size would be more desktop friendly.

I tried putting the CX on a monitor arm initially but the problem is that it is of course limited by the depth of your desk. I bought a cheap floorstand for it instead and that's a more practical solution though not very adjustable. I figured that I am not very interested in moving the screen back and forth but will instead find a good spot and leave it there.

At this point there is not much reason to buy the C2 42" if you have a CX/C1/C2 48". It's pretty much the same thing but smaller. Wait for C3 to be unveiled to see if it brings anything new at least.

What I am really hoping LG would do is a fixed curve C3 that is not much more expensive than the flat C3. To me the motorized curve of the LG Flex is a stupid feature. You will play with it a bit and then probably park it on one setting. LG should just make a 42", 4K 240 Hz version of the 45GR95QE (the curved 45" 3440x1440 OLED).
 
Yup I'm just going to wait until we get something significantly better at 42". Not worth the side grade right now. Still have 0 burn in and only like 3-4 dead pixels on the edges at about 6.5k hours so far.
 
When I moved, I was thinking of originally selling my CX 48" but it happened to be a great size for our small living room. But I was actually happy to go back to a smaller LCD for desktop use. The CX was good but I was often not using its full desktop, instead using maybe like the bottom 2/3rds instead, sort of like a 3840x1600 display I suppose. Even 1m viewing distance I found it a bit large otherwise even though it was lovely for media and games. I think the 42" size would be more desktop friendly.

I tried putting the CX on a monitor arm initially but the problem is that it is of course limited by the depth of your desk. I bought a cheap floorstand for it instead and that's a more practical solution though not very adjustable. I figured that I am not very interested in moving the screen back and forth but will instead find a good spot and leave it there.

At this point there is not much reason to buy the C2 42" if you have a CX/C1/C2 48". It's pretty much the same thing but smaller. Wait for C3 to be unveiled to see if it brings anything new at least.

What I am really hoping LG would do is a fixed curve C3 that is not much more expensive than the flat C3. To me the motorized curve of the LG Flex is a stupid feature. You will play with it a bit and then probably park it on one setting. LG should just make a 42", 4K 240 Hz version of the 45GR95QE (the curved 45" 3440x1440 OLED).

I expect the C3 to be exactly like how the C1 was to the CX, basically the same thing and not worth it over a heavily discounted C2. After that in 2024 maybe we'll get the next big thing for WOLED but I'm pretty sure LG's main focus for their TV's isn't the PC gaming/desktop crowd so instead of giving us higher refresh rates or a curve, they'll probably be releasing that MLA panel that supposedly increases the brightness significantly and even that may not hit the C series but instead be limited to larger sized G series TVs. I think we reached the end of the road here for WOLED TV's but hey at least LG is actually making monitors now starting with that 27" 240Hz so hopefully we'll just get a 32" 4K 144Hz+ WOLED monitor next year or in 2024 and never have to bother with using TV's as a monitor again.
 
I expect the C3 to be exactly like how the C1 was to the CX, basically the same thing and not worth it over a heavily discounted C2. After that in 2024 maybe we'll get the next big thing for WOLED but I'm pretty sure LG's main focus for their TV's isn't the PC gaming/desktop crowd so instead of giving us higher refresh rates or a curve, they'll probably be releasing that MLA panel that supposedly increases the brightness significantly and even that may not hit the C series but instead be limited to larger sized G series TVs. I think we reached the end of the road here for WOLED TV's but hey at least LG is actually making monitors now starting with that 27" 240Hz so hopefully we'll just get a 32" 4K 144Hz+ WOLED monitor next year or in 2024 and never have to bother with using TV's as a monitor again.
That's kind of why I was hoping they would focus on things like increased refresh rate and a fixed curve model. It's hard to sell a C3 if it's just the same thing as the C2 but more expensive unless they delay release until C2 supplies are depleted.
 
That's kind of why I was hoping they would focus on things like increased refresh rate and a fixed curve model. It's hard to sell a C3 if it's just the same thing as the C2 but more expensive unless they delay release until C2 supplies are depleted.

Well again higher Hz and curved screens are features that are more appealing towards the PC crowd and I don't think that's LG's main focus when it comes to their TV lineup so I don't see them bothering with it. Would be happy to be wrong though but I don't expect much out of the C3 that would interest us PC people who use their TV's as a monitor.
 
Does anyone know of anyway to send a command to the PC whenever you press a certain button on the LG remote?

What I'm trying to achieve:
1. Hold button 2 to run Emby shortcut.
2. TV run Emby and notify PC (this is the part I'm trying to figure out).
3. PC run script to pause Wallpaper Engine.

Or the opposite:
1. Hold button 1 to return to HDMI1 shortcut (PC).
2. TV goes back to PC mode and notify PC of that.
3. PC run script to unpause Wallpaper Engine.

Edit: Figured it out using Home Assistant (with LG webOS integration) + HASS.Agent.
 
Last edited:
Does anyone know of anyway to send a command to the PC whenever you press a certain button on the LG remote?

What I'm trying to achieve:
1. Hold button 2 to run Emby shortcut.
2. TV run Emby and notify PC (this is the part I'm trying to figure out).
3. PC run script to pause Wallpaper Engine.

Or the opposite:
1. Hold button 1 to return to HDMI1 shortcut (PC).
2. TV goes back to PC mode and notify PC of that.
3. PC run script to unpause Wallpaper Engine.

Edit: Figured it out using Home Assistant (with LG webOS integration) + HASS.Agent.

What to share the details?
 
Yeah that’s why I keep thinking 42” will feel like a downgrade, for immersive gaming at least. Been playing cyberpunk with a 4090 on it at max ray tracing settings and it’s glorious. It really is a great size for gaming and watching movies, at about 1m away. It’s my fave monitor of all time and I’ve put thousands of hours of gaming and productivity work on it now.

BUT, I do find it a bit too large occasionally. I can’t see the full screen at 1m when playing FPS so I have to look around to see hud elements. I’m pretty used to it now (have 400 hours playing DRG on it for example) - but it can be annoying sometimes. I basically can’t full screen apps at all on it either, which is annoying 1% of the time for screen sharing at work.

Maybe I should invest in an ergotron HX so I can push it farther back when I need to. It’s sort of hard to justify a £250 arm though when the C2 is on sale for £700. 42 would be better ultimately as a pc monitor I think - would lose some immersion but it is a more practical size.

Okay, I convinced myself to cancel my 42" order and look out for a used HX arm :D.

I've had the CX48 since launch. Just recently I bought the 42C2. You might laugh but I'm returning the 42 and keeping the 48. Don't get me wrong, the 42 is beautiful but dare I say....too small. Picture wise they are exactly the same.
 
I've had the CX48 since launch. Just recently I bought the 42C2. You might laugh but I'm returning the 42 and keeping the 48. Don't get me wrong, the 42 is beautiful but dare I say....too small. Picture wise they are exactly the same.

That's why I skipped the 42C2 as well, the picture quality isn't getting an upgrade from the 48CX and the size, while being closer to the ideal 32" that I prefer, is still not there anyways so it's yet another stopgap to 32". What sucks is that the smaller the screen gets, so does the brightness. The 42C2 is less bright than the 48CX and that 27" WOLED monitor coming out is going to be even dimmer. So when a 32" finally lands, I'm not sure if I even wanna upgrade to that if the brightness is going to get significantly nerfed compared to my CX. We need some 4K QD-OLED options in the monitor space! Or for LG to stop making smaller WOLED's lose brightness compared to larger ones.
 
That's why I skipped the 42C2 as well, the picture quality isn't getting an upgrade from the 48CX and the size, while being closer to the ideal 32" that I prefer, is still not there anyways so it's yet another stopgap to 32". What sucks is that the smaller the screen gets, so does the brightness. The 42C2 is less bright than the 48CX and that 27" WOLED monitor coming out is going to be even dimmer. So when a 32" finally lands, I'm not sure if I even wanna upgrade to that if the brightness is going to get significantly nerfed compared to my CX. We need some 4K QD-OLED options in the monitor space! Or for LG to stop making smaller WOLED's lose brightness compared to larger ones.
I am seeing the same thing going from the 65" C9 in my living room to the CX 48". C9 seemed to get brighter overall.

I was definitely tempted by Black Friday deals for the 42C2 but I knew that like the CX 48" it would be less than ideal. I'm using a Samsung G70A 28" 4K 144 Hz as an interim solution and feel like a 32" 16:9 or 40" 5120x2160 would be ideal for me but since neither has that great options regardless of tech, I'm hoping for some improvements in the 42-43" size.

I found the 48" size too tall for me and often ended up using the bottom 2/3rds of the display. 42" would probably have been just that little bit smaller to be appropriate for me.

If LG could make the 42" C2 at a higher refresh rate, similar brightness to 48" CX and with a fixed 800-1000R curve they would have a real winner because a big part of why the CX 48" needed so much viewing distance was that seeing the sides and corners of the display is awkward if it's too close. Curve would bring them more towards you and thus make it easier to manage the size. I would probably buy the LG Flex if it wasn't literally just the C2 with 3x price tag. Just too expensive for what it is.
 
I am seeing the same thing going from the 65" C9 in my living room to the CX 48". C9 seemed to get brighter overall.

I was definitely tempted by Black Friday deals for the 42C2 but I knew that like the CX 48" it would be less than ideal. I'm using a Samsung G70A 28" 4K 144 Hz as an interim solution and feel like a 32" 16:9 or 40" 5120x2160 would be ideal for me but since neither has that great options regardless of tech, I'm hoping for some improvements in the 42-43" size.

I found the 48" size too tall for me and often ended up using the bottom 2/3rds of the display. 42" would probably have been just that little bit smaller to be appropriate for me.

If LG could make the 42" C2 at a higher refresh rate, similar brightness to 48" CX and with a fixed 800-1000R curve they would have a real winner because a big part of why the CX 48" needed so much viewing distance was that seeing the sides and corners of the display is awkward if it's too close. Curve would bring them more towards you and thus make it easier to manage the size. I would probably buy the LG Flex if it wasn't literally just the C2 with 3x price tag. Just too expensive for what it is.

I guess it's up to Asus to do what LG won't and slap a heatsink on their OLED monitors. They will be releasing their own 27" with a custom heatsink it seems. So when 32" eventually lands I would expect Asus to follow up with a heatsink option as well. On the PG42UQ the heatsink is pretty effective but for some reason Asus decided to nerf brightness levels below the C2 once you go past a 25% window. Surely it can deliver higher brightness across the entire range and not just from 1-25%.

1670511771850.png
 
As the TV notifications have been alluding to for a week or two, LG finally released an updated firmware for CX GX series today: v04.40.70

[Web OS] 2020 LG OLED AI Latest Software _OLEDxxCXX_GXX_RXX_WXX_O20). Ver.04.40.70

change log:
1. The voice recognition feature has been improved.(China)
2. TBD
 
Last edited:
As the TV notifications have been alluding to for a week or two, LG finally released an updated firmware for CX GX series today: v04.40.70

[Web OS] 2020 LG OLED AI Latest Software _OLEDxxCXX_GXX_RXX_WXX_O20). Ver.04.40.70

change log:
1. The voice recognition feature has been improved.(China)
2. TBD

Just got this firmware update last night. It's great that LG has continued to give the CX support to this day but I think at this point we should expect nothing major with any future firmware updates. Most of the big issues were already fixed within the first few months anyway which is already more than most manufacturers would ever do.
 
Can anyone do me a favor and check something? So I've decided to make my CX a permanent BFI screen and what's happened is that with OLED Motion Pro (BFI) set to HIGH, my CX can now only output 78 nits fullfield with OLED light at 100 and 6500k whitepoint. I swore this thing could do just over 100 nits at the same settings when it was brand new so maybe it's degraded quite a bit after over 2 years of use. Wondering what kind of brightness output you guys are getting with BFI set to High, OLED light at 100, and whitepoint at 6500k. I would ask some of my friends who also own a CX but none of them have colorimeters to take readings.

EDIT: From TFTCentral: "The OLED Motion Pro feature has settings for off, low, medium, high and auto. We measured the off/on pattern as shown above in each mode. As you increase the setting the “on” period is shortened which can help improve the motion clarity further, but does also impact the brightness of the screen. We measured the following maximum brightness levels in each mode (with OLED light setting at 100) – low setting 216 cd/m2, medium 148 cd/m2 and high 119 cd/m2."

It definitely used to be able to do over 100 nits on the HIGH BFI mode before. Looks like after 2.5 years of use my CX can no longer output the same brightness levels with BFI and I'm capped to 78 nits on HIGH. Weird part is that if I turn BFI off and switch back to HDR it can still output 760 nits on the Windows HDR Calibration menu before clipping.
 
Last edited:
Can anyone do me a favor and check something? So I've decided to make my CX a permanent BFI screen and what's happened is that with OLED Motion Pro (BFI) set to HIGH, my CX can now only output 78 nits fullfield with OLED light at 100 and 6500k whitepoint. I swore this thing could do just over 100 nits at the same settings when it was brand new so maybe it's degraded quite a bit after over 2 years of use. Wondering what kind of brightness output you guys are getting with BFI set to High, OLED light at 100, and whitepoint at 6500k. I would ask some of my friends who also own a CX but none of them have colorimeters to take readings.

Just wondering, why would you want this?
Reduced brightness & contrast, no HDR, no VRR. Sure, your 4090 is stout, but is it locked 120fps ultra setting stout?
Legit question
 
Just wondering, why would you want this?
Reduced brightness & contrast, no HDR, no VRR. Sure, your 4090 is stout, but is it locked 120fps ultra setting stout?
Legit question

I have my Acer X27 set for HDR gaming duty. The kind of games that I would even play on the CX with BFI are games that have no benefit to using HDR like Celeste and Dead Cells, and would be very easy to run. Think pixel art type of games and non demanding side scrollers. Seriously would anyone even think about playing those kinda games at 1000+ nits? Not ALL games would benefit from HDR, some would benefit much more from having amazing motion clarity and a BFI OLED is hard to beat in that regard.
 
Last edited:

MistaSparkul,​

maybe you also may try to ask users like "jorimt" from the blur busters forums if you want for his current CX nit levels.

i remember when i asked him about some of his OLED CX nit levels when using BFI, mode, he confirmed the following: "60hz bfi reduces peak brightness as low as 60 nits, other higher frequencies up to 100 - 120hz have arround 116nits" referring to the CX best motion quality BFI mode "OLED Motion Pro" set to "High", but that was in july 2020 though, now after more than 2 years of that post, some current CX peak brightness luminance degradation woud make sense i guess.

this is the link of that post i was referring about:

https://forums.blurbusters.com/viewtopic.php?f=2&t=7161&p=54567&hilit=3dfan#p54567
 
Last edited:
Can anyone do me a favor and check something? So I've decided to make my CX a permanent BFI screen and what's happened is that with OLED Motion Pro (BFI) set to HIGH, my CX can now only output 78 nits fullfield with OLED light at 100 and 6500k whitepoint.
I think that sounds about right. I never measured it, but visually switching between my calibrated 120 nits brightness in SDR vs turning on BFI and setting OLED light to 100 gave similar brightness level on the "medium" or "auto" setting. "High" to me was unusably dim even when the display was new. With most games I play being HDR capable these days, I have found no use for BFI.

My understanding is that these displays should have some reserve brightness so when the pixel refresh algorithm runs, your display should retain its brightness. How long, I don't know. I really can't say if my CX is dimmer after 2.5 years or not as I don't have anything for comparison.
 
I think that sounds about right. I never measured it, but visually switching between my calibrated 120 nits brightness in SDR vs turning on BFI and setting OLED light to 100 gave similar brightness level on the "medium" or "auto" setting. "High" to me was unusably dim even when the display was new. With most games I play being HDR capable these days, I have found no use for BFI.

My understanding is that these displays should have some reserve brightness so when the pixel refresh algorithm runs, your display should retain its brightness. How long, I don't know. I really can't say if my CX is dimmer after 2.5 years or not as I don't have anything for comparison.

I measured the brightness when it was brand new and high BFI only dropped it by a few nits compared to my non BFI 120 nits calibrated SDR setting so this doesn't sound right at all. Perhaps there's a setting that I forgot to change I'll have to investigate more later.
 
I measured the brightness when it was brand new and high BFI only dropped it by a few nits compared to my non BFI 120 nits calibrated SDR setting so this doesn't sound right at all. Perhaps there's a setting that I forgot to change I'll have to investigate more later.
Here’s to hoping. If not that’s a pretty substantial drop in just a few years.
 
Back
Top