LG 48CX

I never said it was all about maximum image smoothness. I merely mentioned the fact that IPS still has less motion blur. As well as the VA.

Higher Hz screens that approach their peak HZ in the frame rate of the game do. 240Hz at 220fps+ , 360fps at 220fps - 330fps+, etc. 175fpsHz is marginal and even that is when at 175fps.

At the same FPS ranges.. where the top 2/3 of their range is hovering near 120fpsHz on demanding games (at 4k resolution), or more commonly less frame rate range averages due to how demanding 4k is and how high graphics settings ceilings can go when set to very high plus to ultra settings, all of the screens would be nearly the same sample-and-hold blur except that the OLED would probably look slightly tighter due to the response time.

------------------------------

  • 60 fps at 1000 pixels/sec = 16.7ms persistence = 16.7 pixels of motion blur
  • 120 fps at 1000 pixels/sec = 8.3ms persistence = 8.3 pixels of motion blur
  • 240 fps at 1000 pixels/sec = 4.1ms persistence = 4.1 pixels of motion blur
  • 480 fps at 1000 pixels/sec = 2.1ms persistence = 2.1 pixels of motion blur
  • 1000 fps at 1000 pixels/sec = 1ms persistence = 1 pixels of motion blur
But you have to factor in response times in actual use so those number range higher on LCD transitions while oled's transitions are essentially instant.
Assumptions: The exact Blur Busters Law minimum is achieved only if pixel transitions are fully square-wave (0ms GtG) on fully-sharp sources (e.g. VR, computer graphics). Actual MPRT numbers can be higher. Slow GtG pixel response will increase numbers above the Blur Busters Law guaranteed minimum motion blur.


KlIRG0B.png
 
Last edited:
Higher Hz screens that approach their peak HZ in the frame rate of the game do. 240Hz at 220fps+ , 360fps at 220fps - 330fps+, etc. 175fpsHz is marginal and even that is when at 175fps.
No, we started with me saying that my experience shows the IPS and VA smear less. Doesn't matter the if you max out your fps. And it's not marginal but VERY noticeable.
 
No, we started with me saying that my experience shows the IPS and VA smear less. Doesn't matter the if you max out your fps. And it's not marginal but VERY noticeable.

What? So are you saying an IPS or VA running at the same frame rate and refresh rate as an OLED will have less smearing? I don't see how that is even possible. And I seriously doubt the gsync module is the magic "smear be gone" device.
 
What? So are you saying an IPS or VA running at the same frame rate and refresh rate as an OLED will have less smearing? I don't see how that is even possible. And I seriously doubt the gsync module is the magic "smear be gone" device.
Agreed. Without supplying higher frame rates to reach nearing the peak HZ of higher and higher hz displays, the higher Hz is pretty meaningless in regard to sample-and-hold blur, especially using VRR no less.

No, we started with me saying that my experience shows the IPS and VA smear less. Doesn't matter the if you max out your fps. And it's not marginal but VERY noticeable.

The motion clarity is increased the higher the matched or near to matched frame rate range to peak HZ on higher hz screens. That's why the blurbusters graphic shows them all on a 1000hz screen for reference, to show that the frame rate is the limiting factor when you have higher Hz.

okw997S.png

*at zero GtG response time that is, so variable response times on LCDs can add to those # of pixels slightly though the better gaming overdrive implementations reduce that factor.


..the motion clarity is increased a lot more in neighborhood of 240fps on a 240Hz screen,
..(240fps to) ~300fps average+ nearing 360fps on a 360Hz screen, etc.
..what I said was 175Hz at 175fps is marginal increase (and only when at 175fps ish ranges in a game) by comparison to those (240fpsHz, 360fpsHz) and that 175fps on a 175Hz screen is marginal clarity increase vs 120hz at 120fps with oled response time factored in.
..240Hz at ~240ish fps zone touching averages and 360Hz at 240 to 360ish fps graph averages is VERY noticeable.


..a 480hz monitor running 240fps will look the same motion clarity-wise vs sample and hold blur as a 240hz monitor running 240fps if all other factors and specs scaled equally (fast enough response times, good enough overdrive implementations with the same results at both ranges, etc).
 
Last edited:
What? So are you saying an IPS or VA running at the same frame rate and refresh rate as an OLED will have less smearing? I don't see how that is even possible. And I seriously doubt the gsync module is the magic "smear be gone" device.
I was not saying that.
The motion clarity is increased the higher the matched or near to matched frame rate range to peak HZ on higher hz screens. That's why the blurbusters graphic shows them all on a 1000hz screen for reference, to show that the frame rate is the limiting factor when you have higher Hz.
Probably the reason why 37.5" IPS 160hz is so much smoother than OLED 120hz, plus hardware gsync.
 
What? So are you saying an IPS or VA running at the same frame rate and refresh rate as an OLED will have less smearing? I don't see how that is even possible. And I seriously doubt the gsync module is the magic "smear be gone" device.
..
I was not saying that.

Probably the reason why 37.5" IPS 160hz is so much smoother than OLED 120hz, plus hardware gsync.

175hz **AT** 175FPS is marginally less sample and hold blur and a little more motion defintion than 120fpsHz OLED ---> when the bulk of the frame rate average's graph is at or approaches that peak frame rate of 175.FPS.HZ.

175Hz AT 175 fps at ZERO gtg is 5.7ms persistence of the frame so is 5.7pixels of blur 5.7px.
120Hz AT 120 fps at ZERO gtg is 8.3ms persistence of the frame so is 8.3 pixels of blur. OLED response time is essentially instantaneous so there is no added response time factor here.
175Hz AT 120 fps at ZERO gtg is 8.3ms persistence of the frame so is 8.3 pixels of blur plus the relationship of the overdrive compared to the gtg and outside of gtg response times of the LCD, so in practice slightly more than that.

Between the two (175Hz when AT 175fps on LCD and 120hz when AT 120fps on OLED), that's probably around ~ 1.5px difference in blur, which is marginal and keeps both in a similar soften blur range.
240HZ AT 240 fps at zero gtg is only 4.1px of blur (+ LCD overdrive and response time factors) which is more appreciable at around half the blur (plus LCD response time/overdrive factors) of 120Hz at 120fps.


175HZ at lower than 175 fps
--------------------------------------------
e.g. a 120fps or less frame rate graph:

..175hz 4k at 120fps or so average (or less) and a 4k 120hz OLED at 120fps average or so (or less) with VRR/g-sync will be at the same motion clairty and motion definition. The oled might even look better due to the response time (perhaps considerably better depending on the quality or lack thereof of different LCD model's overdrive and the nature of LCD's gtg and outside of gtg transition response times)

...175hz at 120fps solid or so compared to a 120Hz at 120fps solid or so would also look similar outside of response times, overdrive, etc.


============================================

You could have a 1000HZ screen and it won't look appreciably better than a 4k 120hz OLED screen if both are running a demanding game with frame rate graphs that aren't hitting 120fps or going much over it.
.
Notice that the chart is referencing the FPS the source is capable of showing you or the fps the game is running at - not the peak HZ.
You can figure it out that you won't actually see those fps unless the game is running at those rates and the monitor is capable of enough Hz to match or exceed the FPS, so they use a 1000Hz and zero g2g reference point.

e.g. increasing the depth of the pool (Hz) doesn't mean anything unless you are filling it it with more water/at a higher flow rate (frames/sec) - and only when you are as this pool has a giant drain draining it as fast as you fill it :ROFLMAO:

okw997S.png
 
Last edited:
I was not saying that.

Probably the reason why 37.5" IPS 160hz is so much smoother than OLED 120hz, plus hardware gsync.

I cannot tell a difference in motion clarity between my Acer XB271HU @ 165hz (high end gaming IPS) vs my C1 @ 120hz. obviously the motion is slightly smoother because of the 45 extra hertz but I can't perceive any difference in the actual pixel response (ghosting, whatever you want to call it.) with the Acer also set to 120hz i really can't differentiate response times by eye between the two of them. there is no way anyone could say one was clearly smoother than the other at the same refresh rate.

also the input lag feels exactly the same, and i'm one who is extremely sensitive to it (i can literally tell when there is even one added frame of latency.)
 
  • Like
Reactions: elvn
like this
That makes perfect sense. Thanks for that.

obviously the motion is slightly smoother because of the 45 extra hertz

Slight clarification on that part though - it is only on games or periods when you are running an 'extra' 45 FPS higher than 120fpsHz along with the increased peak hz capability of 165Hz. That's why I keep using fpsHz instead of Hz. With lower frame rate graphs near or below 120fps there won't be any difference in motion clarity(blur reduction) or motion definition/smoothness. I know that is what you meant though when you said extra 45Hz --> the extra 45fps shown on games that can go that high, nearing or surpassing 165FPS at 165Hz in the bulk of their frame rate graph.

That's why the blurbuster's graphic is at 1000hz reference point. The fps capable of being shown to your eyeballs can be limited by the source (e.g. a console game's limit or console port game's or an "under-powered" gpu vs resolution e.g. 4k and game settings near or at graphics ceilings demands) or the peak Hz of the screen itself. It doesn't matter much if you have a 1000Hz screen if you aren't running near 1000 FPS frame rate graphs.
 
Last edited:
Anyone know what's been in the last 4ish updates other than 4k/120hz Dolby Vision on XSX? I just got another update today 4.30.55 and have no idea what's different.
 
Release notes a few pages back and/or via Korean mothership website translation. Nothing to write home about sadly.
 
my c1 really made me start thinking about entering the fray of finding a 30 series gpu... i was totally satisfied with my 2080 ti before but the 4k 120hz 10 bit now dangling in front of me is impossible to resist. ugh

at least my 2080 still has value in this insane market i guess
 
The DP1.4 to HDMI 2.1a adapters have come a long way from where they were a year plus ago.

I recently picked up the alternative to Club3D (which I used and returned a year or more ago)
and my aging RTX2060S now perfectly outputs 4K RGB VESA at 10/12bit and 120Hz with 10bit Temporal Dithering enabled.

Sadly it does nothing for banding, in fact making it worse for sub 1080p video material and many games which dont offer film grain/dithering booleans.

RTX4xx/5xx will be the next time I look at GPUs.
Perhaps by then it will be time to change MOBO to switch to some new GPU interface and CPU socket tech too.
 
Also have RTX2080 with C1. Use the USB-C adapter for HDMI 2.1. It does 4k 120Hz and 10bit. Only VRR doesn't work.
 
at least my 2080 still has value in this insane market i guess

The resale of my 2080 Ti paid for 73% of my 3090, before state tax. Insane market indeed. I do admit I was lucky as hell to snatch a 3090 at MSRP from Best Buy. I hesitated at the time, but who could predict pulling the trigger on a $1500 GPU would be a good financial decision? What an upside down world.
 
The resale of my 2080 Ti paid for 73% of my 3090, before state tax. Insane market indeed. I do admit I was lucky as hell to snatch a 3090 at MSRP from Best Buy. I hesitated at the time, but who could predict pulling the trigger on a $1500 GPU would be a good financial decision? What an upside down world.
Yeah, I also randomly found a 3090 in stock at Best Buy at MSRP soon after it launched and casually checked out! Now they are never in stock even if you use alerts. At the time I thought the price was insane over the 3080 but now it's the best purchase I made in the last 1.5 years.
 
I bought the gigabyte aorus external box with the 3090 AiO inside and shucked it (they don't run as fast in the box over TB cables as on the motherboard in a pci-e slot). You can still find the egpu units in stock pretty regularly on newegg but their water lines are very short on the stock AiO. You can strip it and put a 3rd party aio + bracket on it for long lines though. The 3090 egpu aren't cheap but they were available for just over $2k for quite some time. They've gone up in price several hundred dollars since then. I had a spare nzxt aio bracket from a long time ago. Besides they are just sheet metal you can drill the bracket to fit any holes really, especially if you have a drill press. So the older nzxt bracket or the modern one will fit one way or the other. I just needed an AIO that was the style with tabs/notches around a disc/puck heatsink to lock in to the bracket's mounting tabs..

I still have the broken down egpu case in the box it came in and could put some other gpu in it if I really wanted to someday also. It ran ok just in my pc shucked with the short lines and the stock aio but wasn't an optimal config with the radiator sitting in my case awkwardly and vs going the whole 9 yards with new pads, paste, fans, aio, etc.

In general, a lot of people choose to replace the thermal pads on the backplate and any on the front heatsink of their 3000 series gpus and then add one or more fans to the back because the memory junction runs hot. People also usually undervolt them slightly as it doesn't cause a performance hit (in some cases even slight gains).

This pics below are the general idea including a few pictures from someone else who did the same mod on 3090 egpu.
Mine is a different AiO (nzxt kraken) with 3rd party fans swapped on the radiator and different fans on the gpu pcb (I don't like the look of noctua browns). I kept the huge copper plate front heastsink that covers all of the front components. That huge copper heatsink came with it so I might as well use it instead of putting tiny heatsink stacks on. I paid for it after all..

cSrUGSx.png
71KweETBdGL._SL1500_.jpg

20FQ9Y8.jpg
peTbCN7.jpg




This is what it looks like right out of the case (from Linus tech tips review of it):

UzyEwLi.png
O5cKhrT.png



Pic of one with the heavy copper front heatsink and the rear black backplate taken off:

KOouuI6.jpg
Soopuc4.jpg
 
Quick question:
My Dell 3011 just died. So I have 3 choices:
1. 48c1,
2. something else,
3. wait for the 42c2.

Any advice would be welcome :)
 
Depends what you want to do with it. Personally I'm not a proponent of using them as a desktop monitor and rather using them as a media/gaming "stage" - though you'll find plenty that disagree in the thread. I use side screens for static desktop/app stuff.

Samsung is coming out with all blue oleds so you might want to wait to see what that's about as they are about 15% to 25% higher peak brightness and potentially less chance of burn in as they get more output per energy state (or less rapidly burning down the the burn in wear-evening buffer perhaps). Their smaller gaming monitor is 175hz (and too tiny at only ~13" tall for my taste) .. but the TV's with the same tech are going to be able to do 144hz at 4k. Though is price is a big consideration they will likely be expensive and the CX/C1's will be cheaper.. in some cases 1k or less. When the 42" comes out prices may shift too.




On the QDOLED TV - he said they measured 1000nit 1% patch and

1500nit color volume in a 3% patch:
TpfhqiE.png


Reports that they have increased saturation so much more vivid greens and reds. All blue (high energy) oled array stepped down with a quantum dot filter to red and green.


https://www.theverge.com/2022/1/4/2...-34-inch-qd-oled-samsung-gaming-monitor-specs




The ~ 13" tall 3440x1440 alienware gaming monitor is peak 1000nit instead of 1500 as a safer desktop usage scenario. They are giving it a 3 year exchange policy including burn-in. I don't know if they will do the same with their 144Hz 4k tvs at the higher peaks.
 
Last edited:
Im almost hoping QD-OLED won't be as good as I have been reading, as now I will want to wait for it's prices to come down rather then buying the 42" C2..
 
there is always something better around the corner, or around 2 corners... marginally at least. Early adopters pay max price. Even the CX dropped to around $900 - $1000 a few times later after many bought it for $1500
 
Im almost hoping QD-OLED won't be as good as I have been reading, as now I will want to wait for it's prices to come down rather then buying the 42" C2..

Going from LCD to OLED = massive upgrade 1000% better
Going from LG C2 to Alienware QD-OLED = kind of an upgrade, sort of a side grade because of the aspect ratio and lower resolution.

Also we don't know how much the QD-OLEDs will cost yet, but they could be well over 2X as expensive. It's going to be years until they come down to current LG OLED prices.


If you're only willing to spend $1000 my recommendation is go with the LG C2, you could have a very long wait for QD-OLED.
Enjoy the C2 which is a massive upgrade over any LCD, and almost as good as the QD-OLED.
Don't suffer for years waiting for QD-OLED to come down to the price you want.
 
Going from LG C2 to Alienware QD-OLED = kind of an upgrade, sort of a side grade because of the aspect ratio and lower resolution.

Not bad advice but just to be clear they are also going to release 4k 144hz QD-OLED TVs. The alienware monitor is probably going to be overpriced for what it is by comparison, at least in my opinion, as long as you have the space vs PPD of a larger screen. The TV's will be expensive though, sure. No word on if they will have any smaller TVs (42" - 43"- 48"), at least that I've heard yet but I'm hoping so.
 
I have been using an acer x35 for about 5 years now, so an oled will be a massive upgrade. So maybe the best upgrade path for me will be c2 42" in the spring, and a QD-OLED (micro LED?) 5 years from now..
In reality Im probably not going to get anything as Im about to have two kids in daycare so my money has gone bye bye for the next few years anyhow lol.
 
I have been using an acer x35 for about 5 years now, so an oled will be a massive upgrade. So maybe the best upgrade path for me will be c2 42" in the spring, and a QD-OLED (micro LED?) 5 years from now..
In reality Im probably not going to get anything as Im about to have two kids in daycare so my money has gone bye bye for the next few years anyhow lol.

Sounds like a now or never situation.
 
Quick question:
My Dell 3011 just died. So I have 3 choices:
1. 48c1,
2. something else,
3. wait for the 42c2.

Any advice would be welcome :)
Since you seem to need a monitor now, I would look at the 48" C1. The fascination with the 42" is mostly, I feel, due to people having smaller desks.
 
Since you seem to need a monitor now, I would look at the 48" C1. The fascination with the 42" is mostly, I feel, due to people having smaller desks.

It's not even that big of a difference in viewing distance if you want to stay at a minimum of 60PPD where text subsampling and AA are better able to compensate for the perceived pixel granularity at that distance, or if you want to keep closer to 80 PPD for a more optimal viewing angle, somewhat less aggressive pixel stucture and not having to lean quite as hard on AA.


60PPD 64 degree viewing angle
=======================

..technically a bit too close of a viewing angle vs periphery of screen being pushed out too far, but the pixel granularity will at least be low enough that subsampling and AA can compensate for the most part - at a performance hit)

98" 4k screen at ~ 68.5" away has the same PPD and viewing angle and looks the same as:

77" 4k screen at ~ 54" away (60PPD, 64deg viewing angle)

65" 4k screen at ~ 45" away

55" 4k screen at ~ 38.5" away

48" 4k screen at ~ 33.5" away

42" 4k screen at ~ 29" away

27" 4k screen at ~ 19" away

---------------------------------------------------------

80 PPD 48 deg viewing angle (optimal viewing angle is typically 45 - 55 deg)
========================================================

..reduced pixel granularity so can probably get away with a little more more moderate AA and text (with tweaked subsampling) will look a little better.
..until we get to something like 150PPD+ the pixels won't appear fine enough that we won't really have to rely on AA and subsampling anymore. However the gpu demand would counteract that resolution gain (8k+) anyway, losing motion clarity and motion definition aesthetics so probably better off using an optimal PPD on a 4k screen along with AA and text subsampling for the following years (though using an 8k screen on the side for desktop/apps would be good).

98" 4k screen at ~ 96" away has the same PPD and viewing angle and looks the same as:

77" 4k screen at ~ 75.5" away (80PPD, 48deg viewing angle)

65" 4k screen at ~ 64" away

55" 4k screen at ~ 54" away

48" 4k screen at ~ 47" away

42" 4k screen at ~ 41" away

27" 4k screen at ~ 26.5" away

-------------------------------------------------------------

You can see the 80PPD point (on a 4k screen) is where the screen diagonal measurement and the viewing distance make what is more or less an equilateral triangle or pyramid cone with your viewing angle. The view distance approaching the screen's diagonal is the neighborhood of the optimal viewing angle for anything with HUDs, notifications, pointers, text windows, etc. in my opinion, regardless of the PPD.

Coincidentally, a 48" 4k screen at ~ 47" - 48" away is a 48 degree viewing angle. 48diag ~ "48" view - 48deg

The thing is, if you were trying to use a 42" at 80PPD you'd actually be saving 6" so would be able to set it up at 41" viewing distance.
However it's more likely that people choosing a 42 over of 48 don't have the space for a 41" viewing distance on the 42" screen or 47" on the 48" screen to start with so are going to use 60PPD which is only a ~ 4" difference 33.5" vs 29.3" distance. I doubt their desk setup is going to be able to achieve 29.3" distance where it can't 33.5" but maybe there are a few setups within that 4" threshold. heh... That or they are going to try to use it below 60PPD where the pixel structure will be too aggressive for AA and text subsampling to operate effectively without compromised fidelity, let alone desktop content that can't benefit from 3d mode AA and text subsampling... and the viewing angle would be poor.

Say you had a desk that only let you sit ~29" away max. You'd be at 60PPD on the 42" and 54PPD on the 48" at that distance. So there the 42" screen would make sense to me.
If you could get a TV stand or other surface to put the screen on, I suspect you could add the other 4" view distance to where the 48" screen would look exactly the same as the 42" though. PPD, viewing angle, perceived screen size to your perspective.

If you had a desk setup that let you sit 41" away for a 42" 4k screen to have 80PPD, I bet you could manage the extra 6" to 47" view distance for a 48" screen to be at 80PPD too. There are some setups that are probably cutting it close and so could benefit from a 6" difference though.

The 42" also might change things and create different options for setups using multiple screens.
 
Last edited:
Whether it be the C2 or some QD-OLED, I'm committed to updating this year. I'll take some pictures of my B6 to show what extremely heavy use as a monitor over 5+ years, including two of constant WFH, have done to the panel.

EDIT

NVM, forgot how hard it is to capture a 55" screen on a tiny phone without loosing all the detail. :/
 
I'm definitely interested in Samsung's consumer QD-OLED solution. Hopefully it's going to be great!
 
I'm definitely interested in Samsung's consumer QD-OLED solution. Hopefully it's going to be great!
Pending reviews. At this point, we know what we're getting from LG. But if Samsung offers the same/better at a competitive price and there aren't any of the SW pains that LG went through, then I would consider them I perfectly viable alternative.

We'll see in a few months. I'm certainly upgrading this year come hell or high water; my B6 is "really" starting to wear at this point.
 
Pending reviews. At this point, we know what we're getting from LG. But if Samsung offers the same/better at a competitive price and there aren't any of the SW pains that LG went through, then I would consider them I perfectly viable alternative.

We'll see in a few months. I'm certainly upgrading this year come hell or high water; my B6 is "really" starting to wear at this point.
What do you run your OLED light level at? And do you move your windows around periodically or leave them in the same position all day when WFH?

Either way, the good news is that their panel tech has been revised to be more robust since the 6 series, so your next one should fare better.
 
Pending reviews. At this point, we know what we're getting from LG. But if Samsung offers the same/better at a competitive price and there aren't any of the SW pains that LG went through, then I would consider them I perfectly viable alternative.

We'll see in a few months. I'm certainly upgrading this year come hell or high water; my B6 is "really" starting to wear at this point.

It will not be a competitive price because yields are low. And don't bet on Samsung to not have any SW pains, in fact it might be even worst given what the Neo G9 experienced. Besides, Samsung will not be offering QD-OLED in a 4K 42" form factor, just 34" ultrawide, 55", and 65" sizes so they are already not even competing in the same display size category.
 
It will not be a competitive price because yields are low. And don't bet on Samsung to not have any SW pains, in fact it might be even worst given what the Neo G9 experienced. Besides, Samsung will not be offering QD-OLED in a 4K 42" form factor, just 34" ultrawide, 55", and 65" sizes so they are already not even competing in the same display size category.
I am interested in the QD-OLED tech, but you just reminded me of why I like my LG CX so much more than my Samsung Q90R. The Q90R has a more natural picture to it, which I really like, but I HATE the post-launch support. There are still many unresolved software bugs and at least one feature (eARC) which was promised but never delivered. Compare that to the LG CX, which has gotten amazing post-purchase support with continual improvements and feature adds, and it's no contest; if my LG CX quit working after the warranty expired, I'd probably buy the newest model of the same product only because I know LG is committed to their products.
 
  • Like
Reactions: elvn
like this
Same bad track record can be said of sony tvs support, lack of promised features, and their proprietary bent. LG really stands out.
 
A friend that recently bought a Samsung TV told me that none of them support Dolby Vision (!!!). Doesn't necessarily matter right now for monitor use...but DV gaming in Windows is coming. They may also get with the times and add it to the QD-OLED TV's.
 
A friend that recently bought a Samsung TV told me that none of them support Dolby Vision (!!!).
you’re very very late to the party…

Btw: there isn’t a monitor (not a TV) out there with Dolby Vision Support and there wasn’t any announcements on the Ces 2022 either.
Dolby Vision on Monitors is not a thing so it seems… unfortunately.
 
you’re very very late to the party…

Btw: there isn’t a monitor (not a TV) out there with Dolby Vision Support and there wasn’t any announcements on the Ces 2022 either.
Dolby Vision on Monitors is not a thing so it seems… unfortunately.
The PA32UCG does support Dolby Vision input over HDMI.
 
Back
Top