LG 32GS95UE – OLED 31.5″ with 4K @ 240Hz and 1080p @ 480Hz Support - The death knell of LCD panels

I know you guys will blow me up for this but when I had the C2 and PG42UQ side by side, the difference between glossy and matte if I had to quantify it was like 5% improvement in clarity. In my opinion this glossy vs matte thing gets really blown out of proportion (unless we're talking the Neo G8's Vaseline coating). I'll take a 1080p/480hz button + 200nits+ higher highlight brightness and far less firmware issues over a glossy screen.

This guy comes to the same conclusion:


View: https://youtu.be/iNIN9ud9PZU?si=JSK6dHtKUTlAfRxf&t=439

I just want that guy's hair, nevermind the monitor.
 
Well TFTCentral just reviewed the PG32UCDM and it looks like it has no issues when it comes to HDR performance anymore. I don't have access to the full review so I'm just repeating what others have said about it but so far it looks like all the previous issues with HDR aren't present on the 32UCDM. Regardless though, it's still locked to the same brightness levels as all the other QD OLED monitors which is a huge turn off for me. Anyways at this point I'm seriously tired of waiting so whichever model I can place an order on first whether it's the LG WOLED, MSI QD OLED, or Asus QD OLED, I'm just going to buy it and move on from all this.
TFTCentral reviewed the PG42UQ and 27" variant and found no issues with either. It was out in the wild in customers hands that all the issues were discovered. Reviews are great for measurements but I swear none of these guys actually use the displays.
 
Yeah but 1080p at 32" is eye cancer so I consider the 480Hz mode a moot point. All I want is a 32" 4K 240Hz FLAT OLED, QD OLED is preferred but after seeing just how hard the brightness has been nerfed on these monitors I'm thinking this is also a moot point. Who cares about color volume/saturation at higher brightness levels when the monitors can't even reach those higher brightness levels in the first place.

I'm a brightness whore and i think these qoleds are annoying in bright room...but the alienware 4k240 Isplenty bright in a dark room with hdr1000 on....halo infinite has been jaw dropping beautiful on it
 
Yeah but 1080p at 32" is eye cancer so I consider the 480Hz mode a moot point.
Games that take advantage of 480hz also dont require much graphical fidelity, so a lower resolution works fine. You can even use DSR/DLDSR or Integer Scaling to get better results.
 
Yeah but 1080p at 32" is eye cancer
I figure for some older games where 4k doesn't work at all, or the UI gets too small due to no scaling, I'll make use of the 1080p mode. Might come in handy doe some newer esports titles too which have basic graphics or 2d games without 4k sprites.
 
The pixelation on 1080p on a 32 screen is going to look terrible because of how close the monitor is to your face. It's a small screen with an extremely low resolution. You will see physical blocks like Legos and aliasing can't save that amount of blockiness. If it was 1440p 360 and 4k 240 it would make more sense. I don't think I've used 1080p for 15-20 years almost now. There is NO WAY in hell I would be sold on anything having to do with 1080p. Hell it's barley acceptable on a 7" phone let alone a monitor screen lol they really dropped the ball here because if it was 360 4k and 240 1440p it would have been awesome. (Burn in a whole different discussion for hardcore gaming which is what this is for with that high refresh and fps games all have HUDS)
 
I have a 32” monitor so I can see what integer scale looks like at 1080p. I play a lot of pixel art and retro games so I am pretty stoked for this mode assuming there will be a way to get retro arch to support it. May not be the same use case for everyone.
 
The pixelation on 1080p on a 32 screen is going to look terrible because of how close the monitor is to your face. It's a small screen with an extremely low resolution. You will see physical blocks like Legos and aliasing can't save that amount of blockiness. If it was 1440p 360 and 4k 240 it would make more sense. I don't think I've used 1080p for 15-20 years almost now. There is NO WAY in hell I would be sold on anything having to do with 1080p. Hell it's barley acceptable on a 7" phone let alone a monitor screen lol they really dropped the ball here because if it was 360 4k and 240 1440p it would have been awesome. (Burn in a whole different discussion for hardcore gaming which is what this is for with that high refresh and fps games all have HUDS)
You seem to be obsessing over only the resolution while completely neglecting the refresh rate. Until 2026 when a potentially higher refresh rate OLED emerges, this and next years 480hz OLED's will have the cleanest motion clarity outside of eye destroying BFI implementations. I know its hard to imagine but there are real use cases for this like 2D/retro/emulators and I'd like to also try it in some 3D titles with DLDSR.
 
You seem to be obsessing over only the resolution while completely neglecting the refresh rate. Until 2026 when a potentially higher refresh rate OLED emerges, this and next years 480hz OLED's will have the cleanest motion clarity outside of eye destroying BFI implementations. I know its hard to imagine but there are real use cases for this like 2D/retro/emulators and I'd like to also try it in some 3D titles with DLDSR.

That's if DLDSR even works in 1080p mode. On the Alienware models, DLDSR doesn't even work even when you use a resolution/Hz combo that does NOT require DSC. 1440p 360Hz requires DSC but dropping down to 144Hz doesn't and so should allow you to run DLDSR, yet you can't. This LG may have the same issue where even if 1080p480Hz does not utilize DSC, Nvidia will still lock you out of DLDSR.
 
Also here is an new article on color volume that really outlines why this LG panel is inferior to its QD-OLED counterpart:

https://tftcentral.co.uk/articles/e...d-oled-and-the-need-for-new-metrics-and-specs

QD OLED is absolutely superior. But my issue with this article is that they are literally showing it only in the best case scenario which is 1% window size.

1708023595366.png


At 10% window size where the QD OLED won't even go past 480 nits, the gap between the two is much less in certain colors like when it comes to the Green/Cyan/Yellow. Anyway I think for SDR gaming use I think would be fine with the LG WOLED since I'm leaning towards getting a discounted Samsung S90D for HDR gaming instead so I can have both high brightness AND good color luminance rather than having to pick and choose between the two when it comes to monitors.


EDIT: Further reading into the article, he shows the differences (or lackthereof) at larger APL sizes:

1708025582347.png


Looks like QD OLEDs advantage diminishes as you increase APL, he also mentions this in the conclusion:

1708025635142.png


So yeah with QD OLED monitors being restricted too much, looks like there isn't a huge difference compared to WOLED. TVs on the other should show a much larger gap.
 
Last edited:
That was a very good article, and while in theory QD-OLED has better color saturation, what does it mean in practice? Studios have been using LG's WOLED as reference monitors for years, so this means this theoretical disadvantage isnt enough to affect color accuracy and calibration when taking into account estabilished standards.
If there are no differences for professionals, I doubt that your average joe who just uses his TV/Monitor at Vivid mode in torch mode will be able to notice or appreciate the theoretical advantage QD-OLED has. It's mostly marketing.

The more significant differences lie in the specifications, for instance, there wont be a 480hz QD-OLED this year and WOLEDs typically only offer matte coatings. But both WOLED and QD-OLED remain very good choices for both power users and typical users.
 
Yeah and I've always found this 1-2% window size peak brightness thing pretty pointless. You'll rarely if ever see that kind of scenario in real content so the 10%+ is most important.
 
You seem to be obsessing over only the resolution while completely neglecting the refresh rate. Until 2026 when a potentially higher refresh rate OLED emerges, this and next years 480hz OLED's will have the cleanest motion clarity outside of eye destroying BFI implementations. I know its hard to imagine but there are real use cases for this like 2D/retro/emulators and I'd like to also try it in some 3D titles with DLDSR.
Actually you seem to be obsessing over a 32" 480HZ 1080p screen right? How small niche is this audience and who actually really gives a fawk except for you and a small handful of others maybe? For a while there you were preaching about 120hz Oleds are the best and all this crap. Now you're here to talk about a 480HZ 1080p screen, ok bro lol.
This monitor isn't exciting to me in the least bit.

When a 240HZ 40" plus size 4K display is available that is fast and bright and doesn't burn in then it's something to get excited about.
 
Actually you seem to be obsessing over a 32" 480HZ 1080p screen right? How small niche is this audience and who actually really gives a fawk except for you and a small handful of others maybe? For a while there you were preaching about 120hz Oleds are the best and all this crap. Now you're here to talk about a 480HZ 1080p screen, ok bro lol.
This monitor isn't exciting to me in the least bit.

When a 240HZ 40" plus size 4K display is available that is fast and bright and doesn't burn in then it's something to get excited about.
There are numerous people in this thread alone who have shown interest in the 480hz mode.

As for the rest of your tirade, TLDR.
 
There are numerous people in this thread alone who have shown interest in the 480hz mode.

As for the rest of your tirade, TLDR.
Lol tirade 😆

Like I said a tiny, miniscule niche crowd. You just proved my point. :D
 
Last edited:
Interesting article from TFT but it read like a Samsung sponsored piece in some cases representing worst case scenarios. No mention of the black level issues in normal lighting conditions, lack of a polarizing layer / reflections and perceived contrast. Also I thought the new pixel structure was RWGB not RWBG - which at 32” 4k should mean perfectly sharp text going by close ups from CES?
 
Interesting article from TFT but it read like a Samsung sponsored piece in some cases representing worst case scenarios. No mention of the black level issues in normal lighting conditions, lack of a polarizing layer / reflections and perceived contrast. Also I thought the new pixel structure was RWGB not RWBG - which at 32” 4k should mean perfectly sharp text going by close ups from CES?
I thought the new structure went from RWGB to RGWB. Yeah the article really seems like an advertisement for QD-OLED and spends very little time on the real world implications other than that single chart which everyone overlooks.

Here is a C2 vs these new QD-OLED's:


View: https://youtu.be/5yXcGZ9Jjo0?si=Nda0FE1261Km4_qc&t=3061

Lots of C2 owners have been returning the Alienware 32" because the C2 looks brighter. People underestimate how important that 10-25% highlight range is for real content. I'm still going to grab one of the 32" QD-OLED's just to see for myself side by side how it compares to this LG but I already know its too dim and I prefer overall brightness > situational color volume advantage.
 
Last edited:
I thought the new structure went from RWGB to RGWB. Yeah the article really seems like an advertisement for QD-OLED and spends very little time on the real world implications other than that single chart which everyone overlooks.

Here is a C2 vs these new QD-OLED's:


View: https://youtu.be/5yXcGZ9Jjo0?si=Nda0FE1261Km4_qc&t=3061

Lots of C2 owners have been returning the Alienware 32" because the C2 looks brighter. People underestimate how important that 10-25% highlight range is for real content. I'm still going to grab one of the 32" QD-OLED's just to see for myself side by side how it compares to this LG but I already know its too dim and I prefer overall brightness > situational color volume advantage.


Looks like QD OLED TVs is where it's at. You can have your cake and eat it too with high brightness minus the white subpixel there diluting the colors. The monitors are neutered too much for the sake of longevity which is a shame. I'm gonna see whether or not Samsung releases a 48-50" QD OLED TV next year, and if not then I'll look for a discounted S90D. I'm still going to pick up one of these 4K 240Hz OLEDs, but I'll just be using them for shooters.
 
Actually you seem to be obsessing over a 32" 480HZ 1080p screen right? How small niche is this audience and who actually really gives a fawk except for you and a small handful of others maybe? For a while there you were preaching about 120hz Oleds are the best and all this crap. Now you're here to talk about a 480HZ 1080p screen, ok bro lol.
This monitor isn't exciting to me in the least bit.

When a 240HZ 40" plus size 4K display is available that is fast and bright and doesn't burn in then it's something to get excited about.

I'm interested in seeing what the 65" Q900D 8k FALD screen can do but it's release price is going to be a little too expensive for my taste. Maybe it'll hit sub $3k usd six months after release though. The 77" LG C3 4k was just (maybe still is currently) $2000 at wallmart. I think and that tier used to be $3300 or so. The samsung ark v.2 w/ discount has been down to $1700 where at release it was like $3300. The 8k's stay pretty high priced though since no real competition. I could get the previous year's flagship 900c for $2100 + tax now (discounted price) but it's nearing the end of it's cycle as the 900D is due out soon. When it came out it was like $4500 though :rolleyes: . I'd prob be willing to pay $2600 for a 65" 900D if it hits that point sooner than the end of it's cycle with some sale + discount, as long as it's reviews were pretty good.

The 900D supposedly can do 120hz native 8k, and 240hz 4k upscaled to 8k. It has a new, much faster AI upscaling chip that provides 8k detail on 4k material and handles motion of things like balls in sports and even small golfballs, etc much better than most tvs, e.g. at high detail and no artifacting on the ball but idk what gaming input lag there would be at 4k upscaled to 8k on the screen, (in game mode + running at 240hz). The 60hz 8k 900c got 6.0ms input lag when running 4k at 120hz for reference, according to RTings review. Also interested in what coating it has, whether it's the same as the 900c or it is hopefully better. I'm not interested in the very low resolution (1080p, 1440) higher Hz thing but since a 60hz 8k like the 900c could do 4k 120hz and 1440p at 144hz, it might even be able to do something higher than 240hz on 1400 or 1080p (not neccessarily, just saying there might be potential for that). There is also the issue that the 8k screens prior to the 900D reportedly upscale everything (to 8k, and to full screen) b/c DSC is always on. They don't have two Hz settings in their osd so there is no lower Hz, non-DSC setting. However the 57" 7680x2160 "4k doublewide" G95NC does have a 120hz non-dsc setting in it's OSD for compatibility where you don't have to always have it set at 240hz + DSC, so who knows. Have to wait on some reviews. Only news I see is the people who saw the marketing of it and recorded some video of it at CES so far.
 
Last edited:

Great article you linked. That is a huge con/tradeoff to me. Unfortunately most of the FALD LCD screens have matte abraded outer layer like that, and some of the OLEDs, especially OLED desktop monitors. If you use a screen with a matte-abraded outer layer in the same dim to dark viewing conditions that an OLED works best in, some of those tradeoffs would be less prominent at least. Still it's a shame there aren't more glossy options.

They always say the black depth is raised more to grey blacks, and that the contrast is lowered, which is true and is bad enough. However in my experience matte abraded/ag also ends up losing the saturated look you get from that wet ice, clear glossy screen. It also can make an over-layer look on the screen when the ag is "activated" by ambient or direct light, which can make you feel like you are looking through a plastic layer instead of right at the screen surface. Abraded screen layer can also affect clarity with the "sheen", a small degree of DSE (dirty screen effect), some say "vasoline sheen" when enough ambient light hits it. The light from inside of the panel coming out is also being diffused somewhat by the abrasions.

From that tftcentral article you linked:
This “meter-to-panel” measurement approach only tells one part of the story though, as it is really capturing the maximum contrast ratio the panel is capable of. In real use you are sat much further away from the panel’s surface, and the way you perceive the image on the screen, and the black depth, and therefore the resulting contrast ratio, will be influenced by a number of variables. These include:

Variables impacting perceived black depth and contrast ratio
  • 1. Ambient lighting and brightness of your viewing environment
  • 2. The location and positioning of any light sources (i.e. lamps, lights and windows)
  • 3. The coating on the screen surface, which has a direct impact on how internal light from the panel/backlight, and external light from other sources is handled
  • 4. Your viewing position

It's not just a static thing, ambient lighting. People allow their ambient lighting conditions to shift, change over time - from daylight to night, cloud cover/weather, seasons/time of the year. Our eyes view everything relatively so your parameters are going to change to your eyes/brain as your viewing environment changes, no matter what values they were technically set to be at. Just like a phone at 100% brightness might look dim in bright sunlight, but way too bright in a dark bedroom at night. Every lighting change significantly away from where you set your baseline screen values will alter the way your eyes+brain see the screen so your calibration or factory calibration + tweaking state goes "out the window". On my living room tv, I keep a number of different named picture settings in order to adjust for different room lighting levels. In the room I keep my gaming rig, I just control the room lighting like a studio or home theater.

There are a lot of tradeoffs between different screens, especially FALD and OLEDs. I've been considering a FALD in the future to get some of the features that aren't available in oled. Namely 8k 120hz that can do 4k 240hz, but also looking at different form factors only available in FALD LCD. If or when I ever try a modern FALD, the abraded outer layer would unfortunately be one of the biggest trade-offs to me from a glossy 4k 120hz OLED.

=================================

This has been argued in hardforum threads many times. Here is how I see it.

====================

Think of it like a light haze on clear "dry" ice vs. ultra clear wet ice.

Direct light sources hitting a screen are going to pollute the screen surface regardless. Some (many?) people are using their screens in poor setups. It's just like audio or photography - you should set up your environment to suit your hardware/screen not the other way around imo.

Like I said, improper lighting conditions and room layouts that allow direct light sources to hit a screen surface are going to pollute the screen regardless, as shown in the images below.
ag.vs.glossy.reflecting-direct-light-source_1.png


ag.vs.glossy.reflecting-direct-light-source_2.png


half.matte.half.glossy.screen_vega-1.jpg



Since traditionally desks have been laid out up against the wall like a bookshelf, or upright piano with sheet music against a wall - most setups act like a catcher's mitt for direct light source pollution from behind and overhead. Professional/reference monitors often come with a hood that covers the top and some of the sides, like some production cameras have. Light pollution (as well as allowing lighting conditions to change throughout the day) will alter/pollute how even a calibrated screen's values are seen and perceived.

The direct light source vectors hitting the matte or ag screens will blow out contrast and pale saturation, washing out areas of the screen they hit and are diffused onto. Allowing lighting conditions to change will also alter the way our eyes/brain perceives the screen's contrast and saturation so even their "calibrated" values will be lost to your eyes and brain. E.g. the screen will look more pale, weakly contrasted and undersaturated the brighter the room gets, and vice versa. Some keep several sets of settings so that they can switch between them for different times of the day or different room lighting conditions. So you are going to get compromised results if you don't design your viewing environment more optimally no matter what screen coating you have.

. . . . . . . . . .

From TFTcentral review of the PG42UQ:

The PG42UQ features a more traditional monitor-like matte anti-glare coating, as opposed to a glossy panel coating like you’d find on TV’s including the LG C2. This does a very good job of reducing reflections and handling external light sources like windows and lamps and we noticed much better reflection handling (no surprise) than the LG C2. However this does mean that in some conditions the blacks do not look as deep or inky visually to the user. With this being an OLED panel, famous for its true blacks and amazing contrast ratio this could be considered a problem – are you “wasting” that by having an AG coating that reduces your perceived contrast?
.
In certain conditions blacks look a little more dark grey as the anti-reflective coating reflects some of the surrounding light back at you and it “dulls” the contrast a bit. The anti-glare coating means the image is not as clear and clean as a fully glossy coating. You don’t get this same effect if the coating is fully glossy as there’s no AG layer, but what you do get instead is more reflections. Don’t forget this same thing applies to all AG coated desktop monitors, you have the same impact on perceived black depth and contrast on IPS, TN Film and VA panels depending on your lighting conditions if there’s an AG coating used. You’d still get better relative blacks and contrast on the OLED (not to mention other benefits) compared with LCD technologies. They are all impacted in the same way by their coatings.

While they are concentrating on how it affects the blacks which is bad enough, it can also degrade the color saturation as it creates a haze.

==========================

https://arstechnica.com/gadgets/202...-more-reflections/?comments=1&comments-page=1

Or course you are seeing this picture below on whatever screen surface you are using at the moment so it's more of a simulation.
matte-vs-glossy_3.jpg




. . . . . .

https://euro.dough.tech/blogs/news/matte-vs-glossy-gaming-monitors-technology-explained

matte-vs-glossy_1.jpg


matte-vs-glossy_2.jpg



matte-vs-glossy_3.small.jpg


matte-vs-glossy_4.png


matte-vs-glossy_5.jpg
 
Last edited:
Copied the comment from the below link if you don't want to open it:

https://reddit.com/r/OLED_Gaming/co...d_brightness_woled_vs_qdoled_and_the/kql7u2n/

"Pretty good article. But in faith of better evaluating white/color luminance measurements, I feel the writer isn't fully considering how real-world content makes use of these colors and is overemphasizing QD-OLED's color luminance findings.

The frontrunning issue I have is that the writer is spearheading the display techs' 1% APL color measurements in his conclusions. The problem with this is that most users don't have any real intuition to what 1% APL looks like; they've covered it's an outdated term, and for good reason. In the case of a 1000-nit monitor at 1% APL, this translates to a frame-average light level of about 10 nits across the entire screen.

Most HDR10 movies have an average FALL (frame-average light level) of 10–20 nits throughout the entire movie. Note that for the most part, the large majority of pixel values in scenes are within the SDR domain, since the average light levels between SDR and HDR frames aren't all that different. Scenes that make use of more generous highlights will see FALLs trend toward 40–100 nits. This is why the industry standard is measuring HDR highlights at a 10% window, which covers a FALL up to 100 nits at 1000 nits mastering peak. And for most films, the MaxFALL is often 100 nits or greater. 5%, in my opinion, should be the bare minimum APL peak brightness value that should be advertised in good faith.

Another factor is that most people don't use their monitors in a reference setting. For many people, HDR10 content may look too dim, and that's because it's graded for a dim viewing environment of about 5 nits surround. On average, the content people consume on PC monitors are often of greater pixel level than reference-level films. That is to say, if reference-level HDR10 film is already running at a higher FALL than expected by these monitors outputting 1% APL, then it has very little practical application in using those luminance numbers.

When evaluating the display tech at 10% APL, they found the color luminance between WOLED and QD-OLED to be very closely matched. For most content, that's probably the condition most people will form a conclusion from when fairly evaluating both monitors in the same color-managed state. And holistically, when approaching 1-3% APL, these are often very dark scenes that make little-to-no use of high-chrominance colors, and QD-OLED's advantage here is mostly in test patterns. We see WOLED's color volume plummet in this scenario, but that's mostly because it has a much higher volumetric ceiling due to its much higher white luminance, though in reality the total color volume is actually very similar in magnitude (rather than when expressed as a percentage relative to boosted white).

Finally, in a reference setting, having a color gamut extend past DCI-P3 doesn't provide all that much value, since most content is still confined to within P3. But realistically, many users (like my whole family) do use their monitors in a saturation-boosting state, and that's where QD-OLED truly shines."

"The volumetric "collapse" is a consequence of expressing the panel color volume as a percentage relative to ceiling white.

At 10% APL, the color luminance of WOLED and QD-OLED for those monitors are almost evenly matched. But since the white luminance for WOLED is greater, the modeled relative color volume is figuratively smaller. In terms of absolute volume, the WOLED in that scenario ends up having greater color volume from its white luminance extension. The color capabilities toward 1-3% APL are much greater on the QD-OLED, but at light levels that dark (<30 nits frame-average light level) most of those scenes don't exhibit high-chroma colors, especially not over 300 nits."
 
At 10% APL, the color luminance of WOLED and QD-OLED for those monitors are almost evenly matched. But since the white luminance for WOLED is greater, the modeled relative color volume is figuratively smaller. In terms of absolute volume, the WOLED in that scenario ends up having greater color volume from its white luminance extension.

I tend to agree that the difference is overstated at that 1%, and that 1% window is overvalued.

Here is the RTings blurb from how they test the HDR on screens.
Our peak window tests measure the maximum brightness of a white rectangle displayed on an area covering a certain percentage of the TV’s screen. This provides an idea of how bright a small highlight—the sun, a distant explosion, etc.—might look on-screen, but the larger areas can also represent very bright areas, like if you're watching something with a bright sky.
For this test, we display each of the five slides shown above, representing 2%, 10%, 25%, 50%, and 100% APL (Average Picture Level). We use a Konica Minolta LS-100 Luminance Meter connected to a PC to measure the luminance of the square of light over time. This allows us to see not only the peak brightness of the TV but also to capture the sustained brightness over time for the next test.

What it is: The maximum luminance of the TV, even if only maintained for a short time, of a white square covering 2% of the screen.


When it matters: Bright highlights in HDR, present on-screen for a short time.

Good value: > 650 cd/m²

https://www.rtings.com/tv/tests/picture-quality/hdr-peak-brightness#test_643

firefox_vfiKk96JWp.png


FALD proponents would say look at the 25% and 50% and sustained values for where real advancements are needed/will come in oled. Besides all of that, the part you pasted there didn't say anything about the other tradeoffs of QD-OLED which may include slightly worse black depth by nature of the QD layer tech when ambient light hits the QD layer, and things like a magenta hue to the screen reported on some of them with ambient light hitting it. That and the fact that some of them have abraded outer layer which also raises blacks and has other tradeoffs.

I think this part might be being a little forgiving of WOLED though:
"since the white luminance for WOLED is greater, the modeled relative color volume is figuratively smaller. In terms of absolute volume, the WOLED in that scenario ends up having greater color volume from its white luminance extension."

That seems like a kind way of saying that the white subpixel is polluting or taking up some of the color space that should be high range of different colors in the spectrum rather than white. Yes there is more white but there is less of the other colors that should be represented, and which provide a delineation from lower values of those colors. That is, those high color values should provide detail-in-color at those heights that is being instead substituted as white, or washed out/whitened versions of other colors where the full color should be. Surmising that more white poured into the color volume than normal is a greater color volume (look how much more white!), even if it did peak at a bit higher white threshold, seems disingenuous.

Clipping from things, like certain games trying to assign color values in the spectrum higher than your screen is capable of (without being compressed down into other colors logically) also tends to "clip" or blast things to white which blows out detail, but it is not desirable.

It's a scale though, so it's probably more white at the brightest end of the HDR range rather than it being polluted the same amount throughout the whole range in the WOLED. Which is one of the reasons, along with ABL, that it's been said it's difficult if not impossible to calibrate OLED accurately past a certain height. At least that's what I've heard.

color-hdr-color-gamut.jpg


However, micro lens array tech (MLA) in models that can get it going forward may have to rely on the inclusion of the white subpixels less, or at a different scale. Phosphorescent blue oleds might also provide more resilience where "cheating" a higher perceived brightness via pumping the white subpixel might not be as necessary. So in models that get it, MLA and PhOLED, including from a burn-in avoidance standpoint, might allow the design and firmware choices to fire the white subpixels less, less intensely across a range, inserting white into the higher part of the color volume less. Maybe like a different gradient, a higher gradient starting point in the range where the white starts blending in, dilluting color values appreciably. Less reliance on the white.

At least that's how I perceived everything after reading that, and how I understand the techs to operate, but I'm open to discussion :D
 
Last edited:
I thought the new structure went from RWGB to RGWB. Yeah the article really seems like an advertisement for QD-OLED and spends very little time on the real world implications other than that single chart which everyone overlooks.

Here is a C2 vs these new QD-OLED's:


View: https://youtu.be/5yXcGZ9Jjo0?si=Nda0FE1261Km4_qc&t=3061

Lots of C2 owners have been returning the Alienware 32" because the C2 looks brighter. People underestimate how important that 10-25% highlight range is for real content. I'm still going to grab one of the 32" QD-OLED's just to see for myself side by side how it compares to this LG but I already know its too dim and I prefer overall brightness > situational color volume advantage.


I don't get returning the 32 4k240 to go back to a 42"120.

I'm pretty picky on my monitors and the 32's are pretty dang close to endgame for me.
I have had the 42's,48's and 55's....hell even have a c177 in the living room right now...but nothing comes close to that classic gaming on a CRT feeling than these 32" 4k240's....was playing cod mw2 multiplayer last night and it felt like that classic crt experience with ALL the benefits of modern high resolution tech....its just incredible....The faster 240hz helps me with immersion way more than a larger slower screen.

I am totally happy gaming with this Alienware 32, which is not something I have felt in a long ass time.....I thought I was just getting old and bored of the hobby, which is partially true lol....but turns out my eyes are super picky with clarity.
 
I don't get returning the 32 4k240 to go back to a 42"120.

I'm pretty picky on my monitors and the 32's are pretty dang close to endgame for me.
I have had the 42's,48's and 55's....hell even have a c177 in the living room right now...but nothing comes close to that classic gaming on a CRT feeling than these 32" 4k240's....was playing cod mw2 multiplayer last night and it felt like that classic crt experience with ALL the benefits of modern high resolution tech....its just incredible....The faster 240hz helps me with immersion way more than a larger slower screen.

I am totally happy gaming with this Alienware 32, which is not something I have felt in a long ass time.....I thought I was just getting old and bored of the hobby, which is partially true lol....but turns out my eyes are super picky with clarity.

My guess is that the people who returned it probably weren't able to hit 240fps anyways. 240Hz doesn't mean much if you're only able to get 100-120fps, but anyone who with a PC fast enough to get at least 160 fps would not be going back down to 120Hz. I don't buy the whole "not immersive enough" argument because a 42" display isn't exactly a world of a difference in immersion vs a 32". Going from a 77" down to a 32" sure, but 42 to 32? Nah they're too similar.
 
  • Like
Reactions: elvn
like this
Most HDR10 movies have an average FALL (frame-average light level) of 10–20 nits throughout the entire movie. Note that for the most part, the large majority of pixel values in scenes are within the SDR domain, since the average light levels between SDR and HDR frames aren't all that different. Scenes that make use of more generous highlights will see FALLs trend toward 40–100 nits. This is why the industry standard is measuring HDR highlights at a 10% window, which covers a FALL up to 100 nits at 1000 nits mastering peak. And for most films, the MaxFALL is often 100 nits or greater. 5%, in my opinion, should be the bare minimum APL peak brightness value that should be advertised in good faith.

(I know you were just quoting that reddit reply to be clear).

I watched a dolby vision mastering documentary video a few years ago where the dolby tech said a typical HDR 10,000 scene was mastered as something like 50% at "zero" to 100 nit, 25% at 100 to 1,000 nit, and 25% at 1000 to 10,000 nit.
Your tv's firmware would tone map that down within it's own ratios to fit the capacity of the screen though. When fed HDR 10,000 curve, the LG CX's curve was quoted to me as mapping accurate up to 400 nits, then squeezing 400 - 10,000 nits into the remaining 400 - 800 nits. Higher ranged screens would probably map the accurate part somewhat higher before starting to compress (multiple/ranges of) color values into singular values to fit.
LG CX 1000-nit curve is accurate up to 560 nits, and squeezes 560 - 1000 nits into 560 - 800 nits.
LG CX 4000-nit curve is accurate up to 480 nits, and squeezes 480 - 4000 nits into 480 - 800 nits.
LG CX 10000-nit curve is accurate up to 400 nits, and squeezes 400 - 10000 nits into 400 - 800 nits.

That middle 25% and top 25% is still a lot of brightness, color volume and detail you'd be losing out on with current screens compared to what it was mastered for 1:1, but especially OLED's mids and highs as larger %'s of the screen and on sustained %'s. It's all crushed down and compressed into the screen's capability with static tone mapping, then is curtailed by screen's own % of screen brightness cutoffs + ABL kicking in. Saying that to be honest but I still love OLEDs for their pros despite their cons. You were quoting a comparison of two oled techs to each other anyway, to the point that they run pretty parallel to each other in general usage and compared to the larger view on HDR.

.


My guess is that the people who returned it probably weren't able to hit 240fps anyways. 240Hz doesn't mean much if you're only able to get 100-120fps, but anyone who with a PC fast enough to get at least 160 fps would not be going back down to 120Hz. I don't buy the whole "not immersive enough" argument because a 42" display isn't exactly a world of a difference in immersion vs a 32". Going from a 77" down to a 32" sure, but 42 to 32? Nah they're too similar.

Sounds probable there. Hz is pretty meaningless if you aren't filling it with higher frame rates. That's why I usually describe the effects of gains measured in relation to "fpsHz" rather than Hz.
Some people are prob pushing all of the graphics settings over the top and RTX, etc. and valuing the eye candy there over the motion excellence and the up to 2x the blur reduction you'd get at 240fpsHz vs 120fpsHz. Also, like you hinted at, not everyone who gets glammed by newer screen tech necessarily has a 4090 gpu.

Immersion. It can be relative to view distance on the immersion thing. I'd view a 32" 4k at the same 60 to 50 deg viewing angle I'd view any other size 4k screen (like my 48"), so just as immersive since it would fill the same amount of degrees of my personal FoV, and with the same perceived pixel density.

The overall feel of a larger screen is different though, even at the same viewing angles and perceived pixel density. It's less cramped. I use both types of setups regularly (traditional desktop sized monitor vs larger gaming tv command center style setup) more or less, so I can deal with smaller screen just fine if I had to though I mostly game on my main rig with the larger screens.
 
Last edited:
The better question is what games out there actually support 480 FPS. Despite dynamic rendering engines can still break when the framerate gets too high. To keep things consistent physics engines can be limited to a constant rate on a separate thread, and once the framerate goes beyond that point things can still get funky.
 
The better question is what games out there actually support 480 FPS. Despite dynamic rendering engines can still break when the framerate gets too high. To keep things consistent physics engines can be limited to a constant rate on a separate thread, and once the framerate goes beyond that point things can still get funky.

Some people are saying they would use the 480Hz mode for playing games like Valorant, OW2, R6S, CS2, etc. Basically those esports games that would easily run at 480fps. Still seems like a rather niche use case scenario but you know what even if I wouldn't personally use it I'm not gonna hate on it because I liked using 120Hz BFI on my OLED despite it being a niche use case scenario and it was pretty annoying to see people hate on that feature and celebrate it's removal just because they didn't use it themselves.
 
Some people are saying they would use the 480Hz mode for playing games like Valorant, OW2, R6S, CS2, etc. Basically those esports games that would easily run at 480fps. Still seems like a rather niche use case scenario but you know what even if I wouldn't personally use it I'm not gonna hate on it because I liked using 120Hz BFI on my OLED despite it being a niche use case scenario and it was pretty annoying to see people hate on that feature and celebrate it's removal just because they didn't use it themselves.
It actually wouldn't be good for comp shooters because of the terrible resolution. Doesn't matter how fast it is if you can't resolve the players head you're trying to shoot. It's a blocky pixelated blur making it harder to pinpoint their head. At 25" inches yea, not at 32". That's why this mode is pointless.
 
It actually wouldn't be good for comp shooters because of the terrible resolution. Doesn't matter how fast it is if you can't resolve the players head you're trying to shoot. It's a blocky pixelated blur making it harder to pinpoint their head. At 25" inches yea, not at 32". That's why this mode is pointless.
I personally wouldn't sacrifice resolution if it's already 120hz or more, but counter strike and some other competitive games don't have many encounters where people are far enough away for higher resolution to make a significant difference.
 
I personally wouldn't sacrifice resolution if it's already 120hz or more, but counter strike and some other competitive games don't have many encounters where people are far enough away for higher resolution to make a significant difference.
Of course they do. Jiggle peaking from down sight lines makes a very significant difference.why do you think there is literally no other 32" 1080p super high refresh monitors? Because it's not the correct size. That's why most twitch streamers play on 27" 1440p not because it looks better but because they can see the enemy easier.
 
Just placed an order for the MSI on Bestbuy with an ETA of Monday. Still no ETA on the LG.

Edit: based on the LG promo page, it will be available for preorder at the end of Feb or early March. From what I remember, the 27" LG OLED opened for preorders like 2-3 weeks before they were delivered.
 
Last edited:
  • Like
Reactions: wooj
like this
They seriously did a drop in the middle of the night? Meh well at least one good thing about not able to snag one of these monitors is that it is at least forcing me to wait on reviews to see which is the best option.
 
Back
Top