PG32UQX vs Top 2023 Displays (Subjective Review)

Burn in and ABL is something that will never ever be "fixed" IMO. It's just inherent to the technology and there's no getting around it, they can only improve it up to a certain point. I'll still be keeping my 32M2V for desktop use and for certain HDR games that have a ton of high APL scenes where an OLED would struggle in.
I mean I think they'll become less of an issue to the point where they aren't such a concern. Like CRTs got there, burn-in wise, back in the day. Yes you could burn in modern CRTs, but it was difficult enough to the point that other than a basic screensaver/shutting off the screen, you didn't worry about it. I think we'll see that with OLED as time goes on, and part of it will be related to brightness. The amount of fade a light has, and thus the burn in, is related to how hard it is driven. So if they are able to increase peak brightness a whole lot, get panels that'll be happily driven to 2000-3000 nits, then the normal operations at the 100-150 nits for desktop usage could well be so sedate it just isn't a problem. I mean they'd still burn in eventually, but it could become something that takes so long it is just never realistically an issue.

The ABL issue will always be there too, but I'm just hoping they can get monitors more around the level of TVs. ABL isn't a big deal on my TV because it can maintain a fair bit of brightness over a fairly large area, even if the full field brightness is actually less than a monitor. The AW3423DW can do 250nits full field (though I don't know it is a good idea to do that for long periods) the S95B caps out at like 210. They also both hit about the same peak of a little over 1000. The big difference is at 10-25%. The AW3423DW is over 1000 nits at 1%, but down to 450 at 10%, falling further to 360 at 25%. The S95B is still over 1000 nits at 10%, and only down to 560 at 25%.

Net effect is you can have a scene that is not overall bright enough to trigger limiting for either display, but you can't get the bright highlights on the monitor because they are too much of the scene, but you can on the TV. If they could bring the monitor up to the same kind of range as the TV, brightness limiting would be much less of an issue in real-world content.
 
I feel like color performance is almost a moot point when it comes to gaming simply because most games do not seem to actually use super wide color gamuts. Even Cyberpunk 2077 is mostly Rec.709 in HDR with just a tiny bit of DCI-P3. This would explain why CP2077 on my CX OLED looks basically identical to my 32M2V with the Mini LED monitor just being brighter overall. Colors on it don't "pop" any harder vs the OLED.

View attachment 602888

BTW, does anyone know where you can get such a program that shows you the color space being outputted by the source material? I'm really curious to see what color space other games use.
Not sure about the one shown in particular, but the only one I can find that does this in software is a Reshade filter. You get it here and add it to Reshade, the specific thing to enable is called lilum_hdr_analysis. It'll get you an analysis like this.
 
Not sure about the one shown in particular, but the only one I can find that does this in software is a Reshade filter. You get it here and add it to Reshade, the specific thing to enable is called lilum_hdr_analysis. It'll get you an analysis like this.

Will try it out. Thanks.
 
Will try it out. Thanks.
I played with it last night and it works well. I can't speak to the veracity of what it reports, but it does what it says and gives statistics, a histogram, etc in realtime. In that specific picture they've turned on its heatmap mode which shows what it HDR and not.
 
I played with it last night and it works well. I can't speak to the veracity of what it reports, but it does what it says and gives statistics, a histogram, etc in realtime. In that specific picture they've turned on its heatmap mode which shows what it HDR and not.

That's good to know. I mainly want to try this out because I see so many conflicting reports regarding WOLED vs QD OLED where some people claim it's a "night and day difference" in the colors while others are saying there is absolutely no difference at all. I think it just boils down to what color space the game itself is using. Obviously if a game is only using Rec.709 color space then there shouldn't be any difference between WOELD and QD OLED since both are more than capable of displaying full Rec.709, but if a game is heavily using BT.2020 then that's what should separate QD OLED from WOLED.
 
That's good to know. I mainly want to try this out because I see so many conflicting reports regarding WOLED vs QD OLED where some people claim it's a "night and day difference" in the colors while others are saying there is absolutely no difference at all. I think it just boils down to what color space the game itself is using. Obviously if a game is only using Rec.709 color space then there shouldn't be any difference between WOELD and QD OLED since both are more than capable of displaying full Rec.709, but if a game is heavily using BT.2020 then that's what should separate QD OLED from WOLED.
I've only tested Jedi Survivor, and only briefly, but in that game it varies a lot by location. There were places that all the colors were in the BT.709 gamut, but also places where the colors were WAY into the BT.2020, outside of what my monitor (or any monitor) could handle.
 
Back
Top