The 32 inch 4k IPS 144hz's...(Update - this party is started) (wait for it...)

I do agree the halo effect isn't as visible in SDR mode but with adaptive dimming on, the mouse cursor looks very washed out and hard to see.
What's the purpose of using adaptive dimming when programming or doing other productivity tasks? In those cases, just use plain SDR without dimming.
 
There are many reasons to use FALD on SDR content.
Feel free to list those out. SDR has a dynamic range of about 6-stops* and is designed for 100cd/m2. SDR doesn't need any form of dynamic backlight to show all of its information.

If you want to 'enhance' SDR outside of the way it was intended to be viewed, you're welcome to do so. Those would be the only "reasons" to use FALD on SDR.

EDIT: Actually "just" 6-stops. I knew I was reading that wrong.
 
Last edited:
Feel free to list those out. SDR has a dynamic range of about 10-stops and is designed for 100cd/m2. SDR doesn't need any form of dynamic backlight to show all of its information.

If you want to 'enhance' SDR outside of the way it was intended to be viewed, you're welcome to do so. Those would be the only "reasons" to use FALD on SDR.
Better blacks and higher contrast. Nits are irrelevant.
 
Better blacks and higher contrast. Nits are irrelevant.
Theoretically if the local dimming had enough zones it would create a greater contrast ratio specifically for SDR content. In most cases it's not. Not when considering by nature of the zone turning on and off, anything inside that zone maintains the same contrast ratio as it's "native" contrast. It only affects one zone relative to another zone.

If contrast ratio is the big thing, then you're far better off getting a VA panel with 3000:1 contrast ratio rather than any FALD panel with a native ratio of 1000:1 and a "greater" contrast ratio with FALD on. RTINGS states it thusly:

Native Contrast vs. Contrast With Local Dimming​


One frequently asked question is which is more important, a panel's native contrast or contrast with local dimming? It's a good question. The answer is a bit complicated, but basically, it depends. Unlike TVs, most monitors don't have a local dimming feature. The few that do, generally speaking, don't work very well. They usually have very small zone counts, and the algorithms can't keep up with fast-paced motion, so the leading edge of a bright object in a dark scene ends up looking darker than the rest, and there's a trail of light behind it.

Because of these issues with local dimming, it's almost always more important to look at the native capabilities of a monitor instead of the contrast ratio with local dimming. Because most monitors have poor local dimming features, there's usually not that much of a difference between the native contrast of the panel and the contrast with local dimming when tested with a checkerboard pattern. In fact, of the 23 monitors with local dimming that we've tested on our latest test bench, only 4 of them can improve contrast by 10% or more with our test pattern through local dimming.
https://www.rtings.com/monitor/tests/picture-quality/contrast-ratio

You can at best state "it depends whether or not you should turn FALD on to increase contrast ratio for SDR content" rather than it is by default "better". And a majority of the time it isn't better.

My personal recommendation? If contrast ratio on SDR content is the metric you care most about it makes far more sense to go OLED. RTINGS basically says the same.
 
If contrast ratio is the big thing, then you're far better off getting a VA panel with 3000:1 contrast ratio rather than any FALD panel with a native ratio of 1000:1 and a "greater" contrast ratio with FALD on. RTINGS states it thusly:
I disagree. FALD should be as usable in SDR as it is in HDR. Nits are irrelevant. Better blacks should always be better than worse blacks. If it's not then it's a problem with the monitor.
 
  • Like
Reactions: Xar
like this
If contrast ratio is the big thing, then you're far better off getting a VA panel with 3000:1 contrast ratio rather than any FALD panel with a native ratio of 1000:1 and a "greater" contrast ratio with FALD on. RTINGS states it thusly:
My VA panel looks considerably better in SDR with FALD enabled. Properly implemented FALD on an LCD monitor will improve the SDR viewing experience. The Rtings page you quoted even alludes to that, stating how "it depends", because most of the monitors they tested were sucky edge lit LCDs with very few dimming zones.
 
Sorry but the halo effect seen with adaptive dimming just bothers me too much but hey that's me. To each their own.
 
Those of you that game at 4k with 32" screens. What do you have to change your resolution scale to in game? Or does windows scaling apply to games? I assume some change would need to be made since the ppi is 140, hitting targets would be more difficult compared to say a 27" at 2560x1440.

I was looking at 3440 x 1440 ultrawides, but 4k 16:9 is wider lol. Although with the scaling needed, would it end up being about the same?
 
Last edited:
Those of you that game at 4k with 32" screens. What do you have to change your resolution scale to in game? Or does windows scaling apply to games? I assume some change would need to be made since the ppi is 140, hitting targets would be more difficult compared to say a 27" at 2560x1440.
As a rule you don't have to change anything. Modern games tend to just ignore Windows scaling and use the panel's native 4K resolution while scaling their 2D elements like HUD and text themselves either implicitly or by providing scaling options.
I've yet to run into a game where using the panel would present any issues due to high PPI. There are some like Deus Ex HR which don't scale their HUDs with resolutions where this may be a bit of an issue. But it's far from what you'd get on a 27" 4K for example.
 
I disagree. FALD should be as usable in SDR as it is in HDR. Nits are irrelevant. Better blacks should always be better than worse blacks. If it's not then it's a problem with the monitor.
For photorealism, better blacks always triumphs higher brightness. One of the reasons so many enthusiasts ditched FALD LCDs for OLED and why OLED is doing so successful right now while FALD IPS and FALD VA aren't.
 
https://tftcentral.co.uk/news/monitor-oled-panel-roadmap-updates-march-2023

32" 4K OLED panel from LGD:
31.5″ with 4K and 240Hz refresh rate (and 480Hz support!)

This is likely to be the most popular and eagerly awaited OLED panel option we think. A ~32″ sized panel with a 3840 x 2160 “4K” resolution and an impressive 240Hz refresh rate, including also an innovative approach to supporting 480Hz as well! Since the release of the 27″ 1440p 240Hz panels this year, we’ve had loads of feedback from people saying they want a 32″ 4K 144Hz model instead, so this planned panel will surpass those expectations with an even high refresh rate at 240Hz.

This would also represent a step change in pixel density on any of their WOLED panels, increasing from the current ~105 – 110 PPI limit options (42″ 4K and 26.5″ 1440p) to ~140 PPI which is their highest density panel confirmed from LG.Display at the moment. A 27″ 4K is also being discussed but not firm at this stage, which would be even higher density if produced.

All 3 of these new 16:9 panels (excluding the 27″ 4K 240Hz which is still in planning) are currently expected around Q3 2024 so there’s a bit of a wait
Also from Samsung:
31.5″ with 3840 x 2160 “4K” resolution and 240Hz – a direct competitor to the panels LG.Display are planning from their technology
 
For photorealism, better blacks always triumphs higher brightness. One of the reasons so many enthusiasts ditched FALD LCDs for OLED and why OLED is doing so successful right now while FALD IPS and FALD VA aren't.
Remember that not everyone uses a PC like a console xD
 
AUO has been busy developing what the company calls Advanced Reflectionless Technology or A.R.T. which the company was showing off at the Touch Taiwan trade show. The panel in question was a 32-inch, 4K AHVA panel, which is AUO's own variant of IPS. Although the panel was set up so there were fewer reflections in the area where it was being displayed, the panel did seem to be less reflective compared to other displays sitting next to it. However, it's always hard to judge these things on a show floor, so we'll have to wait for some reviews before passing final judgement on A.R.T. but it looks like a promising technology when it comes to reducing unwanted reflections.

The panel is also one of the first 4K panels capable of delivering a 240 Hz refresh rate, although it's going to be hard finding a graphics card capable of driving all games at that kind of refresh rate at 4K resolution. Sadly the brightness is only 400 cd/m², although AUO claims 1 ms grey to grey response time with overdrive enabled and a 95 percent coverage of the DCI-P3 colour gamut, which is better than most 4K gaming panels on the market today. Sadly, AHVA panels suffer from IPS glow, just as normal IPS panels, which can clearly be seen in the second image below.
https://www.techpowerup.com/307474/auo-shows-off-4k-240-hz-a-r-t-gaming-monitor-display-panel
 
I hear that the AOC PD32M which is manufacturing in 2023 has fixed almost of it problems. So i wonder what is the PD32M compare to the X32FP ? 2 monitor has same panel and the PD32M double miniled zone of the X32FP so i must be better , right?
 
Last edited:
I don't think anyone would drop around $1800 on a rumor that a monitor which recieved terrible reviews, just might be good now, IF you happened to get this years model, which is out of your control anyway.
 
Finally caved and bought a refurb XG321UG off eBay directly from Viewsonic. Wish it was a VA panel, but beggars can't be choosers. Excited to finally ditch this shitty size/aspect ratio monitor I have. Fingers crossed it doesn't suck.
 
I don't think anyone would drop around $1800 on a rumor that a monitor which recieved terrible reviews, just might be good now, IF you happened to get this years model, which is out of your control anyway.
I read that the PD32M are using cheap Innolux Q7B (or K7B) panel compare to the panel of the X32FP so i end up order a x32fp yesterday.
 
Last edited:
I read that the PD32M are using cheap Innolux Q7B (or K7B) panel compare to the panel of the X32FP so i end up order a x32fp yesterday.
Yeah, I am waiting for some major reseller in my country, with a decent return policy to offer one before I pull the trigger myself...or for Asus to offer their spin on the same panel later this year. Whichever comes first.
 
Finally caved and bought a refurb XG321UG off eBay directly from Viewsonic. Wish it was a VA panel, but beggars can't be choosers. Excited to finally ditch this shitty size/aspect ratio monitor I have. Fingers crossed it doesn't suck.
Enjoy, I picked one up not too long ago to use as my primary monitor because of a good deal; color and performance are amazing, only issues are the haloing and some slight uniformity issues against greyish colors which are acceptable tradeoffs for me.

Recently, I used some suggestions from the ASUS PG32UQX thread to turn off SRGB via the OSD and to use YCbCR444/BT1866; this improves contrast and color for SDR content in HDR colors and brings it much closer to actual HDR content. It worked out well since the ViewSonic uses the same panel.
 
I've tried YCbCR444 too where OSD of XG321UG shows indeed BT1886 gamma curve. But as far as I can see and measure the gamma goes up to 2.4 which is in line with BT1886, but I do not see any other advantages in color improvements (sRGB colors is set to off). Can you explain why this is better than just setting gamma to 2.4 in OSD?
 
Has any XG321UG user ever received a firmware update for their unit? I cannot even find my version in OSD or with EliteDisplayController.
My unit uses ABL more aggressive than units in reviews. White patches in HDR mode starting from 25% or larger will be dimmed after a few seconds, especially a full white screen, which is displayed with 1000 nits after three seconds.
What's your experience with this?
 
I've tried YCbCR444 too where OSD of XG321UG shows indeed BT1886 gamma curve. But as far as I can see and measure the gamma goes up to 2.4 which is in line with BT1886, but I do not see any other advantages in color improvements (sRGB colors is set to off). Can you explain why this is better than just setting gamma to 2.4 in OSD?
This link kind of explains the differences, https://www.portrait.com/resource-center/bt-1886-10-questions-10-answers/

From my understanding, a straight 2.4 may crush blacks, bt.1886 brings out dark details by having lower gamma at the dark end and higher gamma at the light end (closer to 2.4 gamma on the high end)
 
So my refurb (direct from Viewsonic on eBay) XG321UG arrived today. For anyone wondering, these refurbs do NOT come in the standard retail packaging. It arrived in a smaller cardboard box with 2 foam pieces (barely) holding the monitor in place. The stand and power supply/cables were bubble wrapped and tossed in the bottom of the box. The panel and stand appeared to be in like-new condition, and the power supply/cables appeared brand new as well. It does sound like there are some loose plastic bits inside the monitor itself that I can hear when moving it around, which is slightly concerning... but it powered on and looks great. It appears to have one dead pixel in the bottom right corner, but I can't even notice it from my normal viewing distance.

I'm not sure what magic Viewsonic worked with the fan, or if they somehow built a G-Sync Ultimate monitor without a fan, but I can't hear anything. A huge improvement coming from the PG35VQ.

Windows HDR calibration app put me at 1550 nits peak and 1450 nits full screen white. Seems a bit high compared to measurements for the PG32UQX (and the measurements in Tom's Hardware XG321UG review), but whatever. Blooming isn't noticeably better or worse than the PG35VQ. Some scenarios it's better, some it's worse. Changing the variable backlight setting from the default "mode 2" to "mode 1" is a necessity. Significantly improves blacks.

I've tested a few games so far, and I'm impressed. Dead Space remake made me go "WOW!" when I first loaded it up. Maybe it's just the size or the resolution, but it seemed significantly better looking than on my PG35VQ. Coming from a nearly 1100 nit peak monitor already, I wasn't expecting a big brightness difference, but DAMN does this thing get BRIGHT! Fiery explosions and specular highlights seem much brighter on this monitor.

Overall happy I took the plunge so far. Very happy to be back on a flat 16:9 display.

Recently, I used some suggestions from the ASUS PG32UQX thread to turn off SRGB via the OSD and to use YCbCR444/BT1866; this improves contrast and color for SDR content in HDR colors and brings it much closer to actual HDR content. It worked out well since the ViewSonic uses the same panel.

Why not just use RGB/BT2020?
 
Last edited:
So my refurb (direct from Viewsonic on eBay) XG321UG arrived today. For anyone wondering, these refurbs do NOT come in the standard retail packaging. It arrived in a smaller cardboard box with 2 foam pieces (barely) holding the monitor in place. The stand and power supply/cables were bubble wrapped and tossed in the bottom of the box. The panel and stand appeared to be in like-new condition, and the power supply/cables appeared brand new as well. It does sound like there are some loose plastic bits inside the monitor itself that I can hear when moving it around, which is slightly concerning... but it powered on and looks great. It appears to have one dead pixel in the bottom right corner, but I can't even notice it from my normal viewing distance.

I'm not sure what magic Viewsonic worked with the fan, or if they somehow built a G-Sync Ultimate monitor without a fan, but I can't hear anything. A huge improvement coming from the PG35VQ.

Windows HDR calibration app put me at 1550 nits peak and 1450 nits full screen white. Seems a bit high compared to measurements for the PG32UQX (and the measurements in Tom's Hardware XG321UG review), but whatever. Blooming isn't noticeably better or worse than the PG35VQ. Some scenarios it's better, some it's worse. Changing the variable backlight setting from the default "mode 2" to "mode 1" is a necessity. Significantly improves blacks.

I've tested a few games so far, and I'm impressed. Dead Space remake made me go "WOW!" when I first loaded it up. Maybe it's just the size or the resolution, but it seemed significantly better looking than on my PG35VQ. Coming from a nearly 1100 nit peak monitor already, I wasn't expecting a big brightness difference, but DAMN does this thing get BRIGHT! Fiery explosions and specular highlights seem much brighter on this monitor.

Overall happy I took the plunge so far. Very happy to be back on a flat 16:9 display.



Why not just use RGB/BT2020?
Congratulations with your XG321UG!

When in the Windows HDR Calibration app my unit displays the full white screen for three seconds at full brightness and then goes down to around 1000 nits (while white point is still around the 1500 nits). Do you see this behaviour also on your unit?
 
I read that the PD32M are using cheap Innolux Q7B (or K7B) panel compare to the panel of the X32FP so i end up order a x32fp yesterday.
Acer just posted a 20% discount in my country, so I just ordered an X32 FP as well from Acer store. Asus is just so obnoxiously slow and secretive about their release date, they pissed me off too much already.
 
Remember that not everyone uses a PC like a console xD
Burn-in is an annoying cliche at this point. Won't happen unless you torture it for 1000 hrs of taskbar since day-1 of operating. I'd be worry more about screen degradation because of organic materials even at this state of advancement.
 
Congratulations with your XG321UG!

When in the Windows HDR Calibration app my unit displays the full white screen for three seconds at full brightness and then goes down to around 1000 nits (while white point is still around the 1500 nits). Do you see this behaviour also on your unit?
Nope, mine does not do that. I can leave it on the full screen white test for 15+ seconds and it does not dim. Ok, I see what you're saying now. When advancing to the third test menu (full screen brightness), it is initially BRIGHT white. Then after about 3 seconds it will dim. This happens without touching the brightness slider at all. Strange.

One strange behavior is that there seems to be some sort of weird/inconsistent tone mapping going on. When I loaded the HDR calibration app to test this, I initially needed a setting of 2000+ nit full screen white for the test pattern to disappear. I had previously seen weird behavior after powering off/back on the monitor, where some games (Hitman 3 for example) looked washed out in HDR, but then looked fine after rebooting my PC. So I loaded Hitman 3 just now, and the game looked normal. Closed the game, launched HDR calibration again, and now it's back to normal where I need 1450 nits for full screen white.

My PG35VQ would do something similar where it would randomly need ~3000 nits in game calibration settings instead of 1000 nits. I never could figure out what would cause this, but it was rare. This XG321UG seems much more finicky, and I can see this being a major point of frustration for me in the future.
 
I think it's pretty clear, that 32" 4k IPS high refresh FALD HDR screens are for "monitor hobbyists" only. Which is why I am certain, even though I just bought such an IPS, OLEDs will ultimately trample all over this mini-LED tech.

As a point of comparison, in my country, Acer X32 FP currently costs exactly as much as the Asus 27" 240Hz 1440p OLED, and Asus' OLED is still selling like crazy, compared to the Acer.
 
Last edited:
I think it's pretty clear, that 32" 4k IPS high refresh FALD HDR screens are for "monitor hobbyists" only. Which is why I am certain, even though I just bought such an IPS, OLEDs will ultimately trample all over this mini-LED tech.

As a point of comparison, in my country, Acer X32 FP currently costs exactly as much as the Asus 27" 240Hz 1440p OLED, and Asus' OLED is still selling like crazy, compared to the Acer.
I think it's pretty clear, that all high end computer monitors are "monitor hobbyist" only.

Do you have any proof that the Asus 27" 240Hz 1440p OLED is "selling like crazy"? Because the number of people spending 4 figure price tags on ANY computer monitors are a drop in the bucket compared to the overall market.

For anyone considering spending over a grand for a computer monitor, the 27" 240Hz 1440p OLED potentially has three HUGE downsizes:
  1. Dinky size (27")
  2. Potato resolution (1440p)
  3. Piss poor brightness (<500 nits for 25%+ window)
 
  • Like
Reactions: Gabe3
like this
I think it's pretty clear, that all high end computer monitors are "monitor hobbyist" only.

Do you have any proof that the Asus 27" 240Hz 1440p OLED is "selling like crazy"? Because the number of people spending 4 figure price tags on ANY computer monitors are a drop in the bucket compared to the overall market.

For anyone considering spending over a grand for a computer monitor, the 27" 240Hz 1440p OLED potentially has three HUGE downsizes:
  1. Dinky size (27")
  2. Potato resolution (1440p)
  3. Piss poor brightness (<500 nits for 25%+ window)
No, the monitors we discuss here are hobbyist, because they require a lot of fiddling to get some core features like HDR, local dimming and VRR running correctly. On the other hand, these OLEDs are more-or-less plug and play and look great without all the fiddling.

Yes, I am following the biggest on line store that regularly posts accurate number of available units they have ready to ship, and the whole Asus OLED stock is just disappearing in front of my eyes, at price well over MSRP. Acer X32 FP is nowhere to be found, except at their official store. No one even heard of it, but everyone sure did hear about Alienware and Asus OLEDs.

This Asus in particular, has 900 (Neo G7: 1200) nits in 10% and almost 500 (Neo G7: 1000) nits in 25%, which is far from "piss poor" and I've had the Neo G7 myself and returned it.
1440p resolution is actually ideal for vast majority of gamers who can't afford crazy expensive graphic cards like the RTX 4090 and 27" is a IMHO maximum anyone should go with 1440p anyway, without things looking too pixelated and blurry.

Finally, these OLEDs are basically first generation and brand new, while what we got after years of mini-LED is extremely underwhelming...
 
Nope, mine does not do that. I can leave it on the full screen white test for 15+ seconds and it does not dim. Ok, I see what you're saying now. When advancing to the third test menu (full screen brightness), it is initially BRIGHT white. Then after about 3 seconds it will dim. This happens without touching the brightness slider at all. Strange.

One strange behavior is that there seems to be some sort of weird/inconsistent tone mapping going on. When I loaded the HDR calibration app to test this, I initially needed a setting of 2000+ nit full screen white for the test pattern to disappear. I had previously seen weird behavior after powering off/back on the monitor, where some games (Hitman 3 for example) looked washed out in HDR, but then looked fine after rebooting my PC. So I loaded Hitman 3 just now, and the game looked normal. Closed the game, launched HDR calibration again, and now it's back to normal where I need 1450 nits for full screen white.

My PG35VQ would do something similar where it would randomly need ~3000 nits in game calibration settings instead of 1000 nits. I never could figure out what would cause this, but it was rare. This XG321UG seems much more finicky, and I can see this being a major point of frustration for me in the future.
I never saw that behavior on my unit. The white point sits always around the 1500 nits. The only thing I have is in Forza 5 that I always need to reset the white point in the game (while value was correct), otherwise bright highlights will only be displayed with 1000 nits instead of 1700 nits.
 
Acer just posted a 20% discount in my country, so I just ordered an X32 FP as well from Acer store. Asus is just so obnoxiously slow and secretive about their release date, they pissed me off too much already.
Congrat! My X32FP haven't arrive yet because of the big holiday here. I hope you love your X32 FP!
I still have my PA32UC here and it have only 384 Zones but the color and black are stunning, so i hope the X32FP will have good picture quality compare to the PA32UC for my work and 160Hz for my gaming :)
 
I think it's pretty clear, that 32" 4k IPS high refresh FALD HDR screens are for "monitor hobbyists" only. Which is why I am certain, even though I just bought such an IPS, OLEDs will ultimately trample all over this mini-LED tech.

As a point of comparison, in my country, Acer X32 FP currently costs exactly as much as the Asus 27" 240Hz 1440p OLED, and Asus' OLED is still selling like crazy, compared to the Acer.
No. I am a photographer and the X32FP is a gem for someone like me. It was a 4K monitor with 99.5 AdobeRGB for work and it also have 160Hz for gaming compare to the Asus Oled only 2K and 90% AdobeRGB.
I wait almost 3 years to have the 4k monitor that can have high Adobe accuracy but also have high refesh rate. And most inportant is it just 1/3 price of the Asus proart.
 
Last edited:
No. I am a photographer and the X32FP is a gem for someone like me. It was a 4K monitor with 99.5 AdobeRGB for work and it also have 160Hz for gaming compare to the Asus Oled only 2K and 90% AdobeRGB.
I wait almost 3 years to have the 4k monitor that can have high Adobe accuracy but also have high refesh rate. And most inportant is it just 1/3 price of the Asus proart.
I agree, Acer X32FP is a terrific monitor, great of both world, a great prosumer monitor with excellent gaming performances.
 
Back
Top