PG32UQX - ASUS 32" 4K 144 Hz HDR1400 G-Sync Ultimate

Status
Not open for further replies.
I don't understand why people buy a HDR monitor without knowing how to use it. Or it is just another troll attempt to bash down the best HDR FALD LCDs that outperforms any other OLED or any other monitors not capable of true HDR.
Though RDR2 is not graded specifically for HDR, the game already gives HDR calibration for white points up to 500nits, and HDR brightness up to 10,000nits.
With white points 500nits and HDR brightness 1800nits. The game looks good without loosing details, better than any other monitors.

low exposure
View attachment 492617

high exposure
View attachment 492618
I think he means RDR is not mastered for HDR?

My recommendation: Not a game but try Dune in 4K Dolby Vision/HDR10. I watched on this monitor and it is revelatory. I played it side by side with an OLED and while latter shows just marginally better contrast (hair-splitting) in dark scenes, XG shows the bright details like lights much better and makes them pop more life-like. The outdoor scenes such as in the desert sun were extraordinary and looked realistic and the OLED could not compare at all to the XG. It wasn’t even close when the hot desert sun was blazing.

For game, maybe try Elden Ring.
 
I think he means RDR is not mastered for HDR?

My recommendation: Not a game but try Dune in 4K Dolby Vision/HDR10. I watched on this monitor and it is revelatory. I played it side by side with an OLED and while latter shows just marginally better contrast (hair-splitting) in dark scenes, XG shows the bright details like lights much better and makes them pop more life-like. The outdoor scenes such as in the desert sun were extraordinary and looked realistic and the OLED could not compare at all to the XG. It wasn’t even close when the hot desert sun was blazing.

For game, maybe try Elden Ring.
Exactly, RDR is not real HDR, the signal is SDR.
ON Youtube, I found this channel from Europe and the quality of the videos is amazing!
 
  • Like
Reactions: noko
like this
I think he means RDR is not mastered for HDR?

My recommendation: Not a game but try Dune in 4K Dolby Vision/HDR10. I watched on this monitor and it is revelatory. I played it side by side with an OLED and while latter shows just marginally better contrast (hair-splitting) in dark scenes, XG shows the bright details like lights much better and makes them pop more life-like. The outdoor scenes such as in the desert sun were extraordinary and looked realistic and the OLED could not compare at all to the XG. It wasn’t even close when the hot desert sun was blazing.

For game, maybe try Elden Ring.
Exactly, RDR is not real HDR, the signal is SDR.
ON Youtube, I found this channel from Europe and the quality of the videos is amazing!


This is a typical misconception that movies look more impressive than games. Games are already ahead of movies. Most Youtube videos don't exceed 1,000nits.
You won't be able to see half the HDR capabilities of PG32UQX based on Youtube videos.
And I don't know why people test HDR on old RDR instead of RDR2. I know people said RDR2 has crap HDR simply because their monitors are crap in HDR, not the game.
RDR2 is already a very good title. When monitor becomes brighter in the future, RDR2 will look more and more realistic.

For example, the sunset cloud in the Youtube video you linked only touches 1,000nits. That's not a real-life sunset. The Lumetri Scope can show the brightness graph based on the scene.
52222937189_58b413621e_o_d.png



However, the similar scene in RDR2 can be way more impressive, especially on PG32UQX.
52222674221_1a5e831d66_o_d.png



If you do the EOTF calculation, from the graph, the sunset cloud has electric signal values of around 0.875. So the sunset cloud above the top of the mountain is easily 3,000nits.
52222677571_afbc0cd4b2_o_d.jpg

Unless you don't know how to use HDR, games like RDR2 already look much better than the average Youtube videos.
And a monitor like PG32UQX is more impressive even though the original content needs tone mapping from 3,000nits down to PG32UQX's 1800nits peak brightness to display properly, which is configurable in the R2D2's game options.
 
This is a typical misconception that movies look more impressive than games. Games are already ahead of movies. Most Youtube videos don't exceed 1,000nits.
You won't be able to see half the HDR capabilities of PG32UQX based on Youtube videos.
And I don't know why people test HDR on old RDR instead of RDR2. I know people said RDR2 has crap HDR simply because their monitors are crap in HDR, not the game.
RDR2 is already a very good title. When monitor becomes brighter in the future, RDR2 will look more and more realistic.

For example, the sunset cloud in the Youtube video you linked only touches 1,000nits. That's not a real-life sunset. The Lumetri Scope can show the brightness graph based on the scene.
View attachment 492716


However, the similar scene in RDR2 can be way more impressive, especially on PG32UQX.
View attachment 492717


If you do the EOTF calculation, from the graph, the sunset cloud has electric signal values of around 0.875. So the sunset cloud above the top of the mountain is easily 3,000nits.
View attachment 492718
Unless you don't know how to use HDR, games like RDR2 already look much better than the average Youtube videos.
And a monitor like PG32UQX is more impressive even though the original content needs tone mapping from 3,000nits down to PG32UQX's 1800nits peak brightness to display properly, which is configurable in the R2D2's game options.

Where can movies or series in HDR of the best quality?
 
RDR2 sunset can be high above 6,000 nits.

You can download and check the original HDR picture in Windows HDR mode with the build-in Microsoft Photo apps.
RDR2 Sunset 6,000+ nits

The game looks the best on PG32UQX without a doubt.
However, other monitors display the same HDR with massive overposure and ABL.
RDR2_Lumi_2.png
 
Do you think that the pg32uqx is better than the xg321ug? It's the same panel
RDR2 sunset can be high above 6,000 nits.

You can download and check the original HDR picture in Windows HDR mode with the build-in Microsoft Photo apps.
RDR2 Sunset 6,000+ nits

The game looks the best on PG32UQX without a doubt.
However, other monitors display the same HDR with massive overposure and ABL.
View attachment 493489

...
 
Do you think that the pg32uqx is better than the xg321ug? It's the same panel


...
Monitors with the same panel have identical performance.
But even with the same panel, different monitors from each company have different circuit boards. How the TCON is configured is also slightly different, which controls how brightness and color behave. The build quality is also different.
Need to know the details to compare.
 
By now at least some people can find out:
1. Without additional algorithm, making LEDs completely align with PG EOTF will have accurate brightness but cause bloom due to limited number of LEDs/zones.
2. Configuring backlight LEDs to reduce bloom can sacrifice contrast. When the signals hit in, FPGA uses the signal to make the backlight behind content area shines the supposed nits or reduced nits. With more refined FPGA, additional algorithm makes LEDS/zones around the content area lit a small amount of nits. This method is matured on Sony M9's 96-zone AUO panel.
3. VA has better contrast but less colorspace than IPS. HDR will look washed out on VA compared to IPS.
4. OLED has the pixel zones but brightness is too low. Perceived contrast can be even lower than FALD LCDs.

To make a vivid HDR experience, some choices need to be made from above.

PG32UQX uses IPS with widest colorspace and accurate EOTF brightness without much lifted zones on Level 2. Color, contrast are accurate, brightness is sustained. HDR impact is more powerful.

Compared to PG27UQ, which shines a higher nits in background to reduce blooming, PG32UQX with more zones still has less blooming than PG27UQ in HDR scene. The blooming on PG32UQX is the on same level of PG35VQ in the worst case like starfield.

Most people don't have these monitors. Only comparing them side by side can see the difference.

Reference
52235077412_c2a63a6433_o_d.png


Camera has the same fixed exposure. These pictures below are close to viewing in person.

PG27UQ
52236072923_a5425e969f_o_d.png


PG32UQX
52235078787_755b13547d_o_d.png


PG35VQ
52235078547_434f3ae965_o_d.png


PG27UQ vs PG32UQX
52236072003_1cc037169c_o_d.png


PG32UQX vs PG35VQ
52235077922_78dce7e34a_o_d.png


PG27UQ vs PG32UQX vs PG35VQ
52236058391_e3736cb07c_o_d.png



This is the worst case in real usage. HDR hardly has starfield in real usage, so blooming is not that noticeable. In high APL scene, unlike other monitors hit by ABL, these monitors are unmatched because of sustained EOTF brightness, larger contrast and wider colorspace.
 
By now at least some people can find out:
1. Without additional algorithm, making LEDs completely align with PG EOTF will have accurate brightness but cause bloom due to limited number of LEDs/zones.
2. Configuring backlight LEDs to reduce bloom can sacrifice contrast. When the signals hit in, FPGA uses the signal to make the backlight behind content area shines the supposed nits or reduced nits. With more refined FPGA, additional algorithm makes LEDS/zones around the content area lit a small amount of nits. This method is matured on Sony M9's 96-zone AUO panel.
3. VA has better contrast but less colorspace than IPS. HDR will look washed out on VA compared to IPS.
4. OLED has the pixel zones but brightness is too low. Perceived contrast can be even lower than FALD LCDs.

To make a vivid HDR experience, some choices need to be made from above.

PG32UQX uses IPS with widest colorspace and accurate EOTF brightness without much lifted zones on Level 2. Color, contrast are accurate, brightness is sustained. HDR impact is more powerful.

Compared to PG27UQ, which shines a higher nits in background to reduce blooming, PG32UQX with more zones still has less blooming than PG27UQ in HDR scene. The blooming on PG32UQX is the on same level of PG35VQ in the worst case like starfield.

Most people don't have these monitors. Only comparing them side by side can see the difference.

Reference
View attachment 494448

Camera has the same fixed exposure. These pictures below are close to viewing in person.

PG27UQ
View attachment 494449

PG32UQX
View attachment 494450

PG35VQ
View attachment 494451

PG27UQ vs PG32UQX
View attachment 494452

PG32UQX vs PG35VQ
View attachment 494453

PG27UQ vs PG32UQX vs PG35VQ
View attachment 494454


This is the worst case in real usage. HDR hardly has starfield in real usage, so blooming is not that noticeable. In high APL scene, unlike other monitors hit by ABL, these monitors are unmatched because of sustained EOTF brightness, larger contrast and wider colorspace.
What do you think of level 3 variable backlight? I'm using level 3 and notice is a bit faster lighting up the zones but brighter blooming compare to 2.
 
What do you think of level 3 variable backlight? I'm using level 3 and notice is a bit faster lighting up the zones but brighter blooming compare to 2.
Level 3 is similar to the fast mode on the previous model. The backlight changes as soon as possible when next image refreshes. It has more blooming than slower contrast transition.
In high APL games like Forza Horizon 5, level 3 is better. Starfield is not good for level 3, it's the limitation of zones.
 
By now at least some people can find out:
1. Without additional algorithm, making LEDs completely align with PG EOTF will have accurate brightness but cause bloom due to limited number of LEDs/zones.
2. Configuring backlight LEDs to reduce bloom can sacrifice contrast. When the signals hit in, FPGA uses the signal to make the backlight behind content area shines the supposed nits or reduced nits. With more refined FPGA, additional algorithm makes LEDS/zones around the content area lit a small amount of nits. This method is matured on Sony M9's 96-zone AUO panel.
3. VA has better contrast but less colorspace than IPS. HDR will look washed out on VA compared to IPS.
4. OLED has the pixel zones but brightness is too low. Perceived contrast can be even lower than FALD LCDs.

To make a vivid HDR experience, some choices need to be made from above.

PG32UQX uses IPS with widest colorspace and accurate EOTF brightness without much lifted zones on Level 2. Color, contrast are accurate, brightness is sustained. HDR impact is more powerful.

Compared to PG27UQ, which shines a higher nits in background to reduce blooming, PG32UQX with more zones still has less blooming than PG27UQ in HDR scene. The blooming on PG32UQX is the on same level of PG35VQ in the worst case like starfield.

Most people don't have these monitors. Only comparing them side by side can see the difference.

Reference
View attachment 494448

Camera has the same fixed exposure. These pictures below are close to viewing in person.

PG27UQ
View attachment 494449

PG32UQX
View attachment 494450

PG35VQ
View attachment 494451

PG27UQ vs PG32UQX
View attachment 494452

PG32UQX vs PG35VQ
View attachment 494453

PG27UQ vs PG32UQX vs PG35VQ
View attachment 494454


This is the worst case in real usage. HDR hardly has starfield in real usage, so blooming is not that noticeable. In high APL scene, unlike other monitors hit by ABL, these monitors are unmatched because of sustained EOTF brightness, larger contrast and wider colorspace.
I have both the X27 and XG321ug (same panels as PG27uq and UQX). I have viewed them side by side and I note that while X27 still has beautiful HDR performance, it has less “pop” than the newer XG - it has lower brightness in unhighlighted areas whereas XG has more uniform looking screen. I assume this is due to less zones and lower peak brightness. This may also account for the perceived “less blooming “ people notice c/w the mini/LED. However, the blooming is still there I can confirm, and the overall performance of the XG is superior.

I wonder if someone can compare as you have done above the neo G8 and XG/UQX?

Also I wonder does the newer “Am-LED” AUO Panels like Acer X32 FP have the newer FPGA algorithm you mentioned?
 
I have both the X27 and XG321ug (same panels as PG27uq and UQX). I have viewed them side by side and I note that while X27 still has beautiful HDR performance, it has less “pop” than the newer XG - it has lower brightness in unhighlighted areas whereas XG has more uniform looking screen. I assume this is due to less zones and lower peak brightness. This may also account for the perceived “less blooming “ people notice c/w the mini/LED. However, the blooming is still there I can confirm, and the overall performance of the XG is superior.

I wonder if someone can compare as you have done above the neo G8 and XG/UQX?

Also I wonder does the newer “Am-LED” AUO Panels like Acer X32 FP have the newer FPGA algorithm you mentioned?
IPS>OLED>VA in terms of Rec 2020 colorspace. IPS is the most colorful in HDR. OLED looks more similar to IPS than VA does.
NEO G8 is a VA. So it doesn't look very colorful in HDR. It has the ABL issue, the color will look weird even more. But it's fast, brighter than OLED, and has much less blooming.

Acer X32 FP is already out for a while.
The new algorithm is implemented in some way based on Acer's favor without a G-sync module. Acer is usually the worst to dial algorithm. The screen is lifted a bit too hard in HDR mode. But the new 576-zone amLED blooming control on X32 FP is already better than AOC PD32M.
 
1658818464059.png

Download VESA Display HDR Test from Microsoft Store to check the Display info. There seems to have two batches of PG32UQX with different max frame-average luminance.
 
this info are read from firmware?

my pa32ucx

View attachment 495362
This is ProArt configuration to make backlight completely align with EOTF. The peak brightness in 1-100% window is the same as the sustained brightness in 100% window.
In the test section of Calibrate MAX Full Frame Luminance Value, the inner boxes should disappear when set at 1200nits at any window size including full screen.
 
This is ProArt configuration to make backlight completely align with EOTF. The peak brightness in 1-100% window is the same as the sustained brightness in 100% window.
In the test section of Calibrate MAX Full Frame Luminance Value, the inner boxes should disappear when set at 1200nits at any window size including full screen.
yep on 1.150
 
yep on 1.150
This firmware has the best approach. It doesn't have ABL at all on pa32ucx.

PG32UQX can still have ABL. The brightness can drop from 1600nits to 1000nits when the scene has over 1000nits APL. The effect can be seen in person and in the camera. When ABL hits in, 3 seconds later the sun will be there no more. It also takes about 5-10s to recover the brightness after the APL drops.



The original screenshot can be downloaded to check the ABL effect.
Forza Horizon 5 Screenshot 1,000+ nits APL - Windows Photo Compatible

reference
52245201491_5a0d1f6773_o_d.png


52244268947_faa751c74c_o_d.jpg



It's interesting Samsung's HDR 1500 ad has 10,000 nits highlights. But it won't be seen through Chrome on YouTube. Chrome has a tone-mapping layer to match the current monitor HDR specs. It can be only seen from other video players such as mpv without additional tone-mapping setting.

Overall this is a different level of ABL on PG32UQX while other monitors with different specs look 2-5 times dimmer in such scenes. The contrast and colors will shrink as well.
 
XG321UG has additional colorspace over PG32UQX. 50% human eye. 100% sRGB and Adobe.This is consistent with review from Tom's Hardware. Other numbers similar to UQX. I will post screenshot later.
 
For those interested, on Amazon the PG32UQX has dropped price to $2459 (new).
I've been watching that, was as low as $2300 earlier in the year on Amazon as well. My local MC will price match if shipped and sold by amazon. Still, having a PG27UQ, I can't justify it over $2k as a worthwhile upgrade. Yes, it is certainly better overall, but the PG27UQ is still a monitor better than most out right now and does everything I need. Games look amazing and I rarely notice any blooming in the games i play.
 
That first video test looks great on my monitor.... that second video looks like someone has horrible blooming though.
It's specifically made to test the bloom pattern on 1152 zones miniLED monitors. However, the white dots are too small to trigger PG27UQ to display at supposed max brightness.
So the image on PG27UQ is a grey background with SDR nits white dots while on PA32UCG is a grey background with 1800nits white dots and blooming.

It doesn't mean PG27UQ has less blooming though. The local dimming hasn't kick in yet. If I increase the size of white dots, when the local dimming kicks in the blooming is worse on PG27UQ compared to PG32UQX
 
Last edited:
Level 3 is unusable. It looks like fireworks going off.
Level 3 is the fastest backlight. In fact, it is the best setting for moving images.
If level 3 is unusable, then level 2 and level 1 are even more unusable.
Funny some people just keep talking about starfield on OLED without understanding how backlight works.
 
Interesting, my UQX seems to share this pattern here, although I do have better color performance. What is the other variation of the UQX set to?
The max FALL (Frame Average Luminance Level) is above 1,000 nits instead of 951 nits. The firmware has been changed if the basic readout changed.
There is also information that won't be detected such as luminance level per detailed window size and how long the monitor can stay at that luminance.
The previous firmware can keep the brightness at a larger window size. So the new firmware has more ABL than the previous one.
52338692924_457004ab20_o_d.png

For example, PA32UCG can maintain fullfield 1,500nits for over 30 minutes before dropping down to 1,000 nits while PG32UQX has about 3 seconds before it drops down to either 1,200nits or 1,000nits based on different versions.

You can just do HDR picture tests to check how well the monitor handles the drop.

Forza Horizon 5 Screenshot 1,000+ nits APL - Windows Photo Compatible
Forza Horizon 5 Screenshot 1,000+ nits APL - Windows Photo compatible - 2
 
Does PG32UQX support HDR10+? I know it supports HDR10 data but was wondering if HDR10+ metadata can be supported either now or if it is feasible as a firmware update. DolbyVision I know is not supported which is a shame but only 3 monitors do. Would that require a hardware change or can support be added at a software/firmware level?

Watching House of Dragon on this monitor is an absolute treat. Seeing the fire spew out a dragon maw and the overall color volume in HDR1400 is simply breathtaking.
 
Very interesting info.

While I don't think OLED will get 1000nit + in small % window for 30 minutes any time soon :p (very impressive),
I have hopes for some improvements in sustained brightness periods at different % windows via better heatsinks and tech like QD-OLED getting more color volume/brightness out of lower energy/heat states.

The per pixel ultra contrast of OLED to nil "infinite black" right alongside much brighter (than the"inf" deep blacks) HDR color volumes to a pixel's razor edge, no backlights with bloom or dimming offsets in FALD firmwares losing color detail and detail-in-darks where the high contrast color/dark areas meet, along with the .1ms response times of OLED panels (and not having to rely on overdriving and RTC/response time compensation like LCDs do, esp in gaming), are still greatly superior facets so there are always trade-offs.

The Pg32ucg sounds like a much better screen for authoring HDR and anything else at a workspace desk (without spending ~ $30k on a reference monitor) due to the HDR peaks and sustained periods, size and viewing angles, PPD up close.

During dynamic scenes and in ordinary HDR scene types you might say blooming is less apparent and less aggravated on higher density FALDs but you might also say ABL happens less noticeably on some of the modern OLEDs or QD-OLEDs depending how dynamic the scenes are and what types of scene they are too. Most common HDR scene's graphs have a lot of screen in the SDR range, with the HDR being bright highlights and light sources so those % windows are a bit longer than large %'s of the screen would be typically and the content is usually changing from scene to scene or at least camera shot to camera shot in the cinematography relatively quickly. There are exceptions in certain more static scenes of course (watching a sunrise or the sun above the surface of water for a time in a scene for example, or a long shot of a white snowfield/mountain). It depends on the cinematography and content in a movie or where you are and how much you are moving in a game. Another factor is that the PG32ucg is 'only' 32" in size or 31.5" viewable so while great for traditional desk sizes, isn't really large enough for doubling as a media room/larger screen movie or show experience kicking back much imo. It also can't do uw resolution into a ~ 17.5" tall screen viewable like a 48" OLED tv can when running 3840x1600 rez (or a still quite decent height doing that on a 42"). Personally I'm much more into 42" - 48" screens now at ~ 70ppd or more at distances of 34" to 48" away if I can get it. Also worth noting that our eyes process brightness and saturation in a relative way so your room lighting environment also comes into play somewhat. E.g. staring into a flashlight (or a tablet) in someone else's hand on a bright sunny day outside vs staring into that same flashlight at night. OLED are best in dim to dark home theater environments.

Of course specs stand on their own for each but there is also the price factor with the FALD UCG being up to $5000 while you can get a 42" - 48" 120hz VRR LG OLED or alienware QD-OLED gaming screen for $1000 - $1200 (or less with some of the LGs). The 55" A95k is around $3k though currently, they don't offer one at 42" - 48" size. Still the pg32ucg seems like a great screen and I appreciate all the fine details posted in this thread. For me personally, I'd go all OLED for media and gaming where the 32" FALD seems like a better all arounder for static desktop usage + media + authoring in a smaller format more suitable to "upright piano with sheet music" style desk setups rather than command~media center type setups.

A few examples of color mapping HDR brightness levels showing that the brightest points are often smaller % of the screen and that most of the screen in these shots is in much lower ranges. The scenes are usually dynamic (moving around panning or zooming, camera switching, etc.) too. Obviously the higher end of the range would be mapped and compressed down with static tone mapping to whatever the range of the actual display you are using so that comes before % window brightness limitations and screen protection triggers are reached/calculated. E.g. a 700 - 800nit OLED or a 1500nit FALD.


I forget where the star wars game shot came from but the star wars movie shots are from a video from the HDTVTest channel on youtube.

. . . . .

I also modified this old graphic/meme below at one point to be about HDR. Just making a joke - I do appreciate what brighter/higher color volumes and longer sustained HDR can do for realism ;)
. .

tPyXzb2.png
 
Last edited:
The max FALL (Frame Average Luminance Level) is above 1,000 nits instead of 951 nits. The firmware has been changed if the basic readout changed.
There is also information that won't be detected such as luminance level per detailed window size and how long the monitor can stay at that luminance.
The previous firmware can keep the brightness at a larger window size. So the new firmware has more ABL than the previous one.
View attachment 507350
For example, PA32UCG can maintain fullfield 1,500nits for over 30 minutes before dropping down to 1,000 nits while PG32UQX has about 3 seconds before it drops down to either 1,200nits or 1,000nits based on different versions.

You can just do HDR picture tests to check how well the monitor handles the drop.

Forza Horizon 5 Screenshot 1,000+ nits APL - Windows Photo Compatible
Forza Horizon 5 Screenshot 1,000+ nits APL - Windows Photo compatible - 2
That's interesting info, is there any way to update the firmware? I believe mine has more ABL as I have noticed the dimming before.
 
I forget where the star wars game shot came from but the star wars movie shots are from a video from the HDTVTest channel on youtube.
The heatmap was made by EvilBoris who provides materials for Youtube channel. It's a visualization that output brightness as colors. I should do this heatmap as well to make brightness easier to read.
 
  • Like
Reactions: elvn
like this
The max FALL (Frame Average Luminance Level) is above 1,000 nits instead of 951 nits. The firmware has been changed if the basic readout changed.
There is also information that won't be detected such as luminance level per detailed window size and how long the monitor can stay at that luminance.
The previous firmware can keep the brightness at a larger window size. So the new firmware has more ABL than the previous one.
View attachment 507350
For example, PA32UCG can maintain fullfield 1,500nits for over 30 minutes before dropping down to 1,000 nits while PG32UQX has about 3 seconds before it drops down to either 1,200nits or 1,000nits based on different versions.

You can just do HDR picture tests to check how well the monitor handles the drop.

Forza Horizon 5 Screenshot 1,000+ nits APL - Windows Photo Compatible
Forza Horizon 5 Screenshot 1,000+ nits APL - Windows Photo compatible - 2
Hmmm, echoing Skyhopper, that is really interesting. What's interesting is that the initial reviews were clearly based on the firmware with less ABL and higher sustained brightness at much larger windows. Not sure how I feel about a stealth change after the fact but I guess that's par for the course with tech unfortunately. So it seems the very first batch of PG32UQXs had a less restricted ABL, is that so?
 
Last edited:
The heatmap was made by EvilBoris who provides materials for Youtube channel. It's a visualization that output brightness as colors. I should do this heatmap as well to make brightness easier to read.

I know the video from HDTVTest was made by EvilBoris but thanks for giving him the proper full credit. I was just sourcing where I took the screenshots from which is a HDTVTest video. It was the star wars battlefront game slides that I meant when I said I didn't remember where I got that pic from.

edit: Found it here, and yeah I guess it was evilBoris again Thanks. (y)

https://arstechnica.com/gaming/2018...es-broken-down-by-incredible-heatmap-imagery/

originally from his posts here:

https://www.resetera.com/threads/hdr-games-analysed.23587/

I never realized that original thread was him since I visited it a long time ago. The original post was from 2018.
 
Last edited:
That's interesting info, is there any way to update the firmware? I believe mine has more ABL as I have noticed the dimming before.
Hmmm, echoing Skyhopper, that is really interesting. What's interesting is that the initial reviews were clearly based on the firmware with less ABL and higher sustained brightness at much larger windows. Not sure how I feel about a stealth change after the fact but I guess that's par for the course with tech unfortunately. So it seems the very first batch of PG32UQXs had a less restricted ABL, is that so?
I don't think ASUS will release a firmware update. ASUS probably won't even admit it. A monitor with more ABL should be sold cheaper now.
I had two PG32UQX. The first one I had is close to the reviewing sample. After a year the monitor developed a dark spot with a green tint around it. So I RMA it back and get a new one. The second one I have a year later made in May 2022 has more ABL but less bloom. It also feels a bit faster in terms of response time.
 
I know the video from HDTVTest was made by EvilBoris but thanks for giving him the proper full credit. I was just sourcing where I took the screenshots from which is a HDTVTest video. It was the star wars battlefront game slides that I meant when I said I didn't remember where I got that pic from.

edit: Found it here, and yeah I guess it was evilBoris again Thanks. (y)

https://arstechnica.com/gaming/2018...es-broken-down-by-incredible-heatmap-imagery/

originally from his posts here:

https://www.resetera.com/threads/hdr-games-analysed.23587/

I never realized that original thread was him since I visited it a long time ago. The original post was from 2018.
Made a heatmap with pseudo color scope. The original size of the scope is 68200x9600. It's basically a ruler with 68,200 scale increments. This can be the the most accurate visualization so far.

52341008888_56a4974768_o_d.png


However the heatmap is more useful to compare different images. I still prefer RGB waveform where it shows directly how many nits it has.
 
Last edited:
Made a heatmap

You picked an exceptionally bright skyline example. Most of the games heatmapped in boris' thread don't have that kind of full sky 1000nit+ brightness heh.

Prob the highest one in his examples % of screen wise is forza 3 with some bright 1000nit and higher (the ~3000nit sun parts) breaks in what is otherwise a yellow skyline of clouds temp wise.

hdr-06a.jpg



Though his tomb raider example definitely has some extremely "hot" spots (Aka the sun). :D

hdr-04a.jpg



Shadow of mordor is similar.

hdr-10a.jpg


Horizon Zero Dawn : Frozen Wilds

Even from this short clip we can see how to do HDR right, almost everything you see sits within the standard SDR range, however the highlights on Aloy's weapons glimmer in the sun and the sparkle that runs down her back are heading towards the 10k nit level. You can see the clouds are all highly illuminated and sit between 1000-4000nits with the sun itself hitting 10k

https://www.youtube.com/embed/W5GwqYgd0uI


You aren't always staring at the sun like the former heat maps above are. He purposefully had his characters looking at the sun in the sky. Most games and scenes won't be that extreme most of the time highlight wise. Whatever the range of your display is will result in top part of the HDR scale being compressed down to fit the static tone mapping curve too.

Screenshotted by me to show some average scenes. Actually had to scrub to find some areas in the video with red in them. Most are all orange and yellow.

vXsiLtJ.png


QJQz4Jj.jpg
 
Last edited:
You picked an exceptionally bright skyline example. Most of the games heatmapped in boris' thread don't have that kind of full sky 1000nit+ brightness heh.

Prob the highest one in his examples % of screen wise is forza 3 with some bright 1000nit and higher (the ~4000nit sun parts) breaks in what is otherwise a yellow skyline of clouds temp wise.

View attachment 507670


Though his tomb raider example definitely has some extremely "hot" spots (Aka the sun). :D

View attachment 507671


Shadow of mordor is similar.

View attachment 507672

Horizon Zero Dawn : Frozen Wilds



https://www.youtube.com/embed/W5GwqYgd0uI



You aren't always staring at the sun like the former heat maps above are. He purposefully had his characters looking at the sun in the sky. Most games and scenes won't be that extreme most of the time highlight wise. Whatever the range of your display is will result in top part of the HDR scale being compressed down to fit the static tone mapping curve too.

Screenshotted by me to show some average scenes. Actually had to scrub to find some areas in the video with red in them. Most are all orange and yellow.

View attachment 507673

View attachment 507674
It depends on if the monitor can display it or not whether it is in-game or Windows Auto HDR or other HDR implantation.
If adjusting the HDR sider in the game setting, Forza Horizon 5 can have APL up to 700nits and highlights up to 10,000nits. It will have a lot of purple and red. The HDR looks more impressive as well. The sand looks like real sand, you can see the sand reflecting 1,000+nits sunlight.
Future content will be a lot brighter with APL going above 700nits APL. And the real outside is a lot more than this.
 
The color heat maps can show 10,000 even when the display can't. Outside of one game in his examples where he said it was capped at 1000 (which I didn't post here). The horizon example was a 10,000nit map, as evidenced by the chart and the quote. I was showing that his other examples were staring up at the sun and so showed a few screenshots of the horizon walk through of the world where the typical scenes are not that bright commonly, even on open pathways.

Her armor plating on her back's occasional cascade of scintillation on a c1 or c2 oled would probably be at 725nit or 795nit in game mode, respectively (peak 2% window, non-sustained). A little higher for movies outside of game mode (751nit, 810nit). For momentary and dynamic highlights like that in small areas of the screen rather than massive static areas in ABL inducing scenes that is. In a dim to dark viewing environment this is still very appreciable highlight wise and looks great. Especially down the pixel contrasted with per pixel emissive oled and it's black depths. Mind blowingly good looking to me as compared to SDR or many of the "fake HDR" screens. Still I'm hopeful that advancements with oled heat sink tech and QD-OLED's ability to get higher color volumes out of lower energy states will push these numbers up more in the future. At least until micro LED is actually a mainstream thing and we might get 10,000nit HDR in consumer priced screens some years from now. By then we'll probably have higher PPD VR headsets with HDR and maybe even some early mixed reality headsets/glasses/goggles.

For the 3000nit and 10,000nit glints of sunlight off of metal or water, and staring at the sun in passing, etc. All of the screens will be using static tone mapping with a roll off.

For example, the LG CX's roll-offs are:
LG's 1000-nit curve is accurate up to 560 nits, and squeezes 560 - 1000 nits into 560 - 800 nits.
LG's 4000-nit curve is accurate up to 480 nits, and squeezes 480 - 4000 nits into 480 - 800 nits.
LG's 10000-nit curve is accurate up to 400 nits, and squeezes 400 - 10000 nits into 400 - 800 nits.

I don't know what the roll-offs are for the QD-OLEDS or the PG32ucg. The PG32UCG will show a much brighter range overall at HDR1400 (and sustained periods) but it's still compressing any HDR4000 or HDR10,000 content into what top end block of range remains after whatever the accuracy roll off point is that asus went with.
 
Last edited:
Status
Not open for further replies.
Back
Top