PG32UQX - ASUS 32" 4K 144 Hz HDR1400 G-Sync Ultimate

Status
Not open for further replies.
The color heat maps can show 10,000 even when the display can't. Outside of one game in his examples where he said it was capped at 1000 (which I didn't post here). The horizon example was a 10,000nit map, as evidenced by the chart and the quote. I was showing that his other examples were staring up at the sun and so showed a few screenshots of the horizon walk through of the world where the typical scenes are not that bright commonly, even on open pathways.

Her armor plating on her back's occasional cascade of scintillation on a c1 or c2 oled would probably be at 725nit or 795nit in game mode, respectively (peak 2% window, non-sustained). A little higher for movies outside of game mode (751nit, 810nit). For momentary and dynamic highlights like that in small areas of the screen rather than massive static areas in ABL inducing scenes that is. In a dim to dark viewing environment this is still very appreciable highlight wise and looks great. Especially down the pixel contrasted with per pixel emissive oled and it's black depths. Mind blowingly good looking to me as compared to SDR or many of the "fake HDR" screens. Still I'm hopeful that advancements with oled heat sink tech and QD-OLED's ability to get higher color volumes out of lower energy states will push these numbers up more in the future. At least until micro LED is actually a mainstream thing and we might get 10,000nit HDR in consumer priced screens some years from now. By then we'll probably have higher PPD VR headsets with HDR and maybe even some early mixed reality headsets/glasses/goggles.

For the 3000nit and 10,000nit glints of sunlight off of metal or water, and staring at the sun in passing, etc. All of the screens will be using static tone mapping with a roll off.

For example, the LG CX's roll-offs are:


I don't know what the roll-offs are for the QD-OLEDS or the PG32ucg. The PG32UCG will show a much brighter range overall at HDR1400 (and sustained periods) but it's still compressing any HDR4000 or HDR10,000 content into what top end block of range remains after whatever the accuracy roll off point is that asus went with.
The color heat maps only show data transfer without the concern of hardware limits.
There is a lot of money on mappings such as Dolby Vision, Apple XDR, and ASUS PQ to handle the brightness data transferred from content.
They are trying to do the tone mapping and reverse tone mapping to make the monitor display its full protentional while not looking overexposed or underexposed. Manually mapping/grading from HDR 400 to HDR1400 is not efficient for distribution.
But overall, it depends on the monitor's hardware.
 
The color heat maps only show data transfer without the concern of hardware limits.
There is a lot of money on mappings such as Dolby Vision, Apple XDR, and ASUS PQ to handle the brightness data transferred from content.
They are trying to do the tone mapping and reverse tone mapping to make the monitor display its full protentional while not looking overexposed or underexposed. Manually mapping/grading from HDR 400 to HDR1400 is not efficient for distribution.
But overall, it depends on the monitor's hardware.

The point was, Horizon was indeed a 10,000nit map yet most of the scenes in the youtube video walking over the mountain had HDR highlights in the 400 - 600 range. Unless you are staring at the sun/sun breaking through clouds the HDR ranges of the environments in a lot of game's scenes HDR ranged areas are often in the yellow and orange (300-400) through 600 to red (1000) range with the bulk of the scene in SDR range. You do lose the higher peak scintellation like on her armor occasionaly but even HDR1400 would be compressed down there from the 4000 - 10,000 nit the scintellation would be mapped as otherwise.

But in reply to your last comment - yes, overexposed happens a lot in bad HDR implementations in games, where the game has no HDR peak brightness setting for example. CRU editing your driver to the peak of your screen (around 800 nit in my case) can help with that for games where the game/windows would otherwise end up up using the wrong curve due to the dev dropping the ball. Similarly, the blown out loss of detail on the high end also happens when you use dynamic tone mapping instead of relying on the static tone mapping of the screen. Dynamic will pump up the colors but it will be a perverted curve that crushes out detail since it has to compress somewhere. More or less pushing colors into stepping on top of detail in higher ranges they shouldn't be in (they wouldn't be in those ranges in a less inaccurate static metadata curve). There are a lot of examples of this in HDTVTest videos.
 
Last edited:
It depends on if the monitor can display it or not whether it is in-game or Windows Auto HDR or other HDR implantation.
If adjusting the HDR sider in the game setting, Forza Horizon 5 can have APL up to 700nits and highlights up to 10,000nits. It will have a lot of purple and red. The HDR looks more impressive as well. The sand looks like real sand, you can see the sand reflecting 1,000+nits sunlight.
Future content will be a lot brighter with APL going above 700nits APL. And the real outside is a lot more than this.

I hope my immediate reaction to those to those APL numbers is off base here, but if not it's SDR Mode for Life. My existing displays are set at around 100 nits; and I can't comfortably use them in a dark room; the brightness difference between the screen and the wall behind gives me eyestrain. The outside gets extremely bright, but everything's that bright so my pupils can narrow all the way down and I'm not constantly blasting my eyes with too bright screen lighting every time I glance away for a moment and they dilate open again.

I'm not going to run 7x my normal lighting (and not just because it'd be massively too much for everything else); and universally poor experiences with built in automatic ambient brightness adjustment features in monitors I've owned has me dubious that any of them could implement an HDR Brightness Only for Highlights that wouldn't also drive me crazy by adjusting the overall level up and down noticeably.

On the plus side, one less feature to care about will let me save money i can spent on the rest of my system.
 
Last edited:
- I'm not trying to say that HDR1400 and especially the periods it can be sustained on the pg32ucg are not superior HDR. They clearly are. I'm just showing how appreciable the HDR is on OLEDs in normal gaming scenes.
Also that even HDR1400 isn't "bright enough" to do HDR3000 through HDR10,000 levels. It's still compressing after an arbitrary fall-off.

-outside of staring at the sun and some particularly bright sun reflections off of metal or water in parts of a game, a lot of average screen content in many games (especially adventure games) is within SDR ranges with the bulk the HDR % of many HDR scenes at 300 through 600nit. Even explosions and lasers and torches and spells are usually not as bright as the sun though they might hit ~1000nit (red on the color maps).

-the heat map of 3000nit - 10,000nit ranges of content (hot pink to white on the heat maps) will all be compressed into the top end of a screen's range (800nit range, or HDR 1400 range for example) after an arbitrary fall-off point defined by the manufacturer where compression starts. So where my screen might be accurate to 400nit for HDR10,000 content and compresses everything else into another ~400nit curve wise, a HDR1400 screen might be accurate to 1000 and compress everything else into another ~400nit curve wise. Just guessing but it would be interesting to find out where the asus pg32ucg's accuracy fall-off point is.

- Also explaining how good OLED HDR looks in dim to dark viewing environments due to the way our eyes percieve contrast and saturation in a relative way. E.g. 800nit highlights in a dark room might look as brightly impactful as 1200nit highlights in a bright room. (not actual figures, just saying, much like looking at a tablet or phone screen in daylight vs a tablet or phone screen at night). There is more to consider than just the brightness of the screen due to the way our eyes work. Even 1400nit will look different (more muted) in a bright room or in sunlight vs 1400 nit in a dark room (where it will look more intensely bright to your eyes).

-OLED .1ms response time vs LCD's still having to rely on using RTC and overdrive is a factor, also per pixel emissive: down to "infinite black depth" and detail in blacks down to the pixel level right alongside a pixel of higher color volume - vs FALD firmwares having to choose between leaning toward either a dimming or glowing "halo" which can lose detail. ABL/low sustained peaks is big the tradeoff there on OLED. Heatsinks and QD-OLED should help to diminish ABL occurences in some screens going forward though.

- size difference for more media center oriented setups. 32" is nice up close at a workstation desk but it's not great for a bigger screen feel playing with a controller or watching a video (especially letterboxed videos which will esentially shrink the viewable and size of onscreen material/characters perceptually) . A 48" or 42" screen can also run as a big uw screen resolution. At 3840x1600 a 48" oled ends up having a ~17.5" tall viewable uw rez for example. Hopefully some screens with HDR1400 and higher tech will come out in somewhat larger sizes eventually. Still hopeful regarding heatsink tech and QD-OLED energy levels for now since I'm a huge fan of per pixel emissive razor's edge to inf black.

-Price/performance. ~ $4500 to $5k+ vs $800 - $1200 for OLED tv or 42" alienware QD-OLED monitor.

It sounds like a great monitor (PG32ucg) and especially as an all-rounder and content creator screen rather than only a media/gaing screen. Several years ago I might have sprung for something like this but coming from where I am now it's not my choice. I fully understand that brighter HDR ranges are more realistic and look better though so please don't get me wrong. Going from a compressed height of ~800 to a compressed height of ~1400 is appreciable but it's still scaled way down on the top end regardless and importantly, many common scenes (not saying all) live in the 300 - 600 range with some 1000 reds splattered in.

I still found a lot of the info posted here very valuable in regard to HDR tech and discussion. I try to keep my eyes open.
. . . . . . . . .
 
Last edited:
I hope my immediate reaction to those to those APL numbers is off base here, but if not it's SDR Mode for Life. My existing displays are set at around 100 nits; and I can't comfortably use them in a dark room; the brightness difference between the screen and the wall behind gives me eyestrain. The outside gets extremely bright, but everything's that bright so my pupils can narrow all the way down and I'm not constantly blasting my eyes with too bright screen lighting every time I glance away for a moment and they dilate open again.

I'm not going to run 7x my normal lighting (and not just because it'd be massively too much for everything else); and universally poor experiences with built in automatic ambient brightness adjustment features in monitors I've owned has me dubious that any of them could implement an HDR Brightness Only for Highlights that wouldn't also drive me crazy by adjusting the overall level up and down noticeably.

On the plus side, one less feature to care about will let me save money i can spent on the rest of my system.


The average screen brightness isn't like that most of the time. Like I said there were some extreme examples shown here. Not every game is going to the beach at high noon in the summer .

Most of a typical HDR scene is in SDR range with the HDR ranges confined to some highlights and light sources. Even then, in a lot of HDR scenes the environmental content is in 300 - 600 range with a few sources/highlights flaring in to 1000 at times.

I'm not going to run 7x my normal lighting
- HDR is actually best in dim to dark home theater environments imo. You aren't supposed to be counteracting it's impact.
- The viewport is your "world" while viewing HDR content, like a theater. When not viewing media/gaming, you can turn HDR off or switch to another profile (or different named mode you've set up on a tv) with a low peak brightness.


Outside of a bright sunlit wide open sky you are staring at in certain scenes or a particular game, your entire screen surface won't be blasting 1000 - 1400nit in your face. Even sunlight glinting off of a gun barrel or armor etc at 3000nit - 10,000nit had you a 10,000nit capable screen, would only be in tiny %'s of the screen like glitter, and if an exceptionally bright moon were in the frame for a time it might be the size of a coin depending.


This graphic is like what you are saying. I'm not saying that it never happens momentarily (maybe a nuclear explosion scene full frame?) - but in reality that's not really what happens in typical scenes throughout a movie or game session. You aren't turning your whole screen "up to 11", (or 800 - 1000+) full frame all of the time.

725876_tPyXzb2.png
 
Last edited:
"Most" HDR or "normal" HDR is made based on current average displays. You have to consider how many monitor can display the content.
While a lot of monitors can display low light, low APL scene, not many monitors can display HDR 1,000 to its upper end. That's where HDR shines the most.

I assume this is probably the brightest HDR most people have ever seen if their monitors can do it properly. And it is just HDR 1000.


You can download the HDR screenshots to check the images.
52341679318_eb150dcd73_o_d.png


52341865800_ed20df5eff_o_d.png


52341679158_cd8bc1c56b_o_d.png


52341438846_0d2f16669a_o_d.png


52341739224_51a2c7cd8d_o_d.png


52340476602_fc04c599f5_o_d.png


52341866555_e7840e2618_o_d.png

52341678773_2cb1433663_o_d.png

52341438061_6cc5fa815a_o_d.png

52341678338_872d048b8e_o_d.png


52341437816_2871a549a0_o_d.png
 
Last edited:
Yeh those are all very bright areas with the bulk of them long views of considerably bright open skylines.. but sure. Not every game and movie is all high noon with a skyscape or under intense fluorescent lighting. Even so, those very bright environment shots are 650 - 1000nit mostly in the intense areas and orange to blue in the rest, (but as you said it is HDR1000 content).

Games aren't necessarily limited to HDR1,000 like a lot of movies and shows are released at since they are just code.
The sun itself and breaking sunny sky where there aren't clouds in the tomb raider shot and the shadow of mordor shot as well as the heavenly orb in the sky in the star wars battlefront game shot were much brighter spots than those examples you just showed containing up to ~ 10,000nit hot pink to white, but the character was purposefully aimed to look at the brightest part of the sky in those.

Horizon Forbidden West as quoted in the mini review and youtube video also had HDR 10,000 peak with 10k nit scintillating reflections on the main character's armor periodically like glitter (the reflections glinting on the barrel of the gun in the battlefront shot hit 10k as well) . . but the typical walking scenes outside of those highlights were SDR + 300 to 600 nit in the environment, with a bright part of the sky peaking through at ~1000 at the very top of the screen briefly. A 10k screen highlight might be ~ 1400nit or ~780 nit depending on the screen with everything else compressed into a range at the top end.

Of course the actual real world has much more intense lighting levels and the brighter and better HDR becomes the more realistic it will look - but the content we have now is what we are running in HDR curve fall-offs on our screens so that is what is comparable for now.

The ucg will stand out with it's higher peak in the brightest scenes and highlights and especially it's sustained HDR brightness levels but the bulk of scenes, even HDR10,000 game scenes, remain in SDR range + 300 - 600 + some red 1000 light sources depending on the environment. All of that will be subject to different fall-offs and compressed at the top end so will still scale relative to each part of a scene's brightness/color volume. I'm not saying + ~600nit increase on the top end won't look better by any means, even just highlights alone, just that a large % of the screen in a lot of scenes isn't that high (even in a HDR10,000 game).

Here are a few shots of Elden Ring from that earlier boris screenshot link:

HVdzVeS.jpg


kYujI3p.png


dIb0J0i.png
 
Last edited:
Yeh those are all very bright areas with the bulk of them long views of considerably bright open skylines.. but sure. Not every game and movie is all high noon with a skyscape or under intense fluorescent lighting. Even so, those very bright environment shots are 650 - 1000nit mostly in the intense areas and orange to blue in the rest, (but as you said it is HDR1000 content).

Games aren't necessarily limited to HDR1,000 like a lot of movies and shows are released at since they are just code.
The sun itself and breaking sunny sky where there aren't clouds in the tomb raider shot and the shadow of mordor shot as well as the heavenly orb in the sky in the star wars battlefront game shot were much brighter spots than those examples you just showed containing up to ~ 10,000nit hot pink to white, but the character was purposefully aimed to look at the brightest part of the sky in those.

Horizon Forbidden West as quoted in the mini review and youtube video also had HDR 10,000 peak with 10k nit scintillating reflections on the main character's armor periodically like glitter (the reflections glinting on the barrel of the gun in the battlefront shot hit 10k as well) . . but the typical walking scenes outside of those highlights were SDR + 300 to 600 nit in the environment, with a bright part of the sky peaking through at ~1000 at the very top of the screen briefly. A 10k screen highlight might be ~ 1400nit or ~780 nit depending on the screen with everything else compressed into a range at the top end.

Of course the actual real world has much more intense lighting levels and the brighter and better HDR becomes the more realistic it will look - but the content we have now is what we are running in HDR curve fall-offs on our screens so that is what is comparable for now.

The ucg will stand out with it's higher peak in the brightest scenes and highlights and especially it's sustained HDR brightness levels but the bulk of scenes, even HDR10,000 game scenes, remain in SDR range + 300 - 600 + some red 1000 light sources depending on the environment. All of that will be subject to different fall-offs and compressed at the top end so will still scale relative to each part of a scene's brightness/color volume. I'm not saying + ~600nit increase on the top end won't look better by any means, even just highlights alone, just that a large % of the screen in a lot of scenes isn't that high (even in a HDR10,000 game).

Here are a few shots of Elden Ring from that earlier boris screenshot link:

View attachment 507837

View attachment 507838

View attachment 507839
I haven't played games for a while. Most of the games have a slider option to adjust the average HDR luminance level and highlights.
This can make games look dramatically different. I will checks these games when I have time.
 
It still depends on the monitor. The luminance of a game can be adjusted for different displays. They can look a lot more impressive with higher average luminance.

Horizon Zero Dawn, God of War work differently. They don't cap the max brightness for highlight objects. But the highlight objects are in a small number such flams, lighting or sun. Other objects look similar to SDR 100nits. The devs should've graded more objects. These two games have a normal SDR look.

The most impressive games somehow cap the max brightness to the monitor, but the APL is the highest. The APL will reach 700-1,000ntis easily.

52341899186_14f46eba9e_k_d.jpg

52342202119_86337114c4_k_d.jpg

52340941817_b8b77bca16_k_d.jpg

52342200999_32b27800d5_k_d.jpg

52341898156_8bad37a60c_k_d.jpg

52342200254_ee9196690c_k_d.jpg

52341897336_17c5fc5b20_k_d.jpg

52342140333_9a6c411169_k_d.jpg

52340938057_b74af6c1dd_k_d.jpg

52341892031_0ab3ea9201_k_d.jpg

52342193919_dbc671862a_k_d.jpg

52342207409_8391aabfd6_k_d.jpg

52342345605_4b3f605c33_k_d.jpg

52342360020_c358222f07_k_d.jpg

Only Resident Evil 3 can look the same. It happens at night with a lot of low APL scenes.
 
Elden Ring feels like a downgrade from Sekiro. The game looks underwhelming. The in-game brightness setting is not enough. The HDR lacks impact.
There should be a lot more highlights to display the rays from Erdtree. I might just make a mod to expand the brightness settings.

52342671060_a4b1150318_o_d.png

Compared to the original image. The second image looks a lot better.
 
Last edited:
It's all good. Coincidentally I just watched a scene from that johnny depp harry potter movie "Fantastic Beasts" where they showed a nuclear bomb go off full screen in dolby vision lol.

latest?cb=20200110044959.gif


However with HDR scenes, just like real life, we aren't always in scenes where you might want to throw on a pair of sunglasses :watching: . Those ordinary scenes not necessarily breaking 1000nit still look very impressive in person, especially in dim to dark viewing conditions. Where the difference would really hit for me dramatically screen wide with the ucg would be on large bright fields of ice and snow, like long slow pans on snowy mountains in bright sunlight where ABL would diminish it after a few seconds. I guess that is similar to your very bright shining white sands comparison or very bright skylines which might be sort of like the white mountains screen inverted. Metallic or water, glass etc. scintillation and other reflections would look more dramatic the brighter and closer they could come to real life too but the 400 - 800 nit fields of color volume and small highlights is still quite a bright contrast from the lower and SDR bulk of a scene at the pixel emissive level on an OLED and it's black depths in a dim to dark room.

The ucg sounds like a better monitor for a traditional desk distance overall, and as an all around static app and workstation screen (rather than a dedicated media screen) by a long shot, but at quite the premium price. The sustained brightness at hdr1400 on the ucg you mentioned is extremely impressive and could probably come into play when actually authoring or editing HDR material. I wish oleds could do longer durations currently, even just for regular scene or camera clip durations rather than 30minutes. They should at least get some improvement with better heatsink tech combined with QD-OLED tech's ability to produce brighter color volumes at somewhat lower energy states than WOLED. For that reason I'd be much more interested in a very good heatsink model OLED (for media/games) in the next few years than a curved screen one if I had to choose between the two.

I really like seeing how and learning how HDR performs in general on real world screens of today but also in theory if we had much brighter screens (much brighter even than falling off somewhere and then capping around HDR1400nit or so) - so all of these heatmaps in this thread and the other link are a lot of fun to check out for both reasons. It's what made me look into this thread again, seeing your posts of the color volume maps and your mention of the sustained brightness on the ucg. I'd love to have a 1400 - 2000nit or higher screen with some of my other favorite specs (incl. dimensions) someday. I also look forward to years from now when we might get HDR10,000 (microLED?) displays and maybe some level of HDR in VR/AR glasses with decent PPD. All good discussion feeding my head (and eyeballs). (y)

.
Compared to the original image. The second image looks a lot better.
.
Be careful that the higher output doesn't crush color/texture detail and contrast on the tree for example. The colors in the first image should all shift in a relative way to the 2nd picture rather than becoming uniform blob. Where the yellow-orange from the first pic becomes red in the second pic, the red detail in the first pic should probably shift to a higher color. Yellow->Yellow-Orange or Orange , Orange -> OrangeRed or Red, Red -> lighter red or leaning toward Hot pink (or higher).

This same kind of thing happens when people use dynamic settings instead of static metadata curves. The lower color volumes get pumped up and step on top of the higher ranges rather than maintaining a gradient upward relative to each other - so they flatten out or clip the detail-in-color. It's brighter color overall but flat with lost detail so it's actually worse.

529993_sfRuldn.png


529995_dqUEC1E.png
 
Last edited:
Anyone have thoughts on the Viewsonic version of the monitor? Read some anecdotal reviews stating that the blooming is less (algorithm modified?) on people that have previously had the PG32UQX. With discount code you can get the Viewsonic for $2,149 before tax.
 
Viewsonic also has better/cleaner TestUFO if you compare side by side so some improvement in OD tuning has occured. Also has a newer Gsync module since it supports reflex. I know someone who bought a refurbished from Viewsonic Direct on eBay for $1850 and is happy.
 
Be careful that the higher output doesn't crush color/texture detail and contrast on the tree for example. The colors in the first image should all shift in a relative way to the 2nd picture rather than becoming uniform blob. Where the yellow-orange from the first pic becomes red in the second pic, the red detail in the first pic should probably shift to a higher color. Yellow->Yellow-Orange or Orange , Orange -> OrangeRed or Red, Red -> lighter red or leaning toward Hot pink (or higher).
This is why I still prefer RGB Waveform where you see directly how many nits it has. The contrast is also increased in every part relatively compared to the first image.
52343222558_4ba9c540fe_o_d.png

This same kind of thing happens when people use dynamic settings instead of static metadata curves. The lower color volumes get pumped up and step on top of the higher ranges rather than maintaining a gradient upward relative to each other - so they flatten out or clip the detail-in-color. It's brighter color overall but flat with lost detail so it's actually worse.
You said it reversed. Only passing through static metadata will cause crash when the input is out of the output limits.
Unlike Horizon Zero Dawn or God of War that pass through static data, Elden Ring uses dynamic data processing. The problem of Elden Ring is the offset is too much, maybe it is a bug, the overall brightness is a lot lower.
And there are two stages of tone mapping. The one happens on the game where the software processes the data. Then the second one happens on the monitor where the hardware processes the data using built-in algorithm such as Dolby Vision or custom vision. The hardware processing overrides the software.
 
Anyone have thoughts on the Viewsonic version of the monitor? Read some anecdotal reviews stating that the blooming is less (algorithm modified?) on people that have previously had the PG32UQX. With discount code you can get the Viewsonic for $2,149 before tax.

Worth noting that while the ucg has hdmi 2.1 , the asus 32uqx and the viewsonic xg321ug do not.

tftcentral review of the 32ucx
The PG32UQX offers a good range of connectivity with 1x DisplayPort 1.4 (with DSC support) and 3 x HDMI 2.0 offered for video connections. There is no HDMI 2.1 offered,
 
This is why I still prefer RGB Waveform where you see directly how many nits it has. The contrast is also increased in every part relatively compared to the first image.
View attachment 508104

You said it reversed. Only passing through static metadata will cause crash when the input is out of the output limits.
Unlike Horizon Zero Dawn or God of War that pass through static data, Elden Ring uses dynamic data processing. The problem of Elden Ring is the offset is too much, maybe it is a bug, the overall brightness is a lot lower.

I'm saying when people turn on dynamic tone mapping, it pushes the colors up stepping on the higher parts of the color gradient of the regular curve.. flattenning the color detail into a blob like the image I linked and like your tree example showed visibly as one red blob and bright area in the 2nd pic lacking detail-in colors. The contrasted areas in the first pic on the tree and tree trunk should be scaling up color wise in the heat map, maintaining a difference and thus detail relative to each other. Vincent shows this loss of details in a lot of his TV review vids.
 


If you engage dymanic tone mapping everything becomes brighter including the shadow detail and the mid tones as well- just washing out the image. The thing people don't understand is that they equate a brighter image to a better image in terms of "pop" , but that is not the case. For me, a bright object will look better against a dark background compared with say, a bright object that is against a brighter background that has been lifted in terms of the apl or luminance by artificial means such as dynamic tone mapping. As the villain in the incredibles said: "If everyone is special, no-one is special".

- you may be blowing out highlight detail. You may be lifting the shadow regions causing a more washed out picture. By brightening the mid-tones as well you are actually losing dynamic range, losing "pop".

2WFRhUz.png
 
I'm saying when people turn on dynamic tone mapping, it pushes the colors up stepping on the higher parts of the color gradient of the regular curve.. flattenning the color detail into a blob like the image I linked and like your tree example showed visibly as one red blob and bright area in the 2nd pic lacking detail-in colors. The contrasted areas in the first pic on the tree and tree trunk should be scaling up color wise in the heat map, maintaining a difference and thus detail relative to each other. Vincent shows this loss of details in a lot of his TV review vids.
We are talking about the software level tone mapping.
If you check the pictures of the heatmap. The second image has more details than the first one. It's alinged with RGB waveform.
Dynamic tone mapping won't have the crash. It keeps the relative contrast. The offset will have lower APL to make highlight inside the range while keeping the relative contrast to each part of the screen.
 
This is just LG's naming for their hardware tone mapping algorithms.
In a brighter mode, the highlight part is kept above the display's output so it can have more APL while keeping the the relative contrast of the rest of the screen. It has crash.
Other mode might just keep all the highlights inside the output while it has lower APL.
Considering it's an OLED. It's not surprising it needs multiple options of algorithms to display different scenes in order to achieve higher APL.
 
everything becomes brighter including the shadow detail and the mid tones as well- just washing out the image. The thing people don't understand is that they equate a brighter image to a better image in terms of "pop" , but that is not the case. For me, a bright object will look better against a dark background compared with say, a bright object that is against a brighter background that has been lifted in terms of the apl or luminance by artificial means such as dynamic tone mapping. As the villain in the incredibles said: "If everyone is special, no-one is special".

- you may be blowing out highlight detail. You may be lifting the shadow regions causing a more washed out picture. By brightening the mid-tones as well you are actually losing dynamic range, losing "pop".
- Vincent Teoh (HDTVTEST)


znUdyS8.jpg



game_tone-mapped-badly_A.png


RJHpg16.png



game_tone-mapped-badly_HeatMap_A.png

 
Last edited:
Well, you are definitely wrong now. You are confusing about more than one aspect.

1. The images are HDR. Remember the converted SDR images mapped down from them will be capped to 100nits, so it has more blown out part on higher APL.
That's why we need the heat maps as well as RGB waveforms. If you compare the heatmaps closely. The second one definitely has more details than the first one. The heatmap outline and silhouette are exactly the same. There isn't any crash on highlights at all. The RGB waveform shows it clear as day that the second image has more contrast. And I can see every details preserved on the second HDR image.

52343528939_3430f00a5f_o_d.png


2. Maybe you cannot see the HDR images accurately with your display. The hardware tone mapping is not included here. And your display is doing another tone mapping to have crashed view. So you assume SDR image representation will be the same as HDR.

3. You are using Control to compare HDR. It is a SDR game. Of course the details of SDR images will easily show the difference. But the details of HDR images will not be preserved in SDR. However they are aligned with heatmaps and RGB waveforms.

4. Software tone mapping and hardware tone mapping happens at two different stages. The original metadata is the input for software to have 1st stage tone mapping output data. Then the 1st stage tone mapping output data is the input for hardware 2nd stage tone mapping. You talk about them like the same one.
 
Last edited:
Funny how people complain about HDR > 1000nits being so bright it strains their eyes..ever stepped outside for a minute? Or even turned a light bulb on? Seriously lol…

Wonder why studios such as DolbyVision are targeting 4,000-10,000nit mastered HDR media? I guess they’re all idiots. The holy grail of display technology is to mimic reality, isn’t it?
 
It's not the the peaks are too high, and it's not that the distance between the mids and highs was preserved after they were all scaled up.

Vincent explained what I was seeing better than I could. He's saying DON'T ramp the mid tones up a the same time along with the highs. Scaling everything up to the same degree is not best. He's saying raising the mid tones blows out the image because contrast and larger dynamic range is more meaningful that turning the mid tones up along with the highs.

everything becomes brighter including the shadow detail and the mid tones as well- just washing out the image. The thing people don't understand is that they equate a brighter image to a better image in terms of "pop" , but that is not the case. For me, a bright object will look better against a dark background compared with say, a bright object that is against a brighter background that has been lifted in terms of the apl or luminance by artificial means such as dynamic tone mapping. As the villain in the incredibles said: "If everyone is special, no-one is special".

- you may be blowing out highlight detail. You may be lifting the shadow regions causing a more washed out picture. By brightening the mid-tones as well you are actually losing dynamic range, losing "pop".
 
Last edited:
It's not the the peaks are too high, and it's not that the distance between the mids and highs was preserved after they were all scaled up.

Vincent explained what I was seeing better than I could. He's saying DON'T ramp the mid tones up a the same time along with the highs. Scaling everything up to the same degree is not best. He's saying raising the mid tones blows out the image because contrast and larger dynamic range is more meaningful that turning the mid tones up along with the highs.
5. The pictures from your post are definitely about the peaks are too high so it has large crash when the data pass through instead of being mapped down. Or you won't use SDR Control to compare. But HDR are not crashed. And that Control is all about passthrough data crash.

6. The mid tones can be changed or kept depends on the grading/tone mapping process, depends on the scene. It is "higher mid tone, higher APL" most of the time. The contrast of the mid tone is larger. It looks better with high APL and higher contrast. The highlight and lowlight besides mid tone is extended so highlight gets higher while low lights gets lower or kept the same.

7. But Vincent was actually explaining the lowlight gets raised up as well in the process of LG's OLED tone mapping algorithm.
 
Your map raised all of the yellow and orange ranges on the tree, and the darker green area on the right. He's saying quality wise, maintaining the mid tone levels and shadow detail levels at their levels rather than raising them is best. Raising the high end away from that (within the artist's intent), alongside the darker areas, is good and increases the dynamc range greatly. Raising all of those levels into one big red tree is not doing that. "a bright object will look better against a dark background compared with say, a bright object that is against a brighter background that has been lifted"
 
Last edited:
Your map raised all of the yellow and orange ranges on the tree. He's saying quality wise, maintaining the mid tone levels and shadow detail levels at their levels rather than raising them is best. Raising the high end away from that (within the artist's intent), alongside the darker areas, is good and increases the dynamc range greatly. Raising all of those levels into one big red tree is not doing that. "a bright object will look better against a dark background compared with say, a bright object that is against a brighter background that has been lifted"
8. Maintain mid level you defined is clearly not the best. All the yellow is the SDR range around 80 to 160nits. Imagine how dull and unnatural the images looks like inside this range in the wide open area like Elden Ring or Cyberpunk. The mid will only increase when your monitor can deliver higher APL. So far the mid looks the best at 500-700nits. That's the mid.
 
You're raising the dark green area on the right too. You need to maintain the dark parts of the scene and darkness of the shadow detail while raising the top end's light source or bright highlight right next to it. It's about increasing the highlights and sources to contrast the normal and middle scene or the detail-in-darks and dark areas of the screen along with. Vincent has it right: "By brightening the mid-tones as well you are actually losing dynamic range"
 
You're raising the dark green area on the right too. You need to maintain the dark parts of the scene and darkness of the shadow detail while raising the top end's light source or bright highlight right next to it. It's about increasing the highlights and sources to contrast the normal and middle scene or the detail-in-darks and dark areas of the screen along with. Vincent has it right: "By brightening the mid-tones as well you are actually losing dynamic range"
Now you talk about the low light instead of high light being crashed. 10 nits below is the low light area. I kept that in mind.
There is a reason PQ level is set to have 1/4 scale to have 0-10nits. You can open two HDR pictures in Windows Photo at the same time to compare them side by side. It's obvious the low light are kept or even lower, and mid and high are raised. Which one looks is better and more natural is rather clear.
That's what Erdtree is supposed to look like.
 
Last edited:
No matter how bright your screen can get, you'll be losing range/contrast compared to if you had not lifted those other areas (in the elden ring shot). ""By brightening the mid-tones as well you are actually losing dynamic range""
 
Last edited:
No matter how bright your screen can get, you'll be losing range/contrast compared to if you had not lifted those other areas (in the elden ring shot). ""By brightening the mid-tones as well you are actually losing dynamic range""
9. Well, you need to know how to read the RGB waveform. I have said the contrast is actually higher. The range is fully extended. How come the contrast is getting lower? You quote things from Vincent's journalism channel too much. What he said doesn't apply to everything, especially here.
Here is HDR 1400 area for LCDs that can maintain brightness. The monitor will reach that highlight brightness and won't get hit by ABL to maintain mid tone. The dynamic range is a lot higher than what you normally view.
 
It's never going to be higher than if you didn't lift the other ranges along with it. ;)
 
It's never going to be higher than if you didn't lift the other ranges along with it. ;)
10. Funny you are in denial. 10nits below is not raised, it is even lower. You don't understand how mapping works. You can use any tools to check the contrast.
The adjacent part of each pixel is uniformly increased in contrast with an algorithm such as X+(X/10)^1/2. So do a calculation when pixel A is increased from 100nits to [100+10^0.5] nits and pixel B is increased from 50nits to [50+ 5^0.25] nits. And how much contrast it increased from A /B. The RGB waveform shows very clearly the image has higher contrast. The good algorithm behind the mapping is to increase the contrast first lol.
 
You Rasised the mid tones. Had you not lifted them alongside you'd have greater contrast still in that scene than what you ended up with. No matter how much brighter and larger you make the gap upwards ("look the contrast is extreme comparing this laser to a nuclear bomb!!" /s) , it will always be higher contrast and importantly - a darker actual range you would see on the mids - had you not lifted them (and their colors in the temp map grossly in large areas from green/yellow/orange).

The bulk of those mid levels sub-red would remain visually darker like the 1st pic rather than the 2nd pic raw luminance wise, remaining largely their original temp map colors but with much brighter highlights and light sources mixed in at the top end so it would look more impactful.
 
Last edited:
You Rasised the mid tones. Had you not lifted them alongside you'd have greater contrast still in that scene than what you ended up with. No matter how much brighter and larger you make the gap upwards, it will always be higher contrast had you not lifted the main floors (and their colors in the temp map).
It tell you to do the math. The algorithm is set there for a reason. It increases the contrast uniformly. What I use is a very advanced algorithm. So I increase your "mid tone" from 80nits to 160nits. The highlight is raised from 800nits to 1800nits. Tell me which contrast is larger 800/80 vs 1800/160 lol.
Don't deny the math. And a certain algorithms even need a license to use. That's how valuable it is. It's called Dolby Vision or Apple XDR. What is valuable if Dolby Vision loses contrast in a brighter APL mode?
 
Last edited:
"look the contrast is extreme comparing this laser to a nuclear bomb!!" just measure the gap! it's larger than that dark archway behind that tree that is on fire /s (exaggerating).

"By brightening the mid-tones as well you are actually losing dynamic range" compared to if you hadn't lifted the original mid tones no matter how high you make the top end. It's fine if you like seeing it better that way but you are lifting the mid tones from their more impactful and deeper original depth where they might have remained if mastered better.

. . . . . . . .

Part of DolbyVision's PR:

"Catch nuances in every scene, from seeing the emotions change on a character's face in a dark night shot, to avoiding blown-out details under a bright sunlit scene."

"The code values are used in the most perceptually important parts of the PQ curve. This allows for exceptionally detailed shadows while allowing highlights up to 10,000 nit. "

So, based on a 10,000nit screen's curve obviously (other screens like 725-800nit and hdr1400nit have an accuracy fall-off and compress the top end so the values will be some what different compressed into the remaining range on the high end):

"Over half of the code values are in the zero to 100 nit range.

About 25% of the code values are in the 100 to 1000nit range."

So,
50% of the screen displayed is at 0 to 100. (SDR ranges and down to infinite black depth in OLED). This foundation probably remains the same even on compressed after fall-off screens.

25% of the screen is at 100 to 1000nit. (bulk of which is typical mids to high mids that many HDR screens can display more or less, at least for a time if not long sustained).

"The remaining code values are 1000 to 10,000".
The top 25%. (likely bright highlights e.g. scintillation/glints/reflections of sources, and very bright direct light sources)


The top end will be compressed on screens since practically all of them are below 10,000nit so that last 25, 25 won't be the same. For example HDR 10,000 curve on a LG OLED is around 400nit accurate then falls off to compressing the rest into the remaining ~ 400nit. Not sure where the fall-off is on the asus proarts uqx and ugc.

more from the dolby mastering page:

"I found it helpful to focus on segmenting an image and placing different portions of the shot at different nit values where they are best represented."
 
Last edited:
"look the contrast is extreme comparing this laser to a nuclear bomb!!" just measure the gap! it's larger than that dark archway behind that tree that is on fire /s (exaggerating).

"By brightening the mid-tones as well you are actually losing dynamic range" compared to if you hadn't lifted the original mid tones no matter how high you make the top end. It's fine if you like seeing it better that way but are lifting the mid tones from their more impactful and deeper original depth where they might have remained if mastered better.
No, it doesn't. Again, do the math. Don't deny the math.

The dynamic range is larger. I have said the contrast is uniformly increased. It applies to every pixel even for 10nits below except it is stretched down instead of being stretched up as highlight does. So the pixel B is increased from 80nits to160nits. The pixel A is increased from 100nits to 220nits. Tell me which contrast is larger 100/80 vs 220/160.
I don't doubt you can understand it. But you actively deny it. You've denied the RGB waveforms which is so obvious lol.

Check out these contrast.
52344048974_8f95e0c286_o_d.png

52344152550_4602350c61_o_d.png
 
Last edited:
You're just saying the same thing again lol. It's fine if you like it better that way but you've taken certain brightness/darkness objects and areas in the mids and raised them. You took darker crayons and substituted brighter ones in mid areas. It doesn't matter if you raised the top end to 100,000 and made a crayon box 1000 crayons tall, saying look it's a larger contrast measurement now. You've lifted the mids so they don't look the same and no matter how high you make the peaks to nuclear blinding white for a taller contrast measurement, it will never be as contrasted and as impactful as if you hadn't raised the mids along with the highs.

727255_znUdyS8.jpg
 
Last edited:
Status
Not open for further replies.
Back
Top