PG32UQX - ASUS 32" 4K 144 Hz HDR1400 G-Sync Ultimate

Status
Not open for further replies.
None of that is text šŸ˜‚. Text is where the scaling matters.
I was chiming in from a value perspective. Different uses, different size, different prices, different displays. One display isn't right for everyone and It's all good lol again just a different perspective :)
 
I have 27/24, 27/27/27(16x10 for the last), and a G9, and a pre c series LG. All of the above. The G9 is great for productivity and games are hit or miss. The LG 4K is only good for entertainment, but I guess it could do ok at video or image editing- but nothing with text on a screen (scaling is just a 24ā€ 1080p screen; that was 2009), and the middle one is good for anything and has a pro art display for accuracy if I have to do photo or video work. But itā€™s 1920x1200. The rest are 1440 because Iā€™m close to them - a 4K screen would literally dwarf me at that size.
That's awesome. šŸ‘
 
Out of curiosity, are you all running this monitor at native true 10-bit? Or do you set 12-bit, with an additional 2-bit dithering stage added by the scaler for all refresh rates (up to 144Hz)?

Wondering if it worth it or not, or if it could actually make quality worse (not that my old eyes can tell anyway past 10-bit... lol)
 
Out of curiosity, are you all running this monitor at native true 10-bit? Or do you set 12-bit, with an additional 2-bit dithering stage added by the scaler for all refresh rates (up to 144Hz)?

Wondering if it worth it or not, or if it could actually make quality worse (not that my old eyes can tell anyway past 10-bit... lol)
There is hardly any DSC artifact on this monitor. You can set 12bit (10bit FRC)to see the benefits of HDR 1400.
Color is lifted by brightness. A 10bit color has 2^10=1024 shades. At 1,000nits, it is very easy to see each shade. At 1,400nits, you just begin to see more shades brought by 12-bit color footages which can be fully utilized in HDR 4000.
 
There is hardly any DSC artifact on this monitor. You can set 12bit (10bit FRC)to see the benefits of HDR 1400.
Color is lifted by brightness. A 10bit color has 2^10=1024 shades. At 1,000nits, it is very easy to see each shade. At 1,400nits, you just begin to see more shades brought by 12-bit color footages which can be fully utilized in HDR 4000.
How will the 12-Bit work in 144Hz mode? I know that mode is 8Bit + FRC above 120Hz; so curious how it might be handled that high. Although everything I play at 4K with max setting still sits in the 120Hz range just fine so far.
 
How will the 12-Bit work in 144Hz mode? I know that mode is 8Bit + FRC above 120Hz; so curious how it might be handled that high. Although everything I play at 4K with max setting still sits in the 120Hz range just fine so far.
DSC compresses the data of 12-bit colors into the bandwidth of 10-bit colors. Then FRC shifts the 10-bit panel to see more shades of the 12bit colors.
 
By now at least some people can find out:
1. Without additional algorithm, making LEDs completely align with PG EOTF will have accurate brightness but cause bloom due to limited number of LEDs/zones.
2. Configuring backlight LEDs to reduce bloom can sacrifice contrast. When the signals hit in, FPGA uses the signal to make the backlight behind content area shines the supposed nits or reduced nits. With more refined FPGA, additional algorithm makes LEDS/zones around the content area lit a small amount of nits. This method is matured on Sony M9's 96-zone AUO panel.
3. VA has better contrast but less colorspace than IPS. HDR will look washed out on VA compared to IPS.
4. OLED has the pixel zones but brightness is too low. Perceived contrast can be even lower than FALD LCDs.

To make a vivid HDR experience, some choices need to be made from above.

PG32UQX uses IPS with widest colorspace and accurate EOTF brightness without much lifted zones on Level 2. Color, contrast are accurate, brightness is sustained. HDR impact is more powerful.

Compared to PG27UQ, which shines a higher nits in background to reduce blooming, PG32UQX with more zones still has less blooming than PG27UQ in HDR scene. The blooming on PG32UQX is the on same level of PG35VQ in the worst case like starfield.

Most people don't have these monitors. Only comparing them side by side can see the difference.

Reference
View attachment 494448

Camera has the same fixed exposure. These pictures below are close to viewing in person.

PG27UQ
View attachment 494449

PG32UQX
View attachment 494450

PG35VQ
View attachment 494451

PG27UQ vs PG32UQX
View attachment 494452

PG32UQX vs PG35VQ
View attachment 494453

PG27UQ vs PG32UQX vs PG35VQ
View attachment 494454


This is the worst case in real usage. HDR hardly has starfield in real usage, so blooming is not that noticeable. In high APL scene, unlike other monitors hit by ABL, these monitors are unmatched because of sustained EOTF brightness, larger contrast and wider colorspace.
So how is your PG32UQX configured? Side by Side, in similar dark scenes, my PG32UQX appears to have the same localized blooming that my PG27UQ has. Should I be using Level 2 instead of Level 3 for my Variable backlight? I have OD set to Normal, using 12-Bit Color at 120Hz and 144Hz respectively (depending on what game I play that can handle higher frames at 4K).

It's also hard to tell, and maybe its game specific, but the HDR1400 of my PG32UQX looks about the same to my eyes as the HDR1000 of my PG27UQ. I don't know if that's because I need to configure something else or if the smaller screen of the 27UQ just makes it look that way with the smaller PPI. When I run the VESA HDR utility, it does indeed say my peak brightness is 1566 nits on the PG32UQX and only ~1186nits on the PG27UQ.
 
So how is your PG32UQX configured? Side by Side, in similar dark scenes, my PG32UQX appears to have the same localized blooming that my PG27UQ has. Should I be using Level 2 instead of Level 3 for my Variable backlight? I have OD set to Normal, using 12-Bit Color at 120Hz and 144Hz respectively (depending on what game I play that can handle higher frames at 4K).

It's also hard to tell, and maybe its game specific, but the HDR1400 of my PG32UQX looks about the same to my eyes as the HDR1000 of my PG27UQ. I don't know if that's because I need to configure something else or if the smaller screen of the 27UQ just makes it look that way with the smaller PPI. When I run the VESA HDR utility, it does indeed say my peak brightness is 1566 nits on the PG32UQX and only ~1186nits on the PG27UQ.
Blooming is the same or worse on this monitor compared to it's predecessors the pg27 and pg35 in most cases since it favors brightness. Fald3 is better then 2 most of the time but you have to switch it up to see what looks better for you. 3 tends to be to aggressive for small bright lights like stars so you get a fireworks effect where as fald 2 transitions were smoother but lit zones increased. Another thing I found helpful was change color of cross hairs or something if you can. In RE8 for example going for white to red or green killed the halo effect on it.

In a well lit room it will be not be as noticeable of a change due to how the eye perceives light as a whole with out diving into the science behind it if that makes sense. Regardless highlights at least were noticeably brighter compared to my pg35 so you should notice an improvement.

Hope that helps a little! I am curious since it sounds like this is a recent buy what firmware or model version you are on? They never did a downloadable one to my knowledge but curious if new models are shipping with something better these days.
 
Blooming is the same or worse on this monitor compared to it's predecessors the pg27 and pg35 in most cases since it favors brightness. Fald3 is better then 2 most of the time but you have to switch it up to see what looks better for you. 3 tends to be to aggressive for small bright lights like stars so you get a fireworks effect where as fald 2 transitions were smoother but lit zones increased. Another thing I found helpful was change color of cross hairs or something if you can. In RE8 for example going for white to red or green killed the halo effect on it.

In a well lit room it will be not be as noticeable of a change due to how the eye perceives light as a whole with out diving into the science behind it if that makes sense. Regardless highlights at least were noticeably brighter compared to my pg35 so you should notice an improvement.

Hope that helps a little! I am curious since it sounds like this is a recent buy what firmware or model version you are on? They never did a downloadable one to my knowledge but curious if new models are shipping with something better these days.
Where would I find out what firmware is running on it? OSD does not tell me, and I don't feel like ripping apart my back panel to look. AIDA64 confirms mine was manufactured in Week 21 of 2022, so it was made in the first half of this year...
 
It's also hard to tell, and maybe its game specific, but the HDR1400 of my PG32UQX looks about the same to my eyes as the HDR1000 of my PG27UQ. I don't know if that's because I need to configure something else or if the smaller screen of the 27UQ just makes it look that way with the smaller PPI. When I run the VESA HDR utility, it does indeed say my peak brightness is 1566 nits on the PG32UQX and only ~1186nits on the PG27UQ.

What games are you playing? Many HDR enabled games allow you to specify the maximum brightness (as well as minimum brightness) in HDR. If you are comparing the same games without tweaking the HDR settings, both monitors will look the same. Change the HDR peak brightness settings on the PC running the PG32UQX to 1,600 nits (not all games label this the same so you'll have to dig in game by game) and you should see a difference. Though, this will also depend on your lighting environment as that can impact how you perceive brightness (as mentioned by another, there is a whole science why double the nits does not automatically mean double the perceived brightness, etc).
 
What games are you playing? Many HDR enabled games allow you to specify the maximum brightness (as well as minimum brightness) in HDR. If you are comparing the same games without tweaking the HDR settings, both monitors will look the same. Change the HDR peak brightness settings on the PC running the PG32UQX to 1,600 nits (not all games label this the same so you'll have to dig in game by game) and you should see a difference. Though, this will also depend on your lighting environment as that can impact how you perceive brightness (as mentioned by another, there is a whole science why double the nits does not automatically mean double the perceived brightness, etc).
So far tested HDR in SOTTR, BF2042, COD:MW2, Forza Horizons 5, CP2077, RDR2, MSFS2020.... in all games that had a slider, I did make the adjustments before testing (most of which say slide a slider till a logo disappears).
 
If I recall correctly, RDR2 does allow you to specify the peak brightness via nits as well. I'd have to double-check since it's been a while. Not sure about the others as I haven't played them yet although I think BF2042 also does it.

Now here is where it gets tricky. The peak Fullscreen brightness flash rating for the PG32UQX is around 1,600 nits. After about two seconds, it falls down to different levels depending on the revision you have. The initial batch with a manufacturing date of April 2021 had a sustained full screen brightness of 1,200 nits while subsequent batches had that lowered to 1,000 nits. Kramnelis (who posted this info earlier in this thread) is the one who alerted me to this fact. Depending on the scene you are using to compare along with your background lighting (as well as that whole thing about linear increases in nits does not equate to a linear increases in perceived brightness, etc), the difference may be more subtle than expected. Having compared a PG32UQX and a PA32UCf side by side, I definitely can tell the difference between both of them.
 
If I recall correctly, RDR2 does allow you to specify the peak brightness via nits as well. I'd have to double-check since it's been a while. Not sure about the others as I haven't played them yet although I think BF2042 also does it.

Now here is where it gets tricky. The peak Fullscreen brightness flash rating for the PG32UQX is around 1,600 nits. After about two seconds, it falls down to different levels depending on the revision you have. The initial batch with a manufacturing date of April 2021 had a sustained full screen brightness of 1,200 nits while subsequent batches had that lowered to 1,000 nits. Kramnelis (who posted this info earlier in this thread) is the one who alerted me to this fact. Depending on the scene you are using to compare along with your background lighting (as well as that whole thing about linear increases in nits does not equate to a linear increases in perceived brightness, etc), the difference may be more subtle than expected. Having compared a PG32UQX and a PA32UCf side by side, I definitely can tell the difference between both of them.
I'd be curious why they lowered it (unless that is posted here somewhere)... wonder if it was overheating, causing too much halo'ing, etc. TBH, the main reasons I bought this monitor was essentially to up my screen size (27" is perfect for FPS MP games, but 32" at my distance is much more immersive), get DSC so I'd have full 144Hz RBG 444 at 10/12Bit and of course the HDR. Jumping from HDR1000 to HDR1400 was not a primary, but a bonus for me. I got used to the Halo effects on the PG27UQ in games... you would have to look for it after a while as I just sort of got used to it and never really noticed anymore, the perceived contrast was so much more worth it over a little haloing.
 
So how is your PG32UQX configured? Side by Side, in similar dark scenes, my PG32UQX appears to have the same localized blooming that my PG27UQ has. Should I be using Level 2 instead of Level 3 for my Variable backlight? I have OD set to Normal, using 12-Bit Color at 120Hz and 144Hz respectively (depending on what game I play that can handle higher frames at 4K).

It's also hard to tell, and maybe its game specific, but the HDR1400 of my PG32UQX looks about the same to my eyes as the HDR1000 of my PG27UQ. I don't know if that's because I need to configure something else or if the smaller screen of the 27UQ just makes it look that way with the smaller PPI. When I run the VESA HDR utility, it does indeed say my peak brightness is 1566 nits on the PG32UQX and only ~1186nits on the PG27UQ.
There aren't many user settings in HDR. Backlight speed, Overdrive, Contrast are the major settings you can change. The actual HDR settings such as Maximum Content Light Level is locked.

I use level 3 backlight. Fast backlight is better when the image starts to move. The backlight is supposed to be even faster in HDR 1400 to catch up with the change of the image contrast.
Under the same image, the blooming area on PG32UQX is smaller than on PG27UQ, the contrast is higher and more accurate. There is a chance the blooming is more pronounced brought by small highlights in a grey background but it doesn't bother because the overall contrast I can see is a lot higher.

OD is set to Normal most of the time. I set OD to Extreme in online fps games.

Other settings are for SDR. I use Adobe color space with 400nits to emulate HDR 400 on SDR so the settings are different from sRGB. The Wide Gamut s enabled in the monitor, YCbCr format is enabled in the GPU driver.

Some games such as Final Fantasy VII or Resident Evil Village are not graded over 1,000nits.
Some Sony's games such as God of War, Horizon, Uncharted has highlights at 10,000nits, low light is locked. Their brightness settings are the mid-tone.
The new version of the Wither 3 is rushed. There are no HDR options, it looks as bright as HDR 200. However you can adjust the Contrast settings in the monitor to increase the range. Or you can use Nvidia Filter to increase Highlight and Contrast to see 1,500nits sun.

Windows Auto HDR will make a game such as Battlefield 4 jump to HDR 1400. You can see the differences between PG32UQX and PG27UQ when content is increased to HDR 1400 compared to HDR 1000.
 
I can only guess as to why Asus lowered it in the later revisions. Maybe they thought it was uncomfortable for the average user. Because getting hit with even 1,000 nits is really eye-searing. I end squinting when I get flash banged in The Division 2 because it's so bright at 1,600 nits.
 
Sorry to jump in kinda off topic, but is there any reason to not use 12bit setting in Nvidia Control Panel? I have always used 10 bit at 144hz instead of 12. Just curious if there will be any improvement to the image.
 
Sorry to jump in kinda off topic, but is there any reason to not use 12bit setting in Nvidia Control Panel? I have always used 10 bit at 144hz instead of 12. Just curious if there will be any improvement to the image.
I've turned it to 12bit 144hz and works perfectly fine for me for gaming and work. Its not true 12bit, its 10bit + FRC as someone else pointed out. I'm not sure what games even would use a 12bit package for HDR over 10bit right now, but I guess it adds another FRC layer that changes it from 1024 color steps to 4096.

In short, it looks about the same to my eyes, so I use it... lol. I've tested both 10bit and 12bit at native 120hz as well and my eyes couldn't tell you even when watching true HDR videos... lol.

Look at it this way, if you send a 12bit package to the monitor and only 10bits are used, the 2 most significant bits are 0. No harm, no foul. But if you somehow come across 12bit HDR content, you get that extra FRC layer in the monitor to simulate that color to your eyes.
 
Last edited:
Is there Native 12-bit panel available in the professional segment currently?
 
I've turned it to 12bit 144hz and works perfectly fine for me for gaming and work. Its not true 12bit, its 10bit + FRC as someone else pointed out. I'm not sure what games even would use a 12bit package for HDR over 10bit right now, but I guess it adds another FRC layer that changes it from 1024 color steps to 4096.

In short, it looks about the same to my eyes, so I use it... lol. I've tested both 10bit and 12bit at native 120hz as well and my eyes couldn't tell you even when watching true HDR videos... lol.

Look at it this way, if you send a 12bit package to the monitor and only 10bits are used, the 2 most significant bits are 0. No harm, no foul. But if you somehow come across 12bit HDR content, you get that extra FRC layer in the monitor to simulate that color to your eyes.
Thanks for the info. I am going to leave it on as well.
 
There aren't many user settings in HDR. Backlight speed, Overdrive, Contrast are the major settings you can change. The actual HDR settings such as Maximum Content Light Level is locked.

I use level 3 backlight. Fast backlight is better when the image starts to move. The backlight is supposed to be even faster in HDR 1400 to catch up with the change of the image contrast.
Under the same image, the blooming area on PG32UQX is smaller than on PG27UQ, the contrast is higher and more accurate. There is a chance the blooming is more pronounced brought by small highlights in a grey background but it doesn't bother because the overall contrast I can see is a lot higher.

OD is set to Normal most of the time. I set OD to Extreme in online fps games.

Other settings are for SDR. I use Adobe color space with 400nits to emulate HDR 400 on SDR so the settings are different from sRGB. The Wide Gamut s enabled in the monitor, YCbCr format is enabled in the GPU driver.

Some games such as Final Fantasy VII or Resident Evil Village are not graded over 1,000nits.
Some Sony's games such as God of War, Horizon, Uncharted has highlights at 10,000nits, low light is locked. Their brightness settings are the mid-tone.
The new version of the Wither 3 is rushed. There are no HDR options, it looks as bright as HDR 200. However you can adjust the Contrast settings in the monitor to increase the range. Or you can use Nvidia Filter to increase Highlight and Contrast to see 1,500nits sun.

Windows Auto HDR will make a game such as Battlefield 4 jump to HDR 1400. You can see the differences between PG32UQX and PG27UQ when content is increased to HDR 1400 compared to HDR 1000.
do you have tried the callisto protocol hdr maybe? is the only game maybe or a rare case where hdr is washed :/, i have the pa32ucx if you remember ^^'
if i boost hdr bright slider get better but i feel like i lost the dark atmosphere.
 
do you have tried the callisto protocol hdr maybe? is the only game maybe or a rare case where hdr is washed :/, i have the pa32ucx if you remember ^^'
if i boost hdr bright slider get better but i feel like i lost the dark atmosphere.
I haven't played that game but Nvidia overlay has a brightness/contrast filter. You can adjust the sliders there.
 
Out of curiosity, are you all running this monitor at native true 10-bit? Or do you set 12-bit, with an additional 2-bit dithering stage added by the scaler for all refresh rates (up to 144Hz)?

Wondering if it worth it or not, or if it could actually make quality worse (not that my old eyes can tell anyway past 10-bit... lol)

As dumb as this sounds when I upgraded to a 4090 from a 2080ti using DP I cant select 12bit in the NV options anymore only 8 and 10 show up.
 
I can only guess as to why Asus lowered it in the later revisions. Maybe they thought it was uncomfortable for the average user. Because getting hit with even 1,000 nits is really eye-searing. I end squinting when I get flash banged in The Division 2 because it's so bright at 1,600 nits.

Wait till you see the flashbangs in Modern Warfare 2..
 
Wait till you see the flashbangs in Modern Warfare 2..
Then after you see the flash bang you uninstall because you realize that there are thousands of kids running around using the in game implemented legitimate aimbot aim assist on console controllers because it's cross platform and you are trying trying to play fair and square with keyboard and mouse lmao.
 
Then after you see the flash bang you uninstall because you realize that there are thousands of kids running around using the in game implemented legitimate aimbot aim assist on console controllers because it's cross platform and you are trying trying to play fair and square with keyboard and mouse lmao.

There are pc and console cheaters in online games too besides, up through to the highest levels of competitors. There are whole companies devoted to paid cheats as well. If you think aim assist is bad (and I agree, it is). . you might not be aware of how fair a lot of people have been caught playing (and those are just the ones that were caught).


<.....> That's not even going into the rabid cheating in online "competitive" games. They are smart enough to dial the cheats down to low-key cheating to be somewhat believable advantages in many cases, and also use them to carry or ladder up their bros/teammates. There are a lot of articles on famously high numbers of cheaters logged by gaming co's in popular games, as well as high ranked popular competitive and even professional gamer individuals getting busted by the game companies or stupidly in streams. :LOL:




Fortnite players cheat the most in any online multiplayer game, with over 26,822,000 searches for hacks

Four pro 'Fortnite' players have been banned after FNCS cheating controversy
https://us.blastingnews.com/gaming/...fter-fncs-cheating-controversy-003096277.html

https://fortnitetracker.com/article/1236/fortnite-cheating-crisis-reaches-new-highs

. . .

ā€˜Call Of Duty: Warzoneā€™ Has Now Banned Half A Million Cheaters (may 2021)

https://www.forbes.com/sites/paulta...rzone-has-now-banned-half-a-million-cheaters/

. . .

March 2021
Hundreds of high ranked Apex Legends players just got banned for cheating

april 2021
Apex Legends Devs Looking Into Compensation For Losing To Cheaters

Feb 2021
https://charlieintel.com/apex-legends-cheaters-hit-by-huge-ban-wave-with-more-to-come/85678/

. . . . . . . . .

https://www.gizmodo.com.au/2021/03/...-game-cheating-ring-seize-millions-in-assets/


https://www.axios.com/2022/12/16/video-game-cheating-lawsuits


https://kotaku.com/xbox-automatically-banned-4-million-accounts-for-cheati-1849786570 (2022)

https://gamerant.com/overwatch-2-bans-cheaters/
Overwatch 2 First Hacker and Cheating Ban Wave šŸšØOctober 26 2022, #Overwatch2 Korean Servers 3,486 player accounts using programs not authorized by cheating and Blizzard have ban šŸš«The Overwatch 2 ban wave is just the beginning and will ban more hackers and cheaters. źøˆģ§€! pic.twitter.com/XIttLARwm0
ā€” Naeri X ė‚˜ģ—ė¦¬ (@OverwatchNaeri) October 26, 2022



https://www.talkesport.com/news/csg...ted-cheats-but-valve-has-ignored-the-problem/ (september 2022)
Valve is banning more than 300,000 players that cheated in Counter-Strike: Global Offensive as part of a VAC ban wave as part of its ongoing crackdown on malicious actors in the game.


https://www.reddit.com/r/chess/comments/y9ioza/hans_lawsuit_claims_that_chesscom_allowed_known/

159.Likewise, contrary to Chess.comā€™s self-serving contention that it merely wanted to ensure the integrity of the 2022 Chess.com Global Championship tournament, Chess.com allowed several players who had previously been banned from online chess for cheating in high profile events to participate in that tournament.
160.In fact, Sebastien Feller, a European Grandmaster who was caught cheating at the 2010 Chess Olympiad tournament and subsequently banned from participating in FIDE-sanctioned events for nearly three years, is currently playing in the same tournament as Carlsenā€”the 2022 European Club Cupā€”with no objection whatsoever from Chess.com or Carlsen. Likewise, Magnus recently played a FIDE-sanction game against Parham Maghsoodloo, who was also banned for Lichess.org for cheating. Apparently, Carlsen only reserves his protests for those who have defeated him and threaten to undermine the financial value of Carlsenā€™s brand and the Merger.

. .


https://www.fragster.com/tencent-bans-over-450,000-pubg-mobile-players-for-cheating/ (sept 2022)

https://fanspace.gg/pubg-mobile-ban-pan-report-of-late-november-2022/

As per the Ban Pan Report, PUBG Mobile removed a total of 2,64,237 accounts and 5,211 devices between 25th November to 1st December.

16% of the cheaters were suspended from the game because they were using auto-aim to shoot their enemies.

Around 61% of the accounts were banned for using X-ray vision to see the location of their opponents through the wall.

About 7% were banned for using speed hacks, to overpower their rivals, which is not a fair way to play the game.

10% of the players were suspended for using a modification of area damage, where they kill others using increased bullet damage. The remaining 6% were suspended for using third-party plugins, which allow them to modify their character.

. .

https://pubg.ac/news/65909-one-of-the-top-5-squad-players-of-pcs6-suspected-cheating

. .

https://www.gaming.net/5-pro-gamers-who-were-caught-cheating-mid-tournament/

. .

10 Pro Gamers Who Were Caught Cheating During Live Streams and/or Tournaments

. . .

https://www.sportskeeda.com/esports/5-twitch-streamers-caught-cheating-livestream

Solista is a high-ranking Valorant player who made headlines on Twitter and Reddit back in 2021. As he was streaming a competitive game on his Twitch channel, he was kicked out of the game when a message saying "error, you have been banned from playing Valorant" popped up.

Vanguard's Anti-Cheat system successfully detected the streamer's malicious and unprofessional cheats and banned him on sight. The streamer continues to stream and is currently in the Radian Top 100 in Valorant's ranked ladder.

MrGolds was a pretty popular Call of Duty: Warzone streamer who fell off the radar after he was seen bragging about his in-game skills. He mentioned that:


"Just because I have a good recoil, I'm good at the game. This is the first time you see someone good at the game!"

Following this, he opened up the Task Manager which revealed a cheat engine running in the background of his stream. The program was developed by a cheat engine developer called EngineOwning that provides undetectable cheats for various games.


Soon enough, the Twitch streamer was banned from both the game, and from the streaming platform as well.

šŸ¤£
 
Last edited:
The HDR in the updated Witcher 3 looks very good. It can be scaled all the way up to 10,000nits.

It's a matter of time for the devs to enable the slider option which is not currently available in the game menu but configurable in the dx12user.settings in %USERPROFILE%\Documents\The Witcher 3

Under the [Visual] section, HDRPaperWhite is the max mid tone. MAXTVBrightness is the max highlight luminance.
In a daylight scene, the mid is easily over 1,000nits. Change these two options to match the parameters of PG32UQX then the game will look at least 3x more impactful and more realistic than the original 200nits SDR look.

============

[Visuals]
HdrPaperWhite=1200
MaxTVBrightness=1600

============

The game can reach the same level of real scene HDR benchmark footage. These images are not something a sub-HDR 1400 monitor can showcase properly.
Heatmap 1x3 copy Scene 1.jpg


Heatmap 1x3 copy Scene 2.jpg


Heatmap 1x3 copy Bench.jpg
 
Last edited:
The HDR in the updated Witcher 3 looks very good. It can be scaled all the way up to 10,000nits.

It's a matter of time for the devs to enable the slider option which is not currently available in the game menu but configurable in the dx12user.settings in %USERPROFILE%\Documents\The Witcher 3

Under the [Visual] section, HDRPaperWhite is the max mid tone. MAXTVBrightness is the max highlight luminance.
In a daylight scene, the mid is easily over 1,000nits. Change these two options to match the parameters of PG32UQX then the game will look at least 3x more impactful and more realistic than the original 200nits SDR look.

============

[Visuals]
HdrPaperWhite=1200
MaxTVBrightness=1600

============

The game can reach the same level of real scene HDR benchmark footage. These images are not something a sub-HDR 1400 monitor can showcase properly.
View attachment 536400

View attachment 536401

View attachment 536403
This is great! Going to try this out.
 
Yeah, this is giving me a reason to reinstall that game years after I had beaten it... lol
What a difference. Actually lowered the Midpoint to 1056 as thats what windows says for my monitor. Should all games use the Peak Fullscreen Luminance as the paperwhite point?
 
HDRPaperWhite in this game functions as the max possible mid point. Set it below the monitor's fullframe brightness will avoid clipping.
The in-game menu has gamma setting functions as the minimum luminance to keep the shadow in the low range.
 
HDRPaperWhite in this game functions as the max possible mid point. Set it below the monitor's fullframe brightness will avoid clipping.
The in-game menu has gamma setting functions as the minimum luminance to keep the shadow in the low range.
Thanks for the info. I wonder how this differs in other games.
 
A game should include various HDR adjusting options such as the zones of shadows, midtones, highlights, and saturation. Most games have limited options with just max luminance. And there is only one trim of grading like the way most movies do.

The images on other monitors totally depends on the tone mapping. Technically once the image is tone mapped, it is not what the creators intend you to see. So the creators have to compromise to make the tone mapping look similar on every monitor where most part of a HDR image is in fact inside SDR range because most monitors can only show SDR far from realistic, impactful HDR. But this way you can at least see the most part of images the creators compromisingly intended. This way only creates an iceberg of HDR.

Another way to fix the issue is just to give the users various options to adjust HDR so they can see better HDR matched on the capability of their monitors instead of locked one uniformly compromised image.
 
Hello to all, thank you for your support. I hope you all are well
I often read in reviews that oled tvs, have a high peak brightness, only on small windows. In contrast, if you consider 100% of the window, the peak falls in the range of 100 to 200. The pg32uqx, on the other hand, if I understand correctly, is capable of maintaining 1400 nits even in full-screen windows?
 
Hello to all, thank you for your support. I hope you all are well
I often read in reviews that oled tvs, have a high peak brightness, only on small windows. In contrast, if you consider 100% of the window, the peak falls in the range of 100 to 200. The pg32uqx, on the other hand, if I understand correctly, is capable of maintaining 1400 nits even in full-screen windows?
You are correct, it's freaking bright on a full screen flash or a brightness test. I believe after a few seconds, it drops down to 1000 nits full screen, which is still very bright.
 
Another way to fix the issue is just to give the users various options to adjust HDR so they can see better HDR matched on the capability of their monitors instead of locked one uniformly compromised image.

Definitely agree the situation isn't great currently.

Monitors seem to provide a lot of information about themselves, including a certification which links to some pretty detailed requirements, what's the barrier to games adjusting to the connected device automatically? The most I've noticed is sometimes a game sets the peak nits slider based on what's reported but that's it.
 
Status
Not open for further replies.
Back
Top