DUAL Layer LCD Display Info Thread (Mono LCD as Backlight)

Ok wanna give me a game that supposely isn't too bright with HDR then? And who said I am blaming the display? I am saying that in general I don't see the need for 10,000 nits displays. Nobody said anything about blaming, I just don't think its necessary.
Shadow of the Tomb Raider.

HDR isnt supposed to make the frame generally much brighter all the time, only those few things in the frame that should be.

I didnt say you were blaming that one display, I said you are blaming all displays that can go brighter.
After all, you said what is the point of brighter HDR.
I blamed the game and demonstrated you dont understand how it should work.
(I made an assumption you must have played other HDR games without complaint)
You expect everything to work 100% and blew up because perhaps one game doesnt.
The problem may even be you havent calibrated that display and it is too bright.

edit
Corrected Rise of the Tomb Raider to Shadow of the Tomb Raider.
RoTT doesnt use HDR.
 
Last edited:
Rise of the Tomb Raider.

HDR isnt supposed to make the frame generally much brighter all the time, only those few things in the frame that should be.

I didnt say you were blaming that one display, I said you are blaming all displays that can go brighter.
After all, you said what is the point of brighter HDR.
I blamed the game and demonstrated you dont understand how it should work.
(I made an assumption you must have played other HDR games without complaint)
You expect everything to work 100% and blew up because perhaps one game doesnt.
The problem may even be you havent calibrated that display and it is too bright.

Yeah maybe I didn't. Thing is, how many people who are just average consumers do you expect to be able to properly tweak every HDR experience on a per game, per display basis? Most people DO expect things to just work 100% without having to do too much tweaking/fiddling. I guess I don't understand how it works, and probably so does the vast majority of consumers, and don't you think that is kind of a problem with HDR? If 10,000 nits really is necessary, well yeah I really don't understand it and I admit that.
 
Its a relatively new technology that is still undergoing a lot of development.
You must expect some issues, made more apparent because its high brightness related.
Its quite normal to adjust the brightness down (in this case probably the backlight) if it is too bright.
Although TVs are complex beasts these days.
Until HDR10+ and DolbyVision (and any other new standard based HDR) become the norm and TVs arrive pre-calibrated, it wont get much easier.

ps I edited my post after stating the wrong game.
 
-----------------------------------------------------------------------------

--- This is what the uninformed thinks happens:

"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.

If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."

---This is what really happens:


If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.

For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the color spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.


HDR enables far more vivid, saturated, and realistic images than SDR ever could.

High-brightness SDR displays are not at all the same thing as HDR displays.""

That is from the post I quoted below, snippets of which were quoted from a tech article somewhere. I think the problem with ABLfor current content - aside from it not hitting the HDR1000, HDR1600 and upward peaks to hdr4000 and hdr 10,0000 that HDR is mastered for - is that even if a scene is mostly 600nit color, that the ABL probably works with the highest nits in the scene. So if you had a "max nit" orb of the sun or a bright moon in the sky, it's reflections on water and on the barrel of a gun or off of a car, etc. or similar scenes in space, the ABL might map those highest color brightness as 600 so that what is mostly a 600nit peak color brightness scene might get tone mapped down well below 600nit in relation to the brighter parts. So now your 600nit average scene could become much dimmer. I'm not sure that is how ABL works in every case but it would explain the much more dim display on bright HDR content. Either way, 600nit is typically the full/sustained scene limit and the dell gaming oled is reportedly not doing HDR at all and keeping a 400nit limit , most likely to avoid burn in on a computer display.

If HDR was implemented properly we shouldn't be adjusting our brigthness at all. It's supposed to be absolute values, not relative values where you adjust a slim SDR 2d color range up and down losing black levels on the bottom end or color brightness on the top end. With HDR you have a tall 3d color gamut that is supposed to show absolute color values through the entire range. You aren't supposed to be turning the entire display's brightness up and down. The only reason that is a thing is because HDR isn't being implemented properly on a lot of things (yet), especially games.

HDR1000 itself is just fractional HDR. Most movies are mastered at a 10,000nit color brightness ceiling, and many UHD HDR disc movies are 4000 nit currently with only a few at 10k nit so far (like blade runner 2049). I'm guessing because practically no-one has displays to show it yet.

Most of a scene stays within the SDR range like bananadude said. A lot of people are under misconceptions about how HDR works regarding that.

-----------------------------------------------------------------------------

--- This is what the uninformed thinks happens:

"If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.

If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display."

---This is what really happens:


If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.

For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display. <---- Which will show full color gradations across the color spectrum in bright highlights instead of crushing to white after hitting the sdr ceiling.


HDR enables far more vivid, saturated, and realistic images than SDR ever could.

High-brightness SDR displays are not at all the same thing as HDR displays.""

-----------------------------------------------------------------------------



This is an example I found of a HDR scene in a game, tone mapped to show the targeted color brightness heights.

QfSF1rn.jpg



I agree that you'd need a really good FALD array to get better HDR displayed. You'd likely get exacerbated flashlighting and bleed using HDR on edge lit displays, at least on the brightest highlights/screnes. OLED can do per pixel, but it's shackled down to 600nit color ceiling with ABL to avoid burn in, and the screen sizes are too big imo even for a large monitor desk setup.

In my opinion modern FALD arrays work very well but they suffer a dim or glow radiation since the backlight array isn't tiny enough. So it's a trade off between dim/glow offset on higher HDR color or a 600nit limit on oled out of burn in avoidance (with no warranty vs burn in from the mfg at all, not even 1 yr despite the built in protections), as well as the oled screen sizes.

NVsBTV1.png




I was hoping that the HDR1000 version of the monitor discussed in this thread would be FALD. However there aren't that many quality HDR PC games out yet, and I wouldn't be using this monitor for movies personally. The roadmap seems to be keeping hdmi 2.1 GPUS and monitors out of reach for another year or two. In the mean time , mini LED FALD seems to be appearing in a few tvs slowly (not micro LED, but 1000 to 3000 backlights or more instead of 384 - 512). There may also be hope someday for dual layer LCD tech possibly, which uses a second LCD layer in monochrome as the backlight, much like OLED does per pixel emissive using all white oled but without the burn in risks and color brightness limitations.



I'd call these monitors good "right now" SDR gaming monitors if they perform decently, perhaps wtih some HDR enhancement gain.. but hdmi 2.1 , higher density FALD arrays, higher HDR brightness ceilings (samsung Q9fn TV's already do ~1800 - 2000nit with a 512 zone backlight array), and other improvements are going to show up in the next few years.

You can always be waiting for the next best thing. Monitor tech advancement has been relatively slow up until recently though.
 
Last edited:
Yeah maybe I didn't. Thing is, how many people who are just average consumers do you expect to be able to properly tweak every HDR experience on a per game, per display basis? Most people DO expect things to just work 100% without having to do too much tweaking/fiddling. I guess I don't understand how it works, and probably so does the vast majority of consumers, and don't you think that is kind of a problem with HDR? If 10,000 nits really is necessary, well yeah I really don't understand it and I admit that.
I tweak the brightness for every game I play. You won't get the optimal experience otherwise. But with HDR you're not adjusting the brightness, you're adjusting reference white. If your reference white point doesn't match your display's capabilities then you're going to have issues. I turn it up to 100 nits on my PG27UQ and then adjust the setting in games accordingly. Games like Mass Effect Andromeda and Battlefield V look amazing.
 
  • Like
Reactions: Nenu
like this
People seem to be forgetting that the 'R' in HDR stands for 'range'.

Meaning that you're not making stuff on average brighter or darker, but rather allowing for brighter and darker elements to be displayed concurrently.

On the desktop, this process is frustrated by monitors being targeted largely toward productivity as an industry with color accuracy and dynamic range being secondary, as well as the inability of operating systems to cope well with differing content at the same time.

It's going to take a bit to sort all this out.
 
Back
Top