Why OLED for PC use?

I am using that very same TaskBar Hider, but it doesn't hide the mouse cursor...

I use multiple monitors so I can just move the cursor off of the oled.

You can just "bury" the mouse pointer in the bottom of the screen in windows though so I don't see the problem.

If you have taskbar hider set up so that it locks the taskbar away rather than activates on mouseover, you won't be activating the taskbar by burying the mouse beneath where it would otherwise be. You might have to disable taskbarhider.exe from it's menu at first, then change the mouseover option on the taskbar hiding in windows, then re-enable taskbarhider.exe to get it to behave in the lock-away via hotkey way. That is, when I am using it, I hit something like "Ctrl+SHIFT+Z" to hide it and it's locked away no matter what I do with my mouse. I hit "Ctrl+SHIFT+Z" a second time and the taskbar pops up solid. There is no mouse interactions to show/hide it whatsoever.

However you could probably use display fusion to move the mouse below the screen via a hotkey, and either set up that same hotkey as a toggle to bring it back (to the middle of the screen for example), or just drag it back up. You can tie that kind of thing to a streamdeck button via hotkey(s) too. Displayfusion can move your pointer wherever you want if you set the coordinates, and even do things like move the mouse to the app you hotkeyed (so when you hit that app's button the mouse telports there to the app's preconfigured position) or even just to the center of whatever monitor # if using multiple monitors etc.

Just dragging the cursor below the screen works well enough though, ez.


You might check this out though. Seems to work well even in windows 11:

https://www.softwareok.com/?seite=Microsoft/AutoHideMouseCursor

https://www.majorgeeks.com/files/details/autohidemousecursor.html
 
I am just using the default taskbar hide for right now but might consider trying the ones you folks are using.

When I leave the computer I minimize my window, make sure the taskbar is hidden with the Windows key, and then move the mouse off the screen. Background changes daily so there's nothing too static, and monitor sleeps after 5 minutes anyways.
 
Not to defend Mr. "200 nits" (I'm guessing you are arguing with him, I have him blocked) but I will say you have to be careful with RTings because they have their own biases, that being for contrast ratio. If you read a lot of their reviews, as I have, you notice that something that is real, real important to them is a high contrast ratio. Now I don't disagree that a higher contrast ratio is nicer, but I also don't feel it is as big a deal as they do. I'm an IPS fan, for TVs as well as monitors, and I'm ok with the lower contrast ratio as a tradeoff.
Pretty much the whole section has him blocked, lol.
I'd love a 300-350 nits 100% Full-Screen sustained Brightness though. 300-nits with OLED's infinite contrast and overall superiority makes it comparable to 600-nits LCD.
 
Pretty much the whole section has him blocked, lol.
I'd love a 300-350 nits 100% Full-Screen sustained Brightness though. 300-nits with OLED's infinite contrast and overall superiority makes it comparable to 600-nits LCD.
Ya me too. The S95B works fine for our TV use but it is a thing where if we want to use it during the day, we either need to draw the blinds or just have to deal with it being a bit dim. Another 100-150 nits would be all it would take to be bright enough for that. Brightness is still an area that OLED could use improvements in. It isn't bad where it is now, but if it gets a bit better I think it'll be to the point where even though it isn't as good as LCD, it'll be enough that in most real situations people won't care.

Likewise the other brightness improvement it could use, particularly for the monitors, is the amount of the screen you can have in the 1000+ range. On the AW3423 you only get that with a 1% window which is low for real content. The S95B TV is up to 10% which seems to work fine for pretty much all cases. I don't actually care about, or really even want, that kind of brightness sustained full screen as it would just be eye-searing. You really only want parts of an HDR scene to be real bright, and if the whole thing gets super bright you want the TV/monitor to tone it down for you.

I'm hopeful that it won't be too long before OLED gets brightness levels up to close enough to LCD as to make no difference for most uses and to get burn in down enough to not be a real worry for computers.
 
Ya me too. The S95B works fine for our TV use but it is a thing where if we want to use it during the day, we either need to draw the blinds or just have to deal with it being a bit dim. Another 100-150 nits would be all it would take to be bright enough for that. Brightness is still an area that OLED could use improvements in. It isn't bad where it is now, but if it gets a bit better I think it'll be to the point where even though it isn't as good as LCD, it'll be enough that in most real situations people won't care.

Likewise the other brightness improvement it could use, particularly for the monitors, is the amount of the screen you can have in the 1000+ range. On the AW3423 you only get that with a 1% window which is low for real content. The S95B TV is up to 10% which seems to work fine for pretty much all cases. I don't actually care about, or really even want, that kind of brightness sustained full screen as it would just be eye-searing. You really only want parts of an HDR scene to be real bright, and if the whole thing gets super bright you want the TV/monitor to tone it down for you.

I'm hopeful that it won't be too long before OLED gets brightness levels up to close enough to LCD as to make no difference for most uses and to get burn in down enough to not be a real worry for computers.

I agree that fullfield brightness is a somewhat useless metric. But along the 10% window I do feel that 25% and 50% should also be taken into consideration and this is where OLED currently falls off a cliff. If they can get the 10-50% window sizes to be bright enough then I say that's good enough, no need for 1000+ nits sustained fullfields as that just sounds ridiculous. So far though even Samsung's 2023 QD-OLED hits a wall once you go past 10% window.
 

Attachments

  • 789634_1672855650982.png
    789634_1672855650982.png
    246.9 KB · Views: 0
RTings #1 gaming monitor is the dell AW3423DW
. . .

gamesradar # gaming TV = LG C1 OLED

Forbes #1 Gaming TV Jan 2023 = LG C2

TechRadar's #1 gaming TV = LG C2

IGN 's #1 gaming TV = LG C2

eurogamer's #1 gaming TV = LG C2, C1

HowToGeek's #1 gaming TV = LG C2

. . . . . . . . .



If you are viewing any 4k screen at the human viewing angle of 50 to 60 degrees, the effective size is the same to your perspective so the size of the screen shouldn't matter if using the screen at proper/optimal viewing angles. Sitting closer than that on a larger 4k screens in a pc scenario is going to cause some major tradeoffs to begin with.

I do understand the eye strain thing as I get that from bfi/strobing displays vs sample and hold. I haven't experienced it with an oled though (not using bfi). I have a CX and a C1 and view a lot of HDR material.

These reviewers don't play games. Most of them don't even test high-end monitors. Optimum Tech at least plays games. They don't disassemble the monitor, their reviews are not that much in details comparable to the a retailer like Snowman who just test monitors he sells.

When you start to play more games you need a brighter monitor than just 100nits or 160nits or 250nits. And you sit closer to a monitor than to a TV. The flickers on OLED are always there.

"The Best" usually means the average.
 
I agree that fullfield brightness is a somewhat useless metric. But along the 10% window I do feel that 25% and 50% should also be taken into consideration and this is where OLED currently falls off a cliff. If they can get the 10-50% window sizes to be bright enough then I say that's good enough, no need for 1000+ nits sustained fullfields as that just sounds ridiculous. So far though even Samsung's 2023 QD-OLED hits a wall once you go past 10% window.
I wouldn't mind seeing that improved for sure, but I'm not sure how needed it is. It's again where the more of the screen you are talking, the less bright you really want it to be or it'll get too intense. Having the option is nice, of course, just not sure that for real content it is something that useful.

I'd kinda compare brightness to SPL output of an audio system. For a reference system for movies, the standard is 105dB on the mains, 115dB on the sub peak. That is awesome... but not something you want to run at all that often. You want a brief peak to hit that hard, and then maybe only one or two channels. You don't want all your speakers blasting that as an average level on a continuous basis, it'll literally hurt your ears. Same kind of deal with brightness in HDR. You want some small areas that can get really bright, bigger areas that can get pretty bright, and the whole thing should never get all that bright. While it won't hurt you like too much SPL will, if you blast things too bright it'll be uncomfortable until your eyes adjust, and then when it gets dim you won't be able to see shit again until they adjust again.

Don't get me wrong, if there was a way to make a display that could to 10,000nits, full field, 100% of the time without an issue I'd say we do it. Better to have than not. But in actual use I think the brighter we are talking, the smaller an area that is needed for real content we'd actually want to view.
 
LCD = 56K modem
OLED = broadband

Once you switch from 56K modem to broadband, you just don't go back. They will never make FALD with enough zones to prevent blooming effect. It is either per-pixel light emissions or its not as good as OLED. The only technology that comes close to OLED is plasma, which actually has superior motion compared to OLED, but plasma has many other issues that make it unviable as PC monitors.

OLED's greatest issue is the possibility of burn-in, but that is being mitigated successfully via pixel shift, pixel refresh, and healthy/careful OLED display use habits. You can also enable temporal dithering on NVidia (and probably AMD) cards to make sure there is beneficial non-stop pixel movement that adds to quality.

P.S. You don't have to audo-hide your taskbar, which doesn't audo-hide your mouse cursor. You can just move your taskbar around (top, bottom, left, right) every 4-5 hours.

It's more like OLED is quick a Beetle with higher power/weight ratio but in the end it won't reach that much top speed due to limited power.

A monitor without brightness has no future. OLED has been through an extra white subpixel, a QD layer, a MLA layer. It still doesn't change the organic part that doesn't output enough brightness while deteriorating faster overtime with unsolvable flickers.
 
Last edited:
I wouldn't mind seeing that improved for sure, but I'm not sure how needed it is. It's again where the more of the screen you are talking, the less bright you really want it to be or it'll get too intense. Having the option is nice, of course, just not sure that for real content it is something that useful.

I'd kinda compare brightness to SPL output of an audio system. For a reference system for movies, the standard is 105dB on the mains, 115dB on the sub peak. That is awesome... but not something you want to run at all that often. You want a brief peak to hit that hard, and then maybe only one or two channels. You don't want all your speakers blasting that as an average level on a continuous basis, it'll literally hurt your ears. Same kind of deal with brightness in HDR. You want some small areas that can get really bright, bigger areas that can get pretty bright, and the whole thing should never get all that bright. While it won't hurt you like too much SPL will, if you blast things too bright it'll be uncomfortable until your eyes adjust, and then when it gets dim you won't be able to see shit again until they adjust again.

Don't get me wrong, if there was a way to make a display that could to 10,000nits, full field, 100% of the time without an issue I'd say we do it. Better to have than not. But in actual use I think the brighter we are talking, the smaller an area that is needed for real content we'd actually want to view.

It's definitely helpful for those high APL scenes. Currently OLED can excel with small highlights especially with the per pixel dimming, but higher APL scenes tend to look more dull and washed out vs FALDs due to the ABL kicking in and nerfing the brightness levels down. If the 25-50% window sizes have that extra punch then I would be more than satisfied as currently high APL scenes on my CX look closer to SDR rather than HDR.
 
I wouldn't mind seeing that improved for sure, but I'm not sure how needed it is. It's again where the more of the screen you are talking, the less bright you really want it to be or it'll get too intense. Having the option is nice, of course, just not sure that for real content it is something that useful.

I'd kinda compare brightness to SPL output of an audio system. For a reference system for movies, the standard is 105dB on the mains, 115dB on the sub peak. That is awesome... but not something you want to run at all that often. You want a brief peak to hit that hard, and then maybe only one or two channels. You don't want all your speakers blasting that as an average level on a continuous basis, it'll literally hurt your ears. Same kind of deal with brightness in HDR. You want some small areas that can get really bright, bigger areas that can get pretty bright, and the whole thing should never get all that bright. While it won't hurt you like too much SPL will, if you blast things too bright it'll be uncomfortable until your eyes adjust, and then when it gets dim you won't be able to see shit again until they adjust again.

Don't get me wrong, if there was a way to make a display that could to 10,000nits, full field, 100% of the time without an issue I'd say we do it. Better to have than not. But in actual use I think the brighter we are talking, the smaller an area that is needed for real content we'd actually want to view.
Applying this to a high fidelity audio system would be a little different which is possibly more analogous.
The louder, the more you can hear into the soundstage, without stepping over a comfortable level.
Having the volume level correct is part of listening to my hifi, as loud as possible without fatigue is best to hear as much as possible when listening critically.
I'm doing exactly this as I am typing, with a Dire Straits SACD, awesome.

I wouldnt like to hear my system this loud all the time though.
In the same way, I wouldnt want to watch HDR on all TV channels as bright as I watch a movie.
Bright and correct brightness HDR is a great thing for critical watching, such as an HDR movie.


I use HDR+ (fake HDR) on my samsung TV when watching normal TV.
Its configured to be much less bright than movie HDR and really does add something wonderful to the SDR experience. Occasionally a little wild (such as when a lot of men are close up wearing white shirts) but is easy to forgive.
Everyone who has seen it says leave that on.

This evening I watched a broadcast 1938 Robin hood film made in Technicolour and had to make sure of what I was watching because HDR+ made it look like a pretty recent release.
Quite amazing!
 
Last edited:
Applying this to a high fidelity audio system would be a little different which is possibly more analogous.
The louder, the more you can hear into the soundstage, without stepping over a comfortable level.
Having the volume level correct is part of listening to my hifi, as loud as possible without fatigue is best to hear as much as possible when listening critically.
I'm doing exactly this as I am typing, with a Dire Straits SACD, awesome.

I wouldnt like to hear my system this loud all the time though.
In the same way, I wouldnt want to watch HDR on all TV channels as bright as I watch a movie.
Bright and correct brightness HDR is a great thing for critical watching, such as an HDR movie.
Volume hugely influences how we perceive sound. I've got big Genelec studio speakers on my desk and they do sound awesome but playing them too loud in the room you will get ear fatigue quickly and can no longer hear everything right as your ears adapt and try to protect your hearing. So more moderate volumes for the room are preferable, even if those loud volumes are fun for a few songs.

We don't talk much about different use situations here. I sometimes watch Netflix on my Samsung Galaxy Fold 4 while in bed and it can get very bright for HDR content (I've seen something like 1300 nits but haven't found any definite reviews), to the point that something very bright on screen is downright uncomfortable to look at. By comparison watching the same thing on my LG OLED TV is not an issue - not because it doesn't get as bright, but because it's much farther away from my eyes with typically more ambient light in our living room.

A desktop display between 27-32" fits right in between the two where it's not as close up as a ~7" tablet size foldable phone but much closer than a TV. I run my 28" 4K screen at around 120 nits brightness for desktop use all day long and making it something like 200-300 nits it becomes very uncomfortable for me to view where white text on dark backgrounds feels way too bright. So I really don't see the appeal of these high brightness levels in anything but HDR content where the whole screen isn't super bright in most scenes. In real life where things can be massively brighter it's easy to avert your eyes from looking at say car headlights but with a game you can't do that unless it's VR with eye tracking.

I totally agree with the idea that OLEDs should get better in that 25-50% category and the full field brightness is less relevant, though I guess these are somewhat tied together.
 
Volume hugely influences how we perceive sound. I've got big Genelec studio speakers on my desk and they do sound awesome but playing them too loud in the room you will get ear fatigue quickly and can no longer hear everything right as your ears adapt and try to protect your hearing. So more moderate volumes for the room are preferable, even if those loud volumes are fun for a few songs.
I covered this in the post you replied to, perhaps I should have said more?
I set the volume level to something thats comfortable for hours (usually 3hrs+), not blaringly loud.

Without pushing volume high enough, some detail is not heard under critical listening which is quite the experience when set up well.
Not all situations require such detailed listening and a lower volume is good, perhaps while I'm creating or reading something, sitting with family/friends, doing housework...
Perhaps analogous to watching SDR TV on cutting edge equipment (can look pretty good now). Or perhaps watching with lower brightness, or on lesser equipment?
My comparison though was of audio vs watching an HDR movie for the best possible high end experience.
Although, its maybe not quite fair to go this far as Video technology isnt as advanced as audio equipment vs the holy grail. ie I dont wish for anything better with my audio set up but video still has a way to go and it cant be bought yet.

We don't talk much about different use situations here. I sometimes watch Netflix on my Samsung Galaxy Fold 4 while in bed and it can get very bright for HDR content (I've seen something like 1300 nits but haven't found any definite reviews), to the point that something very bright on screen is downright uncomfortable to look at. By comparison watching the same thing on my LG OLED TV is not an issue - not because it doesn't get as bright, but because it's much farther away from my eyes with typically more ambient light in our living room. ...
This goes beyond the remit of my post :)
 
The optimal viewing angle is the human 50 to 60 degree viewing angle regardless of the display size, especially at a PC . . but yes you sit a lot farther from larger screens to get that. That's a reason why micro OLED in VR headset's more modern pancake lens type builds will be able to reach very high HDR ranges, besides that the micro OLED tech can go bright to begin with. It's very close to your eyes an in a boxed or more modern goggle like design that blocks out light (though there is also mixed reality functionality that shows the real world) so it will go brighter in effect to your eyes than if it was on something like a phone or smart watch. Like seeing a pen light on a table aimed up at you or someone putting it right up near your eyeball. The problem for me with VR/MR even as they reduce the bulk to more like goggles, and include things like varifocal lenses, is that the PPD is still very poor relative to real world screens. Especially comparing a real world 4k or 8k screen in real life compared to a virtual/MR one side by side in real space. The day(year) that VR/MR fidelity is good enough to do that 1:1 in virtual space at something like 70PPD, rather than the whole screen/viewport/virtual world at lower PPD, will be a milestone.

. . . . .

I think we all agree in wanting the best ranges, and eventually the same in HDR4000 and 10,000 screens without aggressive ABL. It's just that for now we can't get it all in a per pixel emissive technology so like usual it comes down to tradeoffs in screen tech and what you put more value on personally between those major tradeoffs on both sides. Once we get better overall option like micro-LED at consumer enthusiast price levels, no one is going to argue to go back to FALD LCD (and even there in some cases, ABL) . . or to go back to OLED's ranges and ABL. (Assuming hdr2000+, hdr 4000 and 10,000 micro-LED won't have to resort to ABL when we get there, we'll see).
 
Last edited:
There's an increase of OLEDs being used for PC use (whether it be an OLED TV or the ever increasing availability of OLED monitors). But isn't burn-in a concern? Those using LG OLED TVs as PC monitors, have you experienced burn-in? Do you do anything "special" to prevent burn-in or just use it as it was any other monitor?

The last monitor I was using was AORUS FV43U. I now have the LG C2. No I do not worry at all about burn in. Nothing is ever just left on. Every monitor I had I use screen saver and then goes off. When I first started looking in to OLED .. first let me say its wise to search, read listen but in the end for me it would have been foolish to by something based only on what some person or site said. Lol yes yes were all experts blah blah blah. Just buy it test it your self make sure you can return it if you don't like it. So reading listing for many months (over a year) one would have thougth "brightness" would be a huge problem. Based on what most said yet so far from the truth. The Aorus FV43U can get very bright yet on my new LG C2 ..I've never noticed this at all. I don't have it cranked up now. Oh the colors.. every game is like new now. I will never go back. I am so glad I never listened to anyone just tried it for my self. Its like reading if you turn off on the LG the auto dimming you void warranty. There is no LG statement to this and there is no haha hidden "code" somewhere that will show LG you turned it off or on if they check.

Oh then the "mirror" yeah just watching all those videos you would know its going to be a huge problem. HAHA no lie never noticed it. My window never has the sun shining in so that helps. I had the LG LG 48GQ900-B 48 at first. Again reading watching comments you would know the MATTE screen vs glossy would be awful. Nope I kind a noticed it. I went with the LG C2 for TV. I am one of those people you just hate who loves the soap opera effect. 43" was great 48".. I don't really notice it but I really like it. Anyting bigger is way to much for me.

So my advice is read study but in the end never go by what someone you don't know tells you. What I found is most tell you based on personal experience. Not what your looking for. Everyone is different. Got it in Jan for $1049. I love this TV used as a monitor which is what it was made for.
 
I didn't mean to start an audio fight or anything, just trying to use an example of where you might not want things to be too much for too long from too many sources :). My point was just that if something tries to be real bright over the whole screen, you might end up wanting ABL even if the screen could handle that brightness. Same kind of deal as to why we have volume knobs on HT systems or dynamic compression. Sometimes, you want to tone it down, sometimes Michael Bay made a movie and it is way too much all the time. Likewise you don't need or want it full blast everywhere. You want 105dB out of the mains, you probably care less about that out of the surrounds and you probably don't care at all about being able to do all channels at that level at the same time, and in fact wouldn't want that much intensity.

I feel similar with brightness on HDR screens. I want some intense brightness for a little bit, on a small area of the screen. I care less about brightness on bigger areas. I'll take screens that can do it, but if they can't I don't think it is a big deal because I imagine I'd have to tone it down if it was too bright, too large, too often.
 
  • Like
Reactions: Nenu
like this
I didn't mean to start an audio fight or anything, just trying to use an example of where you might not want things to be too much for too long from too many sources :). My point was just that if something tries to be real bright over the whole screen, you might end up wanting ABL even if the screen could handle that brightness. Same kind of deal as to why we have volume knobs on HT systems or dynamic compression. Sometimes, you want to tone it down, sometimes Michael Bay made a movie and it is way too much all the time. Likewise you don't need or want it full blast everywhere. You want 105dB out of the mains, you probably care less about that out of the surrounds and you probably don't care at all about being able to do all channels at that level at the same time, and in fact wouldn't want that much intensity.

I feel similar with brightness on HDR screens. I want some intense brightness for a little bit, on a small area of the screen. I care less about brightness on bigger areas. I'll take screens that can do it, but if they can't I don't think it is a big deal because I imagine I'd have to tone it down if it was too bright, too large, too often.

I agree with you about not caring too much about sustained brightness in bigger areas like over 50% but I still think that a 1-10% window is a little bit too small of an area overall and should not be the only metric for determining great HDR. My CX can really excel at delivering those peak highlights in dark scenes with per pixel dimming to let those 700 nits stand out from the dark areas, but high APL scenes it really just lacks that same punch compared to my X27.

39-p_1100.webp
4-p_1100.webp
 
Last edited:
I didn't mean to start an audio fight or anything, just trying to use an example of where you might not want things to be too much for too long from too many sources :). My point was just that if something tries to be real bright over the whole screen, you might end up wanting ABL even if the screen could handle that brightness. Same kind of deal as to why we have volume knobs on HT systems or dynamic compression. Sometimes, you want to tone it down, sometimes Michael Bay made a movie and it is way too much all the time. Likewise you don't need or want it full blast everywhere. You want 105dB out of the mains, you probably care less about that out of the surrounds and you probably don't care at all about being able to do all channels at that level at the same time, and in fact wouldn't want that much intensity.

I feel similar with brightness on HDR screens. I want some intense brightness for a little bit, on a small area of the screen. I care less about brightness on bigger areas. I'll take screens that can do it, but if they can't I don't think it is a big deal because I imagine I'd have to tone it down if it was too bright, too large, too often.
I agree with you for the most part.
However, full screen very high nit output has a place.
For example, a nuclear explosion has a very short high intensity burst of light, same with other explosions.
It would be short enough duration to not cause harm at say 4000+nits, and would create the shocking impact required, being 1 or 2 frames at full screen!
I dont think any display can do that with HDR1000 at the moment.

That said, I'm not for eye seering brightness as a general thing, comfortable viewing is a must.
 
I agree with you for the most part.
However, full screen very high nit output has a place.
For example, a nuclear explosion has a very short high intensity burst of light, same with other explosions.
It would be short enough duration to not cause harm at say 4000+nits, and would create the shocking impact required, being 1 or 2 frames at full screen!
I dont think any display can do that with HDR1000 at the moment.

That said, I'm not for eye seering brightness as a general thing, comfortable viewing is a must.

Part of DolbyVision's PR:

"Catch nuances in every scene, from seeing the emotions change on a character's face in a dark night shot, to avoiding blown-out details under a bright sunlit scene."
"The code values are used in the most perceptually important parts of the PQ curve. This allows for exceptionally detailed shadows while allowing highlights up to 10,000 nit. "

10,000 nit Dolby HDR:

"Over half of the code values are in the zero to 100 nit range.
About 25% of the code values are in the 100 to 1000nit range."
"The remaining code values are 1000 to 10,000".

  • 50%+ of the screen displayed is at 0 to 100. (SDR ranges and down to infinite black depth in OLED). This foundation probably remains the same even on compressed after fall-off screens.
  • 25% of the screen is at 100 to 1000nit. (bulk of which is typical mids to high mids that many HDR screens can display more or less, at least for a time if not long sustained).
  • 25% or less at the top (likely bright highlights e.g. scintillation/glints/reflections of sources, and very bright direct light sources)

The top end will be compressed on screens since all of consumer screens currently are way below 10,000nit so that last 25, 25 won't be the same. For example HDR 10,000 curve on a the first few gens of LG OLED is around 400nit accurate then falls off to compressing the rest into the remaining ~ 400nit (momentary ABL across dynamic media and gaming scene considerations aside).


. .

I added the last cell to this well known meme as a joke. It's not really like that in HDR material usually for the above reasons but like you said there are small windows where it might be.

725876_tPyXzb2.png


. . .

"Fantastic Beasts" is one example I watched in dolby vision and it had a nuclear explosion scene:

727176_latestcb20200110044959.gif



It's worth noting that even the QD-LED LCDs from samsung that go over 2000nit, one is 4k and their other is 8k - both have aggressive ABL even though they are LED LCD. So very high peak brightness/HDR color volume screens might have a hard time avoiding ABL without using some good heatsink and cooling tech and maybe a larger form factor (thicker). There might be some brighter at lower heat tech that could be used to help too. It might be a challenge. HDR 4000 and HDR 10,000 is very bright and usually brightness means higher heat ranges of some kind.
 
Last edited:
I agree with you for the most part.
However, full screen very high nit output has a place.
For example, a nuclear explosion has a very short high intensity burst of light, same with other explosions.
It would be short enough duration to not cause harm at say 4000+nits, and would create the shocking impact required, being 1 or 2 frames at full screen!
I dont think any display can do that with HDR1000 at the moment.

That said, I'm not for eye seering brightness as a general thing, comfortable viewing is a must.
Sure, and I'm 100% for having displays that can do that. I like the idea that our displays, our speakers, etc shouldn't be the limiting factor if possible. But I also don't think it is a big deal, or the thing that needs the most focus with HDR displays. I think it is more important to have reasonable brightness over a decent area, and some great peak brightness for small parts, and if that's all that is feasible right now I'll take it.
 
  • Like
Reactions: elvn
like this
OLED is never bright enough. It doesn't reach the level of bright not even mentioning "too bright". Most of the OLEDs don't even reach the 200nits APL and that's all. A proper daylight scene in HDR is easily over 200nits. Just seeing SDR with a few dots of highlight is not HDR.

Before it gets bright it needs to fix the flickers first.
 
OLED is never bright enough. It doesn't reach the level of bright not even mentioning "too bright". Most of the OLEDs don't even reach the 200nits APL and that's all. A proper daylight scene in HDR is easily over 200nits. Just seeing SDR with a few dots of highlight is not HDR.

Before it gets bright it needs to fix the flickers first.
HDR is unstandardized garbage.

At least OLED does not suffer from off axis glow or gamma shift, slow blurry pixel transitions and backlight bleed.

You can have erratic HDR, I'll take the OLED pros.
 
  • Like
Reactions: Zahua
like this
HDR is unstandardized garbage.

At least OLED does not suffer from off axis glow or gamma shift, slow blurry pixel transitions and backlight bleed.

You can have erratic HDR, I'll take the OLED pros.
Without brightness OLED cannot even look better even it has pixel dimming zones and a bit faster response time.

HDR is real life with a lot more color and contrast. It is high enough that it comes with the substandards. And there are higher standards vs lower standards. There are always better images at higher range close to the realism.
 
Without brightness OLED cannot even look better even it has pixel dimming zones and a bit faster response time.

HDR is real life with a lot more color and contrast. It is high enough that it comes with the substandards. And there are higher standards vs lower standards. There are always better images at higher range close to the realism.
Subjective prattling.

Enjoy your off axis washout, glowing corners, and smeared motion.

Your eyes must be ruined by blasing them with harmful LED sources. I have my C2 42 set to 15% brightness, and it's still too bright in some instances. I have always calibrated my displays for 110-120 nits, potentially saving my eyes.

HDR is a gimmick to sell displays. It's not even native to the film industry, it's added in post is it not?
 
Can anyone who still got Sony GDM-FW900 confirm if C9/CX/C1/C2/S95B/A95K's motion clarity is just as good as Top-End CRT's or even superior?
 
Can anyone who still got Sony GDM-FW900 confirm if C9/CX/C1/C2/S95B/A95K's motion clarity is just as good as Top-End CRT's or even superior?
OLEDs use the sample and hold method of refreshing their picture by default so they'll still have motion blur from that and won't match a CRT in terms of motion clarity. The LG OLEDs from C8 on have black frame insertion but the refresh rate that it works at depends on the model.
 
Reading this thread is painful.

Ultimately we all have to pick the display technology with the most fitting compromises for our needs. For me, OLED burn-in is still too much of a concern and the text fringing issues seem a little annoying but everything else about it is enviable over the QLED display I'm using now. I can't blame anyone for picking OLED for PC use, image quality is great. I can't blame anyone for avoiding them as I'm doing just that for now.

Itching to see if MicroLED or similar displays can sort out the issues I have with OLED so I can finally move to a self-emissive display. The promise is display perfection... time will tell.
 
I think in the quote below Sycraft was talking about HDR brightness more generally as in how bright it could get on very bright screens vs more moderate ones currently. He actually said "I'm 100% for having displays that can do that".. and "our displays, our speakers, etc shouldn't be the limiting factor if possible" - but he also said what he personally put more value on overall considering the limitations we have available now across the board. That's what it comes down to ad nauseam in this thread. Once we can all get per pixel emissive microLED 2000nit+, HDR4000 and HDR10,000 screens in enthusiast price ranges I doubt anyone would argue to go back to FALD LCD and it's considerable tradeoffs or OLED and it's considerable tradeoffs. We can tit for tat those tradeoffs forever until then but they are huge enough tradeoffs to make neither the best choice for everyone's values across the board.

Sure, and I'm 100% for having displays that can do that. I like the idea that our displays, our speakers, etc shouldn't be the limiting factor if possible. But I also don't think it is a big deal, or the thing that needs the most focus with HDR displays. I think it is more important to have reasonable brightness over a decent area, and some great peak brightness for small parts, and if that's all that is feasible right now I'll take it.


hardware unboxed, Asus proart ucx "Brightness, Contrast, Uniformity"


When a FALD screen gets a challenging mix of HDR brights and HDR darks in a scene, which happens throughout in a lot of scenes - a "high" comparatively for now 1400nit+ like the ucx only gets around 5000:1 contrast in those "difficult" mixtures/mappings of areas in scenes. It's native, non-fald contrast is under 1300:1. In a low brightness checkerboard test with FALD active it gets only 4500:1 contrast ratio on those patterns and on a high brightness checkerboard it gets merely a 3800:1 contrast ratio. That's extremely poor, especially compared to an OLED's "infinite" or at least "ultra" depth per pixel next to any color levels it provides as it applies to mixed parts of scenes. FALD ucx also blooms when bright edges are defined vs. dark backgrounds (e.g. bright steam interface thumbnails, plex/emby covers on dark background, crosshairs, even the mouse cursor can bloom slightly.) are straddling across anywhere in the 7000 pixel (4k) zone (or what would be 15,000 pixel zone on 8k screens). Let alone those kinds of edges in HDR 400 - 1400 nit ranges light to dark. While you aren't always watching an actual checkerboard - that really happens any time any area of bright + dark is in the middle of cells. . . 7k to 15k pixels is a big field, and HDR is a big range to attempt to smooth the larger brightness vs darkness range across. Tons of scenes have doorways, stairwells, darker areas cast by hair, hats, tables, trees/foliage, geology and architectures, clothing/robes/dresses, curtains casting darker areas in otherwise bright areas. Also modest brightness or dim room areas with bright window(s), bright artificial lights in dark areas, bright (e.g. space) ships and lasers, dragon fire and spells/magic, lightsabers, starfields in space and dark areas, the light reflected off of people's eyeballs and metals in darker scenes, isolated flame, etc. Even light cast across detailed textured objects with the dark pixels creating detail in an otherwise bright object. And these aren't slides, they are dynamic scenes in media and games so while scenes can pause at times, those zones are otherwise being panned across all of the time as camera cinematography changes the scene camera angels/shots, or pans . . or game virtual cinematography does the same or the player mouse looks, movement-keys, or controller pans. So you are landing on the fence zone-wise randomly in the heat map of more static scenes and you are jumping the fences constantly in dynamic scene's flow and in gameplay.

https://tftcentral.co.uk/reviews/asus_rog_swift_pg32uqx
Back to 1300:1 contrast in SDR?. Yuck.
"Note that you don’t really want to have the variable backlight operating for general desktop use, that’s better saved for HDR games and videos. For SDR general and office use just turn variable backlight off in the OSD menu we would recommend. If you use the FALD variable backlight for colour critical work then it will lead to additional inaccuracies than cannot be avoided with LCD local dimming backlights, which we explained in more detail in our LG 32EP950 OLED review here. You don’t need the local dimming for SDR content anyway."

The cells aren't small enough. The firmware chooses to lighten or darken the entire cell of 7000 pixels all the same amount, toning it's lighting. It can't "split hairs", not even close. Actually, the opposite is true. It can end up toning more cells, the surrounding cells of a contrasted area in a type of jumbo lighting sub-sampling/anti-aliasing type effect so even more cells are affected in an attempt to make the transitions less abrupt. That's a quiltwork of sideways icecube trays, 7k to 15k pixels each, so the lighting is not uniform and you get lighter or darker patches or very, very low rez lighting array brightness toning "gradients" spanning contrasted edges, with the lighting levels smoothed across multiple zones in a width of zones along borders, almost like how text is subsample smoothed except very jumbo sized 7k pixels each cell for each analogous "subpixel" smoothed. It can bloom or sort of reverse bloom and darken cells that shouldn't be but otherwise that smoothing and toning method, if not overtly exhibiting bloom/dim "halos", it can provide a lighting smoothing effect spreading it out across multiple zones when it's able to depending on the scene elements so that in those areas it's not as abrupt. It's a clever system but it's toning any contrasted area's cells against each other all of the time like AA or text-ss in a jumbo grid of 7k pixels per cell whether outright bloom/dim haloing or "anti aliasing" the huge lighting zones vs each other. So you can say "it's not blooming much" in this content, but it's lifting the black depths on zones, and also lowering the brightness/color volume of others dynamically in a wider backlight array scar where any high contrasted areas meet. It has to. It's only a 45x25 lighting resolution at best.

OLED has it's own cons of course and they are as considerable (among others, ABL though the 2000nit+ samsung qd LED LCD's also have aggressive ABL). I don't think any level headed person is disputing that both technologies have major tradeoffs. Both technologies have huge tradeoffs but I prefer per pixel emissive with the current tech in non-modified HDR curves rather than self-distorted curves that would artificially trigger ABL more often. Currently you can only get per pixel emissive in enthusiast price ranges in an OLED. Per pixel emissive is a better way to display things, period. MicroLED will be per pixel emissive ultimately. FALD is a hack we have for now because we can't do better in LCDs. It's clever and makes the most of what it can do but it will never compare to per pixel emissive display method as a technical way to do things, OLED or not.

If you like the overall pros vs. the cons of the large zones rather than the pros and cons of per pixel emissive that's fine but that's not how everyone values things. OLED is inky black and color lighting down to the razor's edge pixel level splitting hairs side by side by side on over 8 million pixels.. throughout the entire screen. No matter what there are 3840x2160 pixels each with their own level independent off all of the other pixels. ABL kicks them down in some dynamic scenes momentarily which is occasional in non-deformed HDR curves, but 2000+ nit FALD LCDs aren't immune to that either so far. The higher density FALDs are a lighting resolution of 45x25 cells at best, probably less grid wise if you don't count the edge lights. The fact that there are really no glossy options on FALD LCDs just makes it worse as matte type ag hit by ambient lighting raises black levels on any screen type so the end result is even higher than the FALD array does alone and can compromise small details a bit with it's sheen/frost-haze when "activated" by light hitting it and reflecting it back. A glossy OLED is a gorgeous picture. Idk if I could ever switch to haze coated FALD's patchwork of very low lighting resolution in halos and low density lighting compensations as zone gradients/AA personally. I know I'd regret it. I will probably will be on oled for media and gaming until OLED is replaced by a per pixel emissive tech like microLED, when it reaches enthusiast consumer prices. Also keeping an eye on microOLED goggle-like form factor VR generations with pancake lenses, varifocal lenses, HDR, and slightly better PPD though VR's ppd is going to be very poor to my tastes overall for many years yet I think. VR is going to microOLED in the next gens and later on it will probably go to microLED someyear. So for VR it's looking like it is all per pixel emissive going forward. Noone will ever go back to FALD if they can help it. It's a temporary compensation masking method like text sub-sampling for low pixel density, but it's much worse because it's so large. Once the lighting resolution or the display resolution ~ PPD is high enough you won't need any of those hacks anymore. 8k is getting close on the text-ss/AA front. Per pixel lighting resolution is the goal for display tech overall.
 
Last edited:
Reading this thread is painful.

Ultimately we all have to pick the display technology with the most fitting compromises for our needs. For me, OLED burn-in is still too much of a concern and the text fringing issues seem a little annoying but everything else about it is enviable over the QLED display I'm using now. I can't blame anyone for picking OLED for PC use, image quality is great. I can't blame anyone for avoiding them as I'm doing just that for now.

Itching to see if MicroLED or similar displays can sort out the issues I have with OLED so I can finally move to a self-emissive display. The promise is display perfection... time will tell.

I agree they are big tradeoffs on both sides. Although like I've said multiple times in this thread, the text fringing issue is so vocal and somewhat overblown because most people are using larger 4k gaming tvs (42", 45", 48", 55") on a desk instead of at a distance where they'd get the optimal 50 to 60 degree human viewing angle at 64 to 77 PPD. At sub 60PPD on a desk everything will get fringed more even with text-ss and aggressive AA (and the 2D desktop has no SS or AA for it's graphics and imagery) - because the pixel sizes will be more like a 1500p screen's or worse at those kind of distance's PPD instead of the fine pixels you should be getting from a 4k screen at optimal viewing angle. With non-standard pixel structures the PPD matters even more. The text thing is exacerbated by low ppd. Shoehorning a large 4k screen, even 42", onto a desk instead of mounting it separately on a slim spined tv stand or wall mount, etc. at a more optimal viewing angle and PPD is putting a square peg into a round hole PPD and viewing angle wise.

OLED reserves the top 25% of the screen brightness for a wear-evening routine so you won't burn in until you burn down through that completely. As long as you don't badly abuse the screen that shouldn't happen for years. Especially if you don't intend to use it as a desktop/app monitor and primarily play games and view dynamic media on it like I do. Though there are plenty of people who use them for desktop monitors and even disable asbl. They are probably burning through their buffer faster than I am yet there are reports on 3+ years running of that usage. Search this thread for burn-in for simple reduction measures. OLED are great as media/gaming displays. If that concerned about it after all of the replies in oled threads of people having used them for 3 or 4 years already - some company's models have burn in warranty, or you can get bestbuy's. Both of those options are more expensive though usually. (LG G model line is more expensive but has burn-in guarantee, or pricey bb warranty covers burn-in, or other expensive gaming screen with warranty)

The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again.
Or you might think of it like a phone or tablet's battery you are using that has an extra 25% charge module, yet after you turn on your device and start using it you have no idea what your battery charge level is. You can use more power hungry apps and disable your power saving features, screen timeouts, run higher screen brightness when you don't need to, leave the screen on when you aren't looking at it etc. and still get full charge performance for quite some time but eventually you'd burn through the extra 25% battery.
 
Last edited:
Subjective prattling.

Enjoy your off axis washout, glowing corners, and smeared motion.

Your eyes must be ruined by blasing them with harmful LED sources. I have my C2 42 set to 15% brightness, and it's still too bright in some instances. I have always calibrated my displays for 110-120 nits, potentially saving my eyes.

HDR is a gimmick to sell displays. It's not even native to the film industry, it's added in post is it not?
You should've realized these drawbacks of a typical LCD doesn't even matter on a FALD LCD. HDR is the future. OLED is a SDR monitor after all.

Just like you said you bought an OLED to tolerate ABL and flickering to just see sRGB 80nits. You can keep the brightness low so you don't have the burn-in. You can only see that much in a very limited range. Claiming monitor too bright only exposes that your monitor is flickering.

While with FALD LCD I can even turn sRGB similar to HDR400. Everytime I see an high-range image on FALD is much better on OLED.
 
Can anyone who still got Sony GDM-FW900 confirm if C9/CX/C1/C2/S95B/A95K's motion clarity is just as good as Top-End CRT's or even superior?

The CX/C1 are the only two models that support 120Hz BFI so they can get really close to matching a CRT, not quite at the same level but it's so close enough to a CRT that I feel in real world usage/gaming you wouldn't care. I personally now have my CX permanently set to 120Hz BFI mode for certain types of games like Dead Cells and it's amazing.
 
  • Like
Reactions: Xar
like this
You should've realized these drawbacks of a typical LCD doesn't even matter on a FALD LCD. HDR is the future. OLED is a SDR monitor after all.

Just like you said you bought an OLED to tolerate ABL and flickering to just see sRGB 80nits. You can keep the brightness low so you don't have the burn-in. You can only see that much in a very limited range. Claiming monitor too bright only exposes that your monitor is flickering.

While with FALD LCD I can even turn sRGB similar to HDR400. Everytime I see an high-range image on FALD is much better on OLED.
FALD or edge lit, doews not solve the issues I stated, you are confused.
 
  • Like
Reactions: Zahua
like this
Can anyone who still got Sony GDM-FW900 confirm if C9/CX/C1/C2/S95B/A95K's motion clarity is just as good as Top-End CRT's or even superior?
being an owner of FW900 and a compaq 7550 CRT monitors and being able to test a C1 on friends house, i can confirm this OLED still is a little behind the CRT motion quality of the CRTs.

the C1 at 120hz with test and game running at constant 120 fps even with oled motion pro set to its best "high" mode, wich was the refresh rate and motion quality combinations i found to be the best, i still was able to persieve motion bluring in motion test, in games i felt it very close to the CRTs motion quality.

the C1 also support BFI at 60hz but i really persieved minimum improvement in motion quality even at its best oled motion pro "high" setting for 60hz, still very notable motion blurring

by the way the FW900 is not the only CRT monitor being able to produce mentioned motion quality, also the COMPAQ does as good as the SONY, i persieve no motion quality difference between both.
 
  • Like
Reactions: Xar
like this
FALD or edge lit, doews not solve the issues I stated, you are confused.
Does it even matter if your OLED only gets 200nits average APL at most to have even worse accuracy than FALD?

If your OLED is so good why nobody uses it as true HDR monitor?

It's always the same when I say FALD LCD>OLED then you say FALD LCD has bad images while OLED has even worse images.
 
When a FALD screen gets a challenging mix of HDR brights and HDR darks in a scene, which happens throughout in a lot of scenes - a "high" comparatively for now 1400nit+ like the ucx only gets around 5000:1 contrast in those "difficult" mixtures/mappings of areas in scenes. It's native, non-fald contrast is under 1300:1. In a low brightness checkerboard test with FALD active it gets only 4500:1 contrast ratio on those patterns and on a high brightness checkerboard it gets merely a 3800:1 contrast ratio. That's extremely poor, especially compared to an OLED's "infinite" or at least "ultra" depth per pixel next to any color levels it provides as it applies to mixed parts of scenes. FALD ucx also blooms when bright edges are defined vs. dark backgrounds (e.g. bright steam interface thumbnails, plex/emby covers on dark background, crosshairs, even the mouse cursor can bloom slightly.) are straddling across anywhere in the 7000 pixel (4k) zone (or what would be 15,000 pixel zone on 8k screens). Let alone those kinds of edges in HDR 400 - 1400 nit ranges light to dark. While you aren't always watching an actual checkerboard - that really happens any time any area of bright + dark is in the middle of cells. . . 7k to 15k pixels is a big field, and HDR is a big range to attempt to smooth the larger brightness vs darkness range across. Tons of scenes have doorways, stairwells, darker areas cast by hair, hats, tables, trees/foliage, geology and architectures, clothing/robes/dresses, curtains casting darker areas in otherwise bright areas. Also modest brightness or dim room areas with bright window(s), bright artificial lights in dark areas, bright (e.g. space) ships and lasers, dragon fire and spells/magic, lightsabers, starfields in space and dark areas, the light reflected off of people's eyeballs and metals in darker scenes, isolated flame, etc. Even light cast across detailed textured objects with the dark pixels creating detail in an otherwise bright object. And these aren't slides, they are dynamic scenes in media and games so while scenes can pause at times, those zones are otherwise being panned across all of the time as camera cinematography changes the scene camera angels/shots, or pans . . or game virtual cinematography does the same or the player mouse looks, movement-keys, or controller pans. So you are landing on the fence zone-wise randomly in the heat map of more static scenes and you are jumping the fences constantly in dynamic scene's flow and in gameplay.

https://tftcentral.co.uk/reviews/asus_rog_swift_pg32uqx
Back to 1300:1 contrast in SDR?. Yuck.
You still don't understand contrast. And you talk FALD like an edge lit.

Don't forget FALD is implemented to boost native contrast to 1milllion:1 with a lot higher range. Though the dynamic range is not done at pixel level but given enough distance between highlight A and lowlight B then you can see 1800nits sun against 0.01nits shadow while OLED has 300nits sun against 0.01nits shadow. The overall contrast of OLED is a lot lower than FALD.

Only banging on pixel dimming won't work because HDR needs both contrast and color. Yet the OLED doesn't even reach more accurate contrast. OLED only has that much brightness, it won't have accurate HDR in the first place.
 
Does it even matter if your OLED only gets 200nits average APL at most to have even worse accuracy than FALD?

If your OLED is so good why nobody uses it as true HDR monitor?

It's always the same when I say FALD LCD>OLED then you say FALD LCD has bad images while OLED has even worse images.
Plenty of people use OLED for HDR, you are just making sh*t up now. HDR is a gimmick anyway, 99% of the time the source is not calibrated to your display, requiring adjustment for every applicaton, if it even works right to begin with (mode switching). Thus producing a blurry mess in high zones and crushed or washed out low zones.

Low brightness (not an issue for me, my eyes are not ruined), and burn in risk are the only cons of OLED. Your FALD backlight still has a shitty LCD in front of it, and all it's cons intact. Even FALD is a gimmick. What classifies a display as FALD, 9 zones, how many zones?
 
Plenty of people use OLED for HDR, you are just making sh*t up now. HDR is a gimmick anyway, 99% of the time the source is not calibrated to your display, requiring adjustment for every applicaton, if it even works right to begin with (mode switching). Thus producing a blurry mess in high zones and crushed or washed out low zones.

Low brightness (not an issue for me, my eyes are not ruined), and burn in risk are the only cons of OLED. Your FALD backlight still has a shitty LCD in front of it, and all it's cons intact. Even FALD is a gimmick. What classifies a display as FALD, 9 zones, how many zones?
Just because "plenty" of people uses OLED making it suddenly a lot better? 99% of monitors are office monitors. Funny why does source needs to be calibrated to fit the display? It's more like the display needs to be capable to show whatever source required.

It's more like you are making shit up saying HDR is a gimmick while letting OLED manufacturer fool you enough to enforce the belief that 100nits plus a few dots of highlight is HDR. That's just SDR.

You don't have FALD to see more realistic HDR1000. But you will have OLED ruin your eyes due to flickering if it ever has a chance to get a little brighter. You try to talk about backlight while FALD has DC dimming backlight far superior than OELD flickering.

The latest AW3423DW looks like crap compared to a 4-year-old FALD. When FALD display SDR, the image can look similar to HDR400. When OLED display SDR, the image only looks sRGB 80nits or DCI-P3 at most. When FALD display HDR1000 the OLED displays HDR200 with tons of ABL.

You like to see dim images with worse range of both contrast and color. Then go pick up the OLED. I'd like to see a monitor similar to real life which looks like a window.
 
The last monitor I was using was AORUS FV43U. I now have the LG C2. No I do not worry at all about burn in. Nothing is ever just left on. Every monitor I had I use screen saver and then goes off. When I first started looking in to OLED .. first let me say its wise to search, read listen but in the end for me it would have been foolish to by something based only on what some person or site said. Lol yes yes were all experts blah blah blah. Just buy it test it your self make sure you can return it if you don't like it. So reading listing for many months (over a year) one would have thougth "brightness" would be a huge problem. Based on what most said yet so far from the truth. The Aorus FV43U can get very bright yet on my new LG C2 ..I've never noticed this at all. I don't have it cranked up now. Oh the colors.. every game is like new now. I will never go back. I am so glad I never listened to anyone just tried it for my self. Its like reading if you turn off on the LG the auto dimming you void warranty. There is no LG statement to this and there is no haha hidden "code" somewhere that will show LG you turned it off or on if they check.

Oh then the "mirror" yeah just watching all those videos you would know its going to be a huge problem. HAHA no lie never noticed it. My window never has the sun shining in so that helps. I had the LG LG 48GQ900-B 48 at first. Again reading watching comments you would know the MATTE screen vs glossy would be awful. Nope I kind a noticed it. I went with the LG C2 for TV. I am one of those people you just hate who loves the soap opera effect. 43" was great 48".. I don't really notice it but I really like it. Anyting bigger is way to much for me.

So my advice is read study but in the end never go by what someone you don't know tells you. What I found is most tell you based on personal experience. Not what your looking for. Everyone is different. Got it in Jan for $1049. I love this TV used as a monitor which is what it was made for.
I'm an AORUS FV43U and I love how bright it gets. I can game in the summer with the sun coming in through my window and still have a great experience due to high bright this monitor can get.

I understand the FV43U leaves a lot to be desired when it comes to pixel response and dark scenes. How does your OLED compare?
 
Just because "plenty" of people uses OLED making it suddenly a lot better? 99% of monitors are office monitors. Funny why does source needs to be calibrated to fit the display? It's more like the display needs to be capable to show whatever source required.

It's more like you are making shit up saying HDR is a gimmick while letting OLED manufacturer fool you enough to enforce the belief that 100nits plus a few dots of highlight is HDR. That's just SDR.

You don't have FALD to see more realistic HDR1000. But you will have OLED ruin your eyes due to flickering if it ever has a chance to get a little brighter. You try to talk about backlight while FALD has DC dimming backlight far superior than OELD flickering.

The latest AW3423DW looks like crap compared to a 4-year-old FALD. When FALD display SDR, the image can look similar to HDR400. When OLED display SDR, the image only looks sRGB 80nits or DCI-P3 at most. When FALD display HDR1000 the OLED displays HDR200 with tons of ABL.

You like to see dim images with worse range of both contrast and color. Then go pick up the OLED. I'd like to see a monitor similar to real life which looks like a window.
Looks like a window? Maybe a window with petrolium jellt smeared all over it.

Plenty of people usng OLED to consume HDR, and enjoying it, was to refute your idiotic claim no one uses OLED for HDR. You either are just trolling, or lack reading comprehension. Requiring calibrtion between sources for HDR cntent is f*cking stupid and a gimmick. Just like the gimmick of ".1ms response" sticker on the box of your "FALD" LCD. Unstandardized parameters manipulated for marketing. But any dupe that falls for that marketing would of course defend their purchase until the end of time, no matter how much buyers remorse they have to endure.

I also don't iknow where you get this "flicker" from on OLED, I have not seen any, and no review I have seen mentions it. It's either in your imagimation or somethintg is wrong with your setup/ display.

You got nothing bro, FALD can be brighter, that's it, but that's not enough for most of us. You got no arguement for everything people have countered with in this thread, contiue to spew the same thing over and over, and that's why you have been blocked.

Maybe they will make a laser backlit display one day, so you can still enjoy an HDR display in your later years, when your nearly blind from your FALD 1000HDR experience.

Enjoy your LCD, don't bother to respond, I wont see it.
 
Last edited:
Kram and others are right in that higher HDR brightness volumes are going to get better and better for more realism, but in regard to your eye-searing comments - the extreme end of brightness levels are about the highlights and light sources and usually not the whole scene. I just find the tradeoffs of the current still very low density, low "lighting resolution" FALD LCDs not worth it vs per pixel emissive tech that is now available and as better per pixel emissive tech becomes available in the future. The current FALD screen options are also aggravated by the fact that they pretty much all use matte type AG coating currently.

In HDR, the whole screen isn't blasted typically like this:
725876_tPyXzb2.png


. . .

"Fantastic Beasts" is one example I watched in dolby vision and it had a nuclear explosion scene:

727176_latestcb20200110044959.gif



It's worth noting that even the QD-LED LCDs from samsung that go over 2000nit, one is 4k and their other is 8k - both have aggressive ABL even though they are LED LCD. So very high peak brightness/HDR color volume screens might have a hard time avoiding ABL without using some good heatsink and cooling tech and maybe a larger form factor (thicker). There might be some brighter at lower heat tech that could be used to help too. It might be a challenge. HDR 4000 and HDR 10,000 is very bright and usually brightness means higher heat ranges of some kind.


Instead typical scene brightness is spread across a range. Screens map and compress the top half of the color volume/range down into their particular screen's capability.


Ultimately, HDR is trying to get more realistic looking. Your eyes see things in reality all of the time at those brightness ranges and you don't "go blind" from it normally. The whole screen isn't going that bright typically in normal, non-distorted curves even if it was a hdr 10,000 nit capable display. Brightness will probably be a concern brought up as HDR will be a thing in the next VR gens as well - since the screens are right up against your eyes and I believe microOLED can go very bright itself as a tech. and later Micro-LED .. (VR is going all per pixel emissive way ahead of desktop screens). So VR has potential for very high HDR color volumes/brightness in effect overall. The scenes should be a mix of levels though, like a terrain relief map or a 3d audio map - so it shouldn't normally be a problem full screen. If someone flashes a spotlight or flashbang at your eyes in game you can always close your eyes, squint, or look away like you would in real life - or in VR you could probably even shield your eyes with your hand. If the whole game is strobing your eyes like that constantly it could be an issue but most media and gaming wouldn't be like that throughout.

Part of DolbyVision's PR:

10,000 nit Dolby HDR:
  • 50%+ of the screen displayed is at 0 to 100. (SDR ranges and down to infinite black depth in OLED). This foundation probably remains the same even on compressed after fall-off screens.
  • 25% of the screen is at 100 to 1000nit. (bulk of which is typical mids to high mids that many HDR screens can display more or less, at least for a time if not long sustained).
  • 25% or less at the top (likely bright highlights e.g. scintillation/glints/reflections of sources, and very bright direct light sources)

The top end will be compressed on screens since all of consumer screens currently are way below 10,000nit so that last 25, 25 won't be the same. For example HDR 10,000 curve on a the first few gens of LG OLED is around 400nit accurate then falls off to compressing the rest into the remaining ~ 400nit (momentary ABL across dynamic media and gaming scene considerations aside).


So pretty bright hdr mids of 400 to 800 - 1000nit, along with extremely bright highlights and light sources up to 10,000 nit. It's just that now, even by comparison to HDR 4000 and HDR10,000 . . the lower HDR 1400nit screens, or the aggressive ABL suffering samsung 2000nit + QD LED LCD screens, have to suffer an extremely low FALD lighting resolution of 45x25 at best, probably lower . . with all of the tradeoffs I outlined in my last reply and many others. Per pixel emissive is the end goal in display technology. Pancake lens, varifocal lens VR headsets are all moving to microOLED per pixel emissive, and in the future they'll move to microLED per pixel emissive, so good VR headsets will soon be all per pixel emissive going forward. Eventually all displays will be in the long run as it is a better way to display things. FALD is clever for what it can do for now but it's a very very low lighting resolution hack in it's current implementation that tries to compensate as much as it can on LCDs. The bulk of scenes overall are full of contrasted areas that straddle the large zones so it's always lifting and dimming large blocks of 7000 pixels (or 15,000 pixels on 8k). Throughout dynamic media and games tons of contrasted areas are never "far enough away" from straddling in more static scenes or dynamically crossing the 45x25 grid in typically dynamic scenes in media and gameplay. Not only the exact particular cells that are straddled in more static scenes or panned across in dynamic scenes jumping the FALD zone fences/walking stones . . but the FALD systems can affect a larger spread of zones to smooth out the toning/brightening/dimming across a scar of zones straddling the contrasted areas like an extremely low lighting rez gradient smoothing of block zones so that the transitions won't look as abrupt (i.e. otherwise "glow halos" , or inversely "dim halos" that lose detail+brightness. though those tradeoffs still show up in some highly contrasted edge things). FALD needs to shrink it's zones magnitudes even as a stop gap tech imo.

A review of a dolby hdr demo unit below. Notable that they were using a backlight of 18,000 RGB LEDs on a 21" screen - which is about a 179x101 "lighting resolution". Even reference monitors have a lot less. The ucg/ucx are around 1152 zones (prob less than 45x25 "lighting resolution") and the samsung qd LED LCD screens have 1344 (less than 48x27.5 lighting rez). So their display had FALD zones many times smaller relative to screen size and at many times more the lighting resolution yielding an additional ~16,700 more zones than 1344 on their 21" 1080p screen. . So much, much higher lighting resolution/density than the best current consumer FALD screens currently. And that demo was from 2014 . That would look much better than the low density fald we have now (but even at that lighting resolution it would be a tradeoff vs a per pixel emissive display like microLED).

So Dolby basically threw current reference standards away. "Our scientists and engineers said, ‘Okay, what if we don’t have the shackles of technology that’s not going to be here [in the future],’" Griffis says, "and we could design for the real target — which is the human eye?" To start the process, the company took a theatrical digital cinema projector and focused the entire image down onto a 21-inch LCD panel, turning it into a jury-rigged display made up of 20,000 nit pixels. Subjects were shown a series of pictures with highlights like the sun, and then given the option to toggle between varying levels of brightness. Dolby found that users wanted those highlights to be many hundreds of times brighter than what normal TVs can offer: the more like real life, the better.

Dolby built a liquid-cooled experimental display
With that knowledge in hand, Dolby then built a 1080p, liquid-cooled experimental display with a backlight made up of 18,000 RGB LEDs (In comparison, its standard reference monitor uses a mere 4,500). With a peak brightness of 4,000 nits, it allowed the company to color grade footage with vastly improved contrast and dynamic range — and it was the same kind of monitor Dolby brought out for journalists in a side-by-side shootout with its current professional display.
While both used the same LCD panel, the difference was staggering. It wasn’t just a higher-quality version of the same image; it was a new kind of imagery. With the ability to reproduce a wider range of the color gamut, images glowed luxuriously. A worker welding looked like a clipped, diffuse blur of white on the standard display; on the Dolby Vision monitor it was a sharp punch of luminescent detail. A person stood in silhouette, and when the sun peaked out from around their head, I actually squinted. Granted, a demo is always a best-case scenario when evaluating new technology, but the combination of increased detail, color reproduction, and ultrabright highlights recreated reality in a way I’d simply never seen on a television before.

As you can see, the big focus is on the highlights in that top 25% range. I agree with others on how that increased color volume is the goal ultimately - but very very low lighting resolution/density FALD matte screens we have now are not a good tradeoff vs per pixel emissive tech for my values . . especially in the sdr content and hdr1000 content world we live in currently at the moment. Also note that they had to liquid cool the display. ABL and heat reduction remains a concern on display tech as we get 2000nit+ screens and head toward 4000nit and someday 10,000nit screens. Even now, the current samsung 4k and 8k QD LED FALD LCD screens that do 2000+ nit have aggressive ABL.
 
Last edited:
Looks like a window? Maybe a window with petrolium jellt smeared all over it.

Plenty of people usng OLED to consume HDR, and enjoying it, was to refute your idiotic claim no one uses OLED for HDR. You either are just trolling, or lack reading comprehension. Requiring calibrtion between sources for HDR cntent is f*cking stupid and a gimmick. Just like the gimmick of ".1ms response" sticker on the box of your "FALD" LCD. Unstandardized parameters manipulated for marketing. But any dupe that falls for that marketing would of course defend their purchase until the end of time, no matter how much buyers remorse they have to endure.

I also don't iknow where you get this "flicker" from on OLED, I have not seen any, and no review I have seen mentions it. It's either in your imagimation or somethintg is wrong with your setup/ display.

You got nothing bro, FALD can be brighter, that's it, but that's not enough for most of us. You got no arguement for everything people have countered with in this thread, contiue to spew the same thing over and over, and that's why you have been blocked.

Maybe they will make a laser backlit display one day, so you can still enjoy an HDR display in your later years, when your nearly blind from your FALD 1000HDR experience.

Enjoy your LCD, don't bother to respond, I wont see it.
I like how people choose to block to avoid the truth. You better check your comments. I like how you busted out petroleum jelly just because your cheap OLED is glossy?

You claim HDR is a gimmick and now you say plenty of people using OLED to consume "HDR". OLED only shows SDR. OLED is only capable of limited contrast at the low range with 8bit color that barely touches HDR.

It's obvious you talk like you have never seen what FALD can do with the higher range of images that are actually HDR.

It's you got nothing while running your mouth full of nonsenses that FALD brightness makes you blind while whatever dim OLED you got still flickers regardless. It's the flickering OLED that's going to ruin your eyes if it ever has a chance to get brighter.
 
Back
Top