Would you ever go back to LCD after experiencing OLED?

At home, I watch HDR and DV content on my 48" LG C1 600nit OLED (calibrated with Calman for LG Home). When I'm on the go, I watch HDR and DV content on my MacBook Pro 14" 1600nit miniLED and the latter is very good. I also don't notice blooming. I can go back and forth between these displays with no complaints.
 
Last edited:
  • Like
Reactions: elvn
like this
At home, I watch HDR and DV content on my 42" LG C1 600nit OLED (calibrated with Calman for LG Home). When I'm on the go, I watch HDR and DV content on my MacBook Pro 14" 1600nit miniLED and the latter is very good. I also don't notice blooming. I can go back and forth between these displays with no complaints.


Some of the major tv manufacturers are putting out 4k FALD tvs with a little more brightness and brightness management chips but no drastic increase in the # of backlights (or MLA and fully phosphorescent OLEDs across the board in OLED lines) which would prob cut into profit. Most FALDs are watering down the zone count on their FALD monitors and especially 42" to 77" tvs. Some outliers to that are TCL 's 115 inch LCD has 20,000 zones and Hisense showed a 110inch tv with 40,000 zones but those are huge expensive tvs most people would never buy. However the TCL QM8 shown at ces available in 65" (to 98") 4k has 5,000 zones, 5,000 nit peak, 120Hz VRR, 240hz game accelerator mode (3840x1080 upscaled gets 240hz).


A 16" macbook has 10,000 miniLEDs and 2,500 zones on a small 16" screen space. The zones per degree of your vision is pretty high there. I suspect that the # of mini LEDs is also helping a bit even if the zone count is 2500. It may be that the dimming systems works so well by buffering or something too. They are very slow so smeary/blurry/ghosting on motion even though they are 120hz.

So, when doing comparisons, while it is a valid point to bring up about available FALD tech vs OLED tech in general, I would discount the macbook from being an apples to apples comparison of FALD gaming monitors/tvs to OLED gaming monitors/TVs. For most people it would be considered firmly a non-gaming capable display since it has response times like a LCD tv from generations ago.


From Rtings review of the macbook M3 2023

Unfortunately, the response time is quite slow, resulting in visible ghosting behind fast-moving objects.

reddit reply about macbooks. It's prob actually more than 4x, could be 5x or more on some transitions.
To give some context, <5ms response times are ideal but the bare minimum expectation is to be at least under the pixel refresh window. For a 120Hz display, the pixel refresh window is 8.33ms and this display takes like 4x longer to transition, which means the display has already refreshed four times in the time the pixels are changing from the first refresh. This is what leads to the long trails and blur while scrolling.

From notebookcheck review of the MBP 16 2023 M3:

firefox_oSvHBniv46.png

. .

. .

. . . . . .

Very few options for much over ~ 1800 to 2500 zones. The TCL one is listed below.

TCL QM8 (2024 model, the 2023 models have 2300 zones). . shown at ces available in 65" (to 98") 4k has 5,000 zones, 5,000 nit peak, 120Hz VRR, 240hz game accelerator mode (3840x1080 upscaled gets 240hz). Sometimes the zone count is less on the smaller of the screens in a product line though, would have to confirm about the 65" vs the 98". Though the 2023 models were 2300 zones across the board.

The TCL M8 screens are native 120hz. They use a game accelerator mode to cut the vertical rez in half to hit 240hz.

From a reddit reply on their previous model:

Apparently it doesn't cut the resolution in half, it just cuts the *vertical* resolution in half. So 3840*2160 becomes a very weird 3840*1080.

TCL have implemented this "motion accelerator" on a few of their native 120hz/144hz panels, too (specifically, I'm looking at the TCL 65C745K, which might be EU/UK-exclusive - I know I had trouble finding any retailers carrying 120hz TCL models widely available in America over here when I was making notes of what was available some time last year. This one does 120hz, 144hz and this weird 3840*1080@240hz).
 
Last edited:
The Hisense 55U8N is out so I can finally put the 55C9 out to pasture. Will see what its like going back to LED after using LG Oled's for my last 3 monitors. Don't really feel like dropping $2K on a C4 right now so if I hate the LED hopefully prices will come down good bit.

All my Oleds have eventually had burn in, but I am pretty rough on them.

C9.jpg
 
Last edited:
The Hisense 55U8N is out so I can finally put the 55C9 out to pasture. Will see what its like going back to LED after using LG Oled's for my last 3 monitors. Don't really feel like dropping $2K on a C4 right now so if I hate the LED hopefully prices will come down good bit.

All my Oleds have eventually had burn in, but I am pretty rough on them.

View attachment 648020

Thank you for your honesty. Seems like it is a rarity now a days.

If you want a very good mini led I can highly recommend the QN90B or QN90C or new variant of this series. I have 2 of them and they are outstanding.
 
Thank you for your honesty. Seems like it is a rarity now a days.

If you want a very good mini led I can highly recommend the QN90B or QN90C or new variant of this series. I have 2 of them and they are outstanding.

Yeah I figured the high end Samsungs are a little nicer but wanted to stay as close to $1K range as I could otherwise I would just drop the $2K+ for something high end.
 
Thank you for your honesty. Seems like it is a rarity now a days.

If you want a very good mini led I can highly recommend the QN90B or QN90C or new variant of this series. I have 2 of them and they are outstanding.
The U8N has 2000 dimming zones and a claimed peak brightness of 3000 nits. The QN90C has 504 zones and a peak brightness of 2000 nits. Even last year's U8K had double the number of dimming zones and outperformed the Samsung in HDR.
 
Dude, CRT and Plasma pimp-slap LCD, in my opinion. Yet I'm using LCD now. I use what I have for the job that I need. I do wish to get an OLED monitor though. But yes, I will continue to "go back" to LCD. Will I use LCD on my main gaming rig afterward? Probably not - why would I?
 
The U8N has 2000 dimming zones and a claimed peak brightness of 3000 nits. The QN90C has 504 zones and a peak brightness of 2000 nits. Even last year's U8K had double the number of dimming zones and outperformed the Samsung in HDR.
Oh good to know, I was only speaking of displays that I have had personal experience with. I haven't seen some of the reviewers that I follow review this one that you were talking about yet I will look into it sounds interesting.
 
Dude, CRT and Plasma pimp-slap LCD, in my opinion. Yet I'm using LCD now. I use what I have for the job that I need. I do wish to get an OLED monitor though. But yes, I will continue to "go back" to LCD. Will I use LCD on my main gaming rig afterward? Probably not - why would I?


I find that the thread title is too generic. I'd frame it as:

"Would you consider buying a modern FALD HDR LCD gaming display"
after experiencing a
"High performing HDR OLED gaming display" ?

You can't compare an edge lit LCD or even a local dimming LCD that isn't a full array, or an old FALD array display from years ago that has few zones. Probably need to be 1200 - 2100 zones to be considered a good modern FALD, along with other modern performance heights.

If it's not a high performing modern FALD LCD, then it's not really worth talking about in such comparisons to OLED. I love OLED but have to be fair in comparisons.

Personally I'd also omit anything that can't do 4k(+), 120hz or higher, VRR, and high rated HDR. Those days are over for me. Losing any of those facets is unacceptable.


. . . . . . . . . . . . . . .
 
Most of this stuff is disposable. 1200-1500$ display that lasts over 3 years is good enough for me. That's 500$ a year for a hobby that brings me much joy. Still cheaper than many other hobbies people have.
 
A 16" macbook has 10,000 miniLEDs and 2,500 zones on a small 16" screen space. The zones per degree of your vision is pretty high there. I suspect that the # of mini LEDs is also helping a bit even if the zone count is 2500. It may be that the dimming systems works so well by buffering or something too. They are very slow so smeary/blurry/ghosting on motion even though they are 120hz.
I have one of these and when I compared it to my LG CX 48" OLED TV with the same HDR video content. For darker scenes the Mac's screen performed well. It had a bit higher black levels but overall it wasn't possible to see e.g blooming. For bright scenes it definitely looked better and was able to coax more detail out of the brightest areas. Colors were also more vibrant, so I ended up adjusting the LG CX a bit to try to get closer to that.

But oh boy, the Mac's pixel response times are the worst ever. It's not fit for 120 Hz let alone 60 Hz! Things in motion look so blurry. It's not a major issue in desktop use or watching a video (good enough for 24/30 fps) but the problem is there. It's clearly a panel designed only for HDR performance. The M1 panel was apparently even worse, but we are still talking about something that has about 3-4x higher response times than you should actually have for adequate 120 Hz performance.

And that brings us to the LCD problem. Even if you have a great mini-LED backlight that by all measure performs well for both SDR and HDR content, you still end up with the issues of LCD panels where the pixel response times and overshoot issues are still there even on the priciest models. It's one of the reasons why I didn't want to spend money on the PG32UQX (still 3500 € over here...) even if it otherwise ticked many boxes for me.

For me OLED is a better compromise because I'm likely to be annoyed by worse motion clarity than lower HDR brightness. I'm not too bothered by the mitigations and would run even LCDs at about 120 nits in SDR. That said, my OLEDs have been all TVs or phone displays so I don't have much experience with how the monitors perform.

The issue is really cost. My current 28" 4K 144 Hz LCD cost me <500 € on sale, has pretty solid pixel response times and overshoot, but is terrible at HDR with edge-lit 400 nits. Make that a mini-LED at 32" and we are talking 3-7x the cost for nothing more than good HDR performance and potentially worse pixel response times.

At least with OLED you get the motion performance benefits, the displays are now starting to come at 4K 240 Hz, and even if the HDR performance is nowhere near the mini-LEDs, it's still acceptable enough for most people to have a good time as there's still plenty of contrast to the image and the small, bright highlights can be handled well. I just wish the TVs caught up because 4K 240 Hz might be enough to make me swap my LG CX 48".

My next monitor is between the Samsung 57" superultrawide if the right sale comes up, or waiting for the 5120x2160 40-45" OLEDs next year. I wish there was a 5120x2160 40" mini-LED option, but I sure as hell am not paying 2000 € for "business" specs like the new Dell 120 Hz model where it performs no better than my <500 € 16:9 other than the extra resolution/size/KVM/USB-C.
 
  • Like
Reactions: elvn
like this
I find that the thread title is too generic. I'd frame it as:

"Would you consider buying a modern FALD HDR LCD gaming display"
after experiencing a
"High performing HDR OLED gaming display" ?

You can't compare an edge lit LCD or even a local dimming LCD that isn't a full array, or an old FALD array display from years ago that has few zones. Probably need to be 1200 - 2100 zones to be considered a good modern FALD, along with other modern performance heights.

If it's not a high performing modern FALD LCD, then it's not really worth talking about in such comparisons to OLED. I love OLED but have to be fair in comparisons.

Personally I'd also omit anything that can't do 4k(+), 120hz or higher, VRR, and high rated HDR. Those days are over for me. Losing any of those facets is unacceptable.


. . . . . . . . . . . . . . .
I'm a BFI whore. Unless I see a FALD LCD that allows the same tweaking as my Viewsonic XG-2431, I'm not interested. OLED is the only tech that doesn't require extensive tweaking to get a perfect crosstalk-less strobe. Even if it's at reduced clarity compared to said XG-2431.

So yes, I get that FALD is a different beast and is vastly superior to edge-lit. For my enthusiast tastes, it's immaterial to me. Couldn't care less, because that's not what I'm after.
 
Why would you, or someone in general, use a modern FALD LCD as their gaming rig's display? Mostly for a few reasons I'd think.

. . One is that OLED have very curtailed brightness levels, especially HDR mids and highs at 25%, 50% of the screen and more, because they are always trying to avoid burning the emitters down too quickly. They have strict brightness governors. A good modern FALD LCD can do very bright HDR color volume, including bright, long sustained mids and highs of larger %'s of the screen (FALD has it's own tradeoffs though obviously).

. . In respect to that first reason, a lot of people would also change their screen usage habits with an OLED, e.g. OLED "best use practices" like hiding taskbars, not using bright wallpaper, dark modes all the time, lower brightness desktop usage, turning screen/emitters off when things are paused or when afk, etc. which can be considered a trade-off.

. . OLED suffer from VRR black flickering since their gamma is pinned to the peak Hz of the screen and when you use VRR your framerate fluctuates a lot. It bothers some people more than others, and it's worse on some games than others (and might be somewhat worse when running lower fpsHz graphs)

. . OLED can suffer banding in things like skies, etc., depending.

. . Since OLED's response time is so fast, it can't display low frame rate movies and shows without visibly stuttering some, unless you turn on some amount of interpolation ~ soap-opera effect to reduce that.

. . Another reason a person might consider a FALD rather than an OLED is that you can't get certain size/aspects/Hz/format screens in OLED. Like you couldn't get a 55" 1000R curvature 165Hz 4k ark in OLED, there is no 57" 7680x2160 240Hz OLED, and outside of astronomically priced giant sized screens, there are no 65" 8k 60/120(4k 144 - "240" upscalable) OLEDs. Not that there is no chance that there could ever be, but OLED seems to be very slow to adopt new formats, sometimes years behind FALD LCD in what formats and Hz are available.

FALD have their own list of tradeoffs/issues, but those are some reasons I'd consider buying a FALD in relation to an OLED's cons, even though I love OLED and own two. There are a lot of tradeoffs on both sides so you get cons using either. Maybe I'd end up unhappy with a FALD after awhile, and maybe OLED will start to cover some of the bases FALD does a few years from now, and maybe microLED will start becoming more affordable in the longer run. So even if I ended up happy with a FALD for different aspects I couldn't get in an OLED for awhile, I wouldn't say the opposite and say I'd "never" go back to a OLED.
 
For me going for FALD miniled TV as a PC monitor was a concious choice. While OLED burn in has gotten better, to the point of almost non existent unless you abuse it, I would still rather have a no risk at all. But also the good sides of LCD are more apparent in games. For movies, OLED is king there is no question about it. It is because HDR movies are usually mastered to have about 200 nits average brightness, with only highlights that are supposed to pop going beyond that. This is where OLED shines because it rarely encounters full screen bright scenes in movies. But games tend to be... more extreme. Lots of bright colors, intense lights and so on and if they do have (functional) HDR calibration settings you can pretty much set the average screen brightness to your liking.
 
Last edited:
For me going for FALD microled TV as a PC monitor was a concious choice. While OLED burn in has gotten better, to the point of almost non existent unless you abuse it, I would still rather have a no risk at all. But also the good sides of LCD are more apparent in games. For movies, OLED is king there is no question about it. It is because HDR movies are usually mastered to have about 200 nits average brightness, with only highlights that are supposed to pop going beyond that. This is where OLED shines because it rarely encounters full screen bright scenes in movies. But games tend to be... more extreme. Lots of bright colors, intense lights and so on and if they do have (functional) HDR calibration settings you can pretty much set the average screen brightness to your liking.
I would imagine that you mean MiniLED rather than MicroLED :)
 
For me going for FALD miniled TV as a PC monitor was a concious choice. While OLED burn in has gotten better, to the point of almost non existent unless you abuse it, I would still rather have a no risk at all. But also the good sides of LCD are more apparent in games. For movies, OLED is king there is no question about it. It is because HDR movies are usually mastered to have about 200 nits average brightness, with only highlights that are supposed to pop going beyond that. This is where OLED shines because it rarely encounters full screen bright scenes in movies. But games tend to be... more extreme. Lots of bright colors, intense lights and so on and if they do have (functional) HDR calibration settings you can pretty much set the average screen brightness to your liking.

OLED stutters in movies, ironically because it's response time is so superior. Movies are low frame rate. LCD transitions are so slushy that it's not (EDIT:) AS visible, not stuttering AS much at the same frame low rates as compared to OLED. OLED requires some interpolation~processing to be turned on if you want to avoid the worst stutter.. Otherwise, yes, I love it for movies in my living room when darkened room, either using a pillar lamp on each side of the screen where they won't show in the screen surface, or lights out completely other than an LED bias lighting strip on the back of the tv itself.
 
Last edited:
I think I need a dual monitor setup. I have a AW3423DW and briefly had a PG32UQX beside it. I miss the HDR performance of the PG32UQX and wasn't bothered by it's slow response. I would use the PG32UQX for work and single player games and save the AW3423DW for faster games and dark themed games.
 
OLED stutters in movies, ironically because it's response time is so superior. Movies are low frame rate. LCD transitions are so slushy that it's not visible. OLED requires some interpolation~processing to be turned on if you want to avoid stutter.. Otherwise, yes, I love it for movies in my living room when darkened room, either using a pillar lamp on each side of the screen where they won't show in the screen surface, or lights out completely other than an LED bias lighting strip on the back of the tv itself.

This may be heresy but I am sensitive to low framerate stutter anyway, LCD or OLED, so I always keep motion interpolation on at very low settings. On TV's with a scale 0 to 10 it is somewhere between 1 and 3. And in the case of of my Panasonic OLED, at minimum setting. Gets rid of the stutter without making things look like a cheap soap opera, movies still look like a film.
 
  • Like
Reactions: elvn
like this
OLED stutters in movies, ironically because it's response time is so superior. Movies are low frame rate. LCD transitions are so slushy that it's not visible. OLED requires some interpolation~processing to be turned on if you want to avoid stutter.. Otherwise, yes, I love it for movies in my living room when darkened room, either using a pillar lamp on each side of the screen where they won't show in the screen surface, or lights out completely other than an LED bias lighting strip on the back of the tv itself.

This is totally a blanket statement. The frametime of 24fps content is 41.6ms. What TV has response times slower than 41.6ms? The vast majority of TVs have response times more than fast enough that stuttering in low frame rate content is still visible. Obviously it's worst on OLED, but to say it's completely invisible due to slushy response times of LCD is just flat out wrong.

https://www.rtings.com/tv/reviews/samsung/qn90c-qn90cd-qled#test_207

1713469474581.png
 
This may be heresy but I am sensitive to low framerate stutter anyway, LCD or OLED, so I always keep motion interpolation on at very low settings. On TV's with a scale 0 to 10 it is somewhere between 1 and 3. And in the case of of my Panasonic OLED, at minimum setting. Gets rid of the stutter without making things look like a cheap soap opera, movies still look like a film.


Agree. I like AI upscaling where purists like the softened 35mm filmic look too. I also don't dislike interpolation in itself at some levels, other than the fact that historically it has produced motion artifacts in complex scenes where the camera is moving through detailed/noisy areas. Like a camera tracking a troupe of chimps moving at a good clip through a leaf filled scene full of trees, or some intros to shows with pouring sands, beads and tiny things forming shapes. Some solutions worse than others though especially in the past. I can remember some older tvs making it look like a character was cut-out and floating across the screen somewhat even. More advanced AI taking over the upscaling and interpolation duties should help improve performance going forward though.


This is totally a blanket statement. The frametime of 24fps content is 41.6ms. What TV has response times slower than 41.6ms? The vast majority of TVs have response times more than fast enough that stuttering in low frame rate content is still visible. Obviously it's worst on OLED, but to say it's completely invisible due to slushy response times of LCD is just wrong.

https://www.rtings.com/tv/reviews/samsung/qn90c-qn90cd-qled#test_207

View attachment 648525

Well, it's obnoxious on OLED due to the speed, where LCD tends to mask it more. You are right, LCD doesn't zero it but by comparison it's way more obvious and aggressive looking on OLED to where imo it becomes more of a problem. (Newer LCDs might be faster than what I'm basing my comparison on also).


This below is in regard to gaming but it has parallels. (I don't think he chose to cover anything beneath 40fps since he was talking about gaming in this instance). Gaming LCDs he is likely referencing below are prob also using overdrive where movies/media on a TV for example might not be using any kind of overdrive, depending.

https://forums.blurbusters.com/viewtopic.php?t=10593

"TL;DR version

LCD stops stuttering beyond about 50fps
OLED stops stuttering beyond about 75fps
(depends on human)


1. Don't confuse GtG and MPRT. Both can add blur that hides stutter.
2. OLED is sample and hold (MPRT is not zerod out)
3. OLED simply visibly stutters until a higher frame rate because of faster pixel response (GtG is near zero). Slow GtG helps mask stutters.

That's why you need higher frame rates to compensate for the increased visibility of stuttering made visible by pixel response being too fast. That's why OLED stutters more at the same frame rate as LCD, especially for framerates between 40fps-70fps territory. Above 100fps, the effect is not an issue, but is a consideration for people who hate stutter and have to play at low frame rates. One method is to upgrade your GPU and lower game settings. Another method is to add GPU motion blur to compensate, if you get headaches from stutter in low frame rate games (e.g. Cyberpunk 2077). "


. . .

If you used frame duping and/or interpolation you could get over 50 - 75 fps but (artifacting arguments aside), some people cling to the old softened filmic look and hate it just based on that. The issue in the past has been that using those creates artifacts, especially in your face in certain types of scenes. They've gotten better at upscaling and interpolating~generating frames, etc.especially with the most modern methods and AI chips, but it's still not completely artifact free yet.
 
Last edited:
Is this why my plasma refreshes at 96hz for 24hz content? It does look lovely. Do OLED TV's not do the same thing? EDIT - I don't mean 96hz specifically but just display 24hz content at a multiple - like 120hz (like most of them are capable of doing)?
 
Agree. I like AI upscaling where purists like the softened 35mm filmic look too. I also don't dislike interpolation in itself at some levels, other than the fact that historically it has produced motion artifacts in complex scenes where the camera is moving through detailed/noisy areas. Like a camera tracking a troupe of chimps moving at a good clip through a leaf filled scene full of trees, or some intros to shows with pouring sands, beads and tiny things forming shapes. Some solutions worse than others though especially in the past. I can remember some older tvs making it look like a character was cut-out and floating across the screen somewhat even. More advanced AI taking over the upscaling and interpolation duties should help improve performance going forward though.




Well, it's obnoxious on OLED due to the speed, where LCD tends to mask it more. You are right, LCD doesn't zero it but by comparison it's way more obvious and aggressive looking on OLED to where imo it becomes more of a problem. (Newer LCDs might be faster than what I'm basing my comparison on also).


This below is in regard to gaming but it has parallels. (I don't think he chose to cover anything beneath 40fps since he was talking about gaming in this instance). Gaming LCDs he is likely referencing below are prob also using overdrive where movies/media on a TV for example might not be using any kind of overdrive, depending.

https://forums.blurbusters.com/viewtopic.php?t=10593

"TL;DR version

LCD stops stuttering beyond about 50fps
OLED stops stuttering beyond about 75fps
(depends on human)


1. Don't confuse GtG and MPRT. Both can add blur that hides stutter.
2. OLED is sample and hold (MPRT is not zerod out)
3. OLED simply visibly stutters until a higher frame rate because of faster pixel response (GtG is near zero). Slow GtG helps mask stutters.

That's why you need higher frame rates to compensate for the increased visibility of stuttering made visible by pixel response being too fast. That's why OLED stutters more at the same frame rate as LCD, especially for framerates between 40fps-70fps territory. Above 100fps, the effect is not an issue, but is a consideration for people who hate stutter and have to play at low frame rates. One method is to upgrade your GPU and lower game settings. Another method is to add GPU motion blur to compensate, if you get headaches from stutter in low frame rate games (e.g. Cyberpunk 2077). "


. . .

If you used frame duping and/or interpolation you could get over 50 - 75 fps but (artifacting arguments aside), some people cling to the old softened filmic look and hate it just based on that. The issue in the past has been that using those creates artifacts, especially in your face in certain types of scenes. They've gotten better at upscaling and interpolating~generating frames, etc.especially with the most modern methods and AI chips, but it's still not completely artifact free yet.

Which proves my point. 24fps stutter is not invisible on a decently fast LCD which most are these days.
 
Which proves my point. 24fps stutter is not invisible on a decently fast LCD which most are these days.

I'll concede that. Perhaps I should have said not AS visible, (and to me not AS obnoxious) at the same frame rates as an OLED. But like I said, I'm not opposed to using some interplolation for movies and shows, (the newest AI upscaling, interpolation, and parameter shaping chips and methods are supposed to do it even better with fewer artifacts), but I look forward to it getting better in the future.

The tradeoff is, without interpolation/processing, OLED looks worse, stutters more in low frame rate content than LCD at the same rates, (rather than it being a binary all or none).
 
Last edited:
This is one of my biggest gripes with OLED:
View: https://www.youtube.com/shorts/Ou1NG9W99tw

Just check out how ABL kicks in and starts clipping detail as more clouds cover a larger portion of the screen. It's extremely jarring in games where you have a skybox out, the more clouds that cover your screen, the harder you get slapped with detail clipping ABL.


I guess for those times you are standing in line picking your ass it could be annoying.

Lol. just joking around, that's what the character in the Hawaiian shirt was doing. I get it, they all have their tradeoffs. I don't mind leaving that on personally. It would make desktop/app use unusable for me though.
 
For competitive games having the brightness dim all of a sudden is crazy distracting. You're all into it and then record scratch!!!!........dim-dim-dim-dim!!! Uhhhh, "automatic brightness limiter just pwned me, okay?...
 
I watched that clip at least 50 times before seeing what you're talking about. The biggest thing I notice is the texture on the sidwalk getting all blurry before getting sharp again at 0:10 but I couldn't tell anything until I realized you meant actually in the clouds.
 
For competitive games having the brightness dim all of a sudden is crazy distracting. You're all into it and then record scratch!!!!........dim-dim-dim-dim!!! Uhhhh, "automatic brightness limiter just pwned me, okay?...

You are probably talking about ABL slapping you in the face where he was specifically talking about the gradual step down in brightness that ASBL does.

ASBL has been defeatable in some oled models using the service menu, but I never bothered to do it since it doesn't really bother me in media and games much. ABL is undefeatable, and would prob be unwise to do it even if you could.

Brightness limiters/governors are there because oled emitters can only burn so bright for so long, lifetime wise vs. the wear evening buffer's remaining "battery".


. .

There is one way to "defeat" ABL so to speak though, after all, and that is to set your screen brightness or a named picture mode to beneath the ABL triggering brightness threshold. E.g. for desktop/app use. Most people would have no interest in those levels for media and gaming though obviously.

. .
 
Last edited:
You are probably talking about ABL slapping you in the face where he was specifically talking about the gradual step down in brightness that ASBL does.

ASBL has been defeatable in some oled models using the service menu, but I never bothered to do it since it doesn't really bother me in media and games much. ABL is undefeatable, and would prob be unwise to do it even if you could.
Where I found that ABSL gets annoying is in games that have scenes with lots of text you are trying to read. In particular Octopath Traveler 2 has real issues. I played it on the couch because it seems like a good couch game and the couch is an S95B. Well during "cutscenes" where there's a lot of text and just sprites moving around on the same screen, ASBL would kick in HARD and the screen would get dimmed to absurdly low levels. Quite annoying. I ended up swapping over to my desktop to finish it up.
 
OLED stutters in movies, ironically because it's response time is so superior. Movies are low frame rate. LCD transitions are so slushy that it's not (EDIT:) AS visible, not stuttering AS much at the same frame low rates as compared to OLED. OLED requires some interpolation~processing to be turned on if you want to avoid the worst stutter.. Otherwise, yes, I love it for movies in my living room when darkened room, either using a pillar lamp on each side of the screen where they won't show in the screen surface, or lights out completely other than an LED bias lighting strip on the back of the tv itself.
I think the idea of stutter on OLED is overblown. Outside of slow-moving scenes, color is going to remain relatively stable, so a display that supports 24 Hz output isn't going to exhibit visual artifacts most of the time. Despite noticing it occasionally, it hasn't bothered me to the point of wanting to go back to watch 24 Hz content on a LCD after 6 years of OLED usage.
This is totally a blanket statement. The frametime of 24fps content is 41.6ms. What TV has response times slower than 41.6ms? The vast majority of TVs have response times more than fast enough that stuttering in low frame rate content is still visible. Obviously it's worst on OLED, but to say it's completely invisible due to slushy response times of LCD is just flat out wrong.

https://www.rtings.com/tv/reviews/samsung/qn90c-qn90cd-qled#test_207

View attachment 648525
The frame hold time is not the issues, it's how quickly the pixels respond to color transition. In lower framerate content the slower response time of a LCD actually aids in the blurring effect between buffered images, so it gets smoothed out. On OLED the color is transitioning so quickly that there is no visible blurring from one image to the next and can cause visible stutter as the new color information is presented.

As you said, this can still be a problem on fast LCD screens, but on monitors you can turn off pixel overdrive to slow the response time down, and on TVs you can use a movie or similar picture mode to increase the image processing time and slow down the pixels.
 
  • Like
Reactions: elvn
like this
I watched that clip at least 50 times before seeing what you're talking about. The biggest thing I notice is the texture on the sidwalk getting all blurry before getting sharp again at 0:10 but I couldn't tell anything until I realized you meant actually in the clouds.

Ha, watching this on my MBP at night was a huge difference from my QD oled during the day
 
I have a 48in C2 and the saturation, vibrancy and blacks are so top notch. I literally can not and WON'T go back to LCD. OLED is too good.
 
I didn't see color TV till 1974 and Star Trek was my favorite in color.

What's not to love about old trek seasons? It aired for three seasons on nbc from 1966 - 1969 though I guess, not that you may not have seen it until 1974. A little before my time.

I agree, very colorful.

A somewhat more modern and slightly campy colorful space opera show (Farscape) :



2561 - 3JqNTPc.jpg


2562 - LBZjTB5.jpg



2564 - cCGqun5.jpg


2560 - lUhugV8.png
 
I saw a recent preview of the upcoming Sony FALD where they show just the backlight in action compared to their best previous model and it’s so obvious how much more accurate the new implementation is. There’s still a lot of room for improvement in FALD - as long as it’s price competitive I see it as a viable option over the next few years.
 
as long as it’s price competitive I see it as a viable option over the next few years.
That's been the overall problem with FALD though. Not so bad in TVs, but on the monitor side. I wouldn't mind if I had my pick of OLED vs FALD monitors in the 1000-1500 euro range, but it gets hard to justify that a FALD is worth 3000+ euros.
 
Back
Top