What do we think of Sony moving away from OLED and back to LED?

I mean if they really managed to overcome blooming then yeah picture quality wise there isn't much reason to go with OLED for the topend flagship model.
 
OLED is impressive, but I don't think I could buy a high end OLED because I'd have crippling burn-in anxiety the whole time and it would ruin the experience. A $900 42" OLED to watch TV/Movies - sure, but as impressive as the upcoming OLED (45" 5120x2160!) monitors look, I couldn't drop $2k and use it all day and not just worry about it the whole time.
 
I think they're just tired of having their flagship being fundamentally a product made by another manufacturer. Both in terms of PR and margins.

If they give up on OLED completely it'll be a bit sad that we can't get QD-OLED with Dolby Vision anymore.

There is no indication whatsoever they've overcome blooming, just talk about how they want more peak brightness. Which is frankly marketing speak.
 
Isn't it really kind of a recycling of the brightness advantage LCD has typically enjoyed to push more, cheaper to make, sets?

I find it disturbing. A retreat from what overall is I think a far superior technology. And one we waited so long for.

At the same time my behavior is hypocritical on this as when I finally replaced my old TV and had a chance to do so with a 77" OLED, I went with a laser UST projector with a 120" screen instead. I wanted the biggest screen I could fit against my best wall to watch movies and such. Guess I went for quantity over quality for my new TV, but I think quantity is kind of a part of quality too though? FWIW, my computer monitor is OLED though.
 
OLED is impressive, but I don't think I could buy a high end OLED because I'd have crippling burn-in anxiety the whole time and it would ruin the experience. A $900 42" OLED to watch TV/Movies - sure, but as impressive as the upcoming OLED (45" 5120x2160!) monitors look, I couldn't drop $2k and use it all day and not just worry about it the whole time.
While I doubt me responding to you will change your mind, at this point OLED doesn't have any permanent retention issues. RTINGS has essentially been running TV's 24/7 for 3+ years on CNN, including their logo on the bottom corners of the screen and not suffered any permanent effects.

If you want to know the whole scoop, watch this entire video:

View: https://youtu.be/Fa7V_OOu6B8?feature=shared

And the results of short compensation cycles:

View: https://youtu.be/rWuwUb-7vjo?feature=shared

The short? Even if you leave your display on with static elements for literally years, you won't have issues. And that would be straight up abuse of your display(s). If you do even basic things to maintain them, I don't think it's hard for them to outlive their useful life. Unless you're the sort that will be upset if their display doesn't last longer than 5-7 years.

Isn't it really kind of a recycling of the brightness advantage LCD has typically enjoyed to push more, cheaper to make, sets?

I find it disturbing. A retreat from what overall is I think a far superior technology. And one we waited so long for.

At the same time my behavior is hypocritical on this as when I finally replaced my old TV and had a chance to do so with a 77" OLED, I went with a laser UST projector with a 120" screen instead. I wanted the biggest screen I could fit against my best wall to watch movies and such. Guess I went for quantity over quality for my new TV, but I think quantity is kind of a part of quality too though? FWIW, my computer monitor is OLED though.
It's a different set of advantages and disadvantages. At this point I think OLED still has the best "all around" image characteristics, when considering motion handling, pure black, self-emissive (meaning no artifacts from haloing etc). Both techs can do similar things with contrast ratio. Mini-LED gaining a huge advantage in terms of peak brightness and generally has far fewer issues with sub-pixel arrangements (because they can use a standard RGB array).

Most of the grading world is on mini-LED. The top end grading monitors from Flanders Scientific or Sony (the $20k+ ones) are all using Mini-LED tech. And this is due in large part because they can sustain 2000+ nit peak brightness on 100% of the display (the one linked is capable of 5000 nit max windowed). On an OLED, even one with 1600 nit peak brightness, it's usually only capable of doing that on a 3% window and perhaps 1000 nit on a 10% window. Then it drops off steadily from there. It's actually rare to have a consumer OLED display capable of more than 250 nit brightness full field. And while it's possible to do so now (ironically Flanders Scientific has just announced a quantum dot OLED capable of 1000 nits), it's incredibly expensive compared to consumer displays.

TL;DR - If Sony can make a Mini LED display for the masses that can show significant difference vs their OLED competition, I think its differences will be its strengths. Mini-LED's major strength being: much higher peak brightness. Combined with Sony's much better image processing compared to competitors, people buying flagships will be able to 'see' the difference vs their OLED competition.
Everyone is already buying LG and Samsung OLED's anyway. Sony is a distant third place at best. This can help them be competitive for people that want a true alternative, since Sony was just using Samsung QD-OLED displays anyway.

I have to agree here. Although the A90k is tempting as a desktop monitor replacement and the A95L is the best looking flagship image out of the box, at this point all of Sony's features for me are offset by LG's features in the C3, which is the bang-for-buck king. LG has the best multi-use experience for games, movies, etc, all in one display and doing so at the lowest cost vs competitors (including profile handling, VRR, etc). Which is why LG is dominating the OLED market. Sony's top end pricing for not enough features is not good enough for most consumers, even of high-end TVs. So it makes sense to me that they would rather be able to compete directly specs wise using micro-LED for a real difference in image qualities, while charging their Sony premium.
 
Last edited:
I played Wartales for near 80 hours over the past few weeks on a LG 27" OLED experienced next to zero eyestrain I even moved the monitor up a half a foot have it about 2.5 feet away for FPS or 3rd person games. The only things I have adjusted in the monitor is lower Brightness and Contrast along with Adaptive Sync off causes me to blink excessively and a Vesa mount stand lowered so I'm not looking up.

LCD is ancient history people still buy TVs because of the price and 75 inch TV panels took over for the 65 inch. There was a guy at Walmart pushing a pallet through he knocked two lcd TV panels over because the are too wide for Walmarts asiles.
 
There is no indication whatsoever they've overcome blooming, just talk about how they want more peak brightness. Which is frankly marketing speak.
No, it's not just marketing. MiniLED displays can get brighter than OLED, a lot brighter, and it is totally something you can see and appreciate. The 10,000 nit target of HDR technology didn't come from nowhere, Dolby did some serious testing of human vision and preference involving shining a cinema projector at an LCD screen and seeing how bright people liked things. Answer? Bright.

Higher peak, and full screen, brightness is something you can notice, and appreciate. I have a nice S95B TV which gets about 900-950nits peak in a 10% window, 180nits full screen, and a PG32UQX which gets about 1600nits peak in a 50% window, 1200nits full screen. You notice the difference, and it isn't subtle. That almost doubling in peak brightness is something that make scenes pop and the higher full screen/lack of ABL is noticeable too. It is a seriously nice experience, despite the fact that it does have blooming. I generally prefer gaming on it (my computer is hooked to both).

The new MiniLED TVs are looking at going much higher than that still, which will give highlights even more brilliance and pop.

As for blooming, we'll see how it goes, but get the backlight array small enough and it won't matter, you can't tell. Seriously. Your eyes have veiling glare (all optical systems do, cameras included) that means you see blooming around bright points of light, they don't appear as razor sharp points. So, get the zones small enough, and it will literally not be perceptible in normal viewing.

I'm not trying to shit on OLED, I still think it is the future of display technology, but MiniLED has some serious merit and a big part of that is brightness. The impact from the high dynamic range comes from, well, dynamics. That means it needs to get up there. It is the same with a sound system. The reason THX/Dolby target 105dB for theater mains and 115dB for theater subs is because you want some serious dynamic impact. For an explosion or something to really hit, it needs to get way louder than normal content. If you were doing speed at 70dB but then limited everything to 80dB max for effects, it would give you the impact, no matter how good the sound system was, even if it had extremely linear frequency response and low distortion. We also don't want to just turn down the volume and say "Oh, well just make the speech 45dB, then you get the same range!" Nobody wants to listen to everything at around a whisper.

Likewise getting that real impact for highlights requires them to get bright. We don't want to turn down the brightness to everything and make it super dark just so highlights can pop, we want to keep it where it is, and have them go higher. Right now, if you want bright, MiniLED (or MicroLED if you have hundreds of thousands of dollars) is where it's at. If Sony feels they can make that offer a flagship experience, I'm here for it.
 
No, it's not just marketing. MiniLED displays can get brighter than OLED, a lot brighter, and it is totally something you can see and appreciate. The 10,000 nit target of HDR technology didn't come from nowhere, Dolby did some serious testing of human vision and preference involving shining a cinema projector at an LCD screen and seeing how bright people liked things. Answer? Bright.

Higher peak, and full screen, brightness is something you can notice, and appreciate. I have a nice S95B TV which gets about 900-950nits peak in a 10% window, 180nits full screen, and a PG32UQX which gets about 1600nits peak in a 50% window, 1200nits full screen. You notice the difference, and it isn't subtle. That almost doubling in peak brightness is something that make scenes pop and the higher full screen/lack of ABL is noticeable too. It is a seriously nice experience, despite the fact that it does have blooming. I generally prefer gaming on it (my computer is hooked to both).

The new MiniLED TVs are looking at going much higher than that still, which will give highlights even more brilliance and pop.

As for blooming, we'll see how it goes, but get the backlight array small enough and it won't matter, you can't tell. Seriously. Your eyes have veiling glare (all optical systems do, cameras included) that means you see blooming around bright points of light, they don't appear as razor sharp points. So, get the zones small enough, and it will literally not be perceptible in normal viewing.

I'm not trying to shit on OLED, I still think it is the future of display technology, but MiniLED has some serious merit and a big part of that is brightness. The impact from the high dynamic range comes from, well, dynamics. That means it needs to get up there. It is the same with a sound system. The reason THX/Dolby target 105dB for theater mains and 115dB for theater subs is because you want some serious dynamic impact. For an explosion or something to really hit, it needs to get way louder than normal content. If you were doing speed at 70dB but then limited everything to 80dB max for effects, it would give you the impact, no matter how good the sound system was, even if it had extremely linear frequency response and low distortion. We also don't want to just turn down the volume and say "Oh, well just make the speech 45dB, then you get the same range!" Nobody wants to listen to everything at around a whisper.

Likewise getting that real impact for highlights requires them to get bright. We don't want to turn down the brightness to everything and make it super dark just so highlights can pop, we want to keep it where it is, and have them go higher. Right now, if you want bright, MiniLED (or MicroLED if you have hundreds of thousands of dollars) is where it's at. If Sony feels they can make that offer a flagship experience, I'm here for it.

Yup. In the right HDR content, Mini LED simply offers a pop and vibrance that OLED simply cannot match due to it's brightness caps and ABL kicking in harder at larger window sizes. Not to mention WOLED also suffers from color dilution the higher the brightness gets due to the white subpixel.
 
Higher peak, and full screen, brightness is something you can notice, and appreciate.
But there's a point of diminishing returns. Going from 100 nits to 1000 nits is a larger increment than going from 1000 to 2000 nits.
If higher brightness is the only thing LCDs can offer while sucking at everything else, then it cant beat OLEDs in overall PQ.
 
I really don't care what Sony does as I won't pay their ridiculous prices in light of the fact that other companies usually have better products.
 
But there's a point of diminishing returns. Going from 100 nits to 1000 nits is a larger increment than going from 1000 to 2000 nits.
If higher brightness is the only thing LCDs can offer while sucking at everything else, then it cant beat OLEDs in overall PQ.
True but it still matters, and the new MiniLEDs are more talking about going to 4000-5000 peak. Have a look at the Dolby presentation I linked, they tested it and you can see the preference curves. People like bright.

Also this statement that LCDs "suck at everything else" is pretty fanboy-ish. For most things, they are a pretty even match, like resolution, refresh rate (for TVs), etc. OLEDs are unquestionably better at fast transition time, which is great for gaming but not necessarily as great for low frame rate movies, and also great at precise brightness.

The precise highlights is one of the things that high-zone MiniLEDs seek to deal with, and they can do it pretty well. It is something that lowers one of the OLED advantages, and at the same time gives you the possibility of really high peak brightness.

There's no perfect display technology, and there's nothing wrong with companies trying to improve their displays. Everyone doesn't have to buy the same TV as you.
 
The biggest problem of OLED is the "O". The Organic pixel just cannot take high temp, While it's possible to push the pixel to over 1000 nits on 2% area since the rest of the surrounding area is acting as an heatsink. Once the area grows, the brightness have to be brought down as you can't sink the heat away fast enough. That's why despite finally getting peak brightness over 1000 nits in the 2% screen. They all basically still peaks at 200 nits on a 100% bright screen as anything beyond that the organic pixel will suffer damage.
 
Last edited:
There's no perfect display technology, and there's nothing wrong with companies trying to improve their displays. Everyone doesn't have to buy the same TV as you.
If anything it's good that there are more options.

Most OLED TVs on the market are pretty interchangeable where a LG vs Sony vs Samsung vs Philips isn't going to drastically differ because they are all using the same LG WOLED or Samsung QD-OLED panels. This year even Samsung is releasing OLED TVs using LG WOLED panels, which are likely to perform like LG's equivalents but with worse software.

I'd still pick OLED for myself because I do plenty of gaming on my TV and that's where OLED's benefits in pixel response times will outweight the higher brightness of Mini-LED and things like burn-in aren't much of a concern.
 
I really don't care what Sony does as I won't pay their ridiculous prices in light of the fact that other companies usually have better products.
I would guess that this might be a reason behind their decision, that quite few people find the differences in prices validated by the difference in performance. Kind of reminds me of Pioneer and to some degree Panasonic some years ago.
 
I think they're just tired of having their flagship being fundamentally a product made by another manufacturer. Both in terms of PR and margins.

If they give up on OLED completely it'll be a bit sad that we can't get QD-OLED with Dolby Vision anymore.

There is no indication whatsoever they've overcome blooming, just talk about how they want more peak brightness. Which is frankly marketing speak.
They could always R&D their own OLED panels.
 
Sony already cofounded JOLED (with Panasonic) which unfortunately is not yet full geared up to mass manufacture big TVs. Their professional monitors are great though, and they completed a plant in late 2020, not sure what came of that.
 
When I hear LED, I hear "broken" LEDs, that is, LEDs that no longer light and often times entire strips. I know a ton of people that paid hundreds of dollars for TVs that have this issue... and with less than 5 years of use.

Personally, I expect "more".

Does Sony know how to make reliable LEDs?

Also, "micro" IMHO takes on a different meaning if talking about an elephant. When Sony shows off a 27" MicroLED with the same amount of LEDs, then, you've got my attention.
 
Sony has one issue, customer service. If they decide on a TV technology and successfully market it as "premium" it'll be quality and do well. I have a 10 year old fully local dimming Sony LED XBR that's still kicking with a perfectly good picture. I've also had Sony's with issues and their customer service treated me like garbage. So, if you buy and have no issues, you'll probably love your TV.
 
Sony already cofounded JOLED (with Panasonic) which unfortunately is not yet full geared up to mass manufacture big TVs. Their professional monitors are great though, and they completed a plant in late 2020, not sure what came of that.

I'm pretty sure Sony even went from OLED back to LCD for their professional monitors. So I guess a follow up with switching the flagship TV back to LCD was always the plan.
 
No, it's not just marketing. MiniLED displays can get brighter than OLED, a lot brighter, and it is totally something you can see and appreciate. The 10,000 nit target of HDR technology didn't come from nowhere, Dolby did some serious testing of human vision and preference involving shining a cinema projector at an LCD screen and seeing how bright people liked things. Answer? Bright.

Higher peak, and full screen, brightness is something you can notice, and appreciate. I have a nice S95B TV which gets about 900-950nits peak in a 10% window, 180nits full screen, and a PG32UQX which gets about 1600nits peak in a 50% window, 1200nits full screen. You notice the difference, and it isn't subtle. That almost doubling in peak brightness is something that make scenes pop and the higher full screen/lack of ABL is noticeable too. It is a seriously nice experience, despite the fact that it does have blooming. I generally prefer gaming on it (my computer is hooked to both).
I wonder if people could only choose one option, what would look better in most TV viewing cases; a TV with no blooming but lower HDR highlights, or a TV with blooming but good HDR highlights.
I would personally probably go for the good HDR, as I use my TV mainly for sports and gaming, and from what I can tell those two things would do better with higher brightness.
 
I wonder if people could only choose one option, what would look better in most TV viewing cases; a TV with no blooming but lower HDR highlights, or a TV with blooming but good HDR highlights.
I would personally probably go for the good HDR, as I use my TV mainly for sports and gaming, and from what I can tell those two things would do better with higher brightness.

Well if I could get the OLED TV HDR specs in monitors then I would absolutely just go OLED full time. The 3rd gen QD OLED TVs will be capable of 300 nits full field and 3000 nits peak (probably 1-2% window) so I would imagine the 10% window would be somewhere around the 2000 nits mark? Real scene HDR could be well over 1000 nits and that is what really matters anyway which would plenty for me so if that was available in a QD OLED monitor then I would have no reason to go with Mini LED anymore. But unfortunately the QD OLED monitors have their brightness nerfed to 1000 nits peak, 400 nits 10% window, and 250 nits full field white. Real scene is only around 500 nits.
 
I wonder if people could only choose one option, what would look better in most TV viewing cases; a TV with no blooming but lower HDR highlights, or a TV with blooming but good HDR highlights.
I would personally probably go for the good HDR, as I use my TV mainly for sports and gaming, and from what I can tell those two things would do better with higher brightness.
Probably varies on the person and the use case. Like the amount of veiling glare you have depends on your eyes, so some people are going to see more blooming anyhow, and thus may not notice much if any difference. Also depends on the kind of content I'm sure. For example RE: Village is better on the OLED because it is so dark, and also features bright lights in a dark background. How bright the content is mastered will matter too. If you are watching things mastered to something like 800nits peak then it won't matter if your display can do 4000, as it would never get used. Also going to depend on the number of dimming zones, at some point they get small enough that it doesn't matter to a given observer, so while someone might notice haloing on a 1000-zone display, a 5000-zone display might be fine.

For games in general I'm rather a fan of higher brightness as most games seem to be perfectly willing to take advantage of it (many will render up to 10,000 nits if allowed to).

Bruh...this is a TV. And what exactly did LCD's do to you? :ROFLMAO:
The fanboyism of some people is unreal. "Everyone has to like the technology I like and have the same opinion I do!"
 
I can't believe the screen on my Macbook Pro isn't OLED. Apparently it's just miniLED and not microLED, but this thing is wild. I was definitely a big believer in oled and really wanted an LG OLED for the house but just too expensive.

However, I don't think I could tell an appreciable difference between oled and miniLED unless I sat there and hunted for it.
 
I can't believe the screen on my Macbook Pro isn't OLED. Apparently it's just miniLED and not microLED, but this thing is wild. I was definitely a big believer in oled and really wanted an LG OLED for the house but just too expensive.

However, I don't think I could tell an appreciable difference between oled and miniLED unless I sat there and hunted for it.
The two things that are noticeable, though not large, even when you have a high dimming count screen like that are response time/blur and viewing angles. OLEDs have sub millisecond response times for everything except black to low level transitions, and the MiniLED panels tend to be on the slow side of LCDs, not sure why. So you can notice some difference there. Viewing angles are also an area that OLED can be better at, though how much depends. QD-OLED is amazing having near perfect viewing angles. WOLED isn't quite as good, and good IPS can come closer to it.

But ya, the difference for things like black levels between good MiniLED and OLED is less than many people think.
 
I mean... he nails in the video. It 1000% is marketing and about doing bigger displays. If this was for the pro-market it would be a different story. But for the consumer market... it is marketing. To be able to do crazy large TV's that are already starting to get popular in other markets.
 
I would argue the only thing OLED does better than mini-LED is motion and lack of any haloes. Everything else a well dialed-in mini-LED does better: better brightness in simple test patters and complex imagery where average brightness is more important than just a highlight, and from this better color volume (which is a byproduct of brightness), IPS panels have equivalent if not better gamuts (the highest measured gamuts currently are IPS panels), better near-black performance (ie., lack of crushing near black and dark greys), basically no discernible pure black difference (as mini-LEDs can achieve < 0.1 nit blacks and the consumer human eye is hard-pressed to see lower in real-world media consumption, nor is likely to care), far better durability, better energy efficiency as mini-LED is powered by a different voltage structure than OLED which uses an “full screen fixed voltage output “), lack of color shift when viewed at an angle (though OLED has superior viewing angles), better text rendering due to standard OS pixel substructure. I would argue OLED will be extinct/phased out once mini-LED zones increase enough that haloing is lessened because despite all the party tricks the OLED industry is using to help it compete with mini-LED (MLA, changing pixel substructure , etc) it cannot overcome the laws of physics which dictate that OLED uses organic material which by its nature is fragile. Hence there is a reason organic emissive technology has been around and known for 40 (discovered in 1979) years and the industry delayed its use in the market. This is the reason big players like Sony are moving back to mini-LED.
 
I would argue the only thing OLED does better than mini-LED is motion and lack of any haloes. Everything else a well dialed-in mini-LED does better: better brightness in simple test patters and complex imagery where average brightness is more important than just a highlight, and from this better color volume (which is a byproduct of brightness), IPS panels have equivalent if not better gamuts (the highest measured gamuts currently are IPS panels), better near-black performance (ie., lack of crushing near black and dark greys), basically no discernible pure black difference (as mini-LEDs can achieve < 0.1 nit blacks and the consumer human eye is hard-pressed to see lower in real-world media consumption, nor is likely to care), far better durability, better energy efficiency as mini-LED is powered by a different voltage structure than OLED which uses an “full screen fixed voltage output “), lack of color shift when viewed at an angle (though OLED has superior viewing angles), better text rendering due to standard OS pixel substructure. I would argue OLED will be extinct/phased out once mini-LED zones increase enough that haloing is lessened because despite all the party tricks the OLED industry is using to help it compete with mini-LED (MLA, changing pixel substructure , etc) it cannot overcome the laws of physics which dictate that OLED uses organic material which by its nature is fragile. Hence there is a reason organic emissive technology has been around and known for 40 (discovered in 1979) years and the industry delayed its use in the market. This is the reason big players like Sony are moving back to mini-LED.

OLED will never be extinct and neither will Mini LED because there are die hard customers on both sides who will never purchase the other tech. There is room in the market for both to co exist. Unless Micro LED stops being a pipe dream and actually becomes a thing for everyone that is, then both OLED and Mini LED will both die.
 
MiniLED displays can get brighter than OLED, a lot brighter, and it is totally something you can see and appreciate.
LCDs have always gotten brighter than competing technologies that offer better contrast. That's not anything new. And yes, it sells TVs in the stores. That isn't new.

The problem is always that going brighter is impressive, but if you're elevating the black levels in the surrounding area at the same time, you're not actually improving contrast.

As for blooming, we'll see how it goes, but get the backlight array small enough and it won't matter, you can't tell.
You're talking about something that happens at far higher densities of individually addressable LEDs than is currently available. Do you think Sony is going to drop a miniLED TV with 100K zones? I sure don't.

Along with that peak brightness and haloing, you also get worse color accuracy especially on moving objects(because the slowly changing backlight can't maintain good colors) and legit terrible response times(TV manufacturers have never made LED backlights fast enough to be tolerable in gaming, the only company that ever managed the hardware for that was Nvidia).

What I see is a change in corporate strategy from Sony, but no significant change in technology backing that change. Maybe it's a good change for them and they'll sell more TVs but I don't see it producing better image quality.

Honestly, I think Sony likes to pretend that they're still the technological leader in TVs that they used to be. But they just aren't anymore.
 
LCDs have always gotten brighter than competing technologies that offer better contrast. That's not anything new. And yes, it sells TVs in the stores. That isn't new.
At least with regards to max brightness in tests, when actually being used to display something from real life, brightness drops substantially to hide haloing/blooming. And as you mention, contrast is noticable lower compared to OLED as well.
 
I can't believe the screen on my Macbook Pro isn't OLED. Apparently it's just miniLED and not microLED, but this thing is wild. I was definitely a big believer in oled and really wanted an LG OLED for the house but just too expensive.

However, I don't think I could tell an appreciable difference between oled and miniLED unless I sat there and hunted for it.
It's very good for HDR...but has the absolute worst pixel response times you can buy. Like truly terrible. It is a blurry mess in motion for anything above 30 fps.
 
I would argue the only thing OLED does better than mini-LED is motion and lack of any haloes. Everything else a well dialed-in mini-LED does better: better brightness in simple test patters and complex imagery where average brightness is more important than just a highlight, and from this better color volume (which is a byproduct of brightness), IPS panels have equivalent if not better gamuts (the highest measured gamuts currently are IPS panels), better near-black performance (ie., lack of crushing near black and dark greys), basically no discernible pure black difference (as mini-LEDs can achieve < 0.1 nit blacks and the consumer human eye is hard-pressed to see lower in real-world media consumption, nor is likely to care), far better durability, better energy efficiency as mini-LED is powered by a different voltage structure than OLED which uses an “full screen fixed voltage output “), lack of color shift when viewed at an angle (though OLED has superior viewing angles), better text rendering due to standard OS pixel substructure. I would argue OLED will be extinct/phased out once mini-LED zones increase enough that haloing is lessened because despite all the party tricks the OLED industry is using to help it compete with mini-LED (MLA, changing pixel substructure , etc) it cannot overcome the laws of physics which dictate that OLED uses organic material which by its nature is fragile. Hence there is a reason organic emissive technology has been around and known for 40 (discovered in 1979) years and the industry delayed its use in the market. This is the reason big players like Sony are moving back to mini-LED.


The only reason why IPS may have less black crush is because they are usually tuned to Relative gamma. The darkest levels are compensated and pushed up to fit to the limits of the panel. Calibrate them to Absolute 2.2 gamma (or even better, 2.4 which is the current standard for SDR movies) and you get even worse black crush than OLED because they simply cannot display the dark details as dark as they should be. Their black levels do not go deep enough for that. OLED can show those darker just above black details in more correct brightness levels than a typical IPS with their compensated relative gamma curve, even if the first couple of steps are easily crushed.
 
LCD apologists, your excuses now, if you would.


View: https://www.youtube.com/watch?v=VM7eFI5pqSY


You still don't get it. The organic pixel will degrade with heat. On small area. it can go bright by sinking heat to the surrounding area or improve cooling but once you hit a large area, brightness will drop like crazy. all OLED are pretty much limited to about 200 nits at 100%. This is actually worst because you get 3000 nits at 3% but still drops to 200 nits at 100% so you get a much more dramatic drop as the bright area increase.
 
It's very good for HDR...but has the absolute worst pixel response times you can buy. Like truly terrible. It is a blurry mess in motion for anything above 30 fps.

Not everyone is that sensitive to motion blurr and even that is if all you do in game with your monitor. a lot of larger monitor fills many roles now from a desktop TV for streaming media, providing a large screen real estate for work without having to use enlarge fonts, etc. a good QLED like a Samsung QN90B/C does a much better job overall that any OLED.
 
Back
Top