Why OLED for PC use?

The problem with WRGB is that the more its brightness gets higher the more its colours get washed-out. You can notice it clearly when putting on side-by-side comparison against Quantum-Dots based IPS/VA/OLED displays. And I don't think MLA is a solution for that. The goal for this year's META + MLA WRGB is to beat QD-OLED in brightness just so they can say they have the brightest OLED TV in 2023, only that department. They don't put into consideration of other factors like Colour Coverages and such.

This is one of my major issues with OLED, it needs to be RGB again.
I understand how difficult that is while keeping cell life up but its the truth.

QDOLED is a great improvement but its lifespan has serious issues on version 1. I'm not sure version 2 will be good enough to tempt me either but will keep a sharp eye on how things develop.
MicroLED wont feature in the 2000nit+ category for many years with decent lifespan unless the whole back of the TV becomes a heatsink. Perhaps this is the way forward and there should be some OLEDs built this way as well, it would solve many issues.
 
Been playing some RE4 Remake on my 32M2V and I'm blown away by the complete lack of blooming on it. The RE Remakes were one of those games that I couldn't stand to play on my X27 due to the excessive blooming that was just far too distracting Vs my LG CX. Yes I can still tell that the blacks aren't truly OLED level black at times on the InnoCN but overall it's performance has continued to impress me.
 
4 month update vid is up from Rtings



Sounds like LG is still the way to go.

I also learned that they pronounce their company "ratings" like the word. I've been calling it R-tings for years! ArrTings :p

Also, a side note. I did not realize even LG's models were just using color filters over white LED's.

I was under the impression that OLED was supposed to have native little LED's for each supixel color. (R,G,B etc.)

I guess I missed a memo at some point.
 
Sounds like LG is still the way to go.

I also learned that they pronounce their company "ratings" like the word. I've been calling it R-tings for years! ArrTings :p

Also, a side note. I did not realize even LG's models were just using color filters over white LED's.

I was under the impression that OLED was supposed to have native little LED's for each supixel color. (R,G,B etc.)

I guess I missed a memo at some point.

Some colors die faster so companies use a single color or two colors almost like a subpixel level backlight in a way.

LG uses all white, and their "white" subpixel in their WRGB is just a clear spot for the white oled emitter to shine through I think. Samsung uses blue x3 and green, at least they used to. idk if they've released anything with phosphorescent blue yet but they aren't the only one developing it.


https://www.techradar.com/news/samsungs-new-oled-tech-could-mean-cheaper-qd-oled-tvs (2022)

Samsung Display has been working on a new way of producing OLED pixels for its QD-OLED TV panels, according to a presentation by a South Korean professor, reported by The Elec (opens in new tab). Specifically. Samsung has been working on a better kind of blue OLED pixel, and this could be just the boost the technology to needs to be able to start dropping its prices and competing directly with the regular (and cheaper) OLED panels used in the best OLED TVs currently.

Here's the background: red and green OLED pixels are made of phosphorescent materials that provide 100% efficiency for brightness internally. But blue OLED pixels are made from florescent materials that provide only 25% efficiency.

Unlike regular OLED screens, which use red, green and blue pixels together to make colors, QD-OLED screens only used blue and green pixels (with the layer of quantum dots converting the light to other colors). The Elec notes that currently, QD-OLED screens use three layers of blue pixels (because their efficiency is so limited), and one layer of green pixels.

Samsung Display's research was apparently to create a phosphorescent blue pixel, hopefully bringing the blue efficiency up to match the other colors – the company has already published a paper based on its work. But according to Professor Kwon Jang-hyuk of Kyung Hee University, Samsung is "prioritizing applying phosphorescent blue OLED material on its advanced TV panel," says The Elec.


Professor Kwon said that "Samsung Display likely had made a process on developing phosphorescent blue OLED material" in practice, not just in theory, and he said that Samsung intends to show off something using the new tech "within the year".


The advantage here is that Samsung wouldn't need to use four layers of pixels anymore – it could reduce it down to just two: one layer of blue, one layer of green. This would reduce both the materials needed and the complexity of production of its OLED panels – it would make them more efficient to produce, as well as more efficient at putting out light.
 
Some colors die faster so companies use a single color or two colors almost like a subpixel level backlight in a way.

LG uses all white, and their "white" subpixel in their WRGB is just a clear spot for the white oled emitter to shine through I think. Samsung uses blue x3 and green, at least they used to. idk if they've released anything with phosphorescent blue yet but they aren't the only one developing it.


https://www.techradar.com/news/samsungs-new-oled-tech-could-mean-cheaper-qd-oled-tvs (2022)

The unfortunate problem of this method, 2/3 of the light power is wasted, except for the intended White pixel that washes the colour out.
If they could properly generate the colours without the waste, OLED power use and heat generation would be a LOT lower.
This is where MicroLED will likely sail past OLED, once they sort out the teething problems.
 
2022:

https://www.whathifi.com/news/this-...at-much-brighter-oled-tvs-could-be-on-the-way

"UDC expects to meet the target specifications for phosphorescent blue by the end of the year", said Mike Hack, Vice President of Universal Display, according to Korean newspaper ETNews.


If so, UDC, which currently supplies TV titans such as LG and Samsung, could begin shipping blue 'PHOLED' materials in early 2024. And since TV makers tend to launch their flagship models in March/April, we could find ourselves reviewing the first ever blue PHOLED TV by late 2024.

Yet another acronym to learn.. PHOLED.

It's supposed to be FOH-LED but I'm sure someone is going to think PeeHOLED.
 
  • Like
Reactions: Nenu
like this
The unfortunate problem of this method, 2/3 of the light power is wasted, except for the intended White pixel that washes the colour out.
If they could properly generate the colours without the waste, OLED power use and heat generation would be a LOT lower.
This is where MicroLED will likely sail past OLED, once they sort out the teething problems.

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1670318058

After an expected launch of OLED TVs with micro lenses in 2023, phosphorescent blue could markedly improve efficiency in OLED TVs in 2024-2025.

OLED emission can be divided into two types; fluorescence and phosphorescence. Red and green OLEDs in displays have already transitioned to phosphorescence (PHOLED) which has up to 100% internal luminous efficiency. Blue OLED is still a fluorescent which has around 25% internal efficiency.

The industry has for years been researching switching blue OLED to phosphorescence as it can markedly increase efficiency to enable higher brightness at the same energy level or similar brightness at a reduced energy level – or something in-between.


. . . . .

This is where MicroLED will likely sail past
Micro LED is going to be many years away yet at anything enthusiast consumer priced I think. Per pixel emissive led tech is the next big frontier for sure but we will have a lot of optimizations and gains of the existing stop gap technologies over the next several years.
 
  • Like
Reactions: Nenu
like this
2022:

https://www.whathifi.com/news/this-...at-much-brighter-oled-tvs-could-be-on-the-way



Yet another acronym to learn.. PHOLED.

It's supposed to be FOH-LED but I'm sure someone is going to think PeeHOLED.


e04dd1%2F2285178955%2Fil_fullxfull.2285178955_jyee.jpg
 
  • Like
Reactions: elvn
like this
WOLED seems to have reached it's peak since the 2016 C6. All they've done after that is just made it more resistant to burn in and then boosted the pure white brightness on the G2 and G3.
I've been rooting for LG Electronics to abandon WRGB for years now, they could easily do it after G3 unless the pride of using competitors' superior panel techs holding them back.

Imagine G4 based on QD-OLED or IJP RGB-OLED, it would have blow Samsung and Sony's 2024 flagships out of water likely Philip and Panasonic's as well. G and C series were holding back by its own panel. It seems both LG Electronics and LG Display just don't want to abandon this technology even though it reaches its absolute limit with these enchantments, kinda like Samsung Electronics' decision to keep pushing their FALD VA QLED panels year after year.
 
  • Like
Reactions: elvn
like this
I've been rooting for LG Electronics to abandon WRGB for years now, they could easily do it after G3 unless the pride of using competitors' superior panel techs holding them back.

Imagine G4 based on QD-OLED or IJP RGB-OLED, it would have blow Samsung and Sony's 2024 flagships out of water likely Philip and Panasonic's as well. G and C series were holding back by its own panel. It seems both LG Electronics and LG Display just don't want to abandon this technology even though it reaches its absolute limit with these enchantments, kinda like Samsung Electronics' decision to keep pushing their FALD VA QLED panels year after year.
Competition is good so if both companies can keep improving their tech - including the VA QLED stuff, then that just means better displays for all of us. I mean LG didn't exactly have much competition for years in OLEDs so they decided to focus on improving burn-in resistance over other capabilities.
 
  • Like
Reactions: elvn
like this
Agreed on both counts (Xar and kasakka)

A micro lens array + quantum dot layer would be nice though.. Layer up the front and then fatten up the back with a big vented housing, big heatsink and active cooling. I don't need laptop screen thinness on a TV and especially not for pc-desk battlestation/mediastation use.

It's probably not pride in that way. It's proprietary tech that they'd have to pay to use if their competitor even allowed them to lease it. So the pride/greed might be more on the keep it for themself side rather than the not willing to use it for themself side. Hell Samsung doesn't even want to pay for dolby vision, and I thiink monitor companies dragged their feet on putting hdmi on their screens because they didn't want to pay for hdmi vs. displayport's open standard.
 
I don't need laptop screen thinness on a TV and especially not for pc-desk battlestation/mediastation use.
Yeah I still don't know what the hell that has been about. Nobody was complaining that TVs or displays were too thick. Yet at some point manufacturers started pushing for thinner and thinner TVs too. Now with OLEDs we have very thin, except for the electronics so that's the point? Just makes it more fragile to move.
 
Yeah I still don't know what the hell that has been about. Nobody was complaining that TVs or displays were too thick. Yet at some point manufacturers started pushing for thinner and thinner TVs too. Now with OLEDs we have very thin, except for the electronics so that's the point? Just makes it more fragile to move.
And probably the reason it gets to hot and burns in? What about an appropriate heatsink? Lol
 
A micro lens array + quantum dot layer would be nice though.. Layer up the front and then fatten up the back with a big vented housing, big heatsink and active cooling. I don't need laptop screen thinness on a TV and especially not for pc-desk battlestation/mediastation use.

I agree that TV's being super thin is silly and doesn't buy us anything, but I am totally against active cooling.

If I were to buy a TV and I heard even the slightest amount of barely perceptible fan noise, that thing would be returned within 30 seconds.
 
I agree that TV's being super thin is silly and doesn't buy us anything, but I am totally against active cooling.

If I were to buy a TV and I heard even the slightest amount of barely perceptible fan noise, that thing would be returned within 30 seconds.
Same with a monitor. Not interested in active cooling at all due to noise and yet another potential point of failure. I'd happily take a display several times as thick as mine with a chunky passive heatsink for performance gains, though.
 
Same with a monitor. Not interested in active cooling at all due to noise and yet another potential point of failure. I'd happily take a display several times as thick as mine with a chunky passive heatsink for performance gains, though.

100% agreed.

If a screen of any kind has active cooling, it doesn't matter to me what other features it does or doesn't have or how awesome it looks. It instantly falls into the "never buy" category.

There is no room for negotiation here.

You could make the perfect god of screens, but if it has a fan it belongs only one place. In the trash.
 
100% agreed.

If a screen of any kind has active cooling, it doesn't matter to me what other features it does or doesn't have or how awesome it looks. It instantly falls into the "never buy" category.

There is no room for negotiation here.

You could make the perfect god of screens, but if it has a fan it belongs only one place. In the trash.


Put ii on water like everything else on the pc can be. High performance devices equal high heat. I wouldn't want that tradeoff in my living room but pc would be fine considering already have two water loops with fans. Replaceable modular components like fan and radiator pump if any would go a long way toward making it more pallettable though.


I think the pro art models have active fan and some of the 25k plus professional reference monitors.
 
A pa 32ucx review on amazon..


“For the price this monitor should never have been sold. It is a very BULKY monitor, the brightness is nice but its nearly too bright. I had a sunburn in less than 30 minutes in front of it (joking). If you want to adjust the brightness there are only three settings. Imagine that. You can't do it 1% at a time ONLY three possible brightness levels.


The main issue with the monitor is noise. You have NO control over the fans. When the monitor is first on you get no fan noise. A few minutes into using the computer you will get a slight fan noise. The noise is roughly 7db. I measured it as I am in a sound studio. Then about every 30 minutes for about 5 minutes you get a super loud fan noise that is really loud and it puts an immovable image on the monitor telling you "cooling system is turned on, not this is abnormal using scenario". And what does that phrase even mean? The noise is STARTLING. I jumped once it kicked in so loud. Also you cannot move or do anything with the window that pops up from the monitor so you have to move anything you want to see off to the side. I went to take a screen shot and post it - but its a monitor image not a computer image so it didn't work. Took one with the phone instead.
This monitor is a horrible purchase.
I disagree with the other reviewer, the color and tone of the pictures and videos is horrible. I have tuned high end TVs and understand what color should look like and this monitor isn't it.”

That sounds bad (literally) but he may have had a lemon or set up the screen badly somehow idk. Regardless I think mfgs could implement active cooling on displays better than that. We have relatively quiet cooling on pc cpus and gpus with a lot of customization of profiles via software and a lot of modularity for replacement.

So there are proart displays that hit 1400nit with long sustained periiods that utilize boxy housings that have airflow grilles and that use active fans on cooling profiles . . then on on the other hand we have 2000nit+ samsung 4k and 8k qLED FALD LCD models that suffer aggressive ABL instead. As we get to higher peak HDR brightness in both FALD and OLED we may have to look to better cooling methods.
 
Last edited:
Sounds like LG is still the way to go.

I also learned that they pronounce their company "ratings" like the word. I've been calling it R-tings for years! ArrTings :p

Also, a side note. I did not realize even LG's models were just using color filters over white LED's.

I was under the impression that OLED was supposed to have native little LED's for each supixel color. (R,G,B etc.)

I guess I missed a memo at some point.

I wish they would do the tests in a PC environment who is going to watch CNN for 20hrs a day.
 
A pa 32ucx review on amazon..







That sounds bad (literally) but he may have had a lemon or set up the screen badly somehow idk. Regardless I think mfgs could implement active cooling on displays better than that. We have relatively quiet cooling on pc cpus and gpus with a lot of customization of profiles via software and a lot of modularity for replacement.

So there are proart displays that hit 1400nit with long sustained periiods that utilize boxy housings that have airflow grilles and that use active fans on cooling profiles . . then on on the other hand we have 2000nit+ samsung 4k and 8k qLED FALD LCD models that suffer aggressive ABL instead. As we get to higher peak HDR brightness in both FALD and OLED we may have to look to better cooling methods.

Indeed i have the UCG and its border is THICK but it doesnt have a huge backprint or anything it is quite heavy unfortunately.

I have only heard the fan a few times usually when i have multiple windows on O3DE opened
 
I wish they would do the tests in a PC environment who is going to watch CNN for 20hrs a day.
Hopefully they'll do more PC-oriented things when they test the monitors, I agree.

That said, I do think it makes sense for TV, since network bugs like CNN were one of the things I did experience burn-in on back when I had my early-gen LG OLED (along with Zelda Breath of the Wild hearts and other TV channel bugs).
 
A pa 32ucx review on amazon..

That sounds bad (literally) but he may have had a lemon or set up the screen badly somehow idk. Regardless I think mfgs could implement active cooling on displays better than that. We have relatively quiet cooling on pc cpus and gpus with a lot of customization of profiles via software and a lot of modularity for replacement.

So there are proart displays that hit 1400nit with long sustained periiods that utilize boxy housings that have airflow grilles and that use active fans on cooling profiles . . then on on the other hand we have 2000nit+ samsung 4k and 8k qLED FALD LCD models that suffer aggressive ABL instead. As we get to higher peak HDR brightness in both FALD and OLED we may have to look to better cooling methods.

I had the UCG briefly (about a week I'd say). It is a thicc boy, and HEAVY. The fan noise wasn't an issue for me in my brief usage, and I could have dealt with the bulk if I was otherwise happy. Unfortunately, it had many quality control issues out of the box, which would have prevented me from keeping it even if I had liked it and wanted another of that model. There was something akin to an eyelash [maybe a plastic shaving?] inside the screen, and I was able to resolve it by light tapping, but in doing so discovered at least 3 dead pixels and a stuck pixel. It also took forever to come out of standby and sometimes just didn't seem to want to. Colors and such seemed pretty great, I will say, and it could get extremely bright, but the local dimming was still a dealbreaker on desktop mode, and for the price, I decided a different type of monitor probably was a better fit for me instead of going through panel swaps. (I'd seen some reports of people exchanging many times before getting a good panel without dead pixels for this model).
 
I'm fine with thick chasis and fans with good bearings, even water cooling if a good pump (and replaceable/modular design wise).

I could see heatsink + active cooling (fan) design on very hot running displays having OSD cooling profiles just like a gpu.

Max performance with high fan ramping up at very high brightness peak.
Medium performance at a lower capped brightness threshold and/or aggressive ABL (like throttling a gpu at high heat).
Low performance at lower brightness cap where fans and ABL would never kick on.
 
I feel fans on monitors are executed like crap. They are often mounted somewhere totally inconvenient for airflow like where the stand attaches because that's the thickest part. Just make the whole thing thicker and put it somewhere where it can draw air in (or push it out) while being able to run at a very low RPM so it's inaudible. Pick a fan that does well in that situation.

Last display with a fan I've owned was a 50" Panasonic plasma TV. Never heard them.
 
I feel fans on monitors are executed like crap. They are often mounted somewhere totally inconvenient for airflow like where the stand attaches because that's the thickest part. Just make the whole thing thicker and put it somewhere where it can draw air in (or push it out) while being able to run at a very low RPM so it's inaudible. Pick a fan that does well in that situation.

Last display with a fan I've owned was a 50" Panasonic plasma TV. Never heard them.
Same here, it had 3 fans and I didnt hear them.
I gave it to a family member and its still working great for PC gaming, no burn in yet despite being 15 yrs old!
 
Yea I would generally say no to fans in monitors but if justified and done well it could be fine. Slap a big heatsink there and a large fan that spins as slowly as possible and it'll be fine.
 
I personally don't have a problem with fans if they are literally silent. Not marketing speak silent, actually silent. Considering the size and surface areas of TV's, they could have very large, very slow rotation fans.

I also agree that on a TV, I don't mind extra depth. It kinda doesn't matter if a TV is 2" or 4" deep. A bigger issue would likely be the weight of having a much larger heatsink/fan(s) more than anything else the increased depth would mean. I could see a bunch of people accidentally mounting their TV's on to the dry wall and have their $6000 TV drop onto the floor because they didn't properly anchor it into the studs.
 
Theoretically they could set up profiles for you to choose from that could work a lot like gpus can. That way you could select how you wanted to use it and even change depending on what you were doing like HDR gaming vs media playback, if using a headset, etc. If that were the case you'd have options rather than an all or nothing approach.

If you never want any fans, low performance where the brightness never hits the peak capability in nits and heat. Like a passive cooled gpu.

If you want very low quiet fans a medium performance profile that has a higher peak nits/heat and ABL "throttling" threshold. Stock or slightly underclocked gpu with a conservative fan profile and throttle threshold in place.

If you want the brightness nits/heat possible, high performance profile that has higher fan speed ramping. Overlocked gpu (even factory overlocked), high fan curve.
 
Last edited:
I have a 65" S95C on the way as well. The 55" S95B has been such a treat. QD OLED is where it's at! If you can dial in the proper sitting distance, these Samsung TV's can't be beat for PC gaming.
It's honestly the Samsung software that gives me pause. While my next TV will most likely be QD-OLED, I find that LG's WebOS has generally worked quite nicely.
 
It's honestly the Samsung software that gives me pause. While my next TV will most likely be QD-OLED, I find that LG's WebOS has generally worked quite nicely.

Agreed. When my S95C comes in, it's staying offline. I am going to be very picky about updating firmware with this set! Samsung needs to step up their game with the OS and the forced firmware updates. They've turned a lot of people off in this regard.
 
I personally don't have a problem with fans if they are literally silent. Not marketing speak silent, actually silent. Considering the size and surface areas of TV's, they could have very large, very slow rotation fans.

I also agree that on a TV, I don't mind extra depth. It kinda doesn't matter if a TV is 2" or 4" deep. A bigger issue would likely be the weight of having a much larger heatsink/fan(s) more than anything else the increased depth would mean. I could see a bunch of people accidentally mounting their TV's on to the dry wall and have their $6000 TV drop onto the floor because they didn't properly anchor it into the studs.
Reminds me of my old Sony plasma. Thing weighed 120lbs I think. Maybe more? That thing was a beast. Had six fans I think. Dead silent though
 
I have a 65" S95C on the way as well. The 55" S95B has been such a treat. QD OLED is where it's at! If you can dial in the proper sitting distance, these Samsung TV's can't be beat for PC gaming.

QD OLED monitor options are currently only ultrawide 1440p crap but supposedly the TFTCentral article that's going to become public in a few days contains information about some 4K monitor sized QD OLED options that is in the works. I'm definitely off the WOLED train myself and sticking to a MiniLED until a good QD OLED monitor option comes around.
 
The unfortunate problem of this method, 2/3 of the light power is wasted, except for the intended White pixel that washes the colour out.
If they could properly generate the colours without the waste, OLED power use and heat generation would be a LOT lower.
This is where MicroLED will likely sail past OLED, once they sort out the teething problems.
Unfortunately, that technique accelerates burn-in.

OLEDs used to have separate R,G,B emitters, but they don't do that for PC monitors due to burn-in issues. Old direct-emissive RGB-OLED gains permanent burn-in very fast.

Emissive displays using different chemicals can burn-in at different speeds for red,green,blue, so that's why samsung unified OLED light emitters to blue, and LG unified OLED light emitters to white.

Unification of light-emitting substance (chemical/semiconductor makeup, etc) greatly improved OLED lifetime, when using the same substance as the light emitter, using the longest-lasting emitter substance they could find - and using it for all colors for longest life time.

So it's kind of a game of pick-poison; the ability to use office productivity applications of LG WOLEDs, is why LG WOLED's usually tend to be better jack-of-all-trades displays. Computer use is more burn-in risky, and the unification of light emitters helps a lot.

Since LCD has more inefficiencies (polarizers in addition to filters), the OLED only needs filtering or QD step. So they tend to be more efficient than LCD at mixed-usage scenarios (some pixels dark, some pixels bright). Now, that being said, QD-LCD displays run on a very similar principle as QD-OLED, but they still have to contend with polarizer-based light losses;

Even MicroLED burns in -- you've seen old JumboTrons/video billboards that go speckled/burnt in. The same thing happens to smaller MicroLED displays, and they are still working on that!

It's all pick-poison, from a science/physics perspective
 
Unfortunately, that technique accelerates burn-in.

OLEDs used to have separate R,G,B emitters, but they don't do that for PC monitors due to burn-in issues. Old direct-emissive RGB-OLED gains permanent burn-in very fast.

Emissive displays using different chemicals can burn-in at different speeds for red,green,blue, so that's why samsung unified OLED light emitters to blue, and LG unified OLED light emitters to white.

Unification of light-emitting substance (chemical/semiconductor makeup, etc) greatly improved OLED lifetime, when using the same substance as the light emitter, using the longest-lasting emitter substance they could find - and using it for all colors for longest life time.

So it's kind of a game of pick-poison; the ability to use office productivity applications of LG WOLEDs, is why LG WOLED's usually tend to be better jack-of-all-trades displays. Computer use is more burn-in risky, and the unification of light emitters helps a lot.

Since LCD has more inefficiencies (polarizers in addition to filters), the OLED only needs filtering or QD step. So they tend to be more efficient than LCD at mixed-usage scenarios (some pixels dark, some pixels bright). Now, that being said, QD-LCD displays run on a very similar principle as QD-OLED, but they still have to contend with polarizer-based light losses;

Even MicroLED burns in -- you've seen old JumboTrons/video billboards that go speckled/burnt in. The same thing happens to smaller MicroLED displays, and they are still working on that!

It's all pick-poison, from a science/physics perspective

There are a lot of clever workaround solutions in both oled and fald for sure. The heat issue, even with power efficiency gains will probably have to resort to heatsink and active cooling at some point I'd think. Some of the higher 1400 - 1600 nit pro art screens already do have more boxy air grille venting + active fan cooling, and the samsung 2000+ nit 4k and 8k screens instead have aggressive ABL even though they are qd-LED FALD LCD and not oled.


Samsung apparently used two colors through a color filter to make the rest but same idea. They triple-layered the weaker blue along with a single layer green and then through a color filter for the rest. LG all white plus one clear subpixel to show white through.

https://www.techradar.com/news/samsungs-new-oled-tech-could-mean-cheaper-qd-oled-tvs (2022)


Samsung Display has been working on a new way of producing OLED pixels for its QD-OLED TV panels, according to a presentation by a South Korean professor, reported by The Elec (opens in new tab). Specifically. Samsung has been working on a better kind of blue OLED pixel, and this could be just the boost the technology to needs to be able to start dropping its prices and competing directly with the regular (and cheaper) OLED panels used in the best OLED TVs currently.


Here's the background: red and green OLED pixels are made of phosphorescent materials that provide 100% efficiency for brightness internally. But blue OLED pixels are made from florescent materials that provide only 25% efficiency.

Unlike regular OLED screens, which use red, green and blue pixels together to make colors, QD-OLED screens only used blue and green pixels (with the layer of quantum dots converting the light to other colors). The Elec notes that currently, QD-OLED screens use three layers of blue pixels (because their efficiency is so limited), and one layer of green pixels.


Samsung Display's research was apparently to create a phosphorescent blue pixel, hopefully bringing the blue efficiency up to match the other colors – the company has already published a paper based on its work. But according to Professor Kwon Jang-hyuk of Kyung Hee University, Samsung is "prioritizing applying phosphorescent blue OLED material on its advanced TV panel," says The Elec.


Professor Kwon said that "Samsung Display likely had made a process on developing phosphorescent blue OLED material" in practice, not just in theory, and he said that Samsung intends to show off something using the new tech "within the year".

The advantage here is that Samsung wouldn't need to use four layers of pixels anymore – it could reduce it down to just two: one layer of blue, one layer of green. This would reduce both the materials needed and the complexity of production of its OLED panels – it would make them more efficient to produce, as well as more efficient at putting out light.

LG is also working on incorporating phosphorescent blue in their future screens.

. . .


https://www.flatpanelshd.com/news.php?subaction=showfull&id=1670318058

After an expected launch of OLED TVs with micro lenses in 2023, phosphorescent blue could markedly improve efficiency in OLED TVs in 2024-2025.

OLED emission can be divided into two types; fluorescence and phosphorescence. Red and green OLEDs in displays have already transitioned to phosphorescence (PHOLED) which has up to 100% internal luminous efficiency. Blue OLED is still a fluorescent which has around 25% internal efficiency.

The industry has for years been researching switching blue OLED to phosphorescence as it can markedly increase efficiency to enable higher brightness at the same energy level or similar brightness at a reduced energy level – or something in-between.
 
Yes, choosing the most efficient OLED color was the different approaches both LG and Samsung did. The current OLED main emissive color that both Samsung and LG chose, are what they (separately) believe are the longest-lasting OLED light emitters, and then using that emitter color for each primary color.

One of the many reasons Samsung chose blue because they found a long-lasting blue and is easy to use QD to downconvert a short-wavelength light to a long-wavelength light.

One of the many reasons LG chose white because it's easy to use a white subpixel as a brightness brightener and as a relief to not overload Red/Green/Blue (burn-in resistance).

If you're doing computer productivity work (office), it's hard to beat LG WOLED. Easy on the eyes and not a problem for Office / Visual Studio / etc and text is fine with the MacType style AA approach. You can use Better ClearType Tuner, or better yet, use the open source MacType-for-Windows system at www.mactype.net ...

While colors at high lumens will look more saturated on blue-based QD-OLED than white-based WOLED, but let's face it -- even LG WOLED blows away the common non-QD LCDs (the garden variety 72%-to-99% NTSC Dell/HP office fare) by a wide margin. Even the fact that white subpixels reduce saturation, doesn't reduce saturation to a magnitude worse than these desktops. It's an obvious compromise.

LG WOLED is durability-ready for office productivity with the common white-on-black and black-on-white text widely used in hybrid work+gaming use cases increasingly common by WFH types. Currently, of all the prototype/DVT monitors I have access to, I've chosen (and am allowed to disclose) that Corsair Xeneon Flex is currently the monitor that stays on my main office desk. While it's not 4K, the giant size allows many documents open, and Windows 11 window-tiling manager makes it easy to arrange windows even in 3 side-by-side full page Word/Excel columns, or various panes of Visual Studio / Visual Studio Code, etc. The bendable OLED is great going flat for office productivity, and going curved for great 240fps FOV gaming. I currently Visual Studio on it all the time now.

Monitors are inherently compromises, but this is vastly better than IPS, TN and VA panels for a general-purpose jack-of-all-trades panel for the vast majority (e.g. people who love high Hz and great colors but are not in paid esports);

You can tell that on an LCD, the whole surface gets warm (after just being powered on for a while). On an OLED, only the bright pixels get warm (if same white solid background is continually illuminated for a while). My Xeneon Flex has been on since this morning, and right now it's 10:30pm -- I just touched the OLED. It's very cold on the non-bright areas (like the HardForum website -- feels icy cool), and only warmish on the bright areas (not even as warm as some of my LED-backlit LCDs get). For an ultrawide, this monitor's power consumption is surprisingly low, very low two-digit watts for the whole 45 incher from a Kill-a-Watt meter.

1440p means that you can get framerates to milk 240Hz easily, very important as something that 4K cannot do. Strobeless blur reduction means 240fps has 1/4th the motion blur of 60fps, which is important if you love brute framerate-based motion blur reduction. With a bit of DLSS 3.0 heft, you can get triple digit framerates even from Cyberpunk 2077 to sufficiently pamper a 45" 240Hz WOLED. A few games like Hogwarts will be a problem, but 1440p means majority of games will show massive brute-framerate-based flickerless motion-blur-reduction benefits on these 240Hz OLEDs.

Yes, I want 4K dammit, but I'm not going to give up high-quality PWM-free motion blur reduction for it. While I'm working on helping some vendors to add BFI to some future 240Hz WOLEDs (although it's a bit too late for the Xeneon Flex), as that's still important especially for lower frame rates. I'm okay with text at 1440p, especially with text-AA optimizations.

And, also, the W pixel happens to be also an ergonomics accident in full-spectrum white for officing use -- yes it reduces saturation of brightest colors (but not to worse than LCD) -- the pure broad-spectrum white (instead of 3-peaky colormix white) can also be more pleasing on many eyes.

If you want to office too on an OLED, the white is just solid, stable, and "desk-lamp-on-paper" style look to its white purity. It's also why RGBW lightbulbs and ribbons exist -- because pure broad-spectrum white looks better than colormixed R+G+B. It's almost the CRI90-lighbulb of whites, if you're officing on an LG WOLED. It seems to contribute why eyestrain is generally extremely low for most people with LG WOLED. It's a stunningly comfortable look. Spending 8-12 hours a day looking at a screen, multiple ergonomics categories can matter more than the resolution and other attributes.

Pick your poison.

Compromises.

FALD and QD-OLED is great too. Can't really bash WOLED competition all that much, with the tech progress.

But the 240Hz WOLED is a great "high Hz + office" jack-of-all-trades if you just want ONE panel that can do it almost all.

Anyway, yes, efficiency optimizations are coming, but current OLEDs are (on average) much more efficient than the average polarization-based color-filtered LCD. There's combined light losses from polarization even before color filtering. Even WOLED is more efficient than LCD at the same APL of typical officing and gaming.
 
Last edited:
This makes no sense whatsoever. Why would I use Wide Gamut SDR when the game supports native HDR (which looks beautiful)?
This absolutely makes sense when OLED monitors like 27GR95QE can only display a 300nits sun in "native HDR" with massive ABL. It can only display the 4th image with caped 650nits lava. There is the 5th image with lava close to 8000nits in the actual native HDR.
So 27GR95QE only show 8% accuracy while FALD wide gamut SDR can show the same 8% accuracy. When FALD shows HDR it can have easy 1800nits lava with 22% accuracy.
 
Back
Top