Why OLED for PC use?


This is the crashed-black dimming that makes low APL 1nits scene shows 0.1nits instead. That infinite contrast doesn't even help in this situation. It has nothing to do with ABL on a larger widow size.
 
But the 240Hz WOLED is a great "high Hz + office" jack-of-all-trades if you just want ONE panel that can do it almost all.

The low PPI makes text look truly horrible on this 45" 1440p panel. It's usable, but awful for that. It's a reasonable choice for gaming, but you really need a different setup for text-based work if you do a lot of that.
 
Bring on the layers.

828374_3d8c114fcff2510b58dea3d92def510acd14ef5d.gif




Hydro or Magnetic bearing Fans
Thick Heatsink
Electronics
Pho-Blue
Pho-Green
QD-Layer
Micro Lens Array.

. . . .

Would love it if a mfg produced a :

8k 55" 120Hz OLED capable of 4k 240Hz and 4k based uw resolutions.

8k 55" 120hz qd-led FALD LCD capable of 4k 240Hz and 4k based uw resolutions. Would be great if one could use the small zones the macbooks do, extrapolated to 55" 1000R curvature size which would be at least 100,000 zones instead of 6000 to 7000, 45x25. Heat would likely be an issue and would have to be addressed.

Even better if they managed to put an AI upscaling module in the screens themselves to bypass port/cable bandwidth limitations.
(i.e. send 4k relatively high hz signal at max port and cable is capable of, then AI upscale it on the TV end instead of trying to stuff a signal already upscaled to 8k over the ports and cable).

At 100,000 zones, a 4k screen would be down to around 83 pixels per zone instead of the up to 6000 - 7000 pixels per zone we have on up to 1300 zone fald now. (Or the 829 pixels per zone on the apple's 10,000 zone display). That would probably be a huge difference. Don't forget that more than one zone gets activated across areas to compensate though - but still that would be an extreme difference of magnitudes if they ever released something like that. You can get 64 PPD and a 60 deg viewing angle on a 55inch 4k screen at 3.5 feet viewing distance so a 55 inch screen is not undoable setup wise using a simple rail tv stand. The radius or focal point of a 1000R curve is around 40" distance too so it would work well for that if curved.


. . .

You can tell that on an LCD, the whole surface gets warm (after just being powered on for a while). On an OLED, only the bright pixels get warm (if same white solid background is continually illuminated for a while).

A random idea I thought of was using flexible screen tech to make the screen like a film cartridge. The "film"/screen emitter layers would advance a little every time you turned the screen off. In that way, the screen would move where the hotspots were (taskbars/interfaces , HUDS, frequented tv channel and streaming logos, etc). The screen would be one full continuous band so it would continue to rotate through over and over. Just a weird idea.

tip5091.gif


F6ILUZOHNSDZYGU.gif


. . . . . . .

https://www.oled-info.com/tcl-csot-suggests-new-hinged-pixel-design-stretchable-displays

iesTCL-CSOT-higned-pixel-design-img_assist-400x211.jpg


. . . . . . . . . .

Tiny MicroOLED screens and later microLED through stationary lens projector type tech could also be a thing (rather than having to strap a boxy VR headset on).
 
Last edited:
Even better if they managed to put an AI upscaling module in the screens themselves to bypass port/cable bandwidth limitations.
(i.e. send 4k relatively high hz signal at max port and cable is capable of, then AI upscale it on the TV end instead of trying to stuff a signal already upscaled to 8k over the ports and cable).
That would be an interesting feature for a future Nvidia G-Sync module, if they ever make one. For games it might have to support feeding some additional data from a Nvidia GPU considering DLSS requires things like motion vectors from a game and Nvidia's video upscaling tech can only be described as "not worth it for anything but really low quality content".

LG TVs already market "AI Picture Pro" but with a little marker saying "*AI Picture Pro will not work with any copyright-protected content on OTT services." so that probably makes it useless on things like streaming media services like most people use these days and could potentially extent to most content if I interpret OTT services right here.

They also market "AI 8K Upscaling uses deep-learning to analyze and restore lost information, transforming lower resolution content into stunningly immersive 8K. And LG TVs feature our upgraded AI 8K Upscaling technology, which uses a one-step variable scale, allowing the device to upscale content from any resolution." on their 8K TVs but I have no idea how well that works in practice.

For games it's probably just simply easier to feed an 8K display 4K signal as you would be really hard pressed to notice a difference to native 8K. I just wish they started supporting higher refresh rates at lower resolution so if you are limited to 8K 60 Hz at native res, 4K could be 120 Hz or more so you could have the best of both worlds before bandwidth becomes an issue.

It will be interesting to see if Samsung makes an 8K ARK model this year. In theory at least they could make something like that out of two panels they use for the 57" superultrawide - it would result in a ~63 inch curved display which is honestly quite huge. But if they make a separate panel with the same tech, maybe they could shrink it down to 55" like the current ARK.
 
  • Like
Reactions: elvn
like this
Even better if they managed to put an AI upscaling module in the screens themselves to bypass port/cable bandwidth limitations.
(i.e. send 4k relatively high hz signal at max port and cable is capable of, then AI upscale it on the TV end instead of trying to stuff a signal already upscaled to 8k over the ports and cable).
Personally I would prefer much simpler solution: integer scaling.

Unlike AI it is so easy to implement anyone could do it with no effort at all.
Even I was able to make integer scaler in Verilog. It took one weekend and was my first Verilog program. Really simple to do and had like two line-time worth of latency.
Not having it as an option, especially on gamer oriented displays is intentional evil or an absolute ignorance on part of people deciding for features.

A random idea I thought of was using flexible screen tech to make the screen like a film cartridge. The "film"/screen emitter layers would advance a little every time you turned the screen off. In that way, the screen would move where the hotspots were (taskbars/interfaces , HUDS, frequented tv channel and streaming logos, etc). The screen would be one full continuous band so it would continue to rotate through over and over. Just a weird idea.
It would make screen bulky and very complex to design and then to manufacture. It would also introduce additional points of failure.
I also do not think changing screen shape this way would be good for it long term and even if they did really bulky display like that it would sooner than later develop image issues because of all the bending.

The name of the game is to make display cheaper but still good enough for typical consumer to buy it and then out-produce everyone else.
This is why we never had dual-layer LCDs for consumers and why FALD do not already have hundreds of thousands of backlight zones.
OLEDs also could probably be already much better than what we have but engineers need not only figure how to make something better but also to make it viable for cheap mass production.
 
I've said it on the CRT forum but I'll say it here. What about a rear-projection laser-scanned display? Sony already built a pico version of a laser-scanned display:
?u=https%3A%2F%2Ftse4.mm.bing.net%2Fth%3Fid%3DOIP.jpg


It scans out the image ala CRT. Probably (I don't know because no one really tested it for motion clarity in their gaming tests) has good motion clarity because it's single-pixel scanned. Yes, the display will have some depth but surely it can't be as deep as a CRT monitor? With some clever mirror techniques it could probably still be thin enough to be a "flat panel".

I know that folks will probably shoot this down but I'm just dreaming here. It would theoretically have close to infinite contrast as it would simply not shine the lasers if it was displaying a black pixel. Anyways - just a thought. Carry on...
 
Personally I would prefer much simpler solution: integer scaling.
100% agree with this.

If I were to view a 4k signal on an 8k panel my ideal way to get it on the screen is with nearest neighbour scaling. On the desktop in particular I don't want the blurriness of bilinear or bicubic upscaling and I'm not interested in "AI" trying to make things up for me either.

For games and (maybe) video it might be different but I'm happy to leave that to DLSS, FSR, etc.
 
  • Like
Reactions: XoR_
like this
I guess LG's EX tech uses AI learning to figure out the emitter wear in advance which allows LG to remove some wear sensing electronics in their displays. That results in more output per pixel per given energy delivered so is more efficient. That means lower electricity used per pixel = longer OLED lifespan -or- higher light output.

Deuterium OLEDs are also more efficient.

Phosphorescent blue (and green) oleds in 2024 - 2025 will be much more efficient too = longer lifespan -or- brighter output

Micro Lens Array will also increase efficiency or light output.

It might be up to the mfg whether they use these technologies to boost the light output or to lengthen the screen's lifespan but it would be nice if the users had the option themselves in the OSD. They'd likely use it to boost the brightness anyway since that is more impactful marketing wise.

. . . . . . . . .



. . . . . . . . .


.
https://cen.acs.org/materials/electronic-materials/next-TV-contain-uncommon-isotopes/100/i9

For LG’s OLED.EX technology, DuPont deuterated an existing molecule used to carry electrons in the blue-emissive substack. DuPont won’t say what the molecule is other than to classify it as a polyfused aromatic compound. A big synthetic challenge for the DuPont team, which has been working on deuterated OLED compounds for 15 years, was replacing hydrogen with deuterium while maintaining high purity.

“There are multiple carbon-hydrogen bonds in that molecule, and we are substituting some of them but not all of them,” Herron says. “And the net effect is we apparently increased the lifetime. So we must be addressing some of the weaker carbon-hydrogen bonds that were leading to the degradation behavior.”

LG is advertising brighter displays, not televisions that last a long time. That’s because with OLEDs, longevity and brightness have an inverse relationship.

“What you’re doing when you make something run brighter is you’re stressing it harder. You’re running more current through,” H
erron says, noting that OLED displays could operate indefinitely if they’re dim enough. By using a more robust material, LG can produce displays that are brighter than the previous generation while lasting just as long.

. . .

Protecting benzylic C–H bonds by deuteration doubles the operational lifetime of deep blue Ir- phenylimidazole dopants in phosphorescent OLEDs

https://assets.researchsquare.com/f...-7b43-494e-a184-712423349920.pdf?c=1631860670
Abstract
Much effort has been dedicated to increase the operational lifetime of blue phosphorescent materials in
organic light-emitting diodes (OLEDs), but the reported device lifetimes are still too short for the industrial
applications. An attractive method for increasing the lifetime of a given emitter without making any
chemical change is exploiting the kinetic isotope effect, where key C–H bonds are deuterated. A computer
model identified that the most vulnerable molecular site in an Ir-phenylimidazole dopant is the benzylic
C–H bond and predicted that deuteration may lower the deactivation pathway involving C–H/D cleavage
notably. Experiments showed that the device lifetime (T70) of a prototype phosphorescent OLED device
could be doubled to 355 hours with a maximum external quantum efficiency of 25.1% at 1000 cd/m2.
This is one of the best operational performances of blue phosphorescent OLEDs observed to date in a
single stacked cell.
.

. . . .

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1670318058
After an expected launch of OLED TVs with micro lenses in 2023, phosphorescent blue could markedly improve efficiency in OLED TVs in 2024-2025.

OLED emission can be divided into two types; fluorescence and phosphorescence. Red and green OLEDs in displays have already transitioned to phosphorescence (PHOLED) which has up to 100% internal luminous efficiency. Blue OLED is still a fluorescent which has around 25% internal efficiency.

The industry has for years been researching switching blue OLED to phosphorescence as it can markedly increase efficiency to enable higher brightness at the same energy level or similar brightness at a reduced energy level – or something in-between.
 
Last edited:
The low PPI makes text look truly horrible on this 45" 1440p panel. It's usable, but awful for that. It's a reasonable choice for gaming, but you really need a different setup for text-based work if you do a lot of that.
Depends on what your eyes are most bothered by.

Not everyone sees the same way. Eyeglasses Prescription? Colorblind? (12% of population) See tearing more than stutters? Bothered by color problems more than brightness? Everyone nitpicks differently. Etc, etc.

Some of us get more motion blur eyestrain than spatial eyestrain (text not clear enough). There are solutions like www.mactype.net to make text look much better on OLEDs.

Being Chief Blur Buster, motion blur is a smidge higher priority, and some Blur Busters fans are fanatically like that too. 240Hz OLED has clearer motion than 360Hz LCD, even for mudane things like web browser scrolling!

The motion of LCD, including during fast text scrolling, can create bothersome blurs/ghosting that is seen by some people but not others.

Although I have had guest 4K displays, I'd rather work on a 45" 3840x1440 than a 27" 3840x2160 display. It'd be better if it was a 7680x2880 display, obviously; and use some kind of 2x mode or DLSS upconversion for keeping 240fps up in gaming.

However, the resolution doesn't bother me as much as the motion resolution limitations of other office displays. The text sizes I use doesn't demand 4K. After years of being forced to work with DELL 1080p displays in office cubicles until I went full-time with Blur Busters hobby-turned-business, working with 3840x1440 isn't half bad at all. It's more desk-tidy than two-monitor multimonitor too (e.g. 1080p esports monitor + 4K development monitor). I have several 27" 4K displays in the same office, but I've decided to keep the Xeneon Flex on my primary desk as the main jack-of-all-trades display, for improved convenience, to handle a wider variety of Blur Busting content. Especially with the way Windows 11 seems to let you organize Windows in a very Ultrawide-friendly manner (as a psuedo-multimonitor but on single monitor):

YMMV, obviously, depending on your priorities and what your eyes feels the most pampered by.
 

Attachments

  • 1680497766052.png
    1680497766052.png
    46.5 KB · Views: 0
Last edited:
I've said it on the CRT forum but I'll say it here. What about a rear-projection laser-scanned display? Sony already built a pico version of a laser-scanned display:
AFAIK, some (maybe not all?) of these used a strange bounce-scan system (scan downwards in one refresh cycle, scan upwards next refresh cycle) that created weird motion artifacts during fast horizontal pans. This created an aberration in motion quality relative to a proper CRT tube.

Some re-optimizing to keep scan direction identical in all refresh cycles, would nail the quality, IMHO.
 
Depends on what your eyes are most bothered by.

Not everyone sees the same way. Eyeglasses Prescription? Colorblind? (12% of population) See tearing more than stutters? Bothered by color problems more than brightness? Everyone nitpicks differently. Etc, etc.

Some of us get more motion blur eyestrain than spatial eyestrain (text not clear enough). There are solutions like www.mactype.net to make text look much better on OLEDs.

Being Chief Blur Buster, motion blur is a smidge higher priority, and some Blur Busters fans are fanatically like that too. 240Hz OLED has clearer motion than 360Hz LCD, even for mudane things like web browser scrolling!

The motion of LCD, including during fast text scrolling, can create bothersome blurs/ghosting that is seen by some people but not others.

Although I have had guest 4K displays, I'd rather work on a 45" 3840x1440 than a 27" 3840x2160 display. It'd be better if it was a 7680x2880 display, obviously; and use some kind of 2x mode or DLSS upconversion for keeping 240fps up in gaming.

However, the resolution doesn't bother me as much as the motion resolution limitations of other office displays. The text sizes I use doesn't demand 4K. After years of being forced to work with DELL 1080p displays in office cubicles until I went full-time with Blur Busters hobby-turned-business, working with 3840x1440 isn't half bad at all. It's more desk-tidy than two-monitor multimonitor too (e.g. 1080p esports monitor + 4K development monitor). I have several 27" 4K displays in the same office, but I've decided to keep the Xeneon Flex on my primary desk as the main jack-of-all-trades display, for improved convenience, to handle a wider variety of Blur Busting content. Especially with the way Windows 11 seems to let you organize Windows in a very Ultrawide-friendly manner (as a psuedo-multimonitor but on single monitor):

YMMV, obviously, depending on your priorities and what your eyes feels the most pampered by.

I'd say 27" 1440p at typical viewing within the human viewing angle PPD wise is decent but not great compared to 4k at the same optimal viewing angles. Its definitely better than that 120hz 27" 1080p days heh. However, like I've said over and over in threads , for those people using a 42" 4k at ~ 24" view distance they are getting 1500p-like pixel sizes as well so they are in the same boat (Or larger 48" to 55" screens shoehorned onto a desk similarly with even worse PPD).

The 45" 240Hz OLED is an ultrawide so that can change how it's viewed, at least optimally. It is also curved at 800R which equals a 800mm or 31.5" radius. That radius taken as the focal point of the curve (as if it were a lens) would give you that ~ 32" as the focal view distance where every pixel on the screen surface is on axis pointed at you and all points on the screen surface are equidistant from your eyeballs. That would minimize any distortion due to the curve, any uniformity issues at the sides, and would fit within your human viewing angle. Conversely, sitting closer would push the sides of the monitor outside of your human viewing angle, would make distortion due to the curve worse, and would make uniformity issues on the sides worse as more of the screen is off-axis. I'd expect a lot of people to cram a 45" uw oled onto a desk similar to how they do the 42" C2 or 48" OLED though so I'd expect most people's PPD would be even worse than the 52 PPD shown below (and worse distortion, uniformity, amount of screen outside of their view).

The spec of the 45" OLED is 84ppi but PPD , pixels per degree (a measure of the perceived pixel density per any given distance) is a more meaningful value.

https://qasimk.io/screen-ppd/

Xw5VTNT.png




OLEDs use non-standard subpixel layouts so they benefit from even higher than a 60PPD baseline in my opinion. I think it's the reason people are so vocal about the text fringing. So many people bought 42" LG OLEDs and shoehorned them onto near desk setups which made the pixels look larger (more like 1500p than 4k). 1440p screens will never get over 60PPD at the human 60 to 50 degree viewing angle so there is a tradeoff there for sure. But like you said pick your tradeoffs/poisons.

You wouldn't get to 64 PPD until sitting at almost 10" farther than the focal point of the curve at 41" away on that 45" OLED and if you did the curve would start to become more of an alcove. That wouldn't line up right and wouldn't be worth setting up that way. Most curved screens have a fixed curve and focal point.

For reference, the 45" 1440p oled compared to 27" 1440p

27 inch 2560x1440 at 18" view distance = ~ 39 PPD
27 inch 2560x1440 at 20" view distance = ~ 42 PPD
27 inch 2560x1440 at 24" view distance = ~ 49 PPD
27 inch 2560x1440 at 26" view distance = ~ 52 PPD

45 inch 3440x1440 at 24" view distance = ~ 42 PPD (distance => distorted curve, off axis pixels, screen extents outside of viewing angle)
45 inch 3440x1440 at 32" view distance = ~ 52 PPD

So if you want to know what that screen's PPD looks like if it were an RGB screen, at least at the optimal view distance of the curve's radius - just take a 27" 1440p screen and sit your eyballs 26" away from it. That wouldn't be too bad. It wouldn't be as crisp and fine of detail as a 4k screen at optimal viewing distances but it would look "ok". Graphics AA and text-ss start to show more aliasing on highly contrasted edges below around 60PPD on rgb screens even after those pixel size masking compensations are applied (and note that the 2D desktop's graphics and imagery typically get no masking at all) - so there is a tradeoff there. For non standard pixel structures higher PPD is even more critical on text in order to make any "tattered" or fringed edges tinier and not as noticeable where lower PPD's large perceived pixel sizes can make it obnoxious looking. The higher the ppd the less obvious (the tinier) artifacting and fringing you will get on most things (graphics aliasing, non-standard pixel structure fringing, standard RGB text fringing/edge resolution, uncompensated for 2D desktop's graphics and imagery, even DLSS/AI upscaling and Frame insertion edge artifacts). Larger perceived pixel sizes, larger problems.

I use a streamdeck with buttons configured to teleport windows around individually generically by what is the focused window, or by app button, or by global window position profiles that do all of the windows at once. So outside of scrolling windows I'm not really moving much around other than the cursor if I'm editing images. Most noticeable blur would be image thumbnails in image libraries, media thumbnails on streaming services or emby/plex type things I suppose but things blur more the faster you move them and not as much if moving things somewhat slower (pixels/second). That's why blur in games is especially bad - because you are typically a spinning top moving the FoV around (i.e. the entire screen around) constantly at speed back and forth as many games are all about action... mouse-looking, movement-keying, controller panning around quickly in combat or in moving vehicles. While you'd get somewhat less blur on a higher Hz screen on a 2d desktop during scrolling, it wouldn't be pristine without softening effect (and +/- depending how fast you were scrolling). The tradeoff would be larger pixel sizes where the 2d desktop's graphics and imagery typically aren't compensated for at all (no text-ss or game engine graphics AA). So those same thumbnails I talked about scrolling wouldn't look as good, and image editing and viewing would have more aliased pixels, etc.
 
Last edited:
Good points -- everyone has different priorities with their displays.

The compromise between resolution and refresh rate is tough. But, maximizing the compromise between resolution and refresh rate (for people greatly bothered by motion blur) is very tough. There is no 360Hz display at 3840x1440 yet, and this 3840x1440 screen beats a 360Hz LCD in motion clarity. Someday, we will eventually reach 8K 1000Hz (early lab works in progress) but that will not be for over one more decade later.

One more thing:

The 45" 240Hz OLED is an ultrawide so that can change how it's viewed, at least optimally. It is also curved at 800R which equals a 800mm or 31.5" radius.
Don't forget that the Xeneon Flex is a bendable monitor. It can go all the way between a flat monitor and an 800R monitor -- or anywhere in between -- the curve is adjustable.

Every website thought it was a gimmick, until they actually tried it, and it actually helped the monitor's "jack of all trades" capability.

I am no big fan of curved monitors personally, but the optionalness of the curve wins it pretty well too! It's not perfect by all means, but it enhances its utility as a single-monitor-fits-all. You can do things like run Microsoft Flight Simulator 2020 or play Cyberpunk 2077 at triple-digit DLSS-assisted frame rates in very immersive curved mode, and then the next hour, you're back in Visual Studio programming away on a flat 3840x1440 with multiple vertical panes, on a very flat 3840x1440 240Hz non-curved OLED.

Eventually I'll have some other screen on the desk, in revolving-door fashion, but it's been very hard to find something that doesn't feel like a "several lineitems downgrade" relative to Xenon Flex. 48" or 55" OLED TV are currently the closest contender, though still has a Hz downgrade until 240Hz becomes a native feature of TVs of the future (~2025?).
 
Last edited:
As much as I dislike the downsides/tradeoffs of FALD, and I love the per pixel emissive nature and black depths of OLED . . The 4k x2 samsung ultrawide looks interesting. Unfortunately the OLED version is smaller and only 1440px high so this FALD sounds like it might be the best step closer to a high PPD gaming screen with a wall of bezel free screen resolution outside of games since the 55" 4k ark missed the mark on a lot of things and flopped.

https://www.extremetech.com/gaming/...rst-57-inch-8k-ultrawide-oled-gaming-monitors

model G95NC

240Hz, quantum dot, VESA HDR 1000nit

1000R curve = 1000mm radius or focal point of curve = ~ 40" view distance optimally for all points on the screen surface to be equidistant from your eyeballs. That would result in the least distortion from the curve as all of the pixels would be on axis in relation to you. For racing and flight sims you might want to sit a little closer pushing the ends of the screen into your periphery somewhat, or if just using desktop as if it was a bezel free multi-monitor setup - though in both cases it would probably introduce some distortion/warping and off-axis pixels the closer you sat.

A 32:9 at 57" diagonal is around 55" wide by 15.4 inches tall. That's as tall as a 32" monitor. So sit back around (30 to) 40 inch away from a 32" (4k if possible) monitor and that would be what the middle of the 57" ultrawide looks like size wise. Just imagine another 32" screen split in half and stitched to the left and right sides of the screen.

For comparison, a 48" cx running a 21:10 at 3840x1600 rez is around a 17.4" tall viewable, and a 42" 4k screen is probably around 16" tall viewable at that rez so is pretty close (42" 4k at 21:9 is ~ 15.5" tall so 21:10 would be a little taller).



Tracking the same distance to compare at the focal point of the 1000R curve for reference ~ 40" on both:
70PPD on the 48" 4k
78 PPD on a 42" 4k
100PPD on a 32" 3840x2160 (two of them side by side essentially = 57" uw).

At 30" view distance:
55 PPD on 48" 4k
61 PPD on 42" 4k
77 PPD on 32" 4k (2 wide)

100PPD is very good compared to most gaming screens today. At 30" view distance , though there would be some distortion and edges of screen would be outside of your human viewing angle, the 4k x2 screen would still be getting 77 PPD which is still quite good, the same PPD you get on 4k screens viewed at the high end of their human viewing angle/distance range at 50deg viewing angle rather than a 60 deg one where they get ~ 64 PPD.

It's too bad we can't get a 7680x2160 240Hz OLED version of this. 240hz 1440p is not for me. I'd actually prefer it even larger/taller personally in 32:10 and a slightly larger screen vs the 1000R, 1000mm radius. Maybe in the next few years OLED will catch up rez+Hz wise along with phosphorescent oleds etc.
 
Last edited:
https://twitter.com/TFTCentral/status/1643166176665649152?t=GD9rCSCaWQCTSLh2X8Zccg&s=19

Soooo many options. Looking forward to that 32 inch 4K 240Hz QD OLED!

Checking now.. direct link: https://tftcentral.co.uk/news/monitor-oled-panel-roadmap-updates-march-2023


LG

45″ ultrawide with high resolution 5120 x 2160 (ultrawide UHD) and 165Hz refresh rate – the refresh rate is lower than their current 240Hz 45″ panel, but the resolution is significantly increased and should be far more suitable for all non-gaming uses on a screen this large than the current 3440 x 1440 resolution option. This would also represent a step change in pixel density on any of their WOLED panels, increasing from the current approximate 105 – 110 PPI options (42″ 4K and 26.5″ 1440p) to around 123 PPI.
  • This panel is not expected to be released for quite some time although it is listed as being in production stage, as opposed to planning. It’s tentatively listed for Q1 2025 at the moment which seems an awfully long way off. Let’s hope it’s actually sooner.

That's a longgg way off unfortunately. Still smaller (including shorter ~ 13" like a 27" 16:9) than the 57" inch diagonal FALD (~ 15.6" tall like a 32" 16:9), less desktop real-estate than 7680 wide uw, and lower hz tool . . "sometime in 2025".



I'm looking for larger screen spaces but it's great that there are some single smaller 16:9 screens coming down the pipe that will fit better on a desk for people. I'm fine with mounting my screen separately so the size doesn't matter much largeness wise. More interesting upgrades to me would be 240hz+ and higher resolutions/larger screen resolutions, real-estate, aspect ratios etc.
 
Last edited:
I'm more curious about this part : "LG.Display plan to increase the brightness of these future panels, with target specs of 1300 nits peak brightness (HDR) and 275 nits (100% APL) s"

Really now.....1300 nits? When the current MLA panels found in monitors are barely hitting 600 nits? Along with the huge increase in PPI....yeah I'm gonna have to doubt that.
 
I'm more curious about this part : "LG.Display plan to increase the brightness of these future panels, with target specs of 1300 nits peak brightness (HDR) and 275 nits (100% APL) s"

Really now.....1300 nits? When the current MLA panels found in monitors are barely hitting 600 nits? Along with the huge increase in PPI....yeah I'm gonna have to doubt that.

Will see what the actual numbers will be but they have made several advances in brightness output that are coming out in 2024- 2025:

..phosphorescent deuterium oleds will be out in 2024 - 2025 with a lot more output per energy state. PH-OLED.
.. EX tech using AI to monitor pixel wear which allows for less circuitry per pixel = more output per energy to each pixel
.. micro lens array tech

I may switch to that 2x 4k FALD intermittently til 2025 depending on the reviews of it. Probably still have a 120hz 4k oled of some flavor above it though. I'd do an 8k above in an over/under seup but they probably still going to be expensive even for 55" for awhile yet and the 2x 4k would be pricey itself.. Flat 55" vs the 2x 4k curve would prob align ok. Will see how it goes.
 
Last edited:
This absolutely makes sense when OLED monitors like 27GR95QE can only display a 300nits sun in "native HDR" with massive ABL. It can only display the 4th image with caped 650nits lava. There is the 5th image with lava close to 8000nits in the actual native HDR.
So 27GR95QE only show 8% accuracy while FALD wide gamut SDR can show the same 8% accuracy. When FALD shows HDR it can have easy 1800nits lava with 22% accuracy.
HDR performance / brightness is the last thing that anyone should bring up in topic about OLED for PC use because it has nothing to do with PC use specifically.

OLED are bad for PC use because burn-in and subpixel structure (for most OLEDs at least).
Otherwise OLED would be amazing due to lack of depth of its panels making it much easier to focus eyes on and lack of most off-angle issues. These characteristics make them amazing for eye-comfort. Still not perfect with some (albeit very minimal) flickering but many people who already used/use OLEDs for desktop can confirm they are very comfortable to look at, especially from up close.

Would be of course much more comfortable with proper RGB subpixels. Also automatic dimming or even user manually dimming displays to prevent burn-in doesn't help.
 
HDR performance / brightness is the last thing that anyone should bring up in topic about OLED for PC use because it has nothing to do with PC use specifically.

OLED are bad for PC use because burn-in and subpixel structure (for most OLEDs at least).
Otherwise OLED would be amazing due to lack of depth of its panels making it much easier to focus eyes on and lack of most off-angle issues. These characteristics make them amazing for eye-comfort. Still not perfect with some (albeit very minimal) flickering but many people who already used/use OLEDs for desktop can confirm they are very comfortable to look at, especially from up close.

Would be of course much more comfortable with proper RGB subpixels. Also automatic dimming or even user manually dimming displays to prevent burn-in doesn't help.
/thread, pretty much.

This whole weird conversation has overly focused on HDR as if we're all watching movies or playing games constantly. Very little focus on use outside of that and even when there is, I'm told I must be staring at 1000+ nit brightness at all times or I suck, basically.
 
Yeah but that's great timing for next gen GPUs. The 4090 is more of a 4K 160Hz card. The 5090 will be a better match for such a display.
Depends entirely on what you are playing. I played through Like A Dragon Ishin recently and that ran at like 300-500 fps at native 4K on my 4090. Of course, it's not exactly an AAA graphics title...
Doom Eternal runs at something like 200 fps, 230 with DLSS.

The 4090 shifted the performance enough that we went from "4K 120 Hz is enough" to "you know what, 240 Hz would be nice headroom to have".
 
Had my LG CX 48 since it pretty much came out and used it for my fault work and gaming. I can only see some tiny bit of burn in when the entire screen is a solid color of gray from where a Rust UI is located if I go looking for it.

Have loved this display and would buy again.
 
Uh huh. And didn't you say the same thing about the 4090? Lol....trust me you'll buy it just like everyone else. I know too many people who said they were "totally happy with their 3090" only to sell it and get a 4090 the moment it became more readily available.
Lmao you're right haha I was trying to stand my ground firmly with my 3080Ti but then ended up getting the 4090 also lol. But bro it's different this time I'm older and wiser I'm going to contain myself 😂
 
Lmao you're right haha I was trying to stand my ground firmly with my 3080Ti but then ended up getting the 4090 also lol. But bro it's different this time I'm older and wiser I'm going to contain myself 😂

:D 5090 is probably going to be an absolute beast with GDDR7 vram. Don't bother trying to contain yourself.

just-do-it-do-it.gif
 
HDR performance / brightness is the last thing that anyone should bring up in topic about OLED for PC use because it has nothing to do with PC use specifically.

OLED are bad for PC use because burn-in and subpixel structure (for most OLEDs at least).
Otherwise OLED would be amazing due to lack of depth of its panels making it much easier to focus eyes on and lack of most off-angle issues. These characteristics make them amazing for eye-comfort. Still not perfect with some (albeit very minimal) flickering but many people who already used/use OLEDs for desktop can confirm they are very comfortable to look at, especially from up close.

Would be of course much more comfortable with proper RGB subpixels. Also automatic dimming or even user manually dimming displays to prevent burn-in doesn't help.
HDR is the most important thing as it provides better images for PC use that includes a variety of content not just for office usage. If you want office image then buy an office monitor.

OLED doesn't hold brightness that's why it has burn-in. It's the same thing. OLED is only easy for the eyes at low brightness with flickers.

So you have a dim monitor in the end that's not for PC but just a bit fast yet 27gr95QE isn't even as fast or as smooth as pg27aqn.
 
Depends entirely on what you are playing. I played through Like A Dragon Ishin recently and that ran at like 300-500 fps at native 4K on my 4090. Of course, it's not exactly an AAA graphics title...
Doom Eternal runs at something like 200 fps, 230 with DLSS.

The 4090 shifted the performance enough that we went from "4K 120 Hz is enough" to "you know what, 240 Hz would be nice headroom to have".

Well obviously it depends on what you're playing, but I'm saying overall the 4090 is more of a 144-160Hz card when averaged across a wider number of games than it is a 240Hz card. If the 5090 gives a ~50% performance uplift over the 4090 then that would make it a 240Hz card when looking at the average across many titles but yeah you will always have those games that are going to run below 100fps when maxed out and those that will run at 400+ fps.

1680729399513.png


1680729409926.png
 
I don't really care about how many fps you can get at max settings.

"max settings" are completely arbitrary

A developer could make max settings like an average console game, target 60 fps on the fastest hardware, or crank things up so insane that it isn't even possible to run until the next generation of hardware is out.
AAA games almost always target something like 60 fps for the highest end hardware so you'll in theory never be able to get 240 fps on a new AAA game.

Also, there typically is almost no noticable difference between max and the next tier down despite significant drops in performance.

What I care about is how many fps can you even get at any settings. I'm willing to lower settings to get higher fps. It's often a CPU constraint when you start getting into hundreds of fps.
 
Really now.....1300 nits? When the current MLA panels found in monitors are barely hitting 600 nits? Along with the huge increase in PPI....yeah I'm gonna have to doubt that.
In a lot of Journal of SID papers (Society for Information Display), tests are already done.

Both Samsung and LG panels are already capable of well over 1000 nits when intentionally overdriven to destruction using specialized internal firmwares, slightly higher max-voltage driver, and a beefier power supply.

Generally, these current OLED panels are already capable of that in "laboratory overdriven" fashion. This nit overdrive is not deployed to production panels, obviously, for liability, wear-and-tear and warranty reasons. Some companies use intental nit-overdrive during accelerated wear-and-tear testing, and then calibrate down to a safe nit level for production, while also downsizing the monitor power supply.

Also, wear is often geometric -- e.g. double brightness often wears out more far more than 2x as fast -- the geometricness can vary but it can be a fairly sharp hockey stick, and pushing that curve point further up the nit curve takes time -- it took many years to get this far. But OLED is changing faster than LCD did from 2008 to 2023. Even your run-of-the-mill Samsung 245BW TN LCD (one of the early 2ms TN) from 2006, has a LCD-look and response sometimes better than a 2023-era DELL 60Hz LCD, except for the move from 72% CCFL to 100% sRGB LED backlights. The direct-emissive *LED technologies (RGB OLED, WOLED, QD-OLED, PH-OLED, MicroLED, etc) are iterating faster, relatively speaking.

The good questions is to fabricate it in a way that 2x brightness doesn't cause problems over the long term.

OLED burn in has been a challenging engineering challenge, sadly -- it's only now (~2022-2023 ish) that they're finally ready for standard desktop / officing usage scenarios.
 
Last edited:
It's often a CPU constraint when you start getting into hundreds of fps.

Definitely. I'm not even running games in the hundreds of fps and I'm already seeing plenty of CPU bottlenecks on my 4090 at 4K. Can't wait to get a 7800X3D tomorrow and finally let the 4090 fully stretch it's legs. 5800X3D and 7700X just didn't feel like worthwhile upgrades to me.
 
In a lot of Journal of SID papers (Society for Information Display), tests are already done.

Both Samsung and LG panels are already capable of well over 1000 nits when intentionally overdriven to destruction using specialized internal firmwares, slightly higher max-voltage driver, and a beefier power supply.

Generally, these current OLED panels are already capable of that in "laboratory overdriven" fashion. This nit overdrive is not deployed to production panels, obviously, for liability, wear-and-tear and warranty reasons. Some companies use intental nit-overdrive during accelerated wear-and-tear testing, and then calibrate down to a safe nit level for production, while also downsizing the monitor power supply.

Also, wear is often geometric -- e.g. double brightness often wears out more far more than 2x as fast -- the geometricness can vary but it can be a fairly sharp hockey stick, and pushing that curve point further up the nit curve takes time -- it took many years to get this far. But OLED is changing faster than LCD did from 2008 to 2023. Even your run-of-the-mill Samsung 245BW TN LCD (one of the early 2ms TN) from 2006, has a LCD-look and response sometimes better than a 2023-era DELL 60Hz LCD, except for the move from 72% CCFL to 100% sRGB LED backlights. The direct-emissive *LED technologies (RGB OLED, WOLED, QD-OLED, PH-OLED, MicroLED, etc) are iterating faster, relatively speaking.

The good questions is to fabricate it in a way that 2x brightness doesn't cause problems over the long term.

OLED burn in has been a challenging engineering challenge, sadly -- it's only now (~2022-2023 ish) that they're finally ready for standard desktop / officing usage scenarios.

Aye
 
Very much appreciating the insight from the blur busting master in this thread.

For one, I hadn't thought about just how much MicroLED displays will also likely burn in, especially early gens. If they go at the same rates as your average OLED I'm not sure what it's going to take for me to buy into self-emissive for desktop use.

I'd likely give a current gen OLED a chance for gaming/TV at this point but by the time I can warrant yet another display things will likely have moved on even further.

Going to be interesting to keep an eye on all this for sure.
 
Definitely. I'm not even running games in the hundreds of fps and I'm already seeing plenty of CPU bottlenecks on my 4090 at 4K. Can't wait to get a 7800X3D tomorrow and finally let the 4090 fully stretch it's legs. 5800X3D and 7700X just didn't feel like worthwhile upgrades to me.
Me and my 8700k feel sad these days, but I'm likely keeping this rig until Win12 SP1 comes out.
 
Also, wear is often geometric -- e.g. double brightness often wears out more far more than 2x as fast -- the geometricness can vary but it can be a fairly sharp hockey stick, and pushing that curve point further up the nit curve takes time -- it took many years to get this far. But OLED is changing faster than LCD did from 2008 to 2023. Even your run-of-the-mill Samsung 245BW TN LCD (one of the early 2ms TN) from 2006, has a LCD-look and response sometimes better than a 2023-era DELL 60Hz LCD, except for the move from 72% CCFL to 100% sRGB LED backlights. The direct-emissive *LED technologies (RGB OLED, WOLED, QD-OLED, PH-OLED, MicroLED, etc) are iterating faster, relatively speaking.
True that cheap TN panels used terrible CCFL backlights (especially on laptops...) but that wasn't true for proper IPS panels which used decent backlights with proper sRGB coverage right from the start. Some TN panels also used good CCFL lamps but this was an exception, not the rule.

In fact good CCFLs were often more accurate with even best W-LEDs having blue over-coverage which only having proper sRGB emulation finally fixed.
Otherwise I agree that LCD panels - especially comparing best old panels vs cheap new - didn't improve much.

OLED itself changed a lot but unfortunately not always for the better.
First major change which happened was move from RGB-OLED to B-OLED + conversion to white + color filters or without conversion + quantum-dots. This didn't as much fix any burn-in as made it less noticeable as its nearly impossible to not have color shifting over time with using different OLED types for RGB and production of only one type of OLED subpixels is much cheaper. Otherwise funnily enough its blue OLED which had always the most burn-in...

Other than that most of the issues still persist like OLED having dirty screen effect and having issues when switching from pure black to dark brig. In fact LG made minor issue much worse in WOLED by applying ridiculous amounts of overdrive making barely visible dark streaks in to flashes of relatively bright light. It is the single biggest disappointment in their OLED tech I have and one which can really screw up darker content to the point of it being unwatchable.

overshoot_0-150_60hz.png

^^^ - easily fixable by not trying to achieve ~0ms pixel response times
Probably if reviewers did their measurements right and counted overshoot artifacts in to total pixel response times we wouldn't get this nonsense 😪
 
Back
Top