Is OLED worth it?

EnvymeRT

Limp Gawd
Joined
Mar 18, 2013
Messages
384
Hello all ,

Currently I have a LG 32in 4k 144hz monitor . Colors are very accurate and speed is great. However I was reading reviews of the new oleds coming out. While researching would I really notice much besides deeper black levels ? Does any one have experience with both ? I’d love to hear if someone has both or did a side by side comparison before dropping another $1000 on a monitor.
 
I don't own one (yet), but motion clarity is a big difference from what I've read and seen online. I'm planning on buying one on around a month.
 
Really depends on your situation. Is the LG mini-led FALD? Are you in a darker room? Games primarily or office stuff?

The speed, blacks, and per pixel light control, it could be a spectacular upgrade especially if you can control the ambient light. On the other hand, if you're in a bright room and crave screen brightness maybe not so much.
 
Really depends on your situation. Is the LG mini-led FALD? Are you in a darker room? Games primarily or office stuff?

The speed, blacks, and per pixel light control, it could be a spectacular upgrade especially if you can control the ambient light. On the other hand, if you're in a bright room and crave screen brightness maybe not so much.
It’s actually a 15x12 ft room in my basement I finished and made a gaming room . Lighting is either fluorescent or completely dark.
 
For gaming in a dim or dark room? OLEDs are literally the best you can get, hands down. Nothing beats them. Motion clarity, responsiveness, contrast sharpness, literally second to none.

For watching movies? Browsing the web? Doing work? In a bright room or sunlit area? Well, they aren't optimal.
 
Yes, if you're talking about gaming. Once you get used to the infinite contrast and other benefits OLED offers, you will never look at backlit displays the same way again. IPS just looks horrid to me now.
 
For now, there is no way I'll ever go back to LCD for my primary desktop display. OLED is just too good.
 
only people that ask this question is those without oled. there's a lot of analogies out there. it just comes down to what you value you most for your viewing pleasure. I recommend you go check out a display unit if there is one near you or maybe a friend that has one.
 
I'll answer in a couple parts:

For desktop usage? No. If I'm just scrolling text, doing Excel, that kind of crap, LCD is just as good or better. You don't actually want high contrast in that situation, as it can be more fatiguing on the eyes (some LCDs offer a "paper" mode that is deliberately low contrast), they get brighter if you need to overpower lights, standard subpixel structure, etc, etc. So for an office, I wouldn't get an OLED. Just a waste of money.

For gaming compared to a standard LCD? Absolutely. The greater contrast ratio is real nice, as is the wider viewing angles. The increased motion clarity is good too, at least if you get high frame rates (at low frame rates it can look a little jerky). However the real selling point for me is HDR. Not every gam supports HDR and not all that do it do it well but man does it add to a game when it does. The new remakes of Resident Evil are just AMAZING in HDR. It takes the atmosphere up a huge amount. So if you are coming from a non-HDR monitor to an OLED and then get to do some HDR gaming on it I think you'll find it stunning.

For gaming compared to a good MiniLED LCD? Hmmmm, this is hard. I don't have an OLED monitor, I have an OLED TV (Samsung S95B) and a MiniLED monitor (ASUS PG32UQX) and... it's a tradeoff. The OLED had pinpoint brightness ability, (much) better motion clarity and better viewing angles. However, the MiniLED has a massive brightness advantage, bother for small and large areas, and more brightness is more impactful HDR. Going to OLED is in someways a step up, and in some ways a step down.


So it depends on what you have and what you usage is. The one other consideration with OLED in terms of usage is burn in. If you are mostly/exclusively gaming then burn in should be essentially a non-issue. If you do heavy desktop work, particularly if you have a habit of leaving everything in the same position, leaving your computer with the screen on, etc then burn-in might be an issue. That is part of why I personally got my MiniLED monitor is that I was worried about burn in because I'm known to do things like that, when I work in Nuendo there can be a lot of static elements on the screen for long periods of time.

But there's a reason OLED is popular for TVs, and now for monitors: It is a good technology and is likely the future of display tech. Good HDR really is an awesome experience, and more and more games are getting it.
 
Hello all ,

Currently I have a LG 32in 4k 144hz monitor . Colors are very accurate and speed is great. However I was reading reviews of the new oleds coming out. While researching would I really notice much besides deeper black levels ? Does any one have experience with both ? I’d love to hear if someone has both or did a side by side comparison before dropping another $1000 on a monitor.

yes. for you with what you have now, i wouldnt bother...
 
One difference AFAIK is that most MiniLEDs have much worse input lag / latency in HDR. Not really sure why though.
 
One difference AFAIK is that most MiniLEDs have much worse input lag / latency in HDR. Not really sure why though.
They have to calculate the correct balance of backlight vs panel, whereas OLED just drives panels to a given level. Now I don't know how much it matters in reality, I notice little difference but then I may not be particularly lag sensitive. It isn't super high either way on my display, but I don't have a good way to measure it. Probably varies display to display as well. But the reason it is there is having to drive two figure out the driving and timing of the two separate things. Particularly since you can't do it on a pixel-by-pixel, or even line-by-line, basis, you have to have more of the image before you can figure out "for this group of pixels, I need to set the backlight to level X and then from that I calculate the driving level of each pixel."
 
They have to calculate the correct balance of backlight vs panel, whereas OLED just drives panels to a given level. Now I don't know how much it matters in reality, I notice little difference but then I may not be particularly lag sensitive. It isn't super high either way on my display, but I don't have a good way to measure it. Probably varies display to display as well. But the reason it is there is having to drive two figure out the driving and timing of the two separate things. Particularly since you can't do it on a pixel-by-pixel, or even line-by-line, basis, you have to have more of the image before you can figure out "for this group of pixels, I need to set the backlight to level X and then from that I calculate the driving level of each pixel."
I can get that the FALD probably needs calculations but not why FALD in HDR would require more than FALD in SDR. It would seem more likely that it is probably FALD rather than HDR that cause lag, and perhaps that is the case. I recall that at least in the Rtings review of the Neo G9 57", they mentioned that it was almost as quick in HDR as SDR, unlike most other FALD monitors.
 
Yes, if you're talking about gaming. Once you get used to the infinite contrast and other benefits OLED offers, you will never look at backlit displays the same way again. IPS just looks horrid to me now.
I agree with this. I have a tough time going back to anything non OLED now.
 
I think once you go OLED, you never come back
Unless Mini-LED at the performance level of OLED gets significantly cheaper, yeah. Right now, models that pack in enough local dimming zones are just too expensive and few. Meanwhile, OLED is booming in the PC space this year and going to keep growing and getting more affordable.
 
Yes, even if you have a multipurpose machine. I purchased my OLED for gaming and work. The compromise for work which is made easier by my secondary monitor is well worth it for the spectacular color and response times that I enjoy when gaming, not to mention video play back. I am hooked for life.
 
Last edited:
In my opinion, emissive displays will always beat backlit displays. No, OLED doesn't get as bright as FALD, but they don't have to cheat to get their blacks either. I think the only real contender to OLED is the two-layer LCD panels, where one polarizer layer controls black and white emission, but these displays are too slow to be adequate for gaming.... For now.
 
I must admit that I was concerned that I would have buyer's remorse but no sir. This thing has exceeded my expectations. OLED if you are a gamer hands down. The response times, blacks and vivid colors are breath taking. Your mileage may vary as I have the perfect environment for it.
 
I must admit that I was concerned that I would have buyer's remorse but no sir. This thing has exceeded my expectations. OLED if you are a gamer hands down. The response times, blacks and vivid colors are breath taking. Your mileage may vary as I have the perfect environment for it.
What one do you have?
 
I just upgraded to Asus PG32UCDM. It’s the best monitor I have ever used. The whole brightness stuff people talk about is not something I can see. For me this is plenty bright and HDR is phenomenal on it. Also the bright room vs dark room is also not a problem I faced. I have been using OLEDs in bright rooms for over 4 years now. This monitor has no problem working fine in a bright room. Just don’t be stupid and put a light in front of the screen.

I am upgrading from CX 55 inch which I used sometimes and G7 32 QHD 240 Hz that I ran with DLDSR for past 1.5-2 years.

The responsiveness of OLED, the colors, the HDR and per pixel lighting cannot be matched.

Also if that LG is an IPS monitor then I don’t think you know what you are missing. I have been past 4 years on VA and that is leagues ahead of the IPS monitors I used to have.
 
Last edited:
Seriously doubt that most normal users would consider us hanging in here to also be normal users :)
Fair point, but i'm talking more that seem to only be able to evaluate displays based on macro shots of pixel layouts vs. sitting in front of it at a real desk.
 
I've had everything under the sun, but currently have both OLED and LCD. I use flight simulators, which have a lot of bright areas of the screen (~half the screen being sky), so an OLED just cannot compete with my FALD LCD. On my desktop, I have an AW32 OLED which is king for motion clarity and gaming at night. Although I have windows behind it, I like a bright and airy environment during the day, like to look out the windows to help eye fatigue, and the OLED is definitely lackluster playing during the day. I don't play as much fast paced FPS's as I used to, so considering swapping out the OLED for a brighter HDR FALD LCD. I just don't play a lot of dark games and I do more gaming during the day than at night.
 
Not sure how much brightness is required but I am doing just fine with the PG32UCDM. For my use case it is best I can get. I just hope ASUS puts out a firmware to reduce VRR flicker in menus. It is a bit absurd that this still exists on Samsung panels.
 
Hello all ,

Currently I have a LG 32in 4k 144hz monitor . Colors are very accurate and speed is great. However I was reading reviews of the new oleds coming out. While researching would I really notice much besides deeper black levels ? Does any one have experience with both ? I’d love to hear if someone has both or did a side by side comparison before dropping another $1000 on a monitor.


You might want to check this thread for some balanced (as well as imbalanced/biased) insights, and general nit-picking info on limitations of each tech.


Hardforum/Hardware/Displays: Would you ever go back to LCD after experiencing OLED?


. .
 
Last edited:
They have to calculate the correct balance of backlight vs panel, whereas OLED just drives panels to a given level. Now I don't know how much it matters in reality, I notice little difference but then I may not be particularly lag sensitive. It isn't super high either way on my display, but I don't have a good way to measure it. Probably varies display to display as well. But the reason it is there is having to drive two figure out the driving and timing of the two separate things. Particularly since you can't do it on a pixel-by-pixel, or even line-by-line, basis, you have to have more of the image before you can figure out "for this group of pixels, I need to set the backlight to level X and then from that I calculate the driving level of each pixel."

Like you said, probably varies from display to display. That might get better with different AI generations taking over those duties. DLSS and frame gen are also doing calculations on buffered frames (but that is with powerful hardware).


. . .

. . .

Online gaming also uses buffered frames and speculative prediction, (around 2 frames on the server and 3 frames on the client in the case of valorant) , has queuing and tick rates in it's simulation of "real time", plus it delivers biased results based on the flavor of the netcode decisions made by the developer.

The highest tick servers are 128 tick , 128Hz, 7.8ms, but -

"Frames of movement data are buffered at tick-granularity. Moves may arrive mid-frame and need to wait up to a full tick to be queued or processed."

"Processed moves may take an additional frame to render on the client."


If you are running higher fpsHz minimums than the tick rate of the server, e.g. well over 128fpsHz on valorant, (I'm guessing probably something like 180 or 200fpsHz average to be safe), you will lower how much out of sync you are from the server, but it's still a **minimum of 72ms of "peeker's advantage" on 128tick servers** according to the valorant networking article referenced in the quote below. The size of the rubberband/gap, and thus the "peekers advantage" for 60fpsHz players on valorant's 128tick servers is \~100 ms. Lower tick servers, like 60 tick would be even worse. Hard to believe some servers are still running much lower ticks in the 20's. That and, some games net code might not be as optimized on top of that.



There is a lot more to it but your local input lag has to go through a lot more machinery. What you see is not what you get in online gaming so while low input lag is nice, it's not a 1:1 thing how it's processed, or even what you think you are seeing in the first place at any given time to act on as far as the server is concerned in online gaming as opposed to local gaming and LAN gaming/competition.

=====================================



The lowest "rubberband" gap you can get on a 128tick valorant servers is to exceed that 128tick with your local fpsHZ and get 72ms peek/rubberband as compared to someone at 60fps Hz on that same 128tick server getting 100ms. Your frame rate minimum would have to exceed the 128hz of the server's ticks. Having a 1000fpsHz capable screen isn't going to change that 72ms.

The movement data (for Valorant in the excerpts below since it's a 128 tick , optimized online gaming server system) is buffered at tick granularity, not at your client side frame rate. Each tick of 128 is 7.8ms.


. . . . . . . .


I'm sure that along with myself, a lot of people have seen this as it's been out for several years, but this [2020 valorant netcode article](https://technology.riotgames.com/news/peeking-valorants-netcode) is a good read and has good info on how things work overall in online games (as opposed to local, LAN, vs bots, etc) and also showing 128 tick server benefits. It has some good explanatory images in the article as well.

A few interesting things, too many to mention really. I put the stuff pasted from from the article in quotes and colored them, the rest is my take referencing the info in the link.



. . . .. .

If you recall, a 120fpsHz screen gets motion definition at a rate of 8.3ms per frame, 240fpsHz at 4.16ms per frame, a 480Hz screen at 480fpsHz gets motion definition at a rate of 2ms per frame (locally)..

.

The server is buffering 2 frames, the client 3 frames. According to the quote below, I think the movement data is buffered at tick granularity (128 tick for valorant), not at your client side frame rate. Each tick of 128 is 7.8ms..

.

You always see yourself ahead of where you actually are on the server, and you always see your opponent behind where they actually are on the server. The server goes back in time using the buffered frames system in an attempt to grant successful shot timing and other actions like player movement compared to (what your machine simulates to the server based on) what you saw locally. However different game's server code use their own biased design choices to resolve which player/action is successful, usually in regard to who's ping is higher or lower than the other - it's an interpolated/simulated result. The client also uses predicted frames in the online gaming system.

>"Smooth, predictable movement is essential for players to be able to find and track enemies in combat. The server is the authority on how everyone moves, but we can’t just send your inputs to the server and wait around for it to tell you where you ended up. Instead, your client is always locally predicting the results of your inputs and showing you the likely outcome. ".
.

If you are running higher fpsHz minimums than the tick rate of the server, e.g. well over 128fpsHz on valorant, (I'm guessing probably something like 180 or 200fpsHz average to be safe), you will lower how much out of sync you are from the server, but it's still a **minimum of 72ms of "peeker's advantage" on 128tick servers** according to that article. The size of the rubberband/gap, and thus the "peekers advantage" for 60fpsHz players on valorant's 128tick servers is \~100 ms. Lower tick servers, like 60 tick would be even worse. Hard to believe some servers are still running much lower ticks in the 20's. That and, some games net code might not be as optimized on top of that.


>At the highest tier of competitive play, the differences between player reaction times become razor thin. The difference between winning and losing a gunfight in our experiments often came down to 20-50ms.Even though the playtests were blind (players weren’t told what conditions each round was running on), skilled players were able to accurately identify small changes (\~10ms) to peekers advantage. **Differences of 20ms felt very impactful** to these players.For evenly matched players, a delta of **10ms of peekers advantage made the difference between a 90% winrate for the player holding an angle with an Operator and a 90% winrate for their opponent peeking with a rifle..**
.

Even exceeding 128tick, 128fpsHz as your local frame rate minimum, you can run into delays in that chain which result in delivering less than 128 ticks per second to the server, or receiving less than 128ticks back depending on timing:

>"Frames of movement data are buffered at tick-granularity. Moves may arrive mid-frame and need to wait up to a full tick to be queued or processed."
>
>"Processed moves may take an additional frame to render on the client."
.

When running a much higher frame rate than the tick rate of the server, the delivered server state "might be" blended across your available more numerous frames. I'm assuming "blending" is some kind of prediction\~interpolation to spread a single delivered server state into multiple frames awaiting it on a higher fpsHz client, rather than just repeating the same state across all of the frames. I'm not sure how this works in reverse with your multiple local frames to the outgoing tick. Normally I'd think it would take the last, most current from your perspective, frame that your local machine simulated before the end of that tick (along with the few continually buffered ticks), but according to the info below, it is also simulating into the future based on all of that, so I'm not certain if it's just the last frame or some simulated frame based on all of those frames or something. It could be compressing multiple ticks into a single simulated delivered state to a 60fpsHz user, and compressing multiple local frames to a single outgoing tick for a 240fpsHz user.

. . . . . . . .

"a client running at 60 FPS will simulate multiple movement ticks per frame, while a higher framerate client might blend a single simulation update across multiple frames. We simulate slightly into the future (e.g. move #408) as needed to make sure we know exactly where the client will be at frame boundaries. We then linearly interpolate the state of the world within a move update to draw things exactly where they should be when the frame is rendered."

In this image below, perhaps a bit confusingly, the things labeled "Tick N" are your local machine rendering rate (60fpsHz vs 240fpsHz). The "move #" entries are a 5 state section (#406 to #410) of the actual 128 game server world states \~ server ticks.

"we decouple our simulation updates from game ticks (your render framerate). Regardless of render framerate, clients and servers always update movement, physics, and other related systems with a fixed timestep: exactly 128 times per second."


netcode3_final.png


. . . .

. . . .

netcode4.png

"As the server advances its own simulation, it executes the queued moves that each client sent, and transmits the resulting simulation state back to all clients. Sometimes, however, as shown for move #401 for client B above, the server hasn’t received a client update when it’s needed. In these cases, the server will predict what the client would have done. Usually, we guess that they continued to hold down whatever keys were being held in the last received update, since only a few milliseconds have passed.But hey, sometimes we get it wrong. Occasional client/server disagreements are unavoidable."
 
Last edited:
I just got my oled ultrawide for work. I don't get all the talk about deep blacks for games. What games are you guys playing that have considerable amounts of color #000000?
Don't get me wrong, I like mine a lot, but the black really pops using dark mode, where the screen is 85% #000000 and white font popping out of it.

That said, the fonts do have issues, I figured it was not as obvious as reviews and stuff made it out to be, when actually using it. It is, but not the end of the world.

My macbook pro's miniled IPS is really impressive too and has super deep blacks. Maybe I'll take a picture of the side by side for comparison.
 
OLED is "ok". At lower resolutions, there are issues. But, most things being 4K, I guess the whole "pixel arrangement" issue isn't a huge deal. Of course, there aren't any "tiny" OLED monitors currently (and that might be related).

In short, it depends. I'm happy with my IPS, and can't imagine giving up the features just for OLED. However, my 55" TV is OLED. But, we'd probably be ok even if it weren't, just as long as the viewing angles are there.
 
I just got my oled ultrawide for work. I don't get all the talk about deep blacks for games. What games are you guys playing that have considerable amounts of color #000000?
Don't get me wrong, I like mine a lot, but the black really pops using dark mode, where the screen is 85% #000000 and white font popping out of it.

That said, the fonts do have issues, I figured it was not as obvious as reviews and stuff made it out to be, when actually using it. It is, but not the end of the world.

My macbook pro's miniled IPS is really impressive too and has super deep blacks. Maybe I'll take a picture of the side by side for comparison.
Please do when you can or get the chance. I too have a MacBook Pro 14" so would like to see how they compare.
 
I would burn an OLED in 3-6 months. Not going to be forced to watch content in the dark. Not going to hide my widows and taskbars and static images.
Not going to turn down the brightness. Not going to worry about anything. I will live whatever is left of my life carefree lol So for me it's easy. QN90B Mini-LED until the QN900D gets cheaper, or a smaller alternative is released in the near future possibly next year's lineup. Let's see how the AI upscaling chip does in the new Samsung Mini-LEDs.
If the OLED could have the brightness power of my Mini-LED and not suffer from burn in? Yea I would have one for sure. But the aggressive automatic brightness limiter and dimmer brightness and burn-in occurring in months and it is falling flat with normal lighting in the room are all huge deal breakers for me. Maybe they will solve all of these issues, but I keep reading about how OLED is by nature fragile and no matter they market it will burn in itself even with heatsinks.
 
I just got my oled ultrawide for work. I don't get all the talk about deep blacks for games. What games are you guys playing that have considerable amounts of color #000000?
Don't get me wrong, I like mine a lot, but the black really pops using dark mode, where the screen is 85% #000000 and white font popping out of it.

That said, the fonts do have issues, I figured it was not as obvious as reviews and stuff made it out to be, when actually using it. It is, but not the end of the world.

My macbook pro's miniled IPS is really impressive too and has super deep blacks. Maybe I'll take a picture of the side by side for comparison.

. .

HDR OLED, especially glossy oled gaming tvs, can go darker and down to "infinite black" floor, at the pixel level right next to brightly colored pixels in HDR ranges. In mixed contrast areas, and even texture detail. FALD reduces mixed, contrasted areas down to 3000:1 to 5000:1 while the larger more uniform fields in those same scenes are capable of being extremely more richly/brightly saturated or dark, so it's non-uniform. FALD essentially puts a glow or dim spotlight (or an outline of multiple ones) on light or dark objects or areas, perimeters . . altering parameters of those "shared" or "blended" areas, even when not noticeably outright "blooming" due to the spread.

A modern macbook can have a very large array of very small LEDs on a smaller 16" screen, so their FALD lighting is better than most but they aren't typical of most FALD gaming displays available. I don't think the macs are any good for gaming either due to horrible response time. I guess they are still "only" 2500 FALD zones even with that # of LEDs (~ 4 led per zone) , but they perform their FALD very well with that # of zones density wise on a tiny 16" screen, and the LED count is probably helping a little. If you extrapolated that LED density to a larger screen it would be extremely high.
If someone extrapolated that 10,000 mini led laptop density of 16" screen up to a 55" screen it would be at something like 100,000 LEDs, and even if you cut it to 4 mini led per zone that would still be like 25,000 zones on a 55" screen lol . . We are lucky to get 2,000 zones on gaming FALDs, if that, so between that and it's FALD performance vs very bad response time tradeoff - it's probably not a good "apples to (non)apples" comparison to use that as a measuring stick compared to gaming FALDs we have available. Even though they perform well , the 16" macbook are still a fald so they won't be uniform like an oled or a future microLED would be.



. . .

OLED's contrast and black depth is "ultra" or "infinite", meaning it can go through black depths and down to "off" at the pixel level right next to each and every other pixel of any other values the screen is capable of in the scene. That looks great in a lot of games, especially well done HDR games or auto/RTX HDR gaming . . so I don't share your opinion on the pixel level side by side having dark contrasted levels of OLED being unimpressive in games, (but your opinion might be spoiled somewhat by that non-gaming 16" screen's unique backlight density and FALD performance vs. response time tradeoff).

OLED also has some bad limitations and faults though. While the sbs pixel lighting looks great for smaller HDR highlights, and though more modern OLED gaming tvs can go brighter than previous gens, OLED still can't go very bright in mids and highs on larger %'s of the screen compared to HDR FALD gaming TVs, and what OLED can hit on decent %'s of the screen they can't always for very long (ABL). There is also the fact that OLED best usage practices are still a good idea vs burn in concerns, so they aren't completely unfettered (unless you don't really care about burning through your wear-evening buffer toward eventual burn-in faster than if you were more concientious). They can also get some VRR black flicker depending on the game and perhaps how low and/or irregular the frame rate drops from the 120fpsHz gamma set-point, which bothers some people.

Matte abraded screens in general can lift black depths in ambient lighting too, which is a problem. Most FALD have matte abraded outer layers on the screens, while you can still get some OLED displays that are glossy. Imo matte is also not getting "oled blacks" on oleds that have it, in ambient lighting at least. Apparently neither is QD-OLED in ambient lighting since is supposedly is reflecting ambient light off the QD-layer lifting the blacks.
 
Last edited:
I would burn an OLED in 3-6 months. Not going to be forced to watch content in the dark. Not going to hide my widows and taskbars and static images.
Not going to turn down the brightness. Not going to worry about anything. I will live whatever is left of my life carefree lol So for me it's easy. QN90B Mini-LED until the QN900D gets cheaper, or a smaller alternative is released in the near future possibly next year's lineup. Let's see how the AI upscaling chip does in the new Samsung Mini-LEDs.
If the OLED could have the brightness power of my Mini-LED and not suffer from burn in? Yea I would have one for sure. But the aggressive automatic brightness limiter and dimmer brightness and burn-in occurring in months and it is falling flat with normal lighting in the room are all huge deal breakers for me. Maybe they will solve all of these issues, but I keep reading about how OLED is by nature fragile and no matter they market it will burn in itself even with heatsinks.
Meanwhile I did all the mitigations you mention and my LG CX 48" is still going strong without a single issue after 3.5 years. I only stopped using it as a desktop display because of its size. I went back to dual 28" 4K 144 Hz LCDs for work, and game on the CX in my living room with my ITX size PC or consoles.

I'm sold on OLED as an overall best compromise where the burn in potential and brightness limitations are far less of an issue in practice than LCD pixel response times or mini-LED blooming.

I just wish OLED TVs got into the 4K 240 Hz game already so I could find something truly better to replace the CX. "It's brighter" isn't enough to sell me on the latest ones.
 
  • Like
Reactions: elvn
like this
Also worth noting that fald gaming tvs typically drop some of their picture enhancements in game mode, and some of the samsungs spread their fald "blending" across a wider number of zones and with slower transitions when in game mode. Some very bright falds that don't have heatsink and active cooling fans, more boxy vented houses (like ucg/ucx has), also suffer aggressive ABL even though they aren't oleds (though triggered from higher brightness levels). So the difference in FALD performance and picture quality for games can be different than what you can get out of one for media (and the very slow response times on the 16" FALD macbooks may have something to do with their quality zone management's inability to perform that well closer to "real time", just guessing). OLED outside of media modes, in game mode, can be a little different too though.

I think the dark mode black page text/apps example, on most FALDs, due to the way the tech works, would have some glow, and if not outright glowing/haloing, then blended lifted blacks and/or similarly dimmed brighter area gradient. That in an area around the white text, like a mouse cursor can have, where on an OLED it would be dark to the pixel's razor edge before the pixels of other values, and within brighter things, (even a stylized mouse cursor that is an outline with a "hollow center" for example). The overall massive field areas of a black page would be dark, sure, but FALD drop the contrast drastically from their peak around where they spotlight and dim content, even if it's blended across zones like a localized small gradient. Bright or Dark blobs behind the scene elements, which has more impact on real world media and games full of dodge and burn detail throughout everything, plus light sources and darker areas in scenes overall.

Still, looking at a modern FALD tv, what they can do with the limited # of zones looks pretty good. Both FALD and OLED do a good job with what they can squeeze out of each of them using tricks/hacks/work-arounds. Otherwise I doubt people would be so happy with either of them, and they wouldn't get decent or good reviews. Best they can do with each being limited and inherently flawed display technologies. (some models performing better than others).

OLED, especially oled gaming tvs, tend to be a lot more affordable compared to a top performing fald though. I love oleds but fald's allure to me now is that they continue to be ahead of the curve on features like higher hz high resolution screens, higher than 4k screen resolutions, screen aspects and dimensions available, etc. The higher HDR brightness levels would be a bonus to help soften the blow of losing per pixel emissive a little. So I might try a FALD this year. The fact that the one I am looking at is essentially glossy would tick another major box for me avoiding matte that most falds suffer. Idk if I could be happy with a matte one compared to a glossy oled, especially for media and gaming. If happy with the fald, I might still go back to oled once they get some 240hz OLEDs at higher than 4k, with higher brightness on larger %'s of the screen space, and for longer durations (phoLED + MLA and other advancements). That's for my pc setup. I'm planning on keeping my 77" living room oled for much longer than that.
 
Last edited:
Once you go QD OLED for TV/HT you never go back. For PC use price would have to drop less than 500 USD for 32 inch for me to spend.
 
Back
Top