Would you ever go back to LCD after experiencing OLED?

MiniLED is my preference at least until OLED gets substantially brighter and burn-in is resolved. I'm well aware that lots of people swear by OLED and claim never to have any burn-in whatsoever. However, unless I'm mistaken, burn-in still is not covered by manufacture warranty. That tells me it's still a problem.
 
Can someone explain to me how the OLED screen on the iPhone 14 Pro and 15 Pro gets so bright compared to OLED monitors and TVs? It has a peak brightness of 1600 nits in HDR and 2000 nits outdoors.
Mostly by not giving a shit about burn-in. Run them that high for very long, they'll burn in. They figure that most people don't, but it does happen. I've a friend with a Samsung phone who has burn-in on his screen.
 
MiniLED is my preference at least until OLED gets substantially brighter and burn-in is resolved. I'm well aware that lots of people swear by OLED and claim never to have any burn-in whatsoever. However, unless I'm mistaken, burn-in still is not covered by manufacture warranty. That tells me it's still a problem.

Oled warranty is a thing now and it even gets extended from 2 to 3 years for the new models.

https://videocardz.com/newz/asus-an...-issue-msi-pledging-3-year-warranty-for-oleds

Not that i trust oled one bit more after reading this.

I would have issues with oled in under 100 days simply using my office suite on a daily basis.
 
Mostly by not giving a shit about burn-in. Run them that high for very long, they'll burn in. They figure that most people don't, but it does happen. I've a friend with a Samsung phone who has burn-in on his screen.

Afaik , phones and tablets don't use a wear-evening routine with a reserved energizing buffer like big OLED screens do. Plus phones and tablets are normally set to time out very quickly besides, so no matter how you look at it, it's not an apples to apples comparison.

Just a reminder that when people say "burn in" on modern LG OLED tech it means that they have probably exhausted their wear evening buffer. (I read somewhere that they reserve something like 25% energize level on them, not certain the actual amount).


As I understand it - you are always "Burning down" oled screens, like millions of tiny very slow burning candles . When the "candles" are sensed as being uneven enough the firmware will burn them all down to level them out again, and then use a reserved energizing buffer to boost them back up . . repeatedly through the lifetime of the display. It's when you exhaust that buffer, usually years down the road under normal media and gaming usage scenarios, that you will be left buffer-less where the TV has no reserved energizing range left to even the emitters off and boost them back up to normal output levels again.

Think of it like a desert island scenario.
It's as if you had a single charge "lifetime" battery in a high performing phone or tablet that was rated for years of use, but that was incompatible with charge sensing (or that sensing broke for whatever reason) so you'd never know how much charge is left.
After you turned on your fully charged device and start using it you'd have no idea what your battery charge level is. You could use more power hungry apps and disable your power saving features, run very high screen brightness when you don't need to/aren't viewing content that benefits more from it, max the OS screen brightness because you choose to view the screen in bright daylight instead of in the shade, use high brightness/contrast backgrounds, no screen dimming kicking in, leave the screen on with a very long screen timeout or "always on" via OS or phone app even when you aren't looking at it etc. - and you'd still get full charge performance for quite some time - - > but eventually you'd burn through the nearly the entire battery to where the device was compromised, and you'd end up there a lot faster than someone who used the phone or tablet without those more abusive and faster draining practices.

When viewing less abusive media and gaming (static MTG at high brightness might be bad though lol) , the burn-in + burn-down mitigation tech, combined with safe-use practices outlined in many of these threads, plus forced ABL, (and enabling logo dimming and pixel shift) - should result in a much longer lifetime of the screen and buffer. Using an OLED for desktop/apps with a lower contrast+brightness named OSD picture profile or windows color profile would probably help some if you had to use one for desktop apps (switching to different named picture profile for media or games), and you could use something like displayfusion to do saved window position profiles that you could switch between to help move borders and such - but I'd just use a different screen for a workstation/desktop-app display personally and keep the OLED for media and games.

. . . . . .

Oled warranty is a thing now and it even gets extended from 2 to 3 years for the new models.

https://videocardz.com/newz/asus-an...-issue-msi-pledging-3-year-warranty-for-oleds

Not that i trust oled one bit more after reading this.

I would have issues with oled in under 100 days simply using my office suite on a daily basis.

Not all warranties are the same. The BB one seems good, though at a price.

terms wise......

LG warranty is:
"The 5 year panel warranty applies to every size of the 2023/2022 SIGNATURE OLED 8K or 2023/2022 OLED evo G3 TV ranges. The warranty doesn't cover commercial or abnormal use, and is only available to the original purchaser of the product when bought lawfully and used within the country of purchase"

" *In the 1st year of the warranty, panel, parts, and labor costs are covered. In the 2nd - 5th year of the warranty, only panels are covered, and labor will be charged.
**5-year panel warranty covers 88Z3, 77Z3, 83G3, 73G3, 65G3, 55G3, 88Z2, 77Z2, 83G2, 73G2, 65G2, and 55G2. "

. .

the bestbuy warranty is:

"We make house calls for TVs 42" and larger.

No need to lug your screen into a store. If your TV is 42" or larger, we'll come to your home to repair your issue. If we originally installed your TV, we'll also uninstall and reinstall it.

You'll never pay for parts and labor.


We take care of 100% of the costs of parts and labor for covered repairs, with no hidden fees.

If your screen has bad pixels or a shadow image, we'll correct it.

If you have at least three pixels that are always the same color or a ghost image that won't go away, we'll get your picture looking like new.

If your TV won’t turn on because of a power surge, we'll fix it.

If there's a power surge or fluctuation that damages your product, we'll make things right. This includes a surge caused by a lightning strike.
If the remote that came with your TV stops working, we'll replace it.

Get a one-time replacement for the remote control that was included in your TV’s original box.


If there's a failure from normal wear and tear, we'll repair it.

This could be a problem with an internal part or how the product was manufactured. It could also be caused by dust, internal heat or humidity. Accidental damage is not included.
 
Lol. Lmao.

Literally read an article like 3 days ago that they may have cracked the burn in problem entirely with a new process, which will also let them run higher brightness. So... Very wrong.
Lots of articles out there on this. Example: https://hothardware.com/news/oled-display-breakthrough-eliminate-burn-in

May take a little while to trickle into current displays, but should put the burn-in issue to rest for good.

I still feel the whole burn-in issue is waaaay overblown. My LG C2 has been my daily driver for 2 years now and I’ve experienced zero burn issues.
 
OLEDs great and when your new to it you definitely feel like you don't want any other display tech. I went thru that phase where I ditched all my IPS and VA panels and only had oled in the house.

Then I got the PG32UQX and realized how epic miniled was with bright stuff.

Dont get me wrong, the current crop of Qled 4k240 displays are excellent. But nothing touches the color pop and brightness of MiniLeds right now.

G9 57 is my current favorite display....even limited to 120hz, I still prefer it over the oleds.... its such a fucking gorilla monitor lol
 
In addition to the wear-evening buffer reason I mentioned in my prev. reply, OLED burn in is not as high of a risk because it's not allowed to go as bright as FALD HDR gaming TV screens for example are. OLED has unavoidable ABL (though some bright samsung FALDS do too, albeit at much higher brightness levels), and OLED won't hit at all or sustain bright mids and highs like some FALD HDR screens can. It's like a burner on a stove that you can't run on high flame, and the highest flames it can do are short bursts and burner-hole flare outs.

I have a 48cx OLED at pc, and a 77" C1 OLED in my living room however, and I love them both for HDR gaming, and dolby vision movies and shows in controlled lighting environments (layout-wise and using smart lights).

. . . .

OLED HDR is decent in that they can do uniform pixel lighting down to the hairline pixel side by side with other color levels and down to "infinite blacks" which creates side by side pixel by pixel contrast and maintains detail. That side by side isn't just object and areas based in scenes either, it's down to the detail and texture within objects. OLED HDR is good in highlights, isolated light sources, and details rather than large bright mids (and highs) scene areas.

FALD are still swiss cheese of lighting zones, a tetris brickwork of backlights where the contrast drops in darker colors and dark areas around the bright backlights (and sometimes "radiates" the other way dimming a brighter object and making it less detail, paled). They are non-uniform, making localized brightness gradients of zones if not outright blooming at times. It works really well for what they are able to do with the tech, but it's a tradeoff either way, among several other tradeoffs overall between the two techs.

. . .



. . . .

I may even end up with a 8k 900D fald sometime before I get a 5000 series gpu, so I'm not bashing FALDs or writing them off really, both techs are using hacks to ameliorate their faults as best they can but neither is perfect by a long shot.

If the 900D ends up being able to do 4k 240hz pc gaming upscaled to 8k very well via the claims of samsungs 3rd gen AI chip (only available in that model, not even the 800D), and upscales 4k media to greater than 4k fidelity "8k", and has better FALD and color/screen parameter shaping, and still gives up to 8k worth of desktop/app real-estate - then it might be worth it for me to get once the price cools off some. Samsung displays have a history of dropping from their release early adopter extreme price several months after release, and more toward end of year. It would still be pretty expensive but once the price drops + if I qualify for discount I can theoretically swing it sometimes before the 5000 series drops.

The main reason I'd get a FALD is because OLED isn't available in certain designs/form-factors and resolutions, specs. The fact that I wouldn't have to use oled "best practices" (personally I wouldn't be careless with one) when using a FALD would be a bonus though.

. . . .


OLED mids and sustained mids+highs are limited , but there are major tradeoffs between FALD and OLED besides that. They are both using hacks/work-arounds to get the best picuture they can but they are both flawed, so it'll be a circular argument as always.

I prefer the pixel level granular uniformity and contrast on OLEDs, and the "infinite" black depth even right next to bright pixels, but I could see me trying a FALD again at some point as I indicated above. I also only use my oleds for media and games, not static desktop/apps, and I always use smart lamps so that I'm in dim to dark viewing conditions when really diving into a game or movie. If you are viewing in relatively bright conditions, you are essentially cutting the brightness of the screen down by a lot since our eyes view everything relatively. Considerable ambient lighting levels would also be activating the matte abraded outer layer on matte screens which pollutes the screen parameters and even affects the clarity a little.

(Luckily the 900D is more on the glossy side too though, which is rare for a HDR FALD LCD , if that display, once reviewed in detail, lives up to the hype vs the marketing claims in the end).
 
In addition to the wear-evening buffer reason I mentioned in my prev. reply, OLED burn in is not as high of a risk because it's not allowed to go as bright as FALD HDR gaming TV screens for example are. OLED has unavoidable ABL (though some bright samsung FALDS do too, albeit at much higher brightness levels), and OLED won't hit at all or sustain bright mids and highs like some FALD HDR screens can. It's like a burner on a stove that you can't run on high flame, and the highest flames it can do are short bursts and burner-hole flare outs.

I have a 48cx OLED at pc, and a 77" C1 OLED in my living room however, and I love them both for HDR gaming, and dolby vision movies and shows in controlled lighting environments (layout-wise and using smart lights).

. . . .

OLED HDR is decent in that they can do uniform pixel lighting down to the hairline pixel side by side with other color levels and down to "infinite blacks" which creates side by side pixel by pixel contrast and maintains detail. That side by side isn't just object and areas based in scenes either, it's down to the detail and texture within objects. OLED HDR is good in highlights, isolated light sources, and details rather than large bright mids (and highs) scene areas.

FALD are still swiss cheese of lighting zones, a tetris brickwork of backlights where the contrast drops in darker colors and dark areas around the bright backlights (and sometimes "radiates" the other way dimming a brighter object and making it less detail, paled). They are non-uniform, making localized brightness gradients of zones if not outright blooming at times. It works really well for what they are able to do with the tech, but it's a tradeoff either way, among several other tradeoffs overall between the two techs.

. . .



. . . .

I may even end up with a 8k 900D fald sometime before I get a 5000 series gpu, so I'm not bashing FALDs or writing them off really, both techs are using hacks to ameliorate their faults as best they can but neither is perfect by a long shot.

If the 900D ends up being able to do 4k 240hz pc gaming upscaled to 8k very well via the claims of samsungs 3rd gen AI chip (only available in that model, not even the 800D), and upscales 4k media to greater than 4k fidelity "8k", and has better FALD and color/screen parameter shaping, and still gives up to 8k worth of desktop/app real-estate - then it might be worth it for me to get once the price cools off some. Samsung displays have a history of dropping from their release early adopter extreme price several months after release, and more toward end of year. It would still be pretty expensive but once the price drops + if I qualify for discount I can theoretically swing it sometimes before the 5000 series drops.

The main reason I'd get a FALD is because OLED isn't available in certain designs/form-factors and resolutions, specs. The fact that I wouldn't have to use oled "best practices" (personally I wouldn't be careless with one) when using a FALD would be a bonus though.

. . . .


OLED mids and sustained mids+highs are limited , but there are major tradeoffs between FALD and OLED besides that. They are both using hacks/work-arounds to get the best picuture they can but they are both flawed, so it'll be a circular argument as always.

I prefer the pixel level granular uniformity and contrast on OLEDs, and the "infinite" black depth even right next to bright pixels, but I could see me trying a FALD again at some point as I indicated above. I also only use my oleds for media and games, not static desktop/apps, and I always use smart lamps so that I'm in dim to dark viewing conditions when really diving into a game or movie. If you are viewing in relatively bright conditions, you are essentially cutting the brightness of the screen down by a lot since our eyes view everything relatively. Considerable ambient lighting levels would also be activating the matte abraded outer layer on matte screens which pollutes the screen parameters and even affects the clarity a little.

(Luckily the 900D is more on the glossy side too though, which is rare for a HDR FALD LCD , if that display, once reviewed in detail, lives up to the hype vs the marketing claims in the end).

The majority of people who have tried both "swiss cheese" FALD and OLED have preferred the swiss cheese option. You keep saying you prefer per pixel dimming, but have you actually tried a high end FALD display yet? Whether it's a PG32UQX or one of the highend Samsung FALDs like the QN95C? Because I'm pretty sure if you did then you would switch over just like the rest of us for HDR gaming. It's pretty easy to say you prefer LG C series OLEDs if you haven't compared it against anything better.
 
This is when we need that crazy dude Kremnelis I think his name was? lol he would go to great lengths to prove how high end FALD displays are superior to any oled on the market. I think he was rough around the edges so he may have gotten banned lol
Is this a stat somewhere i'm missing?

Its glass/full glossy OLED for me from now on. I didn't think 42" would work but now I don't want anything smaller.
It's one of those things that if you know, you know. The oleds are too weak to even compete with real HDR. I don't know the numbers to get all nerdy about it, but some others have definitely discussed it here in depth.
 
This is when we need that crazy dude Kremnelis I think his name was? lol he would go to great lengths to prove how high end FALD displays are superior to any oled on the market. I think he was rough around the edges so he may have gotten banned lol

It's one of those things that if you know, you know. The oleds are too weak to even compete with real HDR. I don't know the numbers to get all nerdy about it, but some others have definitely discussed it here in depth.

Well some of his arguments were just complete nonsense. Like when people say they want a balanced display that can do some HDR but also good motion clarity which is what OLED can do, his response was to buy a 360Hz TN which completely disregarding the fact that a 360Hz TN is not at all a balanced display and is shit at everything besides motion clarity. OLEDs are pretty well rounded displays but if you want the best HDR experience above all else, it simply ain't it.
 
Well some of his arguments were just complete nonsense. Like when people say they want a balanced display that can do some HDR but also good motion clarity which is what OLED can do, his response was to buy a 360Hz TN which completely disregarding the fact that a 360Hz TN is not at all a balanced display and is shit at everything besides motion clarity. OLEDs are pretty well rounded displays but if you want the best HDR experience above all else, it simply ain't it.
He was entertaining haha crazy guy said the wildest stuff. I would say FALD is more balanced though for mixed use in mixed conditions. Oleds limitations and conditions are pretty strict when it comes to where/how to use it.
 
I haven't seen a ucx/ucg/uqx in person, I digested several detailed reviews (e.g. tft central, techspot) multiple times, and several video reviews (HDTVtest and a few others) of their trade-offs. Not dropping that money on something like that without researching. I wouldn't want the slow response time effect on clarity, on top of what now is becoming a lower 120fpsHz peak for blur (and motion definition) compared to more modern 240Hz 4k screens if I was going to upgrade at this point. It's got abraded matte/ag layer on it too. Plus I like larger screens. The checkerboard, and mixed contrast real scene side by side contrast of those uqx drops to 3,800:1 to 4500:1 in those swiss cheese/lighted tetris blocks+adjacent areas, so that is a tradeoff there, and non-uniform vs the extremely higher contrast in other areas of the screen. Plus overt blooming and into letterboxing even in some instances. Again, not that oled doesn't have some big tradeoffs.

I've seen modern FALD samsung gaming tv screens in stores regularly since I like to check out new screen tech and model series, but mainly reviews of the tradeoffs in regard to QN95C model you mentioned, out of curiosity. It wasn't available when I bought either of my gaming tvs anyway. It was a previous QN model line I compared to when I bought my living room tv and the 77" C1 OLED won the decision.

I'm not a time traveller, other than forward. I doubt I'd spend money on anything going forward that wasn't 240hz 4k capable at this point, and higher than 4k resolution would be a big incentive for me too (which is why I was investigating in the 57" 7680x2160 G95NC at one point). So that rules out both of the screens you mentioned (as well as most oleds outside of the newest 32" 4k desktop monitor ones).

Both FALD and OLED techs are using hacks/tricks/work-arounds to get the best picuture they can but they are both flawed, so it'll be a circular argument as always. I'm not saying either is terrible, they have strengths and weaknesses, even in HDR.

In fact, I had already gone through detailed research of the 55" 4k samsung ARK (failed on some functionality promises and isn't a true multi-monitor replacement with quad of 1080p space) . . . and then researched the G95NC 57" 7680x2160 FALD HDR screen (won't do 240hz on nvidia, design/dimensions/curve make it too short for my liking, also has a matte screen) - so I am not against getting a large FALD HDR gaming screen, and if they had ticked the boxes enough for me to tip the scales I could have ended up with either one of those by this point. If I end up getting a 65" 900D 8k with the NQ8 AI processor by end of the year (if it lives up to the hype once reviewed in more detail) , I'll be able to compare that bright FALD to my two oleds more directly. That would be great.

Neither tech sucks. They both use workarounds/hacks to ameliorate their limitations but they are both flawed. You have to pick which gains and losses you are ok with (or use more than one screen). That said, current screen tech limitations aside, per pixel emissive is the better way to do things ultimately.
 
Last edited:
I haven't seen a ucx/ucg/uqx in person, I digested several detailed reviews (e.g. tft central, techspot) multiple times, and several video reviews (HDTVtest and a few others) of their trade-offs. Not dropping that money on something like that without researching. I wouldn't want the slow response time effect on clarity, on top of what now is becoming a lower 120fpsHz peak for blur (and motion definition) compared to more modern 240Hz 4k screens if I was going to upgrade at this point. It's got abraded matte/ag layer on it too. Plus I like larger screens. The checkerboard, and mixed contrast real scene side by side contrast of those uqx drops to 3,800:1 to 4500:1 in those swiss cheese/lighted tetris blocks+adjacent areas, so that is a tradeoff there, and non-uniform vs the extremely higher contrast in other areas of the screen. Plus overt blooming and into letterboxing even in some instances. Again, not that oled doesn't have some big tradeoffs.

I've seen modern FALD samsung gaming tv screens in stores regularly since I like to check out new screen tech and model series, but mainly reviews of the tradeoffs in regard to QN95C model you mentioned, out of curiosity. It wasn't available when I bought either of my gaming tvs anyway. It was a previous QN model line I compared to when I bought my living room tv and the 77" C1 OLED won the decision.

I'm not a time traveller, other than forward. I doubt I'd spend money on anything going forward that wasn't 240hz 4k capable at this point, and higher than 4k resolution would be a big incentive for me too (which is why I was investigating in the 57" 7680x2160 G95NC at one point). So that rules out both of the screens you mentioned (as well as most oleds outside of the newest 32" 4k desktop monitor ones).

Both FALD and OLED techs are using hacks/tricks/work-arounds to get the best picuture they can but they are both flawed, so it'll be a circular argument as always. I'm not saying either is terrible, they have strengths and weaknesses, even in HDR.

In fact, I had already gone through detailed research of the 55" 4k samsung ARK (failed on some functionality promises and isn't a true multi-monitor replacement with quad of 1080p space) . . . and then researched the G95NC 57" 7680x2160 FALD HDR screen (won't do 240hz on nvidia, design/dimensions/curve make it too short for my liking, also has a matte screen) - so I am not against getting a large FALD HDR gaming screen, and if they had ticked the boxes enough for me to tip the scales I could have ended up with either one of those by this point. If I end up getting a 65" 900D 8k with the NQ8 AI processor by end of the year (if it lives up to the hype once reviewed in more detail) , I'll be able to compare that bright FALD to my two oleds more directly. That would be great.

Neither tech sucks. They both use workarounds/hacks to ameliorate their limitations but they are both flawed. You have to pick which gains and losses you are ok with (or use more than one screen). That said, current screen tech limitations aside, per pixel emissive is the better way to do things ultimately.

Ah the 8K...too bad those don't get as bright as the 4K counterparts. The QN900C is actually a lot closer to the S95C QD OLED rather than the QN95C in terms of HDR brightness.

1711948631755.png
 
Overall , the desktop real -estate at 8k being 4x4k, and if it can do 4k 240hz justice AI upscaled to full screen "8k" as well (and not some ill-performing quasi 240), would be huge facets to me. Some detailed reviews could trash it though potentially. Will have to see. It has the same # of zones as the 900C but it supposedly shapes the output and energy differently using the 3rd gen AI chip so will have to see what it can do, not that those are bad numbers to start with imo considering what else it can (supposedly) do.

Some weird numbers going around but most reviewers are claiming it has higher output than the numbers they came up with when they did the 900C. Definitely looking fwd to Rtings to review a 900D in detail.

For me, at this point things like higher than 4k rez+240Hz capability together, and maybe interesting different designs/form factors are things that would get my attention - more than a little higher nits when comparing two models of the same type (FALD-FALD, OLED-OLED), unless one was woefully bad in that facet. The 900D is also more like a glossy screen, which is rare for a FALD LCD, and I really dislike matte/abraded if I can avoid it.

Here is the RTings review comparison of the 900C (which may be lower than the 900D at least somewhat) compared to the QN95C in game mode (again), with the overal rating.

firefox_hpTn7mp1qt.png
 
Last edited:
He was entertaining haha crazy guy said the wildest stuff. I would say FALD is more balanced though for mixed use in mixed conditions. Oleds limitations and conditions are pretty strict when it comes to where/how to use it.

Depends I guess. For me I wanted something 32 inches, 4K, 240Hz. There is only a single FALD that fit those specs which is the Samsung Neo G8. But that monitor has a ridiculous 1000R curve, a disgustingly grainy matte coating, scanlines, horrible EOTF tracking, and typical Samsung QC issues. So for me a QD OLED is a no brainer over that hot mess.
 
Depends I guess. For me I wanted something 32 inches, 4K, 240Hz. There is only a single FALD that fit those specs which is the Samsung Neo G8. But that monitor has a ridiculous 1000R curve, a disgustingly grainy matte coating, scanlines, horrible EOTF tracking, and typical Samsung QC issues. So for me a QD OLED is a no brainer over that hot mess.
Samsung's monitors have made me sad. There were a number that I would potentially be interested in, like the Neo G8, but like you said: Big curve, image issues, etc. So my PG32UQX it is. I don't love the blur, and I wouldn't mind 240Hz... but all things considered it is probably the best balance for me.

I am watching the OLEDs though. I'll be interested to see how the monitor's unboxed burn-in test goes, and I'll be interested to see how the next generation handles brightness.

I will say despite having to agonize over tradeoffs and imperfections, this is the happiest I've ever been with monitors. The PG32UQX is not perfect but god damn does it look good and have bright, impactful, gameplay. If I had one of the OLED monitors instead, I'm sure I'd likewise be super happy as I like my OLED TV a ton. Sometimes picking the nits makes us forget just how great things are. 4k HDR is just a pleasure to have.
 
  • Like
Reactions: elvn
like this
Biggest problem about using Samungs TVs as monitors is that it seems like Samsung has really done everything they could to make the experience much worse than it would have to be with dithering, undefeatable scaling etc. I get the feeling they might actually be doing it deliberately to not have each business area cannibalize on each other. You can say what you want about Samsung but when it comes to doing business and earn money, maybe only Apple plays the game better than they do. I mean, just look at the imagination in their marketing about monitors curved like the eye etc.

It kind of reminds me when Apple seemed to realize that they were making the best PC laptops on the market and realized it might threaten their eco system and went overboard to just make it such a PITA that most Windows users just gave up.
 
Biggest problem about using Samungs TVs as monitors is that it seems like Samsung has really done everything they could to make the experience much worse than it would have to be with dithering, undefeatable scaling etc. I get the feeling they might actually be doing it deliberately to not have each business area cannibalize on each other. You can say what you want about Samsung but when it comes to doing business and earn money, maybe only Apple plays the game better than they do. I mean, just look at the imagination in their marketing about monitors curved like the eye etc.

It kind of reminds me when Apple seemed to realize that they were making the best PC laptops on the market and realized it might threaten their eco system and went overboard to just make it such a PITA that most Windows users just gave up.

They definitely have apple pricing tier, at least at early adopter hiked release prices. They usually drop in price a considerable chunk (around $1000 less) after only 3 months and a bit more at 5 months (and even more than that approaching a year from release).

Since they always ask a bigger early adopter price, it's worth waiting it out, kind of like buying a car at the end of the model year, b/c by then they'll have dropped a lot. It's usually not even that long for samsung to drop price. Plus some qualify for samsung discount on top of that (though that removes the ability to add a best buy warranty by buying from them, even though you can pick up the samsung purchase at best buy ironically).

https://pangoly.com/en/browse/monitor/samsung

firefox_OQpHrQLAV1.png


. .

firefox_jyZXsVmDtp.png


. .

dithering, undefeatable scaling

The dithering on the 700B 8k screen was defeatable with a workaround by forced running VRR mode on the desktop. Dumb that they released it like that though, I agree. Maybe their upscaling or 8k resolution pipeline was not as good back then has something to do with that dithering being on, idk. It was also their lowest tier version of their 8k screens I think. It was out in april 2022.

The forced scaling thing on the 8k screens up until now is unfortunate but on a 900D 8k with a 5000 series gpu you might be able to get 115hz 8k rbg/444 10bit with dsc 3.0, or 120hz 8k is very near to hdmi 2.1's max too, just a hair over, so maybe they could do dsc 3.x. Depends on if nvidia sets up their ports better in the 5000 series (unlike the g95nc 7680x2160 240hz being limited to 120hz, or 3000/4000 gpus not doing multiple 240hz 4k's with a 3rd monitor, etc). If able to run a desktop at 115hz - 120hz 8k, you could run windowed games at 120hz with other desktop windows/apps running when you didn't want to do whatever peak 4k hz upscaled full screen turns out to be (they claim 240hz 4k + AI upscaled to 8k) . There is no information about forced scaling on the 900D that I've been able to find yet. I think they expect a 8k screen to be used full screen, and if something like the ark is broken up into multi-view they want to be the one doing it with their systems. I don't know whether or not the 900D's one connect box can do anything like the newer version 2 of the ark + oc box does (breaking the screen into true multi source input tile configurations at 120hz instead of the 165hz peak of the ark screen) etc. Still too early to tell for everything on that 900D screen, but it wouldn't surprise me if it still did do forced scaling like the 900C. Some issues are also because of DSC implementations, which is why the G95NC 7680x2160 s-uw having a dedicated 120hz mode in addition to 240hz dsc mode might have been a good idea for compatibility reasons.

grainy matte coating

I agree that sucks in general. I still have one of my setups with a glossy 1440p cinema display right next to a ag coated 1440p pg278q and discounting the screen type/spec differences, the ag is way less appealing. My laptop has ag next to a glossy samsung tablet I keep in portrait orientation next to it at times too. So I see the stark difference between abraded screen surface and clear/lush, side by side, regularly.

Will see if the 900D screen I keep talking about will pan out in fully fleshed out reviews, but if it does, of the youtube reviewers who have had hours with that gaming tv recently, a few who happened to be AG coating enthusiasts were complaining that it's more or less on the glossy end of the spectrum "vs reflections". So to me that is a good thing since I love glossy. It's also rare for a FALD screen to have glossy or "reflective" screen.

monitor has a ridiculous 1000R curve,

I'd love a 55" to 65" 1000R screen, (8k > 120hz optimally) - because the center of curvature of 1000R(adius) = 1000mm = ~ 40 inch view distance which would do well for screens of that size. . For a desktop screen, mounted directly onto a desk you'd need a 700R to 800R screen if you wanted to view the screen from the center of curvature. Imo, most ultrawide screen's curvature amount is bad or sub-optimal vs the screen dimensions, with a few exceptions.

700R (700mm, 27.5 inch view distance from center of curvature) to 800R (800mm, 31.5 inch view distance from center of curvature).

932948_screen_1000R.curved_optimal.view.distance_longer.90deg.screen_mini.schematic_1.png



. . . . . . . . . . . . . . . . . .
978721_902903_reflection-light_facing-monitor_1.gif

.
.

The top translucent example is sitting far inside, away from the center of curvature. The solid example is at the ~ 40 inch center of curvature in the case of a 1000R 1000mm radius screen.
978724_1000R_sitting.far.inside.of.focal.point_1.png

. . .


. .

700R (700mm, 27.5 inch view distance from center of curvature) to 800R (800mm, 31.5 inch view distance from center of curvature).

45" 21:9 bendable Xenon gaming oled (3440x1440 240hz) which is able to do 800R (800mm ~ 31.5 inch to center of curvature). LG 45GR85QE-B was also 800R, 800mm ~ 32" but it has a fixed curve.
978747_XENEON-FLX-2.jpg


.
 
Last edited:
This is the reality of burn in even on the newest latest OLED panels. Read this article.

https://www.tomsguide.com/opinion/i-didnt-fear-burn-in-on-my-oled-gaming-monitor-until-i-got-burned

Alienware bullshits you saying it "shouldn't burn in" and usual flock of people that will say "looks like user error" all cocky and ignorant.

This is ridiculous, 1 year of use and the monitor is ruined. That's a joke 😂 actually it's not a joke 😐 for every post that says I have xxxx hours and zero burn in, you are just misleading thousands of buyers that accidentally buy into your fake hype and get burned, literally lol burned in and they have to deal with all the warranty or whatever bullshit. I mean you can't say you weren't warned.

What a clown fiesta this OLED problem has been.

I'll tell you sitting back and watching shaking my head has been unfortunate for you guys but interesting to me as I didn't fall for the bullshit.

Burn in is an absolute problem, if you don't believe it go buy one yourself and suffer yourself and learn the hard way.
 
Last edited:
The dithering on the 700B 8k screen was defeatable with a workaround by forced running VRR mode on the desktop. Dumb that they released it like that though, I agree. Maybe their upscaling or 8k resolution pipeline was not as good back then has something to do with that dithering being on, idk. It was also their lowest tier version of their 8k screens I think. It was out in april 2022.

The forced scaling thing on the 8k screens up until now is unfortunate but on a 900D 8k with a 5000 series gpu you might be able to get 115hz 8k rbg/444 10bit with dsc 3.0,
My point that AFAIK, the dithering on the 700B is no different than other recent Samsungs, they all require Game mode + VRR to be used as a PC monitor from what I recall.

There are some screen ratio settings you could use but while they prevent things from going full screen, it still wont disable scaling unfortunately.
 
Biggest problem about using Samungs TVs as monitors is that it seems like Samsung has really done everything they could to make the experience much worse than it would have to be with dithering, undefeatable scaling etc.
I did notice that on my S95B. In games you don't really see it, particularly from couch distance, but when I was looking at test patches to check dark levels, all the dark patches has noisy spatial dither going on, which is odd since it is supposedly 10-bit. This was in minimal processing mode, game mode with VRR and AALM enabled. Maybe its to try and mitigate the black crush, maybe it is just lazy development. Whatever the case, it is a thing that surprised me.

Next go-around with a TV I might pay the Sony tax because they really seem to do their software right... but damn do they charge a large premium. The comparable Sony QD-OLED was like $1000 more than the S95B when I bought. Probably won't be an issue for awhile because while I do look at the new TVs and drool over the minor spec improvement, I have to admit I'm real happy with the S95B in actual use and I don't think I'll replace it for quite some time.

Again like the PG32UQX: All the nit picking sometimes make me lose sight that this is the happiest I've ever been with a TV. The flawless viewing angles are great for when there's company and not everyone can sit in the sweet spot, OLED just looks so damn great for dark games like Resident Evil, movies look awesome, etc, etc.
 
no lol
Burn in is an absolute problem, if you don't believe it go buy one yourself and suffer yourself and learn the hard way.
Bought one, actually I own 2 OLEDs one of which is 5 years old(LG TV) and neither have any burn-in. My alienware has 12+ hours a day on it for close to 2 years now(received early May 2022) and still nothing.

Sorry to disappoint.

There's always risk but in my book "oh no my $900 monitor needs replacement a couple years earlier" is not something that even crosses my mind to worry about.
 
no lol

Bought one, actually I own 2 OLEDs one of which is 5 years old(LG TV) and neither have any burn-in. My alienware has 12+ hours a day on it for close to 2 years now(received early May 2022) and still nothing.

Sorry to disappoint.

There's always risk but in my book "oh no my $900 monitor needs replacement a couple years earlier" is not something that even crosses my mind to worry about.

Do you think I'm going to trust and believe you?

or Rtings?
View: https://www.youtube.com/watch?v=Fa7V_OOu6B8&t=505s

lol I don't care how many of you say you have 1,000,000,000 hours and no burn in if you want to baby sit your display that's your choice doesn't matter to me because it's to dim anyway let alone that it absolutely will burn in guaranteed for sure.
 
Can we give the histrionics a rest. So much drama queening and “super meaningful indignation” recently in [H] display.

I did notice that on my S95B. In games you don't really see it, particularly from couch distance, but when I was looking at test patches to check dark levels, all the dark patches has noisy spatial dither going on, which is odd since it is supposedly 10-bit.

You mean noisy temporally? It’s there on my alienware 34 too.

Can anybody with a 3rd gen panel see it on theirs?
 
You mean noisy temporally? It’s there on my alienware 34 too.
Yes, it is a temporal dither, looks very much like the one on Lagom's black level page, though not nearly as strong (theirs is exaggerated on purpose). Normally the kind of thing you get on 8-bit panels, since they have to dither for HDR, but supposedly the Samsung OLEDs are true 10-bit which is what makes it strange to me. Also it is likely more noticeable on an OLED because of the fast transition times. I used to have an 8-bit panel that dithered down 10-bit and I had a real hard time noticing it.
 
Ah I see, thanks, I overlooked that when I was at the page last. I should actually read stuff properly!

Yes that's defintely what I see. I do feel like it is more pronounced in 8bit compared with 10bit in HDR, but its there all the time even in 8bit SDR mode.

Forget brightness, I would upgrade for an improvement in that.
 
Biggest problem about using Samungs TVs as monitors is that it seems like Samsung has really done everything they could to make the experience much worse than it would have to be with dithering, undefeatable scaling etc.
I think you are giving them way too much credit. I say it's incompetence, or management not giving display developers enough time to get things right, or to fix anything but major issues later on.

TizenOS is a huge pile of crap. Years ago when I had a Samsung TV it was so bad I started using a Google Chromecast for all my smart TV functions because the apps just kept regularly crashing and I got the TV into "need to reset the whole thing to get it working again" state several times, just using it normally. Those same apps worked just fine on my LG C9 that replaced the Samsung I had. The modern version of TizenOS is not as horrible, but it's still a user experience nightmare with any settings buried deep within nested menus, it's still kinda sluggish etc.

You would think all this would get better over time as an evolving software platform, but looking at various displays it feels like developers of any brand are starting from almost scratch for every new display model where instead of constantly increasing/improving features it's a weird back and forth where this year's model loses some feature, then gets it back the next year, or a feature performs worse on the new model.

Same on the computer monitor side. I've watched the Samsung superultrawides and the OSDs are all iterations of the same thing, except they threw out the useful preset functionality that was finally working like it should, PbP stuff is still "did anyone even try using this regularly?" inconvenient etc.
 
I think you are giving them way too much credit. I say it's incompetence, or management not giving display developers enough time to get things right, or to fix anything but major issues later on.

TizenOS is a huge pile of crap. Years ago when I had a Samsung TV it was so bad I started using a Google Chromecast for all my smart TV functions because the apps just kept regularly crashing and I got the TV into "need to reset the whole thing to get it working again" state several times, just using it normally. Those same apps worked just fine on my LG C9 that replaced the Samsung I had. The modern version of TizenOS is not as horrible, but it's still a user experience nightmare with any settings buried deep within nested menus, it's still kinda sluggish etc.

You would think all this would get better over time as an evolving software platform, but looking at various displays it feels like developers of any brand are starting from almost scratch for every new display model where instead of constantly increasing/improving features it's a weird back and forth where this year's model loses some feature, then gets it back the next year, or a feature performs worse on the new model.

Same on the computer monitor side. I've watched the Samsung superultrawides and the OSDs are all iterations of the same thing, except they threw out the useful preset functionality that was finally working like it should, PbP stuff is still "did anyone even try using this regularly?" inconvenient etc.

Shame that is the way it is for sure. Regarding the smart TV stuff - I jumped to a nvidia shield a long time ago and then a shield 2019. Even when the LG WebOs on screens I got was decent, the shield is still much faster in operations as it was designed for some light gaming duty, it accesses and manages libraries a lot quicker than any smart tv OS I've ever used, and it has a gigabit network port I utilize (sucks high end tvs still usually have slow network ports). It can even act as a media server decently if desired. Also has AI upscaling that while probably dated compared to what is coming out now , was way ahead of the curve and still works very well for 1080p material. Another big thing is it can run 3rd party apps and even sideload stuff, like an android tablet more or less. So I can run 3rd party streaming apps, just simple installs from the regular google play store, which have a ton of quality of life improvements. Also 3rd party custom interface apps, OTA antenna app, camera apps, etc. No walled proprietary garden where you can't find or use (android) apps . The only thing it lacks is the chip for youtube HDR but I can launch the webOS for that , even using my voice assistant, whenever I want that. The only time I ever do that is when I'm watching some HDR game footage on a youtube review or how-to, which is very, very rare to date. Well less than 1% of my youtube watching. Probably .01 if that.

I went to the shield and never looked back, so the smart apps facet in gaming tv reviews and complaints is never an issue for me, unless perhaps it affects the general OSD and regular TV management too much - like you said things buried in menus, sluggish, etc. Though some LG stuff isn't all that intuitive if you aren't familiar with their OSD. Idk why everything doesn't have a search function like windows 11, and even samsung android phone settings has. You should be able to just enter a keyword or use voice control and get associated results. Windows 11's works great. I hit windows key+S to search and type 2 letters and it brings up anything smartly, and instantly. Win+S, 2 letters, enter in most cases. On a win11 PC, I rarely if ever have to use the start menu anymore (and I can win+TAB or alt+tab between apps on a apps per active monitor basis using displayfusion, activate/focus different apps using stream deck buttons too so no real need for a taskbar much anymore other than peeking the system tray).

I still have two samsung nu6900 4k TVs, and even though their OS/OSD is originally from 2018, their OS works well enough for my needs in just setting up screen parameters. Other than that I have a S6-lite tablet and a samsung phone. I never had a samsung gaming monitor (at least not since one of the original 120hz screens that was a 27" 1080p before g-sync was a thing). In more recent history I had a 32" LG 1600p VA and then went with 48cx gaming tv and stuck with it .

The samsung ark had a lot of issues in version 1 though, I followed that thing for awhile. V2 is decent but it left v1 users more like beta testers of an inferior system.


Most of that stuff is more applicable to living room tv usage. I change between a few named picture profiles on my gaming tv at my pc but I don't use it's smart apps really.
 
Last edited:
What is a bit interesting with Samsung is that because of the fact that they seem to release products before they are actually finished, they actually seem to improve over time. So the same product might actually be better a few month later, this seem to be especially true with firmware. Unfortunately their firmware updates seem to introduce as many problems as they solve with their LCD TVs...
 
I have been using my C7 daily watching news and sports with their logos, Dolby Vision content etc. As the living room TV it gets a LOT of use. I also have 8 other current oled screens between phones tablet laptops home and work pcs, but I get that everyone is different. I would still consider LCD if it offered a good value proposition and performance with no major downsides like haloing and blur or a unique size or refresh rate combo. If they made a 36-38” 4k 240hz+ mini led with no haloing that cost the same or less than 32”-42” 240hz oleds..
 
What is a bit interesting with Samsung is that because of the fact that they seem to release products before they are actually finished, they actually seem to improve over time. So the same product might actually be better a few month later, this seem to be especially true with firmware. Unfortunately their firmware updates seem to introduce as many problems as they solve with their LCD TVs...

Good to hear that they tweak their firmwares at least.

Firmware update rollouts were frequent with LG OLEDs. Some manufacturers have been really bad about it in the past, moving on to the next gen displays and leaving the others in the dust too quickly.


the fact that they seem to release products before they are actually finished, they actually seem to improve over time.

Seems like even moreso generationally though on the samsungs. Like version 2 of the ark, and I think the G9 (but I'm fuzzy on that one tbh).
 
What is a bit interesting with Samsung is that because of the fact that they seem to release products before they are actually finished, they actually seem to improve over time. So the same product might actually be better a few month later, this seem to be especially true with firmware. Unfortunately their firmware updates seem to introduce as many problems as they solve with their LCD TVs...
I feel like every manufacturer is the same. Release a product in a kinda crappy state that is good enough that most users won't notice, then release fixes for maybe a span of a year until the cycle repeats with the next gen product being better.

Samsung tends to just have the added issues of quality control on their LCDs at least. It's not necessarily even the panels from TCL/CSOT (who bought Samsung Display's LCD manufacturing business) but usually what fails on e.g the superultrawides seem to be things like power supplies or the controller board.

I feel like Samsung is actually a lot like Apple in some ways. My rule is to never buy any Apple 1st gen products because the 2nd gen will always fix the major issues. A good example would be Macbook Pro M1 where you have only HDMI 2.0 whereas M2 supports HDMI 2.1. On Samsung's side the 1st gen Ark would be a good example, the 2nd gen added a much needed multi-HDMI input system.
 
Do you think I'm going to trust and believe you?

or Rtings?
View: https://www.youtube.com/watch?v=Fa7V_OOu6B8&t=505s

lol I don't care how many of you say you have 1,000,000,000 hours and no burn in if you want to baby sit your display that's your choice doesn't matter to me because it's to dim anyway let alone that it absolutely will burn in guaranteed for sure.

Me thinks that you protestith to much lol. I am 4 months into OLED goodness. 0 issues, no baby sitting, just common sense.
 
What is a bit interesting with Samsung is that because of the fact that they seem to release products before they are actually finished, they actually seem to improve over time. So the same product might actually be better a few month later, this seem to be especially true with firmware. Unfortunately their firmware updates seem to introduce as many problems as they solve with their LCD TVs...
This is true. That's why I reviewed the forums from time to time to see which firmware version has the best HDR and stick to it. The classic if it ain't broke don't fix it strongly applies to display firmware. My HDR looks so good with all types of content that I will never for the life of the display change it.
The Samsung OS I never use just go to my app of choice and use the apps GUi. It's not like it's my desktop OS so I don't mind it. In fact it's even snappier than the LG in the bedroom for comparison.
 
Honestly, id put a ton more stock in a couple reviews from some members here over 1000s of reviews from literally anywhere else.
 
Back
Top