Why OLED for PC use?

Do you have a amazon link to the display you're talking about? I'm just curious about Its price and reviews on Amazon since you give it such a good recommendation.

They actually de list the product every time it goes out of stock and you have to wait for them to restock it before the item gets listed again. RTings already reviewed the 27 inch version, biggest difference being that the 27 inch version can do 160Hz and has a faster response time. I paid $850 for mine back in Feb and between this display and an LG C2 for about the same price I would pick this one every single time.
 
Burn in is not a huge concern unless your goal is to keep the same display for 5+ years. I used the LG CX 48" for two years straight working from home, so ~8h desktop use on workdays plus personal use. Same TV is still working without issue.

Just don't run high brightness, set your taskbar/dock/topbar to autohide, use dark modes where available.

People pick OLED because it's a good compromise atm where it's not super expensive, has real <1ms pixel response times, good enough HDR, excellent viewing angles etc. Like any tech it's far from perfect.

What kind of work do you do? What amount of static windows (Word, Excel, etc.)?

I mean, I tend to switch monitors about every 5 years on average, so at least that is what I expect out of a screen.
 
They actually de list the product every time it goes out of stock and you have to wait for them to restock it before the item gets listed again. RTings already reviewed the 27 inch version, biggest difference being that the 27 inch version can do 160Hz and has a faster response time. I paid $850 for mine back in Feb and between this display and an LG C2 for about the same price I would pick this one every single time.
How fast is yours? 144? or 120?
 
Ratings proves otherwise lol Average users don't notice any difference of an OLED except how dim it is when they bring it home in the living room. Then when it burns in they get mad of course. But I'm sure you're in a cave with low brightness slamming gallons of copium till It's gushing out your ears haha go ahead and tell me how you run max brightness in a bright room for 3000 hours all the usual bullshit because the fact of the matter is Ratings just proved YOU ALL WRONG LOL
Lol you're high on copium
 
Hardware Unboxed ranked the options if you end up go for it:

Top 14 Best OLED Monitors I've Tested: The Ultimate Rankings
 
Most people usually watch the same channels on their favorite shows and can easily rack up hundreds and hundreds of hours in a year or two. Most gamers sink hundreds even thousands of hours into their favorite games. It is absolutely a reality. Ratings is the truth. Believe it or not.
Are you pausing the game for all those hours or playing it?
 
Well, almost 1300 hours on my 48CX so far. Came with a few dead pixels along the sides, which appears to be par for the course with these. Not something I'd notice without looking though. Not seeing any burn-in or such. For office stuff I do run it at OLED Light 0 plus BFI High, which is already bright in my dim, but not dark, room. For non-work stuff, there will be slightly brighter SDR and some HDR, in the mix...

I also hide the taskbar, run screen savers, dark mode where available, but nothing too onerous. (I don't do those things with CRTs, except the screen savers, which I've always used aggressively. And never have seen burn-in on mine.)

If it burns in, I'll be sad, but in the meantime, true blacks, huge contrast, and with 120Hz BFI effective 320Hz motion clarity performance, if my math is right, well, this thing is awesome. And life ain't forever...
 
Are you pausing the game for all those hours or playing it?
Ever heard of a HUD? It's static and you arguably need it for most games unless you're a master gamer that either doesn't need it, prefer no HUD for realism, or are preventing burn in. Personally I love HUDs and they are part of the game I need especially for FPS games.
 
I just come to say that after looking at the latest rtings burn in test, and seeing they managed to have burn before hitting 2000 hours of use

With the firmware bugs, and some of the extra oled annoyances like the flickering, finally decided to get rid of it, and sell it.

oleds are great, but they are not yet prepared for desktop normal use, BFI, higher brightness, much more burn in longevity, etc are needed before spending over 1000€ can be justified with such short livespan if you are not elon musk.


https://www.rtings.com/monitor/reviews/lg/27gr95qe-b#test_19163

View attachment 589178


View attachment 589179
Is that grey one really burn-in? Spots like that? From what? The blue one, yeah...will limit news channels to my main TV. (DLP front projection, which is apparently impervious to such things.)
 
Is that grey one really burn-in? Spots like that? From what? The blue one, yeah...will limit news channels to my main TV. (DLP front projection, which is apparently impervious to such things.)
It may expose itself in different scenarios and certain colors. Watching sports on a burned in display is particularly very annoying as the blobs are just smeared in the same spot. I can't unsee it once it's there. It's just like DSE. Both are distracting.
 
I tested mine, 10,000 hours no burn in at all

Here is a full gray test for proof

1691641300438.png
 
I tested mine, 10,000 hours no burn in at all

Here is a full gray test for proof

View attachment 589287
You have no evidence of how bright you ran your display. On low brightness sure it's fine. If you're ok with that. There is the other half of people that don't like it dim. Ratings proves that it doesn't hold up when you turn the brightness up. Between you and Ratings guess who I'm gonna believe? The copium alcoholic or Ratings who has no bias alterier motives? 🙂 You're not the first or the last to make a post like "I have xxxxxx hours and look I have no burn in!!! Then Ratings comes in and wipes all the bullshit right up lol.
 
Let's throw in more counterpoints, just to balance things out.

I also have some heavily worn LCDs that has gone splotchy due to uneven wear and tear on backlight / edgelight LCDs. Even elements of LCDs wear and tear, even if typically not as fast as OLEDs.

Also, without burnin, I have a nearly 1 year old Corsair Xeneon Flex DVT prototype here (received long before public release, as part of a testing contract) and I've been Visual Studioing / browsing on it -- and walking away from it with lots of static text. I do use taskbar autohide and I do let it do the overnight pixel refresh. Corsair, does provide 3-year burn in warranties, if you want to derisk things. This isn't the bad old LG C6 days.

Each display have their pros and cons. Let's respect that.
 
I have like 16000h on my LG CX by now and it still looks flawless (it's over 3 years old now). That's a better track record than several LCDs I've had.

But yea I'm using it in a dark room with low brightness setting for SDR, so it's not surprising given what I've learned about the tech over the years. The tech isn't flawless of course but for me it's as close to perfection as any display can get right now and the tradeoffs don't even really impact me (it's bright enough, 120hz is enough with 4k etc.). But I am not everyone, and I very often recommend other people to get some LCD display instead. So they can crank up the brightness in their living room and whatnot.
 
What kind of work do you do? What amount of static windows (Word, Excel, etc.)?

I mean, I tend to switch monitors about every 5 years on average, so at least that is what I expect out of a screen.
I'm a programmer. I typically have multiple virtual desktops I switch between with things like Slack/Teams/Outlook, IDE, several browser windows, terminals and so on. I do not have e.g a Word or Excel open all day long, just switching the virtual desktops regularly probably gives enough movement to avoid burn-in.

3 years of ownership and still no burn-in on my CX 48". I have put it back to the living room to use as a TV though. A smaller LCD on the desktop is more convenient and when my use is 70% work, 30% gaming/media, it's just better to use a LCD. Planning to get that 57" Samsung superultrawide when it releases.
 
Last edited:
Hardware Unboxed ranked the options if you end up go for it:

Top 14 Best OLED Monitors I've Tested: The Ultimate Rankings

QD-OLED subpixel structure better for text than WOLED, that was kind of a suprising statement to me based on what I have heard, read and seen myself (returned the OLED G9 for being noticable worse than my C2 in this regard). What am I missing here?

Not that any of them is even close to being ideal for text though compared to something like a good IPS though...
 
QD-OLED subpixel structure better for text than WOLED, that was kind of a suprising statement to me based on what I have heard, read and seen myself (returned the OLED G9 for being noticable worse than my C2 in this regard). What am I missing here?

Not that any of them is even close to being ideal for text though compared to something like a good IPS though...
WOLED at 4K will be better than QDOLED at 1440p.
 
You have no evidence of how bright you ran your display. On low brightness sure it's fine. If you're ok with that. There is the other half of people that don't like it dim. Ratings proves that it doesn't hold up when you turn the brightness up. Between you and Ratings guess who I'm gonna believe? The copium alcoholic or Ratings who has no bias alterier motives? 🙂 You're not the first or the last to make a post like "I have xxxxxx hours and look I have no burn in!!! Then Ratings comes in and wipes all the bullshit right up lol.
Rtings is using max brightness and streaming CNN 20 hours a day, and no other content besides commercials. Even some LCDs are showing signs of image retention now, among other issues. If you need max brightness on OLED, it's not a good fit. I have to keep my C2 at 15-20% brightness or I would go blind. I am so used to 110-120 cd/m2 SDR, I don't need any more in my environment.
 
QD-OLED subpixel structure better for text than WOLED, that was kind of a suprising statement to me based on what I have heard, read and seen myself (returned the OLED G9 for being noticable worse than my C2 in this regard). What am I missing here?

Not that any of them is even close to being ideal for text though compared to something like a good IPS though...

Like you said, neither one is great for text. I don't find the OLED G9 to be worse than the 42" C2.
 
. .

Burn in = "burn down" + restore as long as there is buffer remaining
=====================================================

Burn in is some risk but it's not like an oled phone left on with an app that prevents the screen from timing out. OLED tvs have a 25% reserved brightness/energize buffer. They even the wear on all of the emitters, then boost them back up to level again. This should last years unless you are foolishly abusive of the screen outside of normal gaming and media, dark themes, leaving the screen on static and paused/idle , etc. (there is a turn off the screen trick that just times out the emitters so no reason to leave it lit while afk for example). Still not a good choice for static desktop/app use imo though it's doable.

People are a bit more abusive of their oled and say look no burn in. . . but it's just burning down that much faster so using up more of the reserve buffer. It's a pretty clever system.

You can get the 42" LG C2 for $900 + tax currently at best buy. The 5 year best buy warranty on a c2 can be had for around $36 a year. That covers burn in if you are actually concerned about it but I doubt you'd burn in before 4+ years in normal media and gaming usage with some precautions taken. $36 a year insurance , $3 a month, $180 / 5 yr.

(The LG G series also comes with a LG 5 year burn in warranty by default but they start at 55")

. .

. .


. .

Anyone have experience with recent OLED's for ~10 hour a day productivity work?

Are we at the point where we no longer need to worry about burn-in?

I would love to pick up a 42" LG C2 (or C3 I guess whenever it is released) for my desktop, but I am still concerned that within a few monts the office ribbon or start menu or something else will be burned in...


I used my LG CX 48" like that for two years. ~8h work and personal use on top of that. The display is still without burn in and working fine.

This will heavily depend on how you use your display. I had some mitigations in place:

  • Dark modes where available.
  • Autohide taskbar/dock/topbar. I use MacOS for work.
  • Turn off the display with the remote when taking a longer break.
  • Keep display connected to power so it can run its pixel refresh cycles.
  • Brightness calibrated to 120 nits.
  • Virtual desktops in use so there is some movement between content.
  • Blank screen saver going on in 10 minutes of idle. Faster to get out of that than display off.
  • Display off in 20 minutes of idle.
While this may seem like a lot, it's a one time setup. You really don't need the taskbar/dock for anything 99% of the time so even after returning to a smaller LCD I keep it hidden.

I don't use mine as a static desktop/app screen other than a browser once in awhile or something since I have side screens for static apps and desktop stuff. I've been using multiple monitors for years so it's normal to me to do so.

I think of it like the mainscreen in star trek.. they aren't doing all of their engineering and science work and experiments on their large main viewer typically. All of their data is on other workstation screens while the main screen is the big show. Or you might think of the oled screen as a "stage" for playing media and games.


That's my personal preference. Like kasakka said there are a lot of burn-in avoidance measures, many which he listed. If you keep asbl on it would be even less likely to burn "down" (see below), but most people using them for desktop/apps turn off asbl dimming via the service menu using a remote since it's annoying to have full bright pages dim down.

=======================================================================

Pasting some info from my comment history here for you in case you find any of it useful:

Some burn-in (burning through your "burn-down" buffer) avoidance measures
A few reminders that might help in that vein:

....You can set up different named profiles with different brightness, peak brightness, etc.. and maybe contrast in the TV's OSD. You can break down any of the original ones completely and start from scratch settings wise if you wanted to. That way you could use one named profile for lower brightness and perhaps contrast for text and static app use. Just make sure to keep the game one for gaming. I keep several others set up for different kinds of media and lighting conditions.

  • Vivid
  • Standard
  • APS
  • Cinema
  • Sports
  • Game
  • FILMMAKER MODE
  • iisf Expert (Bright Room)
  • isf Expert (Dark Room)
  • Cinema Home
....You can change the TV's settings several ways. Setting up the quick menu or drilling down menus works but is tedious. Keying the mic button on the remote with voice control active is handy to change named modes or do a lot of other things. You can also use the remote control software over your LAN , even hotkeying it. You can change a lot of parameters using that directly via hotkeys. Those hotkeys could also be mapped to a stream deck's buttons with icons and labels. In that way you could press a stream deck button to change the brightness and contrast or to activate a different named setting. Using streamdeck functions/addons you can set up keys as toggles or multi press also, so you could toggle between two brightness settings or step through a brightness cycle for example.

....You can also do the "turn off the screen emitters" trick via the quick menu, voice command with the remote's mic button, or via the remote control over LAN software + hotkeys (+ streamdeck even easier). "Turn off the screen" (emitters) only turns the emitters off. It doesn't put the screen into standby mode. As far as your pc os, monitor array, games or apps are concerned the TV is still on and running. The sound keeps playing even unless you mute it separately. It's almost like minimizing the whole screen when you are afk or not giving that screen face time, and restoring the screen when you come back. It's practically instant. I think it should save a lot of "burn down" of the 25% reserved brightness buffer over time. Might not realize how much time cumulatively is wasted with the screen displaying when not actually viewing it - especially when idling in a game or on a static desktop/app screen.

...You can also use a stream deck + a handful of stream deck addons to manage window positions, saved window position profiles, app launch + positioning, min/restore, etc. You could optionally swap between a few different window layouts set to a few streamdeck buttons in order to prevent your window frames from being in the same place all of the time for example.

... Dark themes in OS and any apps that have one available, web browser addons (turn off the lights, color changer), taskbarhider app, translucent taskbar app, plain ultra black wallpaper, no app icons or system icons on screen (I throw mine all into a folder on my hard drive "desktop icons"). Black screen saver if any.

... Logo dimming on high. Pixel shift. A lot of people turn asbl off for desktop but I keep it on since mine is solely for media/gaming. That's one more safety measure.

. .

Turn off the Screen (emitters only) trick

I use the "turn off the screen" feature which turns the oled emitters off. You can set that turn off the screen command icon to the quick menu so it's only 2 clicks to activate with the remote (I set mine to the bottom-most icon on the quick menu), or you can enable voice commands and then hold the mic button and say "turn off the screen". You can also use the color control software to set a hotkey to the "turn off the screen(emitters)" function, and even map that hotkey to a stream deck button if you have one. Clicking any button on the remote or via the color control software hotkeys wakes up the emitters instantly. I usually hit the right side of the navigation wheel personally if using the remote.

https://www.reddit.com/r/OLED/comments/j0mia1/quick_tip_for_a_fast_way_to_turn_off_the_screen/

While the emitters are off everything is still running, including sound. This works great to pause games or movies and go afk/out of the room for awhile for example. I sometimes cast tidalHD to my nvidia shield in my living room from my tablet utilizing the "turn off the screen" (emitters) feature. That allows me to control the playlists, find other material, pause, skip etc from my tablet with the TV emitters off when I'm not watching tv. You can do the same with youtube material that is more about people talking than viewing anything. I do that sometimes when cooking in my kitchen that is adjacent to my living room tv. You can probably cast or airplay to the tv webOS itself similarly. Some receivers also do airplay/tidal etc directly to the receiver.

. . .
Distrust Screensavers

I wouldn't trust a screensaver, especially a pc screensaver. Not only do they fail or get blocked by apps - Apps can crash and freeze on screen, so can entire windows sessions or spontaneous reboots stuck on bios screen, etc. It's rare but can happen. Some apps and notifications even take the top layer above the screensaver leaving a notification/window there static.

While on the subject. I kind of wish we could use the LG OSD to make mask areas. Like size one or more black boxes or circles, be able to set their translucency, and move them via the remote to mask or shade a static overlay, HUD element, bright area of a stream, etc.

. .
LG's reserved brightness buffer. You aren't burning in because you are burning down that buffer first, for a long time (depending on how badly you abuse the screen).

From what I read the modern LG OLEDs reserve the top ~ 25% of their brightness/energy states outside of user available range for their wear-evening routine that is done in standby periodically while plugged in and powered. Primarily that, but along with the other brightness limiters and logo dimming, pixel shift, and the turn off the "screen" (emitters) trick if utilized, should extend the life of the screens considerably. With the ~25% wear-evening routine buffer you won't know how much you are burning down the emitter range until after you bottom out that buffer though. As far as I know there is no way to determine what % of that buffer is remaining. So you could be fine abusing the screen outside of recommended usage scenarios for quite some time thinking your aren't damaging it, and you aren't sort-of .. but you will be shortening it's lifespan wearing down the buffer of all the other emitters to match your consistently abused area(s).

A taskbar, persistent toolbar, or a cross of bright window frames the middle of the same 4 window positions or whatever.. might be the first thing to burn-in when the time comes but on the modern LG OLEDs I think the whole screen would be down to that buffer-less level and vulnerable at that point as it would have been wearing down the rest of the screen in the routine to compensate all along over a long time.

The buffer seems like a decent system for increasing OLED screen's lifespan considering what we have for now. It's like having a huge array of candles that all burn down unevenly - but with 25% more candle beneath the table so that you can push them all up a little once in awhile and burn them all down level again.
Or you might think of it like a phone or tablet's battery you are using that has an extra 25% charge module, yet after you turn on your device and start using it you have no idea what your battery charge level is. You can use more power hungry apps and disable your power saving features, screen timeouts, run higher screen brightness when you don't need to, leave the screen on when you aren't looking at it etc. and still get full charge performance for quite some time but eventually you'd burn through the extra 25% battery.
 
Last edited:
Like you said, neither one is great for text. I don't find the OLED G9 to be worse than the 42" C2.

Yes but typically people are seeing way worse in effect because a lot of people choose to shoe-horn larger gaming screens directly onto a desk instead of using a simple rail spine floor tv stand, wall mount, or separate surface for their screen with a better viewing distance. When viewing a 4k at an optimal viewing angle and PPD you'd get much less obnoxious fringing in effect. A 1440p OLED is just lower PPD no matter what due to the screen dimensions/spec but a 4k viewed up close has pretty similarly large sized pixel sizes to your perspective.

Pixel sizes in regard to text on the 2D desktop and in regard to 3d game engine graphics are more or less compensated for at 60PPD but only because text sub sampling and aggressive AA in games are applied to mask how large the pixel structure actually is. As most people know, unfortunately LG OLED uses WRGB and samsung uses pentile which are both non-standard subpixel layouts which text-ss is not designed for. The 2D desktop's graphics and imagery typically have no pixel size/edge-masking compensations at all so are displayed at the "raw" pixel grid/size and the true granularity. So even higher than 60 PPD is better all around. The smaller the perceived pixel sizes, the less noticeable artifacts and fringing issues are. In text and even occasional edge artifacts from things like DLSS and frame insertion/amplification technologies. Larger perceived pixel sizes, larger problems. (and vice-versa)

. . . .

The pasted comment below are mostly in regard to 42" 4k and 48" 4k screens. A 3440x1440 . . . (440w +[ 2560w x1440] + 440w) . . when viewed at "full height" to your perspective - like a 2560x1440 would be at 60deg to 50 deg viewing angle - would only be around 43 PPD to 51 PPD. You could sit farther away than that but the height would shrink to your perspective making the screen look even shorter and more belt-like.

So if someone makes a perfectly mapped for wrgb or pentile text sub-sampling it would be improved compared to text-ss on non standard subpixel layouts as it is now of course but - you still won't have great edges on 1400 - 1500p like pixel sizes to your perspective (1440p screens or 42" 4k screens at ~ 24" or so view distance ). You'd just be back to what text looks like on rgb subpixel layouts at ~ 43 PPD to 51 PPD which isn't great compared to 64ppd - 70 PPD - 77PPD you'd get on a 4k screen within a 60 to 50 deg horizontal viewing angle. Properly mapped text-ss would allow 60PPD+ range on 4k screens at 60 to 50 deg viewing angle to look great on text though, where now you probably have to use 70PPD+ to shrink the actual pixels sizes small enough (and either use 3rd party text subsampling, greyscale, etc. or disable text-ss entirely). As mentioned, text-ss in general doesn't do anything for aliasing on the 2d destkop's graphics and imagery so higher PPD is better anyway if you can get it.
. . . . .

https://qasimk.io/screen-ppd/



..At the human central viewing angle of 60 to 50 degrees, every 8k screen of any size gets around 127 to 154 PPD

..At the human central viewing angle of 60 to 50 degrees, every 4k screen of any size gets around 64 to 77 PPD

..At the human central viewing angle of 60 to 50 degrees, every 2560x1440 screen of any size gets only 43 PPD to 51 PPD

..At the human central viewing angle of 60 to 50 degrees, every 1920x1080 screen of any size gets only 20 PPD to 25 PPD

. . . . . .


>This graphic shows the optimal viewing distances of both a 42" 4k and a 48" 4k, plus a lower 50-ish PPD one at 24" to show how a lot of people are viewing one on a desk sub-optimally at \~ 1500p like PPD.
>
>
>
>
842627_tJWvzHy.png



\-.-.-.--.-.-.--.-.-.--.-.-.--.-.-.--.-.-.--.-.-.--.-.-.--.-.-.--.-.-.--.-

>Human viewing angle is 50 to 60 degrees
>
>
xe7QB1M.png


>
>optimal viewing angle minimizes the off axis and non-uniform edges
>
>
XvKRu9t.png


>
>
>
>
>
>Sitting too close pushes the sides of the screen outside of your viewpoint and makes the off-axis areas larger:
>
>
>
>
RUdpoK8.png





. . . . . . . . . . . .

TLDR: When relaying how text aliasing/fringing, graphics aliasing, etc. look on different monitor sizes and specs it's more meaningful if you include the PPD (or the rez + true viewing distance to your eyeballs at least). A 42" 4k at ~ 24" view on a desk is only ~ 51 PPD which maps pretty close to what 1400p ~1500p would look like on a desktop monitor. Someone viewing text at 64PPD or ~70PPD with optimal viewing angles on a 4k that is decoupled from the desk is going to see things much differently - seeing much smaller pereceived pixel sizes, much less granularity, and much less obvious (much tinier) fringing and edge artifacts.

. . .

That said, I keep my oled as a media and gaming "stage", and I use different non-oled screens for desktop/apps. However high PPD still matters in order to get a fine 4k pixel density look instead of pixel sizes to your perspective that look more like a 1400 ~ 1500p desktop sized screen's.


A few fun Gifs showing how the nearer you sit, the larger the pixel granularity, and vice versa. PPI alone is not a good enough measure.



 
Last edited:
Ever heard of a HUD? It's static and you arguably need it for most games unless you're a master gamer that either doesn't need it, prefer no HUD for realism, or are preventing burn in. Personally I love HUDs and they are part of the game I need especially for FPS games.
I have, and haven't had a problem with my OLED. That said, anyone watching CNN for that long deserves burn-in.

Sounds like you'll have to wait for Micro-LED monitors to address the brightness and burn-in concerns you have with OLED
 
I have, and haven't had a problem with my OLED. That said, anyone watching CNN for that long deserves burn-in.

Sounds like you'll have to wait for Micro-LED monitors to address the brightness and burn-in concerns you have with OLED

Nah, He's sticking to LCD for life.. Micro LED is just as susceptible to burn in as OLED is. No LCD with FALD either, because you could get burn in from uneven wear on the backlight array. The higher the zone count the higher the chance for burn in! He wants a plain LCD edge lit with a single LED (which ships with worse uniformity than RTINGS OLED burn in tests)
 
You have no evidence of how bright you ran your display. On low brightness sure it's fine. If you're ok with that. There is the other half of people that don't like it dim. Ratings proves that it doesn't hold up when you turn the brightness up. Between you and Ratings guess who I'm gonna believe? The copium alcoholic or Ratings who has no bias alterier motives? 🙂 You're not the first or the last to make a post like "I have xxxxxx hours and look I have no burn in!!! Then Ratings comes in and wipes all the bullshit right up lol.

Brightness is just not terribly relevant, unless you are using a TV in a bright room with windows that let in sunlight.

I usually keep the brightness settings on all of my screens in the 20-40% range, or in my dark office they feel like they are scorching my eye balls.

I'd much rather have the true deep blacks of an OLED screen than go for another technology that gives me added brightness I don't needit want.

I'm not blind. My eyes adjust just fine for low light.

Super bright screens are for the birds and completely useless unless your viewing position is in a bright room.
 
Yes but typically people are seeing way worse in effect because a lot of people choose to shoe-horn larger gaming screens directly onto a desk instead of using a simple rail spine floor tv stand, wall mount, or separate surface for their screen with a better viewing distance. When viewing a 4k at an optimal viewing angle and PPD you'd get much less obnoxious fringing in effect. A 1440p OLED is just lower PPD no matter what due to the screen dimensions/spec but a 4k viewed up close has pretty similarly large sized pixel sizes to your perspective.


I'd argue a 43" 4k screen on a desk used as a traditional monitor about 2ft from my eyeballs is perfect. Perfect pixel density, perfect size for good peripheral vision in games, perfect large screen real estate for work.

I can't imagine doing anything else. This is the end stage monitor solution.

The only thing I'd ask of my monitor which I don't have already today in my Asus XG438Q is better font rendering (I'm guessing the BGR sabotages this) better blacks and HDR capability and I consider a 42" LG OLED to be the perfect solution for this, but I am still a little hesitant regarding burn-in, which is why despite this lig about it for years I haven't pulled the trigger.

While I tend to upgrade monitors on a ~5 year basis, the next monitor I buy has the potential of being the last monitor I ever buy, as I'll never need or want a resolution above 4k, never need a refresh rate above 120hz with VRR, abf hope to get really good HDR capability.

I figure I have another 40 years left in me, so I don't want a monitor that is going to burn in and be unusable in just a few years.

I want to get my end stage monitor, and then move on and never have to think about monitors ever again.

In the past there was always something bigger and better on the horizon. Now I e gotten as big as I want it, so I don't want anything bigger, and if I can only add good HDR and blacks, it has become as good as I ever need it, so since I have zero interest in either 3D or head mounted displays, this is it. This is my last monitor. It needs to last for decades :p
 
Last edited:
I'd argue a 43" 4k screen on a desk used as a traditional monitor about 2ft from my eyeballs is perfect. Perfect pixel density, perfect size for good peripheral vision in games, perfect large screen real estate for work.

It's fine if you like that but it's only about 51 ppd which is pixel sizes more like what a 1400 - 1500p desktop sized monitor would have at their normal viewing distances. You don't hit 60 PPD until about 29" view distance (screen surface to eyeballs) on a 43" 4k . . but a bit higher PPD is even better especially for 2d desktop graphics and imagery that get no text-ss and game-aa to mask the actual pixel sizes/granularity. Higher PPD/smaller perceived pixel sizes is also better for any occasional DLSS+Frame generation edge artifcats because they will be tinier. Also if you are using an oled, higher PDD is better for their non-standard pixel structure making text fringing tinier and less obnoxious.

People used 1440p desktop sized screens for years though so it's not like they aren't usable like that or anything. It's just not optimal and not what a 4k fine pixel size would be. More like a larger field of 1440 - 1550p desktop screen sized pixels to your perspective.



https://qasimk.io/screen-ppd/


..At the human central viewing angle of 60 to 50 degrees, every 8k screen of any size gets around 127 to 154 PPD

..At the human central viewing angle of 60 to 50 degrees, every 4k screen of any size gets around 64 to 77 PPD

..At the human central viewing angle of 60 to 50 degrees, every 2560x1440 screen of any size gets only 43 PPD to 51 PPD

..At the human central viewing angle of 60 to 50 degrees, every 1920x1080 screen of any size gets only 20 PPD to 25 PPD


. . .

Personally I'd sit between 28" and 40" on a 55" 8k screen if I end up getting one.

At 28" view distance, where I'd probably sit for a 4k doublewide, - a 55" 8k would be 95 PPD, and a 65" 8k which most of the other ones available are would be 85 PPD. However at that view distance a 55" would result in an 81 deg viewing angle (and 65" screen at 28" would be 91deg). Could work like that for use as a "bezel-free multi monitor array" type of scenario though with some slight head turning It really would only have around 10 to 15 degrees on each end outside of your human central viewing angle for a 55" (15 to 20 deg each outisde on end for a 65") when used like that which probably isn't bad for desktop/app use. For comparison, a 57" super-ultrawide at 1000R has a base width straight across between the ends of about 51", a 55" flat screen is ~ 48" across, a 65" flat screen is 57" across. So it's probably more like a curved 65" but a flat 55" 16:9 would probably align better with it in an over/under setup.

If ever viewed at the human central viewing angle starting around 60deg (42" view distance), a 55" 8k would be ~ 128 PPD. That's close to the focal point/radius of a 1000R curve as well but something like the 57" 4kd doublewide would turn into a short belt to your perspective at that distance so wouldn't be good to game on from there. Would have to move closer for gaming but with the right setup changing view distance like that depending on what you are doing wouldn't be a problem.
 
never need a refresh rate above 120hz with VRR
FYI -- if you expect to live 40 more years, I hate to open a rabbit hole for ya, but there's better technology coming than VRR in the next ten years that could mainstream within twenty or thirty years, that even console makers (then) will adopt. Technologies that will also be able to de-stutter shader compile stutter and more. In the lab there is stuff that shows a far bigger human-visible difference (random population sample, not esports audience) than 60Hz non-VRR versus 120Hz VRR. And it's none of the garbage 240Hz-vs-360Hz LCD near-uselessness.

However, a lot of people can be satisfied with 120 Hz VRR for the rest of their lives, just as they can be satisfied with SDR displays and 1080p.

More power to you if you don't need any further -- it saves a lot of money, in a similar vein to not wanting a bigger place (keeping life simple) to things like being more interested in retro games (less need for the modern niceties), etc.

I even watch my Hollywood films in good 'ol Hollywood Filmmaker Mode (24fps + Dolby Vision HDR).

EDIT: LG now also includes burn-in warranty, as of today:
https://www.theverge.com/23827701/lg-oled-burn-in-warranty-two-desktop-monitor-windows
 
Last edited:
FYI -- hate to open a rabbit hole for ya, but there's better technology coming than VRR in the next ten years, that will also be able to de-stutter shader compile stutter.
Is that involving "AI" generated frames? Call me a purist but I don't want to see that anywhere outside of videos and gaming, and even then it can go too far and introduce lag and artifacting.

Kinda hoping you mean something else. :D
 
Is that involving "AI" generated frames? Call me a purist but I don't want to see that anywhere outside of videos and gaming, and even then it can go too far and introduce lag and artifacting.
Kinda hoping you mean something else. :D
There are non-AI approaches involved too -- more user choice is involved.

I've presume you read my new lagless-and-artifactless frame generation article? (I'll let someone else link it). Frame generation that reduces latency, too! No double image artifacts, no pararallax artifacts, and I even published a "Developer Best Practices" section too.

Keep in mind, we're already faking real life with triangles/textures already, so there's some new algorithms (not interpolation) that adds another parallel similar quality method of faking (painting/drawing) that is many times more transistor efficient, without needing further Moore's Law, which is sputtering, alas. It's a new era that will bring optimization. As you may have read my earlier posts, Netflix is already 23 fake frames per second and 1 real frame per second, due to the prediction/interpolation mathematics built into MPEGx and H.26x video compression standards. It's darn near artifactless nowadays due to its knowledge of black-box (original uncompressed). Likewise, the new lagless/artifactless framegen algorithms know more of the ground truth (rather than being a black box guesswork interpolator). However, you could also brute the 1000fps if you're a purist, but stay with me here. Read on.

However, there's bigger weak links: Refactoring APIs to framerateless workflows. But I'm actually talking about something else -- a fundamental refactoring of the API ecosystem to fix the dreg/awfulness currently out there.

There's a lot of inefficiencies out there that needs to make it easier for game developers to optimize. We've backed ourselves into a corner of temporal difficulties, giving an illusion of dead-end progress -- e.g. 4K 60fps games being downrated to 4K 30fps on consoles, and still having lots of stutters.

In the lab, there are visions of long-term framerateless ecosystem where APIs/drivers/GPUs take over the temporal responsibilities, and present it to whatever the user wants or prefers (24fps with blur. Or 120fps VRR. There are a lot of low lying apples, much like how CPUs turned into multifaceted things (multicore), and GPUs are about to transition into a multilayer rendering hierarchy. At the end of the presentation, the user will have a choice of what they prefer -- but it also comes to a point where enabling a single setting is like enabling (ULMB + VRR + GSYNC + 10x GPU upgrade + Ergonimic FlickerFree + Less Lag Than Without This Setting) concurrently. So in a sense, the setting becomes irresisistible.

Today's 120fps VRR will be even better 120fps VRR. So existing investments get better, but new investments get dramatically better. After what I've seen, the 240-vs-360 incrementalism is quite useless and garbage.

This isn't useful to everyone, but the point is -- 120fps VRR isn't temporal visual endgame even to grandma (90%+ mainstream), but the current ecosystem is very poorly optimized for further progress. We're so hardcoded around the concept of a framerate, and with that, stutters/jerkiness/judder/etc, from poor temporal hygiene in the hardware and/or software. There is a stupendous amount of low-lying apples even with current GPU technology, but we are completely stymied from being able to easily take advantage, due to current artchitecturing of the existing APIs and engines.

There is a lot of inertia, we like APIs that we are familiar with. So it will likely take decades to go mainstream, but there is a surprising amount of "detail improvements concurrent with latency improvements concurrent with framerate improvements" waiting to be unlocked with a grand optimization, not too dissimilar from the transition from OpenGL -> Vulkan. Maybe a bit bigger jump than that. It will take a new Great Transition (over a decade or more) even if the first products utilizing such technologies will come out by end of decade. As developers gets more difficulty de-stuttering and optimizing their games, and future consoles/games/software try to keep up with 4K 120fps at cheaply as possible, new APIs/GPUs/drivers will take over a lot of temporal responsibilities that de-stutter vastly better than VRR / GSYNC can.

Yes, you can still stay a purist and keep the "original classic method of faking reality via triangles/textures" frames -- the API refactorings to a framerateless workflow will help original triangle/texturing rendering workflows too -- but keep in mind the irresistible "up your settings effect" (further lag reductions concurrent with graphics-detail improvements and framerate improvements) -- so you might tolerate it much like everyone tolerates lossy video compression standards unlike the 1980s fears that consumers wouldn't accept/tolerate it, because it looked worse than analog, etc.

Remember: There was once a time that lossy-compression digital video looked much worse than analog videotape. Then decades later, lossy-compressed digital video looks many times sharper and better than analog videotape (even Japan Muse Analog HD videotape).

People like us are the temporal versions of the 1980s Japanese HDTV researchers -- modern digital video compression had not been invented yet -- back then, many consumers couldn't even remotely really imagine HDTV as it sounded like a fairy tale.

Reading my posts sound like pie-in-sky, but so did my posts 10 years ago, and people stopped laughing...

So today I'm a fairy tale, but tomorrow I am not.
 
Last edited:
Is that involving "AI" generated frames? Call me a purist but I don't want to see that anywhere outside of videos and gaming, and even then it can go too far and introduce lag and artifacting.

Kinda hoping you mean something else. :D

If there ends up eventually being a paradigm shift in the long run with frame insertion - where the game devs, peripherals and drivers, and OS all work together to inform the frame gereration tech of actual in-game vectors, it would be much, much more accurate. Nvidia's current method is only comparing two frames and guessing without any added vector information, so it is using an uniformed method. It is especially "fooled" for example by 3rd person orbiting cameras and such becuase it can look like things aren't moving much or at all when according to the actual game physics they are. Frame gen has a lot of room to mature.

Also, we are still often using AI upscaling and frame gen on essentially LOW resolutions and LOW frame rates relatively. You can't get blood from a rock. The higher the base frame rates (and minimums) get and the higher the base resolutions for AI upscaling get, the better the results will be from frame amplification + AI upscaling (including vs artifacting and input lag). Very high hz filled with multiple "tween" frames could reduce sample-and-hold blur / image persistence of the entire viewport during FoV movement greatly for a much better result aesthetically.
 
Nah, He's sticking to LCD for life.. Micro LED is just as susceptible to burn in as OLED is. No LCD with FALD either, because you could get burn in from uneven wear on the backlight array. The higher the zone count the higher the chance for burn in! He wants a plain LCD edge lit with a single LED (which ships with worse uniformity than RTINGS OLED burn in tests)
😂 Now you've gone too far let's lay off the copium drugs you're in another dimension 🤣
 
I'd argue a 43" 4k screen on a desk used as a traditional monitor about 2ft from my eyeballs is perfect. Perfect pixel density, perfect size for good peripheral vision in games, perfect large screen real estate for work.

I can't imagine doing anything else. This is the end stage monitor solution.

The only thing I'd ask of my monitor which I don't have already today in my Asus XG438Q is better blacks and HDR capability and I consider a 42" LG OLED to be the perfect solution for this, but I am still a little hesitant regarding burn-in, which is why despite this lig about it for years I haven't pulled the trigger.

While I tend to upgrade monitors on a ~5 year basis, the next monitor I buy has the potential of being the last monitor I ever buy, as I'll never need or want a resolution above 4k, never need a refresh rate above 120hz with VRR, abf hope to get really good HDR capability.

I figure I have another 40 years left in me, so I don't want a monitor that is going to burn in and be unusable in just a few years.

I want to get my end stage monitor, and then move on and never have to think about monitors ever again.

In the past there was always something bigger and better on the horizon. Now I e gotten as big as I want it, so I don't want anything bigger, and if I can only add good HDR and blacks, it has become as good as I ever need it, so since I have zero interest in either 3D or head mounted displays, this is it. This is my last monitor. It needs to last for decades :p
This is what I'm saying. So I can spend the money on the other 20 hobbies that cost a lot also and not spend on the same thing over and over again lol Except I wouldn't mind 200hz would be even faster 🙂
 
About having difficulty spending money on upgrades anymore, I can relate. Moore's Law was really fast in the lifetime of any Gen-X'er like me, now it's time to optimize.

Let's not forget 4K was $10,000 in year 2001 -- the IBM T221. Now it's a $299 Walmart special.

With all the semi-mainstreaming of 120Hz going on. Even 240Hz someday will too (maybe 2030-2039 decade). Let's give a mainstream example. As I go to conventions, I actually witnessed Apple employees hovering over the 500-600Hz display panels at DisplayWeek 2022-2023, at BOE's booth (it was BOE's laptop panel, and the 600Hz BOE panel was reported in the news). They're actively monitoring the refresh rate race, even if not ready (power budget, quality budget and price-per-screen budget), though it may take a long time before they leap -- but they are certainly aggressively monitoring higher-Hz developments for power/quality/cost metrics. The refresh rate race will (slowly) mainstream beyond 120Hz. Apple, though, is probably biding their time until 2030s when the power budget of 240Hz is feasible for inclusion in their future OLED/MicroLED displays. But I wouldn't be surprised that 240Hz becomes available as a "near free" mainstream-option in the 2030s+ decade.

We see the very slow 120Hz-mainstreaming via entering consoles, televisions, VR, smartphones, tablets. Although not a fully free feature yet, it's quickly getting there over the course of this decade, and will finally hit all the bottom barrel panels too over the course of the decade.

By 2030, it will be hard to buy a basic bottom-barrel TV that doesn't include 120Hz (like can't buy 720p TVs today), and a number of the higher end units will already include 240Hz. 240Hz TV sized OLEDs are expected to start hitting consumer market ~2025-2026 ish after being shown off at conventions in 2024.

The refresh rate race will be a far slower one than the resolution race, but high Hz (beginning with 120Hz) is slowly falling to free-inclusion.

We don't always use it, like we don't always even use 60Hz (e.g. people who only do 24fps streaming). But 60Hz is there for free already.

As long as there's obvious humankind-visibility benefits, it will still progress. Even to mainstream/grandma (for fast motion) 120Hz-vs-1000Hz (non-LCD 0ms GtG) is much more visible than 4K-vs-8K, or 144Hz-vs-240Hz. Refresh rate incrementalism (and further throttled by LCD GtG) is mainly only useful to esports, the mainstream needs the temporal equivalent of VHS-vs-8K to go more than "meh" on it. But even (non-esports) people are still jawdropping in the lab. So, it's a long century of incremental progress that will still occur.

FYI, a 24fps filmmaker (hard-core 24fps) acknowledged the human-visible benefits of 1000fps+ for gaming/reality simulation use cases:



This is the same filmmaker that did this video:
"A Defense Of Why 24fps Is Here To Stay"



So, even some of the 24fps Hollywood nuts (who saw the lab stuff, or saw motion demos, etc) are slowly acknowledging the human visibility benefits of the refresh rate race -- and the science/physics -- for various reality simulation use cases. They may not use it themselves, but recognize the human visible benefits;

Even 1000fps benefits web browser scrolling, map panning, FPS turning, etc. It is very plainly visible to mainstream at large geometrics (4x-8x) on a 0ms-GtG display.
It may not be worth it if it costs $10K extra, but many will use it at $1-$10 extra.
So, if it was just a few dollars extra, like the cost-add of a "Retina display" (spatial resolution). Those used to cost lots, now near-retina resolution is almost the same cost as non-retina resolution.
 
Last edited:
Back
Top