Real-world content to demonstrate LCD vs OLED differences

OpenSource Ghost

Limp Gawd
Joined
Feb 14, 2022
Messages
220
I want to demonstrate LCD vs OLED differences to someone, but not with some abstract high contast wallpapers. I need well-made photos and actual parts of well-mastered films that show the visual difference between mentioned display technologies.
 
Just make a black image with White text across it.
this is an OELD Screen.jpg


this is an OELD Screen.jpg
 
Last edited:
Buy Spears & Munsil UHD HDR Benchmark to demonstrate real scene contrast.

I've seen quite a few people put OLED next to a window while FALD LCD can just look like one.
 
I'm waiting for PG32UQX/1152-Zones FALD QD-IPS Mini-LED LCD vs S95B/QD-OLED comparison in a dark room. Avatar 2 should be the benching title for true HDR experience.
 
As nice as some of the streamed content can look, just be aware that streams use compression algorithms/methodologies that can muddy parts of the screen or keep parts static. So they aren't the best content to showcase the best picture quality a display is capable of. Even a downloaded version of a streamed service's content will be a somewhat compressed format vs. a 4k HDR disc studio release. They are lower bit rate.

A better test would be discs or full lossless/uncompressed downloads of studio released true 4k material, in dolby vision (or HDR10+ if on a samsung). Some 4k movies are actually 2k upscaled so be aware of that too.

I think blade runner 2049's 4k disc is mastered to 10,000nit HDR. Most movies released so far have been in HDR 1000 with some 4000, though they could be re-released later at HDR 10,000 nit feasibly. I mention that because some screens can map over 1000nit peak color volume/brightness, especially FALD LCDs.

I think HDTVTest uses MadMax: Fury Road in a lot of their tests so that might be a go-to for testing/impressions. (For example, the part where the crazy guitar player shoots a flame thrower, among other scenes like one armed charlise theron in the shade of the big rig outside in the bright desert/badlands). That youtube channel is a good resource if you want to do some research. They even do direct comparisons of some of the different TVs. Some of the planet earth style stuff might be a good test for overall PQ too as they use very expensive cameras for those and the content is amazing.

You also have to be aware of the fact that different tv manufacturers use different mapping of their compression ranges for different HDR peaks. They also may have different ABL thresholds and aggression, peak 10%, 25%, 50%, 100% brightness windows, sustained windows, etc.

For example, LG CX:

LG's 1000-nit curve is accurate up to 560 nits, and squeezes 560 - 1000 nits into 560 - 800 nits.
LG's 4000-nit curve is accurate up to 480 nits, and squeezes 480 - 4000 nits into 480 - 800 nits.
LG's 10000-nit curve is accurate up to 400 nits, and squeezes 400 - 10000 nits into 400 - 800 nits.

Besides that, there is now QD-OLED from samsung which gets brighter and shows more colors but which raises blacks when any ambient lighting is in the room and then shows a pink tint. There are also different screen coatings. Which both bring up the fact that different screens perform better in different lighting environments due to light pollution on the screen and/or how our eyes perceive contrast and saturation vs. ambient lighting. There is also the LG G2 OLED which has a heatsink that could allow brighter peaks and less aggressive ABL potentially than LG's other less expensive oleds.

And that's just off the top of my head. Then there are all of the differences in LCD's and the nearest competitive level of LCD to OLED --> FALD LCD and the different technologies, fw biases (favoring a "glow halo" vs. a "dim halo" on near fald zones, etc) of those.

HDR game implementations by devs can vary a lot too. Some dropped the ball on the peak brightness settings but windows HDR calibration tool seems to fix that for the most part now. Still some games are much worse than others. There is also the fact that you can use different windows HDR calibration settings or in game HDR peak brightness, middle brightness, saturation settings etc. so it's hard to do apples to apples comparisons there. You'd probably have to shoot for what looks best on each particular screen rather than using the same values on all of the screens you were comparing.

So while you can make some general impressions of the two technologies it's a little more complicated.
 
Last edited:
A better test would be discs or full lossless/uncompressed downloads of studio released true 4k material, in dolby vision (or HDR10+ if on a samsung). Some 4k movies are actually 2k upscaled so be aware of that too.
How big would that be ? Do they exist ? 2k compressed movie file for movie theatre (but no temporal compression) were around 150-300 gig back in the days, iphone 13 footage are 5.5 gb per minutes, Arriraw can go over 20 gb per minutes, uncompressed full movie would be quite the things to try to manage and play on customer hardware.

Anything remotely long will always be massively compressed, 10 bit 1080p 24fps uncompressed would take 1.5 tb disk.
They are lower bit rate.
Which is not the only variable, they sometime end up having better quality (for an obvious example much lower bitrate AV1 video can be better looking than the higher x264)
 
This timestamp shows compression artifacts in game of thrones. That's when it's most obvious but really it's always compressed in streams.


You also have to be aware of the fact that different tv manufacturers use different mapping of their compression ranges for different HDR peaks. They also may have different ABL thresholds and aggression, peak 10%, 25%, 50%, 100% brightness windows, sustained windows, etc.

How big would that be ? Do they exist ? 2k compressed movie file for movie theatre (but no temporal compression) were around 150-300 gig back in the days, iphone 13 footage are 5.5 gb per minutes, Arriraw can go over 20 gb per minutes, uncompressed full movie would be quite the things to try to manage and play on customer hardware.

Anything remotely long will always be massively compressed, 10 bit 1080p 24fps uncompressed would take 1.5 tb disk.

Which is not the only variable, they sometime end up having better quality (for an obvious example much lower bitrate AV1 video can be better looking than the higher x264)


Not the cheapest way but buying a 4k disc player with dolbyVision would be full quality.

Compression isn't always a dirty word. I should have said, streaming services has *LOSSY* compression and it can also notably be variable on the fly and by demand of the title. There are "lossless" compression formats, either for the most part or entirely depending on the compression ratio. Rips rip out all of the extras, maybe some audio tracks (7.2 tracks are huge but those are the best and are usually kept), menu stuff, etc. so there is some size savings there. You can rip your own discs and define which movie track and at what compression rate and what format too.

A typical high quality 4k HDR rip probably ends up at 50 to 70 gb for just the movie and 7.2 audio. Some are 100. They use H265 format which results in smaller sizes at the same quality. That will be generally be much higher quality than what streaming services deliver on the fly.

The video posted about disney+ is more about the composition of the title to imax rather than the bitrate effect on PQ but I can see your point in that case. However if they were both mastered in dolby vision, imax, etc. then it would be more of an apples to apples comparison.
 
This timestamp shows compression artifacts in game of thrones. That's when it's most obvious but really it's always compressed in streams.
Feature film are always ultra compressed in any medium you ever tried. Some are terrible like Games of Thrones, some have better than their 4KUHD, because like you pointed out streaming can upgrade to the latest best and offer it and easily fallback to other options, bluray need to be reasonably playable on old players.
Not the cheapest way but buying a 4k disc player with dolbyVision would be full quality.
I am not sure what full quality but no, H/265/mpeg-h could be the best of the time for residential use but how can this be full quality ?, the new Avatar uncompressed take 1 petabyte of storage. The regular non high frame comprressed for movie theater version of the Hobbit movie in 2016 was over 380 gb. Would not be surprised if the best movie theater version of the new Avatar reached over 1 tb.

There are "lossless" compression formats,
Yes like a zip file, but for long video in people home no we are far from that, x265 is not looseless compression.
 
This timestamp shows compression artifacts in game of thrones. That's when it's most obvious but really it's always compressed in streams.
HBO generally has shit quality for a lot of stuff as well as one of the clunkier UIs. They just don't care. Even the new Game of Thrones show is excessively dark often.

I'm a big fan of Japan so me and my spouse watch the walking videos from Rambalac or Virtual Japan as morning breakfast shows over the weekend. When not live streaming they have very good image quality and I think they can show off OLED capabilities quite well in a real world comparable setting rather than "color graded for intention" content you see in movies and TV. That said cameras are not perfect and sometimes will end up with too bright or too dark content if it's not adjusted or if the auto-adjustment does not work well.

OLED will tend to do better in those "night walk" type videos whereas a mini-LED LCD TV might do better for "day walk" videos as there's more bright things overall.
 
I'm waiting for PG32UQX/1152-Zones FALD QD-IPS Mini-LED LCD vs S95B/QD-OLED comparison in a dark room. Avatar 2 should be the benching title for true HDR experience.

S95B will be rekted. 10% perfect square window test won't help much on ABL vs true HDR 1,400.
They will only look similar in scenes with a large portion of black background.

It will be similar like these comparisons:

PG35VQ HDR 1000 vs AW3423DW HDR 1000; ISO 100, shutter speed 1/125 vs 1/125
52141797183_04ce2b259a_o_d.png

52142344637_66bc079930_o_d.png

52142264760_c199dde00e_o_d.png

52142343842_6f3cb9f2d4_o_d.png

52142014594_0c075096f2_o_d.png

52143384923_046cb1eb48_o_d.png

52141814143_96ab3069d5_o_d.png

52143390966_4c9bb56c38_o_d.png
More of the these

Even after putting PG32UQX in SDR to emulate HDR 400, it will look similar like below
PG35VQ YCbCR SDR vs AW3423DW ABL HDR
52162568017_418e478717_o_d.png

52163599513_f9ebbccf03_o_d.png

52163599801_86c0d42b7d_o_d.png

52162581652_f81e024409_o_d.png
 
Last edited:
  • Like
Reactions: Xar
like this
That looks sick on my Alienware
that channel has a bunch of cool tokyo at night vids that i figured would be a good simple demo for what op wants


edit: and this one just made me realize the yt's hdr10 actually works in my browser now. sweet.
 
Last edited:
Feature film are always ultra compressed in any medium you ever tried. Some are terrible like Games of Thrones, some have better than their 4KUHD, because like you pointed out streaming can upgrade to the latest best and offer it and easily fallback to other options, bluray need to be reasonably playable on old players.

I am not sure what full quality but no, H/265/mpeg-h could be the best of the time for residential use but how can this be full quality ?, the new Avatar uncompressed take 1 petabyte of storage. The regular non high frame comprressed for movie theater version of the Hobbit movie in 2016 was over 380 gb. Would not be surprised if the best movie theater version of the new Avatar reached over 1 tb.


Yes like a zip file, but for long video in people home no we are far from that, x265 is not looseless compression.

I meant that it wouldn't be the lossy compression that streaming services use (and would have hdmi 2.1 uncompressed 7.2 audio as well). Streaming providers also apply dynamic resolution switches (something like some game's, esp. console game's, dynamic resolution tech) and other compression algorithms to save bandwidth, and also to balance load in extreme # of viewership cases. It's a type of transcoding.

Some people would upload their 1080p content to youtube as 4k in order to have their material's bitrate not be cut down as much by youtube's lossy compression. That shows how much they cut it down just at the compression stage. They do the same for 4k compared to the source material.

This is an interesting article that goes into a lot of detail on youtube compression:



As for lossless, they say x265 is "virtually lossless" in it's typical usage. AV1 is lossless. That is, it won't lose fidelity from the source material you started with. Really it can depend on what parameters you set when you compress something though. You can set up x265 compression to be lossless if you really wanted to but the file size would probably be somewhat larger (though still compressed) but you'd have negligible, perhaps even imperceptible gains there. And even with AV1 files, streaming service providers can still use bandwidth saving features that change your quality or bitrate on the fly to balance your stream quality network wise (fidelity vs potentially seeing visible buffering, choppiness, or interruption), or to optimize what they spend on bandwidth, or both.

https://x265.readthedocs.io/en/stable/lossless.html
x265 can encode HEVC bitstreams that are entirely lossless (the reconstructed images are bit-exact to the source images) by using the --lossless option. Lossless operation is theoretically simple. Rate control, by definition, is disabled and the encoder disables all quality metrics since they would only waste CPU cycles. Instead, x265 reports only a compression factor at the end of the encode.

. . .
HEVC (H.265) is properly the latest and the better compression codec than others. HEVC is the best lossless video codec in terms of efficiency, performance, and compatibility. Most modern software and hardware have support for HEVC lossless video codec. The latest codec AV1 is a viable option.


Which video codec is best for quality?

For a given compression ratio, H.265(HEVC) is the best codec for quality and is widely used for 4K UHD content. H.265 is the best video codec when it comes to the best trade-off between compression and quality. VP9 is more commonly used for YouTube, Android, Google Chrome, and web browsers. In fact, there's no the best answer because it depends on the source material, the bitrate, and you demands.


Is H.264 lossy or lossless?

H.264 is commonly used for lossy compression. It also offers lossless compression. FFmpeg has a "lossless" mode for x264. If you use libx264rgb ffmpeg encoder with format of pixel rgb24, you won't lose video quality.
. . .

What I was comparing to was as if you had a local source file at a resolution and then you uploaded it to a streaming service, or if you had a UHD 4k HDR disc and converted it to a streaming service format, compression, file size. . and then fed it through their networks and streaming algorithms.

A 3rd party test showed that disney+ and apple tv have considerably higher bitrate than the other streaming services but disney+ and apple's bitrate is still around 1/2 that of a uhd 4k HDR disc. But for a streaming service that is great. Even if it ended up being comparable to something like "full" 1080p or 1500p local source content fidelity, that would still be really high for a streaming service. It also doesn't consider that as an end result you might have a good scaler and/or sharpness feature in your screen or device, perhaps even AI upscaling.

And for walking in japan vids on youtube in 4k. I've definitely watched those on my 48cx when I first got it and they look great. Very colorful and with high contrast in black nighttime shots. I just don't like that they keep their logo on the screen the whole time.

. . .

My original point was that there are a lot of variables and options in both local content and streaming content providers and their delivery systems, mfgs and models of screens (some models even switched panels during the same model # run), screen technologies (QD-OLED, wrgb OLED, FALD and # of zones LCD, non-fald local dimming LCD, backlight LCD, IPS, VA, TN), firmware updates, user settings/setup guides/calibration, ambient lighting/room layout/light pollution/lighting vectors, screen coatings, etc.

So showing someone a random LCD and a random OLED can and will show differences but you'd experience different results depending on many of those factors.
 
Last edited:
  • Like
Reactions: Xar
like this
As for lossless, they say x265 is "virtually lossless" in it's typical usage. AV1 is lossless. That is, it won't lose fidelity from the source material you started with. Really it can depend on what parameters you set when you compress something though. You can set up x265 compression to be lossless if you really wanted to but the file size would probably be somewhat larger. And even with AV1 files, streaming service providers can still use bandwidth saving features that change your quality or bitrate on the fly to balance your stream quality network wise, or to optimize what they spend on bandwidth, or both.
You can use x265 in -lossless mode but it reduce the raw file size not enough to make usable, when Warner Brother made very high quality to test tv technology they made 3GB/s file, highest bitrate you go on a UHD bluray will be 166 times smallers at 144mb/s and a lot of it will be for the sounds.

A 3rd party test showed that disney+ and apple tv have considerably higher bitrate than the other streaming services but disney+ and apple's bitrate is still around 1/2 that of a uhd 4k HDR disc
Yes but a bit like x265 at similar bitrate will look much better than a x264 (and I imagine AV1 much better than x265) there is more to it than raw bitrate, Netflix for example now stream some content via AV1 if the client support it, it will not need the same bitrate to look has good:
https://www.audioholics.com/hdtv-formats/netflix-av1

If they stay at around that 50% ratio they could very well often be has good if not better quality in the near future.

That talk of bluray being lossless versus compressed streaming existed even in the days of the lower bandwidth h.264 Blu-ray and if they ever release 500gb movies on the future better than AV1 codec people will start to call the previous UHD compressed affair. There some notion around we hear very often about disk not being compressed for some reason, maybe it is just a shortcut that everyone understand has being a bit less compressed than streaming platform, but it is often stated has if there was more than that to it
 
Last edited:
It really wouldn't make sense to run x265 uncompressed regardless, at default "virtually lossless" compression would be pretty much imperceptible without a microscope or sitting up close or zoomed in looking at screenshots. AV1 can be lossless functionally though from what I've read.

As for streaming . . there is more to streaming than compressing the file and playing it back at that same singular compressed quality though. They do bandwidth saving dynamically too. They are transcoding, and that transcoder isn't set to "original" quality, it's set to "dynamic" so to speak. Better, more efficient codecs for streaming (and discs) is a good thing though, and theoretically should reduce the amount streaming services have to narrow your feed due to network dynamics, or to do "load balancing", (or to save $ on bandwidth). Even with AV1 codec, a streaming service could use less than 1:1 source to content delivery wise if/when they wanted to. Youtube might still be reducing the quality of an upload from 1:1 source and in their transcoding of it as well even with AV1 codec, but perhaps not by as much as they cut 1080 and 4k source material before. If so, I'd be interested to find out exactly how that might be changing.

The compression/bitrate to quality comment you made is true admittedly, and the tech is getting better. I'd love it if streaming services could reliably provide the same 4k uhd video+master 7.2 audio quality (or future AV1 uhd disc quality?) without streaming compression/transcoding tradeoffs, and full hdmi 2.1 uncompressed 7.1 audio without dynamically altering that either. As it is, they are all still dynamic transcoders, and youtube especially has limitations and downgrades as it stands now. They can still look and sound very good though currently, esp for a streaming service. Apple and Disney seem to have the best feeds for now. AV1 might narrow that playing field though in the long run, and like you said future codecs/technologies (e.g. AI upscaling, frame amplification, etc) in years ahead.
 
Last edited:
A friend was over just now and I told him that video was produced by Nvidia showing their new Ray Tracing technology and he believed me.
He was like, those people look so real, lol.

When I first got my 48cx I watched some world-y HDR documentary that showed a torch-baton spinnning native guy in the dark. It was striking how isolated the flame was in the blackness of night. Then I soon noticed the gleam and reflections in people's eyes on screen. I've never seen a LCD look like that, even FALD ones - though I haven't owned a modern higher density fald one myself, I've only seen those in bestbuy. The complete isolation of high color volume pixels right next to contrasted areas of up to ultra/"infinite" black down to sbs pixels is amazing. HDR games also look great.
 
Back
Top