Why OLED for PC use?

You just downplay the importance of brightness as if it doesn't matter. If there is no brightness then there is no color. Realistic images has much higher mid to showcase 10bit color. A higher range is always better.

Of course overall image brightness matters, but is somewhat moot for SDR. SDR brightness reference is generally 100 nits. Anyone is free to go above that, as I do, if you want a bit more pop. But again, mastered in sRGB can't look more accurate than sRGB.

I can turn the brightness all the way up on my TV for SDR if I want (it's at 9 now for ~160 nits and goes all the way to 50). Colors will pop much more and compared side-by-side, and there would be no contest as to which looks prettier/more vibrant if I took a photo. I'd also have a headache and my eyes would burn at the first white screen I came to. Of course, this isn't desireable. In a dark room, not comparing side-by-side, my eyes adjust to the 160 nits and things look good and are sufficiently easy on the eyes while looking like they're supposed to, albeit a touch brighter per my preference.

Even for HDR, the technology is displaying things in the nits they're meant to display in, and the majority of those things are going to be in the SDR or slightly-above-SDR brightness range. Where you need that extra brightness oompf is things like suns, lights, neon signs, etc., and OLED does fine here most of the time. FALD does, of course, do better because it can get brighter and sustain it longer, so OLEDs would be limited in full field, super bright images. But, those tend to be fairly brief in games and movies as they'd be too bright to look at sustained comfortably anyways, so in most cases, OLED is going to look quite good, and that's been my experience so far.

It's ridiculous to trade a little local dimming bloom or trade sustained 1600nits or trade 10bit color for a 200nits OLED experience. A dim image with a fewer dots highlight is not true HDR. FLAD LCD can display better than this.

It may be ridiculous *to you*, and that's fine. The bloom I was experiencing in day-to-day use bothered me much more than any ABL on this monitor ever could, and at a higher price. In short, I tried one of the best-reviewed FALD on the market and was pretty unhappy. I'm much happier with the OLED. You can argue specs all you want. Hell, I bought the FALD because of the specs and was avoiding OLED for those same reasons. But real-world use matters, and in the end, I ended up much happier with the picture in the vast majority of the situations I use the display for with the OLED, and the HDR implementation is plenty for me.

As far as panel technology can go, manufacturers especially LG is on a road to go all-in just like Sharp. If things doesn't go well it will be too late for LG to realize that before jumping to the next gen tech the FALD LCD already takes the market.

I'm excited for the future of miniLED and perhaps even moreso microLED. But I also don't feel it's there yet - it still has a lot of issues that need to be ironed out. Who knows - maybe FALD will solve its issues and be king with something like microLED, maybe OLED will continue to improve and solve burn-in and brightness and be king, or maybe it will be something different altogether that ends up being the best technology that can combine the best of both. But for right now, both FALD and OLED are popular for different people for valid reasons. They both have clear pros and cons that comes down to what upsides a person values most and what downsides a person can live with.

Accuracy in limited sRGB doesn't mean it will have a better or a natural image. Accuracy is intended for mass distribution. It's more important for the creators. These creators might not have the intention or budget to implement HDR for better images. Even if they implement native HDR, the UI can still shine at max brightness just like every game made by Supermassive. These anomalies are easy to fix.

Native HDR can be good or bad depending on implementation, and the industry is still working out the kinks in that. I remember Red Dead Redemption's HDR was notoriously broken for quite a while. But the anomalies with Auto SDR do extend beyond that. When there's not proper information as to what the brightness of different aspects of a picture is supposed to be, extrapolation from an algorithm can be good, but far from perfect. Again, you may not prefer an sRGB image. But I do, for its objective accuracy, and for the naturalness I perceive. Many creators do like and prefer it when those who enjoy their work view it as close to reference as possible.

AutoHDR is a way if you want to see a higher range. It is only a matter of time before AutoHDR will give users various options such as adjusting levels of shadows, midtones, highlights, and saturation just like native HDR so they can see better HDR matched on the range of their monitors instead of SDR sRGB.

The higher range is artificial, though. It's nothing more than an educated guess expanding an image not meant for the HDR range into the HDR range. The results may be appealing to some people, but they won't be for everyone, me included.

As far as the customizability, I think giving people options is great. Me, personally? I don't want to spend forever tweaking an image to make it the way I want it, especially when I may have no reference to what it was supposed to look like in the first place. Intent and accuracy matters to me, so I just want to watch it as close to the creator's intent as possible on my display. I might make minor tweaks to things like brightness, but otherwise, I'd prefer to keep it simple for myself. (Though I'm great with others having all the options in the world, as long as I don't have to use them =oP).

I think on most of these points, we're just going to have to agree to disagree. I can respect not everyone has the same priorities with content. I'm just trying to point out not everyone has the same criteria for what's best, and a great monitor that's the best fit for someone can mean different things to different people. I never expected that to be OLED for me, but here we are, and at this time in technology, it is.
 
mastered in sRGB can't look more accurate than sRGB.
Accuracy in sRGB only means distribution. It's very easy to cover 100% sRGB. Even TN monitor has high color accuracy in sRGB. It doesn't mean it looks good. sRGB always looks "accurately" dull and lifeless. People shouldn't limit the range to see how dull it looks. What is more important is to stretch sRGB into high-range HDR as there are monitors that can display better images than just the lowest sRGB.


I can turn the brightness all the way up on my TV for SDR
Simply turning up the brightness won't do the job. You will have a raised black level. The color won't change either. You need to put the image into another color space such as Adobe for more colors at higher brightness. Then use YCbCr to have a lower black level.

Even for HDR, the technology is displaying things in the nits they're meant to display in, and the majority of those things are going to be in the SDR or slightly-above-SDR brightness range.
The majority or even all parts of an image can be outside 400nits to have a higher APL. This is where HDR can show more color. Realistic images even in a night scene need higher APL. You just don't realize that 400ntis APL can be very common. 100nits SDR plus a few highlights never show color. They don't look better.

You can argue specs all you want. Hell, I bought the FALD because of the specs and was avoiding OLED for those same reasons. But real-world use matters, and in the end, I ended up much happier with the picture in the vast majority of the situations I use the display for with the OLED, and the HDR implementation is plenty for me.
It's the same problem seeing content with just 100nits SDR plus a few highlights. That's just an iceberg that barely scratches HDR. It's a matter of time before more contents have a higher APL. OLED is not even going to be enough because what you see on OLED is just SDR 8bit unless you are only satisfied with an iceberg.
 
Muh FALD LCD

Screenshot_20230212-121959.jpg
Screenshot_20230212-121950.jpg
Screenshot_20230212-122018.jpg
Screenshot_20230212-122603.jpg
 
If you've bought the remastered UHD version of the old movies such as 1975 Jaws, there is a section specifically explained how the old Rec709/sRGB footage is regraded to HDR1000 for better images more than what the director had seen before.
I can't find it. Be more specific.
Remastered movies are all graded from the lower ranges of sRGB/Rec.709 80nits to HDR 1000.
They are absolutely not. Movies are not "converted" from "SDR 80 nits" to "HDR 1000". If they were it would very much count as "fake HDR", much like what you are practicing "Adobe SDR 400 nits" nonsense.
I can grade movies myself. I know how it works.
I'm sure you can. You have probably graded many of the UHD Blu-Ray releases on the market...
A true HDR monitors has Adobe color to automatically deliver better images similar to HDR400 so you don't need to see sRGB 80nits while the actual HDR shows more powerful impacts

You can keep denying it with your imagination while 400nits Adobe SDR easily destroys OLED HDR400 which is still inside SDR range.
Once again you are the one letting your imagination run wild just to discredit a certain type of technology. I know you know what the actual shortcomings of both OLED panels and FALD LCDs are, so I find it rather bizarre that you can't just stick to factual discussion. You know that it's a compromise no matter what you pick, but you are so dead set on full scene ultra bright HDR being the only factor that matters.
 
Accuracy in sRGB only means distribution. It's very easy to cover 100% sRGB. Even TN monitor has high color accuracy in sRGB. It doesn't mean it looks good. sRGB always looks "accurately" dull and lifeless. People shouldn't limit the range to see how dull it looks. What is more important is to stretch sRGB into high-range HDR as there are monitors that can display better images than just the lowest sRGB.

This is just factually incorrect as far as "accuracy" only meaning "distribution". You may prefer stretching sRGB into HDR, but that doesn't mean it's accurate, and I think most professional calibrators/colorists/directors/etc. would tell you the same thing and say they value accuracy over "pop". Accuracy matters to me. If it doesn't to you, it's all good and that's your preference, but I personally have no interest in images being stretched into ranges they weren't mastered in, and I'm not alone. It's great people have the option if that's what they prefer, but trying to say everyone should prefer an inaccurate image because it's brighter or more colorful doesn't make a lot of sense to people who value accuracy. For me, striving towards accuracy will always be the goal.

Simply turning up the brightness won't do the job. You will have a raised black level. The color won't change either. You need to put the image into another color space such as Adobe for more colors at higher brightness. Then use YCbCr to have a lower black level.

If I crank the brightness way up on an SDR display, colors indeed do appear more vibrant. But yes, it will raise black level - it's artificially raising brightness way past a sensible level. But putting things into another color space for "more colors" isn't really all that different. If it's mastered in sRGB, there are no "more colors" available in the data for you to see. Sure, you can get "more colors" by stretching to a different color space, but they're not intended colors, so to me that's an entirely *undesireable* outcome.

The majority or even all parts of an image can be outside 400nits to have a higher APL. This is where HDR can show more color. Realistic images even in a night scene need higher APL. You just don't realize that 400ntis APL can be very common. 100nits SDR plus a few highlights never show color. They don't look better.
It's the same problem seeing content with just 100nits SDR plus a few highlights. That's just an iceberg that barely scratches HDR. It's a matter of time before more contents have a higher APL. OLED is not even going to be enough because what you see on OLED is just SDR 8bit unless you are only satisfied with an iceberg.

I've already conceded that native HDR is going to look best the brighter a display can go, which right now is indeed FALD, though I'd still argue it looks quite good in the vast majority of cases for things like gaming on OLED, partially because of the infinite blacks, but also because most of the time those highlights are all you need. Colors are saturated just fine, and even on a brighter display, would be at the same brightness level regardless of FALD or OLED since most of the images you're going to see, even in HDR, aren't using that many nits. 400 nits is bright! Unless you're in a really bright room (which HDR isn't meant for in the first place unless it's a heavily tone mapped version meant for a bright room), an image consistently that bright is going to cause a lot of eyestrain (I know I couldn't watch it without a headache), and it makes no sense to me that you're saying dark scenes need a high APL. Huh? The whole point of HDR is to portray realism, so if a shadow is at 20 nits, 20 nits is what is shown for said shadow... Maybe most of what you see in a midday scene is higher than 100 nits reference - I haven't measured, but I think for most of the HDR content I've watched (and this is on my really bright Sony FALD TV), APL is generally not that far from SDR. A bright light source would be a lot more nits for the light (or the best tone mapping can do if a display can't reach the exact figure), but those scenes tend to be brief and transient. Do you have any source for HDR content where the APL is 400 nits throughout? If there's content that pushes close to 400 consistently, I'd love for you to point me in the direction of what that is.

Again, I haven't found problematic HDR content on modern OLED displays so far (my first OLED couldn't do tone mapping, and it was a problem then, but that was a long time ago and technology has come a long ways since then). This OLED has done a great job with everything I've thrown at it so far. Brights aren't quite as good as my FALD TV, but contrast is better, and for my uses, I'm quite happy so far.
 

The camera has it's own bias by nature of it being a camera in the first place, plus camera settings, the picture format, and each monitor people are using to view your images is different . Cameras, while they have their own relativity biases also don't see screens vs ambient lighting environment / relative to each other the same as we do in person and vs screen surface treatments (matte raises blacks and can affect details) in exactly the same way our eyes and brains process them so it's not that meaningful. Each screen you took pictures of as well as everyone's screens they are viewing them on also have a myriad of settings that can be adjusted and modes that can be used to different effect.

That said, the samsung has bad blooming and large pale washed out areas in the pictures you posted so idk if it's Poe's law and you forgot the /s or what. The trees lack depth/saturated blacks, rocket has a huge bright area around it and the starfield is missing a ton of stars. idk whats up with the native american scene and the trees scenes and why the background details are so low on the oled in that example settings/mode wise perhaps idk - but the samsung version seems washed out with blacks lifted high to greys/blues of a dirty chalkboard like some first person shooter player cranking up the settings in order to see in areas that are aesthetically supposed to be dark lol.

Again, due to the reasons I referenced above you really can't show what each looks like in person , especially in HDR , so it's pretty dumb to use them as a reference either way - but if I compared the bright ones posted there with missing details and blown out lifted areas to my own two oled screens at my own settings with my own movies locally, I'd stick with my oleds. I did view those pictures full size on my ips glossy tablet, 600nit matte (non fald) LCD laptop, and my C1 OLED for what it's worth.

========================================================
You just downplay the importance of brightness as if it doesn't matter. If there is no brightness then there is no color. Realistic images has much higher mid to showcase 10bit color. A higher range is always better.

It's ridiculous to trade a little local dimming bloom or trade sustained 1600nits or trade 10bit color for a 200nits OLED experience. A dim image with a fewer dots highlight is not true HDR. FLAD LCD can display better than this.

As far as panel technology can go, manufacturers especially LG is on a road to go all-in just like Sharp. If things doesn't go well it will be too late for LG to realize that before jumping to the next gen tech the FALD LCD already takes the market.

Yes and you downplay the value of per pixel emissive razors edge side by side per pixel emissivity and side by side contrast, side by side per pixel detail in color rather than large zones. Each zone on a 1,344 zone FALD screen has 6,171 pixels in it and is tilted against all of the other surround zones of 6, 171 pixels. May even have more pixels per zone if they are counting any edge lights as zone too. 1152 zones = 7200 pixels per zone (or more). Double that on a 8k screen. You also downplay the blooming and dimming and losses that causes in fine detail and highly contrasted scenes. There is also the fact that many FALD screens have matte coatings layer treatments that raise blacks no matter what screen they are on (LCD or OLED), will lose the full glossy clear saturated wet ice look to more of a haze frost, and can compromise fine detail slightly.

I agree that higher HDR and color are going to be the way to go in the future once these tradeoffs are nullified across the board - but these tradeoffs will continue to exist until another per pixel emissive display type replaces both of FALD and OLED.
 
Last edited:
I can't find it. Be more specific.

They are absolutely not. Movies are not "converted" from "SDR 80 nits" to "HDR 1000". If they were it would very much count as "fake HDR", much like what you are practicing "Adobe SDR 400 nits" nonsense.

I'm sure you can. You have probably graded many of the UHD Blu-Ray releases on the market...

Once again you are the one letting your imagination run wild just to discredit a certain type of technology. I know you know what the actual shortcomings of both OLED panels and FALD LCDs are, so I find it rather bizarre that you can't just stick to factual discussion. You know that it's a compromise no matter what you pick, but you are so dead set on full scene ultra bright HDR being the only factor that matters.
No, you are the one do imaginations. You don't even know how grading works. You can either directly grade movies from modern raw 12bit footage at HDR2000+ or you can grade old movies from old 8bit footage at HDR1000.

When you buy that UHD disk, you will find the extra footage explaining how they grade from old archive. Don't imagine old footages back in 1975 has anything more other than sRGB/rec.709. I can make HDR1000 just from stretched 8bit color.
 
...Film has a wide dynamic range (and fidelity) that hasn't been fully represented using most home media in the past... Whatever was seen has obviously either not been an accurate picture of what is occurring, some process specific to how they went about creating that release or been misunderstood.
 
This is just factually incorrect as far as "accuracy" only meaning "distribution". You may prefer stretching sRGB into HDR, but that doesn't mean it's accurate, and I think most professional calibrators/colorists/directors/etc. would tell you the same thing and say they value accuracy over "pop". Accuracy matters to me. If it doesn't to you, it's all good and that's your preference, but I personally have no interest in images being stretched into ranges they weren't mastered in, and I'm not alone. It's great people have the option if that's what they prefer, but trying to say everyone should prefer an inaccurate image because it's brighter or more colorful doesn't make a lot of sense to people who value accuracy. For me, striving towards accuracy will always be the goal.
Just sticking to sRGB doesn't make sense now. Most professional calibrators/colorists/directors/etc will be happy if you can see a better image out of old sRGB because human eyes can see a lot more outside sRGB. They always want to intend more than sRGB. They grade it for sRGB because of technology limitations as the majority of monitors only do fine with sRGB. When all the monitors can do HDR1000, nobody will want to see SDR sRGB.

Accuracy only means you see sRGB distributed in a compromised way with limited colorpace and brightness. HDR is what they want to intend if they have more choices. Now there are HDR monitors so we can make better images ourselves if the content is only sRGB.

an image consistently that bright is going to cause a lot of eyestrain (I know I couldn't watch it without a headache), and it makes no sense to me that you're saying dark scenes need a high APL
You won't have eyestrain watching higher APL images as long as the monitor is DC dimming. OLED has flickers. A higher APL will give eyestrain easily on OLED.

Night scenes can have high APL. You just don't see them in a more realistic way. If you keep seeing images like these you will realize APL can a lot higher. There are easy 4000nits highlight and a large area over 800nits. OLED can get ABL in this scene.
52684209112_beccee8d9a_o_d.png


I've already conceded that native HDR is going to look best the brighter a display can go, which right now is indeed FALD, though I'd still argue it looks quite good in the vast majority of cases for things like gaming on OLED, partially because of the infinite blacks, but also because most of the time those highlights are all you need. Colors are saturated just fine, and even on a brighter display, would be at the same brightness level regardless of FALD or OLED since most of the images you're going to see, even in HDR, aren't using that many nits.
As I've said many times 100nits average images with a few dots of highlight only scratch HDR. You just need to wait for more for content to see the actual HDR while I can make HDR myself. I know how much better these high APL realistic images can look.
 
Yes and you downplay the value of per pixel emissive razors edge side by side per pixel emissivity and side by side contrast, side by side per pixel detail in color rather than large zones. Each zone on a 1,344 zone FALD screen has 6,171 pixels in it and is tilted against all of the other surround zones of 6, 171 pixels. May even have more pixels per zone if they are counting any edge lights as zone too. 1152 zones = 7200 pixels per zone (or more). Double that on a 8k screen. You also downplay the blooming and dimming and losses that causes in fine detail and highly contrasted scenes. There is also the fact that many FALD screens have matte coatings layer treatments that raise blacks no matter what screen they are on (LCD or OLED), will lose the full glossy clear saturated wet ice look to more of a haze frost, and can compromise fine detail slightly.
Individual pixel level emissivity is nice but OLED doesn't produce enough brightness. Color is lit by brightness. HDR needs both contrast and color or you just see SDR.

This is a bigger trade off when having contrast without brightness while having moderate contrast with brightness can delivery more realistic HDR at higher range.
 
Individual pixel level emissivity is nice but OLED doesn't produce enough brightness. Color is lit by brightness. HDR needs both contrast and color or you just see SDR.

This is a bigger trade off when having contrast without brightness while having moderate contrast with brightness can delivery more realistic HDR at higher range.

It's tradeoffs either way . .

Until we get another per pixel emissive display type that can do everything better (and by then it might be well over 2000nit hopefully if temps can be handled without aggressive ABL all over again). Per pixel emissive display methods are clearly the superior way to display things all other factors aside.

Once we get more per pixel emissive display technology(ies) like microLED in the enthusiast consumer space , no-one will ever do "zones" ever again if they can help it let alone ones full of 7k to 1500k pixels each zone, and dim/glow balancing between the large zones affecting the whole cell area worth of those thousands and thousands of pixels etc.

I can still use a FALD, or an edge lit, matte layer treatment, etc. if I have to on whatever display but I wouldn't seek one out as my media+gaming usage scenario screen currently if I had an OLED option. If set on LCD or forced to use LCD, similarly wouldn't want to go back to simple edge lit displays flashlighting from the sides either as FALD is clearly better pixel lighting and dimming than edge lit at least. FALD is still flashlighting and dimming in an ice tray grid against all of the other large zones though where OLED lights and darkens each pixel down to a razor's edge with contrast down to ultra black throughout to the pixel level, as well as individual pixels of detail in color and detail in darks - pixels side by side by side by side times 8,294,400 pixels.

Both FALD and OLED have strengths and weaknesses I think everyone can agree. Eventually we'll get microled per pixel emissive technologies that should get the best of both worlds, and better, but this is what we have for now.

"I agree that higher HDR and color are going to be the way to go in the future once these tradeoffs are nullified across the board - but these tradeoffs will continue to exist until another per pixel emissive display type replaces both of FALD and OLED."
 
Last edited:
...Film has a wide dynamic range (and fidelity) that hasn't been fully represented using most home media in the past... Whatever was seen has obviously either not been an accurate picture of what is occurring, some process specific to how they went about creating that release or been misunderstood.

Completely agreed that how a film is originally captured isn't always reflected in the home release. I'm just saying how it's mastered/released is the most accurate experience you can get at home until a better version comes out, as you can't get that lost information back from something like sRGB if that's the best format available.

Just sticking to sRGB doesn't make sense now. Most professional calibrators/colorists/directors/etc will be happy if you can see a better image out of old sRGB because human eyes can see a lot more outside sRGB. They always want to intend more than sRGB. They grade it for sRGB because of technology limitations as the majority of monitors only do fine with sRGB. When all the monitors can do HDR1000, nobody will want to see SDR sRGB.

Please feel free to provide quotes/articles from any calibrators/colorists/directors, etc. where they encourage people auto-converting sRGB into HDR. Because just about every article I've ever read from people in these industries trends towards people getting the most accurate at-home experience, and that's without a doubt sRGB if no native HDR version is available. Now, I'll completely agree with you if something is manually and competently converted to HDR (Jaws for example), that's the version to get. But if it's not, and only sRGB is available (which is the case with a lot of old games/TV shows/movies), to a lot of us, sRGB is still preferable if we want the most accurate picture currently available. Auto HDR isn't usually a train wreck and is very watchable/even enjoyable, but there's no reason for me to make something less accurate than it can otherwise look by auto converting, until such time as a properly made HDR version is available.

Accuracy only means you see sRGB distributed in a compromised way with limited colorpace and brightness. HDR is what they want to intend if they have more choices. Now there are HDR monitors so we can make better images ourselves if the content is only sRGB.

sRGB has been the standard for a reason. Part of that is certainly limitations for home viewing, but that's not a bad thing. If they intend HDR, there will be an HDR native version eventually. Your "better" is completely subjective. It isn't objectively better to take something with only an sRGB source and expand it into an HDR colorspace. That's what I'm saying. That's YOUR preference for some extra pop, and that's great, but it is not the most accurate way to watch the material and it never will be.


You won't have eyestrain watching higher APL images as long as the monitor is DC dimming. OLED has flickers. A higher APL will give eyestrain easily on OLED.

Personally, I haven't noticed any flicker on my OLED, and it feels easier on my eyes than the IPS and FALD models I tried. That said, I have no idea at higher APL if this would be an issue or not. It certainly hasn't for me on this monitor with the content I've viewed so far.


Night scenes can have high APL. You just don't see them in a more realistic way. If you keep seeing images like these you will realize APL can a lot higher. There are easy 4000nits highlight and a large area over 800nits. OLED can get ABL in this scene.

View attachment 548561

Source for this video? Happy to test it on my LG OLED and see how it looks. I can even compare to my Sony FALD TV.

This (especially check out the first five minutes) had several sunrises and looked gorgeous on my OLED:


I'm sure it's brighter on FALD, but it looked plenty beautiful with great colors on the OLED. No hint of ABL that I saw in the brighter scenes either.

As I've said many times 100nits average images with a few dots of highlight only scratch HDR. You just need to wait for more for content to see the actual HDR while I can make HDR myself. I know how much better these high APL realistic images can look.

You keep saying that, but you're also not providing a compelling argument. "Wait for more content"? There's plenty of great HDR content already.

As far as making your own, that's great and I'm glad you can enjoy the content you want the way you enjoy it. If you have the ability to manually take source material and make it into great looking HDR, more power to you. Manual grading into HDR can look great if done right. My argument is simply that the Auto HDR algorithms don't do justice with sRGB when there's literally no information there to allow for a more accurate reproduction of colors, and you're better off watching in the best format it's mastered and released in if you care about accuracy, which for me is sRGB for sRGB (and HDR for native HDR).
 
No, you are the one do imaginations. You don't even know how grading works. You can either directly grade movies from modern raw 12bit footage at HDR2000+ or you can grade old movies from old 8bit footage at HDR1000.
No. You're still the one imagining things. I haven't made up a single term, but you have made up several. This is well-documented.
And what on earth is this 8-bit footage you're talking about? Please provide a source.
You don't even know how grading works.
I don't? Well, why don't you post something that backs up your claims? I can't think of a single reason why you wouldn't...
You can either directly grade movies from modern raw 12bit footage at HDR2000+
You should elaborate on what you mean by modern raw 12-bit footage, since just calling it that is a bit of an oversimplification. I know what it is, but some of our viewers may not.
or you can grade old movies from old 8bit footage at HDR1000.
And this is where we go completely off the rails, deep into your imagination...
When you buy that UHD disk, you will find the extra footage explaining how they grade from old archive.
Yeah, or you could just provide a source.
Don't imagine old footages back in 1975 has anything more other than sRGB/rec.709.
I'm not imagining anything. Why on earth are you talking about SDR in connection with scanning analog film and delivering it in HDR? There is literally no connection except for maintaining consistency between SDR and HDR releases.
I can make HDR1000 just from stretched 8bit color.
Yeah, I'm sure a professional colorist such as yourself can do literally anything you want.
 
No. You're still the one imagining things. I haven't made up a single term, but you have made up several. This is well-documented.
And what on earth is this 8-bit footage you're talking about? Please provide a source.

I don't? Well, why don't you post something that backs up your claims? I can't think of a single reason why you wouldn't...

You should elaborate on what you mean by modern raw 12-bit footage, since just calling it that is a bit of an oversimplification. I know what it is, but some of our viewers may not.

And this is where we go completely off the rails, deep into your imagination...

Yeah, or you could just provide a source.

I'm not imagining anything. Why on earth are you talking about SDR in connection with scanning analog film and delivering it in HDR? There is literally no connection except for maintaining consistency between SDR and HDR releases.

Yeah, I'm sure a professional colorist such as yourself can do literally anything you want.
You don't even know the production workflow.

Both SDR and HDR are post production from raw footage captured by camera. There are cameras able to capture 12bit. But there wasn't any camera used for 12bit for movies in 1975. You think you can magically scanning 10bit or 12bit from a 1975 film that capturers as many data as 8bit?

Funny you talk like you have never grade HDR. You can only watch SDR on OLED and imagine it is HDR. Ture HDR is not something 200nits OLED can display properly.
 
Both SDR and HDR are post production from raw footage captured by camera. There are cameras able to capture 12bit. But there wasn't any camera used for 12bit for movies in 1975. You think you can magically scanning 10bit or 12bit from a 1975 film that capturers as many data as 8bit?
It seem really strange to be talking about bits when talking about films, a remaster can start from a fresh up to date scan no ?

The 1975 films (or a Tarantino-Nolan movie made in 2020) were not captured digitally with any set amount of bits, but with 16-35-65mm film.

And the amount of range in a good film will be hard to beat with digital capture.
 
Last edited:
It seem really strange to be talking about bits when talking about films, a remaster can start from a fresh up to date scan no ?

The 1975 films (or a Tarantino-Nolan movie made in 2020) were not captured digitally with any set amount of bits, but with 16-35-65mm film.

And the amount of color range in a good film will be hard to beat with digital capture.
Old films only has so much data. That's why they need to scan the original to have as much data as possible. They still don't look as details as modern footages even after rescanning and regrading.

Even the digital phone camera can capture way more data now.
 
35 mm is around 5k. Interesting topic.

https://letsenhance.io/blog/all/film-renessainse-or-back-from-dead/


So what exactly is the resolution of film?​


The film resolution doesn’t exist.
We can’t measure the analog film in the same way we do with digital ones.

We can't do it until the film would be captured by the scanning device and turned into the digital pixel coordinate system. After that it's open for usage on Blu-Ray disks, by streaming services or broadcast media.

Why?
Because the film hasn’t got a pixel count. There is no systematic combination of red, blue, green dots onto any sort of grid.
Instead of this, the film has grain. Lots of tiny grains.

Every frame has a unique grain pattern. It gives an illusion of images moving within itself on the contrary with a digital version which is very flat.
The analog film has higher resolution than the most digital sensor cameras we have today.
35 mm translates to 4-16 megapixels depending on the film quality. These 16 megapixels (if the movie was shot on a good film) translates into 4920 x 3264 and it’s about 5K in modern digital equivalents.

Yeap, the old movies shot many years ago have approximately 4-5K of modern quality.
Mesmerizing?
So what quality approximately has 70mm?
If we convert 70mm celluloid film into digital it would be... 13K.
We barely can afford 8K quality in 2019!

Well, wait!
But we have 70mm for IMAX cameras and that’s almost unimaginable quality - 18K. And yeap, “Dunkirk” has 18K of quality. And all other visual masterpieces that were shot on 70mm IMAX film as well.

70mm IMAX frame is about 10 times larger in physical size and about three times larger in final resolution than the iconic 35mm frame.
By the way, 35mm is the standard for theatrical presentation widely known as DCP - Digital Cinema Package.

. . .

https://www.filmfix.com/blog.asp?post=599

Thirty-five-millimeter film has a digital resolution equivalent of approximately 5.6 kilobytes of information. (This translates to an image size of about 5,600 × 3,620 pixels.) The finite resolution of film will fluctuate, based on multiple variables (see list below). A film’s image quality depends on its “grain”.
.
R3tFWcx.png
 
Last edited:
Old films only has so much data. That's why they need to scan the original to have as much data as possible. They still don't look as details as modern footages even after rescanning and regrading.

Even the digital phone camera can capture way more data now.
That a strange statement, they need to scan the original because films has much more data than the older scanner were able to scans in a manageable time (and file) back in the days.(what would they have done with the 8k hdr scans of Kubrick A Space Odyssey in 1991 ?), it is just normal to redo it now.

It depends of how well the negative was kept obviously, but well shot old 65mm film will have an immense quantity of data, not that dissimilar to the latest Nolan movie. Film and lens did get better over time, but by the 60s it was already quite good and innovation slowed down in that era.

I am not so sure digital phone camera can capture way more data than 65mm film used by Kubrick
 
Please feel free to provide quotes/articles from any calibrators/colorists/directors, etc. where they encourage people auto-converting sRGB into HDR. Because just about every article I've ever read from people in these industries trends towards people getting the most accurate at-home experience, and that's without a doubt sRGB if no native HDR version is available. Now, I'll completely agree with you if something is manually and competently converted to HDR (Jaws for example), that's the version to get. But if it's not, and only sRGB is available (which is the case with a lot of old games/TV shows/movies), to a lot of us, sRGB is still preferable if we want the most accurate picture currently available. Auto HDR isn't usually a train wreck and is very watchable/even enjoyable, but there's no reason for me to make something less accurate than it can otherwise look by auto converting, until such time as a properly made HDR version is available.
Gears 5 is used as a showcase of AutoHDR. AutoHDR has algorithm. It's a matter of time AutoHDR becomes sophisticated enough to match the level of manual HDR grading. It's a good feature to have so you have an option to see higher range instead of not able to see it at all when native HDR is not supported. And most people won't manually grade HDR to see better content.

Personally, I haven't noticed any flicker on my OLED, and it feels easier on my eyes than the IPS and FALD models I tried. That said, I have no idea at higher APL if this would be an issue or not. It certainly hasn't for me on this monitor with the content I've viewed so far.
It's more like the content you watch are just average 100nits that be anywhere realistic. When the APL goes higher the OLED flicker will give eyestrain.

Source for this video? Happy to test it on my LG OLED and see how it looks. I can even compare to my Sony FALD TV.
You need to buy the Spears and Munsil UHD HDR Benchmark to see the image. It's mastered up to 10,000nits.

This (especially check out the first five minutes) had several sunrises and looked gorgeous on my OLED:
The video you show just look like SDR where the cloud around the sun is graded only around 400nits. Only the sun has 1000nits. OLED is going to have less brightness than 1000nits.

It's nothing compare to actual HDR where the could is close to 2000nits while the sun is easy over 4000nits. The second images has a lot higher APL overall.

52685519046_7f2ae2128a_o_d.png


You keep saying that, but you're also not providing a compelling argument. "Wait for more content"? There's plenty of great HDR content already.
There are contents even better if you grade HDR to match the range of the monitor. Most contents are just average 100nits won't show 10bit color. It's just an iceberg of HDR. High APL HDR won't be distributed because not many monitors can properly display the images unless you grade the image yourself.
 
That a strange statement, they need to scan the original because films has much more data than the older scanner were able to scans in a manageable time (and file) back in the days.(what would they have done with the 8k hdr scans of Kubrick A Space Odyssey in 1991 ?), it is just normal to redo it now.

It depends of how well the negative was kept obviously, but well shot old 65mm film will have an immense quantity of data, not that dissimilar to the latest Nolan movie. Film and lens did get better over time, but by the 60s it was already quite good and innovation slowed down in that era.

I am not so sure digital phone camera can capture way more data than 65mm film used by Kubrick
Higher resolution doesn't mean it has less noises. It doesn't mean it has more color. It still has limited data the same as 8bit or even less.

Technology has advanced for almost half a century since 1975. Old camera cannot beat the modern ones or they would've kept using the old stuff.

Remastered HDR movies still looks less details than modern grade HDR.

It is easy for modern 4K to beat remastered 4K as the sheer amount of noises just destroy data. It's not an artistic intent. If you've tried to grade movies you will find the less noises the better range the footage can be graded to.

2001 A Space Odyssey
2001.A.Space.Odyssey.1968.png


Prospect
00004.m2ts.png
 
Last edited:
Higher resolution doesn't mean it has less noises. It doesn't mean it has more color. It still has limited data the same as 8bit or even less.
Why would film be limited by an clear amount of bits ?

https://petapixel.com/2019/05/02/film-vs-digital-this-is-how-dynamic-range-compares/

Really good and well shot film can do 12-15 stops or so ranges, today HDRX red cameras will do around 13 (https://www.red.com/red-101/hdrx-high-dynamic-range-video#:~:text=Dynamic range is typically specified,comparable to color film negatives.), it is really new that digital camera can match film in that regard.

Technology has advanced for almost half a century since 1975. Old camera cannot beat the modern ones or they would've kept using the old stuff.
Tarantino The Hateful Eight used the same lens that Ben Hur did in 1959 and it is one of the best looking movie of all time. Yes film/lenses got better since 1975 but the highest end not by that much, it was already quite a mature medium by then, it is not like Kodak has been spending that much into making film better since 2000.

It is quite the ordeal to use, the reason to not shoot in old school 65 mm are not about the image quality, at least until extremelly recently. The camera are giant, make giant noise, film of that size take so much space that it limit how long of a shoot you can take and need to be recharged all the time, you need an expensive scan.

There an long list of advantage of using digital outside best possible image quality (often easier to light for them)
 
Last edited:
Why would film be limited by an amount of bits ?


Tarantino The Hateful Eight used the same lens that Ben Hur did in 1959 and it is one of the best looking movie of all time. Yes film/lenses got better since 1975 but the highest end not by that much, it was already quite a mature medium by then, it is not like Kodak has been spending that much into making film better since 2000.

It is quite the ordeal to use, the reason to not shoot in old school 65 mm are not about the image quality, at least until extremelly recently. The camera are giant, make giant noise, film of that size take so much space that it limit how long of a shoot you can take and need to be recharged all the time, you need an expensive scan.
If the film has that much noise it won't look anywhere better than 8bit. Isn't the movie you mentioned just SDR? They look fine and arty but nowhere near realistic.
 
If the film has that much noise it won't look anywhere better than 8bit. Isn't the movie you mentioned just SDR? They look fine and arty but nowhere near realistic.
Why do you say that the film has that much noise ?
https://www.indiewire.com/2015/12/w...ionist-with-quentin-tarantino-watching-40956/

Yes they did not made an HDR scan of it and considering the difference between the film projection and the scan it could be worth it to do it one day, all that to say that good film capture an impressive range one that it is only recently digital matched.
 
Gears 5 is used as a showcase of AutoHDR. AutoHDR has algorithm. It's a matter of time AutoHDR becomes sophisticated enough to match the level of manual HDR grading. It's a good feature to have so you have an option to see higher range instead of not able to see it at all when native HDR is not supported. And most people won't manually grade HDR to see better content.

I'm sure it'll improve with time, AI, etc. And more and more things will support HDR natively as well. But manual HDR is created with intent. Someone can go as the creator and choose what elements are at what brightness, etc. That's the only type of HDR I'm particularly interested in. Auto HDR is fine for those who like it, and I can see the appeal. I'd just rather know, with an SDR source, that I'm seeing things that aren't improperly blown out and are shown as intended. (Similarly, I don't like things like AI frame generation to make motion smoother, etc. I just prefer things to be as close to the source release as possible. It's a personal preference.)

It's more like the content you watch are just average 100nits that be anywhere realistic. When the APL goes higher the OLED flicker will give eyestrain.

I have yet to hear about significant problems with OLED eyestrain. It seems like you're saying it doesn't happen because OLED doesn't get bright enough. Even if that is the case, at the levels this OLED can do now (it's supposed to approach 1000 nits for HDR highlights), I stand by it being a bit easier on my eyes than the IPS panels I tried. And I've noticed no flicker problems in either SDR or HDR. I'm sometimes sensitive to such things, so if it becomes a problem, I'd definitely note it.

You need to buy the Spears and Munsil UHD HDR Benchmark to see the image. It's mastered up to 10,000nits.

I'm not opposed to buying that (I'd need to know which version), though I don't have a Blu Ray player on my PC right now, so I'd need to get one of those also. I'm not opposed to getting an external drive tho'. If there are any online HDR samples that showcase what you're saying, it'd be easier for me to look at those, but whatever works.


The video you show just look like SDR where the cloud around the sun is graded only around 400nits. Only the sun has 1000nits. OLED is going to have less brightness than 1000nits.

There is a profound difference between viewing it in SDR and HDR with that video, and it looks lovely on OLED in HDR. (It does look even better on FALD, for the most part, though there's also some blooming, especially in that first image with the red liquid, and it still looks great on both technologies). I was just looking for something in HDR with a similar sun scene. That said, if you can show me a video with a higher APL, I'm happy to take a look. There are quite a few HDR videos on YouTube these days - feel free reference any you think will illustrate your point.

It's nothing compare to actual HDR where the could is close to 2000nits while the sun is easy over 4000nits. The second images has a lot higher APL overall.

View attachment 548650


There are contents even better if you grade HDR to match the range of the monitor. Most contents are just average 100nits won't show 10bit color. It's just an iceberg of HDR. High APL HDR won't be distributed because not many monitors can properly display the images unless you grade the image yourself.

Well, right, when you're talking about videos that go up to 10,000 nits, practically nothing can support that. Even good FALD TV's don't often approach 4,000 nits. That's why tone mapping is so useful with HDR to make it fit in the display's capabilities. But things have come a long way toward providing a good experience regardless of technology. I've already conceded OLED will never look as spectacular on an OLED as a much brighter FALD display, but these days it can still look very good and impactful enough to enjoy HDR content.
 
There is a profound difference between viewing it in SDR and HDR with that video, and it looks lovely on OLED in HDR. (It does look even better on FALD, for the most part, though there's also some blooming, especially in that first image with the red liquid, and it still looks great on both technologies). I was just looking for something in HDR with a similar sun scene. That said, if you can show me a video with a higher APL, I'm happy to take a look. There are quite a few HDR videos on YouTube these days - feel free reference any you think will illustrate your point.
Even if there is link of HDR1000 video it won't show correctly on a monitor that does less than HDR1000. Youtube will automatically tone mapping as well based on range of the monitor.
I have posted links before. You need at least a true HDR1000 monitor to correctly view these. They are not something OLED can display.





Well, right, when you're talking about videos that go up to 10,000 nits, practically nothing can support that. Even good FALD TV's don't often approach 4,000 nits. That's why tone mapping is so useful with HDR to make it fit in the display's capabilities. But things have come a long way toward providing a good experience regardless of technology. I've already conceded OLED will never look as spectacular on an OLED as a much brighter FALD display, but these days it can still look very good and impactful enough to enjoy HDR content.
10,000nits sun cannot be seen for now but you can at least see a lot more 1800nits cloud around it instead of just seeing a 1000nits sun with 400nits cloud similar to SDR. Tone mapping has its limit. It shrinks the intended range. Sometimes it is better just to let image clips.
 
Why do you say that the film has that much noise ?
https://www.indiewire.com/2015/12/w...ionist-with-quentin-tarantino-watching-40956/

Yes they did not made an HDR scan of it and considering the difference between the film projection and the scan it could be worth it to do it one day, all that to say that good film capture an impressive range one that it is only recently digital matched.
The Hateful Eight -1 .jpg

The Hateful Eight -2.jpg

I've already said just because it was shot on high resolution doesn't mean it will have less noise.

This movie has as much noise as 2001 A Space Odyssey. No wonder it wasn't graded to HDR as this much amount of noises just prevented the color to be stretched to a higher range. When you start to stretch the color, the noise is going to be even more disturbingly visible.
 
Haven't followed the last several replies in detail but at a glance I'd just like to remind people that Tarantino usually goes for a nostalgic look to his films so that kind of thing could be intentional.
 
Haven't followed the last several replies in detail but at a glance I'd just like to remind people that Tarantino usually goes for a nostalgic look to his films so that kind of thing could be intentional.
You better find any old film that looks better than the recent 4K UHD.
 
Can we just end this silly debate about film being limited to 8-bit, sRGB. Film has a much wider colorspace. Here's Sony's own CIE chart showing their current 8K digital video camera only barely having a wider colorspace than print film.

Notice where ITU-709 (aka Rec. 709) is in the picture?

Like LukeTbk mentioned the tech used to make a film scan in the first place requires it supports the output desired. If a film was scanned before such tech was available then any subsequent digital masters/restorations made using it will be based on a limited representation of the original film.

Sony CIE chart.png
 
Even if there is link of HDR1000 video it won't show correctly on a monitor that does less than HDR1000. Youtube will automatically tone mapping as well based on range of the monitor.
I have posted links before. You need at least a true HDR1000 monitor to correctly view these. They are not something OLED can display.





10,000nits sun cannot be seen for now but you can at least see a lot more 1800nits cloud around it instead of just seeing a 1000nits sun with 400nits cloud similar to SDR. Tone mapping has its limit. It shrinks the intended range. Sometimes it is better just to let image clips.


Thanks for the links. I viewed both of these (the first in full; the second certain segments) on both my 75" Sony Z9F FALD television (RTings reviewed the 65% and estimated real-scene HDR brightness at about 1640 nits) and my new 27" LG OLED monitor (which can do highlights up to 1000 nits, but of course not full-scene nearly that high).

The bottom line is both were a pleasant experience and looked very good. It's true the HDR on the Sony is brighter, so it looked more impactful as expected (but also with a little bit of blooming, though not enough to really bother me). But the OLED viewing was still quite pleasant and decently impactful, not nearly the drastic downgrade it's presented as. I didn't feel like any scenes had a major tonal shift on the OLED - the tone mapping involved did a good job. I didn't see any scenes where the display seemed to unnaturally dim down/engage ABL in a distracting manner either. If I were to sit down and watch a movie like this, would my choice be the Sony? Absolutely! There's a wow factor there that's hard to match. But is it completely watchable/enjoyable on the OLED as well? Absolutely, and if I wasn't comparing between the two, I might not even notice. Given the varied use of my monitor for my PC, lots of work as well as gaming, SDR and HDR, I still feel like the slightly lower brightness for HDR is worthwhile and still gives me a really good experience when I do use HDR vs. the problems I had with the IPS/FALD display. For me, it does everything pretty well, including HDR, and has the fewest compromises that bother me for my needs. The blooming problems I had with the FALD IPS, as well as the poorer viewing angles (which my Sony doesn't really share, though viewing angles still aren't OLED good) were much bigger drawbacks (and then I had some quality control issues with that monitor on top of that).

I mean, I'd love a monitor with perfect viewing angles, perfect blacks, and high brightness, but they simply don't exist. Like others have said, until there's a perfect tech that manages to achieve the best of both technologies, it's always going to be a game of compromises. For me, the HDR this OLED is capable of is quite good with a nice amount of impact and decent reproduction of HDR material, even if using tone mapping and it can't display the absolute nits that would be ideal. It's certainly come a long ways from truly low-nit OLEDs.
 
Thanks for the links. I viewed both of these (the first in full; the second certain segments) on both my 75" Sony Z9F FALD television (RTings reviewed the 65% and estimated real-scene HDR brightness at about 1640 nits) and my new 27" LG OLED monitor (which can do highlights up to 1000 nits, but of course not full-scene nearly that high).

The bottom line is both were a pleasant experience and looked very good. It's true the HDR on the Sony is brighter, so it looked more impactful as expected (but also with a little bit of blooming, though not enough to really bother me). But the OLED viewing was still quite pleasant and decently impactful, not nearly the drastic downgrade it's presented as. I didn't feel like any scenes had a major tonal shift on the OLED - the tone mapping involved did a good job. I didn't see any scenes where the display seemed to unnaturally dim down/engage ABL in a distracting manner either. If I were to sit down and watch a movie like this, would my choice be the Sony? Absolutely! There's a wow factor there that's hard to match. But is it completely watchable/enjoyable on the OLED as well? Absolutely, and if I wasn't comparing between the two, I might not even notice. Given the varied use of my monitor for my PC, lots of work as well as gaming, SDR and HDR, I still feel like the slightly lower brightness for HDR is worthwhile and still gives me a really good experience when I do use HDR vs. the problems I had with the IPS/FALD display. For me, it does everything pretty well, including HDR, and has the fewest compromises that bother me for my needs. The blooming problems I had with the FALD IPS, as well as the poorer viewing angles (which my Sony doesn't really share, though viewing angles still aren't OLED good) were much bigger drawbacks (and then I had some quality control issues with that monitor on top of that).

I mean, I'd love a monitor with perfect viewing angles, perfect blacks, and high brightness, but they simply don't exist. Like others have said, until there's a perfect tech that manages to achieve the best of both technologies, it's always going to be a game of compromises. For me, the HDR this OLED is capable of is quite good with a nice amount of impact and decent reproduction of HDR material, even if using tone mapping and it can't display the absolute nits that would be ideal. It's certainly come a long ways from truly low-nit OLEDs.
Youtube will have automatic tone mapping. Once the image is tone mapped it is has lower APL.

It looks SDR on OLED. It's probably fine for you but I don't want to see images that still belong in SDR range without 10bit color.
 
Youtube will have automatic tone mapping. Once the image is tone mapped it is has lower APL.

It looks SDR on OLED. It's probably fine for you but I don't want to see images that still belong in SDR range without 10bit color.

It really doesn't look like SDR. The difference is immediately apparent.

I'll agree it'd be nice if tone mapping wasn't necessary, but as you yourself said, it is necessary on the majority of displays right now. Some day hopefully that won't be the case, but considering I was able to test on both a relatively high-nit display (that wouldn't have tone mapping, or at least not as aggressively, and you can see a difference) and this one, I found the results quite pleasing. Yes, better on the Sony, but the OLED still looks good. I've had similar results in games.

I understand liking the extra brightness a FALD affords you and agree it's hard to equal high-nit HDR for wow factor, but I still think OLED HDR can look quite good and impactful. We only disagree on which tradeoffs are more important in our respective cases, and on that we'll just have to agree to disagree. I can't wait for the day where we can have all the advantages of both technologies in one display.
 
I have two things to say:

1. Film has a ton of resolution, color, and visual detail. And glass for lenses has been very high quality, for decades. Its only recent that digital can match or surpass film. And even then, that's only technically. Because, film is still very nice to view.
To get a film onto a Blu-ray or a video stream, you have to make a digital scan of it. So....you have to take a picture of the film. Think about how difficult it can be to take a good picture of something.

A scan of a film can only be as good as the scanner hardware and how careful the team was, about how they used the scanner (The technique of the photographer is just as important as the quality of his camera). A lot of early digital scans have really "hot" highlights and added blooming to the picture, for one example of how things can go wrong.


2. Brightness isn't the only thing which makes HDR look good. Its right in the Acronym, 'Dynamic Range'.

OLED has 'infinite' black level. Which tends to give more dynamics to the picture. And it allows HDR at lower brightness, to still have a visual impact. Because, you still have a pretty wide dynamic range. Even though the peak brightness isn't as brilliant. What I mean is, HDR500 or whatever, will look incredible on an OLED, compared to HDR500 on an IPS or VA panel with zone lighting. And even though HDR can be very brilliant on a high nit FALD display-----the impeccable black level detail of an OLED can still lend a certain depth of detail to the overall image.
The thing is, that doesn't matter as much anymore. Because a lot of movies are being color graded so that they don't have a lot of range in the low tones and high tones. They grade them to be 'punchy', high contrast, and crush a lot of details together, and limite the color range. Which would lend itself to looking great on a high nit FALD display.
 
You don't even know the production workflow.
I don't? Maybe not. I'm sure you could write a book about all the things I don't know...
You certainly don't. Now I'm not like you, so I will actually back my claims up. Not for your sake, but for anyone who might be misled by your nonsense.
Both SDR and HDR are post production from raw footage captured by camera.
Or scanner, yes. Quite contrary to your previous claims about stretching 8-bit sRGB video to HDR...
There are cameras able to capture 12bit.
Funny when Arri, RED and Sony all make cameras that capture at least 16-bit raw footage...

https://www.red.com/v-raptor-black?quantity=1&package=2
https://www.arri.com/en/learn-help/arri-camera-technology/best-overall-image-quality
https://pro.sony/ue_US/products/digital-cinema-cameras/venice

But there wasn't any camera used for 12bit for movies in 1975.
Very true. Digital cinema cameras were not really that popular in 1975...
https://en.wikipedia.org/wiki/Digital_movie_camera
You think you can magically scanning 10bit or 12bit from a 1975 film that capturers as many data as 8bit?
No. It's not magic. It's very real and very possible.
https://www.arri.com/en/camera-systems/archive-solutions/arriscan-xt
https://lasergraphics.com/director.html
https://www.cinelab.co.uk/film-scanning
Funny you talk like you have never grade HDR. You can only watch SDR on OLED and imagine it is HDR. Ture HDR is not something 200nits OLED can display properly.
And there goes the broken record again. More useless mudslinging and misinformation. Once again, you are the one obsessed with "Adobe RGB 400 nits" make-believe "enhanced" videos. You really should stick to what you know, which is what... downloading SDR videos off of Youtube and "enhancing" them with your professional HDR regrades..?
And if you haven't heard of it (but of course you have) you should check out SweetFX. A decade ago or something like that it was all the rage with the kids that thought their games looked too dull..
You wouldn't happen to have made this, have you?
https://sfx.thelazy.net/games/preset/8868/
THE GAME IS MORE BETTER WITH REAL COLOUR WITH THIS RESHADE
Sure sounds awfully familiar...
 
This is a very interesting discussion and I'll leave this link here.



The guy arguing above is factually correct but can't seems to grasp preference/subjectivity. Although OLED is incapable of doing HDR justice today, it still looks great to many peoples eyes vs LCD. OLED for HDR games specifically really only provides a SDR+ experience but many of us accept that for the deep blacks/contrast and pixel response.
 
The guy arguing above is factually correct but can't seems to grasp preference/subjectivity. Although OLED is incapable of doing HDR justice today, it still looks great to many peoples eyes vs LCD. OLED for HDR games specifically really only provides a SDR+ experience but many of us accept that for the deep blacks/contrast and pixel response.

Even 1500nit in whatever % windows and less in others will look like SDR+ compared to future HDR4000 and HDR 10,000 screens.

You forgot a huge aspect about the fundamental nature of each display tech besides:

the value of per pixel emissive razors edge side by side per pixel emissivity and side by side contrast, side by side per pixel detail in color rather than large zones. Each zone on a 1,344 zone FALD screen has 6,171 pixels in it and is tilted against all of the other surround zones of 6, 171 pixels. May even have more pixels per zone if they are counting any edge lights as zone too. 1152 zones = 7200 pixels per zone (or more). Double that on a 8k screen. You also downplay the blooming and dimming and losses that causes in fine detail and highly contrasted scenes. There is also the fact that many FALD screens have matte coatings layer treatments that raise blacks no matter what screen they are on (LCD or OLED), will lose the full glossy clear saturated wet ice look to more of a haze frost, and can compromise fine detail slightly.

I agree that higher HDR and color are going to be the way to go in the future once these tradeoffs are nullified across the board - but these tradeoffs will continue to exist until another per pixel emissive display type replaces both of FALD and OLED.
.
It's tradeoffs either way . .
Until we get another per pixel emissive display type that can do everything better (and by then it might be well over 2000nit hopefully if temps can be handled without aggressive ABL all over again). Per pixel emissive display methods are clearly the superior way to display things all other factors aside.
Once we get more per pixel emissive display technology(ies) like microLED in the enthusiast consumer space , no-one will ever do "zones" ever again if they can help it let alone ones full of 7,000 to 15,000 pixels each zone, and dim/glow balancing between the large zones affecting the whole cell area worth of those thousands and thousands of pixels etc.
..
..

I can still use a FALD, or an edge lit, matte layer treatment, etc. if I have to on whatever display but I wouldn't seek one out as my media+gaming usage scenario screen currently if I had an OLED option. If set on LCD or forced to use LCD, similarly wouldn't want to go back to simple edge lit displays flashlighting from the sides either as FALD is clearly better pixel lighting and dimming than edge lit at least. FALD is still flashlighting and dimming in an ice tray grid against all of the other large zones though where OLED lights and darkens each pixel down to a razor's edge with contrast down to ultra black throughout to the pixel level, as well as individual pixels of detail in color and detail in darks - pixels side by side by side by side times 8,294,400 pixels.

Both FALD and OLED have strengths and weaknesses I think everyone can agree. Eventually we'll get microled per pixel emissive technologies that should get the best of both worlds, and better, but this is what we have for now.
 
Last edited:
This is a very interesting discussion and I'll leave this link here.



The guy arguing above is factually correct but can't seems to grasp preference/subjectivity. Although OLED is incapable of doing HDR justice today, it still looks great to many peoples eyes vs LCD. OLED for HDR games specifically really only provides a SDR+ experience but many of us accept that for the deep blacks/contrast and pixel response.

I actually love Vincent/HDTVtest's stuff. He does really good reviews and deep dives. I know he also likes OLED displays - he's just saying it doesn't do HDR justice without tone mapping (which is off for that video), which like you said is factually correct, and points out that sometimes applying tone mapping can change the tone of the image. (That's why in a previous post I compared on two displays to see how much I felt my OLED had to compensate, and I was pleased that there wasn't that drastic a tone shift even though of course the FALD still looked brighter).

What's exciting to me is that even though that video is only a couple years old, the new OLED monitors can actually exceed the brightness levels of what he's talking about there, with all the advantages you mention.

But yeah, I can't imagine the point where HDR is hitting 4,000+ nits on consumer screens. It'll be insane.
 
It really doesn't look like SDR. The difference is immediately apparent.

I'll agree it'd be nice if tone mapping wasn't necessary, but as you yourself said, it is necessary on the majority of displays right now. Some day hopefully that won't be the case, but considering I was able to test on both a relatively high-nit display (that wouldn't have tone mapping, or at least not as aggressively, and you can see a difference) and this one, I found the results quite pleasing. Yes, better on the Sony, but the OLED still looks good. I've had similar results in games.
OLED only shows SDR. Of course there is difference if you only compare it with sRGB 80nits. But 400-600nits SDR with wide gamut can look better than OLED "HDR" because OLED HDR can be even dimmer than SDR. You can imagine what microLED can do if FALD LCD SDR can look better than OLED dim HDR. And OLED cannot use better colorspace in SDR mode.

OLED monitor like LG 27GR95QE only shows a sun as bright as 350nits. AW3423DW only shows 460nits. The brightness is miserable. The sun should be 1000nits. Every object around the sun should have higher brightness, more color as well. The OLED don't show 10bit color in this scene. It only looks average 200nits SDR 8bit. Just like the comparison I did before, 400nits SDR from only 512-zone FALD LCD PG35VQ can look better than AW3423DW.

27GR95QE
World's First LG 2K 240Hz WOLED Review丨LG 27GR95QE Gaming Display Comprehensive Evaluation Rev...png


AW3423DW
World's First LG 2K 240Hz WOLED Review丨LG 27GR95QE Gaming Display Comprehensive Evaluation Rev...png


This APL in the BBC test scene isn't eve that bright compared to the video I posted. The sun in that video on OLED is dimmer than 350nits on OLED due to massive ABL.
52687584926_0cc18b688d_o_d.png


It's ironic people are laser-focused on the mere accuracy of SDR sRGB 80bits while ignoring HDR accuracy. This only results:
1. SDR can look better than dim OLED HDR
2. Watch SDR but think it is HDR.

Higher range is always better.
 
I don't? Maybe not. I'm sure you could write a book about all the things I don't know...
You certainly don't. Now I'm not like you, so I will actually back my claims up. Not for your sake, but for anyone who might be misled by your nonsense.

Or scanner, yes. Quite contrary to your previous claims about stretching 8-bit sRGB video to HDR...

Funny when Arri, RED and Sony all make cameras that capture at least 16-bit raw footage...

Very true. Digital cinema cameras were not really that popular in 1975...

No. It's not magic. It's very real and very possible.

And there goes the broken record again. More useless mudslinging and misinformation. Once again, you are the one obsessed with "Adobe RGB 400 nits" make-believe "enhanced" videos. You really should stick to what you know, which is what... downloading SDR videos off of Youtube and "enhancing" them with your professional HDR regrades..?
And if you haven't heard of it (but of course you have) you should check out SweetFX. A decade ago or something like that it was all the rage with the kids that thought their games looked too dull..
You wouldn't happen to have made this, have you?

Sure sounds awfully familiar...
Except there isn't any old remastered movie that look as good as the modern 4K HDR. Camera is a system to process images. It's not just about how large a 70mm lens is. It's useless to have high resolution if the noises destroy data.

How nice of a film production with 70mm camera ends up with a 1080P release with tons of noises.

And color is lifted by brightness. 400nits Adobe RGB SDR can beat OLED ABL HDR. OLED just doesn't have enough brightness. You only see SDR on OLED.
 
OLED only shows SDR. Of course there is difference if you only compare it with sRGB 80nits. But 400-600nits SDR with wide gamut can look better than OLED "HDR" because OLED HDR can be even dimmer than SDR. You can imagine what microLED can do if FALD LCD SDR can look better than OLED dim HDR. And OLED cannot use better colorspace in SDR mode.

OLED monitor like LG 27GR95QE only shows a sun as bright as 350nits. AW3423DW only shows 460nits. The brightness is miserable. The sun should be 1000nits. Every object around the sun should have higher brightness, more color as well. The OLED don't show 10bit color in this scene. It only looks average 200nits SDR 8bit. Just like the comparison I did before, 400nits SDR from only 512-zone FALD LCD PG35VQ can look better than AW3423DW.

I'll keep this brief because at this point we're going around in circles In short, I don't agree with you that OLED can't display HDR. No, it's not as impactful as on a FALD display, but it's still more impactful than SDR and well worth using in my opinion. (And I don't watch SDR at 80 nits...I watch it at 160 nits). Tone mapping makes the HDR "good enough" until there are panels that can do everything well, pixel level control AND brightness. Clearly your opinion differs, and that's fine, but I have been quite pleased by all of the content I have tried. Arguing SDR set to an unreasonable brightness (who watches SDR at 400-600 nits unless in an exceptionally bright room?) can top HDR doesn't make a lot of sense to me. That's just a recipe for headeaches/eyestrain.


This APL in the BBC test scene isn't eve that bright compared to the video I posted. The sun in that video on OLED is dimmer than 350nits on OLED due to massive ABL.


It's ironic people are laser-focused on the mere accuracy of SDR sRGB 80bits while ignoring HDR accuracy. This only results:
1. SDR can look better than dim OLED HDR
2. Watch SDR but think it is HDR.

Higher range is always better.

I care about sRGB accuracy because that's generally where I'm viewing photos, web content, and SDR games. I want to view things as they are intended (albeit with a touch extra brightness to 160 nits), not an oversaturated version of them or an automatic HDR implementation that might get subtle things wrong.

HDR accuracy for native HDR is important to me, sure, but tone mapping helps quite a bit with that, and the only monitors that can display HDR without some form of tone mapping have other issues I consider more undesirable for accuracy (such as blooming with local dimming on, which was a major downside for desktop work, as well as imperfect black levels with local dimming off and poorer viewing angles).

Higher range objectively isn't always better. A photograph or piece of artwork shown in the wrong range may look more colorful, but if they aren't the right colors, that doesn't matter.
 
Back
Top