Has HDR been standardized already? Is there "HDR Reference Image" ?

OpenSource Ghost

Limp Gawd
Joined
Feb 14, 2022
Messages
233
I have been out of display calibraiton world ever since HDR came out. Has HDR been standardized in any way? Is there actually a reference HDR image that consumer displays are capable of producing after calibration? I remember arguements that consumer HDR displays are not capable of tone mapping used by studio/reference HDR displays. Is that still the case? There are also sub-types of HDR and color gamut differences... What is the actual reference HDR standard?
 
Last edited:
Hello,
As the author of an upcoming (working) TestUFO HDR motion test for as-yet unreleased HDR-compatible web browsers, I have been studying this matter in a huge crash course;

And OMG, it's a rabbit hole the size of a large black hole.
Consequently, it is hard to come up with standard HDR test patterns that works across everything.

There are a large number of HDR colorspace standards
  • DCI-P3
  • Rec 2020
  • etc.
And lots of brand names
  • Dolby Vision
  • HDR10+
  • etc.
And lots of programming colorspace bytepacking formats (GPU framebuffers):
  • DCI-P3 encoded as three IEEE754 FP16 floats
  • REC2100 encoded as three IEEE754 FP16 floats, Perceptual Quantization (PQ)
  • REC2100 encoded as three IEEE754 FP16 floats, Hybrid Log Gamma (HLG)
  • etc.
And lots of metadata/transform/SMTPE standards
  • ST 2094-10
  • ST 2094-40
  • ST 2086:2018
  • etc.
And lots of display quirks that varies by display
  • Linear nit behavior (1:1 to the pixel data)
  • Dynamic behavior (if too many max-nit pixels, whole display dims to prevent APL from getting extreme)
  • Countouring behavior (dimming of pixels around ultra-high-brightness HDR pixels)
  • Adjustable windowing behaviors (user preference...you don't want 10,000nit pixels to be more than 1% of pixels, e.g. sun reflections of chrome, nighttime neon, etc)
  • Clipping behaviors vs curve behaviors (of various kinds and algorithms)
  • etc.
However, proprietary test patterns are booming now.
Many different vendors have come up with their own test patterns, such as:
Consumer displays have varying abilities to do HDR, the best available consumer HDR displays are currently the OLED displays, as well as the MiniLED-locally-dimmed LCD displays (e.g. the new 4K TVs, or even mobiles like new iPad Pro with the 2,596 local dimming zones and 1600-nit peaking).

Very hard to Cole Notes it out, but we're thinking of Blur Busters style explainers in 2023
 
Last edited:
Whatever image looks real can be a reference of HDR.
A good HDR monitor looks like a window.
 
I have been out of display calibraiton world ever since HDR came out.
Calibration doesn't have a lot to do with this, other than to ensure your colors and luminocity values are falling where they are supposed to. That isn't to say it's not important to achieving good color, but it isn't directly related to HDR anymore than it is related to SDR.
Has HDR been standardized in any way?
CBB discusses this a lot above. But the short is, without standardization HDR wouldn't exist. There is a lot of engineers that have done a huge amount of work to create the pipelines for these formats to exist. From acquisition (though technically acquisition has exceeded HDR color space for some time), working color spaces in NLE's (ACES, DWG), and exporting to consumer level gamuts. As well as all of the transforms necessary to have these gamuts appear uniformly on a myriad of devices.

To reiterate, if there were no standards, there would be no way for you to see HDR imagery in the first place.
Is there actually a reference HDR image that consumer displays are capable of producing after calibration?
Not really sure what you mean here. If you're just talking about a "test image" or "test pattern" of some sort, I'm sure there is. I'm just not fully aware of it.

ARRI has specific color charts so that you can view different gamut transforms and LUTs, but I wouldn't go as far as to say their tools are "standard" and all of those tools are designed specifically for working inside of NLEs. I would probably just look at something like "Planet Earth" or other HDR content with known excellent high production value.
I remember arguements that consumer HDR displays are not capable of tone mapping used by studio/reference HDR displays. Is that still the case?
Yes, this is still the case. It is highly unlikely that reference displays will come down in pricing enough for consumers to afford. When you're looking at scientific monitors, they're capable of 5,000 nitts of peak brightness and cost $20,000 or more and they do with with absurd color accuracy (at all brightness levels) and uniformity and are extremely controlled precision edge to edge. If you're an engineer working for Dolby, you're sitting in front of likely $40k+ worth of displays (minimum). Likely $30k+ in a Mac Pro. Possibly $20k+ worth of color wheel controllers. And then probably another $40k+ of audio gear. It's out of reach price wise for general consumers, and most consumers want displays that cover their walls.
There are also sub-types of HDR and color gamut differences... What is the actual reference HDR standard?
I would not focus on this aspect too much. Like all competing formats there are different flavors floating around that different agencies prefer. Film for the most part favors Dolby Vision because Dolby Vision was built from the ground up to attract customers to the theater model and was also designed to differentiate itself from watching at home. Getting a setup that can look and sound like a movie theater isn't "impossible", but it was kind of intentionally designed to give shock and awe, and to delight in ways that most normal people can't afford. Can you watch Dolby Vision content at home? Yes. Turns out OLED and excellent receivers with good decoders go a long way.

Then you'll find platforms like YouTube, that specifically only uses Rec2020, which builds off of broadcast and SDR display tech.

In the end, if you want to watch all HDR content, then you have to support different formats. I'm sure you were looking for a "clean" answer, but like most things it's not that simple.

EDIT 1: Spelling/grammar.
EDIT 18_03_2023: Found a small technical mistake, but drove me crazy so I had to change it.
 
Last edited:
Calibration doesn't have a lot to do with this, other than to ensure your colors and luminocity values are falling where they are supposed to. That isn't to say it's not important to achieving good color, but it isn't directly related to HDR anymore than it is related to SDR.

CBB discusses this a lot above. But the short is, without standardization HDR wouldn't exist. There is a lot of engineers that have done a huge amount of work to create the pipelines for these formats to exist. From acquisition (though technically acquisition has exceeded HDR color space for some time), working color spaces in NLE's (ACES, DWG), and exporting to consumer level gamuts. As well as all of the transforms necessary to have these gamuts appear uniformly on a myriad of devices.

To reiterate, if there were no standards, there would be no way for you to see HDR imagery in the first place.

Not really sure what you mean here. If you're just talking about a "test image" or "test pattern" of some sort, I'm sure there is. I'm just not fully aware of it.

ARRI has specific color charts so that you can view different gamut transforms and LUTs, but I wouldn't go as far as to say their tools are "standard" and all of those tools are designed specifically for working inside of NLEs. I would probably just look at something like "Planet Earth" or other HDR content with known excellent high production value.

Yes, this is still the case. It is highly unlikely that reference displays will come down in pricing enough for consumers to afford. When you're looking at scientific monitors, they're capable of 10,000 nitts of peak brightness and cost $20,000 or more and they do with with absurd color accuracy (at all brightness levels) and uniformity and are extremely controlled precision edge to edge. If you're an engineer working for Dolby, you're sitting in front of likely $40k+ worth of displays (minimum). Likely $30k+ in a Mac Pro. Possibly $20k+ worth of color wheel controllers. And then probably another $40k+ of audio gear. It's out of reach price wise for general consumers, and most consumers want displays that cover their walls.

I would not focus on this aspect too much. Like all competing formats there are different flavors floating around that difference agencies prefer. Film for the most part favors Dolby Vision because Dolby Vision was built from the ground up to attract customers to the theater model and was also designed to differentiate itself from watching at home. Getting a setup that can see and hear like a movie theater isn't "impossible", but it was kind of intentionally designed to give shock and awe, and to delight in ways that most normal people can't afford. Can you watch Dolby Vision content at home? Yes. Turns out OLED and excellent receivers with good decoders go a long way.

Then you'll find platforms like YouTube, that specifically only uses Rec2020, which builds off of broadcast and SDR display tech.

In the end, if you want to watch all HDR content, then you have to support different formats. I'm sure you were looking for a "clean" answer, but like most things it's not that simple.

Thank you for taking the time to provide such an informative answer. What I mean by "reference image" is the image on a calibrated display that mastering team sees when mastering content. Another example is thhat calibrated displays allow graphics designer A to see a similar image on his/her screen that graphics designer B sees on his/her screen. That way they could both be "on the same page" when designing graphics or as close to that as possible given that their displays were properly calibrated. Another way to define it is to say that image on calibrated display is how content is meant to be seen or how content mastering team sees it when it masters that content.

I guess with HDR the same concepts apply, except there are more standards (in terms of quantity). With SDR there is mainly one standard, one "reference image". Having many standards, in a way, defeats the idea of stardization itself. It definitely makes the concept of display calibration less important. There is no more singular reference that defines how mastered content is meant to be seen.
 
I guess with HDR the same concepts apply, except there are more standards (in terms of quantity). With SDR there is mainly one standard, one "reference image". Having many standards, in a way, defeats the idea of stardization itself. It definitely makes the concept of display calibration less important. There is no more singular reference that defines how mastered content is meant to be seen.
The universal standard is literally the CIE 1931 color gamut. Every single SDR and HDR standard has essentially derived off CIE 1931, that was borne out of research from the late 1920s to early 1930s.

Only minor tweaks have ever been made to CIE 1931, aka CIE 1976, which were simply ultra-subtle tweaks to the gamut shape. The gamut shape which maps the human perceivable colors based on the humans' red/green/blue vision behaviors -- the gamut is not a perfect triangle since the human photoreceptors don't have perfect R-only, G-only, B-only response -- and we've chosen average R wavelength sensitivity, G wavelength sensitivity, and B wavelength sensitivity.

Despite different humans are often slightly color-blinder than others (12% of population), and even the non-colorblind may have color primaries that has spectral sensitivity centers of the sensitivity "hump" that vary from another person (e.g. 560nm vs 564nm vs 569.7nm or whatever, for human vs human vs human).

The CIE 1931 gamut is just a human-population-average compromise, sampled from a population. And whereupon we attempt to boilerplate an R spectral standard, G spectral standard, and B spectral standard onto the shape of the color gamut -- whether sRGB or NTSC color or DCI-P3 or REC.709 or REC.2020, despite different humans having slightly different gamut shapes than the next human (in the XY chart, and even Z, for brightness sensitivity/insensitivity differences between humans)

So there's kinda a grand-daddy of a standard: CIE 1931.

Every colorspace standard is just a subset of CIE. Example:

1669257967173.png


The thing is, displays vary much more hugely now than they did in CRT tube days. We've got LCDs (IPS, VA, TN), of varying brightness (from under 100 nits all the way to 10,000 nits), as well as local dimming backlights, amongst laser displays, DLP displays, LCOS/SXRD displays, OLEDs, MiniLED/MicroLED displays (mainly jumbotrons and The Wall, but shrinking), etc.

So HDR is a game of compromises, where mastered content displayed on 10,000 nit capable displays are simply rendered best-effort on consumer displays, through various kinds of algorithms. Things like tone-mapping is often done instead of simply white-clipping, for example. So basically curving-off near the maxes, essentially. And different displays behave differently on that.

So really, it's kinda impossible except the display is aspiring to cover as much as CIE 1931 as possible, the near century-old root standard, from ultra-dim darkroom, all the way to sunlight-bright, while pushing the colors all the way out to extremes.

There are HDR test patterns that has encoded 10,000 nit pixels, but only shows as max-nit pixels. And for protection, some displays (OLED) automatically dim max-bright pixels (whatever the specific consumer display does) after a set time. Sometimes a second or few maximum, so you can't display static test patterns statically at ultrabright nits for long.

And some displays with white pixels (e.g. WRGB) will reduce color saturation to render the peakier-brighter colors, so the color saturation isn't a linear curve along the HDR brightness. So you've got weird shapes of colorspace rendering capability when you're doing all 3 dimensions (XYZ) including XY on CIE 1931 gamut chart, followed by Z for luminance. So the triangle shrinks as brightness gets brighter on certain displays, e.g. DLP with white section in color wheel, or OLED displays with white pixels next to R,G,B pixels, etc.

The bottom line, CIE 1931 is the root granddaddy standard of it all, and different consumer displays are doing best-effort 3D-volume rendering HDR (entire space of addressable XYZ)

One display will "fudge" the image, e.g. a 400-to-600 nit HDR display may re-tonemap surrounding pixels to simulate 1000+ nits into 600nit, simply by gradient-dimming around the bright pixels. This is an ugly behavior, even if it amplifies HDR-look on limited-nit-range displays, and also a workaround for limited ability to peak pixels brightly.

A display voltage can overload if more than 1% of pixels are 10,000 nits, so you've got average picture level dimming algorithms much like CRTs/plasmas did, where they dimmed automatically if displaying a bright white image. Some of these phosphor-based could do single white pixels at tons of nits, but dimmed massively when you displayed a whiteout sunny snow scene -- because they did not have enough power to power all pixels at maximum nits. That behavior is being brought back with many HDR displays, especially OLED. So you've got HDR benchmarks that are like a 1% square, 5% square, 10% square, of different nits, to benchmark how much a display deviated from HDR behaviors. So you have all kinds of new HDR test patterns.

Displays have gigantically widely differing HDR behaviors, but if you walked up to a prototype 10,000 nit display, it's like walking into a Star Trek Holodeck that can render almost any color at almost any brightness your eyes can see, from ultra-dim candlelit caves that looks real-to-life, through sunlight glints on chrome on 1957 Chevy. (e.g. Sony 10,000 nit prototype I saw at CES 2018).

HDR is amazing when it's got more than 100x more colorspace-detail than NTSC (X multiplied by Y multiplied by Z) -- in other words, the ability to map over 0-thru-10,000 nit while having a color gamut (rec2020) triangle 72% bigger than good old fashioned sRGB, and 50% bigger than good old fashioned CRT (NTSC). When you multiply all of that together, it's mind bogglingly huge -- and shows how anemic consumer displays really are, compared to real life (or an aspirant Holodeck). Woe is us that we try to do a rec2020 test pattern and expect a SINGLE consumer display to render it successfully. Ouch.

Even though humans have irises that automatically open/close during dark/bright content, the ability to successfully simulate the whole real life behavior -- e.g. tiny sun glints off chrome of a 1957 chevy car that looks like real sun, brightness-wise. Or ultra-bright perfect 8K twinkle-tiny pinpricks of stars in an extremely dark countryside sky, in a way that is 100x more impressive than today's OLED (yes.) And, that's amazing, but you generally don't want that to exceed roughly 1% of pixels max, sometimes even only 0.1% (especially on a wide-FOV display like IMAX or VR). Few people ever see any display more impressive than the best OLED, but some lab displays that I've seen. Wow.

Will be a very, very, very long time before we can't tell apart a VR headset and transparent ski goggles, in an A/B blind test of the most extreme content attempting to perfectly fully trick a person (Matrix style or STTNG Morarity style) that VR is real life, in a complete scientific blind A/B test ("is it VR or is it real life you're seeing" after given a random headset that is either a VR headset or transparent ski goggles, that both look identical and weigh identical). It's astoundingly difficult, when retina refresh rate AND retina resolution is thrown in simultaneously too with near-retina HDR range. The bandwidth required is supernova league to combine retina HDR + retina resolution + retina refresh rate, all simultaneously.

Anyway:

It's hair pulling frustrating, when it comes to doing standard test patterns, I tell you.

Astoundingly, astoundingly bang-on-cement-wall. Picard facepalm moments too at how some wimpHDR displays try to algorithmfake it. 🤦

(TestUFO simply focuses on certain narrow scopes, obviously -- e.g. pixel response of different HDR brightnesses, and things like that. The other test pattern inventers can handle the rest, thank you. Whoopee.)

Hats off to all test pattern work that dives deeper into HDR benchmarking. Giant rabbit hole the size of a black hole.
 
Last edited:
CBB adds more about the technical, I'll add other info as someone who makes films.
Thank you for taking the time to provide such an informative answer. What I mean by "reference image" is the image on a calibrated display that mastering team sees when mastering content.
You can skip this response, but as someone who works in film (at the low level end, not in Hollywood, I've intentionally avoided it but I have worked on sets with ARRI Alexa Mini's and various RED cameras and I personally shoot on an E2-F6, which was extensively used in the latest MI:7 film just for a point of reference), I can tell you that no one is looking at a "reference image" when mastering content. At least not in the way you mean.

There is an entire website called "Shotdeck" and its purpose is to pull up frames from films so that while the DP and director are story-boarding a film, they can both see what they are imagining in their mind(s) and communicate that to other stake holders what they want the film to look like, which gives them a target of roughly what they're going for. In the film industry, that is what a "reference image" is. They might reference certain images for color/tone. And other images for what they want the action or locations to look like and other images still for composition. So a single scene could have a dozen different reference points (or however many it takes) for different things such as action, location, color, luminocity, time of day, number of actors, framing, etc. But none of these are "a standard test pattern".

If some kind of standard test pattern was used, that would have to be shot in camera, and then in post have it exactly match the reference. Which by the way, colorists are more than capable of doing. In fact color charts and software can do this for you "instantly", and such software has existed for at least 7 years at this point (maybe more? Colorcheckers have been used extensively, but I'm not certain at what point NLE's and software had this functionality built in). However when you're making a film that's just a starting place, and things like color are HIGHLY subjective. The whole job of the colorist is to drive the feel of a film. If this wasn't the case, then all films would look the same.

Does "Drive" look like the live action "Aladdin" or even remotely similar to "No Country for Old Men" or "A Clockwork Orange"? The answer is definitively "no." Because they were not designed to. Creative choices might be to crush out blacks or blow highlights. Or never blow highlights. Or have 60% of the image live in the midtones. Michael Bay famously uses "Teal and Orange" as his color palate for all of his "Bayhem" films, which is also the most common color palate in action films. The Matrix intentionally pushed green into all of the blacks and mid-tones of the film whenever the characters are inside of the Matrix in order to intentionally make those places feel "less real" or you could say it was the color of the simulation, but did it in a subtle way to make it seem just like a stylistic choice. In other words, films aren't necessarily designed to "look real". If they were, then they'd match color charts and call it a day. But they don't, as a huge part of film is creativity. Color choice is part of the art. And it's incredibly subjective. Just how subjective? Try watching "Solo, A Star Wars Story", I have no idea who graded that film but it's awful. A heck of a lot of people must've thought it looked good or it would've never been released.

The point I'm making is there isn't some reference image because it wouldn't be relevant to creative decisions and film making in general. For more on this read below.
Another example is thhat calibrated displays allow graphics designer A to see a similar image on his/her screen that graphics designer B sees on his/her screen. That way they could both be "on the same page" when designing graphics or as close to that as possible given that their displays were properly calibrated. Another way to define it is to say that image on calibrated display is how content is meant to be seen or how content mastering team sees it when it masters that content.
Yes and no. The short is what you're describing is screen calibration. But there isn't a color chart or something that people look at on multiple monitors to see whether something is calibrated or it's not. Mostly because that's the point of calibration in the first place. I'm sure you've seen a bunch of those "eye tricks" or whatever they're called that demonstrate what we perceive to be pure white is based around perception and context. And so are things like grey (like that cylinder image on a checker board showing that the grey check in the shadow actually matches the "light" grey check in the bright area). Even what is day vs night or brightness for either can easily be manipulated in our minds. In fact, video in general is just a bunch of "still frames" played back at 24fps, and that's enough to fool our brains into thinking that something is moving.

Humans are poor calibrators. And giving you a test pattern wouldn't allow you to know whether or not your screen was calibrated or not unless you had a screen directly next to it that was calibrated to compare it to. In a void, such a thing is useless.
I guess with HDR the same concepts apply, except there are more standards (in terms of quantity). With SDR there is mainly one standard, one "reference image". Having many standards, in a way, defeats the idea of stardization itself. It definitely makes the concept of display calibration less important. There is no more singular reference that defines how mastered content is meant to be seen.
I think you've learned the opposite lesson. It's all the more important. In terms of standards, mastering is all done to specific specifications. Like I mentioned before, if the destination is the theater, that film was likely fully mastered in Dolby Vision. And if you're putting up content on YouTube or OTA TV, then it's Rec2020. If you've mastered in one and then use the gamma of another you're going to have problems. The standards matter a heck of a lot. And it's obvious that if you know something was mastered in either Dolby Vision or Rec2020, you "should" be able to reproduce that on any other display of your choosing.

Let me put it to you another way, if you're working on cars in the US (as we all love car analogies) there were two standards: imperial and metric. And knowing which standard you were working on was essential to fixing cars (generally if they were domestic or foreign, but that doesn't matter). First off, the engineering to make each set of tools operate on the car had to be exacting and indeed the entire construction of every part in the car was designed around either metric or imperial bases. That doesn't mean that because there are two standards that all of a sudden the standards don't matter and you can never know whether or not you're using the correct tool or not. I don't know of a single mechanic whose only wrench is an adjustable "monkey wrench". If anything it's highly specialized to work on specific models of cars.

All this to say: knowing what you're looking at matters a lot, and it's absolutely possible to view the content exactly as it was intended if your displays are 1.) Capable of reproducing the original gamuts involved 2.) you know what those gamuts are and have them input correctly into your display with proper transforms 3.) Your display is properly calibrated.

--------------------------------------------------------------------------------------------------------------------------------------------

EDIT: mostly grammar spelling. Added a small amount of content.

EDIT 2: Here is a simple calibration solution.
 
Last edited:
Funny after all this talk people will be sad to find out they still cannot calibrate HDR on 99.99% of their monitors.

If you want to calibrate HDR, your monitor needs to be over-specs to get the coverage first. Then shrink the coverage to match the accuracy. A reference image can be as much as a result of a calculation.

But have you ever seen a true HDR monitor that is over-specs to display images more than real life? Of course not.

Real life is the ultimate reference for HDR. There are just so fewer monitors that can even cover 80% Rec.2020, or maintain 1,000 nits brightness to make things look slightly real. The most important thing is still the making of the monitor.

Under the true HDR is where all the inferior sub-colorspaces and the messy algorithms scavenge from other incompetent HDR standards.
 
Funny after all this talk people will be sad to find out they still cannot calibrate HDR on 99.99% of their monitors.
I generally agree with this statement, but not for your following reasoning.
If you want to calibrate HDR, your monitor needs to be over-specs to get the coverage first. Then shrink the coverage to match the accuracy. A reference image can be as much as a result of a calculation.
I would say the problem here is most of the time displays don't meet the specifications they're designed to fulfill.

And this is a two part problem. 1.) All certification programs aren't stringent enough. And that is paired with 2.) The display market is primarily driven by cost. In short, display manufacturers are incentivized to do the least possible to get the certifications in order to drive the cost down to sell more units.

In the high end display world, "specs" will sound "modest" because they actually have done all the testing and engineering beforehand and will actually state clearly what you are getting. To consumers that don't know the difference, it "seems like" a monitor that costs $6000 and "just" covers HDR10 is a rip off. To professionals they know the difference.
This is actually one of the reasons why people on this forum "shit on" the Apple Display XDR. They would rather talk about $1000 stands or its "high price" for "only 60Hz", not realizing that it's designed to compete with professional displays that cost $10k+ and is actually a "low cost alternative" for people that want to master HDR10 content.
But have you ever seen a true HDR monitor that is over-specs to display images more than real life? Of course not.

Real life is the ultimate reference for HDR.
This is kind of a mixed bag of some-what true. Your first rhetorical question is a bit obvious. But also your wording doesn't make sense. I'll assume you mean generally that there isn't a display that is capable of displaying the wide dynamic range of reality. In which case: yes we agree. And that will be something of science fiction for likely several decades as that would require a source that is capable of amplitude far greater than technologies we have now.
To put it bluntly that thing we call "the sun" would be incredibly difficult to produce on any display technology while also trying to maintain shadow detail. Our eyes are also technically not capable of dealing with the brightness of the sun, so you could say our "analog imaging tech" is also at a loss. And that also raises the question about "what is reasonable" in terms of if we were capable of reproducing the brightness of the sun, would that be an enjoyable experience? Every action movie would be less interesting if every time something exploded we have to turn away or close our eyes because it's too bright just in order to be "realistic".

Bottom line, we're talking about movies. Which as a medium aren't necessarily about trying to reproduce reality 1:1. And in fact often times aren't. As I mentioned earlier "color grading" isn't realistic. Heck, Marvel films aren't realistic (I'm talking about the way they look, not the "magic" they do in them). It's about the art for a lot of filmakers. And it's about fun for most movie goers. As much as I like to talk about the technical, if you miss the point that films are supposed to be fun, then you will start to overstate the importance of realism. And frankly outside of documentaries (and sometimes not even in documentaries) or "the news", it's not that important for things to "look real", in the sense that it's supposed to be reality or not. Film has always been about the fantastic and specifically "not realistic" whether we're talking about what is physically possible or how things "are" or "look".
And to be clear, I'm talking about purely how things look, not the fact that "Iron Man" exists. I'm saying "color grading" by it's nature "doesn't look real". And that's a big part of film.
There are just so fewer monitors that can even cover 80% Rec.2020, or maintain 1,000 nits brightness to make things look slightly real. The most important thing is still the making of the monitor.
Sure, we mostly agree. But consumers like I noted above aren't willing to spend $10k+ dollars to get better reproduction. And indeed there are diminishing returns for doing so.

The other part of this is also that these gamuts were made intentionally large so that display tech has "room to grow". The engineers behind these standards are very intelligent. It's not as if they couldn't have limited these gamut sizes to currently available tech. By their nature they're designed to force the display market to get better and grow. What you're seeing is part of that process.
As another reference point, it took something like 20 years for commonly available SDR displays to near 100% of AdobeRGB coverage on displays. Or DCI-P3 (around 15 years). Although those gamuts have been fully covered on professional displays for some-time. Again, it's willingness to pay for professional displays or not. And it takes time for the tech to trickle down in order to become economical so that even a $400 display can show what it took a reference display to do 10+ years ago.
Under the true HDR is where all the inferior sub-colorspaces and the messy algorithms scavenge from other incompetent HDR standards.
I would say you really don't know anything about this space at all if you think incompetence is involved. That is a very low brow way of looking at this issue. It took many incredibly intelligent people to figure out the best ways to arrive where we are. And it wasn't done flippantly or easily.

The very short is that gamut mapping is incredibly complex.
The long to describe the problem: Cinema cameras are capable of capturing 11.5 stops of dynamic range at the low end and 17 stops of dynamic range at the high end. Ever since the inception of digital cinema back in the early 2000s we have had cameras exceeding at least 10 stops of DR.
SDR for reference only contains about 6 stops of dynamic range. That means the lowest end camera cinema was shooting 16x the luminance and color data than what SDR was capable of showing. The Alexa 35 is now capable of being well over 100x the luminance and color data of an SDR display. In other words acquisition is in incredibly large gamut sizes that are far and away larger than even "science fiction" displays aren't capable of reproducing. This problem has existed since the beginning of digital cinema.

Gamut mapping then has to take what a camera "sees" and squeeze that entire dynamic range into a color space that a display is capable of reproducing. And if you think that doing that is "easy" or that this is "incompetence" then you know very little about the industry or the technology that drives these displays. Working with these gamut mapping tools everyday, I can tell you that to make images in a comprehensible and displayable form is an absolute wonder. And while I "wish" there was a way to see everything a camera does, the truth is like I've mentioned before, that won't be possible for an incredibly long time (if ever, for practical reasons).

To take it a step further: if you think luminance mapping and gamut compression from a camera is hard. Then you should look into the engineering it took to do the same types of compression from either a film negative or a digital RAW file into print. Because if you want to see what "not a lot of dynamic range looks like" then printing is it. The tonal range is incredibly squished and indeed there is no way to increase the luminance value of printers. To say that being able to print a photo onto a piece of paper (which is now "so easy" that we take it for granted) took a tremendous amount of also incredibly smart engineers to do.

Similarly magically increasing the luminance of displays isn't easy either. As evidenced by the multi-decades long usage of CRTs and now LCDs. FED and SED were virtually impossible to build at reasonable sizes. qdOLED and MicroLED will continue to get fleshed out, but if we're honest will have trouble with luminance because of things like heat and how that also translates to individual pixels or subpixels failing. Just brightness = heat and how that relates to burn in and dead pixels is a tremendous problem. And again if this was easy to solve it would be solved. Because who ever cracks it first will have displays that absolutely dominate the consumer market. This isn't incompetence either. It's an incredibly hard engineering task that involves incredibly complex manufacturing, materials science, and complex software/hardware to control it all. And all of this is expensive. It is of course expected that it will take some time like all technology does to come down in cost. I'll be happy if I can buy a QD-OLED display (or MicroLED) capable of mastering HDR10 content (as you note "actually" 1000 nitts of peak brightness) and 32" size in 4K for $2000 if it covers 90% of Rec2020, that will be plenty. I think that will probably take another 2 years to get such a display to market.

If you have a better solution to all of the above and you're not the one inventing this tech, then you're basically just a jerk. Not only a jerk, but you're leaving potentially billions of dollars on the table. It's easy to be an arm chair quarterback. It's quite another thing to actually do the tasks that are being done.

----------

In terms of "why are there multiple standards" there are multiple reasons, and incompetence isn't one of them. Some of them are practical. Rec2020 exists because it was easy to build on the SDR foundation. And what is necessary for the future of TV's often times is "easy" standards that can drive adoption. That is a real and serious consideration. If you make an awesome technical standard and no one adopts it, it's meaningless; the tech doesn't exist in a void.
Dolby was the opposite. Obviously their target market was building for theater and digital laser projection. In other words, their tech wasn't restricted by consumers and the business owners that also wanted a substantially different display format (as in enough to hopefully draw people back to theaters because their TV's couldn't work at that display level) and those business operators were willing to pay for the cost unlike consumers.

Again, just because there is more than one format, doesn't mean incompetence. Even if you're cynical, at minimum then it's about competition and not about good engineers. Similar to how things such as HD-DVD and Blu-Ray played out. Or VHS vs Betamax (which is actually a great example of how the "worse tech" won out due to artificial restrictions of Betamax and a high cost that most consumers were unwilling to pay, despite having higher resolution and physically smaller cartridges).
 
Last edited:
You can scavenge all you want and you might just be able to push the standards higher up such as pushing VDE HDR 2000 that looks as bright as VESA HDR 600 or pushing all the flickering OLED with a TÜV flicker-free Eye-Comfort sticker.

Of course, there are a few bucks left on the table waiting to be scavenged.

I have no doubt whatever HDR by such standards looks nothing close to the real life unless you want to be arty instead of being absolute.

Funny the thing you describe as "incredible complex" is a bunch of compromised functions done years ago from simple ideas while the hardest part is always to make a competent monitor to do the higher bit log transfer without all the compromises.
 
I have no doubt whatever HDR by such standards looks nothing close to the real life unless you want to be arty instead of being absolute.
If you're goal is "absolute" then you will be continuously disappointed. By literally all content ever produced ever until the end of time. Even if you're watching BBC's Planet Earth, their goal isn't to be literal with imaging either.
Funny the thing you describe as "incredible complex" is a bunch of compromised functions done years ago from simple ideas
On this point we disagree. At least on its face. It's true in the sense that "compromises" had to be made in order for us to have anything to see as a result of display technology being what it is. But transforms, gamut compression, and creating these image standards wasn't easy.

In terms of the modern standards, those also weren't necessarily easy either, and they had to be created with the greater "vision" and "knowledge" that display technology is improving. Or in other words they want these standards to be aspirational enough to not be limiting to future display tech but not so far ahead that all you're doing is creating gamut clipping and colors that are irreproducable by all TVs.
Which to a certain degree is an impossible task that I think they've done incredibly well with.
while the hardest part is always to make a competent monitor to do the higher bit log transfer without all the compromises.
On this we mostly agree. Creating display tech is much more complex than acquisition tech. As one merely has to "absorb light" and record it. The other has to actually create/reproduce it. And it turns out the latter is much more difficult to do.
It's not as if display manufacturers are intentionally doing things this way. It's just that TV's like most tech costs billions of dollars of investment to start doing, the products cost next to nothing, and the profit is razor thin. If you could convince everyone to only buy TV's that cost $3000+, we'd probably see display tech evolve faster.
As it is though, the LG C2 is incredible for what it's capable of reproducing for its cost. Albeit with stupidly aggressive auto-dimming and limited total luminance. The Sony A95k, exists if you want the best of the best QD-OLED in terms of a consumer level display. It's cheap by professional standards and will likely blow anyone's minds that hasn't ever gotten to see what a nice image at home looks like. I expect <.01% of Americans will buy one though.
 
Last edited:

The guys with greater "vision" and "knowledge" don't make the monitors. Their perception is limited just like Rec.709

Funny an image on a cheap made OLED TV suddenly becomes "amazing" with the inferior range of both the color space and the luminance where the actual grading is done on a LCD instead.

The display PA32UCG is already at $5000 blowing away dual-layer LCDs.

Try a little to see what looks real, what looks better, to grade every movie you see with a better monitor.
 
The guys with greater "vision" and "knowledge" don't make the monitors. Their perception is limited just like Rec.709
That is a very cynical view of monitor tech, especially when in a few more sentences you directly reference a monitor. Where do you think said monitor comes from? The sky? Magic?

Rec709 existed because the technology we had was only capable of showing that size of color space. The limitation as I discuss below was film stock (capable of a maximum of 48 nitts of brightness). And the other half of that was our display technology which was CRT tubes (which were also limited again by bandwidth in the sky to watch OTA broadcasts) and film projectors. The fact that they were able to make standards for two wildly different formats, figure out ways to broadcast that over airwaves, and have TV's be able to interpret that gamma was and is a wonder. And they did this all without modern microprocessors that we're just taking for granted. A lot of very clever people created Rec709 out of the incredibly limited analog technology of the time. And that changed dramatically over the second 50 years in the 20th century where TV became ubiquitous. There were dozens of updates to both OTA and Film to improve that standard. And even Rec709 where we are now has had tons of updates up to and including resolution and gamma curve.

Both film and TV were mastered at 100nitts and below; that was the limit of analog technology. What were they supposed to do? There couldn't be the improvement we have now until now. Microprocessors had to become ubiquitously plentiful and manufacturing tech also had to reach this level of maturity. To say otherwise is ignoring history.

The engineers now are more than capable of developing "better monitors" and yes, they have the vision to produce them. Otherwise like I mentioned at the beginning of this response, said monitors wouldn't even exist. They are very aware of the disparity between digital camera acquisition and display technology. Hence why we even have HDR in the first place and can complain about it in this thread. But the technology doesn't exist yet to magically make perfect displays capable of showing Rec2020 that are cheap. The reality is while it's technically possible to build displays with gamuts that large, consumers aren't willing to pay $20k+ for a TV. As I noted few if any will pay 25% of that (likely <.01% or really .001%). And zero consumers will likely pickup the ASUS display you mention below. That's where the tech is now in terms of manufacturing. It might be another 10 years before such displays are cost effective to make.

In other words, engineering is more than just "can we design xyz", it's also can we manufacturer it in an economical way. We're talking about bleeding edge here, and whether we're talking about cars, or microprocessors (as in super computers), or in this case display tech: that's expensive. I don't know what your deal is, but you're basically ignoring this basic fact. Ferrari's exist. They can be engineered, again by very smart people. It's just that most people can only afford Corollas. For more on that I'd look into the history of the S-Class Mercedes (here is a 16-year old bootleg of Jeremy Clarkson talking about one in 2006) , which kind of epitomizes this point: basically the tech in that car has been consistently ahead of the general consumer market by about 10 years. In the video he demonstrates GPS, voice commands, hands free tech, radar guided cruise control, etc. Which for reference was before Siri existed. This car actually predates the iPhone and Android. Now it's possible to get those same things in cars that cost 20% as much. It just took the better part of a decade to do.

This is not an engineering issue in the sense of whether it can be designed and built. It's an engineering issue in the sense that manufacturing technology to make this stuff isn't cheap yet and it costs more than normal people either can afford or are willing to pay. And yes, some of the tech is maturing (like all things, 10 years from now bleeding edge will be common and there will be other stuff people can't afford such as the S-Class example. In the case of displays nothing is capable of displaying all of Rec2020 yet, but that will come from the top down). Obviously OLED isn't perfect, but it's not as if they're not working on it. For some reason you're complaining about the engineers designing this tech like they don't "get it", when they absolutely do.
The display PA32UCG is already at $5000 blowing away dual-layer LCDs.

Try a little to see what looks real, what looks better, to grade every movie you see with a better monitor.
The Asus PA32UCG is still a consumer level display. People working at Dolby and other high end color houses use scientific reference monitors that are capable of 5,000 nitts of brightness in excess of $22k (that monitor from the Dolby Video I linked was $40k at the time and was an older model. In other words they're "cheaper" now for greater specs). They are also SDI fed and 12-bit internal processing. ASUS is trying to come in and undercut in the mastering market by creating this much lower level display, it's only consumers that are making 'content' that will be buying them though. Not Hollywood.

It also seems like you don't know the process of grading footage either, which is very relevant to our discussion. In short "real" is subject to interpretation. At the end of the day you're using analog eyes and an analog brain both of which understand "real" to be contextual, not absolute (though Dolby Vision by design gives absolute standards in terms of luminance and is a massive upgrade). Grading is just as much art as science because humans are involved. As in, it's humans that watch things, not robots. Watch that Dolby video. Even if you don't understand a lot of the technical, a huge amount of grading is literally interpretation. It's not about "realism" it's about making images people like and want to see. Film/TV is about looking better than real life (in fact here is a time stamp from that Dolby Video where he directly addresses that as a "misconception" about color). If you don't like that idea, then I'd suggest that you'll never be happy regardless of display technology and that a medium driven by art and artists isn't for you. But he also covers the whys behind Dolby Vision and its purpose as well as where we came from, which was CRT tubes and Rec709.
Funny an image on a cheap made OLED TV suddenly becomes "amazing" with the inferior range of both the color space and the luminance where the actual grading is done on a LCD instead.
There is far more to even what "looks real" than just peak luminance. If luminance was the gold standard, then basically every piece of film made in the 19th and 20th century doesn't look "real" at all. Film is only capable of 50 nitts of brightness (48 to be precise in terms of mastering). All of the dynamic range of film has to be squished in that range. You can make a brighter projector, but effectively all you'd be doing is increasing the black point and shifting middle grey up.

Anyway, the point is film has its own set of qualities, and a lot of it is incredibly cleverly built, realism being a part of that. There is an incredible history placed into film stocks and the technologies therein. Moving along, Tarantino as an example wants the quality of film and refuses to shoot on anything else other than film as does Christopher Nolan. Their films then by design are limited by the medium they film them on. From a technical standpoint they are limited in comparison with if they shot exclusively on a camera such as the ARRI Alexa 65. It's not as if they are unaware of the technical limitations of film, it's that they like the quality and the feel of it and indeed place a higher value on those qualities rather than your idea of "realism" which in this case is directly tied to "luminance" or lack thereof. You think 10,000 nitts of brightness is the holy grail? Well not all artists feel the same way. Nor do all color scientists. The engineers at ARRI, whom make the best cameras, in fact push specifically for beautiful rendition of things like skin tones and are less concerned about "realism". Sony made cameras for 10 years that could easily match up 1:1 to how the colors actually looked. But it turns out people want to look at something "pleasing" far more than "real". This whole paragraph is really illustrating that fact that your ideal of "realism" being the highest standard isn't what most people want. In reality it's probably not even what you want. "Real" is boring. Otherwise just go outside and stare at things. Film are anything but real. Lenses aren't how people see, digital sensors have more DR than your eyes (though perhaps less latitude), and filmic color is more "fantastic" than real life. And that is to say nothing about the stories being "unreal". If you don't want/like any of that stuff, I'd perhaps suggest a different art form.

This is also a very long winded way of saying that OLED looks fantastic. Precisely because it's capable extremely high contrast ratios, perfect black point, it's motion clarity is excellent due to its incredibly short response rate, flicker free, zero judder on all sources, and it's viewing angles are also much better than LCD meaning there are no viewing distance or color shifting issues at the edges of the display. It also costs comparatively much less for a far larger size than is available vs a scientific display. Frankly it's more enjoyable to watch films on something big rather than a 31" monitor. It's more fun to watch films on a couch with friends and loved ones rather than either being crammed shoulder to shoulder with them to view something at off axis angles or a "tiny" box too far away. Which also is a part of the experience. We can wax on about the technical as we already are, but frankly we're also talking about entertainment. Which means at the end of the day it's supposed to be fun. OLED, that thing you're crapping on, gives a huge amount of ROI on "fun", while looking incredibly good and meeting a lot of the metrics to making images look good. And this is ignoring the PQ curve (perceptual quantizer), that will make the lower peak luminance level of OLED still look incredible (which btw still isn't bad, OLED has some of the highest color volume of any display type and it still reaches nearly 1000 nitts peak brightness which is plenty). In short if I wanted to have a display to "just watch movies" on, it would be the Sony A95k and not a scientific monitor and certainly not any monitor from Asus.

However, Flanders does make an OLED monitor designed for on-set display if you want access to 12-bit internal processing and want to pay the money also to drive it. It's true that said display is just to be used as a reference for a DIT or other work on set and not specifically to grade on, but it still will exceed the imaging quality of consumer level displays.
 
Last edited:

Your way of grading movies is just being arty, doing whatever you like instead of grading it real.

Only if you had known exactly how many nits and color each pixel has in an image. Most of the graded images look nothing close to the real life to have true HDR impact. Better learn how PG or HLG is calculated so you can grade the content in a more precise way.

Most people shill too hard on OLED that cannot even do HDR properly if the brightness and the color shrinks down.

A reference monitor is stepping away from OLED. It's also a waste to use a reference monitor to producing anything away from realism.

No wonder all these incompetent standards made no good HDR these days.
 
Your way of grading movies is just being arty, doing whatever you like instead of grading it real.
That’s all images you’ve ever seen in your whole life from every source. Including all pictures, all movies, the news. All visual mediums. Period.

Only if you had known exactly how many nits and color each pixel has in an image. Most of the graded images look nothing close to the real life to have true HDR impact. Better learn how PG or HLG is calculated so you can grade the content in a more precise way.
You’re not getting it. No one cares. If your goal is realism prepare to always be unhappy. You have never seen an 'ungraded' image in a theater. And you literally never will.

And though I've said it more than once, clearly you don't believe and/or don't understand: but you wouldn't want to anyway. Ungraded images look terrible, because by their design they are supposed to be manipulated in post. They're designed to hold and retain as much data as possible, which for a CCD chip means they look washed out (extremely low contrast), desaturated, and with very low luminance levels. A black point that's way above zero and a white point that's way below 10,000 nitts. FWIW a film negative is also designed this way, and there has been a huge amount of manipulation done to all photography (whether video or still) in post since its inception.

Most people shill too hard on OLED
This isn’t shilling. This is using a spectrophotometer and/or colorimeter to see precisely what a given display is capable of reproducing. It is this technical definition that shows OLED is superior in most ways to other display options. It also just happens to look better as a result to our eyes. Peak luminance is only one metric. I can give you a 10,000 nitt monitor today. It will just be a white light that blasts light into your eyes. Maybe at that point you'd get that contrast and color volume actually matter.

that cannot even do HDR properly if the brightness and the color shrinks down.
You have eyes. They are analog. You aren’t capable of seeing and assigning a numerical value to what you’re seeing. But it would be a fun double blind test study for you because you’re not willing to acknowledge this. I could do this today. All we have to do is get an LED fixture used on any set, use a spectrophotometer to measure it's exact luminance value at different levels. Then simply test in random order different values and see if you're capable of even being within 100 nitts of the actual volume of light. To make it even harder we could do this with an RGB light fixture so you'd have to tell us the luminocity value while not only the brightness level is changing randomly but also the color is changing randomly. I doubt you'd get even 20% correct. In fact in a "100 question test" with random RGB and lumincity values assigned, I'd bet you'd get less than 5%. Especially if we never acknowledged or told you whether you were correct or not or what the actual value was throughout the test. Again showing that it's your perception and not reality.

And you’re ignoring the PQ curve which does a lot of work for not only OLED but all displays. In other words again it's perception. Not reality.

A reference monitor is stepping away from OLED.
You should probably be aware of what "reference" means. After you complete a grade on a monitor, all other displays are "wrong" because any other monitor won't show the decisions you made the same way.
If you're not using the display that the grader used when it was displayed, then everything you're seeing is being interpreted. Regardless of if the display you have is capable of "greater luminance targets" and "greater color volume". It's that reality that we live in every day.
So if you don't get that stuff in a movie theater was not graded using a projector and how that immediately relates to all other displays, then there isn't much I can do to get this through your skull.

Part of what we're even talking about here is the simple fact that these standards have to look good on a $40k display and a $400 display. On a laser projector and on a phone. And it turns out that OLED display that's in the "middle" somewhere is better than about 95% of all displays. Pretty much only short of spending at least 4x as more for something much smaller and less practical for the intended use: which is watching movies.

It's also a waste to use a reference monitor to producing anything away from realism.
You’re just doing another flavor of the gamer complaint that movies should be filmed and played back at 120Hz because it’s “more real”.

Don’t watch movies. Ever. There are zero films you can site and that you ever will be able to site that look real. I’d love to have you list one movie. Just one that is supposed to look “real” to you.

No wonder all these incompetent standards made no good HDR these days.
Okay. It’s becoming clear you know nothing, want to know nothing, and just want to complain about things you know nothing about.

It’s clearly not for being given the information or the knowledge. But you’re just some guy on an armchair that has never used a cinema camera, never developed an image, never worked in color science, never engineered a display, and are unwilling to listen to anyone else that does or have or look at references from people that do who know more than you. So, I guess we’re done here because it’s no longer productive. You’re not bringing up new talking points, you’re simply talking about your uniformed mistaken beliefs and stating that the entire industry should follow you. And they won’t. And they never will. But I suppose it's that precise position that allows you to stay uneducated and complain. Ignorance is bliss right?
 
Last edited:

It looks like you forget how PG or HLG is calculated.

Anybody with an actual decent HDR monitor close to the reference and a basic PG/HLG calculation can grade movies of his own. Each pixel can be shown the exact brightness and color. It is not an exclusive job done by a certain people anymore long time ago.

Funny I grade every movie I see. We can do a test to see how good your job is to grade a SDR scene compared to mine or anybody with the two requirements above.

Your movie standards are not going to live up to a level that people are indeed wondering why certain HDR games can even look more realistic, more impactful than the worse graded movies now. You should've seen people mistakes a realistic grading scene as a gaming HDR CGI.

You are going to fade away if you don't even do realistic grading to appreciate the capability of a monitor.

I do hope ignorance is a bliss because every time I know a display is an OLED as it helplessly flickers all over the places.
 
It looks like you forget how PG or HLG is calculated.

Anybody with an actual decent HDR monitor close to the reference and a basic PG/HLG calculation can grade movies of his own.
Getting "reality" isn't hard. You need a color chart and automated software. If you shot on target with middle grey using a light meter, between the chart and the exposure you can reproduce exactly what you saw.

Except you can't. Because by definition every point of interpretation isn't real. Your eyes aren't 200mm. Your eyes aren't 18mm. Any lens doesn't see the way human sight does. Camera sensors see more that human eyes do. And interpreting it by definition will always look different than what a human sees. Not only from the way the camera captures color but also from the way whatever system you're using interprets color and your display interprets color. And although in theory if your goal is to match calibrate all of them using a myriad of targets you can get close, but no one wants to see that. This all has been possible for a decade. Watch a soap opera if you want a closer interpretation to "real colors" and "non-filmic motion".

HLG also has significantly less dynamic range than shooting in RAW or LOG. You'll never see anyone shooting in these modes unless they're going straight to broadcast which was the original reason why HLG was developed. Well that and to be interpreted easily on both SDR displays and HDR displays simultaneously specifically for OTA broadcasting. Which if you're hardcore about HDR and "realism" you'd never use.
The irony here is you go on and on about how trash HDR standards are, while using a standard that is pretty much the worst one. Any SDR display that doesn't have HLG gamut mapping will clip and have tons of out of gamut SDR values and hue shifting. And in terms of HDR, although it allows for Bt.2020 color primaries it's still way more limited than using a "true" HDR standard. If that's what you're using, no wonder why you think HDR standards are shit.
Each pixel can be shown the exact brightness and color. It is not an exclusive job done by a certain people anymore long time ago.
That isn't in debate. It's just that no one cares.

And if you search YouTube for 30 seconds, you can find a bunch of color grading videos by people that have no idea what they're doing. Anyone "can" do it, but it's obvious the industry professionals are doing a far better job than amateur hour that is 99.99% of "colorists" on YouTube with few notable exceptions.
Funny I grade every movie I see. We can do a test to see how good your job is to grade a SDR scene compared to mine or anybody with the two requirements above.
Cool. Post them. Let's see how "real" they are. You're bragging about it. Blow my mind. Show the before/after.

I don't feel it's necessary for me to do any work here. I haven't stated that my goal is to be "realistic", you have. My goal is always about acknowledging that it's an artistic practice and I care more about producing pleasant images rather than "what is real". That is your bent.
Your movie standards are not going to live up to a level that people are indeed wondering why certain HDR games can even look more realistic, more impactful than the worse graded movies now. You should've seen people mistakes a realistic grading scene as a gaming HDR CGI.
Who? What people?

For reference I have no doubt that VR will become "a thing" and maybe in 20 years or so it becomes the more favored format over "traditional movies". But it won't be for the reasons you're bringing up. I gurantee you that the virtual world will be just as "artistic" as what is happening in films. If not more so. People in the virtual space want an escape from reality and not reality. Even after ray-tracing can be done "perfectly" on a phone at 120fps (with no rasterization), I guarantee you all games will continue to be driven by art. Because that is what it is.
You are going to fade away if you don't even do realistic grading to appreciate the capability of a monitor.
This is idiotic.
Again, you're the same sort of person that thinks that 24fps should go away. Or that film should change to be how you want it to be.
It turns out that the entire industry disagrees with you. So you'd have to change the minds of every director in Hollywood, every colorist, every DP, every production company, every streaming company (perhaps you haven't actually explored what "Netflix Approved" in terms of camera acquisition and processing means), and then also the tastes of every home viewer. Good luck with that.

Meanwhile every teenager posts on Tik Tok and Instagram using the heaviest and craziest filters not giving a rip about whether or not color (or luminosity) is remotely close. And they're the next generation, showing pretty definitively that if "it's about realism" they don't care. Similarly the current generation making films (like Roger Deakins, arguably the best DP in Hollywood or directors such as Tarantino or Nolan) as well as the ones leaving and nearing retirement (Ridley Scott, Spielberg, Lucas) also don't care. Jill Bogdonovich who was the colorist from Joker made "the fakest", "most teal" grade pretty much of all time. Turns out everyone loved it. You may hate that all this is the truth, but it is. Turns out everyone cares about the art, not whether it "looks real".
I do hope ignorance is a bliss because every time I know a display is an OLED as it helplessly flickers all over the places.
OLED deosn't necessarily flicker. The A95K doesn't. At all. Neither does Flanders OLED monitor that I linked either. Neither of the displays that I mention do. You're commenting on displays you've never seen or experienced and are trying to make performance claims. Ignoring literally the thing you're saying is the most important, which is: technical data that has been measured scientifically. You can't have it both ways.

And if you want, you can buy a $15 remote for your LG C2 as an example and turn off adaptive brightness. So, you really don't know anything about the display tech involved either. In fact any LCD with a PWM system for its backlight will "actually flicker" unlike OLED which is self-emissive and therefore has 100% stability.
 
Last edited:
Every OLED is born from flickers. You cannot avoid that no matter what.

I have good impression that you cannot grade for crap instead of flapping around your cameras even you have the whole imaginary Hollywood back you up.

Better grade the scene below and let me see how arty your "philosophy" got you this far.

shot0069.jpg
 
Every OLED is born from flickers. You cannot avoid that no matter what.
:rolleyes:
OLED is a self emissive display type that uses sample and hold. You can avoid flickering easily, because OLED doesn't flicker. In order to have it flicker it would have have to also have an "off" state like a PWM system. And it doesn't. Your ignorance about OLED has been apparent this entire conversation.
I have good impression that you cannot grade for crap instead of flapping around your cameras
You don't get it. Let's say for sake of argument I can't. That isn't necessary to be capable of doing in order to describe Hollywood, "realism", or lack there-of. It's also not necessary to grade in order to know about the technology behind film.

You think that all of the directors I mentioned know how to color grade? Newsflash: they don't. The points I'm making stand up whether I have capacity or not. The fact that this is all you have to answer with as an argument doesn't help your case at all. And it’s also telling you can’t name a single film that “looks real” while saying that this should be the goal when it isn’t and hasn’t ever been.

Still waiting: name one film.
even you have the whole imaginary Hollywood back you up.
Case in point, I literally linked a colorist for Dolby that spent an entire video talking about how grading is interpretive. Then I also referened Jill Bogdanowicz the colorist for the film "Joker". I don't have to be able to color grade at all to reference two sources that demonstrate my point completely, where you haven't demonstrated your point at all either through yourself (which you claim is the great source of knowledge) or anyone with any clout or Hollywood knowledge.

If you'd like, I can also link for you this video here with industry colorist Cullen Kelly and also Jill Bogdanowicz herself, while they discuss show LUTs and creative intent while coloring. But I doubt you'll watch that either, because you'd rather be in denial than actually see what people in Hollywood are doing.

Better grade the scene below and let me see how arty your "philosophy" got you this far.

View attachment 530489
Where's yours? Thought you were going to blow my mind? Show me the realism that makes me want to change my philosophies. You're supposed to change all of Hollywood's mind that it's supposed to be realer than real. Nevermind that anything you come up with will in fact be an interpretation.

But in order to do that correctly, I'd have to know what it was shot on and what gamma curve it was shot on. You know, basic stuff. I still won't bother, but your lacking of bothering is also telling.
 
Last edited:

I am disappointed with the hope to see at least 1,000nits highlight from a grading of yours to make an arty presentation if your philosophy has ever made an image looks good. If you've ever checked the mp4 file, you will know the original gamut is your favorite Rec 709.


Heatmap 1x1 copy.png

Just check out the contrast boost.
There is still room for improvement from what it feels when painting a rooftop.
Scope.png


There are movies that's graded as good as BBC documentary such as the indie Prospect 2018 to fully appreciate the natural reserve in Hoh Rainforest.

There is no CGI but spectacular HDR sunlight.
 
Last edited:
I am disappointed with the hope to see at least 1,000nits highlight from a grading of yours to make an arty presentation if your philosophy has ever made an image looks good. If you've ever checked the mp4 file, you will know the original gamut is your favorite Rec 709.
Rec709 is just a color container format. LOG files present themselves as Rec709. That doesn’t mean that the image has been normalized. In fact your image is clearly not normalized through any transform. I wouldn’t consider that to be Rec709, or otherwise broadcast ready. And it’s obvious you agree.

View attachment 530514
Just check out the contrast boost.
There is still room for improvement from what it feels when painting a rooftop.
View attachment 530513

K. Which is still your creative intent. And you’re too ignorant to know that. You’ve crushed shadows according to both your wave form and false colors. Your eyes have enough DR to see transparently into shadows on a sunny day, but that isn’t what looks good most of the time is it?

The fact that you’re talking to me about how you’ve added contrast and could “still add more” is literally showing how you’re interpreting how contrast should sit. A creative decision. You are not basing where contrast should sit impericaly as it was in the original scene. Otherwise you should be able to tell me the precise IRE or nitts the shadows should sit at and the highlights and the mid tones because you should’ve measured exactly where they were in “real life” in order to match it 1:1. It wouldn’t be a discussion about “adding contrast” because that isn’t an imperical measurement.

Precisely where you place contrast isn’t in order to match reality. It’s in order to match your creative intent.

You’re only proving my point.
There are movies that's graded as good as BBC documentary such as the indie Prospect 2018 to fully appreciate the natural reserve in Hoh Rainforest.

There is no CGI but spectacular HDR sunlight.
Feel free to link them. But you’ve missed it. Each of those images were made with creative intent. As was BBC’s Planet Earth. Not one image looks “real” or as it was. They’ve all been “enhanced”.
 
Last edited:
The HDR grading makes images as close to the reality as possible. This is the ultimate goal instead of your defend of "creative intent" or whatever.

I've seen the same crap again when people knows nothing about PQ/HLG based grading algorithm just saying there is "crashed" or "lifted" shadows.

The shadows under 5nits compared to the original SDR is basically the same. They are in the lowlight region that won't be altered much. It is also the reason why PQ is has a simple idea to transfer the electronic signals from 0-0.25 to 0nit-5nits, 0.25-0.50 to 5nits-100nits, 0.50-0.75 to 100nits-1,000nits, 0.75-9.00 to 1,000nits-4,000nits, 0.90-1.00 to 4,000nits-10,000nits in a smooth curve under the Barten Ramp. I can still lift shadow or lower down them even more if the scene requires.

Scope-SDR.png
 
Last edited:
I've seen the same crap again when people knows nothing about PQ/HLG based grading algorithm just saying there is "crashed" or "lifted" shadows.

The shadows under 5nits compared to the original SDR is basically the same. They are in the lowlight region that won't be altered much. It is also the reason why PQ is has a simple idea to transfer the electronic signals from 0-0.25 to 0nit-5nits, 0.25-0.50 to 5nits-100nits, 0.50-0.75 to 100nits-1,000nits, 0.75-9.00 to 1,000nits-4,000nits, 0.90-1.00 to 4,000nits-10,000nits in a smooth curve under the Barten Ramp.

View attachment 530530
Your standard is realism. This doesn’t look real. Period.

You’ve interpreted where shadows should sit.
 
Your standard is realism. This doesn’t look real. Period.

You’ve interpreted where shadows should sit.
As I said before people will be sad to find out 99.99% of the monitor cannot even display range just like this.
The image can be graded from 0nits all the way up to 10,000nits or even more than that using other EOTF with the precision closer to real life if there is also a monitor competent to do that.
Real life is the HDR reference.
 
Last edited:
As I said before people will be sad to find out 99.99% of the monitor cannot even display range like this.
Again, you’re not understanding what I’m saying.

if your grade was perfection:
What was the nitt value of the shadows in real life?
What was the nitt value of the roof in real life?
What was the nitt value of the asphalt in real life?
What was the nitt value of the power pole in real life?

If the answer to any of those questions is: “I don’t know” and you just “added contrast” then you aren’t working in realism. You’re just interpreting it how you want and not how it actually was.

This has nothing to do with available display tech. Rec2020 has the max luminance value of 10,000 nitts. The sun. So you can place the sun with shadows in the same frame and give room for all of it. In other words “perfect realism”. Just match it all up 1:1.

But if you aren’t matching the luminance value of each object in the frame in your grades 1:1 then it’s not realism. It’s just interpretation. The ability of displays to reproduce it isn’t the point. The point is your grading isn’t real. The same thing you’re claiming Hollywood is guilty of.

In other words what the hell are you even talking about then? When called out on this you just say: oh well displays can’t show it. All the while taking constantly about realism.
 
Last edited:
I know what your narrative is. It is all about the incompetent, compromised HDR to favor the low range, more unrealistic images to look arty fine.

If there is a HDR monitor that is over-specs to display more than real life. All the talk is over. The shadow can be 0.01nits while the rooftop can be graded to 1,000,000nits the same as real life or whatever you detect with a luminance meter. When you look at the image, it looks the same as a window.

HDR is meant for the reality. That's the ultimate goal. People are going to push it.
 
I know what your narrative is. It is all about the incompetent, compromised HDR to favor the low range, more unrealistic images to look arty fine.
You’re doing the same thing. So welcome to Hollywood. You didn’t grade “real” at all. Nor know how to.
If there is a HDR monitor that is over-specs to display more than real life. All the talk is over. The shadow can be 0.01nits while the rooftop can be graded to 1,000,000nits the same as real life or whatever you detect with a luminance meter. When you look at the image, it looks the same as a window.
Nope. Still will be subject to artistic interpretation.
HDR is meant for the reality. That's the ultimate goal. People are going to push it.
Lol.
 
I have no doubt a guy cannot even grade for crap won't be able to see or understand what a better image can be.
Okay? Well apparently you can’t either. Which is apparent above. Nor do you know how to make your own standard of realism. Soooo cool?

You don’t even understand the basics behind choices or decision making and you’re criticizing an entire industry that does. That much is obvious.
 
I can grade HDR 1000, HDR 1400 with better contrast you've never seen before, which are closer to the reality.
Funny how you can use an entire industry to suddenly back you up when you don't do any HDR works, have no good reasons other than inferior compromises.
These movies needs to be graded in a way to sell on most of the monitors. This is the compromise. This is nothing about HDR reference.
 
I can grade HDR 1000, HDR 1400 with better contrast you've never seen before, which are closer to the reality.
Funny how you can use an entire industry to suddenly back you up when you don't do any HDR works, have no good reasons other than inferior compromises.
These movies needs to be graded in a way to sell on most of the monitors. This is the compromise. This is nothing about HDR reference.
The point of grading on a reference monitor that “no one else can see” is to do things to the highest possible standards.

These are things the industry is already doing. I’ve given plenty of evidence of that. But you’re convinced that people are using 5000 nitt displays in order to merely shoot for “realism” and that isn’t the case. And it has never been the case. And it never will be the case.

No film looks real. Period. Regardless of if it’s on a 5000 nitt reference display or not. And the goal isn’t to be real. If it was color grading wouldn’t be a thing.

You still don’t get that color grading isn’t necessary at all if all you want is 1:1 to real life. Just grade in a 10k nitt space, shoot a color chart. Match middle grey, use software that holds middle grey and gets all color values from your chart, let all values fall where they fall. Boom done, every nitt value would be the same as real life. Displaying it is another story, but that isn’t the point. You could say doing it this way would “future proof” your film and let quantizing allow for all displays to fall where they may until display technology catches up.

You will see ZERO films done this way because it’s about artistic intent. So you keep going on and on about realism when everything can already be graded that way now, and it’s very intentionally not done.

There is no teal and blue in real life. No matrix green. No light reshaping. No film emulation. No post DOF or post blur. No targeting of values. No color compression. No pushing of hues. In other words for the 40th time, Hollywood, nor TV looks real or is designed to look real. And that isn't getting into things like added motion or distortions. Film makers are putting back in film gate and film weave artifacts. Halation. Film grain. None of that is real. And some of it if not all of it could be considered "defects".
You could argue that the news gets close, but then all you’re really getting is just how a particular camera interprets what it sees and compresses it down Rec709 or HLG. And that isn’t “real” or “accurate” either.

If “real” is the metric, we will never be there because no one is trying to get there, other than you. Regardless of if we all had 10k peak brightness displays or not. But good luck with all that. Basically all I see is you insulting me, ignoring Hollywood, doing the same thing Hollywood is doing (interpreting footage), and complaining about “realism” while totally missing realism.
 
Last edited:

I already tell you Prospect looks rather good. Movies like this will only look better when the monitor is more capable of especially it is mastered with Dolby Vision dynamic tone mapping.

You talk like you don't even have a true HDR 1000 monitor since the way you talk is more like SDR.

Better show me what kind of works you've done so I can grade your SDR to let you know what a better image looks like lol.
 
I already tell you Prospect looks rather good. Movies like this will only look better when the monitor is more capable of especially it is mastered with Dolby Vision dynamic tone mapping.

You talk like you don't even have a true HDR 1000 monitor since the way you talk is more like SDR.

Better show me what kind of works you've done so I can grade your SDR to let you know what a better image looks like lol.
What statements about Hollywood, grading, TV, or display tech that I've said is false. You say the same garbage, meanwhile I site sources.
Give me your sources now. Or, I suppose show me your show reel. You're saying that this has to be based on my work, but I'm not even citing me. I'm citing professionals. So you're the professional? Then you're the one that needs to prove it.

Otherwise you really have nothing to say. But that has been apparent for basically all of your posts in this thread.
 
So you don't even have a true HDR monitor? lol.

Don't be shy. I have shown my way of HDR grading. Show me your works.
So you don’t know how Hollywood works? You can’t site sources? You are ignorant about HDR? You don’t have a show reel? You’re not working at the level of any of the colorists that I cited that work directly as industry professionals every day of their life for organizations like COMPANY3?

You don’t get it. Even if I was Mitch Bogdanowicz, I’m not citing myself. My "opinion" doesn't matter, it doesn't even factor into this. This has nothing to do with me. I’m telling you how it is (via numerous sources), and you’re out of touch. You’re the expert? You're the one citing yourself? Blow my mind. Still haven’t seen it yet. You demand from me, but I've already seen from you that you can't even live up to your own standards. I have ZERO to prove.
 
Last edited:

Of course you are not Mitch Bogdanowicz lol. You don't even have a true HDR monitor.

I don't give a crap about a guy who talks so much "philosophy" about HDR without ever seeing or realizing what HDR can look like.

The way you bring up Hollywood as the back up of your philosophy makes me laugh. Have you ever realized that all the Marvels movies you fancy are graded more and more realistic after every release?

Heatmap 1x3 copy.jpg
 
Of course you are not Mitch Bogdanowicz lol. You don't even have a true HDR monitor.

I don't give a crap about a guy who talks so much "philosophy" about HDR without ever seeing or realizing what HDR can look like.

The way you bring up Hollywood as the back up of your philosophy makes me laugh. Have you ever realized that all the Marvels movies you fancy are graded more and more realistic after every release?

View attachment 530588
None of that looks remotely realistic. It looks the opposite of real. It's beyond reality. And Thor in general is color graded like crazy. Which is ALSO NOT REAL or REALISTIC or towards REALISTIC INTENT. It's towards artistic intent. Literally the thing you're citing now is proving my point.
Realism is more than just "luminance" for the 50th of time. "Realistic luminance" doesn't mean anything close to "realistic visuals". Everything here is designed to look better than reality, with nicer colors. Even every shade of blue in the sky and in his clothes is all hue shifted to stay inside specific ranges, as is the exact color of his red cape. If you even know who Mitch Bogdanowicz is and his work, then you should know that everything the man does is art driven.
Heck he also collaborated on Ravengrade which is a film emulation plugin: which again ISN'T WHAT REALITY LOOKS LIKE. So you're arguing with me about how Hollywood does things and then literally show just how manipulated their images are? What are you even talking about?

You talk about trying to look like reality and then post one of the most "enhanced" images from one of the most "enhanced" film franchises possible. There isn't even an attempt here for realism. It's stuff like this that has me convinced and you continue to convince that you don't know the difference between real and fake and even what color grading is or does.

EDIT: This WHOLE MOVIE WAS GRADED BY COMPANY3. Are you out of your mind? You think this is what reality looks like? What warped reality distortion field do you live in? Again, they're hired to make PLEASING IMAGES. Not real ones. For the love of the LORD Jesus Christ.
EDIT2: The main colorist is Jill Bogdanowicz! The person I cited up above that also graded Joker. And again, YOU CAN HEAR HER SAY FROM HER OWN MOUTH THAT SHE MAKES THINGS PLEASING AND NOT REAL. She talked with Cullen Kelly for an hour about LUTs, color separation, shaping, and film emulation! I cited her myself. Seriously, this is your win? She's saying the exact opposite things as you, and agreeing with my statements.
 
Last edited:
It is more realistic compared to other movies you fancy. Of course, whatever your crap monitor doesn't do the proper job to make it look as intended lol.

HDR is meant for realistic images matched for human perception. The image like this is only going to be more realistic when the monitor display more range.
 
It is more realistic compared to other movies you fancy.
What films do I fancy? You’re an expert on me, so go ahead and state all my viewing habits.
Of course, whatever your crap monitor doesn't do the proper job to make it look as intended lol.
Source?
HDR is meant for realistic images matched for human perception. The image like this is only going to be more realistic when the monitor display more range.
Every part of that image is manufactured. You don’t know what color grading is or does if you think that matches reality. It wasn’t designed to match reality to begin with.

The contrast doesn’t match reality. The color, doesn’t match reality, the hues, everything is driven towards artistic expression and how it feels and not reality at all. And you keep saying this is a display technology issue when it’s not. They could’ve mapped it all up perfectly with reality 1:1 AND DIDN’T.

I’M LITERALLY QUOTING THE COLORIST OF THAT FILM. HOW MUCH DENIAL CAN YOU BE IN TO SAY THE COLORIST IS WRONG ABOUT THEIR OWN WORK?

You are dense as bricks. COMPANY3, Jill Bogdanowicz, reality. Lol.
 
Back
Top