Do you like wide gamut monitors ??? And do you use a colorimeter?

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,765
Hi there,

some years ago when the first wide gamut monitors became widespread,

a lot of people talket about colorimeters.

No one talks about them anymore. Why?
Do you like how your wide gamut monitor behave in an sRGB environment like Windows?

Is colorimeters unfashionable this days?
 
Because Wide gamut (AdobeRGB) is useless for anyone except professionals who deal with photography/printing.
Rec. 2020 is superior and used in HDR content, any display worth their salt is capable of covering both sRGB and Rec.2020/DCI-P3 gamuts.
 
The difference is that between professionals who are early adopters and a mature market.

Calibrating has always been niche. Displays do drift but for 90% of people the factory Cal is good enough.

Plus as you can see above, many of us data geeks have drifted away from here to other locations due to attitudes.
 
Hi there,

some years ago when the first wide gamut monitors became widespread,
This has more or less already been answered, but I can add perhaps a bit more.

If a monitor is capable of any HDR standard, then it's capable of "wide gamut". HDR by definition is a "wide gamut".
Gamut is effectively a description of how many colors a given color space is capable of reproducing. And HDR gamuts are "large".
Rec2020 is the most common format that monitors are targeting for HDR. Though there is also Dolby Vision as the other major cinema format.
a lot of people talket about colorimeters.

No one talks about them anymore. Why?
Do you like how your wide gamut monitor behave in an sRGB environment like Windows?

Is colorimeters unfashionable this days?
They're still integral to people doing professional work.
If your interest is ever trying to get an accurate image, they are just as important today as they were 10 years ago.

Most casual people don't care about accuracy though. Though if you're a purist, as in you want to see the full artist intent, then you'll have some interest in pursuing it.
If the image looks "fine" for most people then that's enough. Most don't think about display inaccuracy just as they don't think about audio EQ inaccuracy. In fact a lot of people will EQ their sound in a way that the producer, mixer, and masterer never intended.
 
Factory calibration has become a good bit better. The default preset is usually still a pile of crap but usually there is a reasonably accurate option out of the box on most decent monitors.

sRGB modes are still terrible across the board because they lock every option usually for no good reason, including brightness on many displays.

I used to borrow a colorimeter from work and calibrate my own displays with that but my current workplace doesn't have one to borrow and I haven't bothered buying my own.
 
Plus as you can see above, many of us data geeks have drifted away from here to other locations due to attitudes.
lol what exactly are you objecting to in that post?



op, pros use calibration, most people wouldnt even notice the difference and factory settings are fine.
 
lol what exactly are you objecting to in that post?



op, pros use calibration, most people wouldnt even notice the difference and factory settings are fine.

We used to have a lot more testing, tear downs, and data quoting to back up statements but everyone seems much happier to make snappy hot takes that are blanket statements. (Not saying this isn't a wide spread change)

I'd say a lot of monitors come with options from the factory that are fine but they still don't default to the most accurate to standards settings and anyone who even asks about calibration has at least some interest in seeing things as the creater intended.
 
some years ago, when wide gamut monitors started appearing colorimeters was bought from normal users too, people simply don't liked the oversaturated colors in windows, browsers ecc...
why people don't have this problem now?

most of the wide gamut monitors today comes with a good factory calibration but they are still wide gamut and colors are still very oversaturated.
 
some years ago, when wide gamut monitors started appearing colorimeters was bought from normal users too, people simply don't liked the oversaturated colors in windows, browsers ecc...
why people don't have this problem now?

most of the wide gamut monitors today comes with a good factory calibration but they are still wide gamut and colors are still very oversaturated.

I used to have AdobeRGB monitor, HP LP24something, I do not remember the model exactly, and when you used it to watch sRGB content (basically everything PC related) it made everything look really weird. Colors were super saturated and almost candy like and every person suffered from chronic sunburns. No sRGB emulation in that one, the only way I could make colors look tolerable was to turn Digital Vibrancy down from Nvidia Control Panel.

DCI-P3 of my HDR TV as monitor does not have the same effect. Yes if you use it to view sRGB content then colors do have more pop but it does not look weird. Skin tones are still natural. While there is enough purist in me to always use proper sRGB mode in my screen if available, subjectively I may actually prefer the way DCI-P3 looks with sRGB content, especially games.

And of course HDR makes full use of DCI-P3 and Rec2020 and then you use a correct gamut from start to finish, no oversaturation and only colors that do pop are the ones that are supposed to.
 
Last edited:
I love the saturated look you can get with wide gamut screens. It's not accurate but I like it none the less ;)
 
Before HDR, wide gamut was the wild west in that there was no software “mode” that told the OS that the color space was expected to be a wide gamut mode (HDR now requires/expects wide gamut color spaces). As a consequence most SDR images and all SD videos are SRGB mastered and look “wrong” in a wide gamut color space without correction. Since WG was a niche thing monitor makers would just quote wild specs and in general most monitors were very badly calibrated in wide gamut (and in srgb sometimes) as you can see from older reviews.
Things are generally better now and out of the box accurate modes are available for most decent monitors- I always check for this in reviews and look at the delta E pre and post calibration.
I still have and use both a colorimeter and photo spectrometer.
 
This has more or less already been answered, but I can add perhaps a bit more.

If a monitor is capable of any HDR standard, then it's capable of "wide gamut". HDR by definition is a "wide gamut".
Gamut is effectively a description of how many colors a given color space is capable of reproducing. And HDR gamuts are "large".
Rec2020 is the most common format that monitors are targeting for HDR. Though there is also Dolby Vision as the other major cinema format.

They're still integral to people doing professional work.
If your interest is ever trying to get an accurate image, they are just as important today as they were 10 years ago.

Most casual people don't care about accuracy though. Though if you're a purist, as in you want to see the full artist intent, then you'll have some interest in pursuing it.
If the image looks "fine" for most people then that's enough. Most don't think about display inaccuracy just as they don't think about audio EQ inaccuracy. In fact a lot of people will EQ their sound in a way that the producer, mixer, and masterer never intended.
To be fair, not everyone sees things the same way (and I mean this literally, as in their vision is not the same). So "the way it was intended to be" is often not "the way it is perceived," even if the display is calibrated perfectly, in an appropriately lit room, at the correct distance, etc.
 
Back
Top