Are standard gamut (~99% sRGB) monitors still worth it in 2022?

3dprophet

Limp Gawd
Joined
Oct 9, 2020
Messages
178
Is it still worth buying standard gamut (~99% sRGB) monitors in 2022?

Or will I look like a cuck in a few years when wider gamut becomes the new standard and I'm still using an sRGB only monitor?
 
Is it still worth buying standard gamut (~99% sRGB) monitors in 2022?

Or will I look like a cuck in a few years when wider gamut becomes the new standard and I'm still using an sRGB only monitor?
Totally a cuck. Why you sRGB bro?

Honestly - unless creators are telling us which gamut to use, it’s moot. My assumption is unless spelled out otherwise, sRGB is the standard and is here to stay.
 
sRGB is still the standard for 99.9% SDR content and I doubt this going to change at least until there is a system wide way to map old sRGB content into wide gamut without ICC profiles calibrated to individual monitors etc. Changing an existing standard is pain in the ass so it is easier to just make new ones and use them for the specific reasons. Like Adobe RGB for photograph editing work or DCI-P3/Rec2020 for HDR movies and games.
 
sRGB is still the standard for 99.9% SDR content and I doubt this going to change at least until there is a system wide way to map old sRGB content into wide gamut without ICC profiles calibrated to individual monitors etc. Changing an existing standard is pain in the ass so it is easier to just make new ones and use them for the specific reasons. Like Adobe RGB for photograph editing work or DCI-P3/Rec2020 for HDR movies and games.
Gamut remapping and sRGB emulation is a thing on wide gamut monitors for >10 years.
In fact I use monitor from 2010 with this feature and sRGB (or rather Rec.709) is how I use this monitor 99.9% of the time.

Some earlier normal consumer models didn't have that (and still some DCI-P3 monitors can come without it which is quite ridiculous...) or quality left a lot to be desired (eg. Dell U2410/U2711) but professional models (pretty much what you would use today if you still used old monitors) almost universally always had this nailed down.
 
Gamut remapping and sRGB emulation is a thing on wide gamut monitors for >10 years.
In fact I use monitor from 2010 with this feature and sRGB (or rather Rec.709) is how I use this monitor 99.9% of the time.

Some earlier normal consumer models didn't have that (and still some DCI-P3 monitors can come without it which is quite ridiculous...) or quality left a lot to be desired (eg. Dell U2410/U2711) but professional models (pretty much what you would use today if you still used old monitors) almost universally always had this nailed down.

Oh, I know it exists in hardware but what about software? Say in hypothetical distant future Windows 31 finally sheds the old sRGB and Rec.2020 becomes a norm (and monitors capable of it are common place) from ground up, what will happen to the old sRGB content? With no mapping they would become so oversaturated that it would look gross. And this mapping needs to be automatic for common people. We enthusiasts can always calibrate our screens and use ICC profiles or 3DLUT's or turn on gamut emulation modes in screens and whatever but that is not what one should expect from common users.
 
Oh, I know it exists in hardware but what about software? Say in hypothetical distant future Windows 31 finally sheds the old sRGB and Rec.2020 becomes a norm (and monitors capable of it are common place) from ground up, what will happen to the old sRGB content? With no mapping they would become so oversaturated that it would look gross. And this mapping needs to be automatic for common people. We enthusiasts can always calibrate our screens and use ICC profiles or 3DLUT's or turn on gamut emulation modes in screens and whatever but that is not what one should expect from common users.
Maybe I don’t understand it… but aren’t bits bits? Why would software need to map to sRGB? Wouldn’t setting the monitor to sRGB mode be enough? Or are you advocating for a more seamless approach, which if possible, I would be on board for?
 
Maybe I don’t understand it… but aren’t bits bits? Why would software need to map to sRGB? Wouldn’t setting the monitor to sRGB mode be enough? Or are you advocating for a more seamless approach, which if possible, I would be on board for?

Definetly about seamless approach that allows you to see sRGB content made today correctly on future operating systems that may work in larger gamuts internally. Monitor sRGB mode is a bandaid that you have to turn on manually depending on the content you are watching.
 
Definetly about seamless approach that allows you to see sRGB content made today correctly on future operating systems that may work in larger gamuts internally. Monitor sRGB mode is a bandaid that you have to turn on manually depending on the content you are watching.
How would it work? Do extended gamut content files have flags declaring which gamut they use? How would the OS know?
 
How would it work? Do extended gamut content files have flags declaring which gamut they use? How would the OS know?

As far as I understand, yes. For example a photo meant for AdobeRGB gamut has flags or such to indicate it. When viewed on color managed software (like Photoshop) and combined with an ICC profile calibrated from the monitor it then shows correct colors. Or if the monitor is not capable of showing it (say it only reaches 90% Adobe RGB) then you have choices how it handles them. Stuff like Absolute which shows colors as it is and everything outside is hard clipped or Perceptual which shrinks image gamut to fit the monitor without messing the hue (IE blue is same shade, just desaturated from original) and few others.

It works, but is not universal and requires monitor calibration for each individual monitor and special software that makes use of the said calibrarion.
 
As far as I understand, yes. For example a photo meant for AdobeRGB gamut has flags or such to indicate it. When viewed on color managed software (like Photoshop) and combined with an ICC profile calibrated from the monitor it then shows correct colors. Or if the monitor is not capable of showing it (say it only reaches 90% Adobe RGB) then you have choices how it handles them. Stuff like Absolute which shows colors as it is and everything outside is hard clipped or Perceptual which shrinks image gamut to fit the monitor without messing the hue (IE blue is same shade, just desaturated from original) and few others.

It works, but is not universal and requires monitor calibration for each individual monitor and special software that makes use of the said calibrarion.
I like your idea. I think that to add to what you're talking about, sRGB could be the fallback, in case the content doesn't reveal itself. IE - it has no flags, then lets assume it's sRGB. :)
 
Oh, I know it exists in hardware but what about software? Say in hypothetical distant future Windows 31 finally sheds the old sRGB and Rec.2020 becomes a norm (and monitors capable of it are common place) from ground up, what will happen to the old sRGB content? With no mapping they would become so oversaturated that it would look gross. And this mapping needs to be automatic for common people. We enthusiasts can always calibrate our screens and use ICC profiles or 3DLUT's or turn on gamut emulation modes in screens and whatever but that is not what one should expect from common users.
Windows 10/11 can already do gamut remapping for HDR with full sRGB desktop emulation.
And HDR is really the only use case anyone except people who prepare stuff for printing would use anyways so it makes some sense.

Of course HDR and LCD is nonsense. Maybe for videos/games FALD displays are ok but not nearly good enough to use desktop in HDR mode so all this HDR more is another step to do before starting game and another step after game over. One reason to not bother with FALD displays. At least with HDR600 display I do not need to enable HDR mode at all 🥳
 
Back
Top