6 bit FRC vs True 8 bit?

jt55

n00b
Joined
Feb 28, 2020
Messages
12
Anyone who has owned/compared true 8bit and 6bit+FRC monitors, did you discern a difference?
All discussion I can find outside of pro photography is basically reduced to 'the difference is too small to matter'.

There is a difference obviously, it's the reason a professional market exists for true 8bit+ displays, but what is it?

I should not be related to vibrancy/gamut, there are will be some 6bit+FRC capable of better/brighter colours than many true 8bit displays.

It shouldnt effect visible banding either, the whole purpose of FRC is to reduce banding.

Im guessing it must be some other sort of visible artifact that is added by FRC?
 
Last edited:
I noticed a massive jump between 8-bit and true 10-bit. But then I was moving from something that could only really display sRGB to something that could display most of DCI-P3 color gamut.

What FRC does is essentially change the voltages back and forth between two states on a pixel to "emulate" the colors it can't display. It uses an attempted "middle state" to show those colors. If each pixel state is a step on a stair case with no step in the middle, then this constant alternating voltage is an attempt at being somewhere in the middle.
If all you do is game and other factors like Hz, response time, and rise and fall/grey-to-grey matter to you more than visual fidelity then you probably won't care whether your display is 6-bit+FRC vs true 8-bit.
But if you actually do anything that is remotely color critical and you need to be able to display a true unaltered gamut in order to do you work then it will matter a heck of a lot more. FRC displays will generally be harder to calibrate - their "middle states" aren't constant and as they aren't able to hold those colors they will invariably be inaccurate (those pixels in in-between states may also flicker in a worst case scenario). And yeah, it's still possible to have banding - but in theory FRC should help minimize it. Again, do you care if you're not doing anything color critical or not? I suppose that's up to you.

To me, I'm so far on the other end of the spectrum that this isn't even a question I would ask. I more or less won't buy a display at this point that isn't true 10-bit. But if all your frames are 1/120th or 1/144th of a second long and you're more busy with blowing up people's heads; perhaps whether or not every shade of red spurting from someone's recently decapitated neck is color accurate or not may not even register as being remotely important to you. If you're buying a display made in the last 2 years though it's also probably not something have to worry about. It's rare to see 6+2 FRC anymore. I at least haven't personally seen a new display like that in a while. Even most gaming displays at this point are true 8-bit or 8+2FRC.
 
Last edited:
  • Like
Reactions: Elios
like this
I noticed a massive jump between 8-bit and true 10-bit. But then I was moving from something that could only really display sRGB to something that could display most of DCI-P3 color gamut.

What FRC does is essentially change the voltages back and forth between two states on a pixel to "emulate" the colors it can't display. It uses an attempted "middle state" to show those colors. If each pixel state is a step on a stair case with no step in the middle, then this constant alternating voltage is an attempt at being somewhere in the middle.
If all you do is game and other factors like Hz, response time, and rise and fall/grey-to-grey matter to you more than visual fidelity then you probably won't care whether your display is 6-bit+FRC vs true 8-bit.
But if you actually do anything that is remotely color critical and you need to be able to display a true unaltered gamut in order to do you work then it will matter a heck of a lot more. FRC displays will generally be harder to calibrate - their "middle states" aren't constant and as they aren't able to hold those colors they will invariably be inaccurate (those pixels in in-between states may also flicker in a worst case scenario). And yeah, it's still possible to have banding - but in theory FRC should help minimize it. Again, do you care if you're not doing anything color critical or not? I suppose that's up to you.

To me, I'm so far on the other end of the spectrum that this isn't even a question I would ask. I more or less won't buy a display at this point that isn't true 10-bit. But if all your frames are 1/120th or 1/144th of a second long and you're more busy with blowing up people's heads; perhaps whether or not every shade of red spurting from someone's recently decapitated neck is color accurate or not may not even register as being remotely important to you. If you're buying a display made in the last 2 years though it's also probably not something have to worry about. It's rare to see 6+2 FRC anymore. I at least haven't personally seen a new display like that in a while. Even most gaming displays at this point are true 8-bit or 8+2FRC.
Massive jump in what?
Viewing 10bit content or all content?
As you say you went from limited to wider gamut display, which going to be the biggest difference and if uncalibrated will show stronger colours.
 
EDIT and tl;dr: I want to preface this and also stay on topic for your OP, as much as I enjoy explaining technical stuff: for you to re-answer your question - you probably won't notice or care about 6+2 FRC vs 8-bit vs 8+2 FRC vs 10-bit if all you do is game or look at Word/Spreadsheets/browse. You might start to care if you're really interested involved in media creation. You also might start to care if you're really into film/video fidelity that you can see while watching a film.
I can't with any degree of accuracy say whether you'll notice any of this stuff or not. But I am working on a color managed OS with a color calibrated display in 10-bit on a daily basis and that was important for me and what I do and I definitely noticed a difference immediately - but I also have semi-trained eyes and am looking for and at color on a daily basis. Most normal people do not.
If you want to have probably more than you bargained for explained below, then feel free to read it. Otherwise I won't be offended if you just move on.

I would also recommend reading this primer from Benq just about 8-bit vs 10-bit if you're in general interested in learning more about this stuff and whether or not any of it matters to you: https://www.benq.com/en-us/knowledg...oes-monitor-panel-bit-color-depth-matter.html
Their interest is in selling monitors of course, so they really try to push people to 10-bit - so feel free to ignore that. But the information about 8 vs 10 bit is relevant.


Massive jump in what?
My career is in photo and video so, both of those vastly exceed 10-bits of color. They also vastly exceed DCI-P3 color space. A Sony RAW file from any of their cameras made in the past 5 years or so is 14-bits. Moving from RAW to literally any color space drops a lot of information - unless you're working in 16-bit color space with a gamut wide enough to hold all the color information (but even then you lose a lot of pliability moving from RAW to a color space that Photoshop accepts such as ProphotoRGB or even less pliability than that with AdobeRGB). But anyway, suffice it to say I can definitely see way more color information and make much better informed decisions about how things will appear in both print and web.

So a massive jump in both color and luminosity information. That 2 bit jump moves you from 16.7 million colors to over 1 billion colors. And by "colors" that means not only differences in shades but differences in luminosity - which is really how all those colors are achieved. 12 bit panels which are now starting to occupy the extreme top end are able to process and display 68.7 billion colors.

But again, as I mentioned before it's still technically not possible to show every color or the full dynamic range that a camera is capable of capturing on a display. Arri, Sony, and RED (to name a few, there are others) all have cameras that shoot in full 16-bit RAW. This may be news to you, but every color space that you see on a computer or a movie theater or a TV is an agreed upon LIMITED color space designed to be a standard that can be achieveable on those mediums. So they take 16-bits worth of data and essentially through the selection of a specific color-space limit it to something like DCI-P3, which generally is 10-bits worth of data. All of which is far less than 16-bit. They squash down the dynamic range and color data massively before you ever see it.
Viewing 10bit content or all content?
It depends on if you want to view things accurately or not. To directly answer your question: just for things that are outside of the range of sRGB (sRGB is basically 8-bit content). I'm on a Mac, and unlike Windows, almost everything in macOSX is color managed. So I don't have problems viewing sRGB content properly and then switching to another application that's a bigger color space and viewing that "improperly". I only have to color calibrate my display to its maximum gamut, and color management at the OS level handles the rest. There are a few apps that are not properly color manged, but those are exceptions - whereas on Windows that's basically the norm.

So when I edit a movie in FCPX I can see the full range that's available to me in whatever format I choose, Rec 601 or Rec 709 or HDR (or at least as much as 10-bits can display) and when I export, if I do to a smaller gamut (Rec 601/sRGB) then I can view the content properly inside of that color space.
As you say you went from limited to wider gamut display, which going to be the biggest difference and if uncalibrated will show stronger colours.
I mentioned this above, but this is less of an issue in macOS than in Windows. You kind of have to calibrate your display for whatever it is that you're trying to view in Windows and hope that it works out properly. Some software like Davinci Resolve is properly color managed as is Photoshop - but basically all the viewing software isn't and browsers on Windows aren't so it can cause a lot of headaches.

Also to the best of my knowledge, most video-card manufacturers didn't even want to support 10-bit on Windows unless you bought a professional level video card such as Quadro or Radeon Pro. Which was a really dumb really annoying software limitation for working professionals that actually could use the additional color data (such as small business photographers or videographers).
AMD I'm fairly sure came around to allowing full 10-bit on Windows on consumer cards through their driver stack, I'm not sure if nVidia has or not, but that was also a really big problem on Windows for a long time. Not only couldn't you output 10-bit (it required a very specific chain of software and hardware - specific graphic cards and specific displays), but then you couldn't color manage anything. In short it was a mess even just 2 or so years ago. I haven't looked into it again in a while, but I'm also not frequently in Windows 10 unless I specifically want to game.
(EDIT: I may have slightly misspoken, AMD and nVdia specifcally don't allow 10-bit on their non-professional video card for rendering in OpenGL - it appears that it is possible for the desktop - but again still has all the headaches associated with managed versus un-managed applications).

For games, 10-bit probably means nothing or virtually nothing (unless you like an overly saturated look that the devs didn't intend but didn't bother to properly color manage). If you're involved in media development and creation then having a true 16-bit display can't come fast enough. But to say that that is a few years off is a massive understatement as like I mentioned before we're just starting to get 12-bit displays at the extreme high end.
 
Last edited:
My career is in photo and video so, both of those vastly exceed 10-bits of color. They also vastly exceed DCI-P3 color space. A Sony RAW file from any of their cameras made in the past 5 years or so is 14-bits. Moving from RAW to literally any color space drops a lot of information - unless you're working in 16-bit color space with a gamut wide enough to hold all the color information (but even then you lose a lot of pliability moving from RAW to a color space that Photoshop accepts such as ProphotoRGB or even less pliability than that with AdobeRGB). But anyway, suffice it to say I can definitely see way more color information and make much better informed decisions about how things will appear in both print and web.

So a massive jump in both color and luminosity information. That 2 bit jump moves you from 16.7 million colors to over 1 billion colors. And by "colors" that means not only differences in shades but differences in luminosity - which is really how all those colors are achieved. 12 bit panels which are now starting to occupy the extreme top end are able to process and display 68.7 billion colors.

But again, as I mentioned before it's still technically not possible to show every color or the full dynamic range that a camera is capable of capturing on a display. Arri, Sony, and RED (to name a few, there are others) all have cameras that shoot in full 16-bit RAW. This may be news to you, but every color space that you see on a computer or a movie theater or a TV is an agreed upon LIMITED color space designed to be a standard that can be achieveable on those mediums. So they take 16-bits worth of data and essentially through color-grading and editing limit the color space to something like DCI-P3 or HDR10. All of which is far less than 16-bit.

It depends on if you want to view things accurately or not. To directly answer your question: just for things that are outside of the range of sRGB (sRGB is basically 8-bit content). I'm on a Mac, and unlike Windows, almost everything in macOSX is color managed. So I don't have problems viewing sRGB content properly and then switching to another application that's a bigger color space and viewing that "improperly". I only have to color calibrate my display to its maximum gamut, and color management at the OS level handles the rest. There are a few apps that are not properly color manged, but those are exceptions - whereas on Windows that's basically the norm.

So when I edit a movie in FCPX I can see the full range that's available to me in whatever format I choose, Rec 601 or Rec 709 or HDR (or at least as much as 10-bits can display) and when I export, if I do to a smaller gamut (Rec 601/sRGB) then I can view the content properly inside of that color space.

I mentioned this above, but this is less of an issue in macOS than in Windows. You kind of have to calibrate your display for whatever it is that you're trying to view in Windows and hope that it works out properly. Some software like Davinci Resolve is properly color managed as is Photoshop - but basically all the viewing software isn't and browsers on Windows aren't so it can cause a lot of headaches.

Also to the best of my knowledge, most video-card manufacturers didn't even want to support 10-bit on Windows unless you bought a professional level video card such as Quadro or Radeon Pro. Which was a really dumb really annoying software limitation for working professionals that actually could use the additional color data (such as small business photographers or videographers).
AMD I'm fairly sure came around to allowing full 10-bit on Windows on consumer cards through their driver stack, I'm not sure if nVidia has or not, but that was also a really big problem on Windows for a long time. Not only couldn't you output 10-bit (it required a very specific chain of software and hardware - specific graphic cards and specific displays), but then you couldn't color manage anything. In short it was a mess even just 2 or so years ago. I haven't looked into it again in a while, but I'm also not frequently in Windows 10 unless I specifically want to game.
(EDIT: I may have slightly misspoken, AMD and nVdia specifcally don't allow 10-bit on their non-professional video card for rendering in OpenGL - it appears that it is possible for the desktop - but again still has all the headaches associated with managed versus un-managed applications).

For games, 10-bit probably means nothing or virtually nothing. If you're involved in media development and creation then having a true 16-bit display can't come fast enough. But to say that that is a few years off is a massive understatement as like I mentioned before we're just starting to get 12-bit displays at the extreme high end.
Ok, just to be clear you are talking about 8bit+FRC to 10bit, not 8bit to 10bit?
If differences are that clear for the 8bit+FRC to 10bit transition then 6bit+FRC must be especially limited compared to 8 bit.
You kinda piqued my interest in 10 bit displays aswell, but being on Windows and using it only for entertainment that is probably better left until it become more widely adopted in the future.
 
Ok, just to be clear you are talking about 8bit+FRC to 10bit, not 8bit to 10bit?
For my personal experience I'm referring to true 8-bit to true 10-bit.
If differences are that clear for the 8bit+FRC to 10bit transition then 6bit+FRC must be especially limited compared to 8 bit.
You kinda piqued my interest in 10 bit displays aswell, but being on Windows and using it only for entertainment that is probably better left until it become more widely adopted in the future.
Everything comes down to preferences and money. Right now the best consumer display is arguably the LG 48CX OLED (which is widely also getting discussed on this forum) as an HDMI 2.1, 4k, 120hz display in 10-bit, and 4:4:4 chroma subsampling. The short version is that it meets basically the entire specs list if you can afford it and have the graphics card to drive it. With the only downside being that at 48" it's a bit large.
If you're in the professional sphere, everything is 10/12bit. However a lot of gaming monitors are 8+2 FRC or 10bit now. The Acer x27 Predator being a prime example. Hopefully there will be some 32" options from LG and AOC this year that are also 32", 120hz, 4k, 10-bit. Most I think would like to have at least that list of specs, if not also HDR10 cert and/or mini-LED back-lights. But we're now getting into territory you haven't talked about.

Again, either way, you're likely to end up buy an 8-bit display anyway. I think it's pretty hard to get a 6+2 FRC at this point unless whatever you're buying is old and or costs less than $100.
 
6-bit + proper A-FCR will actually be less visible for static content like photo editing than games where additional dithering will cause slight motion artifacts like slight banding and less defined edges.
There is slight noise on 6-bit, especially visible at darker tones, not that hard to notice when looking for it but also not something most users will look for and certainly not something which will be noticeable on itself.

6-bit is not an issue except cheapest IPS panels. Most today are 8-bit+A-FCR. Some claim to be 10-bit but even if they were just stated as 10-bit because they have 10-bit input no one could really tell. There is simply too small tonal difference to discern. This would not be the case if LCD had contrast ratio like 10K:1 but they do not.
 
I've had bad experinced with 6 bit + 2bit FRC dithering relating to "eye dizzyness"... YMMV though.
 
There is one game which had extra visible banding when I compared it between a 8bit and 6bit+frc screen, otherwise the only thing I generally notice is less dark detail.
 
I've had bad experinced with 6 bit + 2bit FRC dithering relating to "eye dizzyness"... YMMV though.
To really be able to make such claim you would need to have the same panel which is 6 and 8/10 bit.
It is actually possible to force 6bit + all kinds of dithering on 8/10bit panels including the same algorithms that is used on modern monitors A-FCR
 
I could see (barely) a shimmering effect on 2008-2012ish Dell 22" 1680x1050 monitors that used 6+2. It was very subtle, but fell under the category of things that couldn't be unseen afterwards. At my old job I had "fun" arguing with IT about why I wanted an older (but higher end monitor that was native 8 bit), not the newer model year (but cheaper) 6 bit panel I was initially given when I needed a screen replaced.

I'm not sure if it was that replacement or a different one; but I also had the dubious pleasure of disputing that a TN panel that would visibility color shift depending on if it I was sitting upright or leaning back was an acceptable alternative to an otherwise equivalent VA model.
 
I could see (barely) a shimmering effect on 2008-2012ish Dell 22" 1680x1050 monitors that used 6+2. It was very subtle, but fell under the category of things that couldn't be unseen afterwards.
I totally get what you’re saying. I had a 6bit +2 FRC monitor 10-15 years ago and at some point I saw that shimmering. Extremely subtle, but once I noticed it I could never unsee it. I looked for true 8bit displays after that, not that I need color correct anything, but I prefer it. Since I recently got my pixio 275h (which is most certainly and 8bit + 2 FRC) I haven’t noticed that effect anymore, perhaps because there’s way more colors being displayed so it’s harder to notice. Maybe it is a true 10bit panel but I’d highly doubt it.
 
Back
Top