U2410: Grey gradients have pinkish stripes after calibration.

Geolith

Limp Gawd
Joined
Apr 7, 2012
Messages
311
I get barely (but still) distinguishable pinkish color stripes on greyscale gradients after calibration. Is there any way to get rid of them?

Monitor: Dell U2410
Monitor preset: Adobe RGB
Connection: 10-bit DisplayPort
Calibration tool: Spyder4Pro
Calibration soft: Argyll CMS + dispcalGUI
Calibration quality: High
Profile quality: High
Profile type: XYZ LUT + swapped matrix

Additional info:

Video card: GTX 680
OS: Windows 7
 
GTX680 support 10bit via DP?
by results (banding) it probably doesn't...
 
GTX680 support 10bit via DP?
by results (banding) it probably doesn't...

The banding is virtually non-existent.

If GTX 680 doesn't support 10-bit via DP, then what on earth does?

I have calibrated this monitor with 8-bit connection. Trust me, there was a HUGE difference in gradient banding. Which may lead to only one conclusion: DP on GTX 680 indeed provides 10-bit connection.
 
Everything I'm finding on the internet suggests that only Quadro and FirePro drivers support 10-bit output.

This technical brief specifically mentions Quadro also.
 
GTX680 support 10bit via DP?
by results (banding) it probably doesn't...

No, only Quadro and FireGL/FirePro through DisplayPort.

ATI's regular Radeon cards only have internal 10-bit LUTs which helps when calibrating via video card LUTs and "resists" banding a little bit better than nVidia boards, which (correct me if I'm wrong) only have internal 8-bit LUTs. Both will only output 8-bit color through DP and DVI though.

It was suggested earlier by someone on this forum (maybe ToastyX), so I checked out a GTX 680 versus a 7970 and I found there was more banding on the GTX 680 (with all other variables the same) than the 7970.

Goodcoin, why not try a 14-day demo of BasicColor (from www.basiccolor.de) or ColorEyes?

This will help you isolate colorimeter versus software issues.

Also try the "black level" test or another program with a gradient, and not through FireFox. So try it on Internet Exploder. FireFox can do funny things with profiles.
 
Last edited:
OK, if GTX 680 has 8-bit internal LUT and its DP provides the same 8-bit connection as standard DVI, then how on earth do I get massively better gradients on DP? Any explanation?
 
Goodcoin, why don't you just sell the U2410 and get a good monitor instead of making new threads about all of its problems?
 
Goodcoin, why don't you just sell the U2410 and get a good monitor instead of making new threads about all of its problems?

I can't sell a monitor just because I can't calibrate it properly. Not until I'm 100% sure that it's not my fault and it is physically impossible to calibrate this monitor properly. THEN I may consider selling it and buying something giving less headache.
 
ATI's regular Radeon cards only have internal 10-bit LUTs which helps when calibrating via video card LUTs and "resists" banding a little bit better than nVidia boards, which (correct me if I'm wrong) only have internal 8-bit LUTs. Both will only output 8-bit color through DP and DVI though.

It was suggested earlier by someone on this forum (maybe ToastyX), so I checked out a GTX 680 versus a 7970 and I found there was more banding on the GTX 680 (with all other variables the same) than the 7970.
all GF8xxx and above have 10bit LUT and can actually display 10bit on all monitors via D-SUB connection (even lowest end D-SUB only TN monitor will have pretty smotch gradients after eg. gamma correction with that connection!)

HD7970 maybe using some kind of dithering? but it's more likely there were no differences :rolleyes:

@Goodcoin
if you want monitor that can be truly calibrated it have to be equipped with it's own hardware LUTs and it would be very good if it had gamut remapping feature...

but back to U2410, you can check how it would look with 10bit connection you can try D-SUB
 
Last edited:
OK, if GTX 680 has 8-bit internal LUT and its DP provides the same 8-bit connection as standard DVI, then how on earth do I get massively better gradients on DP? Any explanation?

Not sure really, but if it WERE 10-bit it wouldn't show banding at all. There are 10-bit tests around the web you can try. Once I get my 5870 back from XfX RMA I will try the 5870 > FirePro soft-mod and see what happens. I have a few 10-bit capable screens.

Goodcoin, why don't you just sell the U2410 and get a good monitor instead of making new threads about all of its problems?

I recently tested a friend's brand new Revision A07. On BCC with the EODIS3 using the sRGB pre-set I got 850:1 contrast ratio at 140 cdm/2 whites and a DE94 of 1.7 (average) and 3.02 (max).

Color temperature was a bit off at ~6900K so they aren't as shitty as they used to be. Also the AG was no better nor worse than any recent IPS 24".

My summation for him was at $370.00 Cdn plus jerk-off ECO/Recycling Fee of $11.00 plus taxes he got a good monitor at a great deal. As always YMMV, and depending on how recently the Dell equipment was calibrated etc... but it's not as bad as it was originally IMHO. NO tinting from green to red either that I can see.

all GF8xxx and above have 10bit LUT and can actually display 10bit on all monitors via D-SUB connection (even lowest end D-SUB only TN monitor will have pretty smotch gradients after eg. gamma correction with that connection!)

HD7970 maybe using some kind of dithering? but it's more likely there were no differences :rolleyes:

@Goodcoin
if you want monitor that can be truly calibrated it have to be equipped with it's own hardware LUTs and it would be very good if it had gamut remapping feature...

but back to U2410, you can check how it would look with 10bit connection you can try D-SUB

The AMD does "dither down" and it works very very well and with ICM profiles and there is less loss of gradients/colors and much better ramps. Trust me on that one ;)

I don't see how going through a DAC (Digital to analog Converter), then through an ADC (analog to digital converter) is going to be an improvement. Also there is not "bitness" through VGA and no guarantee that there is 10-bit support through this interface on the monitor.

Too many variables there IMHO. Also nVidia didn't support 10-bit rendering on DX or OpenGL with the GF8xxx series. This started with the GT200, but this doesn't apply here, as Windows APi is only in 8-bit and he's looking at a gradation in a web browser from a PNG file.

Even worse this post here says nVidia doesn't do any more than 8-bit over analog: http://forums.nvidia.com/index.php?showtopic=209573

It may be wrong but who knows.
 
A general comment-

It's worth noting that output isn't going to be any better than the source.

A lot of content out there is calibrated for sRGB and isn't 10-bit.
 
Last edited:
@10e
can you post closeup images of this image on some Radeon via DVI with gamma set to 1.10. On Geforce via DVI/HDMI there are visible steps.

On VGA on the other hand everything is nice and smooth on all three monitors I have (third is EIZO L557 from 2003). Some time ago I did some testing on normal old cheap TN that hadn't even had A-FRC but older FRC and it was obvious those used more than 8bit ADCs because changing gamma didn't introduce any banding. On DVI the same TN monitor had visible banding...

so nothing forbids D/A -> A/D from being more accurate than 8bit and apparently most monitors operate on >8bit internally anyway, even cheap ones...

and definitely post your findings softmod @firegl. Also those pictures of dithering on Radeons would be nice because I can't find one . If Radeons do dithering it would mean people who calibrate monitors have no other option but to use Radeons, especially PowerStrip forced LUT works only on Radeons...
 
A lot of content out there is calibrated for sRGB and isn't 10-bit.

That's irrelevant. Even the panel itself is 8-bit. So what? The monitor is capable of producing smooth gradients internally and they get screwed after calibration, no matter the content.
 
This started with the GT200, but this doesn't apply here, as Windows APi is only in 8-bit and he's looking at a gradation in a web browser from a PNG file.

Not exactly. I have the PNG saved on my PC and I'm looking at it in various software. By the way, Adobe Photoshop displays the file heavily banded, while Windows Photo Viewer displays it almost perfectly smooth, except those barely noticeable pinkish stripes.

If not for those stripes of extra color, I would swear I'm looking at 10-bit gradients, or at least a decent dithering.
 
OK, folks. Basically, the topic is not about banding, but extra colors (notably, pinkish) in greyscale gradients. Any idea on what could give "impure" greyscale? Maybe it's related to profile type, calibration settings or software in general?
 
@Goodcoin
DVI outputs can be digital (those have "-" sign) only or digital + analog ("+" sign). If there is "+" you can use cheap DVI-VGA adapter or DVI-VGA cable.

it would be helpful if you took photo of those gradient and post it along with profile you created
 
Convert the png to jpg and see if the pink tint is still there.
 
@Goodcoin
DVI outputs can be digital (those have "-" sign) only or digital + analog ("+" sign). If there is "+" you can use cheap DVI-VGA adapter or DVI-VGA cable.

it would be helpful if you took photo of those gradient and post it along with profile you created

I do have a DVI-VGA adapter -- it came with my previous card. But where should I look for those signs exactly? This is how my ports look:

geforce-gtx680-ports.jpg


I did take a photo of the gradients, but it didn't come out well. I'll try again later with different light/camera settings.
 
OK, folks. Basically, the topic is not about banding, but extra colors (notably, pinkish) in greyscale gradients. Any idea on what could give "impure" greyscale? Maybe it's related to profile type, calibration settings or software in general?
If the multiple calibration attempts (at slightly different location on the monitor) results in the same problem, I'd probably suspect the calibration unit - or the correction matrix applied to it.
Is it possible to try out calibration with another wide gamut monitor? Not necessarily an U2410. But if the results still shows similar tints, something's got to be wrong with the sensor/software.

I don't know the features of the software that you use, but HCFR allows you to use the sensor to measure the RGB and temperature values of whatever you point the sensor at, without performing any sort of calibration. http://www.homecinema-fr.com/colorimetre-hcfr/hcfr-colormeter/ . If the Spyder4Pro is supported, it would also be interesting to see if the whether it detects any red bias in specific parts of the gray scale.
 
I do have a DVI-VGA adapter -- it came with my previous card. But where should I look for those signs exactly? This is how my ports look:

geforce-gtx680-ports.jpg


I did take a photo of the gradients, but it didn't come out well. I'll try again later with different light/camera settings.

On the right side of the DVI connectors (in this picture). In the picture the top one is DVI-D (digital only) and the bottom is DVI-I (Analog and digital)
 
If the multiple calibration attempts (at slightly different location on the monitor) results in the same problem, I'd probably suspect the calibration unit - or the correction matrix applied to it.

ColorEyes Display Pro (PC version) should have Spyder4 support in a couple of weeks (presently only Mac version has it). Then I'll see if it gives any different results from Argyll CMS/dispcalGUI combo.
 
On the right side of the DVI connectors (in this picture). In the picture the top one is DVI-D (digital only) and the bottom is DVI-I (Analog and digital)

Thanks! I'll recalibrate with DVI-I to D-SUB connection and see what it does.
 
If the Spyder4Pro is supported, it would also be interesting to see if the whether it detects any red bias in specific parts of the gray scale.

Unfortunately, Spyder4 is not supported by HCFR Colormeter software at this moment.
 
OK, I've recalibrated with D-SUB connection (via the adapter), but no improvement, sadly.

I tried to take another photo of the gradients, but there's no point due to the heavy moire patterns. The difference is too subtle to be captured by camera (at least mine).
 
I have got rather good results with the Spyder4 and Argyll+dispcalGUI on a w-led with normal gamut. Have you imported the colorimeter corrections from the Spyder4 software? and did you set the mode to wide gamut ccfl?
 
Have you imported the colorimeter corrections from the Spyder4 software? and did you set the mode to wide gamut ccfl?

Yes, I did.

I don't have a clue what can be the reason, but I've read somewhere that such thing could happen when loading a 16-bit profile via 8-bit LUT. I don't quite understand what that means, but it sounds like a theory worth checking. Only I have no idea how to do it. Is there any way to determine the "bitness" of a profile?
 
color profiles are generally 16bit
can you post one with banding issues?
 
I have read a bunch of reviews of the Dell U2410 and your best bet is running it in sRGB or AdobeRGB modes. Also it is possible your U2410 could have different tints on the left and right side. What happens if you set it to sRGB or AdobeRGB and calibate it one time with the Spyder4 on the left side of the screen and one time with the Spyder4 on the right side?
 
I have read a bunch of reviews of the Dell U2410 and your best bet is running it in sRGB or AdobeRGB modes. Also it is possible your U2410 could have different tints on the left and right side.

My unit does not have tinting issues. That was the first thing I checked, as I was aware of the bad history.

The gradient banding and impure greyscale after calibration is a universal issue, across all presets.
 
Have you tried the Spyder 4 on a different monitor? (just to make sure your Spyder4 is not defekt).
 
Have you tried the Spyder 4 on a different monitor? (just to make sure your Spyder4 is not defekt).

I have only 6-bit cheap TN panels at my disposal. I don't see any point in comparing their calibration results with a 8-bit panel. The test ramp looks already crappy on those monitors, for starters.
 
Back
Top