spacediver
2[H]4U
- Joined
- Mar 14, 2013
- Messages
- 2,715
Yes, it's the converter. At present, the only way to properly use the FW900 is with a video card that has VGA out, or DVI-I out.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
at 1080P the monitor say " OUT OF SCAN RANGE " and it reports value of 27.0 KHZ/ 24HZ which is TOTALLY on the display range capability ,
Just so I am clear, 6500k is the color temperature that should be used on all CRTs right?
thanks but i need to use that I/O card for color correction ... i have 4 titans but i need the FW900 plugged to the decklink I/O card ...i wonder if i could reclock he signal at 60 Hz though ...
thanks
g
wait though ... even the horizontal is out of range ... isn't 1080p always 27KHz horizontally ?
thanks
g
Thank you very much UV!
I'm curious, do you have any experience with the GDM5402? I cannot find any information if that is a true Trinitron tube...thoughts on picture quality?
The other quandary I have is, besides your death which can occur whenever, how long are you planning to service these Sony Monitors for, Unkle Vito?
I don't want to purchase a monitor and 2 years later I need servicing and you end up closing shop and then I would have nobody to go to.
Just curious how long you plan on doing this for.
Just so I am clear, 6500k is the color temperature that should be used on all CRTs right?
Also, we have extensive clientele in the Hollywood area so I am here for the long haul... Your purchase is safe...
UV!
no need to have wild valuesAlso, it would be interesting to confirm that your EDID hack is actually having an impact. Try setting it to wildly different values, and measuring the chromaticity of your white point, using the free measure in HCFR (green play button). Just load up a white screen and measure, being sure to keep probe in same place with diff EDID settings.
is it really one Euro? I see a comma next to the 1 digit, wouldn't it make it 1,000?
I stand corrected then. Maybe a Diamondtron then? Rabbidz, is there a service menu for your monitor? If so - then it's likely a Mitsu and not a Sony.
1. I edited reference measures to match what is inside EDID
BTW. to make it perfectly clear: this gamut measured in first picture and taken as reference (gray triangle) is exactly the gamut that spacediver provided. It is not measured by me in normal sense of the word.
You need a spectro if you want truly accurate results on a wide gamut monitor.My probes:
Spyder3 http://i.imgur.com/Tpe74ni.jpg
calibrating to 6500K on it resulted in image that is both red and green. Absolutely unusable.
Spyder1 http://i.imgur.com/vVpScdK.jpg
This one actually measured 6500K white to be almost exactly like I set and it is definitely not 6500K but something like 7500K or something.
All those shite calibrators can do is aid to have evened out grayscale and with Argyll make LUTs. Just not measure gamut or white-point... For obvious reasons I cannot use suction cups with polarizer so I need to invent some way to keep it on screen for few hours (its kinda slow, and argyll high quality have many dark patches which it repeat and repeat and repeat ;.( )
I need to buy some good device to really talk about 'accuracy'.
i1 Display2 is supported by my LG and affordable but not really suitable for RGB-LED and that make me reluctant to buy it. I will probably go with i1 Display Pro as it seems to be very good and support RGB-LED monitors and will help me make correction matrix for spyder so I was able to calibrate LG with it.
Displays should be calibrated to D65, not 6500K.Just so I am clear, 6500k is the color temperature that should be used on all CRTs right?
It is not a matter of preference. Photographers may use D50 or D55 because they are dealing with print work; whether that is books, magazines, or selling prints of their work.NO! It is a matter of preference... For example... Photographers prefer 5000K (warm) because simulates the ambient sun light. Most gamers like the blueish tone so they prefer the 9300K (cold)... Some prefer 6500K which is somewhat in the middle...
Beauty is in the eye of the beholder...
UV!
Just to be clear, the hues in your "EDID Correction Disabled" image are correct.no need to have wild values
1. I edited reference measures to match what is inside EDID (based on spacediver coordinates + errors from low 10bit precision EDID use to store xy data (it is basically 10bit value divided by 1024...))
2. using Spyder3 I measured screen without EDID option
3. created correction file after which it showed gamut to match perfectly EDID coordinates
4. measured with AMD EDID option disabled
5. measured with AMD EDID option enabled
Notice how even with EDID being enabled I have errors but hues are all corrected and without EDID all hues are totally and utterly wrong, especially green and magenta but pretty much all of them. And it ofcourse is all clearly visible by eye
With proper instrument that measure proper whitepoint (now it is set to be D65 and it definitely is not hence the saturation errors) and after full blast with Argyll to have proper gamma (here is 2.1 on average) and with EDID option it should be near perfect.
BTW. to make it perfectly clear: this gamut measured in first picture and taken as reference (gray triangle) is exactly the gamut that spacediver provided. It is not measured by me in normal sense of the word.
And imho it is not sRGB enough to use uncorrected. In fact it need correction badly which is what my inferior eyes instrument told all along. LOL
Better probes will only make it more documented and proven. Besides anyone can test what color correction does to image, even with GeForce, just not in everything at once.
Anything else?
You need a spectro if you want truly accurate results on a wide gamut monitor.
The default tables in the i1Display Pro are okay, though it should be much better if the manufacturer provides an OEM version with tables specific to their own displays.
So you really need both an i1Pro and an i1 Display Pro.
first picture show HCFR reference gamut which is exactly Rec. 709, as is second picture. In the second there is also light gray which show gamut of first picture for comparison. You can click 'reference' checkbox in one window and it will compare gamut. But only gamut. All saturation points and graycale, temperature, etc. are compared to Rev. 709
Wait, I was assuming that the white triangle in the first picture was based on your actual probe measurements before EDID correction and that the white triangle in the second picture was based on your actual probe measurements after EDID correction. Have I got this right?
Yes, an i1 Display Pro is all that you need with a CRT, XoR mentioned that he wanted to be able to calibrate wide gamut displays as well though.The FW900 is not wide gamut, and the i1Display Pro works excellently with the FW900, especially when using the Hitachi CRT ccss correction (Colorimeter Calibration Spectral Set) that Grame has provided in his Argyll drivers.
It's a good sanity check, and if you're working with different displays I'd agree, but for the FW900, an i1 Display Pro is all you need. (I have both, because I wanted that sanity check).
yesOk, but did you take two sets of actual probe measurements, one before and one after the EDID correction?
such calorimeters are far from good representative of human eyes and need correction matrices for different light spectrum types.
One other thing I'd like to mention. I have a CRT that at certain times I may need to go down in resolution like 1024x768 @ 150hz and even 800x600 @160hz. Am I shortening the life of the tube? I say that cause if I get a FW-900 or a Trinitron tube I will sometimes be using those resolutions around that refresh rate.
I just wonder why this isn't advisable...in the service manual those timings are there so I can't imagine there being a problem...
btw I did manage to find someone at the conference to ask about CRTs and circuit wear with higher scanning frequencies (a very prominent scientist who worked at bell labs in the 50's).
He said that older CRTs used a particular type of circuit (I think QI?), and I think the idea is that they'd naturally oscillate at a particular frequency. If you fed it a faster signal, it would be unable to even try and run at a higher frequency. With newer CRTs that are more flexible, I think he implied that if the CRT could handle the frequency it would not harm it.
Sorry for sounding dense, but what does hardware calibration mean? Is that a reference to changing settings using the buttons on the bezel or doing something with a hardware piece?
WinDAS might be unavailable unless I get a PC that still has windows XP pre loaded or I buy the software.
I'm starting to agree as well, thank you for the information.I tend to think that it doesn't make a difference:
The instructions here on keeping s healthy G2 and peak luminance level could apply to any GDM Trinitron tube correct?Yes, you're calibrating the monitor internally, which means that, "out of the box", it has a high degree of accuracy. It's also important on CRTs for tube health reasons (keeping a healthy G2 and peak luminance level).
WinDAS should work with Windows 7, and it's not difficult to find a copy of XP that you can load on a laptop (I recommend running WinDAS off of a laptop).
Has anyone used a dolly/handtruck to move the FW900 up the stairs? I figure that would be the easiest way to do it outside of someone else doing the heavy lifting.
I still chuckle when my buddy and I moved I think a 32" Trinitron TV years ago, we were completely out of breath on the 6th floor...must have weighed 300 lbs for sure.
We got it in my apartment, only to turn it on and the display being a piece of junk...and back down it went... After that I stuck with 90s model Trinitrons with the rounded screens.
Has anyone used a dolly/handtruck to move the FW900 up the stairs? I figure that would be the easiest way to do it outside of someone else doing the heavy lifting.