24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Yes, it's the converter. At present, the only way to properly use the FW900 is with a video card that has VGA out, or DVI-I out.
 
at 1080P the monitor say " OUT OF SCAN RANGE " and it reports value of 27.0 KHZ/ 24HZ which is TOTALLY on the display range capability ,

Actually that's not within the display's capable range.

5mz32a.png
 
ouch ... i see it now ...

damn ... everything i saw posts about watching blu rays at 1080P on the monitor i thought would be a 24P signal ...

damn ... i believe every I/O card would output that way ...

sad news..but thanks guys

g
 
It's worth buying a geforce so you can use your FW900 properly.

And blu-rays look fantastic on a calibrated unit.

The way I do it is to rip the blu-rays to my hard drive and watch them like that. That's an easy way to bypass the HDCP bullshit.
 
Last edited:
:)
thanks but i need to use that I/O card for color correction ... i have 4 titans but i need the FW900 plugged to the decklink I/O card ...i wonder if i could reclock he signal at 60 Hz though ...

thanks
g
 
Just so I am clear, 6500k is the color temperature that should be used on all CRTs right?

NO! It is a matter of preference... For example... Photographers prefer 5000K (warm) because simulates the ambient sun light. Most gamers like the blueish tone so they prefer the 9300K (cold)... Some prefer 6500K which is somewhat in the middle...

Beauty is in the eye of the beholder...

UV!
 
:)
thanks but i need to use that I/O card for color correction ... i have 4 titans but i need the FW900 plugged to the decklink I/O card ...i wonder if i could reclock he signal at 60 Hz though ...

thanks
g

If you don't mind 60hz, you could go with an HDFury to convert the HDMI to VGA.
 
wait though ... even the horizontal is out of range ... isn't 1080p always 27KHz horizontally ?
thanks
g

You probably want to run 1920x1200 @ 72 hz, which I believe runs at 90.9 Khz.

Just means the display will render each frame three times.
 
Last edited:
Thank you very much UV!

I'm curious, do you have any experience with the GDM5402? I cannot find any information if that is a true Trinitron tube...thoughts on picture quality?

The other quandary I have is, besides your death which can occur whenever, how long are you planning to service these Sony Monitors for, Unkle Vito?

I don't want to purchase a monitor and 2 years later I need servicing and you end up closing shop and then I would have nobody to go to.

Just curious how long you plan on doing this for.


No. We never serviced the GDM-5402. It is an older monitor, similar to the Dell p911.

I have clientele all over the world and outside the USA there is a huge demand for CRTs. Also, we have extensive clientele in the Hollywood area so I am here for the long haul... Your purchase is safe...

UV!
 
Just so I am clear, 6500k is the color temperature that should be used on all CRTs right?

sorta... nowadays most computer and video content is designed with a d65 white point in mind. on a calibrated monitor, the 6500k setting will give a d65 white point

sometimes i lower the osd setting to 5000k as i find it less straining to read webpages/whatever at night
 
Also, we have extensive clientele in the Hollywood area so I am here for the long haul... Your purchase is safe...
UV!

Wow now that is interesting. Does movie industry still use CRTs? I doubt they cannot afford absolutely top-end LCD or OLED displays. Could you shed some light on it?

Jeez I hope someday soon I would be able to afford to trip to your workshop UV and see all your magic you have in there :p
 
Also, it would be interesting to confirm that your EDID hack is actually having an impact. Try setting it to wildly different values, and measuring the chromaticity of your white point, using the free measure in HCFR (green play button). Just load up a white screen and measure, being sure to keep probe in same place with diff EDID settings.
no need to have wild values

1. I edited reference measures to match what is inside EDID (based on spacediver coordinates + errors from low 10bit precision EDID use to store xy data (it is basically 10bit value divided by 1024...))
2. using Spyder3 I measured screen without EDID option
3. created correction file after which it showed gamut to match perfectly EDID coordinates
4. measured with AMD EDID option disabled
Ug4WVb5.jpg


5. measured with AMD EDID option enabled
uQuGcJV.jpg


Notice how even with EDID being enabled I have errors but hues are all corrected and without EDID all hues are totally and utterly wrong, especially green and magenta but pretty much all of them. And it ofcourse is all clearly visible by eye

With proper instrument that measure proper whitepoint (now it is set to be D65 and it definitely is not hence the saturation errors) and after full blast with Argyll to have proper gamma (here is 2.1 on average) and with EDID option it should be near perfect.

BTW. to make it perfectly clear: this gamut measured in first picture and taken as reference (gray triangle) is exactly the gamut that spacediver provided. It is not measured by me in normal sense of the word.

And imho it is not sRGB enough to use uncorrected. In fact it need correction badly which is what my inferior eyes instrument told all along. LOL

Better probes will only make it more documented and proven. Besides anyone can test what color correction does to image, even with GeForce, just not in everything at once.

Anything else?
 
I stand corrected then. Maybe a Diamondtron then? Rabbidz, is there a service menu for your monitor? If so - then it's likely a Mitsu and not a Sony.

It does have a service menu. I have found that the CRT type is not listed in the service manuals. It goes up to 125K horizontal and is rated for 180 vertical but it can actually go to 344Hz, which makes it the fastest CRT (in vertical refresh rate) that is known.
 
Does anyone know who can service a Gateway VX1120? It went black one day and won't turn back on.
 
Last edited:
1. I edited reference measures to match what is inside EDID

Wait, did you edit reference measures in HCFR? No need to do that. Just stick with the standard Rec 709 coordinates (sRGB), and measure your display with and without EDID. That way, you can compare the gamut before and after EDID correction, against the same reference.

Unless by reference, you meant something in EDID?

BTW. to make it perfectly clear: this gamut measured in first picture and taken as reference (gray triangle) is exactly the gamut that spacediver provided. It is not measured by me in normal sense of the word.

Wait, I was assuming that the white triangle in the first picture was based on your actual probe measurements before EDID correction and that the white triangle in the second picture was based on your actual probe measurements after EDID correction. Have I got this right?
 
My probes:
Spyder3 http://i.imgur.com/Tpe74ni.jpg
calibrating to 6500K on it resulted in image that is both red and green. Absolutely unusable.

Spyder1 http://i.imgur.com/vVpScdK.jpg
This one actually measured 6500K white to be almost exactly like I set and it is definitely not 6500K but something like 7500K or something.

All those shite calibrators can do is aid to have evened out grayscale and with Argyll make LUTs. Just not measure gamut or white-point... For obvious reasons I cannot use suction cups with polarizer so I need to invent some way to keep it on screen for few hours (its kinda slow, and argyll high quality have many dark patches which it repeat and repeat and repeat ;.( )

I need to buy some good device to really talk about 'accuracy'.
i1 Display2 is supported by my LG and affordable but not really suitable for RGB-LED and that make me reluctant to buy it. I will probably go with i1 Display Pro as it seems to be very good and support RGB-LED monitors and will help me make correction matrix for spyder so I was able to calibrate LG with it.
You need a spectro if you want truly accurate results on a wide gamut monitor.
The default tables in the i1Display Pro are okay, though it should be much better if the manufacturer provides an OEM version with tables specific to their own displays.

Since consumer-grade spectros are not very good at low-light measurements, if you're buying one you will also need a good colorimeter profiled off it.
So you really need both an i1Pro and an i1 Display Pro.

Just so I am clear, 6500k is the color temperature that should be used on all CRTs right?
Displays should be calibrated to D65, not 6500K.
Unfortunately 6500K is not marked on this image, but we can use 5000K as an example.
planckian-locuspna54.png


5000K is anywhere along that line. It can have quite a strong green or magenta tint.
D50 is a neutral point in the center, approximately where the 5000K line crosses the planckian locus. (curve)

NO! It is a matter of preference... For example... Photographers prefer 5000K (warm) because simulates the ambient sun light. Most gamers like the blueish tone so they prefer the 9300K (cold)... Some prefer 6500K which is somewhat in the middle...
Beauty is in the eye of the beholder...
UV!
It is not a matter of preference. Photographers may use D50 or D55 because they are dealing with print work; whether that is books, magazines, or selling prints of their work.
Displays used for graphics work, or for viewing games/movies etc. should be calibrated to D65.

no need to have wild values

1. I edited reference measures to match what is inside EDID (based on spacediver coordinates + errors from low 10bit precision EDID use to store xy data (it is basically 10bit value divided by 1024...))
2. using Spyder3 I measured screen without EDID option
3. created correction file after which it showed gamut to match perfectly EDID coordinates
4. measured with AMD EDID option disabled
5. measured with AMD EDID option enabled

Notice how even with EDID being enabled I have errors but hues are all corrected and without EDID all hues are totally and utterly wrong, especially green and magenta but pretty much all of them. And it ofcourse is all clearly visible by eye

With proper instrument that measure proper whitepoint (now it is set to be D65 and it definitely is not hence the saturation errors) and after full blast with Argyll to have proper gamma (here is 2.1 on average) and with EDID option it should be near perfect.
BTW. to make it perfectly clear: this gamut measured in first picture and taken as reference (gray triangle) is exactly the gamut that spacediver provided. It is not measured by me in normal sense of the word.
And imho it is not sRGB enough to use uncorrected. In fact it need correction badly which is what my inferior eyes instrument told all along. LOL
Better probes will only make it more documented and proven. Besides anyone can test what color correction does to image, even with GeForce, just not in everything at once.
Anything else?
Just to be clear, the hues in your "EDID Correction Disabled" image are correct.
The hue position is determined by the opposing primary and the white point.
Basically draw a line from red through the white point; that is where cyan should lie, and so on.

P.S. You should switch HCFR to display uv plots rather than xy plots.
They are far more perceptually uniform. xy plots tend to exaggerate errors with green in particular.
 
You need a spectro if you want truly accurate results on a wide gamut monitor.
The default tables in the i1Display Pro are okay, though it should be much better if the manufacturer provides an OEM version with tables specific to their own displays.

The FW900 is not wide gamut, and the i1Display Pro works excellently with the FW900, especially when using the Hitachi CRT ccss correction (Colorimeter Calibration Spectral Set) that Grame has provided in his Argyll drivers.

So you really need both an i1Pro and an i1 Display Pro.

It's a good sanity check, and if you're working with different displays I'd agree, but for the FW900, an i1 Display Pro is all you need. (I have both, because I wanted that sanity check).
 
first picture show HCFR reference gamut which is exactly Rec. 709, as is second picture. In the second there is also light gray which show gamut of first picture for comparison. You can click 'reference' checkbox in one window and it will compare gamut. But only gamut. All saturation points and graycale, temperature, etc. are compared to Rev. 709

---
I need something to compare Spyder3 readings with to have even remotely-accurate results so I took your RGB xy values and made correction matrix from them and my actual measurements so now my FW900+Spyder3 measure exactly what you provided as accurate FW900 gamut. It was only to show how AMD clips gamut and correct hues to Rec. 709 gamut which it did imho very nicely. To do any measurements with this Spyder3 I need to make proper correction files using reference probe.

Interesting but predicable fact is that if I make correction matrix for RGB-LED based on EDID it captures build in sRGB mode fairy well but screws up FW900 measurement. Green is much higher and red to the right, slightly outside CIE diagram which proves that such calorimeters are far from good representative of human eyes and need correction matrices for different light spectrum types.

BTW. for the time I won't measure monitor without polarizer. I will probably buy probe and do contraption to apply filter properly and then and only then remove polarizer and do measurements without it.
 
first picture show HCFR reference gamut which is exactly Rec. 709, as is second picture. In the second there is also light gray which show gamut of first picture for comparison. You can click 'reference' checkbox in one window and it will compare gamut. But only gamut. All saturation points and graycale, temperature, etc. are compared to Rev. 709

Ok, but did you take two sets of actual probe measurements, one before and one after the EDID correction?

Wait, I was assuming that the white triangle in the first picture was based on your actual probe measurements before EDID correction and that the white triangle in the second picture was based on your actual probe measurements after EDID correction. Have I got this right?
 
The FW900 is not wide gamut, and the i1Display Pro works excellently with the FW900, especially when using the Hitachi CRT ccss correction (Colorimeter Calibration Spectral Set) that Grame has provided in his Argyll drivers.

It's a good sanity check, and if you're working with different displays I'd agree, but for the FW900, an i1 Display Pro is all you need. (I have both, because I wanted that sanity check).
Yes, an i1 Display Pro is all that you need with a CRT, XoR mentioned that he wanted to be able to calibrate wide gamut displays as well though.
 
Ok, but did you take two sets of actual probe measurements, one before and one after the EDID correction?
yes

both with the same correction matrix

if I shown what probe measured without it it would not be very visible what happened because Spyder3 probe seems modified to handle RGB-LED gamut and Spyder1 is ... Spyder1 ;)
 
Ok, then this would be my suggested workflow:

1) WPB in WinDAS

2) Measure white point and primaries.

3) Use these meaurements for EDID

4) Argyll LUT adjustment (I'm assuming that the AMD gamut remapping occurs at a low enough level that a LUT adjusment is possible on top of this.)
 
Sorry for sounding dense, but what does hardware calibration mean? Is that a reference to changing settings using the buttons on the bezel or doing something with a hardware piece?

I take software calibration to mean using software in order to calibrate the display.

WinDAS might be unavailable unless I get a PC that still has windows XP pre loaded or I buy the software.

Any general tips on keeping a CRT running well throughout the duration of its life? I always power off my monitor when not in use.

If a FW-900 is going to require software calibration once a year and if that is only done through Windows XP, and I would need a device, that makes the Artisan more worthy in my eyes since it already comes with the calibration unit itself and all I would need is XP to install the software.
 
One other thing I'd like to mention. I have a CRT that at certain times I may need to go down in resolution like 1024x768 @ 150hz and even 800x600 @160hz. Am I shortening the life of the tube? I say that cause if I get a FW-900 or a Trinitron tube I will sometimes be using those resolutions around that refresh rate.

I just wonder why this isn't advisable...in the service manual those timings are there so I can't imagine there being a problem...
 
such calorimeters are far from good representative of human eyes and need correction matrices for different light spectrum types.

cant find link but the i1display pro's filters are designed to pretty closely match the observer functions.
other cheap colorimeters however...
 
One other thing I'd like to mention. I have a CRT that at certain times I may need to go down in resolution like 1024x768 @ 150hz and even 800x600 @160hz. Am I shortening the life of the tube? I say that cause if I get a FW-900 or a Trinitron tube I will sometimes be using those resolutions around that refresh rate.

I just wonder why this isn't advisable...in the service manual those timings are there so I can't imagine there being a problem...

the question is whether you wear the tube more quickly at these settings or at regular settings, and by how much.

and i don't think anyone has any numbers on this. all we have are generic arguments like "you're pushing the tube harder so of course it will wear faster" which is completely meaningless because even if true, it could be a 1% difference or a 400% difference.
 
I tend to think that it doesn't make a difference:

btw I did manage to find someone at the conference to ask about CRTs and circuit wear with higher scanning frequencies (a very prominent scientist who worked at bell labs in the 50's).

He said that older CRTs used a particular type of circuit (I think QI?), and I think the idea is that they'd naturally oscillate at a particular frequency. If you fed it a faster signal, it would be unable to even try and run at a higher frequency. With newer CRTs that are more flexible, I think he implied that if the CRT could handle the frequency it would not harm it.
 
Sorry for sounding dense, but what does hardware calibration mean? Is that a reference to changing settings using the buttons on the bezel or doing something with a hardware piece?

Yes, you're calibrating the monitor internally, which means that, "out of the box", it has a high degree of accuracy. It's also important on CRTs for tube health reasons (keeping a healthy G2 and peak luminance level).

WinDAS might be unavailable unless I get a PC that still has windows XP pre loaded or I buy the software.

WinDAS should work with Windows 7, and it's not difficult to find a copy of XP that you can load on a laptop (I recommend running WinDAS off of a laptop).
 
I tend to think that it doesn't make a difference:
I'm starting to agree as well, thank you for the information.

Yes, you're calibrating the monitor internally, which means that, "out of the box", it has a high degree of accuracy. It's also important on CRTs for tube health reasons (keeping a healthy G2 and peak luminance level).



WinDAS should work with Windows 7, and it's not difficult to find a copy of XP that you can load on a laptop (I recommend running WinDAS off of a laptop).
The instructions here on keeping s healthy G2 and peak luminance level could apply to any GDM Trinitron tube correct?
 
APOLOGIES, i have posted this on the wrong threats as well (WINDAS guide thread)


hi guys

so , sorry guys i am going to ask few more basic questions again (regarding the signal i am going to use 1080i 59.94 using pulldown)

on this post i would like to ask to whom knows about it , the following 2 topics:


1- WINDAS POSSIBLY WRONG RESIDUAL CALIBRATION FROM PREVIOUS OWNER ; I have bought this unit from a guy who told me that couple years back he was used to play with windas, as far as i understand windas also let’s you manipulate the monitor internally , in a way that the OSD controls won’t , is that correct ?
Now,
if i reset the monitor using the reset button the state of the image is pretty bad , the black level is probably like 1 nit high , super green , the whole the grey balance is way off etc…

using the OSD i was able to get to quite better starting point : i use LIGHTSPACE (www.lightillusion.com) along with Xrite i1PRO2 to read values ,
so using the OSD GAIN AND BIAS i managed to get quite decent D65 values on the grey scale ,
BUT gamma seems off , and i am lacking some RED info (compared to REC709 RED)

QUESTION : i wonder if the guy messed with WINDAS wrongly therefore internally the monitor has been shifted to much
is there a easy way to reset anything that internally has been changed ? (the alternative i guess would be for me to learn WINDAS and manually adjust things , but having LIGHTSPACE i was hoping to avoid that (even though lighspace would PNLY create a LUT , so the better the starting point the better the result ,
for example i am concerned about the gamma because to get the monitor to have real blacks i have to set brightness to a value of 7 (that way i feel it reaches black (i use as references the black bars (top boom ) generated by the aspect ratio of the signal )
but than way the gamma is too dark ,
anyway i wonder i can reset what has been changed internally or if should lear WINDAS to do so


PS: does WINDAS also allow geometry adjustments not possible using OSD?



2 : AFTER WARM UP RELATED ISSUE : looks like that the monitor need 30+ min to reach the image that has been set for ( for example when i turn it on the black level is way higher than after 30 goes black ) ,

QUESTION : WOULD you agree that 30 min are enough ?


2B : After 30+ minutes warming up time if i put on a black patch , the whole left side of the monitor have something that looks like a light leak (like little brighter …

QUESTION is that something that can be fixed , or is the tube going worse?

2C: After 30+ minutes warming up time if i put on a White patch sometimes, the effect varies i see some colored patches around the corners , sometimes store sometimes heavy . i feel ike if i have the monitor worming up using a white patch , those appear more (like if displaying full white for a while will cause the issue ) the DEGAUSS almost fix it all , but sometimes not entirelythoughts ?



alright enough questions of now

thanks in advance

i truly appreciate


ps: let me know if you want to know more about Lightspace


thanks
g
 
Has anyone used a dolly/handtruck to move the FW900 up the stairs? I figure that would be the easiest way to do it outside of someone else doing the heavy lifting.;)

I still chuckle when my buddy and I moved I think a 32" Trinitron TV years ago, we were completely out of breath on the 6th floor...must have weighed 300 lbs for sure.

We got it in my apartment, only to turn it on and the display being a piece of junk...and back down it went... After that I stuck with 90s model Trinitrons with the rounded screens.
 
Has anyone used a dolly/handtruck to move the FW900 up the stairs? I figure that would be the easiest way to do it outside of someone else doing the heavy lifting.;)

I still chuckle when my buddy and I moved I think a 32" Trinitron TV years ago, we were completely out of breath on the 6th floor...must have weighed 300 lbs for sure.

We got it in my apartment, only to turn it on and the display being a piece of junk...and back down it went... After that I stuck with 90s model Trinitrons with the rounded screens.

I will be getting a dolly because I've been working with 32-inch + Trinitrons myself and I can't keep asking friends to come over to help all the time. :)

The 36-inch wasn't quite 300 lbs but it did weigh ~ 220lbs. Yikes! :D
 
Has anyone used a dolly/handtruck to move the FW900 up the stairs? I figure that would be the easiest way to do it outside of someone else doing the heavy lifting.;)

honestly, get to a gym and do some basic strength training (squats & deadlifts are a good start). I recommend starting strength, and reading/watching anything Mark Rippetoe has to say on the topic :)

One of the (many) benefits of improving strength is being able to handle the almighty FW900 with ease!
 
Back
Top