24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I have a question. I have two FW900 monitors side by side. When one monitor is running, and I power on the second monitor next to it, the first running monitor flickers briefly almost like a mild degauss effect on the screen. What causes this? Afterwards everything looks fine on both screens just concerns me. Could this cause damage to one of my units running them next to each other?
 
I have a question. I have two FW900 monitors side by side. When one monitor is running, and I power on the second monitor next to it, the first running monitor flickers briefly almost like a mild degauss effect on the screen. What causes this? Afterwards everything looks fine on both screens just concerns me. Could this cause damage to one of my units running them next to each other?

I have a decent theory, although I didn't use dual monitors until we moveed to LCDs at work :D

Both monitors have a deguass coil that automatically triggers on power-up - all modern CRTs do this. Your second monitor that is already on is probably picking up the magnetic field generated by the other degauss. You can try and switch to powering-on the other display first, and see if they switch behavior. Alternately, you can put more distance betwen them, and the effect should lessen.

I would think they're not very well magnetically shielded. They degauss on startup anyway BECAUSE they're susceptible to magnetic fields.

http://en.wikipedia.org/wiki/Degaussing#Monitors
 
Last edited:
excellent explanation! I thought that it may have something to do with this. I just wanted to ensure that I was not causing harm to the unit by having them close together. Thank you.
 
Because I have a CRT next to an IPS LCD and the colors tend to look better on the CRT. Am I wrong about this?

So you're taking your CRT vs your IPS and making a generalization about all IPS displays.

Yes, you are completely wrong. Go learn about what color gamut means, and do some research about various IPS displays out there.
 
Do let us know if it actually goes higher than 165mhz. I think the fastest HDFury only does about 210mhz or 1080p @ 72hz.

soooo, the adaptor arrived today and well....i can't say i'm surprised but i was hoping for more.
It maxes out at 179Mhz; anything beyond that pixel clock isn't shown in the windows resolution settings. The image itself still looks just as good as it did on my hd 7870 with native DVI-I and it doesn't have any (noticeable) input lag, so there's that at least.

But using my fw900 again makes me realize how much better it feels than my 120Hz IPS;
even at 70Hz it's so much clearer and the movement feels so direct (i tried it with F.E.A.R.).
I completely forgot about that. (and also about the 120 watt power consumption according to my electricity usage monitor lol)
captureb2sbs.png
 
Deadspace (1 and 2) on the FW900 @ 1920x1200 @ 85 hz, with quality headphones, and in a light controlled environment was seriously cool.
 
I have a question. I have two FW900 monitors side by side. When one monitor is running, and I power on the second monitor next to it, the first running monitor flickers briefly almost like a mild degauss effect on the screen. What causes this? Afterwards everything looks fine on both screens just concerns me. Could this cause damage to one of my units running them next to each other?


the electromagnetic field. this normal both my 18" Dell CRTs do the exact same thing.
 
soooo, the adaptor arrived today and well....i can't say i'm surprised but i was hoping for more.
It maxes out at 179Mhz; anything beyond that pixel clock isn't shown in the windows resolution settings. The image itself still looks just as good as it did on my hd 7870 with native DVI-I and it doesn't have any (noticeable) input lag, so there's that at least.

But using my fw900 again makes me realize how much better it feels than my 120Hz IPS;
even at 70Hz it's so much clearer and the movement feels so direct (i tried it with F.E.A.R.).
I completely forgot about that. (and also about the 120 watt power consumption according to my electricity usage monitor lol)
captureb2sbs.png


That because CRTs are low persistence/latency.

https://www.youtube.com/watch?v=wQQgHYnPiFs

CRTs if i'm not mistaken refresh line by line, i think LCDs refresh whole chunks at a time (part of why it takes longer for the image to hit the screen/latency)

also can't the FW do like 85 hz?
 
I am using the pixel clock patcher since i got my Sony 2 years ago (and for my LCD to "overclock" it to 96Hz) but it doesn't change anything. It still doesn't show up.
 
I am using the pixel clock patcher since i got my Sony 2 years ago (and for my LCD to "overclock" it to 96Hz) but it doesn't change anything. It still doesn't show up.

Weird. The only other thing I can think of is to use Powerstrip to overwrite the adapter's EDID. I don't have any experience with this but I've read about people doing this to various monitors and adapters.
 
Just snagged two Diamondtron Pro 2070's for $20 each, got them home and working great, They may need to be adjusted a little, but for $40 I feel like I stole them.
 
yea... that's a steal

The guy was used them for Photoshop and all that stuff, bought them brand new when they came out and apparently spent 1200 each.

Was testing out the refresh rate and resolution they are doing, I got one to do 2560x1440 @ 60hz and it's crystal clear looking like text was sharper than most LCD panels.. Couldn't get 2560x1600 working though.:confused:
 
The guy was used them for Photoshop and all that stuff, bought them brand new when they came out and apparently spent 1200 each.

Was testing out the refresh rate and resolution they are doing, I got one to do 2560x1440 @ 60hz and it's crystal clear looking like text was sharper than most LCD panels.. Couldn't get 2560x1600 working though.:confused:

In the past, we tried some torture resolutions trial runs with the GDM-FW900 and achieved 2560x1440 and 2560x1600 @60Hz, and we were able to have the unit maintain it. Again, these resolutions and timings ARE NOT RECOMMENDED! It only illustrates the range of the bandwidth of the unit.

Hope this helps...

Sincerely,

Unkle Vito!
 
You can actually do 4k on a FW900 at 60hz interlaced. AMD cards won't let you go over a 2000 pixel vertical resolution in interlaced mode, though. Somebody should give it a shot on Nvidia.

Uncle Vito, can you give us an idea of what parts of the monitor are actually put under more stress as a result of non-recommended timings?
 
Question for the CRT gods. Just picked up a Lacie Blue IV 22" and the colors seem dull, I can't get the intense white that I can get with my LG34 or Korean 27. Not sure if this is a limitation of the fact that the CRT has 110-150nit vs my LCD's 300nit. Whites look very dull. On my LCD they are insane.
 
The superbright mode is very ugly though, it kills the contrast.

Nickerz, something you can try is the Extron RGB. I've used it to increase the brightness from some dark sources. Most models can increase the voltage on the VGA line, which gives you a brighter overall picture.
 
In the past, we tried some torture resolutions trial runs with the GDM-FW900 and achieved 2560x1440 and 2560x1600 @60Hz, and we were able to have the unit maintain it. Again, these resolutions and timings ARE NOT RECOMMENDED! It only illustrates the range of the bandwidth of the unit.

Hope this helps...

Sincerely,

Unkle Vito!

I was able to go as far as 2560x1600 at 72hz and 2800x1800 at 62 hz. 1600p is bareable for shorter times, but I don't play more than 30-40 minutes with 72hz. With Nvidia DSR it gives whooping 5120x3200 screen res (nearly double to 4K) but makes sense only with older, less demanding games (like Halo 1 or Half Life 2).

1800p at 62hz kills my eyes instantly.


Just pasting few links :

http://static.frazpc.pl/board/2015/03/49e9f3cadf678e1c.jpg
http://static.frazpc.pl/board/2015/03/16920230cb9c420a.jpg
http://static.frazpc.pl/board/2015/03/f3e7990d5ddec88e.jpg
http://static.frazpc.pl/board/2015/03/0235404501a75b0b.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/BF3/ScreenshotWin32-0005.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/BF3/ScreenshotWin32-0006.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/BF3/ScreenshotWin32-0007.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/Crysis/2015-02-26_00004.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/Crysis/2015-02-26_00007.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/Crysis/2015-02-26_00009.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/Crysis/2015-02-26_00011.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/Warframe/2015-02-26_00005.jpg
https://dl.dropboxusercontent.com/u/14848901/Screenshots/Warframe/2015-02-26_00007.jpg

You can actually do 4k on a FW900 at 60hz interlaced. AMD cards won't let you go over a 2000 pixel vertical resolution in interlaced mode, though. Somebody should give it a shot on Nvidia.

Uncle Vito, can you give us an idea of what parts of the monitor are actually put under more stress as a result of non-recommended timings?

Any hint on how to try interlaced? As quality geek I wouldn't probably use it at all but might be cool for taking screenshots just like ones I posted above :p
 
I wouldn't bother with interlaced on a CRT designed from the start for progressive scan. The flicker is even worse, because each line is effectively getting half the refresh rate, and CRT monitors are a lot are lower-persistence than TVs were.

I tried the interlaced resolutions on my S3 card just to see, and they looked like crap. This coming from one of the worst 2D quality cards of all time - I could see the difference in interlaced resolutions, even with a static screen.
 
I wouldn't bother with interlaced on a CRT designed from the start for progressive scan. The flicker is even worse, because each line is effectively getting half the refresh rate, and CRT monitors are a lot are lower-persistence than TVs were.

I tried the interlaced resolutions on my S3 card just to see, and they looked like crap. This coming from one of the worst 2D quality cards of all time - I could see the difference in interlaced resolutions, even with a static screen.

Interlaced isn't really bad for games. I've noticed that it gives you some added perceivable resolution while being easier on the video card. It's not necessary for everyone, but for those of us (like me) who have older video cards, it's nice to have the option.
 
Any hint on how to try interlaced? As quality geek I wouldn't probably use it at all but might be cool for taking screenshots just like ones I posted above :p

I use CRU to add interlaced resolutions. As I said earlier, it seems like AMD won't let you use interlaced resolutions over 2000 vertical. Hopefully Nvidia will.

I wouldn't bother with interlaced on a CRT designed from the start for progressive scan. The flicker is even worse, because each line is effectively getting half the refresh rate, and CRT monitors are a lot are lower-persistence than TVs were.

I tried the interlaced resolutions on my S3 card just to see, and they looked like crap. This coming from one of the worst 2D quality cards of all time - I could see the difference in interlaced resolutions, even with a static screen.

Did you actually try it with a game, though? Interlaced does look bad with text, but it looks fine with games if you lock the framerate at half the refresh rate. Like with Far Cry 3, my CPU prevents me from playing it at a steady 60, but my graphics card gives me a lot of headroom, so I set it to 1836x1392 interlaced 80hz (which my monitor can't scan progressively) and cap the framerate at 40. So every two refreshes combines to make one full frame, rendering a perfect 40hz without all the flicker.

And with Far Cry 4 I do the same except at 2304x1728 @ 60hz, which gives me a perfect 30fps.
 
You can actually do 4k on a FW900 at 60hz interlaced. AMD cards won't let you go over a 2000 pixel vertical resolution in interlaced mode, though. Somebody should give it a shot on Nvidia.

Uncle Vito, can you give us an idea of what parts of the monitor are actually put under more stress as a result of non-recommended timings?

The HOT, FBT, cathode, anode and the guns, and the tube assembly.

Hope this helps...

UV!
 
Question for the CRT gods. Just picked up a Lacie Blue IV 22" and the colors seem dull, I can't get the intense white that I can get with my LG34 or Korean 27. Not sure if this is a limitation of the fact that the CRT has 110-150nit vs my LCD's 300nit. Whites look very dull. On my LCD they are insane.

You are not comparing apples to apples...You CANNOT compare a CRT to an LCD... LCD are insanely color inaccurate and they over-bright and over-saturate the colors.

UV!
 
You are not comparing apples to apples...You CANNOT compare a CRT to an LCD...
yes you can; they both show images.

LCD are insanely color inaccurate and they over-bright and over-saturate the colors.

UV!

backlight can be turned down to crt levels

nowadays, even midrange lcd's have gamuts closer to rec709 than crt. as to which gamut is better, well most content on computers is designed for rec709

what lcd's will never match are the black levels and the viewing angles.
 
yes you can; they both show images.



backlight can be turned down to crt levels

nowadays, even midrange lcd's have gamuts closer to rec709 than crt. as to which gamut is better, well most content on computers is designed for rec709

what lcd's will never match are the black levels and the viewing angles.

"yes you can; they both show images..." Comparing a CRT to an LCD is like comparing two items with much different technologies... Both show images and the "comparison" will not have any practical value: One is BRIGHTER that the other, one SATURATES COLORS AND CANNOT REPRODUCE A 1 IRE BOX IN A BLACK BACKGROUND, one has no flicker and the other one has, one is energy saving and the other is not, one has more active viable area and other has not, one is real time and the other is not.... I can go on and on but what practical value does this "comparison" of two different technologies showing an "image" has? This is clearly not comparing apples to apples...

Nevertheless, if two CRTs with same technologies are compared, then a more sound and practical value can be achieved.

Again, this is my own opinion...

UV!
 
You are not comparing apples to apples...You CANNOT compare a CRT to an LCD... LCD are insanely color inaccurate and they over-bright and over-saturate the colors.

UV!

LCD's can be color calibrated tho can't they? I've seen a ton of reviews on various models stating excellent results after calibration, although I know this isn't something the average user can do without the required calibration hardware.
 
So Goodwill had a working Gateway VX920 for $10. I passed on it because it is a 19" and I already have two Dell P991's and one Dell P992. But apparently the Gateway used to retail for almost $1200? Is this monitor in a higher league than my Trinitrons? Should I go back and get it?
 
So Goodwill had a working Gateway VX920 for $10. I passed on it because it is a 19" and I already have two Dell P991's and one Dell P992. But apparently the Gateway used to retail for almost $1200? Is this monitor in a higher league than my Trinitrons? Should I go back and get it?

Your Dell monitors are 107KHz. The VX920 is 110KHz. The VX920 is slightly better than the Dells, but not by much. Expect a 2Hz-5Hz higher refresh rate.
 
I'd get it. Like the gdm-fw900, that monitor was designed for professional photo editing.

A 19 inch monitor is good too, just sit bit closer to the screen if you think its tiny :)

Whether or not its worth to buy if you already own good crt, is up for debate.
 
LCD's can be color calibrated tho can't they? I've seen a ton of reviews on various models stating excellent results after calibration, although I know this isn't something the average user can do without the required calibration hardware.

The million dollar question... would the images displayed be color accurate?

After calibrating an LCD, check and see if the display can reproduce a one IRE box in a black background... Check the LCD for luminance uniformity, check color casts and hues; check if the LCD can be calibrated at 85-100 cd/m2 max.

Sure they can be "calibrated"... at the expense of which parameter... Luminance, primary color band(s)...

In my experience with displays, you just can't beat the glass...

Again, it is only my opinion...

UV!
 
I think I may have been overestimating the original price of the VX920. I see old posts selling it for $100-$200, so it probably didn't retail for a grand like the LaCie Electrons or the FW900. I can't find a lot of info through Google so maybe it wasn't that popular.

With proper WinDAS maintenance, my Dell Trinitrons are going to last me a while, so I'm only interested in another 19" monitor if it actually has better color or sharpness than what I already have.
 
Back
Top