24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Hi btw. It's been a while since I've been here. Ever since my FW900 died.
I have a temporary G500 that is at around 70-80% of its original luminosity, and, I found a LaCie electron 22 blue IV on sale today.
There are 3 FW900's on sale on ebay kleinanzeige here in Germany, but, they're expensive, and since I can run 16:9 resolutions on a 4:3 screen, at higher refresh rates, at a lower price, I'm looking for a 4:3.

The owner is selling it for 375 euros, disputable. I asked for 300. Does anyone have any experience with this monitor ?
 
I have one, I've been using it for 8 years. I got it with less than 10,000 hours, possibly as low as 5,000. It's over 20,000 now and still looks great..

So ask them if you can look at the hours (you have to go into the service menu and it's on one of the main screens, look up the service manual for the NEC or Mitsubishi rebrand and it will tell you). So you can gauge how much you want to play based on that.

I think it's a beautiful monitor. Dimmer than my Trinitrons, but I use it in a dark room.
 
I have one, I've been using it for 8 years. I got it with less than 10,000 hours, possibly as low as 5,000. It's over 20,000 now and still looks great..

So ask them if you can look at the hours (you have to go into the service menu and it's on one of the main screens, look up the service manual for the NEC or Mitsubishi rebrand and it will tell you). So you can gauge how much you want to play based on that.

I think it's a beautiful monitor. Dimmer than my Trinitrons, but I use it in a dark room.
Thanks for the info, that's very helpful! I have a Sony CPD G500 right now, and the lamp is very used, so the light is at around 70-80% what it originally was... Do you think it's around 70-80% of the Trinitrons ? I fell in Love with the glow of my FW900, and ever since then, no CRT monitor I've gotten had that sort of glow-except for a g400 I had before this one, but it only lasted less than a week, since shipment damaged it a lot.

That luminous glow is what sets appart CRTs from OLEDs in my experience, and, with the g520 I have right now, it's not really there, even though it does have some glow.

I'm deciding wether I want to spend 1500 euros on an FW900 I found, or 300-350 max on the LaCie electron 22 blue IV.
 
This one is better than the Delock 62967 : https://www.lets-sell.de/detail/index/sArticle/46886?sPartner=delock

I have both this one and the 62967. It is around 10-15% faster than the 62967, and the startech is even slower than them both.
The Startech DP2VGAHD20 is the best converter out there. It can output a max pixel clock or around 375MHz which is more than 2400x1800 at 59Hz (362MHz) of your 62967.
The Sunix and Delock variants of the splitter, although higher in max pixel clocks, are unstable on certain resolutions (2048×1536) and produce various artefacts so I don't recommend either of them (I have both).
There was a report a few months ago of a LK7112 chip that could do above 390MHz but it's nowhere to be found.
 
Last edited:
The Startech DP2VGAHD20 is the best converter out there. It can output a max pixel clock or around 375MHz which is more than 2400x1800 at 59Hz (362MHz) of your 62967.
The Sunix and Delock variants of the splitter, although higher in max pixel clocks, are unstable on certain resolutions (2048×1536) and produce various artefacts so I don't recommend either of them (I have both).
There was a report a few months ago of a LK7112 chip that could do above 390MHz but it's nowhere to be found.
I disagree.

The only artifacts the Delock splitter produces for me is in 2048x1536. Assides from that every other resolution I've tried has worked without a problem. I'm running 3200x1800 at 65hz and 2400x1800 at 69hz, 536MHZ on the 16:9 resolution and 427.8MHZ on the 4:3.

It's well worth the sacrifice of one resolution, to have around 30% more performance than the startech one.

2400x1800 at 59hz is also unusable because of the flicker. Anything below 65hz is uncomfortable and also makes the image look worse because of detail popping in and out.
 
Does anyone here know if it's possible to use one older Graphics card for the analog input, and use the other graphics card as the primary output through it ?

Here is someone who used their 560Ti with a 1080Ti.
 
Does anyone here know if it's possible to use one older Graphics card for the analog input, and use the other graphics card as the primary output through it ?

Here is someone who used their 560Ti with a 1080Ti.

It works! This is amazing! No more adapters :D.

Unless I use a modern console*
 
Last edited:
Does anyone here know if it's possible to use one older Graphics card for the analog input, and use the other graphics card as the primary output through it ?

You mean use an older card for VGA output?

Yeah, but it doesn't work well in many games. DX9 and DX11 games have very poor frame delivery when they go through a second GPU, in my experience. It'll make a 90fps game look like it's running at 25 fps because of all the stuttering.

Vulkan, on the other hand, I've had mostly good experience with. A little more input lag, but for some games that isn't a big deal. But for Doom Eternal it was a game breaker.
 
I disagree.

The only artifacts the Delock splitter produces for me is in 2048x1536. Assides from that every other resolution I've tried has worked without a problem. I'm running 3200x1800 at 65hz and 2400x1800 at 69hz, 536MHZ on the 16:9 resolution and 427.8MHZ on the 4:3.

It's well worth the sacrifice of one resolution, to have around 30% more performance than the startech one.

2400x1800 at 59hz is also unusable because of the flicker. Anything below 65hz is uncomfortable and also makes the image look worse because of detail popping in and out.
You must be lucky I guess. I have these splitters (Sunix and Delock) for a few years now and they are both buggy as hell. 2048x1536 shuts the signal to the monitor (it goes into standy) quite often.
Then there's the sides swapping bug that occurs randomly on All resolutions.
There are other resolutions that simply tremble or jitter for the lack of a better word and 2184x1365 85Hz on my FW900 flickers like crazy quite often on both of them.
They are utter CRAP im my opinion but I guess people have various quality standards. The only reason they are not in the trash bin by now is they are the only way to test custom timings above 400MHz.
The only converter (sadly) that is identical in image quality and performance to a regular video card with analog output is the Startech (up to 375MHz). I use that on my Dell P1130 because 2048x1536 80Hz is within range of the converter.
For my F520 and FW900 I use the GTX 980Ti.
 
😫
Also, here's a little secret. Since CRT monitors scan horizontally, the same refresh rate can be achieved as long as horizontal pixels remain the same. For example, this means that I can run 3200x1800 at 65hz as well, and 2560x1440 at the same refresh rate as 1920x1440 too, which is 86hz on my Sony g500. I tried 1942p as well, but, I don't think there is a lot more to squeeze from 3200x1800. Maybe up to 1900p at 60hz max.
Not a secret really
CRT screen only cares for synchronization signals so basically only parameter that is important is number of lines and refresh rate.

they're expensive, and since I can run 16:9 resolutions on a 4:3 screen, at higher refresh rates, at a lower price, I'm looking for a 4:3.
Did you ever use CRT before?
If not and you are interested then grab cheap Trinitron or even some 19" LG or something similar to test how it works, play with it and when you think this is it then get the big guns. Of course you will also need some converter if your GPU doesn't have DVI-I ports (or Got forbid VGA ports XD) and for that Delock 62967 is my recommendation or Startech DP2VGAHD20 though Delock has better picture quality imho so I recommend Delock.

Does anyone here know if it's possible to use one older Graphics card for the analog input, and use the other graphics card as the primary output through it ?
Yes it is possible. Starting with certain version of Windows 10 you can force application to render on selected GPU even if it is displayed on different monitor.
It works as one could imagine:
- lower performance
- higher input latency (or rather display latency?)
Not sure about stuttering issues because I am not crazy enough to test such things thoroughly but so far other than "floaty" feeling, especially at lower frame rate in-game, and lower framerate in general it does seem to work just fine.

And this is why no one uses this solution. DAC is cheap and converts digital output to analog without any added latency, no reduced frame rates or system resources.
 
You mean use an older card for VGA output?

Yeah, but it doesn't work well in many games. DX9 and DX11 games have very poor frame delivery when they go through a second GPU, in my experience. It'll make a 90fps game look like it's running at 25 fps because of all the stuttering.

Vulkan, on the other hand, I've had mostly good experience with. A little more input lag, but for some games that isn't a big deal. But for Doom Eternal it was a game breaker.
Are you actually using this solution? 😮
I fail to see one reason why this would be a good idea...
 
The only converter (sadly) that is identical in image quality and performance to a regular video card with analog output is the Startech (up to 375MHz). I use that on my Dell P1130 because 2048x1536 80Hz is within range of the converter.
For my F520 and FW900 I use the GTX 980Ti.
Did you really test Startech well enough? Or maybe I got a dud
My unit shows issues on horizontal gradients like this http://www.lagom.nl/lcd-test/gradient.php above certain clock. At <200MHz it is fine but definitely not fine above 300MHz
It is very slight and might not be even noticeable in normal usage (hence I am keeping it) but other than this issue generally image on Delock 62967 just look better, it is brighter and more what I would expect from proper VGA output device. When I compared Delock to GTX 980Ti I did not see any difference other than 980Ti doing 400MHz pixel clock and having 10-bit thus did not need dithering as much for gamma correction.
 
You must be lucky I guess. I have these splitters (Sunix and Delock) for a few years now and they are both buggy as hell. 2048x1536 shuts the signal to the monitor (it goes into standy) quite often.
Then there's the sides swapping bug that occurs randomly on All resolutions.
There are other resolutions that simply tremble or jitter for the lack of a better word and 2184x1365 85Hz on my FW900 flickers like crazy quite often on both of them.
They are utter CRAP im my opinion but I guess people have various quality standards. The only reason they are not in the trash bin by now is they are the only way to test custom timings above 400MHz.
The only converter (sadly) that is identical in image quality and performance to a regular video card with analog output is the Startech (up to 375MHz). I use that on my Dell P1130 because 2048x1536 80Hz is within range of the converter.
For my F520 and FW900 I use the GTX 980Ti.
Same here with the 2048x1536. The sides swapping bug happened once in the last 48 hours that I used it, and only lasted less than 10 seconds. It hasn't happened on all resolutions though. Only one of them so far (I forgot which).

How is 375mhz identical to a GPU that has an Analog input ? The Delock 62967, which supports more than 375mhz was much slower than my 980Ti when I used it on the FW900. As far as I can tell, my 980Ti was outputing much more than that. The only bottleneck was the FW900's max frequency.
 
Not a secret really
CRT screen only cares for synchronization signals so basically only parameter that is important is number of lines and refresh rate.


Did you ever use CRT before?
If not and you are interested then grab cheap Trinitron or even some 19" LG or something similar to test how it works, play with it and when you think this is it then get the big guns. Of course you will also need some converter if your GPU doesn't have DVI-I ports (or Got forbid VGA ports XD) and for that Delock 62967 is my recommendation or Startech DP2VGAHD20 though Delock has better picture quality imho so I recommend Delock.


Yes it is possible. Starting with certain version of Windows 10 you can force application to render on selected GPU even if it is displayed on different monitor.
It works as one could imagine:
- lower performance
- higher input latency (or rather display latency?)
Not sure about stuttering issues because I am not crazy enough to test such things thoroughly but so far other than "floaty" feeling, especially at lower frame rate in-game, and lower framerate in general it does seem to work just fine.

And this is why no one uses this solution. DAC is cheap and converts digital output to analog without any added latency, no reduced frame rates or system resources.
I have used CRTs before. I had an FW900, a G400 and now a G520. It's a new discovery for me, which is why it felt like a secret.

That's good to know.
 
You mean use an older card for VGA output?

Yeah, but it doesn't work well in many games. DX9 and DX11 games have very poor frame delivery when they go through a second GPU, in my experience. It'll make a 90fps game look like it's running at 25 fps because of all the stuttering.

Vulkan, on the other hand, I've had mostly good experience with. A little more input lag, but for some games that isn't a big deal. But for Doom Eternal it was a game breaker.
Thanks for the info <3
 
Are you actually using this solution? 😮
I fail to see one reason why this would be a good idea...

I experiment with it from time to time to see how it's working on newer APIs.

One reason why it would be useful for interlaced resolutions, since new Radeon and Geforce cards no longer support interlacing.

If you're playing a new game where you're CPU limited below 60fps, or you just want to really crank the graphics settings/resolution. You will need to cap to some frame rate like 40fps or 50fps, and then interlacing helps you get a high refresh rate for those resolutions. So 150hz for a 50fps game to get the 1:3 cadence and low input lag.

Also, old Radeon cards can have their max pixel clock unlocked with Toasty X's utility. So they still have more capable DACs than any external adapter, including the Sunix.
 
Did you really test Startech well enough? Or maybe I got a dud
Nope. No problems with my Startech. The only "difference" between it and the 980ti output is the image is shifted to the left about 2 mm or so and I have to recenter in "Center" menu option (FW900) and, ofcourse 375MHz vs 400MHz.
 
How is 375mhz identical to a GPU that has an Analog input ? The Delock 62967, which supports more than 375mhz was much slower than my 980Ti when I used it on the FW900. As far as I can tell, my 980Ti was outputing much more than that. The only bottleneck was the FW900's max frequency.
Read my post again. I said they are identical up to 375MHz (the limit of usability for the Startech as in for any custom timing up to 375MHz there's no difference between it and a regular VGA card).
What are you talking about? Delock 62967 does NOT support 375 MHz. Please prove your above statement. Screenshot in NVCPL or CRU.
 
Read my post again. I said they are identical up to 375MHz (the limit of usability for the Startech as in for any custom timing up to 375MHz there's no difference between it and a regular VGA card).
What are you talking about? Delock 62967 does NOT support 375 MHz. Please prove your above statement. Screenshot in NVCPL or CRU.
Well, you explained now, so, I got it.

I'm not gonna unplug the splitter and potentially ruin the resolutions I sized right to take a screenshot.

Btw. I have two analogix ANX6212 chips somewhere. Do you happen to know what the max speed on those is ? And, if they would work on the Delock 62967.
 
I have two Analogix ANX6212 chips. Does anyone know what it's speed is, and if I can replace the chip in the Delock 62967 with it ?
 
Well, you explained now, so, I got it.

I'm not gonna unplug the splitter and potentially ruin the resolutions I sized right to take a screenshot.

Btw. I have two analogix ANX6212 chips somewhere. Do you happen to know what the max speed on those is ? And, if they would work on the Delock 62967.
Delock 62967 supports what is written on its box. 2560x1600 60Hz which is 350MHz. That's it. At least that's what mine does.
Maybe you have a unicorn but I really doubt that.
 
I have one, I've been using it for 8 years. I got it with less than 10,000 hours, possibly as low as 5,000. It's over 20,000 now and still looks great..

So ask them if you can look at the hours (you have to go into the service menu and it's on one of the main screens, look up the service manual for the NEC or Mitsubishi rebrand and it will tell you). So you can gauge how much you want to play based on that.

I think it's a beautiful monitor. Dimmer than my Trinitrons, but I use it in a dark room.
I went ahead and bought the FW900 for 1400 Euros. And I will also buy the LaCie electron 22 blue IV for the faster refresh rate.
 
Cool! Looking forward to hearing a comparison from you in a month or two.
Will do :).

I'm gonna learn how to use windas to help maintain the FW900. I remember mine had its voltage regulator explode the tube, and, I read that something called G2 on windas has to be checked and brought back down to stable voltage once every month, since, apparently, all Sony F monitors bring it up a bit over time for some reason.
 
Will do :).

I'm gonna learn how to use windas to help maintain the FW900. I remember mine had its voltage regulator explode the tube, and, I read that something called G2 on windas has to be checked and brought back down to stable voltage once every month, since, apparently, all Sony F monitors bring it up a bit over time for some reason.
This should help.
 
I went ahead and bought the FW900 for 1400 Euros. And I will also buy the LaCie electron 22 blue IV for the faster refresh rate.
Good for you 🙂

Unfortunately as years go by it is exponentially harder to find this monitor for sale. I have not seen it locally for years at this point.
Even normal 21" Trinitrons are quite hard to find and got very expensive. 22" Diamondtrons are rare unicorns just like FW900. In fact the only CRTs that are still readily available and do not break the bank are <= 17".

I read that something called G2 on windas has to be checked and brought back down to stable voltage once every month, since, apparently, all Sony F monitors bring it up a bit over time for some reason.
Every month only increases risk to the monitor because you might screw something up while doing this.
People who think they know what they are doing recommend doing WPB twice a year and doing WPB instead of just adjusting G2 voltage.
 
Good for you 🙂

Unfortunately as years go by it is exponentially harder to find this monitor for sale. I have not seen it locally for years at this point.
Even normal 21" Trinitrons are quite hard to find and got very expensive. 22" Diamondtrons are rare unicorns just like FW900. In fact the only CRTs that are still readily available and do not break the bank are <= 17".


Every month only increases risk to the monitor because you might screw something up while doing this.
People who think they know what they are doing recommend doing WPB twice a year and doing WPB instead of just adjusting G2 voltage.
I definitely don't know what I'll be doing right now. I've yet to read the thread spacediver sent me, follow it, practice it, and learn more about maintenance, and the right way of doing it.

I received the monitor yesterday. My arms got a bit hurt since I hurried to help the deliverer whom was struggling, so I'm waiting to recover until I bring it to my room, and finally start learning everything I need to learn.

WPB instead of adjusting G2 voltage, twice a year, I'll keep that in mind. Thanks.

Does it need extreme precision to be careful of not damaging while adjusting the voltage or G2 ? Or is it something that can be done without too much risk if done carefully ?
 
I definitely don't know what I'll be doing right now. I've yet to read the thread spacediver sent me, follow it, practice it, and learn more about maintenance, and the right way of doing it.

I received the monitor yesterday. My arms got a bit hurt since I hurried to help the deliverer whom was struggling, so I'm waiting to recover until I bring it to my room, and finally start learning everything I need to learn.

WPB instead of adjusting G2 voltage, twice a year, I'll keep that in mind. Thanks.

Does it need extreme precision to be careful of not damaging while adjusting the voltage or G2 ? Or is it something that can be done without too much risk if done carefully ?
ABL and other protections are inactive during the WPB. So if you set things wrong initially the tube can be overbright in later phases. The initial adjustment is not made using instrument data, but the later ones are so if they don't line up redo it from the start. If you do the initial steps in a dark room or under a heavy blanket it will be fine. G2 only in the dat file is fine if you do it incrementally. I feel like as long as the tube is in the normal brightness range nothing really bad can happen. Long-term the monitors will go bad if they go bad due to the flyback issue, but I actually don't think that is a related to WPB IMO. Overall the rule of thumb is if the picture doesn't look odd the monitor should be able to handle it.
 
  • Like
Reactions: XoR_
like this
Have somebody get the lontium cgmha to work with a new rtx card? Delock bidirectional cable to usb 3.1 genderchanger doesn‘t work for me. I wonder if it would work with a thunderbolt capable mainboard and a rtx card since win11 seems to allow to specify the videooutput.
 
Thanks for the info, that's very helpful! I have a Sony CPD G500 right now, and the lamp is very used, so the light is at around 70-80% what it originally was... Do you think it's around 70-80% of the Trinitrons ? I fell in Love with the glow of my FW900, and ever since then, no CRT monitor I've gotten had that sort of glow-except for a g400 I had before this one, but it only lasted less than a week, since shipment damaged it a lot.

That luminous glow is what sets appart CRTs from OLEDs in my experience, and, with the g520 I have right now, it's not really there, even though it does have some glow.

I'm deciding wether I want to spend 1500 euros on an FW900 I found, or 300-350 max on the LaCie electron 22 blue IV.
I have got these monitors. Except for g400. The crt glow is only prominent on fairly new tubes in combination with stable power supply etc. as far as I know. Although I never saw a diamontron which shines like trinitron. The 19IV seems to get more vibrant as it‘s bigger brother. But still inferior to the 21“g520 in my case (hours are almost the same). Even in superbright mode. Of course the fw900 is the king. As long as it is in good shape. Meanwhile the missing hourcount makes it a no go for me. Since it cost me more than a stupid amount of money… I would look after other brands in case of the lacie at that given price. Just my 2 cent
 
Have somebody get the lontium cgmha to work with a new rtx card? Delock bidirectional cable to usb 3.1 genderchanger doesn‘t work for me. I wonder if it would work with a thunderbolt capable mainboard and a rtx card since win11 seems to allow to specify the videooutput.
I've read bidirectional cables rarely work as advertised

You want the Sunix UPD2018:

https://www.ebay.com/itm/234144370547?hash=item3684178f73:g:kMYAAOSwFo5hG-W0

By the way, where did you find a CGMHA?
 
As an eBay Associate, HardForum may earn from qualifying purchases.
As an eBay Associate, HardForum may earn from qualifying purchases.
Whenever you get the Sunix/Dell adapter or whatever, come back and let us know how the Vention adapter works for you.

We've heard that it goes to really high pixel clocks, but I don't think any of the regular posters here actually have one
Unfortunately nobody want to send the sunix card to germany. At this moment I wait for the answer of a seller on ebay. The dell card should cost round about 20 dollar. Which is a steal compared to the delock clone that is offered here. 90+ euro… so maybe I buy a new mainboard earlier as planned.
Once again this hobby getting really insane.
 
Unfortunately nobody want to send the sunix card to germany. At this moment I wait for the answer of a seller on ebay. The dell card should cost round about 20 dollar. Which is a steal compared to the delock clone that is offered here. 90+ euro… so maybe I buy a new mainboard earlier as planned.
Once again this hobby getting really insane.
I could sell you mine and just buy another one, since I'm in the US.
 
I could sell you mine and just buy another one, since I'm in the US.
Sorry. Just read your post. I was going to get all in and buy the „gigabyte z590i vision d“ since I was planing a mini itx build anyway. The manual says:

„ThunderboltTM 4 Connector (USB Type-C® Port)
The connector supports standard DisplayPort and ThunderboltTM video outputs. You can connect a standard DisplayPort/ThunderboltTM monitor to this connector with an adapter. The ThunderboltTM connector can daisy chain up to five ThunderboltTM devices. Because of the limited I/O resources of the PC architecture, the number of ThunderboltTM devices that can be used is dependent on the number of the PCI Express devices being installed. You can adjust the ThunderboltTM settings under Settings\Thunderbolt Configuration in BIOS Setup. The maximum supported resolution is 5120 x 2880@60 Hz with 24 bpp via single display output, but the actual resolutions supported are dependent on the monitor being used. Also, the connector is reversible and supports the USB 3.2 Gen 2 specification and is compatible to the USB 3.2 Gen 1 and USB 2.0 specification. You can use this port for USB devices, too.
The ThunderboltTM 4 connectorjsupports a resolution of up to 4K@60 Hz when a graphics card is connected to the DisplayPort In port for the ThunderboltTM 4 connectorjto output.“

Is this a working solution? I miss the statement about how many lanes are utilized. Also the usb compatibility seems not guaranteed in use with video passthrough. I can‘t remember if it needs to be 3.1 and/or 3.2. The adapter works on my macbook air 2020 with thunderbolt integrated 3.1 usb. Too many standards…
Sorry for such a long post.
Last but not least. Thank you for the really nice offer to sell a card to me! Tomorrow I try to send you a dm.
 
Back
Top