24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Basically this card has an insanely fast RAMDAC that goes up to a whopping 655.xxMHz!! :eek: Unfortunately it has some really weird limits of horizontal pixels and lower max bandwidth when doing interlaced. Really sad.
For fun I tried again to display 4K, in progressive, and I was able to squeeze 3840x2160p at 60Hz with this GPU, after a lot of trial and error adjusting the timings in CRU (see photo below). Not the best settings, but it worked, just for a test!
4K progressive-scan on a CRT! Nice to see, even if it's not resolvable.

I knew it was possible to do 2160i in the bandwidth of 1080p if you jumped enough hoops -- but 2160p I haven't seen anybody do that at full 3840 wide.

You must be using 200-300% DPI zoom at the moment!

Try testing VRR on your CRT!
This won't work with all CRTs, but it reportedly works with a few very-multisync tubes with less aggressive "out-of-range" firmware cops. You should try the ToastyX force-VRR trick (give your output a FreeSync range), and see if it works on your CRT in a tight range. Try testing 800x600 55-65Hz and using the windmill demo, and slowly expand your VRR range. However, not all HDMI-to-VGA adaptors work, but if you've got an onboard DVI-I port with both analog/digital and a RAMDAC on a graphics card that supports VRR, bingo -- try it... This could be fantastic for capped 60fps games for the world's lowest latency "60Hz 60fps VSYNC ON lookalike" on a CRT. DVI-D already worked with VRR sometimes with the force-VRR trick ten years ago, on some generic LCD panels, and the DVI-A signal may be along for the ride too.

However, since I actually hardly use VSync cause of the latency, I will certainly look into the modern methods of reducing/eliminating latency. Thanks!
There are many low-lag clones of "VSYNC ON":
  • Special K Latent Sync
  • RTSS Scanline Sync
  • Capped VRR (fixed-framerate), may not work with CRT
And latency-lowering spins on traditional VSYNC ON:
  • Low-Lag VSYNC HOWTO (Microsecond-accurate capper like RTSS 0.01fps below what you see at www.testufo.com/refreshrate)
  • VSYNC ON + NULL
Any of the above tricks are needed if you want the Arcade / Sega Model 3 / NES Super Mario Brothers ultrasmooth 60fps on a CRT tube, at ultra-low latency, without tearing.

Whenever you *can* do VRR -- it is the easiest low-lag VSYNC ON clone, you simply cap below max Hz, e.g. run VRR range to 60.5Hz-63Hz, and cap to 60fps for your emulators. Ideally you need to cap about 3% below max Hz (3fps-below is an old VRR-specific boilerplate, but there's some give or take).

Capped VRR is the world's lowest lag "easy VSYNC ON" clone, do not ignore it just simply because esporters complain it has more lag than VSYNC OFF. You should use VSYNC OFF for a lot of esports games. But not all games are esports games, and you simply want to enjoy the amazing TestUFO-smooth motion in games on your display, you can do it. If you ARE HUNTING for the world's lowest lag VSYNC ON clone, there ya go. Now, if your display doesn't support VRR, then Special K Lantent Sync is becoming a quick favourite on Discord, as it seems to be slightly lower lag and slightly easier than RTSS Scanline Sync.

Scanline sync technologies, unfortunately, requires enough GPU horsepower to achieve 60fps at only ~50% GPU, for reliable VSYNC OFF raster-interrupt-style tearline-steering between refresh cycles (tearingless VSYNC OFF), pushing the tearline above/below edge of screen.

However, the effort is worth it if you actually put the effort in procuring a CRT -- then there is a desire to pamper the visuals properly with a low-lag "VSYNC ON" lookalike technology.
 
Last edited:
4K progressive-scan on a CRT! Nice to see, even if it's not resolvable.

I knew it was possible to do 2160i in the bandwidth of 1080p if you jumped enough hoops -- but 2160p I haven't seen anybody do that at full 3840 wide.

You must be using 200-300% DPI zoom at the moment!

Yes, I usually don't appreciate at all using UI scaling, but to make things readable for the photo, it was set pretty high here indeed!
It took me quite some time back then, to tweak the timings until I got an acceptable result! I did hit the 655.35MHz pixel clock limit in CRU, and trying to use a DisplayID extension didn't help, as the resolution didn't show up in Windows going past 655.35MHz anyway.
At those timings, it is not possible to make the picture take the full width of the display, but it gets really close. Widening it further, or moving the picture to the left will end up cropping in the display area, because the front porch isn't wide enough.

So as I said, it's not the best, it would have been better if a 700MHz pixel clock was attainable. But as always, we go as far as we can, way beyond what we thought was possible already, but we still need just a little bit more to be satisfied! :ROFLMAO:

Try testing VRR on your CRT!

Just realised this completely slipped my mind! Some time ago you already asked me to test it.

Sadly my 5700XT died on me, I was gonna test VRR on this card. I spent the afternoon flashing the Vbios different ways, cleaning the card, trying to figure out what was going on. I managed to get the card going just fine on Linux, but not on Windows at all (BSOD). Then it stopped posting completely...

So I reverted back to the 380X for testing VRR.

It took me a moment to figure out how to configure it, using CRU and adding the proper extension set to 55-65Hz. I tried 1600x1200, 1280x720, 800x600, using the AMD windmill demo.
But unfortunately it just glitches. The picture jumps a lot and it just goes to black screen, with the picture briefly showing up from time to time. Seems like the monitor doesn't really appreciate the varying timings at all.

To make sure it was not a particular adapter, I tried 3 different HDMI to VGA adapters, with the same results exactly. I also tested on both my F520 and ElectronBlue IV, once again same behavior.

I seem to understand you told me it could work on the analog output of the GPU? Because I tried that as well, and the AMD drivers just told me VRR is not supported on that display, when connected to DVI-I analog (VGA).

It's the first time I try to mess with VRR on unsupported displays, so if you think I did something wrong, or if you have other things in mind that I could try, just let me know. I will see what I can do, maybe with a bit of luck we can get something working. But so far no success for me here.

---

Now, on another subject. I'm just mentionning this here, but understand that this is quite certainly DANGEROUS! I just learned that it is possible to alter the factory vertical and horizontal frequency limits on the Trinitrons supported by WinDAS...
Obviously, those limits are set for a very good reason from the factory, as going above will put stress on the deflection circuit, flyback, the yoke, maybe dynamic convergence, and everything else related to those.

With that out of the way, I did test if of course... because I couldn't possibly learn about something like that, and not test it myself. But if you care about your CRT, just stay away from this I would say!!

So, the F520 has 29-138kHz horizontal range, and 47-171Hz vertical range, which can be visible in the WinDAS calibration data backup below.

Capture d’écran 2023-10-25 141033.png


After editing those and reuploading the data, the limits will change. So I did some very short tests here.

Capture d’écran 2023-10-25 141120.png


Knowing the behavior of my Iiyama 514 with unlimited vertical frequency, and also talking to someone who pushed the vertical limit on his FW900 quite high without issues SO FAR, I just set the vertical limit here to 251Hz.
For the horizontal I was very careful here, and just tried +5kHz to see how it would behave, matching the horizontal limit of my Iiyama 514 at 143kHz.

Note that I did not leave resolutions pushing those limits for a long time on screen! But everything performed normally here, so I pushed the vertical a little more.

After 145kHz the OSD started showing some glitches, with the bottom portion of it jumping a bit on the screen. Everything else remained perfectly stable.
I kept pushing things a little more, and 150kHz seems to be a hard limit. After that the screen just stays black.
Between 145kHz and 150kHz, everything looks normal except for the OSD. So something clearly doesn't like such a high frequency.
That concluded the tests for the horizontal limits.

For the vertical, I did try 210Hz at 1600x1200 interlaced (143.328kHz), and 180Hz at 1920x1440 interlaced (144.631kHz), and no distortion whatsoever on the display. I did not go further.

So here are some test results, if anyone is curious about that. But again, I don't know how dangerous that is, but I assume it can't possibly be good for the circuits, running above specs!
So this was mostly for experimenting a bit, and getting some test results. That's all.

Getting a tiny little more performance out of the CRT clearly isn't worth potentially shortening its lifespan, but it was very interesting to find out that something like that is in fact possible. Please don't try this!
 
You're playing with fire pushing the monitor out of spec like that.
As unklevito used to say here: "You play, you pay".
 
You're playing with fire pushing the monitor out of spec like that.
As unklevito used to say here: "You play, you pay".
No worries, it's back to stock limits already. I didn't do more testing that that, and I did it with the monitor I had full spare parts for, just in case something wrong happenned. I have another F520 with a shorted tube, but I'd like to keep it original as long as possible.
Tests were done only for a few seconds, and then back within specs. Then I waited a bit before testing something else.
Just wanted to see how it behaves. As I said, not worth it to me really, just something curious I wanted to try out. I'm done with it now.
 
Funny thing about CRTs. They can literally explode in your face. My aunt Flo...yes I had an aunt Flo married to my Uncle Tom. Is that some God like comedy or what? She had a floor tv in a wood cabinet. One night I came home and the tvs glass literally was all over the couch 8 feet away and the tv literally exploded outward. Thank God no one was sitting there cause the glass would of impaled their eyes. No shit. Be careful as hell with them. The literally can fuck you up. That scene will always stay with me. Crazy as hell seeing the glass screen with a gaping hole in it and the couch literally covered in glass. The couch was 8FT AWAY!
 
Funny thing about CRTs. They can literally explode in your face. My aunt Flo...yes I had an aunt Flo married to my Uncle Tom. Is that some God like comedy or what? She had a floor tv in a wood cabinet. One night I came home and the tvs glass literally was all over the couch 8 feet away and the tv literally exploded outward. Thank God no one was sitting there cause the glass would of impaled their eyes. No shit. Be careful as hell with them. The literally can fuck you up. That scene will always stay with me. Crazy as hell seeing the glass screen with a gaping hole in it and the couch literally covered in glass. The couch was 8FT AWAY!
Sounds like my kind of way to die.
 
Funny thing about CRTs. They can literally explode in your face. My aunt Flo...yes I had an aunt Flo married to my Uncle Tom. Is that some God like comedy or what? She had a floor tv in a wood cabinet. One night I came home and the tvs glass literally was all over the couch 8 feet away and the tv literally exploded outward. Thank God no one was sitting there cause the glass would of impaled their eyes. No shit. Be careful as hell with them. The literally can fuck you up. That scene will always stay with me. Crazy as hell seeing the glass screen with a gaping hole in it and the couch literally covered in glass. The couch was 8FT AWAY!
Weird story. The front glass is very thick and if a CRT tube had to break, it would be in the neck area, and it wouldn't happen on its own.
 
This TV was from the '60s so who knows how it was made. This happened in 1996. And the TV was so old at that point it was probably made with standards that aren't around today.
 
Good to see this thread refusing to die. Godspeed CRT fans. Long live the king of monitors. Every now and then I get an urge of desire to go out and find another GDM example. But alas... My life is just so different now.
The 👑 will return one day eventually in the form of either SED or LPD. Just a matter of time...
 
With the OLEDs being so dominant and at center of attention, my guesstimate is 2027 onwards.
I know. That was a joke. Been into this hobby for more than a decade when I rediscovered this display. Go through this thread and you’ll see that for years OLED was promised as being the next thing that would finally dethrone CRT and it’ll only be a “few more years”. :)
 
With the OLEDs being so dominant and at center of attention, my guesstimate is 2027 onwards.
Sounds about right -- 1000 Hz OLED is about ~2027.

You can do a real-time 60-100Hz CRT electron beam simulator with that brute refresh rate, utilizing HDR nit surge headroom too. Complete temporal beam simulation in 1/1000sec timeslices.

Rolling scan, phosphor fade, low persistence, etc. All done in software (~16 digital refresh cycles per 1 analog refresh cycle).

So CRT tube simulation via 1000Hz fast-GtG DFP technologies such as OLED.

The ancestor to that is already arriving on the market -- even the Retrotink 4K is already injecting BFI in a box-in-middle video processor, and even using HDR nit-surge tricks to brighten classical BFI (pre-beam-simulation era),

In other words -- by end of the decade, the temporal version of the spatial CRT filters, will be arriving to piggyback off brute Hz to simulate retro displays even more accurately.
 
  • Like
Reactions: Xar
like this
I dont think 1000hz will be necessary for OLEDs to achieve near-perfect motion clarity under most circumstances, 500hz will likely be the sweet spot, and anything more than that would be beyond diminishing returns. And by that point, BFI or ULMB implementations will be more than enough to close the gap for the CRT-tier clarity we've always been looking for.
The current 480hz+ offerings lack the pixel response to keep up with that speed, so we're still not seeing those refreshes blur-free.
 
I dont think 1000hz will be necessary for OLEDs to achieve near-perfect motion clarity under most circumstances, 500hz will likely be the sweet spot, and anything more than that would be beyond diminishing returns. And by that point, BFI or ULMB implementations will be more than enough to close the gap for the CRT-tier clarity we've always been looking for.
The current 480hz+ offerings lack the pixel response to keep up with that speed, so we're still not seeing those refreshes blur-free.
Why do you say that? Running the Viewsonic XG-2431 near 2ms at 60hz still has a hint of motion blur. It's not as crystal clear as a CRT at that refresh. No, Mark is absolutely right here. 1000hz is where we need to be to have CRT motion clarity, or the 1ms image persistence.

EDIT - if you mean that 2ms would be enough for people to be like "damn!" then I would agree. It's definitely worlds better than 8ms or even 4ms. But those who know (or still have CRT's) would be able to see the difference in no time.

Double EDIT - I have to run it to around 2ms of image persistence because while it could go down to 1ms at that refresh, it's way too dim to be reasonably useable.
 
1000hz is where we need to be to have CRT motion clarity, or the 1ms image persistence.
Not necessarily because of the Blur Buster's Law: 1ms persistence = 1 pixel motion blur per 1000 pixels/second motion.
https://blurbusters.com/blur-buster...000hz-displays-with-blurfree-sample-and-hold/
Persistence by itself isnt enough to determine how many pixels of motion blurring there is, pixel MOTION is also another important factor. For example:
1000 pixels/second at 16ms persistence = 16 pixels of motion blurring

So, if I double my refresh (cutting in half my persistence), while keeping the same pixel motion I get:
1000 pixels/second at 8ms persistence = 8 pixels of motion blurring

However, if I halve my pixel motion I can also halve pixel blur without changing my persistence:
500 pixels/second at 8ms persistence = 4 pixels of motion blurring

Therefore, to achieve a 1ms pixel blur I can either:
Have 125 pixels/second at 8ms persistence or 1000 pixels/second at 1ms persistence. Or, like I suggested, 500 pixels/second at 2ms persistence, that would be a 500hz OLED. In other words, if the pixels move fast enough, even 1000hz/1ms persistence can blur, and if the pixels are slow enough, even a 1hz display is blur-free.

In my experience, even fast paced shooters barely reach 1000 pixels/second. When playing CSGO on my 240hz LCD I have a really time perceiving blur compared to 120hz, which, in comparison, feels very blurry. Doubling that without an panel that lags would be effectively blur-free.
 
I dont think 1000hz will be necessary for OLEDs to achieve near-perfect motion clarity under most circumstances, 500hz will likely be the sweet spot, and anything more than that would be beyond diminishing returns. And by that point, BFI or ULMB implementations will be more than enough to close the gap for the CRT-tier clarity we've always been looking for.
The current 480hz+ offerings lack the pixel response to keep up with that speed, so we're still not seeing those refreshes blur-free.
Depends on your goal persistence. It's probably going to be a fantastic sweet spot for most users (e.g. average retro users)

For moderate and slower scrolling at low resolutions like Super Mario, 2ms persistence is perfectly fine. But for ultrafast scrolling ala Sonic Hedgehog, you'll want a bit extra clarity.

Now, if you are playing 1080p games on a CRT tube, then you'll already be able to tell 0.5ms MPRT (2000Hz equivalent) and 1.0ms MPRT (1000Hz equivalent).

Retina refresh rate at extreme cases (180-degree 16K VR) is not until the quintuple digits.

Everyone has different sensitivity -- For human visible benefits, it is all about geometrics (e.g. 4x differentials while GtG=0).

In my experience, even fast paced shooters barely reach 1000 pixels/second. When playing CSGO on my 240hz LCD I have a really time perceiving blur compared to 120hz, which, in comparison, feels very blurry. Doubling that without an panel that lags would be effectively blur-free.
120Hz vs 240Hz is throttled to roughly 1.2-1.5x difference due to the double problem of (A) VSYNC OFF; and (B) LCD GtG being nonzero.

VSYNC OFF, while great for low lag, isn't good for maximizing blur differentials between refresh rates. Retro games are VSYNC ON, unlike CS:GO! And often keyboard/gamepad-controlled so you've got perfect TestUFO-smooth panning motions in platformer games, unlike the more micro-jittery VSYNC OFF 800dpi mouselooks in CS:GO (Hint: If things seem to jitter/stutter more when you turn strobing ON, you've just revealed the weak link. Nonzero GtG hides this type of microjittering -- the strobe jitters can only be fully zeroed out with framerate=Hz VSYNC ON, for see-for-yourself low-MPRT motion demos such as www.testufo.com/map ...)

120Hz vs 240Hz is massively more visible on my OLEDs when using VSYNC ON. Much bigger compared to CS:GO 120-vs-240 LCD. 120-vs-480 is much more noticeable, as >2x geometrics + GtG=0 + full "framerate = Hz " sync is where humankind benefits is at in the refresh rate race to retina refresh rates.

Blur Busters Law is only a baseline -- it worsens via extra blur from nonzero GtG -- and it also worsens from jitter vibrating to blur (stutter-to-blur continuum, see TestUFO, even 70 microstutters per second at 360Hz can be additive extra motion blur, like a fast-vibrating music string (blurry) versus slow-vibrating music string (vibratey).

The whac-a-mole of weak links in the refresh rate race goes far beyond typical CS:GO + VSYNC OFF + 800dpi mouse + LCD 240Hz esports displays. It's great for low lag and control, but isn't representative of Blur Busters Law not being TestUFO-smooth mouselooks, and also having additive blurs.

Some of us are more sensitive to stutter. Tearing. Color/HDR. Blacks. Motion blur. I most certainly can tell apart 0.5ms MPRT and 1.0ms MPRT with my eyes in VSYNC-ON'd framerate=Hz motion.
___

However, this does not preclude the fact that ~2ms MPRT will be a great sweet spot for retrogaming, with their much-lower resolutions involved (e.g. 256x224 pixels etc).
 
Last edited:
Even though retrogaming will never be optimal on a flat panel, at least now we have options to mitigate persistence at 60hz. Besides BFI, retroscalers like the RetroTink4K and OSSC Pro will offer support for hardware-level BFI and even 120hz modes. Far from CRT-Optimal, but good enough for the convenience and wonders of playing on a large panel.
For everything else, I believe a 480hz OLED with proper BFI, ULMB 2.0 or DyAc+ would take motion quality to unprecended levels because we're talking about crystal clear and crosstalk-free motion.
 
I said in my previous posts that the limit is 191 MHz not 270 MHz.
I opened un my Gembird A-HDMI-VGA-04 and guess what? :)
Not even a lowly CH7101. The chipset is CS5210.
What a sham of a company!

Cheap adapters from Amazon (HDMI to VGA)
Benfei and Rankie HDMI to VGA, we don't know the model code, the chipset appears to be the Algoltek AG6200, they should handle up to 330 MHz but it is necessary to set the output to YCBCR 444 and this can cause problems to some users because sometimes and with some drivers that option is not present.
It seems to go up to 177-180 MHz with classic RGB mode, over it needs YCBCR to have a good image.

I have advanced question about Unitek adapter Y-6333 (alternatively Y-6355 or Y-5118E).
Will it handle 1280x960 at 100 Hz (Dell Trinitron P1130)?
There is no such resolution mentioned (and others like 1366x768), while it is 4:3 (there is a hole between 1024x768 and 1400x1050 on supported list, but there are other ratios like 1280x1024 or 1280x800).
What I have read on internet, the thing with resolutions may vary per adapter, some can work with any that are in supported range, other with only fixed by manufacturer.
Refresh rate can be even more problematic allowing only 60 Hz that is flickering and causes eyes strain, while monitors made in 2000+ year can support 75/85/100/120 Hz as standard.
I also did not find what any info about used chipset, but I know that such knowledge it is not enough to guarantee all features, because it also depends on whole adapter project.
When optional USB power would be needed and is it required for 1280x960 at 100 Hz?
Other optional quests are like what is integrated equalizer, 10 bit dac, link on, valid de, and if such adapter will work with all HDMI or DP (I guess it should)?
 
Sounds about right -- 1000 Hz OLED is about ~2027.
By the time it happens, new panel technology will come along captured center of attention or OLEDs' Burn-in and Degradation might be reduced to TN-IPS-VA-FALD's level and become the end-game until 2030s.
You can do a real-time 60-100Hz CRT electron beam simulator with that brute refresh rate, utilizing HDR nit surge headroom too. Complete temporal beam simulation in 1/1000sec timeslices.

Rolling scan, phosphor fade, low persistence, etc. All done in software (~16 digital refresh cycles per 1 analog refresh cycle).

So CRT tube simulation via 1000Hz fast-GtG DFP technologies such as OLED. The ancestor to that is already arriving on the market -- even the Retrotink 4K is already injecting BFI in a box-in-middle video processor, and even using HDR nit-surge tricks to brighten classical BFI (pre-beam-simulation era), In other words -- by end of the decade, the temporal version of the spatial CRT filters, will be arriving to piggyback off brute Hz to simulate retro displays even more accurately.
Aside from ridiculously high Refresh Rate Hz, I would love to see proper Brightness Nits, HDR & VRR implementations on CRT's successor, whether it's SED or LPD.

With high-resourced researches SD, LGD, JOLED, TCL, BOE, AUO rapidly putting in, I'm optimistic 2027+ OLEDs without BFI should be able to surpass FW900 and all Professional-Productivity CRTs' level of motion clarity and smoothness or at worst, come very close to it.
 
You are one of the last CRT-only Mohicans my friend :dead:

Then I am also one of the last, because I still can not stand few things in LCD that pisses me off (no, it is not nostalgy).

First is natural "motion blur" known as "ghosting" that exist even in "gaming" "1 ms" monitors.
I know most people do not notice it, but I belive it is because they used to it and they think it is normal.
You do not have to have super eyes, but just use good CRT for years in dynamic games like fps or even stategies (where you move whole map), so your eyes are used to smooth animation.
Later you run "gaming" "1 ms" LCD and see that pixels from previous frames when for example you turn around in open space fps game and see that ghost pixels of moving darker mountains in brighter sky from left to right (or other direction), while this never happens on CRT.
There were also online tests with bright grey vertical blocks set reparately in line on black background that moved slowly which LCD shows as one long grey rectangle (grey pixels "ghosts" black background) and CRT keeps sharp separated lines.
If you want more, then go to wikipedia or other website that has smaller black font on white background, click and hold on small arrow on bottom right corner to start moving text in actually very slow continous stable tempo, and lock your eyes to follow font to see that it will be blurred with background on LCD, while on CRT it is as sharp as when it was not moving.
Yes, I saw that 144 Hz monitors and the problem is smaller if you set that higher refresh rate, but it still exists.
CRT do not do it even on 60 Hz and we have 90s vs 2023 year, so what the hell?
I saw OLED and yes, smooth is good enough for me, but there are still no monitors on the market, only big TVs.

Second thing is that anti glare coating that spreads all that white grain on darker areas (not to mention that colors are "grayer") or you get other side with glass that reflects all the light right into your eyes.
Today we have also semi glass variants (finally!), so effect may vary per model.

Third thing is that only native resolution looks good on LCD, so if something is forcing you to use different one (like very old games in full screen mode), then image is blurred especially noticable in areas with much color difference of small amount of pixels like dark small fonts on bright backgrounds.

Did you hear about SED (Surface-conducation Electron-emitter Display)?
If not, then check out wikipedia for start.
Too bad it was never produced widely, but limited mainly for military and medical equipment.
 
Last edited:
The ironic thing here is that the only tech that can actually do a full quick-strobe with no crosstalk is OLED. And from what I hear OLED flicker is pretty pronounced because of the instant response time. CRT had some phosphor decay but the initial drop-off was sudden enough to not have motion blur, but not be too flickery to cause eye discomfort.

It's like the perfect gaming display has already come and gone.
 
The ironic thing here is that the only tech that can actually do a full quick-strobe with no crosstalk is OLED. And from what I hear OLED flicker is pretty pronounced because of the instant response time. CRT had some phosphor decay but the initial drop-off was sudden enough to not have motion blur, but not be too flickery to cause eye discomfort.

It's like the perfect gaming display has already come and gone.
That phosphor decay slightly bugs me when you are in a dark area with only a few small light sources, but it's an elegant decay rate that softly and quickly goes away. I never see it other wise.

Ok, so going by all the comments, the magical OLED screens with 1000mhz refresh and BFI are coming in a few years, and will finally be the replacement for my FW900?
 
That phosphor decay slightly bugs me when you are in a dark area with only a few small light sources, but it's an elegant decay rate that softly and quickly goes away. I never see it other wise.

Ok, so going by all the comments, the magical OLED screens with 1000mhz refresh and BFI are coming in a few years, and will finally be the replacement for my FW900?
Only time will tell.

Regarding crt decay and dark scenes. Yep - no display is perfect. Such is life. :)
 
I have advanced question about Unitek adapter Y-6333 (alternatively Y-6355 or Y-5118E).

I could attempt to answer the 50 questions you asked here...

but you're already here. If you just skim through last year or so of posts in this thread , you'll answer a lot of your questions.

The one thing I'll say, to point you in the right direction, is that horizontal frequency is really the only number that matters for limitations of a CRT, and pixel clock is the only number that matters for DACs.

Now, of course there are caveats to both of those statements (max vertical Hz, YCbCr vs RGB full vs RGB limited, etc), so lets not pick that apart. But for day to day gaming on your CRT, making custom resolutions for your games, those are the two numbers you'll be dealing with most often.
 
I could attempt to answer the 50 questions you asked here...

but you're already here. If you just skim through last year or so of posts in this thread , you'll answer a lot of your questions.

Yes, I read some posts here, but could not find any info about Unitek adapters like Y-6333, Y-6355 or Y-5118E (used also google site and general search to help with it).
Wanted to buy Gembird at first, but find out that it may contain random chip, so it is lottery and I guess AG6200 or CH7101 will not work with 1280x960 at 100 Hz?

The one thing I'll say, to point you in the right direction, is that horizontal frequency is really the only number that matters for limitations of a CRT, and pixel clock is the only number that matters for DACs.

Now, of course there are caveats to both of those statements (max vertical Hz, YCbCr vs RGB full vs RGB limited, etc), so lets not pick that apart. But for day to day gaming on your CRT, making custom resolutions for your games, those are the two numbers you'll be dealing with most often.

I already played many current and older games with 1280x960 at 100 Hz on Dell Trinitron P1130, but I use DVI-I to VGA, so simple analog to analog "conversion".
Now I want to change graphic card and all newer products offer only digital ports like HDMI, DisplayPort, or DVI-D.
 
I have advanced question about Unitek adapter Y-6333 (alternatively Y-6355 or Y-5118E).
Will it handle 1280x960 at 100 Hz (Dell Trinitron P1130)?
There is no such resolution mentioned (and others like 1366x768), while it is 4:3 (there is a hole between 1024x768 and 1400x1050 on supported list, but there are other ratios like 1280x1024 or 1280x800).
What I have read on internet, the thing with resolutions may vary per adapter, some can work with any that are in supported range, other with only fixed by manufacturer.
Refresh rate can be even more problematic allowing only 60 Hz that is flickering and causes eyes strain, while monitors made in 2000+ year can support 75/85/100/120 Hz as standard.
I also did not find what any info about used chipset, but I know that such knowledge it is not enough to guarantee all features, because it also depends on whole adapter project.
When optional USB power would be needed and is it required for 1280x960 at 100 Hz?
Other optional quests are like what is integrated equalizer, 10 bit dac, link on, valid de, and if such adapter will work with all HDMI or DP (I guess it should)?
1280x960 100 Hz requires a pixel clock of 178 MHz with CVT and 179 MHz with GTF timings, i think that even the worst adapter can handle that.
There aren't resolutions fixed by manufacturer or that can vary per adapter, OS through video drivers, EDID of your monitor and input bandwidth of your adapter, shows you the possible applicable resolutions.
But you can change that using custom EDID override with software like CRU or forcing resolutions through the graphic control panels.
You can create whatever you want, as long as you respect the limitations of your monitor and adapter.
You can also remove unwanted resolutions showed by the OS, by modifying some video driver registry keys (DALNonStandardModes with AMD and NV_Modes with NVIDIA)
 
Last edited:
1280x960 100 Hz requires a pixel clock of 178 MHz with CVT and 179 MHz with GTF timings, i think that even the worst adapter can handle that.
There aren't resolutions fixed by manufacturer or that can vary per adapter, OS through video drivers, EDID of your monitor and input bandwidth of your adapter, shows you the possible applicable resolutions.
But you can change that using custom EDID override with software like CRU or forcing resolutions through the graphic control panels.
You can create whatever you want, as long as you respect the limitations of your monitor and adapter.
You can also remove unwanted resolutions showed by the OS, by modifying some video driver registry keys (DALNonStandardModes with AMD and NVMODES with NVIDIA)

So I believe I will give it a try with Unitek or even Gembird.
I hope that RGB will not be limited, because current analog to analog works great with darker shades, so noticable color loss in some darker games would be fatal (reminds me highly compressed movies or some 3Dfx hardware rendering where you could not see things in darker places while having contrast jump in fragments with light).
I already use CRU, because original drivers did not want to work correctly on Windows 10.
Unitek officially responded to my email and they claim that adapter (Y-6333, Y-6355, Y-5118E) will not work with 1280x960 (while it can popular 640x480 or 1920x1080) and also that 60 Hz is maximum refresh rate at any supported resolution.
Two big computer specialized shop networks also responded that no adapter (various manufacturers) in their offer can do it.
I guess they do not know much about it.
 
Last edited:
Unitek officially responded to my email and they claim that adapter (Y-6333, Y-6355, Y-5118E) will not work with 1280x960 (while it can popular 640x480 or 1920x1080) and also that 60 Hz is maximum refresh rate at any supported resolution.

Guys, let's not be completely naive here. These are generic, zero liability, zero quality assurance Chinese companies. They don't even know what their products are capable of. The just slap some chips in there that seem to do the job they advertise them for, and as long as they don't catch on fire, they go out for sale.

And seriously, NOBODY, besides the few dozen people in this thread, and a small fraction of people on r/crtgaming, give a shit about PC CRTs. So "Unitek", whatever that is, didn't give 2 seconds of though to CRT use when they made this adapter. They were strictly thinking of business users who are A) using old LCDs with VGA and B) using old LCD projectors. Hence the answer you got of "1080p 60hz"

It is beyond pointless to contact them to ask about specs. You might get an answer about chipset, if you email the right person.
 
This is probably a long shot, but I have confirmed that my FW900 has a bad flyback transformer and so am looking for ways to source this part.

Does anyone have any ideas?

Maybe someone reading this thread has a broken FW900 and is willing to disassemble and sell me their flyback? If so, please PM me and we can negotiate a fair price.

Don't know if Unkle Vito still posts on this forum, but anyone know if he can still get this part?

Any help would really be appreciated.

The tech who confirmed the flyback was the problem directed me to a recycling center but there's no way I'm going to give up on it if there's any way to salvage it, especially how rare and desirable these monitors are.

The tube has very low hours and is otherwise in great shape. It was fully calibrated and factory reset only a couple of years ago by Unkle Vito.
 
Does anyone have any ideas?

r/crtgaming and the associated discord. Let people know you're looking for a beat up, as-is FW900, or at least the flyback from one. I think there are a few other Sony monitors that used the same flyback, Definitely don't remember which ones off the top of my head.
 
r/crtgaming and the associated discord. Let people know you're looking for a beat up, as-is FW900, or at least the flyback from one. I think there are a few other Sony monitors that used the same flyback, Definitely don't remember which ones off the top of my head.

I read that the P1110 has a compatible flyback.

I have the opportunity to acquire a CPD-G520 for a reasonable price.

Does anyone know if I could use the flyback from this unit in the FW900?
 
I read that the P1110 has a compatible flyback.

I have the opportunity to acquire a CPD-G520 for a reasonable price.

Does anyone know if I could use the flyback from this unit in the FW900?
This is a big NO. The circuit schematic inside the flyback is different between the service manual of the G520 and the one of the FW900, they're not pin compatible.

Also, I think someone reported using another flyback with success in the past and the patch-up job was holding but this is a critical component that shouldn't be replaced by anything else than the original part. Safety reasons. There's absolutely no public information about the technical specs of these flybacks.
 
I read that the P1110 has a compatible flyback.

I have the opportunity to acquire a CPD-G520 for a reasonable price.

Does anyone know if I could use the flyback from this unit in the FW900?
Grab the G520 if it's cheap enough and use it while you search for the FW900 part?
This is a big NO. The circuit schematic inside the flyback is different between the service manual of the G520 and the one of the FW900, they're not pin compatible.

Also, I think someone reported using another flyback with success in the past and the patch-up job was holding but this is a critical component that shouldn't be replaced by anything else than the original part. Safety reasons. There's absolutely no public information about the technical specs of these flybacks.
Thanks. I was thinking about offering up my dusty P1110 but I wasn't sure if it actually shared the part.
 
Grab the G520 if it's cheap enough and use it while you search for the FW900 part?

Thanks. I was thinking about offering up my dusty P1110 but I wasn't sure if it actually shared the part.

If you have a P1110 that you are willing to part with, I am very interested.

To clarify, the P1110 and FW900 have very similar flyback, but the G520 is different.

Obviously, ideally I'd replace the flyback with the same exact part, but realistically I know the odds of finding it are pretty low.

I know there was a member of this forum that swapped flyback between these two units (P1110 and FW900) and it was working really well.

If that's my only option, I'd rather give it a try than giving up on the unit.
 
Chief Blur Buster Do you know any 16:10 Trinitron Professional CRT better than FW900 that comes with DVI-I/DVI-D?
Sadly, no.

There are some Sony's with HDMI input (e.g. Sony KD-30XS955), none of them better than the 900 series AFAIK as they are lower dot pitch. There's also the widescreen Sony PVM/BVM, the broadcast monitors, though they were generally designed to handle TV brodcasts rather than computer graphics.

Generally, it can be vastly superior to use a modern HDMI to RGB/component adaptor -- much less laggy than early builtin HDMI converters, which didn't do a really good job back in the day (Built-in HDMI is very laggy in 30XS955, for example).

I do not have direct experience with the Sony widescreen broadcast monitors, which are extremely rare now -- some of the models (especially the bigger widescreen ones) are now even more rare than a 900 series monitor.

But even with those, I'd still generally use an external HDMI-to-RGB/Component adaptor, due to generally superior performance of modern converters relative to 1990s/2000s converter electronics.
 
Last edited:
Great news everyone!

After more than 5 years of searching I have finally found the perfect RAMDAC converter, the LK7112 DAC that you can buy from Mert over here:

https://www.ebay.com/itm/3861892264...HPx5Ab7Q3O&var=&widget_ver=artemis&media=COPY

My sample, with no cooling (naked chip) can do 398MHz, with the small radiator Mert put on his custom case 411MHz but with these heat sinks, a whopping 437MHz (see attached pics for proof).

The heat sinks: https://ro.mouser.com/ProductDetail/Wakefield-Vette/SKV505014-AL?qs=u16ybLDytRbIt3tRqNBa0w==

The thermal pads:
https://www.arctic.de/en/Thermal-Pad-10-x-10-x-3.5-mm/T-Ti39-010-010-35-04

This converter is perfect. It provides exactly the same experience I get with my trusty old GTX 980Ti via its analog output. No screen sides swapping, no loss of sync, no input lag, no artefacts! 😀

Do note that when the chip gets hot or you reach its voltage limit, you'll get flickering green pixels all over the screen so proper cooling can make the difference between 390MHz and 440MHz.

My sample can output a signal at 445MHz max but with green pixelated artefacts even though it is at room temperature. Above that frequency there is no sync and the monitor shuts off.

I've assembled a small tutorial of how to get the most of your FW900 with the RTX 4000 series with the latest drivers on Windows 10 and I've attached a zip file with all the necessary files.

1. Extract the latest geforce driver and edit nv_dispig.inf like this in order to see ONLY the resolutions you set into CRU:

[nv_commonDisplayModes_addreg]
HKR,, NV_Modes, %REG_MULTI_SZ%, "{*}S 3840x2160x8,16,32,64=FFFF"

2. Install the driver via its Setup (you may need to disable the windows driver signature enforcement via F7 after an advanced restart in Win10 prior to this)

3. Run Nvidia Pixel Clock Patcher

4. Install the attached FW900 inf driver file via Device Manager -> Monitors.

This driver has been created with CRU and contains the vital HDMI datablocks required for the converter to use all of its bandwidth (feel free to load it up in CRU via Import to check it out). You'll see there that the highest pixel clock timing is 2560x1600 73Hz which is ~430MHz. The best I could do with the GTX 980Ti at that resolution was 68Hz (400MHz) so I get an awesome 5Hz upgrade at the resolution I keep my desktop at all times.

5. Restart your PC. You should now see ONLY the resolutions set-up in CRU.

6. Use RefreshLock to lock your resolutions to the highest refresh rate (very useful in games). You can do it globally or per resolution.

7. All current geforce drivers are DCH so you'll notice the Control Panel is missing and you get an annoying Windows Store notification. Copy the Control Panel Client wherever you want and execute the included registry file.

8. Enjoy!

After 7 years of faithful service I can finally retire my GTX 980Ti with absolutely no compromise and even an increased refresh rate at my most important resolution!

Hello RTX 4000 series! 😀
First off, you're an absolute champ, thank you for all of this precious information brother.

Question, the computer for which you'll be using this adapter, what is the CPU and Motherboard?
 
As an eBay Associate, HardForum may earn from qualifying purchases.
If you have a P1110 that you are willing to part with, I am very interested.

To clarify, the P1110 and FW900 have very similar flyback, but the G520 is different.

Obviously, ideally I'd replace the flyback with the same exact part, but realistically I know the odds of finding it are pretty low.

I know there was a member of this forum that swapped flyback between these two units (P1110 and FW900) and it was working really well.

If that's my only option, I'd rather give it a try than giving up on the unit.
I'd be willing to trade it for another monitor or sell it if I found a decent replacement for cheap enough. It's my only 20" backup.
 
Back
Top