24" Widescreen CRT (FW900) From Ebay arrived,Comments.

So much misinformation and poor setup in that video that I blocked it out of my memory already.

Dude's head was in the right place, but he just didn't understand how to setup up things like the Dreamcast properly. Like why would you run it in 480i when you have two TVs that support 480p?
 
So much misinformation and poor setup in that video that I blocked it out of my memory already.

Dude's head was in the right place, but he just didn't understand how to setup up things like the Dreamcast properly. Like why would you run it in 480i when you have two TVs that support 480p?
Let them poopoo it. Maybe the prices of CRT's will come down to sane levels.
 
Has anyone found a HDMI to VGA that will actually properly pass through SMPTE 720p / 1080p at 0-255 full range RGB from a PS4, Switch,etc - rather than the source device needing to be set to limited range? As far as I know this is only a issue with resolutions within SMPTE spec and VESA resolutions don’t have the issue. This is also what causes black crush most of the time, as many of you know.
 
Has anyone found a HDMI to VGA that will actually properly pass through SMPTE 720p / 1080p at 0-255 full range RGB from a PS4, Switch,etc - rather than the source device needing to be set to limited range? As far as I know this is only a issue with resolutions within SMPTE spec and VESA resolutions don’t have the issue. This is also what causes black crush most of the time, as many of you know.
HD Fury adapters use RGB full range. Though some of them, like the HD Fury 2, can do either.
 
They don't actually have them in stock, according to the HDFury Discord.

The Gamma X feature is described here (it also sold as a standalone VGA->VGA unit)
http://www.curtpalme.com/GammaX.shtm
So it has nothing to do with pixel clock. It was too good to be true...
Scrolling through the forum I see now that it has the pixel clock limit at around 280MHz...
Another overpriced piece of crap...
 
Found this forum linked in another CRT discussion on Reddit. Been doing a lot of heavy reading the last couple days through here. I have a Sony GDM-F520 I ordered through a small shop in Akihabara in Japan back in 2002, they had it delivered to my residence when I lived there. Around 2007 I moved back to the US and the monitor went with me. In 2008 I purchased a 24" LCD Dell monitor and put the F520 in plastic and set it in the closet. The 24" LCD was...different...but I didn't understand why at the time. In 2010 I moved back to Japan and the monitor stayed wrapped in plastic and in a box in storage in San Diego. In 2019 I was in California on business. I decided to rent a truck and get our stuff from storage and drive it across the country to the other coast where I live now. The monitor was unloaded and sat in the garage unused until a few weeks ago. My now 15 year old son decided to dig through some of the old storage boxes and found an "ancient" (his words not mine) Pentium 4 with an ASUS ATI 9800 XT, SFF computer I used to take on deployments and to LAN parties. I had left it in storage also. He plugged it in and it powered on, we dug out the old F520 and hooked it up and it also came to life with no issues. He loaded up Half Life 2 and it blew me away just how much smoother and crisper it looked compared to my 32" 1440P, 144Hz LCD that I currently own. It actually bugged me how much better it looked. Then Linus released that video on the FW900 and that answered some of my questions but I had more. That's when I went searching and found this thread. I just wanted to share my story and thank everyone who has contributed over the years to this thread. It's truly amazing to me that in 20+ years we have barely caught up to the same quality (debatable), but have yet to match the motion and speeds available with CRT's. Though OLED is very close from my understanding.
I've learned a lot over the last week about things I never would have known if not for this thread. I'm an electronics tech so even some of the deeper in depth stuff wasn't too greek to me. I'm getting ready to order an adapter to hook the F520 back up to my main rig and put it back to use. Thanks to those of you doing the testing for the adapters it looks like the main winners are the Delock and Sunnix adapters, so I'll look into those unless I missed something. I'll have to look into calibration and set up to make sure it's done right.
 
blew me away just how much smoother and crisper it looked compared to my 32" 1440P, 144Hz LCD that I currently own. It actually bugged me how much better it looked.
Yuuup. I think in the mid-late 00's PC gamers underwent some sort of collective psychosis and irrationally ditched all their CRTs. Glad you at least kept yours stashed away. Thanks for stopping by and sharing

Also make sure you look for that IcyBox version of the Sunix. I've seen it sold for cheaper than the Delock recently.

Two things I like to say to people getting back into CRTs:

Download CRU and don't be afraid to use the heck out of custom resolutions. My strategy is to basically make a custom resolution/refresh rate for every new game I get, because I can always adjust the resolution/refresh rate and not have to compromise on things like lighting, level of detail, shadows, etc. And you have one of the highest frequency models in existence, so you can make some really wild resolutions like 1920x1440 @ 90hz, 1024x768 @ 160hz, and 2880x2160 @ 60hz.

And always use some form of vsync, as it's necessary to get the motion clarity advantage from your CRT. Recent third party variants with no lag are Latent Sync in Special K (with delay bias to reduce input lag if you have GPU overhead) and Scanline Sync from RTSS. Additionally, some games peform best when using a frame cap + old fashioned vsync: https://blurbusters.com/howto-low-lag-vsync-on/
 
Anyone tried anything with the IT6692FN chipset? Noticed a pic of the PCB in the reviews for this one (I uploaded it as the attached image in this post), and it had a chipset I've never heard of before!

https://www.newegg.com/bytecc-hm201/p/N82E16812270601?Description=hdmi to vga&cm_re=hdmi_to vga-_-12-270-601-_-Product

Here is the datasheet
https://datasheetspdf.com/pdf-file/1156838/ITETECH/IT6692FN/1
Should top out around 225MHz according to the datasheet, but who knows...

---------------

Also, I emailed Delock about this one
https://www.delock.com/produkt/64172/merkmale.html?f=s
and they told me the chipset is EP94Z3
DACs with the other Explore chipset, EP94Z1E, have been found to go up to 350+ MHz (one of the Tripp Lite units, P131-06N-2VA-U).
Perhaps the chipset in this newer Delock unit is even better? The datasheet for the EP94Z3 also says it tops out at 225. Can't find one for the Z1E, but I assume they also say 225, in which case that isn't true...

---------------

Lastly, I have read somewhere that this unit
https://www.amazon.com/gp/product/B00EIS3TEO/ref=ewc_pr_img_1?smid=A2IX3RNQE846HA&psc=1
Actually processed full-range 0-255 RGB properly, unlike many others. I will probably check it out sometime...
 

Attachments

  • vga.jpg
    vga.jpg
    237.9 KB · Views: 0
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Really started digging into display tech that is out there and started looking for tech other than OLED that might come close to CRT performance. Only thing I could find was LPD's and they haven't really innovated with them since they started in 2010. However the tech is probably the closest thing to CRT available today. Sadly it seems to be strictly for commercial use. Unless they innovate and can scale it down significantly we will just have to wait for OLED to get BFI to appropriate levels.
 
hi azhighwayz nice to read your story, yes, this thread has been very helpfull to us CRT monitor enthusiast, your are not wrong mentioning the sunix as one of the best adapters to use our CRT monitors with non analog output cards, thanks to the contribution and research from people here, i decided to go for the sunix dpu3000 a while ago and i am very happy wth this, can reach res-refresh combos that even from another video cards with native analog and max pixel clock of 400mhz output was not posible like 2560x1440@80hz as an example, sunix can go 539mhz, however this adapter seems to be discontinued but there seem to be another variants with same chipset reported by users like Derupter ,Enhanced Interrogator, etc,

the sunix is a bit tricky adapter with some res-refresh combos, and seems to behave diferent at different video card brands (amd-nvidia) as i have personaly tested and from other reports, you seem to have researched about it and maybe you already know it has some weird issues at some resolution-refersh rate combos but in my case i manage to solve most of them by creating the custom-res - refreshes combos with the nvidia control panel with the GTF standard timiing, not with the CRU utilty bacause in my case with CRU i have more issues, but it seem not to be all user cases, thanks to the felixiblity of CRT monitors when creating many res-refreshes combos different than the ones that have issues with the sunix, seem to solve the issues for some as others have claimed.

as you have done, the best we can do in the hard quest to finally find a real CRT monitor successor is too deeply research through the internet and be very carefull to what "experts" recommend, suggest about this, because from my research about this i consider normal forum user testimonials and opinons and some low profile, low popularity youtubers review are definitely more trustable than many missinforming and poor research info from sites from youtubers or other sites with high popularity about or agains CRTs in order to feed their businnes rather or pretending to make look their advertised monitors better than CRTs than their really are, than really educating the user, in fact i have been so dissapointed by reading-watching so many missinformation about CRTs motion clarity and overrated modern monitor motion quality and so on that regard coming from "experts"

in fact there are plenty of modern monitors-TVs than can create CRT like motion quality, but come with many flaws like diming badly the screen, introducing another motion artifacts like crosstalk, worse flicker than CRTs at the same refresh rates, specialy on lower ones, worse input lag, requiring to high refresh rates and expensive gpus capable of maintainig constant high fps to match the refresh and create the CRT motion clarity, etc, so many tradeofss that their motion clarity capabilities become rather poorly enjoiable and hardly providing the motion experience CRT monitors can.

examples like Linus Tech Tips on his FW900 video, poor research, (hilarious when they introduce themselves as "experts in consumer technology and video production which aims to inform and educate people") only mentionig 80hz vertical refresh when it can perfecly go, even from a video card with analog 400mhz pixel clock outputs to refresh rates up to 106hz at resolutions like 1920x1080 (323MHZ pixel clock), 96hz at 1920x1200 which many consider FW900 prime resolution (323mhz píxel cock) and it can even max 160hz vertical refresh at resolutions that fit into its horizontal scan limit, (arround 121 khz from my test) also mentionig it as a monitor for retro content is not entirely true, i see he even recommending a standard CRT TV rather than a FW900 being the TV being much cheaper, as if CRT monitors and standard CRT TV where the same, but when i agree its not worth to pay so much for a CRT monitor whatever brand or sceen size it has since most likely those will be used, and with some wear on it, no way to connect them on a modern video card without an external adaptor which good ones are getting harder to find, me being and owner of FW900 and using it to play both retro and modern games, a lot of retro games i play on mame emulator, i can testity that also retro games on this monitor like those classic king of fighters games for the 90s and classic mortal kombats on this monitor look as ugly pixelated as they look on a modern display or TV if you play them without emulator filters, since the FW900 and many other CRT monitors have enough small and plenty dots, like modern monitors have small and many pixels, so modern content looks beautifull on those CRT monitors as well, but low res retro not so good, specialy those games based on low res sprites or 2d low res images rather than 3d poligons, so you need to use filters to improve retro emulation like visual quality on them. modern gaming on those monitors, even on my smaller 7550 CRT compaq 17 inch look very good.

i see many people trend to think standard CRT TVs and CRT monitors are the same, so they think we CRT monitor users use them just for retro content and as i wrote thats not correcrt for all us CRT monitor enthusiast, standard CRT TVs have much bigger dots, and notable scan lines, lower resoluton and refresh rates than CRT monitors so that low resolution content on them look less pixelated than on modern monitor, also on most CRT monitors i have seen, i barely see the scan lines and just in lower resolutions, have much smaller dots and more of them than standard CRT TVs way sharper image than standard CRT TVs so you can see more detailed modern content on CRT monitors as it is on a modern monitor, i personaly have been enjoying even modern gaming on my CRTs not just for their motion quality and other features lke very good blacks, perfect viewwing angles, latency, multi resolutions -refresh rate felixibilities, 0 crosstalk at any refresh rate used, softer flicker, good brightness, etc, and their phosphor decay is barely notable and only in high contrast situations and vanishes fast enough so it does not disturb the CRT motion quaity as bad as modern monitor motion quality-flaws-tradeoffs do, so also CRT monitor image quality has nothing to envy from modern monitors.

also more missinformation against CRTs monitors is sadly happening with sites like blurtbusters, a site with very helpfull information about motion clarity, but a site which owner sadly became to biased promoting viewsonic monitors as having "better" motion quality than even an FW900 (monitors that in reality carry so many flaws on their CRT motion quality making them distant to give a real motion quality experience an FW900 or any other CRT monitor can) and also misinforming the public about CRTs evidenlty in order to make CRT monitors look as flawlled as their viewsonic certified monitors on their motion quality, like in this link, (comments section) from which can be observed that Mark Rejhon replied to the top ViewSonic XG2431 amazon reviewer (namcost) complaining about XG2431 60hz strobing mode being badly flickery, so bad that even when namcost consider himself to have good toilerance to flicker, and even with tweaks the situation didnt improve and he definitely didnt recomend to anyone to use the XG2431 60hz single strobing CRT motion quality mode. so Rejhon replies bashing agains CRTs with non sense information to justify XG2431 60hz CRT motion clarity bad flicker with arguments like CRTs being "smaller" and today monitors being "huge" to justify why the rough flicker on the XG2431 60h CRT motion quality mode, when in reality, a CRT like the FW900 (the monitor he keeps overpraising agains his viewsonic certified monitors) is just 1.3 inch smaller (22.5 inch viewable area) than the XG2431 (23.8 inch viewable area), also other CRT monitors that many still use have viewable area of about 19inch viewable area, are not that size far from the XG2431, so it makes the monitor size justification for XG2431 60hz bad flicker totaly senseless.

also it can be observed that Rejhon on his persistence to justify XG2431 60hz CRT motion quality bad flicker advice to reduce brightness to reduce flicker perception on the XG2431, but again, misinforming with false information that this is because "CRTs are just 50 nits" when the true is that CRT monitors can reach up to 100 nits, evidently he claims this falsety since its known that both blurbusters certified monitors; the XG270 and the XG2431 both suffer from getting very dimm display at their best CRT matching motion quality mode so that their reviewers only see them usuable on that mode in a "cave" dark room with no lights on it, in fact the XG2431 was tested to only reach 53 nits in that review at its best CRT quality motion mode, so "curiously" Rejhon missinform about the CRTs being only able to get 50 nits.

by the way, after using CRT monittors for over than 20 years, i have never needed to use any of them in such a dark room, i can use them on a moderated natural lighted room or even in the night with a bulb light tuned on on the room with still very nice luminance levels, definitely brigtness too far from being only watchable on a "cave" room, even youtuber "Aperture Grille" on his review on one of the viewsonic blurbusters certified monitor, the XG270 confirmed his 18 year old CRT can reach 100 nits.

also i dont see any CRT monitor owner here in this forums complayning about their CRTs monitors only being able to reach luminance at the point of needing a "cave" room or CRTs getting "fuzzy" to need to use their monitors brightness (luminance) so low that they only can see its display image on a completely dark room with no lights on. as a matter of fact i even have used my CRT monitors, 19 years old compaq 7550 and FW900 monitors at their max luminace level with no "fuzzy" issues.

by the way i respecfuly wrote about a month about correcting those missinformations agains CRT Rejhon wrote in that blur busters XG2431 public advertising page, but at the moment of writing this, my comment have not been posted to the public under a "Comment awaiting moderation" message and Rejhon doesnt seem to be interested to correct that missinformation he knows is missleading to the public, hilarious he just add an edit with nonsese information ("yesterday" "yesteryear" "fuzzy", "pandemic"??).
this speaks bad about how blurbusters is becoming another biased missinforming site, but also a censorshiping one.


Though OLED is very close from my understanding.


Wondering when this thread will finally be mothballed? 🤔 Join the OLED party guys, time to move on....


i recenlty had the pleasure to test myself and LG OLED C2 (EDIT: ITS A C1, NOT C2 MODEL) on a friend house which he recenlty bought one, with my own computer, i did some limited tests since i have not had much time, so i focused on the point i care most: motion clarity:
its the closest thing i have seen to the CRT monitor all in one quality, however still issues i see on it; from which i still prefer CRTs monitors:

-its 60hz best motion quality mode still have more notable flicker than the CRT monitors, also it had still notable motion bluring, even at the best high oled motion pro quality setting i found, so it was an huge dissapoint for me, however the screen was not badly dimmed to requiere something like a "cave" room, when i enable that mode, i would say its a bit dimmer than a CRT but not that much, still usable on a semi dark room, not bad on that brightness regard.

-at 120hz with best motion quality setting mode enabled, the motion quality is very good, very close to CRT motion, and hard to note in games, also 0 notable crosstalk, decent screen brightness (luminance) not requiering a "cave" room and very good latency, sad this wonderness was at such high refresh rate so you would need a gpu capable of runing 120 fps constant to achieve this on moden games. sure even a expensive RTX 3090 TI will have trouble runing constant 120fps in all modern games even without raytracing.

i would dare to say if there are no worriess to use 120hz for very good motion clarity and TV size and price is not concern, or maybe emulators at 120 hz with any build on black frame insertion is ideal, (i didnt test this with emulator BFI but i guess it should give similar result to the clarity of a 60hz CRT but for me is critical 60hz clarity at any tipe of gaming content, not just emulators ) this TV is the CRT relpalcement many have been looking for, but for me, not yet, but damn, so f*** close!!

it seems LG degraded motion quality for lower refres rates on this model, from what i read, the LG CX had a bettter motion quality for its best motion 60hz mode, but seemed to dimm the screen a lot, worse than C2 (EDIT: ITS A C1, NOT C2 MODEL) , so maybe they prefered to remove their absolute best motion quaility mode on the C2 (EDIT: ITS A C1, NOT C2 MODEL) in favor of better brightness since even at its "best" motion quality mode on the C2 (EDIT: ITS A C1, NOT C2 MODEL) there was still notable motion bluring.

sad it seems to be vey poor improvment on motion clarity vs brightness loss, i see the same issue on all modern TV-monitors as old lightboost tech were you need very dimm screen to achieve the CRT clarity quality on all modern displays, or very high refresh rates up to 240hz + and so 240fps + constant capable gpu or high refreshes up 120 +hz (so 120 + fps constant capable gpu) + strobing to achieve that.
 
Last edited:
Not to derail but no way is that OLED dimmer than a crt even with BFI enabled. I’ve seen measurements of the C2 with that mode enabled and I think we’re talking 170 nits or so. Twice as bright as what I had my old GDM monitors set to. Your friend probably had the OLED light setting or whatever it’s called turned down.

I also have a Viewsonic XG2431 and I think you need to see it in person as your sticking points against it are exaggerated, in my opinion. I do agree with you though on some of the marketing BS about it though. CRT’s weren’t 50 nits. No way. Most screens I used that were badly calibrated tended to be too bright and needed to be lowered to be correct.
 
CRTs are what gas cars will be in the next two decades , they were great for the time but time always marches on, saying as some who still drives a V6 everyday....
 
I love my FW900 to the point where it's probably my prized possession if I had to say, as someone who also "rediscovered" CRTs as an adult and realized what I'd been missing out on, but I can't argue with how inefficient they are as far as power consumption, size, weight, etc. Not to mention the legitimately hazardous process of producing them in the first place...

It's kind of like vinyl records and analog sound, or viewing a painting versus a high-res scan of the same painting; human eyes and ears are analog sensors, so there's just something more natural and cozy feeling to me than a digital recreation of an image or sound etc. I'd concede that it's totally possible it's just a bias, or something to do with using them in my youth with fond memories, but I'd happily part with every flat panel I've ever owned over losing my CRTs ^.^
 
CRTs are what gas cars will be in the next two decades , they were great for the time but time always marches on, saying as some who still drives a V6 everyday....
I have about six 19-21in trinitron and shadow mask crt monitors in mint or near mint condition. I will be using them in 10 years regardless of what improvements come about in the future. I anticipate that pc games will continue to be made for a generation that I'm no longer a part of. As a result I'll probably still be playing pc games that look best and play best on an analog display. Just like other forms of popular culture it seems pc gamers over 50 like me are being left behind. 😞 That's okay because i have a virtually unlimited catalog of pc games and console games that will require the remainder of my natural life just to make a dent in. My crt's will be put to much use in the coming years as i near retirement. 😀
 
Last edited:
i forgot to mention i did those test at 1080p, even when this LG OLED C2 (EDIT: ITS A C1, NOT C2 MODEL) has 4k native res, 1080p to my eyes looked very good, good scaling.

jbltecnicspro, to my eyes i persieved a little less brighter screen at its best motion quality setting that i persieve my CRTs which i use them at their max "contrast setting" luminance levels, but you are right, i didnt check if the TV had it max lumimnance correspondig setting level to max, also was another enviroment from what i am used to use my CRTs, also that TV screen size is 55 inch so in can give diferrent perseption in this regard, so there is chance for inaccuracy i agree with you, (wish i have a device to measure luminance nit levels) but what im sure i persieve well was that motion quality definitelly was not much improved at its best motion mode from leaving that motion mode off, there was so little differece in motion at 60hz, still notable blurry motion but since the motion was poorly improved in this OLED C2, (EDIT: ITS A C1, NOT C2 MODEL) so it makes sence that "best" motion mode at 60hz would get something to 170 nits since you know, the better the motion quality the dimmer the screen, as it uses to ocurr on modern monitors -TV based on BFI.

im aware you have the XG2431, and so from what i have read from your testimonials, an a lot of reading i have done from others sources and reviews that have had the monitor in their hands.

i remember when i asked some question to you about the XG2431, and remember your answer:

info.jpg


after that answer, remember you contacted me via private message and asked me to keep asking if i wanted about more info for the monitor via privately, i found it very suspicios whats wrong to make your impressions about that monitor public? however i respected your wish, and to be honest with you man, no offense but i suspect Mark Rejhon contacted you and asked to stop talking about that monitor in public. so i decided to stop asking you, also from what you said was enough and matches from what i have reasearched from other sources that this monitors is definitiely not on par with a CRT monitor, is clearly dimmer at its best matching CRT quality motion mode and have more intense flicker at its 60hz flicker mode, and so to eliminate its crosstalk (which CRT monitors are completely free at any refresh rate), as i found Rejhon wirting in the Bijan Jamshidi XG2431 review comments sections that you need high refresh rate headroom, (hence high FPS contant capable GPU) to eliminate crosstalk, and at the trade of flicker even with tweaks.............(didnt supposeldy blubusters "factory-tuned ViewSonic XG2431 monitors to achieve among the world’s best motion blur reduction for an LCD, even better than the previous gold standard, NVIDIA ULMB. Blur Busters Approved 2.0 monitors are among the most CRT-motion-clarity desktop LCDs ever invented"???)

662496_lol.jpg



"if you are worried about flicker but want lower crosstalk 180hz vertical total 1500 is a good compromise" even a RTX 3090 TI will struggle to achieve constant 180 fps in most modern games even with raytracing off to achieve that good compromise.

jbltecnicspro, i would like to ask you, have you been able achieve crosstalk free at any refresh rate, with CRT like 100 nit luminance levels at CRT clear motin clarity, (not close to it, but as clear or "better" than CRT motion quality as Rejhon uses to state) without the need to have your room dark without any lights on it to see the screen image, (by the way you dont seem to be using best XG2431 CRT motion quality mode pure XP "ultra") with same CRT flicker perseption for 60HZ single strobe? if you do, i believe you those points can be considered exagerated, as a matter of fact those "exaggerated" sticking points are not from me, are from people (in some way you also confirm those), reviewers like Bijan Jamshidi, namco (top XG2431 amazon reviewer and owner ) even Rejhon himself (the pic above about crosstalk) that actuality have had the monitor and witnessed it with their own eyes.



talking about the OLED C2, (EDIT: ITS A C1, NOT C2 MODEL) i also tested a basic and quick input lag with this latency boost mode enable (didnt check with motion clarity mode enabled) and it give me the same measure i get from my CRT monitors even hooked to a vga analog input video card, with vercal sync off, impressive!,

the way i measure was all test with vsync off, record fighting games (king of fighters 95 from mame version 0246b_64bit with low latency setting enabled) with the smarthphone at 60 fps since that game runs at 59 fps to have a more aproximate match, TV set to 60hz refresh rate, pressing multiple times game controller button and then replaying the video with a frame by frame player and comparing button presses with character actions, so i saw an average of 6 frames input delay, which is even the same measeure i have witnessed from an original MVS neogeo arcade with CRT arcade monitor!!

also did a test with king of fighters XV and measured and average of 3 frames input lag!! the same measure i get from my pc with fw900 and sunix dpu3000, nice!, by the way i have tested and compared with this method input lag with and without the sunix agains analog outputs from video cards and there is no notable input lag increase, amazing from the sunix dpu3000 (i rembemer sunix dpu3000 input lag was measure in this thread and also was not found notable lag).
without the TV latency boost option input enabled, inputlag increased for the OLED by 2 aditional frames

so from what i tested, this TV latency is awesome, at least for 60fps fighting games, didnt test for mouse or first person shooter based games, nothing to envy from CRTs in this regard, at least with its latency boost mode enabled.





I love my FW900 to the point where it's probably my prized possession if I had to say, as someone who also "rediscovered" CRTs as an adult and realized what I'd been missing out on, but I can't argue with how inefficient they are as far as power consumption, size, weight, etc. Not to mention the legitimately hazardous process of producing them in the first place...

It's kind of like vinyl records and analog sound, or viewing a painting versus a high-res scan of the same painting; human eyes and ears are analog sensors, so there's just something more natural and cozy feeling to me than a digital recreation of an image or sound etc. I'd concede that it's totally possible it's just a bias, or something to do with using them in my youth with fond memories, but I'd happily part with every flat panel I've ever owned over losing my CRTs ^.^

many people trend to believe preference even as year 2022 for CRT is because "nostalgia", bias, things like that, not my case, they can consume more power, bulky, phisically ugly but their image motion quality (specialy motion clarity) is still unmatched by most monitrs and TVSs, the C2 (EDIT: ITS A C1, NOT C2 MODEL) OLED is the closest thing i have seen, but from a practical point it sitll strugless: really good motion clarity at 120hz best motion setting, but impractical if you realize you wiill need a computer capable of constant 120hz to achieve that, even the most expensive ones will strugle to achieve that constant in modern gamers, and i hate motion bluring because it makes feel games boringly artificial, unantural they can have beautifull graphcs but all that magic gets runined when in motion, CRT do a much better job in making that feel more natural, more inmersive and thanks for their felixibility at any refresh, it can be enjoyed at even lower framerates such 60 which still there are a lot of worthy content, including modern games, also requiered much lower pc hardhware and realistics specs to achieve that on them, you dont see things getting blurred in reality when you move, so it makes feel the monitor blured motion dull and lacking realism.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
after that answer, remember you contacted me via private message and asked me to keep asking if i wanted about more info for the monitor via privately, i found it very suspicios whats wrong to make your impressions about that monitor public? however i respected your wish, and to be honest with you man, no offense but i suspect Mark Rejhon contacted you and asked to stop talking about that monitor in public. so i decided to stop asking you,
Yep, you caught me red handed. Mark contacted me and told me that I would have to sell my Viewsonic stock if I continued to run my mouth. :rolleyes:

No - I asked to continue the conversation in private because I didn't want to turn the FW-900 (and CRT thread in general) into the XG-2431 thread. There are lots of people here asking for help for their CRT monitors and I felt that going down the rabbit hole of running tests, doing this, doing that, reporting back... on an LCD monitor, would have derailed this thread off topic. Since you were the only one who expressed an interest for an in-depth analysis of the monitor, I thought it would be easier to just one on one you.
jbltecnicspro, i would like to ask you, have you been able achieve crosstalk free at any refresh rate, with CRT like 100 nit luminance levels at CRT clear motin clarity, (not close to it, but as clear or "better" than CRT motion quality as Rejhon uses to state) without the need to have your room dark without any lights on it to see the screen image, (by the way you dont seem to be using best XG2431 CRT motion quality mode pure XP "ultra") with same CRT flicker perseption for 60HZ single strobe? if you do, i believe you those points can be considered exagerated, as a matter of fact those "exaggerated" sticking points are not from me, are from people (in some way you also confirm those), reviewers like Bijan Jamshidi, namco (top XG2431 amazon reviewer and owner ) even Rejhon himself (the pic above about crosstalk) that actuality have had the monitor and witnessed it with their own eyes.
No I have not. 120hz gets me the closest I feel. There's a way to do QFT but I haven't been able to successfully enable it without crashing my GPU. Not sure why. This done on two separate machines. So if I get that working then maybe I'll have something that is cross-talk free.

In terms of exaggeration, let me clarify. Declaring that something has cross talk isn't an exaggeration. Declaring that at its clearest motion, the screen is half as bright as a decent CRT isn't exaggerating. It is what it is. No display is perfect, not even your beloved FW900. What I have a problem with is that you seem to imply that because the XG-2431 falls short of the FW900, it's a crap product. That it's a scam.

Here's reality. No display comes closer to CRT motion clarity than this screen and maybe a handful of others. If you have a quality, working CRT, then this won't replace it. I never claimed it would. I DO have a problem with Mark's advertising like I said. I agree with you 1000% on this. If you do not have a quality, working CRT, then this will get you closer than anything else there is. That's really all there is to it.

Here's my concern. I'm concerned that monitor manufacturers will see opinions like yours and all the brow-beating, and nit-picking. Yes - you read that correctly. And say "fuck it" and stop trying. Be honest with yourself - we're in the minority here. And I would love to be able to buy a brand new monitor that actually replaced my old CRT's in every aspect for less than $5000. For all its faults, the XG-2431 is the first screen that I've seen where a manufacturer made a genuine attempt, and not some half-assed feature slapped on at the last minute. I would very much not like it to be the last of said attempts.

Again - you have a nice CRT? Keep it going for as long as you can. Those babies are awesome and though there are morons who say that it's all nostalgia. We all know better. You're preaching to the choir here. Do I need to remind you that I wrote a calibration guide to the FW900? I know more about these screens that most. You're not talking to someone who is ignorant of how good these displays are. I absolutely miss my GDM monitors and my FW-900 especially. I've never had another display that topped that and have never seen one that topped it except for a fully-calibrated G90 projector (now THAT's a sight to behold! Take your favorite aspects of your CRT monitor and magnify the size to 110 inches and you’ll have an idea).
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
What I have a problem with is that you seem to imply that because the XG2431 falls short of the FW900, it's a crap product. That it's a scam.
"people who prefer CRT will be pleased to know that a Blur Busters Approved monitor, strobed at a Hz well below maximum, can produce a motion experience superior to a CRT."
"ViewSonic XG270 is superior to a Sony FW900 CRT"

words like those, among many type of false advertising manipulation i described in this thread, is that makes me strongly believe both blurbusters certified viewsonic monitors are a scam compared not just to the FW900 motion clarity experience but also against any other CRT monitor in good condition, in fact, even my ordinary compaq 7550 17 inch CRT monitor is way brighter than those viewsonics at their CRT matching motion quality that i can use it as well as i do with the FW900 on a moderated natural lighted room, even with a bulb light in the night, with still bright enough notable screen, and also crosstalk free at any refresh rate it supports from 48 to 140hz, with no "fuzzy" issues at its max contrast-luminance levels it can achieve, (i can record and upload video proof of this if someone interested)

also i see you struggle to get crosstalk free and use the supposed supported QFT tweaks even on different computers with all the time you already have had with the XG-2431 on your hands, and have been only able get close to CRT motion quality at 120hz.
then, really does it make any kind of sense to say the XG2431 "can produce a superior motion experience than a CRT," when it cannot reach decent brightnress levels that dont requiere a dark room, at its best CRT motion quality, needs 120fps to provide CRT motion quality (so 120 fps constant capable gpu needed), have worse fickering than any CRT monitor, not just the FW900 for 60hz CRT clarity 60 fps content, crosstalk?.

i know no display is perfect, im not pretending to say this, or to presume the FW900, all my issue is with Rejhon and his false advertising campaing with their viewsonic certified monitors and agains CRTs, all the falsety i have described that can be observed on their main blur blurbusters advertising page for the XG270, XG2431 and Blur Busters Approved programme. sure he (Rejhon) comes to the forums to give any kind of nonsense explanations to his actions, even calling me "silly", but you can see he refuses to correct with acurrate, honest, real information his main advertising pages. and keep the missinformation, falseness.

Here's my concern. I'm concerned that monitor manufacturers will see opinions like yours and all the brow-beating, and nit-picking. Yes - you read that correctly. And say "fuck it" and stop trying. Be honest with yourself - we're in the minority here.
man, i openen my eyes to the reality long time ago, and its evident manufacturers simply are not interested to improve motion quality on modern monitors simply because as what you said, we motion clarity CRT enthusiast are the very minority in the entire world, what i say or not say in the forums will not make any relevanse on that, , in can safely say i am the only one in my entire country that care about motion clarity, the few guys i have seen still using CRTs besides of me here (and are CRT TVs) is because they prefer how their retro consoles look on them, they dont care about motion clarity when i ask them about it. so, is in the vast mayority of the world.

i gave up on this topic log time ago, and conclude the best it to stick on my CRTs the live is left on them, sure, 60hz single strobe options emerged, but as i suspected, i wasnt happy when new monitors with 60hz single strobe were anounced because i knew what flaws were coming with them: worse flicker than CRT, crosstalk, dims screen, far to provide the motion experience that CRT monitors can and replace them,

now look at old lightboost from many years ago, how much improvement since that? still same notable brightness loss for CRT like motion clarity, still crosstalk not being fully eliminable, or limited to few high refresh rates, ok, better colors but because of better color IPS like monitors supporting BFI, this poor improvement is evidently due to the lack motivations and worth investiments from manufaturers due to lack of demand for those improvments, do the impossible task to convince thousands of million monitors enthusiats in the entire world to not to buy monitors or TVs until manufaturers improve motion clarity and finally we have the true flexible, real world usable CRT motion clarity experience been seeking for decades.



IMPORTANT EDIT ABOUT THE LG OLED TV I REPORTED TO HAVE TESTED, MY FRIEND CORRECTED ME THAT HIS MODEL IS A "C1", NOT "C2" MODEL.!
 
Last edited:
"people who prefer CRT will be pleased to know that a Blur Busters Approved monitor, strobed at a Hz well below maximum, can produce a motion experience superior to a CRT."
"ViewSonic XG270 is superior to a Sony FW900 CRT"

words like those, among many type of false advertising manipulation i described in this thread, is that makes me strongly believe both blurbusters certified viewsonic monitors are a scam compared not just to the FW900 motion clarity experience but also against any other CRT monitor in good condition, in fact, even my ordinary compaq 7550 17 inch CRT monitor is way brighter than those viewsonics at their CRT matching motion quality that i can use it as well as i do with the FW900 on a moderated natural lighted room, even with a bulb light in the night, with still bright enough notable screen, and also crosstalk free at any refresh rate it supports from 48 to 140hz, with no "fuzzy" issues at its max contrast-luminance levels it can achieve, (i can record and upload video proof of this if someone interested)
It's called marketing. Marketing sucks and I don't like it either, but that's what it is. Buyer beware. Because on one hand, in one specific aspect, I would actually agree it's better than the FW900's motion clarity. In the sweet spot on the shorter pulse lengths, its clarity is better than CRT. There's just all the other BS baggage that comes with that though :D I get your point and I agree with you.

also i see you struggle to get crosstalk free and use the supposed supported QFT tweaks even on different computers with all the time you already have had with the XG-2431 on your hands, and have been only able get close to CRT motion quality at 120hz.
then, really does it make any kind of sense to say the XG2431 "can produce a superior motion experience than a CRT," when it cannot reach decent brightnress levels that dont requiere a dark room, at its best CRT motion quality, needs 120fps to provide CRT motion quality (so 120 fps constant capable gpu needed), have worse fickering than any CRT monitor, not just the FW900 for 60hz CRT clarity 60 fps content, crosstalk?.
"My struggle" and all the time I've already had... :D Please don't make assumptions here. I'm a dad with *several* small children. If I can't get something like this working in an hour or less, without a clear solution in the future, then I'm not going to spend a whole lot of time pursuing it - especially if it's for a hobby. Truth is I've tried it maybe a few times and couldn't get it working. Haven't tried it since.

i know no display is perfect, im not pretending to say this, or to presume the FW900, all my issue is with Rejhon and his false advertising campaing with their viewsonic certified monitors and agains CRTs, all the falsety i have described that can be observed on their main blur blurbusters advertising page for the XG270, XG2431 and Blur Busters Approved programme. sure he (Rejhon) comes to the forums to give any kind of nonsense explanations to his actions, even calling me "silly", but you can see he refuses to correct with acurrate, honest, real information his main advertising pages. and keep the missinformation, falseness.
Sounds like your real beef here is with him. And I see where you're coming from. Flipside of the coin is that without him, I wouldn't have the XG-2431.
man, i openen my eyes to the reality long time ago, and its evident manufacturers simply are not interested to improve motion quality on modern monitors simply because as what you said, we motion clarity CRT enthusiast are the very minority in the entire world, what i say or not say in the forums will not make any relevanse on that, , in can safely say i am the only one in my entire country that care about motion clarity, the few guys i have seen still using CRTs besides of me here (and are CRT TVs) is because they prefer how their retro consoles look on them, they dont care about motion clarity when i ask them about it. so, is in the vast mayority of the world.

i gave up on this topic log time ago, and conclude the best it to stick on my CRTs the live is left on them, sure, 60hz single strobe options emerged, but as i suspected, i wasnt happy when new monitors with 60hz single strobe were anounced because i knew what flaws were coming with them: worse flicker than CRT, crosstalk, dims screen, far to provide the motion experience that CRT monitors can and replace them,

now look at old lightboost from many years ago, how much improvement since that? still same notable brightness loss for CRT like motion clarity, still crosstalk not being fully eliminable, or limited to few high refresh rates, ok, better colors but because of better color IPS like monitors supporting BFI, this poor improvement is evidently due to the lack motivations and worth investiments from manufaturers due to lack of demand for those improvments, do the impossible task to convince thousands of million monitors enthusiats in the entire world to not to buy monitors or TVs until manufaturers improve motion clarity and finally we have the true flexible, real world usable CRT motion clarity experience been seeking for decades.



IMPORTANT EDIT ABOUT THE LG OLED TV I REPORTED TO HAVE TESTED, MY FRIEND CORRECTED ME THAT HIS MODEL IS A "C1", NOT "C2" MODEL.!
So my response is that the XG is a definite improvement on my Samsung VA monitor with strobing. But I wouldn't necessarily equate that to advancements in the technology. My biggest problem is that the Samsung locks a couple of controls in BFI mode. I would like for it to do 60hz single strobe but I understand that requires a tighter tuning than just simply adjusting brightness (which is the control that's locked in BFI mode by the way - it's way too bright in that mode).

I'll also encourage you to feel lucky and blessed that you still have a quality CRT. I made the choice to give mine up when we decided to help out a family member who was terminally ill. I also decided not to pursue getting any more as they're way too expensive for quality units and it's not something I can really do with my kids. My life is different now. I still wish I had mine but alas I don't. I agree with you though, it seems that the only way we can get a decent motion blur screen is to crowdsource one. OLED screens with adjustable roll or a scan-out implementation should give us something closer to what CRT did. Or hell - a rear projection laser display, which literally scans out the image like a CRT did - but with lasers. That would do it too.
 
Just compiling a few more unique-looking HDMI > VGA's that have not been community-tested. The ones that aren't that *one* specific widely-used little 'dongle' style design are always intriguing, to me...
Also worth noting that some of these are quite new, could have newer chipsets we are not aware of.

~~~~~~~~~~~~~~~~~~~~

https://www.newegg.com/p/2VR-001J-00020?Item=9SIA7253E98173
^ audio breaks out in a diff. location than other dongle ones

https://www.newegg.com/p/2VR-00JS-00001?Item=9SIA57ZB721964

https://www.newegg.com/p/2VR-001J-00042?Item=9SIA7254MW7685
^ different shape and PSU connector than other dongle ones

https://www.globalmediapro.com/dp/A2K8V6/ASK-HDCN0012M1-HDMI-to-HDMIVGAAudio-Decoder/

https://www.newegg.com/bytecc-hm201/p/N82E16812270601?Item=N82E16812270601
^ has a very different chipset than all others, see my previous post

https://www.newegg.com/p/2VR-00EB-00004?Item=9SIB7DVJ3W0798

https://www.newegg.com/p/2VR-065A-00DR4?Item=9SIAP9WHHX8449

https://www.newegg.com/p/2S7-04RG-000M5?Item=9SIA441ADU5326
^ this is an older design that's become uncommon, probably Lontium LT8511

https://www.aliexpress.com/item/2251832309640536.html?gatewayAdapt=4itemAdapt
^ this one is VERY intriguing to me

https://www.newegg.com/c2g-40714/p/N82E16886970009?Item=9SIB18AFEP3665

https://www.newegg.com/p/1YN-00N6-000H9?Item=9SIAYWHEX04752
^ have seen this one for cheaper on Amazon

https://www.newegg.com/p/1B4-09RP-00R59?Item=9SIAWNEECX6682
^ have seen ones similiar to this on AliEx for very very cheap

https://www.aliexpress.com/item/2251832761974946.html?gatewayAdapt=4itemAdapt
^ super-weird UGreen... a bit on the older side.

https://www.aliexpress.com/item/2251801509268091.html
^ HDFury 3 clone...? Maybe, maybe not

~~~~~~~~~~~~~~~~~~~~

If anyone takes the plunge and tries any of these for science, that would be awesome! Never know when we'll find another HDMI DAC chipset that can pump out ~400MHz like the LK7112
 
Last edited:
"people who prefer CRT will be pleased to know that a Blur Busters Approved monitor, strobed at a Hz well below maximum, can produce a motion experience superior to a CRT."
"ViewSonic XG270 is superior to a Sony FW900 CRT"

words like those, among many type of false advertising manipulation i described in this thread, is that makes me strongly believe both blurbusters certified viewsonic monitors are a scam compared not just to the FW900 motion clarity experience but also against any other CRT monitor in good condition, in fact, even my ordinary compaq 7550 17 inch CRT monitor is way brighter than those viewsonics at their CRT matching motion quality that i can use it as well as i do with the FW900 on a moderated natural lighted room, even with a bulb light in the night, with still bright enough notable screen, and also crosstalk free at any refresh rate it supports from 48 to 140hz, with no "fuzzy" issues at its max contrast-luminance levels it can achieve, (i can record and upload video proof of this if someone interested)

It's just a matter of line-item perspective -- if your priority is motion clarity instead of colors/brightness -- I'll give readers a head up about my post here containing my reply to 3dfan's biased perspective:
https://hardforum.com/threads/inter...z-crt-for-8000.2018866/page-3#post-1045361267
[I can crosspost it here if needed]

Squarewave single-strobe 60Hz certainly is a polarizing feature. Many love it, and many hate it. He's already taken his camp on the opposite side.
At the end of the day, it's a feature that exists that people can choose to use or not.

Needless, I'll leave it at that. I've already clearly pointed out I'm all about line-items, while that person's post is all about the aggregate. Like youtubers or streamers, I am a minor 'celebrity' of different sorts in a different niche, so I am due to barbs from my detractors and subject spin-amplification of any (incorrectly) perceived negatives. Go out, have a latte, admire nature, play a game, or whatever. Readers can come to their judgements.

Things are easily quoted out of context.

I get it. People are passionate about CRTs. That's the common ground. We certainly still have the inability to match ALL checkboxes (blacks, texture, resolution independence, softer flicker for a Hz, etc) despite LCDs having beat CRTs in the motion-clarity department.

Some people certainly still make the false claim we haven't yet (from an end-result / human-vision / pursuit-camera point of view) about the motion clarity checkbox. I've certainly never lied about that. Sure, CRTs certainly has a rise-edge of microseconds/nanoseconds (far faster than human vision needs, but scientifically that part is not the cause of CRT's superlative motion clarity -- it's the short peak pixel visibility time). A detractor will frequently wave that in my face, but that's ignoring the falling edge (aka GtG on LCD, or phosphor decay on CRT). The fall-edge (phosphor decay) can still be can be slower than certain crosstalkless strobed LCDs (where the fall edge GtG is completely hidden in the dark period of a strobe). Yes, there's the brightness tradeoff of a strobed LCD and the squarewaveness, but the motion clarity beats CRT. And not everyone even run their CRTs at over 100 nits brightness -- you might, but not everyone does. I can keep going on.

It really shows with the human vision and pursuit cameras, on many tubes, and when you're doing high-resolution content (1080p+), the actual real-life measured motion resolution of some of the best big-budget strobed LCDs now exceeds the majority of CRT tubes (like Quest 2 VR LCD which is zero-crosstalk straight out of the box, no tuning needed like XG2431 needs to reach the zero-crosstalk far-better-than-LightBoost state).

If someone haven't purchased enough of the best strobed LCDs post-2020 (both VR and non-VR), and is just an armchair reviews parrot, they are definitely not equipped to be a credible detractor.

So my response is that the XG is a definite improvement on my Samsung VA monitor with strobing. But I wouldn't necessarily equate that to advancements in the technology. My biggest problem is that the Samsung locks a couple of controls in BFI mode. I would like for it to do 60hz single strobe but I understand that requires a tighter tuning than just simply adjusting brightness (which is the control that's locked in BFI mode by the way - it's way too bright in that mode).
In my experience, VA panels are still too slow to hide LCD GtG in the VBI more fully. Additional adjustments to VA strobing helps, but the max-Hz of a crosstalk threshold (e.g. sub-1%) is much lower than it is with TN or Fast IPS, due to the highly inconsistent LCD GtG heatmap of VA panels.

So adjustments (even as many as XG2431, which has more than XG270) will greatly improve strobing on certain VA panels, but having the same adjustments doesn't improve VA to the same strobe quality. You need newer faster-GtG panels, combined with a higher max refresh rate (headroom), combined with additional strobe adjustments, to go vastly better than LightBoost. VA panels still have too slow GtG / inconsistent GtG heatmaps to get decent crosstalk-free strobing, still.

High Hz OLED is finally hitting the market soon, so who knows which racehorse in the refresh rate race will pull out ahead in CRT emulation. I think OLED/MicroLED has a bigger chance to produce algorithms that ticks off more CRT checkboxes other than motion clarity (certainly long since scientifically dethroned, especially if you are a Quest 2 owner). The thrill of eventual lagless rolling-scan algorithms and various kinds of flicker-mitigation algorithms.

While OLED's have less peak brightness than a water-cooled backlight mod [DIY YouTube Hack], and may be unable to strobe as brightly (at first), they have near-zero GtG to the point where a 170 Hz OLED has slightly better sample-and-hold motion clarity than a 240Hz LCD - and the blacks / colors checkboxes are ticked, at only a slight tradeoff of motion clarity (which may not matter as much as long as low enough, if your priority is colors/blacks) - The best LG OLED BFI setting (~4ms MPRT) has more than 10x motion blur than an Oculus Quest 2 VR LCD (0.3ms MPRT), while simultaneously having equally-clean crosstalkless strobing (no leading/trailing artifacts). From a real-world to-human-eyes (and also to pursuit camera) motion clarity checkbox standpoint, CRT ranks somewhere in between best BFI'd OLED and best strobed LCD. But neither will check all CRT-matching checkboxes (e.g. LCD's worse colors, or OLED's worse motion blur).

The package deal (e.g. better colors) can cause someone to create a preference for a specific display technology, for their specific use cases. You might be happy with only 75% better motion clarity than a common garden-variety 60Hz LCD, on a display that can do perfect blacks and DCI-P3 color (e.g. LG C1 at max BFI setting). The image is very clean, and LG OLED TVs are fantastic for use as a desktop monitor too.

I am actually excited to see what the new Micro-OLEDs rumored in Quest 3 (4?) and rumored to be in Apple Reality VR headset will pull off -- rumor is that they finally found out how to strobe them bright enough for ultra-short MPRTs, making strobed "HDR" possible. They may actually use longer pulse widths for brighter pixels, which is an interesting compromise (even the human vision ghosts with superbright light sources -- the vision trails you see when you dart your eyes past the sun or an ultra-bright streetlamp) -- so a variable-MPRT-per-pixel approach is a way to make OLEDs even brighter. There's a patent somewhere (either LG or Samsung). RTs even had more motion blur for brighter colors, due to longer phosphor decay of brighter colors, so ironically this simulates CRT better!

Certainly a legit approach, as a compromise to allow brighter strobing while keeping perfect blacks, ultrahigh 4K resolution, and great colors that exceed CRT gamut. Technically, you can do that with a FALD+MicroLED strobed backlight too, even while rolling simultaneously, though at a more zoned level, so there will be some development there too. But it appears several OLED factories are suddenly spooling up as we speak and a boom of desktop OLEDs will hit the market over the entirety of 2020s. Now, back to the Micro-OLEDs used for upcoming VR headsets.

The new upcoming Micro-OLED displays aren't desktop sized but will probably show superlative motion clarity specs. They might even become the benchmark display to beat for many CRT-checkboxes-matched (if you're OK with viewing a virtual CRT tube on a virtual desk, while inside virtual reality). Display technology is continuously improving for all panel technologies, but the OLED horse has apparently finally surged and sped up quite a bit and is something to watch in the next 3 years.

But the package deal shifts around, better blacks and colors. And overall, with more engineering improvements, can simulate CRTs much better than LCDs eventually. I hope OLED/MicroLED engineering improvements happen rapidly throughout this decade and next. Your priorities of CRT checkboxes to tick off, are different, despite there not yet being a display panel that is Jack of All Trades enough to tick-off all CRT checkboxes (yet) -- but brute Hz and HDR should allow very accurate temporal CRT electron beam simulation algorithms, as discussed in some of my previous posts (i.e. simulating a 60Hz CRT electron beam on a 1000Hz OLED). Exciting times indeed!

I cannot understate enough how massively things have improved since the LightBoost days -- many will unscientifically & falsely claim otherwise -- but try to borrow a friend's Quest 2 or other high-resolution crosstalkless modern VR-headset LCD to micdrop this "LCD motion clarity has finally beat CRT" matter, and they managed to do it without too much brightness loss.

Technology is improving with all panel tech, whether it be OLED or LCD. 'Nuff said.
 
Last edited:
--- PSA: Informational ---

Since people are not moving things to private message (as others suggested), I am jumping in to be involved, after having respectfully holding back. So here's my famous wall of text to address all this technology.

Even my posts is arguably somewhat more ontopic than some of those recent posts, in thvein of "Are there upcoming technologies that will eventually match Sony FW900 in one checkbox? Or all checkboxes? How long is FW900 safe for, if I'm worried about more than just motion clarity?" Etc.

IMPORTANT: This is not /intended/ to be an advertisement, but an informative reply that also doubles as counter-argument to some misleading information that 3dfan has posted... Skip over the post if not interested in factual corrections.

While certain well-tuned strobed LCDs have long passed CRTs in motion resolution -- however, if you're an all-checkboxes CRT person, CRTs are safe for quite a while. Sure.

But seeing how 3dfan is surgically targeting in an unreasonable way -- in an attempt to falsely discredit the proven strobe science in an unnecessarily broad-spectrum way... I'm way more pragmatic.

View attachment 504927

"if you are worried about flicker but want lower crosstalk 180hz vertical total 1500 is a good compromise" even a RTX 3090 TI will struggle to achieve constant 180 fps in most modern games even with raytracing off to achieve that good compromise.
As a correction to out-of-context claim on "even a RTX 3090 Ti will struggle..."

It's also important to note that it's a continuum of ever-decreasing crosstalk.
If your panel has support for Strobe Utility re-tuning (BenQ, ViewSonic) as well as an Overdrive Gain adjustment (Factory Menu on BenQ, or Utility on ViewSonic) then after retuning --
- 179 Hz has slightly less crosstalk than 180 Hz
- 178 Hz has slightly less crosstalk than 179 Hz
- and so on. Assuming, you've retuned as fully as possible to the LCD's maximum potential.

Assuming the Hz is properly re-tuned, it is a complete continuum of ever-decreasing strobe crosstalk as you gain more Hz headroom. So you can choose any Hz between ~59Hz - ~241Hz and then re-tune it. You can do 120Hz or 110Hz, or you can do 140Hz. While the XG270 only has tuning presets and can't be retuned at in-between Hz, the XG2431 does -- it did take a lot of convincing of the manufacturer to enable broad-spectrum tunability.

Typically, at the factory, only certain Hz is perfectly tuned (e.g. NVIDIA ULMB 85 Hz, 100 Hz, 120 Hz), but you can now tune in-between refresh rates more perfectly than NVIDIA ULMB has tuned 85/100/120.`

While Hz-for-Hz, squarewave strobing is more flickery than CRT, most people see 100Hz digital strobe less flickery than CRT 60 Hz. So you target a specific Hz, and then retune using Strobe Utility for zero-crosstalk operations;

The Large Vertical Total is based off the horizontal scan rate of 240 Hz, so the lower your target Hz, you recompute your new Large Vertical Total as (240Hz Existing VT) / 240 * TargetHz, and create a new QFT mode based on that, to get bigger VT for every Hz decrement, to hide more LCD GtG in the VBI between refresh cycles; until GtG98% or GtG99% or GtG100% is successfully fit to your needs. At low Hz, top/center/bottom can go fully strobe crosstalk free. At 120Hz QFT + 1/240sec scanout, you have up to ~4.1ms of VBI time to hide LCD GtG. By keeping scanout at 1/240sec using such a QFT computation, any custom lower QFT Hz (large VTs) will have slightly bigger VTs to hide LCD GtG better between refresh cycles.

Even without Large VTs, the crosstalk does progressively decrease the lower Hz you go -- just not quite as fast as you would with Large VTs. Even RTINGS's non-QFT (non-large-VT) pursuit photos are still pretty good, but you can even get better than RTINGs pursuit imagery with QFT modes (Large VTs):

(example reviewer that tests multiple strobe Hz)
1661994808766.png


Where RTINGS reviews the XG2431, they shows multiple pursuit photos for the multiple strobe settings, and you can see major differences. But the same pattern arises -- the lower the Hz, the less crosstalk, regardless of whether you use QFT or not. While RTINGS did not test QFT, it illustrates the relationship between Hz and strobe crosstalk. But this is massively amplified (faster decreases of strobe crosstalk as you lower Hz), if you use QFT (Large VT).

The same technique is also used on Quest 2 VR LCD as discovered at DisplayWeek -- it apparently scans out faster than its Hz too (possibly as fast as 1/240sec; the exact scanout speed is not measured) -- so it's also using refresh rate headroom too;

https://uploadvr.com/quest-2-lcd-display-detailed-specs/

Specifically this slide:

1661995979954.png


While they do things differently (due to one LCD doing two images), the diagram clearly shows a QFT behavior in action with the big gaps between scanouts -- VBIs historically was 5% or less of a refresh cycle. So other parties have independently found large VBIs reduce strobe crosstalk too, whether done internally (scan conversion like LightBoost) or externally (better, since it's also QFT = faster transmission of refresh cycle over cable = reduce lag).

The end result is a LCD inches ever closer and closer (or exceed) the motion resolution of a CRT such as a Sony FW900 (in the motion clarity department), as you milk around those compromises. Now, for a specific person, you may reach something you love but find another attribute you hate (e.g. LCD greys).

But that's not the argument being made here -- the fact is somebody is falsely trying to discredit me on LCD's inability to exceed the motion resolution of a CRT. So, here, I come to defense. CRT can be a superior option if you are concerned about all the checkboxes, but Blur Busters, the name sake, specializing on display motion blur, is very surgically focussed on display motion blur obviously.

This is just longtime strobe science, and once you have roughly 2:1 to 3:1 refresh rate headroom, you can pretty much go zero-crosstalk on some LCDs. But you don't have to go all the way, if you want a sweet spot compromise between your GPU's frame rate, your preferred Hz, your brightness, whatever attributes are important to you. You aren't limited to 180 Hz.

So, if one wants CRT-clarity (ignoring other CRT checkboxes like blacks / squarewave flicker / etc) at 100Hz, then a 240Hz-scanout capable panel is ideal for this, since 100Hz 10ms scanout can be accelerated to 4.2ms (1/240sec), leaving a VBI of 5.8ms, which is quite longer than manufacturer suggested LCD GtG numbers.

With excellent temperature-compensated overdrive tuning on a "1ms GtG panel" (TN or IPS), much of the GtG heatmap hides within 5.8ms, eliminating most strobe crosstalk. Add more headroom, e.g. 80Hz on a 240Hz LCD (12.5ms refresh cycles minus 4.2ms scanout = 8.3ms VBI to settle your LCD pixels in dark and then a short strobe at the end). So it's a complete continuum -- lower your Hz slightly, you have more VBI headroom (keeping horizontal scanrate unchanged), and retuned, it looks better than the slightly higher Hz mode. That's why Blur Busters Approved 2.0 monitors have a requirement of any-Hz tuning now -- give users more Hz choices.

Either way, strobed desktop monitors should have a similar crosstalk-reducing features over the long term that is made much easier.

In the long-term, a monitor should have a user-friendly "Quick Frame Transport": ON/OFF toggle. Which would load QFT EDIDs if ON, to avoid the need to buffer refresh cycles (like LightBoost does -- very laggy strobe backlight).

To avoid users needing to do a hellava lot manual optimizing to reduce crosstalk. Stobe Utility is like the strobe equivalent of a colorimeter -- strobe tuning utility, instead of a color tuning utility. Because LCDs do have minor factory (& temperature) variances panel-to-panel with strobe, not just for color. Since GtG can be slightly faster/slower in different batches, or because of temperature, etc. Account for this in a Utility, and one can get better-than-factory strobe tuning.

Different people have different flicker-sensitivity thresholds. For a specific individual, their flicker-compensation for squarewave global strobe may be +20 Hz (e.g. 80Hz strobing is roughly as comfortable as average 60Hz CRT for one person). While for others, it's roughly at least 2x Hz, etc (e.g. needing 120 Hz).

The context of my original reply of 180 Hz is "I can accept a slight amount of strobe crosstalk if not hugely noticeable. I would like to use highest possible Hz that doesn't have super-ugly strobe crosstalk" -- but that is not the same question-answering as answering "I want something that mimics my Sony FW900 motion clarity at 100Hz". The context of my original answer needs to be taken into account -- Right Tool For Right Job. It's okay to say you don't like the XG2431 but this borders on finding-excuses to hate on a product or business [etc], creating artificial means of hating on a product. One that generally is widely acclaimed by its users -- a popular monitor will often have more bad review total than a less popular model.
One choose your sweet spot Hz for your purpose, and strobe-tune to it for clearest motion clarity;

The strobe tuning is generic so any monitors with "pulse width + pulse position + overdrive gain" (PW + PP + OD), given sufficient tuning granularity such as 100-level overdrive, will have strobe-quality-improvement behaviors that behave (almost) exactly the same (not just XG2431), but only if those LCDs provide those features. The great news is that RTINGS has started to include whether or not the monitor includes strobe tuning capability:

1661995010670.png

(screenshot clip that shows RTINGS now mentioning Pulse adjustments for all future computer monitors they review)

Wider strobe tuning range means a very wide brightness range. Because tuning permits 1% refresh cycle all the way to 40% refresh cycle, this produces a wide brightness range. One person may be happy with the motion clarity at 100-150 nits, and others may not be -- you can still get clearer motion than an LG OLED max-BFI while still staying north of 100nits, if you wanted. But if your package deal includes colors and perfect blacks, then sure, the deal's off. Those who have milked XG2431 to the max (PW+PP+OD+QFT+headroom) remarks it's the clearest-motion LCD they've seen short of certain other specimens (e.g. Quest 2 VR LCD or Valve Index 2 VR LCD).

Typically, a detractor will typically use a low brightness as an excuse to bash the monitor, when in reality the user has a choice of many strobe tuning adjustments that makes it brighter/dimmer, as a tradeoff between motion clarity and brightness of strobe.

Unlocking the refresh rate range of strobe tuning risks unlocking lower-quality strobing Hz (e.g. 240Hz has more strobe crosstalk than 144 Hz), some vendors lock strobe Hz, but Blur Busters unlocks.

Unlocking the PWM range of strobe tuning risks unlocking too-dim strobing Hz, but this gives a wide brigthness choice of very bright strobing through very dim strobing, to help users find their preferred sweet spot.

About other quality attributes, I highly recommend the LG OLED's when color/blacks is more important, and they use a form of a rolling-scan as part of their BFI which mitigates flicker more. However, this does not dispute the fact that the top 1% cherrypicked LCDs, combined with improved tuning support, can achieve superlative motion clarities nowadays -- submillisecond MPRTs. Not everyone has the liberty to view thousands of LCDs and make honest callouts on what produces the best motion-clarity LCDs.

Actual reviewers who actually test out the generic "PW+PP+OD" strobe tuning science I am talking about -- are now changing their review testing procedures (more Hz testing, and checking for strobe tuning capability). RTINGS is only one of them. Most users don't test strobing to such depths.

The generic strobe tuning standard of trio of adjustments (PW + PP + OD) is not widely implemented yet, along with QFT which helps even further, but I would like to see all strobed LCD manufacturers implement them. Not all manufacturers fully understand how to improve strobe quality.

(Image for glossary purposes -- strobe tuning with PW+PP+OD as well as QFT(Large VT))
1661997396259.png

Only monitors with all four adjustabilities can get quite noticeably better than LCDs without any of these adjustments. Only monitors with all four adjustabilities can get better motion resolution than most CRTs (especially during large refresh rate headroom margins of 3:1 or greater)

Nobody has an exclusivity on strobing. Manufacturers should implement QFT concurrently with PW+PP+OD strobe tuning adjustment capability, in even better quality, for next-gen strobe quality. Also one interesting behavior is PW+PP+OD has much bigger effect on lower Hz than at max Hz, so people who try to play around these adjustments at max Hz, will often notice "meh -- why bother". But at lower, the strobe tuning magic appears -- since PW+PP+OD tuning really shines with large vertical totals.

FW900 is safe for a long time yet, if you're concerned about far more than just motion clarity. But that doesn't stop Blur Busters from trying to chip away at the line-items over the long term. I'm rather excited about the upcoming new high-Hz OLEDs (Even if MPRT will probably temporarily take a slight backseat, while other attributes like near-zero-GtG, colors, and blacks can help fill other checkboxes).

I'm looking forward to seeing more manufacturers implementing easier PW+PP+OD+QFT adjustments that are beneficial to LCD strobe. Reviewers are starting to notice over the years, and more of them are starting to educate end users about the existence of strobe tuning (whether ours or others) -- thanks to the trailblazing I do to lift all boats.

Only LCDs capable of retuning with 100% range PW+PP+OD+QFT (either at factory or by user retuning) can legitimately easily exceed CRT in motion resolution. A few LCDs, such as Quest 2 VR LCD, is one of them, while other boilerplate scaler/tcons may only allow the OEM to adjust 2-3 out of the 4 --

A common engineering problem with strobe backlights is some scalers are unable to overlap the strobe partially over the VBI (important due to GtG lag, since LCD GtG and scaler/TCON line-buffering often produces a tapedelay behavior on where it is necessary to turn on the strobe backlight in the middle of the signal VBI, and turn off the strobe backlight shortly after the signal VBI -- because real world pixel response needed such a strobe pulse to be aligned in a weird part of the refresh cycle.

strobe-tuning-chart-pulse-phase-ANIMATED.gif

(Non-QFT animation, but illustrates 100%-range Pulse Phase adjustment available only in few panels such as 23.8" Innolux 240Hz IPS. Not even all BenQ's panels can do it either!)

Most backlight timing controllers can only keep pulses inside the refresh cycle and not overlapping refresh cycles. Such panels containing a hardware limitation CANNOT go zero crosstalk -- because they can't what the above animation does.

Optimal tuning settings sometimes requires the pulse to be partially overlapping two reresh cycles in an optimal manner.

For example, the microcontroller strobe timing limitations preventing 100%-range pulse phase -- will often have unsolvable strobe crosstalk during QFT modes because you can't position the strobe flash where you optimally need it to be (for real world pixel response hiding).

Blur Busters had a meeting room presentation to explain why the backlight timing controller needed to be modified to permit 100%-range VBI-overlapping strobe phase control, to solve a further-crosstalk-reducing limitation that other panels had.

I actually go onsite for this sort of stuff to do TestUFO-style powerpoints -- here's an image:

D3D78197-EE56-404A-8B63-50552F91207C-740x555.jpg


So, I really do have to do some on-site educating from time to time.

For a crosstalk-free (top AND center AND bottom) strobe backlight during optimal-five conditions (3 settings + custom timings + enough headroom) during refresh rate headroom (PW+PP+OD+QFT and roughly 3:1 Hz headroom depending on panel).

Most reviewers don't tune to all these conditions, and they don't really need to -- because manufacturers should be (but still don't) tune factory-inbuilt QFT modes concurrently with PW+PP+OD. A funny outlier is some newer LCD-based VR headsets, which had to do it out of sheer necessity (since motion clarity artifacts are nauseating in VR).

QFT looks like this at the signal level:
large-vertical-total-vt-vbi-quick-frame-transport.jpg

QFT = Large VT = hide more LCD GtG in VBI between refresh cycles = great reason to use refresh rate headroom with strobed LCDs, no matter what LCD technology -- it's generic laws of physics.

In the case of XG2431 as a strobe tunability example, there are 100 strobe OD settings, 100 strobe PP settings, and 40 strobe PW settings, and over 3000 different vertical totals possible (QFT+Hz depenedent), ignoring all the possible scanrates (I'm assuming always using the 240Hz's max scanrate, regardless of QFT / Large VT refresh rate you use).

100 x 100 x 40 x 3000 = That's literally over 1.2 billion strobe tuning possibilities in Blur Busters Approved 2.0 if you multiply these numbers together to compensate for everything (Hz, temperature, panel variances, etc) to get your favourite zero-crosstalk mode.

Not everyone has the time to iterate through all possibilities, but I designed step-by-step instructions to get you pretty close to perfect (google XG2431 strobe utility as an example) for a given Hz within less than 5 minutes of work after the initial learning experience. Few manufacturers are willing to let users have such control.

Also, some reviewers have used Strobe Utility, but I haven't heard of a reviewer that does Strobe Utility + QFT concurrently. (That's why I'm attempting to lobby some manufacturers to add a "Quick Frame Transport: ON/OFF" setting, because Large VT's is a method of reducing input lag by transmitting frames faster over the video cable). So you see forum posts get better results than some of the reviewers did.

Mind you -- Twitter advocacy or HardForum does not help, Blur Busters had to actually attend headquarters and ally with manufacturers to effect real industry change.

Only people who call colorimeters a scam, will call strobe-tuning a scam. Not everyone wants to adjust their monitors to such a professional detail, I get it. This is all proven science that is being adopted by the industry, even those not working with Blur Busters.

Blur Busters' work is kind of trailblazing a new internal industry standard in strobe tuning (that can get superior to ULMB) as well as increased reviewer coverage of DIY strobe tuning, but it certianly doesn't solve LCD blacks nor other shortcomings of LCD over CRT.

Nontheless, it illustrates how tough it was to successfully exceed CRT motion resolution with an LCD -- less than 1% of LCDs on the market can do it -- and never at their max Hz. They certainly still fail other attributes (e.g. lack of CRT blacks, or comfort of low-Hz flicker), but the importance of those attributes varies from person to person.

It is not possible to satisfy everyone. Even CRT doesn't satisfy everyone -- some like the perfect geometry of an LCD, and wished a specific CRT attribute to be included -- so there is an overlapping venn diagram that varies from human to human. As time progresses, LCD manages to overlap a bit more and more. XG2431 overlaps more venn digrams than XG270 does, because of its support for Strobe Utility (creates >1 billion total combinations of strobe tunings, when combined with functional VTs). Wait until more and more of your venn diagram (colors, blacks, flicker sensitivity, tunability, brightness achieved at your preferred tunings, etc) is met with future technologies if, say, for example, XG2431 is not good enough for you in attributes other than motion clarity. So, be pragmatic of the pros and cons, and your goals, and you're fine.

Fair is fair.
 
Last edited:
--- PSA: Informational ---

Since people are not moving things to private message (as others suggested), I am jumping in to be involved, after having respectfully holding back. So here's my famous wall of text to address all this technology.

Even my posts is arguably somewhat more ontopic than some of those recent posts, in thvein of "Are there upcoming technologies that will eventually match Sony FW900 in one checkbox? Or all checkboxes? How long is FW900 safe for, if I'm worried about more than just motion clarity?" Etc.

IMPORTANT: This is not /intended/ to be an advertisement, but an informative reply that also doubles as counter-argument to some misleading information that 3dfan has posted... Skip over the post if not interested in factual corrections.

While certain well-tuned strobed LCDs have long passed CRTs in motion resolution -- however, if you're an all-checkboxes CRT person, CRTs are safe for quite a while. Sure.

But seeing how 3dfan is surgically targeting in an unreasonable way -- in an attempt to falsely discredit the proven strobe science in an unnecessarily broad-spectrum way... I'm way more pragmatic.


As a correction to out-of-context claim on "even a RTX 3090 Ti will struggle..."

It's also important to note that it's a continuum of ever-decreasing crosstalk.
If your panel has support for Strobe Utility re-tuning (BenQ, ViewSonic) as well as an Overdrive Gain adjustment (Factory Menu on BenQ, or Utility on ViewSonic) then after retuning --
- 179 Hz has slightly less crosstalk than 180 Hz
- 178 Hz has slightly less crosstalk than 179 Hz
- and so on. Assuming, you've retuned as fully as possible to the LCD's maximum potential.

Assuming the Hz is properly re-tuned, it is a complete continuum of ever-decreasing strobe crosstalk as you gain more Hz headroom. So you can choose any Hz between ~59Hz - ~241Hz and then re-tune it. You can do 120Hz or 110Hz, or you can do 140Hz. While the XG270 only has tuning presets and can't be retuned at in-between Hz, the XG2431 does -- it did take a lot of convincing of the manufacturer to enable broad-spectrum tunability.

Typically, at the factory, only certain Hz is perfectly tuned (e.g. NVIDIA ULMB 85 Hz, 100 Hz, 120 Hz), but you can now tune in-between refresh rates more perfectly than NVIDIA ULMB has tuned 85/100/120.`

While Hz-for-Hz, squarewave strobing is more flickery than CRT, most people see 100Hz digital strobe less flickery than CRT 60 Hz. So you target a specific Hz, and then retune using Strobe Utility for zero-crosstalk operations;

The Large Vertical Total is based off the horizontal scan rate of 240 Hz, so the lower your target Hz, you recompute your new Large Vertical Total as (240Hz Existing VT) / 240 * TargetHz, and create a new QFT mode based on that, to get bigger VT for every Hz decrement, to hide more LCD GtG in the VBI between refresh cycles; until GtG98% or GtG99% or GtG100% is successfully fit to your needs. At low Hz, top/center/bottom can go fully strobe crosstalk free. At 120Hz QFT + 1/240sec scanout, you have up to ~4.1ms of VBI time to hide LCD GtG. By keeping scanout at 1/240sec using such a QFT computation, any custom lower QFT Hz (large VTs) will have slightly bigger VTs to hide LCD GtG better between refresh cycles.

Even without Large VTs, the crosstalk does progressively decrease the lower Hz you go -- just not quite as fast as you would with Large VTs. Even RTINGS's non-QFT (non-large-VT) pursuit photos are still pretty good, but you can even get better than RTINGs pursuit imagery with QFT modes (Large VTs):

(example reviewer that tests multiple strobe Hz)
View attachment 505752

Where RTINGS reviews the XG2431, they shows multiple pursuit photos for the multiple strobe settings, and you can see major differences. But the same pattern arises -- the lower the Hz, the less crosstalk, regardless of whether you use QFT or not.

The same technique is also used on Quest 2 VR LCD as discovered at DisplayWeek -- it apparently scans out faster than its Hz too (possibly as fast as 1/240sec; the exact scanout speed is not measured) -- so it's also using refresh rate headroom too;

https://uploadvr.com/quest-2-lcd-display-detailed-specs/

Specifically this slide:

View attachment 505758

While they do things differently (due to one LCD doing two images), the diagram clearly shows a QFT behavior in action with the big gaps between scanouts -- VBIs historically was 5% or less of a refresh cycle. So other parties have independently found large VBIs reduce strobe crosstalk too, whether done internally (scan conversion like LightBoost) or externally (better, since it's also QFT = faster transmission of refresh cycle over cable = reduce lag).

The end result is a LCD inches ever closer and closer (or exceed) the motion resolution of a CRT such as a Sony FW900 (in the motion clarity department), as you milk around those compromises. Now, for a specific person, you may reach something you love but find another attribute you hate (e.g. LCD greys).

But that's not the argument being made here -- the fact is somebody is falsely trying to discredit me on LCD's inability to exceed the motion resolution of a CRT. So, here, I come to defense. CRT can be a superior option if you are concerned about all the checkboxes, but Blur Busters, the name sake, specializing on display motion blur, is very surgically focussed on display motion blur obviously.

This is just longtime strobe science, and once you have roughly 2:1 to 3:1 refresh rate headroom, you can pretty much go zero-crosstalk on some LCDs. But you don't have to go all the way, if you want a sweet spot compromise between your GPU's frame rate, your preferred Hz, your brightness, whatever attributes are important to you. You aren't limited to 180 Hz.

So, if one wants CRT-clarity (ignoring other CRT checkboxes like blacks / squarewave flicker / etc) at 100Hz, then a 240Hz-scanout capable panel is ideal for this, since 100Hz 10ms scanout can be accelerated to 4.2ms (1/240sec), leaving a VBI of 5.8ms, which is quite longer than manufacturer suggested LCD GtG numbers.

With excellent temperature-compensated overdrive tuning on a "1ms GtG panel" (TN or IPS), much of the GtG heatmap hides within 5.8ms, eliminating most strobe crosstalk. Add more headroom, e.g. 80Hz on a 240Hz LCD (12.5ms refresh cycles minus 4.2ms scanout = 8.3ms VBI to settle your LCD pixels in dark and then a short strobe at the end). So it's a complete continuum -- lower your Hz slightly, you have more VBI headroom (keeping horizontal scanrate unchanged), and retuned, it looks better than the slightly higher Hz mode. That's why Blur Busters Approved 2.0 monitors have a requirement of any-Hz tuning now -- give users more Hz choices.

Either way, strobed desktop monitors should have a similar crosstalk-reducing features over the long term that is made much easier.

In the long-term, a monitor should have a user-friendly "Quick Frame Transport": ON/OFF toggle. Which would load QFT EDIDs if ON, to avoid the need to buffer refresh cycles (like LightBoost does -- very laggy strobe backlight).

To avoid users needing to do a hellava lot manual optimizing to reduce crosstalk. Stobe Utility is like the strobe equivalent of a colorimeter -- strobe tuning utility, instead of a color tuning utility. Because LCDs do have minor factory (& temperature) variances panel-to-panel with strobe, not just for color. Since GtG can be slightly faster/slower in different batches, or because of temperature, etc. Account for this in a Utility, and one can get better-than-factory strobe tuning.

Different people have different flicker-sensitivity thresholds. For a specific individual, their flicker-compensation for squarewave global strobe may be +20 Hz (e.g. 80Hz strobing is roughly as comfortable as average 60Hz CRT for one person). While for others, it's roughly at least 2x Hz, etc (e.g. needing 120 Hz).

The context of my original reply of 180 Hz is "I can accept a slight amount of strobe crosstalk if not hugely noticeable. I would like to use highest possible Hz that doesn't have super-ugly strobe crosstalk" -- but that is not the same question-answering as answering "I want something that mimics my Sony FW900 motion clarity at 100Hz". The context of my original answer needs to be taken into account -- Right Tool For Right Job. It's okay to say you don't like the XG2431 but this borders on finding-excuses to hate on a product or business [etc], creating artificial means of hating on a product. One that generally is widely acclaimed by its users -- a popular monitor will often have more bad review total than a less popular model.
One choose your sweet spot Hz for your purpose, and strobe-tune to it for clearest motion clarity;

The strobe tuning is generic so any monitors with "pulse width + pulse position + overdrive gain" (PW + PP + OD), given sufficient tuning granularity such as 100-level overdrive, will have strobe-quality-improvement behaviors that behave (almost) exactly the same (not just XG2431), but only if those LCDs provide those features. The great news is that RTINGS has started to include whether or not the monitor includes strobe tuning capability:

View attachment 505754

Wider strobe tuning range means a very wide brightness range. Because tuning permits 1% refresh cycle all the way to 40% refresh cycle, this produces a wide brightness range. One person may be happy with the motion clarity at 100-150 nits, and others may not be -- you can still get clearer motion than an LG OLED max-BFI while still staying north of 100nits, if you wanted. But if your package deal includes colors and perfect blacks, then sure, the deal's off. Those who have milked XG2431 to the max (PW+PP+OD+QFT+headroom) remarks it's the clearest-motion LCD they've seen short of certain other specimens (e.g. Quest 2 VR LCD or Valve Index 2 VR LCD).

Typically, a detractor will typically use a low brightness as an excuse to bash the monitor, when in reality the user has a choice of many strobe tuning adjustments that makes it brighter/dimmer, as a tradeoff between motion clarity and brightness of strobe.

Unlocking the refresh rate range of strobe tuning risks unlocking lower-quality strobing Hz (e.g. 240Hz has more strobe crosstalk than 144 Hz), some vendors lock strobe Hz, but Blur Busters unlocks.

Unlocking the PWM range of strobe tuning risks unlocking too-dim strobing Hz, but this gives a wide brigthness choice of very bright strobing through very dim strobing, to help users find their preferred sweet spot.

About other quality attributes, I highly recommend the LG OLED's when color/blacks is more important, and they use a form of a rolling-scan as part of their BFI which mitigates flicker more. However, this does not dispute the fact that the top 1% cherrypicked LCDs, combined with improved tuning support, can achieve superlative motion clarities nowadays -- submillisecond MPRTs. Not everyone has the liberty to view thousands of LCDs and make honest callouts on what produces the best motion-clarity LCDs.

Actual reviewers who actually test out the generic "PW+PP+OD" strobe tuning science I am talking about -- are now changing their review testing procedures (more Hz testing, and checking for strobe tuning capability). RTINGS is only one of them. Most users don't test strobing to such depths.

The generic strobe tuning standard of trio of adjustments (PW + PP + OD) is not widely implemented yet, along with QFT which helps even further, but I would like to see all strobed LCD manufacturers implement them. Not all manufacturers fully understand how to improve strobe quality.

(Image for glossary purposes -- strobe tuning with PW+PP+OD as well as QFT(Large VT))
View attachment 505765
Only monitors with all four adjustabilities can get better than LCDs without any of these adjustments. Only monitors with all four adjustabilities can get better motion resolution than most CRTs (especially during large refresh rate headroom margins of 3:1 or greater)

Nobody has an exclusivity on strobing. Manufacturers should implement QFT concurrently with PW+PP+OD strobe tuning adjustment capability, in even better quality, for next-gen strobe quality. Also one interesting behavior is PW+PP+OD has much bigger effect on lower Hz than at max Hz, so people who try to play around these adjustments at max Hz, will often notice "meh -- why bother". But at lower, the strobe tuning magic appears -- since PW+PP+OD tuning really shines with large vertical totals.

FW900 is safe for a long time yet, if you're concerned about far more than just motion clarity. But that doesn't stop Blur Busters from trying to chip away at the line-items over the long term. I'm rather excited about the upcoming new high-Hz OLEDs (Even if MPRT will probably temporarily take a slight backseat, while other attributes like near-zero-GtG, colors, and blacks can help fill other checkboxes).

I'm looking forward to seeing more manufacturers implementing easier PW+PP+OD+QFT adjustments that are beneficial to LCD strobe. Reviewers are starting to notice over the years, and more of them are starting to educate end users about the existence of strobe tuning (whether ours or others) -- thanks to the trailblazing I do to lift all boats.

Only LCDs capable of retuning with 100% range PW+PP+OD+QFT (either at factory or by user retuning) can legitimately easily exceed CRT in motion resolution. A few LCDs, such as Quest 2 VR LCD, is one of them, while other boilerplate scaler/tcons may only allow the OEM to adjust 2-3 out of the 4 --

A common engineering problem with strobe backlights is some scalers are unable to overlap the strobe partially over the VBI (important due to GtG lag, since LCD GtG and scaler/TCON line-buffering often produces a tapedelay behavior on where it is necessary to turn on the strobe backlight in the middle of the signal VBI, and turn off the strobe backlight shortly after the signal VBI -- because real world pixel response needed such a strobe pulse to be aligned in a weird part of the refresh cycle.

View attachment 505767
(Non-QFT animation, but illustrates 100%-range Pulse Phase adjustment available only in few panels such as 23.8" Innolux 240Hz IPS. Not even all BenQ's panels can do it either!)

Most backlight timing controllers can only keep pulses inside the refresh cycle and not overlapping refresh cycles. Such panels containing a hardware limitation CANNOT go zero crosstalk -- because they can't what the above animation does.

Optimal tuning settings sometimes requires the pulse to be partially overlapping two reresh cycles in an optimal manner.

For example, the microcontroller strobe timing limitations preventing 100%-range pulse phase -- will often have unsolvable strobe crosstalk during QFT modes because you can't position the strobe flash where you optimally need it to be (for real world pixel response hiding).

Blur Busters had a meeting room presentation to explain why the backlight timing controller needed to be modified to permit 100%-range VBI-overlapping strobe phase control, to solve a further-crosstalk-reducing limitation that other panels had.

For a crosstalk-free (top AND center AND bottom) strobe backlight during optimal-five conditions (3 settings + custom timings + enough headroom) during refresh rate headroom (PW+PP+OD+QFT and roughly 3:1 Hz headroom depending on panel).

Most reviewers don't tune to all these conditions, and they don't really need to -- because manufacturers should be (but still don't) tune factory-inbuilt QFT modes concurrently with PW+PP+OD. A funny outlier is some newer LCD-based VR headsets, which had to do it out of sheer necessity (since motion clarity artifacts are nauseating in VR).

QFT looks like this at the signal level:
View attachment 505769
QFT = Large VT = hide more LCD GtG in VBI between refresh cycles = great reason to use refresh rate headroom with strobed LCDs, no matter what LCD technology -- it's generic laws of physics.

In the case of XG2431 as a strobe tunability example, there are 100 strobe OD settings, 100 strobe PP settings, and 40 strobe PW settings, and over 3000 different vertical totals possible (QFT+Hz depenedent), ignoring all the possible scanrates (I'm assuming always using the 240Hz's max scanrate, regardless of QFT / Large VT refresh rate you use).

100 x 100 x 40 x 3000 = That's literally over 1.2 billion strobe tuning possibilities in Blur Busters Approved 2.0 if you multiply these numbers together to compensate for everything (Hz, temperature, panel variances, etc) to get your favourite zero-crosstalk mode.

Not everyone has the time to iterate through all possibilities, but I designed step-by-step instructions to get you pretty close to perfect (google XG2431 strobe utility as an example) for a given Hz within less than 5 minutes of work after the initial learning experience. Few manufacturers are willing to let users have such control.

Also, some reviewers have used Strobe Utility, but I haven't heard of a reviewer that does Strobe Utility + QFT concurrently. (That's why I'm attempting to lobby some manufacturers to add a "Quick Frame Transport: ON/OFF" setting, because Large VT's is a method of reducing input lag by transmitting frames faster over the video cable). So you see forum posts get better results than some of the reviewers did.

Mind you -- Twitter advocacy or HardForum does not help, Blur Busters had to actually attend headquarters and ally with manufacturers to effect real industry change.

Only people who call colorimeters a scam, will call strobe-tuning a scam. Not everyone wants to adjust their monitors to such a professional detail, I get it. This is all proven science that is being adopted by the industry, even those not working with Blur Busters.

Blur Busters' work is kind of trailblazing a new internal industry standard in strobe tuning (that can get superior to ULMB) as well as increased reviewer coverage of DIY strobe tuning, but it certianly doesn't solve LCD blacks nor other shortcomings of LCD over CRT.

Nontheless, it illustrates how tough it was to successfully exceed CRT motion resolution with an LCD -- less than 1% of LCDs on the market can do it -- and never at their max Hz. They certainly still fail other attributes (e.g. lack of CRT blacks, or comfort of low-Hz flicker), but the importance of those attributes varies from person to person.

It is not possible to satisfy everyone. Even CRT doesn't satisfy everyone -- some like the perfect geometry of an LCD, and wished a specific CRT attribute to be included -- so there is an overlapping venn diagram that varies from human to human. As time progresses, LCD manages to overlap a bit more and more. XG2431 overlaps more venn digrams than XG270 does, because of its support for Strobe Utility (creates >1 billion total combinations of strobe tunings, when combined with functional VTs). Wait until more and more of your venn diagram (colors, blacks, flicker sensitivity, tunability, brightness achieved at your preferred tunings, etc) is met with future technologies if, say, for example, XG2431 is not good enough for you in attributes other than motion clarity. So, be pragmatic of the pros and cons, and your goals, and you're fine.

Fair is fair.
QFT on/off would be an instant buy from me. Especially since you could use it for consoles. Is there a thread over at your forums for getting QFT to work? I haven’t been able to get it working yet and I’d love to see a crosstalk-less 60hz.
 
QFT on/off would be an instant buy from me.
I'm working to do more internal lobbying to add user-friendly QFT feature (plug-n-play QFT DisplayIDs / E-EDIDs). It's a long thankless slog for future models -- but keep tuned (pun)

Especially since you could use it for consoles. Is there a thread over at your forums for getting QFT to work? I haven’t been able to get it working yet and I’d love to see a crosstalk-less 60hz.
There's a QFT HOWTO thread on the Blur Busters Forums but I'll post a TL;DR for ToastyX CRU users:

TL;DR Quick Frame Transport (QFT) Mini-HOWTO
(Monitor Independent if panel is at least undocumented QFT-compatible, not XG2431 spcific):
1. Start with your max-Hz mode in ToastyX CRU
2. Lock radio button on the horizontal scan rate to keep it at 240Hz scanout speed
3. Don't edit refresh rate directly (only increase VT to decrease refresh rate). Increase the VT to let ToastyX autocompute a lower Hz. Keep increasing VT to lower the Hz.
Note: Some people increase the Back Porch or Front Porch or the VT box to increase the VT -- some monitors prefer a different method of increasing VT since it contains three different things -- Vertical Back Porch, Vertical Sync, Vertical Front Porch). Most of the time, usually keep the Sync setting unchanged and only increase either Front Porch or VT (easiest) or Back Porch (better, some drivers hate it). Sometimes better if done in a CEA861 extension block if supported, since you get much more VT adjustment range than with the old EDID extension blocks since there's not enough bits in classic EDID for unusually large numbers in porches.
4. (Optional, but necessary for some modes) In some cases you may only successfully decrease down to a certain X Hz (e.g. 64 Hz) before it blacks out, at which point you may move the radio button to lock the VT, and then instead editing the refresh rate downwards while maintaining the same last-working VT.
5. Now you have created a QFT mode with an unusually large blanking interval (also called "Large Vertical Totals")

Then you've got your QFT mode such as a 60Hz fixed-Hz QFT mode that transmits refresh cycles over video cable in 1/240sec (e.g. 1080p in an approximately 4500-scanline refresh cycle). Whereupon a compatible horizontal-scanrate multisync scaler/TCON in an LCD (such as Innolux 23.8" IPS) will scanout in sync, creating your requisite large-VBI for more easily hiding LCD GtG in VBI. (And a beneficial side effect: Reduces center-screen strobe lag too, because the strobe backlight can flash sooner -- since LCD GtG begins sooner and finishes sooner, plus a little more wait time dependent on your configured strobe phase (to hide more of real-world GtG), then monitor flashes backlight (refresh cycle strobed), and you're still lower-lag than non-QFT 60Hz at the end of the day, while simultaneously having less strobe crosstalk due to more complete LCD GtG by the time backlight is flashed. A QFT 60Hz mode has over 12 millisecond VBI (more than 10x longer than manufacturer 'claimed' 1ms), a blanking interval 3x taller than the vertical resolution!

Note: QFT behavior on CRT varies a lot, but gigantically large VBIs (more than 50% of vertical resolution) usually does not work well on most CRTs because large VBIs (relative to active image) usually cause CRT image to become vertically compressed beyond the range of the vertical-height adjustment to compensate;

EDIT:
UPDATE.... This HOWTO is now semi-obsolete!
- ToastyX on September 2nd, 2022 just added Quick Frame Transport to ToastyX CRU. ToastyX just created respective new thread on Blur Busters Forum about this new feature. This should make QFT experimenting much easier!
 
Last edited:
Or hell - a rear projection laser display, which literally scans out the image like a CRT did - but with lasers. That would do it too.
Still not sure why this hasn't been done yet. With the tech we have now days I understand DLP, LCD, and LCoS, are cheaper and more practical. But why wouldn't a CRT like Laser projector simply be superior? I'd help crowd fund a short throw Laser projector that formed the image with scan lines. Understandably the whole bending of the laser beam to produce the same effect as an electron beam needs to be worked out but piezoelectric tech has come a long way and might be good for this application. Something like how they made the fan in this video work maybe? No idea really, but that's why engineers get paid all that money right?
 
Still not sure why this hasn't been done yet. With the tech we have now days I understand DLP, LCD, and LCoS, are cheaper and more practical. But why wouldn't a CRT like Laser projector simply be superior? I'd help crowd fund a short throw Laser projector that formed the image with scan lines. Understandably the whole bending of the laser beam to produce the same effect as an electron beam needs to be worked out but piezoelectric tech has come a long way and might be good for this application. Something like how they made the fan in this video work maybe? No idea really, but that's why engineers get paid all that money right?
Something like this might work but just because it might work is not good reason to mass produce it.
Issues I can imagine not seeing the display:
- flickering which would be much higher than CRT
- they would need pretty strong lasers as power of laser would be spread over large screen surface
- having lasers (let alone quite powerful) would make screen dangerous for humans eg. when front glass integrity is compromised
- in case of failure power of our powerful laser might go in to one spot which might increase fire risk (and someone eyes getting melted)
- complexity/cost of manufacturing
- susceptibility for failure

And then you look at products which sold: multiple frames of input lag. If motion clarity was more important for people we would have better motion clarity. It is only very important to small number of people and even for them in some cases it is better to have sample & hold display.
 
So I have a rather pointless and dumb tinker-quest I've been on lately. I made this post on Blurbusters (shameless plug) a little while ago, and they didn't seem to think it made sense or would work, but I figured I might as well run it by you fine, fine people and see what you think.

I play a lot of console-based video games that don't have solid or any emulation alternatives, such as those on Nintendo Switch and PS4, and they are stuck on those consoles at 30fps. I understand these are doubled to meet standard 60hz output timings, and this creates double-image artifacts on our beloved FW900 and other CRTs.

The only 'practical' thing I can think of to solve this issue is to introduce BFI artificially, as some sort of post-processing option, but I'm not aware of a video processing box that offers it as a feature, and even if it did, BFI on a 60hz signal that already has natural CRT flickering is pretty awful to my eyeballs. It did occur to me later that some video devices can do framerate conversion similar to what game consoles and movie players already do, such that a 60hz input signal can be output as 120hz simply by doubling up each frame. It was also suggested to me that I could use the flip flop logic gates often sourced for retro scanline generators by blanking out alternating horizontal lines in rhythm with horizontal sync, but instead use vertical sync to blank out alternating frames. In theory, this could work to get rid of 60hz flicker for 60fps games (essentially 120hz BFI?), but it also raises the issue of an original 30fps source having its frames QUADRUPLED, and then cut in half again for the black frame insertion. So each frame would be on-screen for half as long as the original 60hz source signal, being interrupted twice as often by total blackness on our blessed FW900. The circuit stuff is a little over my head, so all I've gotten so far in my tinkering is a darkened image and a wavy vertical line on my test CRT, haha, but I'm still trying.

Anyway, just curious if you guys thought this is even worth pursuing or is just stupid. With a basic ADC, this could still be useful for those who have high refresh LCD screens that didn't come with BFI as an option, though obviously an all-digital solution would probably be easier in that situation.
 
So I have a rather pointless and dumb tinker-quest I've been on lately. I made this post on Blurbusters (shameless plug) a little while ago, and they didn't seem to think it made sense or would work, but I figured I might as well run it by you fine, fine people and see what you think.

I play a lot of console-based video games that don't have solid or any emulation alternatives, such as those on Nintendo Switch and PS4, and they are stuck on those consoles at 30fps. I understand these are doubled to meet standard 60hz output timings, and this creates double-image artifacts on our beloved FW900 and other CRTs.

The only 'practical' thing I can think of to solve this issue is to introduce BFI artificially, as some sort of post-processing option, but I'm not aware of a video processing box that offers it as a feature, and even if it did, BFI on a 60hz signal that already has natural CRT flickering is pretty awful to my eyeballs. It did occur to me later that some video devices can do framerate conversion similar to what game consoles and movie players already do, such that a 60hz input signal can be output as 120hz simply by doubling up each frame. It was also suggested to me that I could use the flip flop logic gates often sourced for retro scanline generators by blanking out alternating horizontal lines in rhythm with horizontal sync, but instead use vertical sync to blank out alternating frames. In theory, this could work to get rid of 60hz flicker for 60fps games (essentially 120hz BFI?), but it also raises the issue of an original 30fps source having its frames QUADRUPLED, and then cut in half again for the black frame insertion. So each frame would be on-screen for half as long as the original 60hz source signal, being interrupted twice as often by total blackness on our blessed FW900. The circuit stuff is a little over my head, so all I've gotten so far in my tinkering is a darkened image and a wavy vertical line on my test CRT, haha, but I'm still trying.

Anyway, just curious if you guys thought this is even worth pursuing or is just stupid. With a basic ADC, this could still be useful for those who have high refresh LCD screens that didn't come with BFI as an option, though obviously an all-digital solution would probably be easier in that situation.
It reduces flicker yes, but it introduces a duplicate image effect, so it isn't a Holy Grail.

Multi-strobe definitely creates duplicate images. This is confirmed:

strobed-display-image-duplicates.png


1 - Motion blur is proportional to pixel visibility time. For squarewave strobed framerate=Hz, motion blur=pulse width. For non0strobed, motion blur=frametime
Educational Animation Demo Link:
TestUFO Variable Persistence BFI
(*IMPORTANT: Not for strobed displays or CRTs; run this on a sample-and-hold display of 120Hz or higher. If you only have 60 Hz, it will be super-flickery)

2 - To avoid duplicate image artifacts, flicker must be contiguous for a unique frame, to avoid duplicate images.
Educational Animation Demo Link: TestUFO Black Frame Insertion With Duplicate Images

From the "Research" tab on the Blur Busters website, I have provided these two images as a reference how to get identical display motion blur on a strobed display versus impulsed display. (Important: CRTs impulse so briefly that no commercially available unstrobed sample-and-hold display can yet match the motion blur of CRT).

1662414335224.png


1662414328185.png


This is all confirmed and now known (for a decade) display motion blur science.

This is even demonstratable in software-based BFI, just click the links above. The more refresh rate the better, because BFI persistence can only be simulated in refresh cycle increments. So if your display is 144Hz, your BFI persistence control of software-based BFI can only occur in 1/144sec increments. So the TestUFO animations become more educational the higher the refresh rate you go

Blurfree 60fps absolutely has to unavoidably modulate light output (with a contguous light-output peak per unique image aka frame).
So because you're stuck with 60 light-output peaks (flickers) per second, you have to do various means of mitigation, as follows:

How Do You Fix 60 Hz Flicker As Much As Possible? (aka How To Simulate a CRT Better)

The main 60 Hz flicker workaround is
(A) Soften the leading edge and/or trailing edge. CRTs have phosphor decay so the motion blur trailing edge is softened slightly (less harsh flicker)
(B) make sure photons are continually hitting human eyes by using a rolling strobe (like a CRT) instead to soften the leading and trailing edges of the flicker, at the cost of a bit more blur/ghosting/phosphor-decay effects. That's why I'm a fan of future CRT simulators.

Be noted there are other factors (brightness, image size, viewing distance, ambient lighting, flicker sensitivities between humans, etc), but the above is what CRTs naturally did; a rolling flicker with a decay effect. This is what makes 60 Hz flicker a lot more tolerable than 60 Hz squarewave. Be noted the sensitivities vary a lot -- there are people who can't stand a 60 Hz CRT -- and on the opposite side of the spectrum, there are people who tolerate 60 Hz global strobing.

Some displays do (B) like LG OLED rolling BFI but that doesn't fix (A). So the current (2017-2022) LG OLED rolling strobe is a less flickery but has way more motion blur (over 10x more) than the best strobed LCDs. Many love the LG OLED (as do I) as a compromise, however. But it is not (yet) a Holy Grail.

That's why I am a big fan of future CRT electron beam simulators (rolling strobe with fadebehind), because once you have enough brute refresh rate, you can simulate in finer granularities. The higher the refresh rate, the shorter-persistence a CRT tube can be simulated (as persistence is in refreshtime increments). If only the LG OLEDs could do 1000Hz+....

Long-term, I'm interested in seeing an open-source Windows Indirect Display Driver exist (based on that Microsoft sample) to run various kinds of shader algorithms such as:
- software-based BFI (for 120Hz displays)
- rolling-scan simulators (for ultrahigh Hz displays, 240Hz+ OLED or 360Hz+ LCD)
- software simulated VRR (like testufo.com/vrr -- algorithm only works on sample and hold display; doesn't work on CRT)
- software-based LCD overdrive superior to manufacturer overdrive (e.g. allow in-between overdrive settings, and/or select different overdrive curves).
- etc.

An ADC isn't critical -- to omit a video processor box or ADC, you can just do a simple GPU shader running in a modified Microsoft-sample Windows Indirect Display Driver, and committed to github -- would do the job.
 
Last edited:
I'm working to do more internal lobbying to add user-friendly QFT feature (plug-n-play QFT DisplayIDs / E-EDIDs). It's a long thankless slog for future models -- but keep tuned (pun)


There's a QFT HOWTO thread on the Blur Busters Forums but I'll post a TL;DR for ToastyX CRU users:

TL;DR Quick Frame Transport (QFT) Mini-HOWTO
(Monitor Independent if panel is at least undocumented QFT-compatible, not XG2431 spcific):
1. Start with your max-Hz mode in ToastyX CRU
2. Lock radio button on the horizontal scan rate to keep it at 240Hz scanout speed
3. Don't edit refresh rate directly (only increase VT to decrease refresh rate). Increase the VT to let ToastyX autocompute a lower Hz. Keep increasing VT to lower the Hz.
Note: Some people increase the Back Porch or Front Porch or the VT box to increase the VT -- some monitors prefer a different method of increasing VT since it contains three different things -- Vertical Back Porch, Vertical Sync, Vertical Front Porch). Most of the time, usually keep the Sync setting unchanged and only increase either Front Porch or VT (easiest) or Back Porch (better, some drivers hate it). Sometimes better if done in a CEA861 extension block if supported, since you get much more VT adjustment range than with the old EDID extension blocks since there's not enough bits in classic EDID for unusually large numbers in porches.
4. (Optional, but necessary for some modes) In some cases you may only successfully decrease down to a certain X Hz (e.g. 64 Hz) before it blacks out, at which point you may move the radio button to lock the VT, and then instead editing the refresh rate downwards while maintaining the same last-working VT.
5. Now you have created a QFT mode with an unusually large blanking interval (also called "Large Vertical Totals")

Then you've got your QFT mode such as a 60Hz fixed-Hz QFT mode that transmits refresh cycles over video cable in 1/240sec (e.g. 1080p in an approximately 4500-scanline refresh cycle). Whereupon a compatible horizontal-scanrate multisync scaler/TCON in an LCD (such as Innolux 23.8" IPS) will scanout in sync, creating your requisite large-VBI for more easily hiding LCD GtG in VBI. (And a beneficial side effect: Reduces center-screen strobe lag too, because the strobe backlight can flash sooner -- since LCD GtG begins sooner and finishes sooner, plus a little more wait time dependent on your configured strobe phase (to hide more of real-world GtG), then monitor flashes backlight (refresh cycle strobed), and you're still lower-lag than non-QFT 60Hz at the end of the day, while simultaneously having less strobe crosstalk due to more complete LCD GtG by the time backlight is flashed. A QFT 60Hz mode has over 12 millisecond VBI (more than 10x longer than manufacturer 'claimed' 1ms), a blanking interval 3x taller than the vertical resolution!

Note: QFT behavior on CRT varies a lot, but gigantically large VBIs (more than 50% of vertical resolution) usually does not work well on most CRTs because large VBIs (relative to active image) usually cause CRT image to become vertically compressed beyond the range of the vertical-height adjustment to compensate;
UPDATE.... This HOWTO is now semi-obsolete!

ToastyX on September 2nd, 2022 just added Quick Frame Transport to ToastyX CRU. ToastyX just created respective new thread on Blur Busters Forum about this new feature.
This should make QFT experimenting much easier!
 
Last edited:
Back
Top