24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Just compiling a few more unique-looking HDMI > VGA's that have not been community-tested. The ones that aren't that *one* specific widely-used little 'dongle' style design are always intriguing, to me...
Also worth noting that some of these are quite new, could have newer chipsets we are not aware of.

~~~~~~~~~~~~~~~~~~~~

https://www.newegg.com/p/2VR-001J-00020?Item=9SIA7253E98173
^ audio breaks out in a diff. location than other dongle ones

https://www.newegg.com/p/2VR-00JS-00001?Item=9SIA57ZB721964

https://www.newegg.com/p/2VR-001J-00042?Item=9SIA7254MW7685
^ different shape and PSU connector than other dongle ones

https://www.globalmediapro.com/dp/A2K8V6/ASK-HDCN0012M1-HDMI-to-HDMIVGAAudio-Decoder/

https://www.newegg.com/bytecc-hm201/p/N82E16812270601?Item=N82E16812270601
^ has a very different chipset than all others, see my previous post

https://www.newegg.com/p/2VR-00EB-00004?Item=9SIB7DVJ3W0798

https://www.newegg.com/p/2VR-065A-00DR4?Item=9SIAP9WHHX8449

https://www.newegg.com/p/2S7-04RG-000M5?Item=9SIA441ADU5326
^ this is an older design that's become uncommon, probably Lontium LT8511

https://www.newegg.com/p/2VR-001J-00048?Item=9SIA7254ZG6324
^ this one is VERY intriguing to me

https://www.newegg.com/c2g-40714/p/N82E16886970009?Item=9SIB18AFEP3665

https://www.newegg.com/p/1YN-00N6-000H9?Item=9SIAYWHEX04752
^ have seen this one for cheaper on Amazon

https://www.newegg.com/p/1B4-09RP-00R59?Item=9SIAWNEECX6682
^ have seen ones similiar to this on AliEx for very very cheap

https://www.newegg.com/black-ugreen...port-mal/p/1DG-0038-00069?Item=9SIA50M5PR2599
^ super-weird UGreen... a bit on the older side.

https://www.aliexpress.com/item/2251801509268091.html
^ HDFury 3 clone...? Maybe, maybe not

~~~~~~~~~~~~~~~~~~~~

If anyone takes the plunge and tries any of these for science, that would be awesome! Never know when we'll find another HDMI DAC chipset that can pump out ~400MHz like the LK7112
many items from those links are missing with no item name, i think its better if you also write items name outside the links, so in case they are out of stock they can be searched elsewhere ;)
 
Go out, have a latte, admire nature, play a game, or whatever. Readers can come to their judgements
please excuse me to disrupt your user-fooling misinforming-oriented strategy marketing, readers like me and overall, deserve to be correctly, transparently, ethically informed, and so come to our own judgements. its evident you wanted to fool readers into believe that CRT monitors have no technical advantage over your viewsonic even in the motion quality aspect regard , again, fooling them missinfoming with those things like CRT having a "bad" only worth for "nostalgic" 60hz flicker, so "dimm" only able reach 50 nits and way smaller, than today "HUGE" monitors to justify why that bad 60hz flicker in the XG2431 as already mentioned in early previous comments.

again, me, considering myself as a CRT monitor enthusiast specially for they motion quality, and having the right to give my personal judgment in this forum thread that is in fact, a CRT monitor related and based on these statments from blurbusters in their Monitor Certification Program

"people who prefer CRT will be pleased to know that a Blur Busters Approved monitor, strobed at a Hz well below maximum, can produce a motion experience superior to a CRT."
"ViewSonic XG270 is superior to a Sony FW900 CRT"

from which i feel involved as a person who prefer CRTs, express my dissagrement with this and i hardly see any viewsonic certified blurbusters monitors provding a "superior motion experience than a CRT", any CRT, not just against a FW900.


I cannot understate enough how massively things have improved since the LightBoost days -- many will unscientifically & falsely claim otherwise

this is a faq taken from a 9 years old lightboost blurbuster article about lightboost cons and pros here

View attachment 507886
now compare the reality with the XG2431 cons:
-Reduced brightness: i see same is ocurring with the XG2431, at its strobe mode pureXP ultra or custom mode that matches CRT clarity mode, as seen on one of its reviews
-Degradation of color quality: eventhough the XG2431 being an IPS, also i can see in that review low brightess makes it looses color vividness when using its best CRT motion matching mode.
-Flicker: still more intense flicker even than CRTs at same refresh rate, also confirmed to be more intense than CRTs by XG2431 users at its 60hz mode.
-Requires a powerful GPU to get full benefits: as well as lightboost, at 120hz with the XG2431 users have comfirmed to fit it best benefits, even you in this XG2431 video review from other different reviewer, from the comments section suggest such a high refresh rate for that:
View attachment 507887
so, after reading those facts, i personaly find it hard to agree to such "how massively things have improved since the LightBoost " comment



Typically, a detractor will typically use a low brightness as an excuse to bash the monitor, when in reality the user has a choice of many strobe tuning adjustments that makes it brighter/dimmer, as a tradeoff between motion clarity and brightness of strobe.

as the word "tradeoff " stands, user will need to sacrifice motion clarity for brightness, honesty those strobe tuning adjustments doesnt make sence if also those degrade the motion quality out from the advertised CRT quality motion range.



While certain well-tuned strobed LCDs have long passed CRTs in motion resolution -- however, if you're an all-checkboxes CRT person, CRTs are safe for quite a while. Sure.
FW900 is safe for a long time yet, if you're concerned about far more than just motion clarity.
strongly disagree here.
i have been using CRT monitors for years, currently using FW900 and an ordinary compaq 7550 17 inch CRT, and after witnessing all the flaws XG2431 suffer from its motion quality, that also affect its motion experience like more intense flicker than CRT at same frecuency, crosstalk, way lower brightnes than CRT at matching motion quality, way higher fps requirements, a lot of tradeoff i dont see in the CRT monitors i can safely say the "concers" go further than just the checkboxes you mention like
blacks, texture, resolution independence, softer flicker for a Hz....

first, there is a mith about the FW900, after comparing it to an ordinary CRT like the compaq 7550 (which supports refresh rates from 48h to 140hz), their motion behave the same (same ficker perseption, both brightfull enought that can be used in a moderated natural light room, ir in the night with a bulb light with still good luminance and color vividness. and 100% crosstalk free at any refresh they support,

now again:
-CRT monitors, any CRT monitor, not just the FW900 are 100% motion-experience-ruining-artifact crosstalk free at any refresh rate, the XG3431 and as you said, seems only able to "reduce" it crosstalk at refreshes as high as 120hz (so requiering powerfull system to achieve 120 contant FPS), and so far it seems the XG3431 is not able to be crosstalk free at 60hz from what it have been seen from XG3431 users, even with tweaks and testing different computer.
i even have been reading arguments from you justifying monitor unability to remove crosstalk with arguments like "panel temperature / lottery / variances, due to the winter or whatever season, cold room, etc" so unless proven with real facts, not just with lenghty words, it remains in the pure BS marketing words if XG2431 can really be "zero* strobe crosstalk " even with tweaks at 60hz. , by the way CRT monitors are 100% crosstalk free at any refresh rate no matter if its cold, hot, whatever season is.

lets assume is true XG2431 can improve 60hz crosstalk with tweaks, what about 60hz console usage? user cannot use those tweaks on a console, so unavoidable crosstalk, way low brightness and more intense flicker on the XG2431 if the user want to experience 60hz console CRT 100% crosstalk free, brightfully, less intense flicker like an CRT would provide even on a 60hz console since its currelty posible to succesfully connect a 60hz console to a CRT via external adapter.

-CRT monitors have lighter flicker, specialy usefull for 60hz, so it makes 60hz motion exprience more enjoyable than the XG2431

-CRT monitors can achieve better brightness and more vivid colors without sacrificing motion quality

so in brief: even in the motion quality aspect, CRT give better motion experience thanks to their 100% free motion-artifact crosstalk, are brighter without sacrificing colors, sacrificing motion clarity at any refresh the user want, especialy for the popular and still widelly used 60hz, from which the XG2431 compared to any CRT monitor at its CRT matching quality motion sacrificies brightness, colors, more intense flicker and crosstalk, all those aspects that affect the overall motion experience, this again, without taking into account the other mentioned checkboxes.



the fact is somebody is falsely trying to discredit me on LCD's inability to exceed the motion resolution of a CRT.

But, we actually managed to beat CRT motion blur while simultaneously having zero* strobe crosstalk (*only after QFT + Strobe Utility recalibration) - less motion blur because of lack of phosphor decay

i have strongly disagreed with you about viewsonics providing "superior" motion experience" than a CRT monitor, and i also i find the words "exceed" "less motion blur" rather exaggerated.

i see you use to use the CRTs phosphor decay as the only argument to make them look "beaten" in the motion quality regards, to me eyes, in all the CRT monitors i have seen they produce a faint, fast vanishing barely notable and notable in certain situations like high contrast moving scenarios decay, even in many other secenerios i dont note it at all, it so insignificant that i frequently forget that it even exists, also the motion clarity or "motion resolution" or watever is called of the moving object is not affected since the decay appear at the side of it, so the moving object keeps being life like motion clear.
if phosphor decay were that bad as XG2431 motion flaws (again, without mentioned other "checkboxes"), this thread would not been active for so long, how many CRT users you see complaining about phosphor decay by the way?
 
  • Like
Reactions: XoR_
like this
(Before I begin, It's worth re-reading my earlier rebuttal to 3dfan's hypocritical post)
this is a faq taken from a 9 years old lightboost blurbuster article about lightboost cons and pros here
View attachment 507886
That's correct. I have repeated these facts at the XG2431 Strobe Utility HOWTO.
Degradation of brightness also does not necessarily mean degradation of color gamut -- this is a matter of interpretation.
LightBoost artificially clipped the color gamut intentionally, while XG2431 preserves its color gamut (only loses brightness) in PureXP+

now compare the reality with the XG2431 cons:
-Reduced brightness: i see same is ocurring with the XG2431, at its strobe mode pureXP ultra or custom mode that matches CRT clarity mode, as seen on one of its reviews
-Degradation of color quality: eventhough the XG2431 being an IPS, also i can see in that review low brightess makes it looses color vividness when using its best CRT motion matching mode.
-Flicker: still more intense flicker even than CRTs at same refresh rate, also confirmed to be more intense than CRTs by XG2431 users at its 60hz mode.
-Requires a powerful GPU to get full benefits: as well as lightboost, at 120hz with the XG2431 users have comfirmed to fit it best benefits, even you in this XG2431 video review from other different reviewer, from the comments section suggest such a high refresh rate for that:
View attachment 507887
so, after reading those facts, i personaly find it hard to agree to such "how massively things have improved since the LightBoost " comment
These are correct. Strobed necessarily has some tradeoffs, but the tradeoffs are much more minor -- e.g. color gamut.

I don't think you ever directly compared LightBoost with an XG2431?
Also, most reviewers don't compare LightBoost directly with XG2431.
However, every single user who's spent lots of time with LightBoost, almost unamiously claim XG2431 is superior after a Strobe Utility recalibration.

LightBoost only supported 100Hz and 120Hz, and has more strobe crosstalk than a QFT-calibrated XG2431.
If you compare the contrast ratio of LightBoost to the contrast ratio of XG2431, the XG2431 strobed contrast ratio is greatly superior.

As the word "tradeoff " stands, user will need to sacrifice motion clarity for brightness, honesty those strobe tuning adjustments doesnt make sence if also those degrade the motion quality out from the advertised CRT quality motion range.

strongly disagree here.
i have been using CRT monitors for years, currently using FW900 and an ordinary compaq 7550 17 inch CRT, and after witnessing all the flaws XG2431 suffer from its motion quality, that also affect its motion experience like more intense flicker than CRT at same frecuency, crosstalk, way lower brightnes than CRT at matching motion quality, way higher fps requirements, a lot of tradeoff i dont see in the CRT monitors i can safely say the "concers" go further than just the checkboxes you mention like
blacks, texture, resolution independence, softer flicker for a Hz....

I'm talking about motion clarity, which can be indeed superior after tuning.
Motion clarity doesn't have anything to do with blacks, texture, resolution independence, and softer flicker.

You're simply putting words in my mouth, by blending "motion clarity" = "motion quality". Everybody's definition of "motion quality" is different (subjective) while "motion clarity" is qualitative.
I have not lied about motion clarity.
I have been honest about motion clarity.

With Strobe Utility, you can create a mode much worse than LightBoost, and you can create a mode much better than LightBoost. If you follow the instructions like many users do, you can get crosstalk-free modes (top-to-bottom) like this from the Pursuit Camera Forum:

1662599176440.png


(Smartphone camera exposure of 1 strobe of testufo.com/crosstalk, from XG2431 Pursuit Camera Forum in the Blur Busters Forums, page 3).

This is not the only one, of course.

You can maintain this top/center/bottom crosstalkfreeness (to >GtG99%, not only GtG90%) up to roughly ~100Hz with the maximal QFT timings, but you need to do all five tuning (Pulse Width + Pulse Phase + Overdrive Gain + Hz headroom + also add QFT) in order to pull off this crosstalkless nature.

It even retains the same color gamut as non-strobed, without the gamut-squeezing (like the mandatory "16-235" lookalike behavior of LightBoost, because LightBoost does it to create LCD overdrive headroom above white, and LCD overdrive headroom below black, to reduce LightBoost crosstalk on early LCDs, but this is no longer necessary if you utilize QFT instead. Look at the old reverse engineering article about LightBoost, why it degraded color quality in ways that XG2431 does not.

LightBoost was literally a 1 or 2 (pick a number) out of 10 in color quality.
Whereas XG2431 would be closer to a 6 or 7 (pick a number) out of 10 in color quality. Some say 10, some say 5, or whatever -- but regardless, it is many times better than LightBoost.
The point being is, it's not a binary polar black and white.

It is now improved to the point where the main color-gamut problem is blacks -- e.g. XG2431 is now able to reach similar color gamut as a "black-level-raised" FW900., That is something LightBoost could not do, as its color rendition was extremely unsaturated. (e.g. CRT blacks intentionally brightened to match LCD blacks and contrast intentionally reduced to 1000:1 while maintaining as best-possible CRT colors). You wouldbn't gimp a CRT like this, but this illustrates how minimal the color-quality degradation has become now on the XG2431 compared to LightBost
Anyone who directly compares LightBoost vs XG2431 directly, confirms it's massively better than LightBoost.

Most people and reviewers, only master 3 or 4 out of the 5 which still gets better than LightBoost. Then if you master all 5, it becomes vastly better than LightBoost.

In addition to this, with the right modes, you can still beat CRT motion resolution (aka motion clarity), being able to read tiny 6-point street label text at www.testufo.com/map#pps=3000 at 3000 pixels/sec during 1920x1080p at 1:1 pixel mapping at 100% DPI (OS DPI) -- no browser zoom -- while still having triple digit nits. A Sony FW900 can't achieve that motion resolution in that specific TestUFO test, for example -- the text gets slightly fuzzier at 3000 pixels/sec than stationary if you stick to 1:1 pixel mapping (100% OS DPI, 100% browser unzoomed). While at some pulsewidths on XG2431, the clarity is identically LCD-square-pixel between stationary and moving. Shorter pulse widths may not be important for VGA emulator users, but important for higher-resolution PC games -- like the "LightBoost 10%" lovers of yesteryear (XG2431 PW adjustment range is much wider -- can get much shorter or much longer PW than LightBoost% adjustment).

There will be those who prefer FW900 and those who prefer XG2431 depending on their priorities.

Motion clarity (only the motion resolution) isn't as important to everyone as motion quality that includes all attributes (blacks, texture, etc). Everybody's priorities is different and I haven't misrepresented that.
When people say "motion quality" (which some countries and translations translate to specifically only include "motion clarity"), it can be misunderstood in some cultures/languages to include all attributes (might be part of your beef, 3dfan) -- so clarification (pun) would be "motion clarity".

Only a few displays (e.g. Quest 2) does all 5 automatically darn near perfectly, so it's quite unfortunate a lot of manual work is needed to make XG2431 even better. However, those who did (while also familiar with LightBoost) all universally say it's so vastly superior to LightBoost, as well as also much less lag than LightBoost (which buffers the refresh cycle for processing) -- 1-2 refresh cycles less input lag than LightBoost does.

The bottom line is everybody's sensitivities and priorities are different. If your priority is CRT motion clarity (and only that line item) in a desktop-LCD panel, it's already very hard to beat the XG2431 now (at least during 2021-2022 nothing beat it).

Also, I should point out your earlier 3dfan post which I've replied to which I've already rebuttal'd.

NOTE: Editted to add more links including the LightBoost reverse engineering link, and corrections to typos/terminology]
 
Last edited:
as the word "tradeoff " stands, user will need to sacrifice motion clarity for brightness, honesty those strobe tuning adjustments doesnt make sence if also those degrade the motion quality out from the advertised CRT quality motion range.
You're also forgetting the tradeoffs have a positive edge to the double edged sword too. (PW + PP + OD + QFT + Hz headroom). For example, you couldn't do that on an XG270, but you now can with XG2431.

Strobe Utility also can help you gets you brighter at same crosstalk.

For example, adding OD retuning and adding QFT to get more brightness at the same strobe crosstalk as a factory PureXP mode (e.g. 120Hz).

QFT makes Strobe Utility tuning mandatory, as the factory strobe tuning profiles were tuned only for the plug-and-play EDID modes (All VESA formulas are all non-QFT, unforunately).

With QFT, some users were able to more than double strobe brightness while maintaining the same strobe crosstalk as the factory 120Hz mode. In other words, brighter than LightBoost while still having less strobe crosstalk than LightBoost 120Hz, while having better color gamut (darn nearly 100% NTSC for any color above LCD black) than LightBoost ever had.

If you're a motion-clarity videophile rather than a colors videophile, learning Strobe Utility is an asset for a wide variety of purposes:
- Reduce strobe crosstalk while keeping same brightness
- Or get brighter while maintaining same strobe crosstalk
- Or prioritize zero strobe crosstalk
- Or hybrid compromise (still brighter than LightBoost + less crosstalk than LightBoost)
- Or reduce strobe lag even further (retuned QFT mode with early-as-possible strobe timing to your preferred crosstalk tolerance)
- Etc.

Determined power users often spend more time studying / learning / adjusting Strobe Utility than a reviewer does (who may be paid only a flat rate per monitor and have to work fast on higher priority tests).

This is no different from becoming a super-expert at a colorimeter. Many print companies also spend more time with a colorimeter than a monitor reviewer does, because of the criticality of matching print to screen.

I honestly wish all of this was more plug-and-play so people can get maximum strobe quality, but panel variances + manufacturers sticking to VESA formulas (which are still all non-QFT). + temperature variances (cold rooms in winter vs hot rooms in summer = different LCD ghosting) + other factors.

But Strobe Utility is no harder for strobe fans to learn, than a CRT fan learning the CRT convergence factory menus (e.g. 25-point convergence in a CRT projector.

I used to own an NEC XG135 CRT projector and could match ISF calibration myself. Things like focus, astig, keystone, bow, pincushion, linearity, zoned convergence, etc (not necessarily in that order -- it's been a long time, mind you). Strobe fans wanted more tuning control, and I provided that capability. While factory modes may be meh to a person like you (3dfan), the XG2431 shines when you understand the tradeoffs properly. When you have many adjustments (via Utility and Timings) like the 5 (PW+PP+OD+QFT+Hz headroom options), you have complex interactions between adjustments like you did with all the zillions of adjustments on a CRT tube. Once you become an expert of the tradeoffs, you more easily achieve higher quality tradeoffs than you did with fewer adjustments.

Thinking ahead -- eventually, it would be nice to develop a Strobe Utility Wizard that autotunes (via an attached photodiode) based on your selected goals. The same photodiodes used in NVIDIA LDAT (and such) could be commandeered as GtG/crosstalk sensors, if it is possible to get sufficient API access before LCD's sunset (to tech such as ultra-high-Hz OLED/MicroLED).

The bottom line is that you've got a cornucopia of CRT adjustments, and some panels now provide a cornucopia of strobe adjustments.

Ultimately, this talk could ideally be forked off to its own thread.
 
Last edited:
A whole bunch of cool stuff including tests and whatnot

I remember seeing a lot of this on your site, but thanks for posting it again here to refresh my memory! It's pretty cool to hear from Mr. Chief Blur Buster himself! 😎

For some reason (custom resolutions?), my FW900/Windows 10 PC does NOT like the various tests on there... Sometimes they work fine, and other times it can't seem to tell what refresh my monitor actually is (usually showing about half). Not sure what I'm doing wrong, but they're definitely handy since I can just intuit the rest anyway.

An ADC isn't critical -- to omit a video processor box or ADC, you can just do a simple GPU shader running in a modified Microsoft-sample Windows Indirect Display Driver, and committed to github -- would do the job.

I have absolutely no idea how I would go about doing that, but am willing to learn and tinker if you can point me in the right direction. My original idea was for some type of box in order to keep things simple for me because I'm an idiot, but I'm quickly realizing this isn't going to be a simple project...

Do you (or anyone here, really) have any suggestions for 30fps content at 60hz, particularly on the FW900? I'm not having a bad experience with it at all, but if there's a fairly simple way to try and smooth out or reduce the duplication artifacts I will at least humor and check it out. I was suggested to tinker with things like a capture card and PC software such as Smooth Video Project or AviSynth to stream gameplay (to myself) and apply video filters/post-processing in "real time." I know the latency introduced here will likely make any game unplayable, but it was mostly as a proof of concept.

Anyways, thanks for your time and your elaborate answer. :)
 
Do you (or anyone here, really) have any suggestions for 30fps content at 60hz, particularly on the FW900? I'm not having a bad experience with it at all, but if there's a fairly simple way to try and smooth out or reduce the duplication artifacts I will at least humor and check it out. I was suggested to tinker with things like a capture card and PC software such as Smooth Video Project or AviSynth to stream gameplay (to myself) and apply video filters/post-processing in "real time." I know the latency introduced here will likely make any game unplayable, but it was mostly as a proof of concept.

Anyways, thanks for your time and your elaborate answer. :)
Real-time smoothing with good latency can be done but it is not something you will be able to do with any reasonable amount of latency by using existing technologies developed for video. Streaming itself will add latency let alone running AviSynth on it and even more so programs designed for videos where latency was no consideration at all (you can always delay audio if that becomes an issue).

Imho you can play with it by making recording at 30fps and re-encoding (or watching) this video at 60fps to see what kind of artifacts are visible and what settings can reduce these artifacts. For real-time smoothing you should imho look elsewhere.

BTW. The best latency-wise smoothing I used was Oculus Rift. There are artifacts there but at times it is not obvious at all you are running game at 45fps vs 90fps. What surprised me the most was that there was this 45->90fps smoothing when I played Doom (2016) through some program to run any game in VR but which could for this game only get 2d image hence no stereoscopic 3d. Still at 45fps game was smooth and responsive with smoothing artifacts. Something like this if it was possible to run any game on any display with this level of latency would be imho very usable for 30fps-only games on CRT.
 
It seems people are quite comfortable with OLED being superior to CRT in various image quality aspects as this was always expected to happen but when LCD scores win in something like motion clarity it is completely unimaginable.
What next... someone measures some LCD at 60Hz and proves it has less input lag than CRT? As far fetched as it sounds it is actually possible!
 
1 - Motion blur is proportional to pixel visibility time. For squarewave strobed framerate=Hz, motion blur=pulse width. For non0strobed, motion blur=frametime
Educational Animation Demo Link:
TestUFO Variable Persistence BFI
(*IMPORTANT: Not for strobed displays or CRTs; run this on a sample-and-hold display of 120Hz or higher. If you only have 60 Hz, it will be super-flickery)

2 - To avoid duplicate image artifacts, flicker must be contiguous for a unique frame, to avoid duplicate images.
Educational Animation Demo Link: TestUFO Black Frame Insertion With Duplicate Images

I've been thinking about this a lot today... If motion blur=pulse width, then speeding up sync pulses (shortening them) by increasing framerate reduces motion blur, which makes perfect sense since that's exactly why we want consistently smooth, high framerates while gaming and such... Probably obvious to everyone else here but I'm still learning 😛 This makes me even more curious about my idea of framedoubling 60hz to 120hz and then blanking half the frames.

This "flicker must be contiguous for a unique frame" line is bugging me though... For 60hz signals, we have 60hz flicker (duh), but for 24/30fps content our flicker is still in sync with each frame despite the fact that we now have two of every unique frame. Even though CRTs are strobed displays, I still see afterimages/double image artifacts while playing 30fps games, which I suppose COULD be a product of certain specific game rendering processes and not part of a typical 30fps > 60hz output conversion. It's very easy to see when slowly panning a camera in a 3D action game, for example. But the 60hz signal is still flickering at 60hz, so I would think that flicker would be taking care of the afterimages in a way that a traditional LCD can't. This makes me think that even though the CRT is indeed a strobing display, the duplicate image artifacts I'm seeing are still a product of the 30fps frame duplication (unless, like I said before, it's simply part of the types of games I happen to play a lot). OR, I'll admit that what I'm seeing could also be some type of phosphor retention artifact since I don't have a trained eye for this stuff like you guys do.

This brings me back to my idea of 120hz with BFI; even though each unique 30fps frame is now quadrupled for the 120hz signal, I imagine the 120hz flicker combined with half of those copied frames being blacked out would reduce some artifacts? For more proper math than my very well made graphic I posted on BlurBusters (also attached here), if a unique frame in a 60fps source lasts about 16.67ms, and therefore a unique frame from a 30fps game in a 60hz signal lasts 33.33ms (not factoring in CRT strobe time because I'm dumb and math is hard); further doubling the framerate to 120hz output doesn't really change the time that the image is on-screen, but blanking half the frames does cut the time that the frame is on screen by half. Even if the unique frame still has one duplicate, it only lasts 8.33ms without interruption instead of the original 33.33ms (a whopping 75% drop), but then again, CRT strobing kind of already does this to a degree. Not sure exactly what my point was with all this, just needed to get my thoughts down somewhere. :p
 

Attachments

  • BFI Illustration.png
    BFI Illustration.png
    89.2 KB · Views: 0
Still not sure why this hasn't been done yet. With the tech we have now days I understand DLP, LCD, and LCoS, are cheaper and more practical. But why wouldn't a CRT like Laser projector simply be superior? I'd help crowd fund a short throw Laser projector that formed the image with scan lines. Understandably the whole bending of the laser beam to produce the same effect as an electron beam needs to be worked out but piezoelectric tech has come a long way and might be good for this application. Something like how they made the fan in this video work maybe? No idea really, but that's why engineers get paid all that money right?
Sony actually made one a while back. I never bought one because I was trying to get a good home theater setup going and it would NOT have worked for me. I'm kicking myself though because it's no longer available. I would have loved to see one in action because I'm 99% certain it had no motion blur. https://www.projectorreviews.com/sony/sony-mp-cl1-laser-projector-review/
 
Update #2. Just created an 85hz mode and it's about as cross-talk free as 60hz. There's a slight hint of it when using specific patterns but once I dive into UT2004, for example, I don't even see it. Period. Other than the lack of contrast, this is about as close to my old FW900 that I can possibly expect with an LCD. Bravo, bravo, bravo! 10/10. Very happy now. If this monitor had 3000:1 contrast then it would be perfect for me. Otherwise, I'm more than satisfied with this solution.
 
Update #2. Just created an 85hz mode and it's about as cross-talk free as 60hz. There's a slight hint of it when using specific patterns but once I dive into UT2004, for example, I don't even see it. Period. Other than the lack of contrast, this is about as close to my old FW900 that I can possibly expect with an LCD. Bravo, bravo, bravo! 10/10. Very happy now. If this monitor had 3000:1 contrast then it would be perfect for me. Otherwise, I'm more than satisfied with this solution.
What LCD are you using again?
 
Chief Blur Buster - finally got it working! HOLY SHIT!!! 60hz is crosstalk free!!! Gonna play some Streets of Rage 4 now.
Excellent!

Can you screenshot your 60Hz and your 85Hx ToastyX CRU and ViewSonic Strobe Utility (written by me) for others to copy?

Update #2. Just created an 85hz mode and it's about as cross-talk free as 60hz. There's a slight hint of it when using specific patterns but once I dive into UT2004, for example, I don't even see it. Period. Other than the lack of contrast, this is about as close to my old FW900 that I can possibly expect with an LCD. Bravo, bravo, bravo! 10/10. Very happy now. If this monitor had 3000:1 contrast then it would be perfect for me. Otherwise, I'm more than satisfied with this solution.
85 Hz is a great compromise! Much less flicker, as long as you’re not needing to accommodate legacy 60fps content.

OD Gain will need a bit of re-tuning on a panel lottery basis / panel temperature, but all other settings are reproducible (mostly) on all other XG2431s, at least for ultratall crosstalk-free zones taller than the screen (thanks to ultratall VBIs).

ToastyX’s new “semi plug and play” QFT is a godsend ("Vertical total calculator"). Turns QFT into much easier technology. May even convince some manufacturers to add plug-n-play QFT EDIDs. Then reviewers will finally discover this. 😊

While I successfully convinced ViewSonic to do (A) 60Hz single-strobe wth single strobe support in 0.001Hz increments from ~59 - ~241Hz, and (B) End-user strobe calibration was unlocked…
…I was not yet able to convince ViewSonic to include plug-and-play Quick Frame Transport (fast scanout + humongous VBIs in the signal & at the panel, for more complete LCD GtG settlement time between strobes). But with ToastyX making QFT easier now in their new version,

Sadly, no reviewer has done the five-calibration strobe-backlight retuning combo, which is Pulse Width + Pulse Phase + Overdrive Gain + QFT + Refresh Rate Headroom (Hz far below max Hz), that are all concurrently required for the full-screen-height crosstalk-free strobed LCD on the XG2431. But ToastyX’s automatic QFT calculator just made it a hell lot easier to do the QFT part (cramming LCD GtG between strobes in a giant 8ms-13ms VBI).

Meanwhile, mind if you post your screenshots of 60 Hz and of 85 Hz?

You will need to retune Strobe Utility for now as it does not yet support memorizing multiple QFT tunings yet, as there’s only one PureXP Custom slot). So write down your settings in between switching custom QFT refresh rates.

If QFT becomes popular enough, I am considering the idea of a refresh rate monitor on the monitor-side, and rewriting saved Strobe Utility values separately during a refresh rate switch — so that multiple QFT refresh rates can be memorized. Creating a PC-side software based equivalent of multiple “PureXP Custom” strobe tuning memories that are all QFT-refresh-rate specific.

Cheers,
Chief Blur Buster
 
Last edited:
It seems people are quite comfortable with OLED being superior to CRT in various image quality aspects as this was always expected to happen but when LCD scores win in something like motion clarity it is completely unimaginable.
What next... someone measures some LCD at 60Hz and proves it has less input lag than CRT? As far fetched as it sounds it is actually possible!
First, since it’s common for people here to shoot down “X is better than Y” without nuancing to line-items, because none blow away all of the others in every single checkbox.

All these displays (LCD, OLED, CRT) have their separate respective pros and cons, and they have their fan followings. Retro display looks have to be preserved, and that’s why I’m a fan of multiple approaches on the burners (hardware based, software based).

Now that disclaimer prefaced…. I want to mention that there are emulators (without RunAhead algorithms) that has less lag than the original machine, because of ultrafast scanout.

For example, a 360Hz LCD can refresh (scanout) a “60Hz” emulator refresh cycle in 1/360sec. So even if there’s a bit of lag rendering the emulator image, it can blast the frame from the C++ program to photons-emissions in just a few milliseconds (for all pixels — first to last pixel). While a classical 60Hz signal on any display (CRT, LCD, OLED, whatever) scans out over the cable from first pixel through the last pixel in 1/60sec. (e.g. 60fps at 360Hz = defacto software-based QFT as long as the software is accurately framepacing the frames, e.g. 360/60 = so exactly every 6 refresh cycles, to prevent stutter and other side effects)

Most esports LCDs now refresh top to bottom in a subrefresh-latency manner (essentially beamraced display processing where the signal scanout is synchronized to panel scanout. You can see this this high speed video [YouTube video of testufo.com/scanout] of an old BenQ XL2720Z, a panel already known to have subrefresh Present()-to-photons latency as they had begun engineering for that at the time -- using line buffers instead of full framebuffering of refresh cycles in the monitor's scaler).

So most of lag is LCD GtG lag, though GtG90% means the pixel is already 90% visible in about a millisecond or two after the pixel arrives at the LCD. Thusly, Present() to photons hitting eyeballs, when well timed and completely fully inputdelayed (hairtrigger), can be as little as ~2-3ms on some LCDs (more laggy than CRT, but still far less than 1/60sec) for the first pixel row of a refresh cycle (or for VSYNC OFF, first pixel row of a frameslice). But because of the ultrafast scanout (top to bottom sweep), finishes the refresh cycle far ahead of the original CRT.

This is where the twist happens, where you can simply use an ultrafast scanout (say, 4ms instead of 16.7ms) to overcome that LCD lag disadvantage (vs CRT). Basically piggybacking on the sheer brute Hz (and ultrafast scanout), for lower-Hz use cases.

So that’s why some 60Hz emulators in some cases, manage have less lag than their original machines, when run on ultrahigh Hz monitors (e.g. 360Hz monitors), as long as they use a good low-lag swapchain workflow in their programming + inputdelay techniques. (this is even if RunAhead=OFF, the algorithm that RetroArch invented).

This is not specific to LCDs. You can do it with a CRT, some CRTs manage to do crazy high refresh rates at extremely low resolutions (e.g. 240p), so you could also have an emulator generate less input lag. You have the emulator processing lag, but as long as the ultrafast scanout (4.2ms for 240Hz, 2.8ms for 360Hz) can out-race a 16.7ms scanout, you have CRTs/LCDs that have less lag than the original arcade machine already today, simply by piggybacking on ultrafast scanouts (whether it be VRR, non-VRR, QFT, or simply running 60fps at perfect 4:4 pulldown at 240Hz, or 6:6 at 360Hz).

Related — I helped some emulator developers add beamraced sync (where the real raster is synchronized to the emulator raster, on a VSYNC OFF frameslice beam racing). This includes WinUAE main branch, CLK (Tom Harte) main branch, and an early GroovyMAME alpha. Though this has a bit more lag than using the ultrafast scanout technique (though you can also beamrace 2x as fast at 120Hz, e.g. simulate a 1/60sec emulator refresh cycle in 1/120sec surges at a time — WInUAE is capable of this weird beamraced sync mode). In original=emulator refresh rate situation, it almost matches the lag of an original machine, when rendered out to a CRT of same Hz — aka FPGA, except in Windows EXE — because of beamraced emulator rendering+output. Beamraced emulator output (emuraster=realraster) benefits all displays, including precious FW900s as well as modern LCDs.

Nontheless, if you don’t use the ultrafast scanout technique (e.g. scan out “60Hz” refresh cycle very fast over cable, on panels that can refresh in realtime via rolling linebuffering instead of framebuffering). Then you also avoid algorithms like RunAhead, you won’t be able to get less lag than the original machine on an original CRT.

However, most emulator users don’t realize that an ultra-high-Hz is a good conduit for ultrafast “60Hz” frame delivery of any kind on any display technology (including CRT) — to overcome the classical situation of taking 1/60sec for the first thru last pixel to finish refreshing. Take the 240Hz display example. 1/240sec scanout finishes a "60Hz" emulator refresh cycle more quickly by 0.75/240sec sooner than a 1/60sec scanout. This milliseconds savings can veer into the double digits (0.75/240sec = 12.9ms faster than 1/60sec), which can be sufficient overcome other lags -- including modem/transceiver/micropacket dejitter latency of a HDMI/DP port electronics (including lags of an adaptor/scaler like HDFury) + also the pixel response lag if it's not an instant-response display.

This technique works perfectly fine on a Sony FW900 since it can do refresh rates well above 60Hz. (too)
 
Last edited:
  • Like
Reactions: XoR_
like this
...including precious FW900s as well as modern LCDs.

...This technique works perfectly fine on a Sony FW900 since it can do refresh rates well above 60Hz.

Yes, praise the almighty FW900 some more... That's what I live for! 😎

Also, thanks for mentioning FPGA solutions on there -- a lot of this is completely over my head (as it should be, since you're THE GUY for blur-busting, unless your name is a LIE), but I was wondering how FPGA systems might be affected by or otherwise take advantage of a lot of these concepts.
 
So, last post on this because I don't want to take the thread off topic:

For the Viewsonic XG2431

1. Download CRU 1.5.2
2. Create a new 240hz mode (see screenshots below)
a. Click a detailed resolution and click the Add... button.
1663000486019.png

b. Set your refresh rate to 240hz. When you're done your specs should match my screenshot.
1663000368799.png

3. Next, select Vertical total calculator and lower your refresh rate to whatever you desire. In this example I'm using 60hz
1663000406500.png

4. Rinse and repeat for each frequency you want (keeping in mind the limited number of Detailed Resolution slots. I edited a 60hz, 85hz, and 120hz mode for crosstalk-free. 120hz isn't completely crosstalk-free but far better than the default)
5. Click "OK" on CRU. It'll close the window
6. Run the "restart" or "restart64" application. If done correctly, your monitor will flicker on and off for a little bit and then Windows will come back and will tell you that the driver was successfully restarted. Yay!
7. Select your desired mode by selecting the resolution and refresh rate in the Windows display control menu and run the Viewsonic Strobe Utility. If you've done everything above correctly your Vertical Total should be YYYYUGEE!. Well... Bigger than normal.
1663000737700.png


8. For the most part you're going to be tuning Strobe Pulse Phase and Overdrive Gain. You'll adjust strobe pulse phase so that there is no cross talk. Then you'll adjust the Overdrive Gain to remove the Overdrive artifacts. Strobe Pulse Width is a "set it and forget it" setting. That's the setting which trades off brightness for clarity. Find a good balance that you like and leave it. This is my 60hz tuning. As Chief said, you'll have to tune this every time you switch QFT modes. A pain in the ass, kinda. But if you write it down it's a quick edit. Hope this helps.
 
So, last post on this because I don't want to take the thread off topic:

For the Viewsonic XG2431

1. Download CRU 1.5.2
2. Create a new 240hz mode (see screenshots below)
a. Click a detailed resolution and click the Add... button.
View attachment 509372
b. Set your refresh rate to 240hz. When you're done your specs should match my screenshot.
View attachment 509369
3. Next, select Vertical total calculator and lower your refresh rate to whatever you desire. In this example I'm using 60hz
View attachment 509371
4. Rinse and repeat for each frequency you want (keeping in mind the limited number of Detailed Resolution slots. I edited a 60hz, 85hz, and 120hz mode for crosstalk-free. 120hz isn't completely crosstalk-free but far better than the default)
5. Click "OK" on CRU. It'll close the window
6. Run the "restart" or "restart64" application. If done correctly, your monitor will flicker on and off for a little bit and then Windows will come back and will tell you that the driver was successfully restarted. Yay!
7. Select your desired mode by selecting the resolution and refresh rate in the Windows display control menu and run the Viewsonic Strobe Utility. If you've done everything above correctly your Vertical Total should be YYYYUGEE!. Well... Bigger than normal.
View attachment 509373

8. For the most part you're going to be tuning Strobe Pulse Phase and Overdrive Gain. You'll adjust strobe pulse phase so that there is no cross talk. Then you'll adjust the Overdrive Gain to remove the Overdrive artifacts. Strobe Pulse Width is a "set it and forget it" setting. That's the setting which trades off brightness for clarity. Find a good balance that you like and leave it. This is my 60hz tuning. As Chief said, you'll have to tune this every time you switch QFT modes. A pain in the ass, kinda. But if you write it down it's a quick edit. Hope this helps.
Excellent post JBL!
 
Excellent post JBL!
Thanks! It truly is something to behold. Being able to have something so close to my old FW-900 and from a relatively inexpensive LCD is something special indeed. I estimate my MPRT is around 2ms so not the razor sharp of the 1ms CRT but it’s close enough. Doom 3, for example I can strafe left and right and still read all the scrolling text perfectly fine. Without the strobing it’s unreadable.
 
Be noted that optimal Strobe Pulse Phase changes every time you change Strobe Pulse Width. Overdrive Gain 7 is a bit strong for 60 Hz, but can be useful if you've got a fairly cool/cold room. I usually use Overdrive Gain 2-4 at 60Hz. Don't forget to warm up your LCD for at least an hour before tuning, as warmer LCDs have faster pixel response and requires a little bit less Overdrive Gain for the same crosstalkfreeness.

Either way, Pulse Width 17 is definitely a good compromise for 60 Hz. It would be brighter but (not much) higher MPRT (standard 10%-90% cutoff) than the FW900. You are probably currently getting a picture brighter (lumens) and clearer (MPRT) than LG OLED BFI, with a lower MPRT than OLED BFI, to boot -- give or take -- and way lower lag than LG OLED BFI. 85+ Hz will have significantly better MPRT at same pulse width, since Pulse Width is approximately a percentage of refresh cycle, so higher Hz = shorter pulse width at same Hz.

I don't want to have a monopoly on QFT info, and QFT works on many displays (albeit those displays won't have such complete DIY strobe tuning features). QFT is useful for strobing because most strobing implementations does not support VRR (easier form of QFT) -- and also because doing plain 4:4 pulldown (e.g. 60fps cap at 240Hz) on native VESA CVT / CVT-R via framerate capping will often cause a quadruple image effect caused by 240Hz strobing on 60fps content.

"strobing and QFT" is a good marriage that hits two birds with one stone (crosstalk reduction + lag reduction) --
But it is still very little known among reviewers, and little known among manufacturers -- especially ones who program some strobe modes in an afterthought-manner.

Meanwhile, mind if you post the 85 Hz QFT screenshots too? You might want to crosspost your screenshots to the ToastyX QFT HOWTO thread on the Blur Busters Forums.

___

...Addendum - I'm thinking ways to automate this....
Since Pulse Width keeps leading edge stationary but changes the position of the trailing edge, it means Pulse Width will affect the centeredness of the strobe relative to panel VBI (+a minor tapedelay effect for scaler/GtG lag), so you will need to re-tune Pulse Phase everytime you change Pulse Width. Longer term, I'm considering using an "Autocalculate" button by using algebraic quadratic regression formula to curve-fit a strobe-tunning scatter-plot graph plot (manually tune in 10Hz increments). (For those who forgot algebra, quadratic regression is generating a y = ax^2 + bx + c type of formula to draw a curve through a scatterplot graph). After tuning (60...240) in 10Hz increments I have enough data to create a curve-fit autocalculate formula to autocompute near-optimal numbers for any Hz and any VT! I'd need separate cubic regression formula and/or quadratic regression formula (as appropriate), linked to the most sensitive variables (Hz and VT and PW as input variables, to generate PP and OD as output variables). Then any Hz can automatically strobe-tune everytime you switch Hz or change VTs, if you enable an [X] Automatic checkbox. Still will need a minor manual biasing for per-panel (lottery / temperature) at least for the OD Gain setting, but can get pretty close automatically, in a way that monitor manufacturers don't yet. But this is a lot of work, for a niche feature, though. But there's a definite curve when I do a scatterplot graph tuning presets, which means it can be automated with a curve-fit algorithm.
 
Last edited:
Also, thanks for mentioning FPGA solutions on there -- a lot of this is completely over my head (as it should be, since you're THE GUY for blur-busting, unless your name is a LIE), but I was wondering how FPGA systems might be affected by or otherwise take advantage of a lot of these concepts.
Short Answer:
Not possible for unmodified FPGA implementations of original machine.
Yes Possible with a semi-complex modification to FPGA of original machine to temporarily time-warp during active refresh cycles.

For a 240Hz signal (and a panel that refreshes in sync with horizontal scanrate), the scanout is physically 4x faster top-to-bottom refreshing than 60 Hz. So to synchronize the rasters -- that's time-accelerated execution such as 1/240sec executions of 1/60sec refresh cycles. It's as if the emulator or FPGA is temporarily in a time-bubble where time passes 4x faster than usual for the duration of a visible refresh cycle scanout. And then when in VBI, time is temporarily at a standstill until just right before the new real-world refresh cycle that most closely aligns real-world-temporally to the original machine's refresh cycle (e.g. resume paused FPGA execution every 1/60sec)

Long answer:
Through some complex clock acrobatics to convert to QFT, unless you intentionally speed up the FPGA (run in 4x timewraped mode for a 4x faster scanout), to keep the emulated CPU cycles the same number per scanline, during an accelerated-scanrate situation. Basically surge-execute one refresh cycle worth of emulator refresh cycle, by a proportional speedup factor matching the faster scanout.

WinUAE actually already supports "lagless VSYNC" which is based on the the algorithm I helped Toni with, ala successfully implemented VSYNC OFF frameslice beam racing to emulate a front buffer with a performance-jitter safety margin. When lagless VSYNC (beamraced sync) is enabled in that emulator, it beamraces emuraster=realraster (on a physical screen surface manner, appropriately resolution-scaled) plus a jitter safety margin that is conveniently hidden by rendering emulator raster to visible front buffer slightly vertically below of the real raster.

Now for FPGA.... FPGA implementations typically just sprays out the scan lines in real time like an original machine -- so you need to upclock the FPGA temporarily whenever outside VBI to keep the FPGA outputting scanlines quicker via the higher horizontal scanrate of QFT. Let's say If you kept it permanently upclocked, 240Hz would run things 4x faster than the original machine, so you only upclock 4x (FPGA implementation at 4x its original clockspeed) for 0.25/240sec and pause (FPGA temporary halt) for 0.75/240sec. This would work whether you choose to do QFT at the timings level, or choose to generate 4:4 pulldown (via standard EDID CVT-R) with (60fps at 240Hz (requires a custom scaler stage to do as low lag as QFT, by framebuffering the 60Hz FPGA output during its beam raced first refresh cycle, for re-outputting 3 more times to generate a 4:4 pulldown).

In a fast-scanout situation, the emulator doesn't run faster continually -- just surge-executions per refresh cycle. What happens is that there is still 59.94 refresh cycles per second (NTSC) or 50 refresh cycles per second (PAL), but since a beam-raced sync implementation in an emulator polls the raster on the GPU via D3DKMTGetScanLine() (or estimated as a time offset between VBlank's -- you can estimate it for cross-platform beam racing!). it will emulate an emulated raster faster if the real raster is physically moving faster. If the scanout is physically 2x faster top-to-bottom, then execute the beam raced implementation (emulator or FPGA implementation) at a 2x faster clockspeed until hitting somewhere inside the VBI where emulator (or FPGA) execution is paused until just right before the next display ultrafast scanout.

But the implementation (emulator or FPGA) needs to support that time-warp trick to be compatible with QFT signals. It is not an easy programming task for someone unfamiliar with beamracing the original classic machine AND beam racing modern GPUs. (ala Tearline Jedi) AND has the programming skill to tie the two together in a synchronized way. But it's successfully done with emulator software, and at least "in theory" possible with a highly modified FPGA that can vary its clockspeed everytime in-VBI versus out-VBI. At the end of the day, you can get less input lag than the original machine, if your fast scanout is sufficient to overcome all other overheads.

Needless to say, I've done a lot of algorithmic work on this already, being familiar with raster interrupt concepts myself.

UPDATE: Confirmed! Not theory anymore. WinUAE unmodified "lagless vsync" setting appears to work fine with QFT signals. Pretty neat. It just synchronizes to whatever scanout velocity reported by Windows raster API call D3DKMTGetScanLine(). It even appears to already briefly surge-execute Amiga emulation 2x velocity during 120Hz QFT. Fantastic. Toni is one smart developer that futureproofed their beamraced emu=real raster implementation. (He didn't know about QFT, but tweaked the "lagless VSYNC" implementation to be VRR-compatible and 120Hz+ non-QFT compatible as 60fps at 120Hz 2:2 pulldown. QFT 120Hz at double VT is the same scanout velocity except is only 60 fast-scanout refresh cycles separated by longer VBIs.) This is the Correct Natural Way to synchronize an emulator raster to a high-scanrate output signal). So this confirms my above post works just fine!
 
Last edited:
UPDATE: Confirmed! Not theory anymore. WinUAE unmodified "lagless vsync" setting appears to work fine with QFT signals. Pretty neat. It just synchronizes to whatever scanout velocity reported by Windows raster API call D3DKMTGetScanLine(). It even appears to already briefly surge-execute Amiga emulation 2x velocity during 120Hz QFT. Fantastic. Toni is one smart developer that futureproofed their beamraced emu=real raster implementation. (He didn't know about QFT, but tweaked the "lagless VSYNC" implementation to be VRR-compatible and 120Hz+ non-QFT compatible as 60fps at 120Hz 2:2 pulldown. QFT 120Hz at double VT is the same scanout velocity except is only 60 fast-scanout refresh cycles separated by longer VBIs.) This is the Correct Natural Way to synchronize an emulator raster to a high-scanrate output signal). So this confirms my above post works just fine!
WinUAE is great piece of emulator. Kudos for author.

As for FPGA it would be possible to get lower input lag than native but that is hardly the point of it. FPGA's are great because you get exactly original timings. Just hook old CRT and play like on original hardware, just for ultra cheap (compared to price of some systems today something like MISTer can be pretty much dismissed) and on more modern better controllers like DualSense.

MISTer works great with VGA CRT's too having amazing CRT emulation. In fact I compared image on PVM and VGA CRT and other than slightly different effects when moving eyes they looked pretty much the same. I mean by moving eyes rapidly it can be spotted that electron beam moved differently on 15KHz CRT than VGA CRT in something like 1600x1200. Image is of course only identical when output resolution in MISTer is set to be high enough to not see original VGA CRT scan lines. Didn't do any tests with photodiodes or anything but to my eyes monitors were perfectly in sync.

Oh and FPGA do not skip frames. Emulators, on fixed refresh displays can always do that. Maybe better coded emulators like WinUAE which care about individual scanlines not so much but this is the state of current emulation scene.
 
As for FPGA it would be possible to get lower input lag than native but that is hardly the point of it. FPGA's are great because you get exactly original timings. Just hook old CRT and play like on original hardware, just for ultra cheap (compared to price of some systems today something like MISTer can be pretty much dismissed) and on more modern better controllers like DualSense.
Agreed. That's why I like keeping the emuraster-realraster configured to original-machine sync (same relative scanout velocity of raster as original machine).

My algorithm was probably the first time a software-emulator was able to do per-pixel original-latency simulation (to within the currently configured error margin of frame slice size + jitter error margin).

While Retroarch's RunAhead got all the publicity -- my algorithm actually preserves original per-pixel latency better for the entire screen surface -- even for mid-screen input reads inside a raster interrupt routine!

You can get a software-based emulator on a PC with almost exactly the same per-pixel original CRT timings, with beam raced emuraster-realraster synchronization. You can use a VGA output and a real tube, and get original-machine-like latency per scanline, with a software based PC emulator such as WinUAE that implemented my algorithm.

WinUAE can do a configurable number of frameslices per refresh cycle, so the more frameslices, the more perfectly WinUAE matches an FPGA or real original Amiga. With 2000fps frameslice beamraced front buffer simulation via VSYNC OFF on an ultra-low-lag 60Hz OLED, you can get within an error margin of +1/2000sec of original timings per pixel.

Now that being said, a big advantage of an RTX4000-series is they have 1 terabytes/sec memory bandwidth so you could probably do 15,000 frameslices per second (NTSC scanrate!) and perhaps get almost down to atomic one-scanline VSYNC OFF frameslices. Would be a fun exercise to see if the jitter margin can be squeezed down to only 2-3 scanlines (to hide raster artifacts during performance imperfections of emuraster:realraster sync).

But with newer GPUs without VGA outputs, you are stuck with a minor latency of a HDMI-to-VGA adaptor, but this latency may be very well be less than low-granularity frameslicing (e.g. 600 frameslices per second).

The firehose performance of new GPUs raise the spectre of ultra-precise emuraster:realraster synchronization. The performance improvement of newer GPUs now vastly outweigh the latency overhead of an ultra-low-lag HDMI-to-VGA adaptor, if you're wanting to use the Lagless VSYNC feature -- since 600 frame slices/sec on an old GTX 760 is outweighed by the idea of doing 6,000+ frameslices/sec on a newer RTX GPU (1/6000sec tapedelay latency!), and the thought of matching NTSC scanrate at 15,000+ VSYNC OFF frameslices/sec!

That's the neat thing about my scalably flexible emuraster-realraster algorithm -- you can shrink and shrink the slices and jitter error margin until it's down to one CRT scanline! if you keep the original scanout timings, even including the original latency gradient along the vertical dimension of a display too!

QFT is certainly purely optional here; I'm just impressed how well the Toni algorithm adapts to best-effort beamracing any-Hz any-VRR any-QFT. But it can configured to perfectly relative-to-original latency too (only a tapedelay effect of 1 frameslice time).

Tip for FW900 Users using Lagless VSYNC on WinUAE
(Achieving original per-pixel latency preservation):


To preserve original per-pixel Amiga latency on a Sony FW900 with WinUAE, I recommend using NVIDIA Custom Resolution or ToastyX CRU making the VBI exactly (525-480)/525ths of the vertical resolution, if you've configured WinUAE to scale a 480i/240p NTSC emulation to the entire height of your display.

So if you're using 1080p, use a VBI of 92 or 93 scanlines (VT1172 or VT1173). That keeps your VBI:TOTAL ratio of 45:525 from NTSC scaling correctly to roughly a 92:1080 ratio for modern 1080p -- scaling-compensated VBI size!

Or if you're double-scanning (square pixel), e.g. 960p, you want to scale your 45:525 to 90:1050, basically a Vertical Total 1050 with a 960p signal, would need to use 90 scanlines in VBI in NVIDIA Custom Resolution to preserve original beam raced NTSC timings in a scaling-compensated way.

That will keep the latencies properly original-scaled to the original NTSC signal, during beam-racing emuraster:realraster algorithms. Any scaling overlaps are nicely confined within the jitter zone (the safety margin where raster artifacts are guaranteed invisible), so scaling works fine during beam racing!

Recompute accordingly if you're using PAL resolution, or if you're showing more/less of the top and bottom borders, you need to make sure you're correctly scaling the vertical velocity of the emulated raster = real raster = actual display = with whatever settings you are using.

You will want to maximize the number of frameslices as much as you can, to get as close as possible to original per-pixel Amiga latency (clone of FPGA latency, clone of original Amiga latency) -- including all latency behaviors such as scanout latency.

Your input lag with frameslice beam racing will generally be an average between 1-2 frameslices. 1 frameslice lag because, and between 0-1 frameslice lag because of jitter margin (hiding tearlines as long as they're within the height of previous frameslice). So the more frameslices per second, the lower tapedelay latency you have relative to an original machine. It's a very subrefresh latency, as you're rendering the emulator while the display is already refreshing the current refresh cycle.

But at the end of the day -- the more GPU memory bandwidth performance to Present() more VSYNC OFF frameslices during frameslice beam racing -- the more it converges virtually perfectly to original per-pixel original machine latency.

Original per-pixel input latency capability in a software emulator unprecedented in an emulator until Toni implemented my suggested algorithm in WinUAE.
 
Last edited:
interesting video, was expecting another poor researched, missinforming, rather bussines oriented subscirbing and "likes" expecting feeding youtuber video using the CRT topic, this individual have mentioned various valid points that me, as a CRT monitor enthusiast and current user of them for either retro and modern gaming content personaly feel identified.
He just needs a small form OLED. Never thought about giving more Depth.

if you have any particular OLED recomendation, i would really apreciate it, from what i have researched, small from OLEDs, assuming you meant OLED monitors are practically non existent, the few ones seem very expensive, most of them 60hz max, still suffering poor quality motion, and no motion quality improvment tech (BFI).

so far the only OLED gaming worth in action display i have been able to see in person is the LG OLED C1 55 inch version from a friend, but its motion clarity quality at 60hz still left me dissapointed to my likes, 60hz on this even with its BFI setting at its best was poor notable blured, and with more notable flicker than CRT monitors at 60z, it was at its 120hz paired with its best BFI setting that it really feld like a CRT monitor motion quality, however not jet as clear in motion tests as a CRT but close enough in games that it really feld like playing on a CRT, but unfortunately, at the cost of requiring 120 constant fps system, and way too big screen to my tastes, there seem to be a smaller model of 42 inch but still too big to my likes, and its poor 60hz motion clarity is a no thanks for me.

the LG CX seem to have better motion quality at 60hz than the C1 from what i have read, but at the cost of badly reduced brighntess of about 60 nits, so reduced color vividness due to such low brightness (way lower than peak CRT monitors that can achieve 100+nits) increaced input lag, more intense flicker than CRT at 60hz, and minimum size seem to be, 48 inch, so those also those CX OLEDS TVs are a no thanks to me as well.

so in brief, i prefer to play on my CRTs, even modern games because on them i can play from refreshes as low as 60hz (even 55hz from some mame emulated games) to up to 90hz which is enough for me with soft enough flicker, full brightness, excelent latency., blacks, 0 crosstalk in the whole screen.

if you know and OLED that:

-is between 22.5 - 32 inch size
-can produce CRT life like motion clarity (no notable motion bluring) at 60 and 90hz with similar soft enough flicker like CRT monitors at least
-100 + nits while keeking that CRT quality motion on those mentioned refresh rates.
-CRT like latency on those mentioned refreshes and motion quality.
-0 crosstaltk in the whole screen (top center bottom) in those mentioned frecuencies under those CRT motion clarity, brightess, latency conditions.
- plus obviously the blacks, excelent viewing angles i am aware OLED are able to achieve.

since so far CRT monitors are the only ones i have seen being able to achieve all those in one single monitor.
 
Sony PVM OLEDs can do what you’re asking except the motion blur is only 7.5ms persistence. Basically it’s a plasma with infinite contrast. And they’re stupid expensive, even on the used market.
 
thanks but, well.....7.5ms persistence is still quite blured motion quality, not far from the 8ms that is the still blured motion at 120z on non strobed modern displays from what i have seen, even on the C1 OLED non strobed at 120hz i personaly experienced.
my eyes have not seen a single bit of motion blur on any CRT monitor i have used (not just on the FW900) at any refresh rate they support, if i am not mistaken, CRT persistance is arround 1ms

unfortunately my experience with plasmas is very limited, never been able to experience their motion quality from what i remember but if plasma is also 7.5 ms then even those seems not quite at the good motion quality of CRTs as well.
 
thanks but, well.....7.5ms persistence is still quite blured motion quality, not far from the 8ms that is the still blured motion at 120z on non strobed modern displays from what i have seen, even on the C1 OLED non strobed at 120hz i personaly experienced.
my eyes have not seen a single bit of motion blur on any CRT monitor i have used (not just on the FW900) at any refresh rate they support, if i am not mistaken, CRT persistance is arround 1ms

unfortunately my experience with plasmas is very limited, never been able to experience their motion quality from what i remember but if plasma is also 7.5 ms then even those seems not quite at the good motion quality of CRTs as well.
Nothing really is. My response was kind of rhetorical :). Plasma is fine if you’re used to LCD big screen gaming. It’s a breath of fresh air, as 50-60% motion blur reduction is much better than nothing.
 

Thanks for this video, it is really well done and hits on some really great points.

I had a 19" Samsung SyncMaster 997DF up until 2008 (well after most people had converted to LCD's) and it was an amazing monitor. I truly believed that it just felt and looked better. There was just this nice soft-glow to the text that I actually found nicer on the eyes that the artificial feeling I get from using even today's LED backlit LCD's. Also, when scrolling webpages or text slowly, it was so easy to read on a CRT (zero motion blur whatsoever). I bought a new ASUS 24" LCD display this year with a 5mS response time an while I admit that it does handle text scrolling better than the older LCD's moving text is still nowhere near as readable as it was on that old CRT. Moving text was pleasant to read on a CRT, even if it is readable on a modern LCD I would still not describe it as pleasant (as a result, I just changed my viewing habits to accommodate the monitor).

Honestly I would probably still be using a CRT if it wasn't for the health concerns that people kept harping on. I kept getting hounded by family about how there are X-Rays and that CRT's cause vision impairment. Even on this forum (back in 2005) I had arguments like this with people whenever they found out that I was using a CRT:
https://hardforum.com/threads/sucks...00-60-or-1920x1200-52.953882/#post-1028261072

I am a pretty health-conscience person and that was the main reason why I converted to LCDs. Honestly, though, I would love to see a more rigorous analysis done on the health effects of more recent CRT's because LCD's aren't without their own set of problems with blue light exposure. I mean, when most people remember a CRT they are usually thinking about that old white box from the early 90's in their classroom that had a flicker and gave you an outright headache after an hour of use. I don't have those memories of the later CRT's I used (eg. TCO '03 compliant units). I would also like to know about the radiation output of the newer models and see just how much of a concern that still is (it would be a big consideration for me if I were to possibly go back to a CRT - I know that a lot of shielding got added to the glass to address that concern in later years).
 
Last edited:
many items from those links are missing with no item name, i think its better if you also write items name outside the links, so in case they are out of stock they can be searched elsewhere ;)

That is an interesting list they posted. I'm still using the StarTech DP2VGAHD20 at 2235 x 1397 @ 83Hz, and it has been flawless. Has anyone offered to start doing that list, yet?
 
  • Like
Reactions: 3dfan
like this
Your driving that vertical pretty hard… I think the limit is 160hz though right?
Ah yeah you right. Limits are:
Horizontal 121khz
Verticial 160hz

I was only curious to try to see if I could test the Startech that high for someone, but not if it can damage the FW.
 
Ah yeah you right. Limits are:
Horizontal 121khz
Verticial 160hz

I was only curious to try to see if I could test the Startech that high for someone, but not if it can damage the FW.
It won’t let you go beyond the limit. Even then, Unkle Vito warned against pushing it that hard. Just cause you can rev to 6000 rpm’s doesn’t mean to you need to stay there.
 
It won’t let you go beyond the limit. Even then, Unkle Vito warned against pushing it that hard. Just cause you can rev to 6000 rpm’s doesn’t mean to you need to stay there.
It seems the advice Vito gave here has been thrown to the winds as I've seen many posters here driving these units way past the golden resolution of 1920 x 1200@85hz. These units can of course accept higher resolutions but the question is should you?
 
It seems the advice Vito gave here has been thrown to the winds as I've seen many posters here driving these units way past the golden resolution of 1920 x 1200@85hz. These units can of course accept higher resolutions but the question is should you?
Hell if I know. I can see both sides of the argument. Preserve to enjoy forever, or throw caution to the wind - yolo! Lol
 
I was only curious to try to see if I could test the Startech that high for someone, but not if it can damage the FW.

It won’t let you go beyond the limit. Even then, Unkle Vito warned against pushing it that hard. Just cause you can rev to 6000 rpm’s doesn’t mean to you need to stay there.
Guys, this is dumb. A deflection circuit isn't like a car's engine. The monitor wouldn't allow you to run 800x600 160hz if it wasn't safe.

I've been running my LaCie at crazy shit like 1920x1440p 90hz and 2880x2160p 60hz, 1024x768p 160hz, for years.

Don't be afraid to take full advantage of your monitor's capabilities.

I mean, I keep the desktop at something less intense, like 1600x1200 @ 85hz, but for games I go all-out.
 
Last edited:
Guys, this is dumb. A deflection circuit isn't like a car's engine. The monitor wouldn't allow you to run 800x600 160hz if it wasn't safe.

I've been running my LaCie at crazy shit like 1920x1440p 90hz and 2880x2160p 60hz, 1024x768p 160hz, for years.

Don't be afraid to take full advantage of your monitor's capabilities.

I mean, I keep the desktop at something less intense, like 1600x1200 @ 85hz, but for games I go all-out.
I’m all ears. Curious to know if the circuit gets hotter at higher refreshes. If not then I’d say you have a point. If only a little I’d still say you have a point. :)
 
Last edited:
Back
Top