24" Widescreen CRT (FW900) From Ebay arrived,Comments.

me neither :)

first thing I'm gonna do is to take a photo of the phosphor layer of the unit with anti glare on and anti glare off (after calibrating both). It will be interesting to see whether the unit with anti glare off has a clearer image.
 
this is what my camera could get:

glossy lcd:
3qfjHpD.png


g520p with ag (looks dull because it's ~40% darker):
JOqp6ka.png
 
This setup will allow me to move the camera along the focal axis with very very good precision (perhaps with a precision of a micron or less). This will allow me to get perfectly focused images, which is critical.

with a stage like that, there's probably going to a large range of adjustment where the focus is indistinguishable from perfect. for instance the tolerance for good focus (=depth of field) maybe +- 1mm and a full turn of the knob may only move your camera 0.5mm or something like that.

keep in mind that for every lens, generally there's an optimal aperture setting for focus (usually around f/8.0 i think). probably easiest to figure out empirically for your setup
 
With the reverse lens set up, it automatically goes to the highest aperture, so depth of field is ridiculously narrow. I figured out how to adjust aperture even with reverse lens, but I'm cool leaving it at highest aperture, as this will result in the least amount of diffraction. With the largest aperture, I think the DoF is quite a bit lower than 1mm based on the fact that even applying a slight amount of pressure to the stack of books that the camera was resting on resulted in a massive change of focus during my last session. I use liveview on a laptop with a zoomed in image so I get a very clear real time view of the focus quality.

If it turns out that the depth of field is sufficiently narrow, I'll be able to actually measure absolute distance across the image (by pre-measuring a standard, such as a stage micrometer).

I'm also hoping to get my hands on a ronchi ruling. The main purpose will be to see whether the image I get at the same magnification I'm using for the CRT imaging will show sharp transitions between the lines (plus I can also use it for distance calibration).

cool images - I'll be calibrating all the test units to the same spec, so brightness will be controlled for. The VGA splitter I got will come in handy for these tests.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
I've been casually monitoring this thread, curious to see what made it keep popping up in the Display section.
Although I do have fond memories of playing on an old 17" at my parent's house, I never felt the need to jump on this bandwagon and hunt down an FW900 or other CRT myself though.

Over on Reddit a guy posted the usual "CRT is so much better" reply in a monitor thread, which I've seen him do on multiple occasions. This time I decided to type out a reply to him about CRTs being deprecated and impossible to find nowadays, which made me look at the largest site of used good sales in my country trying to prove my point. "CRT" gave me only two responses, old cheap things - however, "FW900" did not give 0 results as I anticipated.

Yesterday I picked up my very own FW900, A 2000 model, which the guy had used for graphical design for a couple of years. It has been in room temperature storage since 2003.
Paid the equivalent of about 100$ for it - a fair price for satisfying my curiosity.

U4cQ7kv.jpg


jgoPl3S.jpg


Axvw9Vi.jpg


This thing complements my X-star@96hz really well. I'll have to get a graphics card with at DVI-D port as well as the analog one to use them at the same time though.
How do I make it do 1200p@96hz correctly? When I set the res in CRU I get a working but squashed image.
 
Last edited:
wow congrats, sounds like you got a great deal!

Sounds like you just need to readjust the geometry, using the OSD controls. You can adjust the width, height, and position of the image using these controls (click the menu button to the left of the power button, and you should be able to find it).

Also, if you really want to make the most out of this monitor (and maintain its health and life span), I highly recommend getting a colorimeter and a WinDAS cable, and following the white point balance guide
 
Sad to say, after 11 years my Visionmaster Pro 451 has started to go blurry, and since I don't have the time to fix it myself, or the desire to spend the kind of money to get it fixed, I have decided to retire it. Probably won't chunk it, as some day I might want a toy to play with (when I'm old and have some money and time to burn :)

I replaced it with a cheap-ass Asus TN 23", because I'd rather not get too invested in a screen while all these new potentiality awesome technologies (Lightboost, Freesync, G-Sync, OLED, 120 Hz IPS or soemthng else?) are right now a bad thing to bet on, and I don't feel lik,e dropping $700 on a loser (and I hate buying the latest crap because you're enrolling to br a beta-tester :(.

I paid $450 for my last CRT, so I'm not afraid to lay out the big bucks, but I want to throw my money at something that will work for the 5-10 years I expect a high-end monitor to last me.

For now, I'll just live with the cheapo TN 1080p 1ms screen. It's pretty impressive what you can get for $150 - 2 digital inputs to hook-up my two computers, and running in sRGB mode with windows calibration, it's not too far-off from my CRT when I sit right in front of it, and I avoid any trouble with the crap pulsed backlight by letting the graphics card set the brightness. Twitch FPS games are also surprisingly smooth compared to my slow-as-dirt IPS TV, and the 60Hz doesn't bother me because I already play with vsync off.

I'm glad I was able to hold-out for so long to replace my CRT, because LCD got a whole fucking lot better. I figure this cheapo will tide me over until the market figures out which new whiz-brang refresh rate tech will win :D
 
Last edited:
Still rocking my NEC MultiSync FE1250+ today. My wife thinks I'm crazy.

Running 100 hz on desktop and 120 hz in games.
 
Still rocking my NEC MultiSync FE1250+ today. My wife thinks I'm crazy.

Running 100 hz on desktop and 120 hz in games.

Nice. Drag a FW900 to a LAN Party someday.

I bet all the youngsters won't be able to decide whether you're a LOON using a stone age CRT, or whether to approach the beast in curiosity.
 
I was that reddit user. I am bobthetrucker on reddit. Are you using CRT standard timings in CRU?
 
I was that reddit user. I am bobthetrucker on reddit. Are you using CRT standard timings in CRU?

No ill thoughts with my response on Reddit, I just think it's weird that you keep posting in a forum with no other CRT enthusiasts. The form factor and oldness of these monitors make them fairly unpopular in the monitor forum.

I got it working with the CRT standard timings yes.
 
On the good news side of things, I was able to get my $150 POS working at 1080p @ 74Hz by using the reduced-blanking sync option. Nice to see they wasted bandwidth in the original DVI timing spec, because it means I can get within a stone's throw of my 85 Hz CRT :D

I am very disappointed that SED got killed, because now that scaling chips have gotten so good, I wouldn't have cared that the resolution is fixed. OLED is the only potential savior here, since every other LCD tech has some annoying flaw. It's going to be tough to unseat LED LCD at these prices, but hopefully Samsung and LG are hard-headed enough to make it work. The price of a 1080p WOLED set has fallen by 3x since a two years back (and has been replaced by a 4k WOLED display, proving they can compete in TVs with LCD on density), and the size / resolution / color fidelity of portable OLED devices has risen sharply in the last 3 years (720p 4.7" to 2,560x1,600 10.5"), so there is hope :D
 
Last edited:
pretty sure lcds have been using bilinear (and maybe bicubic) scaling since always
 
pretty sure lcds have been using bilinear (and maybe bicubic) scaling since always

Not the ones from 11 years ago (when I was last shopping around for a monitor). I recall distinctively 17/19" 5:4 monitors blurring the living fuck out of 4:3 resolutions, even ones close to native (i.e. 1280x960). That was quite inconvenient when we were using a splitter to drive the the 4:3 1024x768 projector in a conference room :(

Remember, that was a time when the LCD industry was first exploring the speedup they could get with 6-bit panels that looked like crap:

http://www.anandtech.com/show/1325/4

They also hadn't even heard of overdrive chips to speed-up gtg, and lots of cheaper panels only came with VGA inputs (forcing you to use their crappy A2D s, which have also gotten more accurate).

My cheapo $150 2015 monitor has all those features standard: 2 digital inputs, overdrive (including no overshoot), 6-bit panels that have decent dithering and a quality scaler because they're tried-and true cheap technologies 11 years later :)

You can see why I opted for the CRT in 2004. Unless you were doing lots of coding work and spent $600 on a 20" 1600x1200 LCD, you were wasting your money.
 
Last edited:
This thing complements my X-star@96hz really well. I'll have to get a graphics card with at DVI-D port as well as the analog one to use them at the same time though.
How do I make it do 1200p@96hz correctly? When I set the res in CRU I get a working but squashed image.

As for now I use these timings but I'm curious how it should be done properly.

Screenshot%202015-03-03%2015.39.16.png


Actually I tried to achieve best timings by eye so that I found out cutting few pixels here and there would result in good image with no visible distortions.

I found out that lowering Sync width lowers vertical lines flickering, but I couldn't really find what this value is :p


Image that might be helpful:

scansize.jpg


Also some valuable article I guess:

http://www.linuxjournal.com/article/1089?page=0,2

Could someone share their CRU timings? UV maybe?
 
I've only ever used the preset timing formulas, I never mess around with the manual timings. Then again, I don't run at 96 hz at 1920x1200 (I use 85 hz).

You might find this posts interesting though:

http://hardforum.com/showthread.php?p=1040440823#post1040440823

(for extra enrichment, be sure to go through the tutorial that this post links to)

Also see posts on this page:

http://hardforum.com/showthread.php?t=952788&page=506

I'm almost positive that Vito doesn't run at 96 hz, so you're better off asking someone who has actually successfully done 96 hz (I think XoR has?)
 
I've only ever used the preset timing formulas, I never mess around with the manual timings. Then again, I don't run at 96 hz at 1920x1200 (I use 85 hz).

You might find this posts interesting though:

http://hardforum.com/showthread.php?p=1040440823#post1040440823

(for extra enrichment, be sure to go through the tutorial that this post links to)

Also see posts on this page:

http://hardforum.com/showthread.php?t=952788&page=506

I'm almost positive that Vito doesn't run at 96 hz, so you're better off asking someone who has actually successfully done 96 hz (I think XoR has?)

Correct!
 
i notice on older used up CRTs that i get, that the "porches" are not straight with the bezel i usually don't stretch to the bezel because the porch gets all wavy the closer i stretch the picture to the bezel.

I wonder what the outer edge to edge width on the FW900 is? Are folks achieving 1560x1600 res on FW900s?
 
I wonder if the 390X and TitanX will have RAMDACs... If they do, I just might upgrade from my 980 Classies to a 390X.
 
I wonder if the 390X and TitanX will have RAMDACs... If they do, I just might upgrade from my 980 Classies to a 390X.

Not sure I'd bet on the AMD card having DVI-I, the R9 290X doesn't seem to come with a DVI-I port in the default configuration:

http://hardforum.com/showthread.php?t=1800043

At the same time it's confusing because the newer R9 285 DOES have a DVI-I port. So, who knows? Maybe it's a one-off, or maybe it's a trend?

I'm both pissed and happy as hell that my CRT died, because after Maxwell I wouldn't have had an easy way to drive the damn thing. VGA has been ditched by AMD in the 290, and is scheduled to be dropped by Intel in Skylake. Every other discrete card should follow shortly.
 
My guess is that the 380 will have DVI-I and the 390 will only have DVI-D, kinda like with 280 and 290 last time. Just basing that on the fact that the 285 has DVI-I even though it uses the newest chipset.
I am curious to see if Titan X will have it, I couldn't find a picture that gives a clear view of the connector.

This is a trend though, and one day there will be no more VGA output on video cards. Our only hope is for the HD Fury guys to develop a high pixel clock DAC. I don't think there is any other company that makes high-fidelity video DACs. All the other products I've seen are just cheap converters with no videophile oriented features.
 
wtf does wccft mean?

thats the source of the second pic, they often make up shit or state rumors as facts to get clicks.
That AMD won't add a RAMDAC again to their new Fiji Chips is a given, i'm glad that nvidia keeps at least one DVI-I port
 
Thanks for the Titan X pic, but I think you and I know that the Wccft pic is basically worthless. But you're right that it's a safe bet it won't have DVI-I, though I think it's still possible for the 380 to have it.
 
I fear that even the 380(x) won't have it. If they just rename the hawaii chip they won't add a ramdac and if they designe a new chip they most likely won't include it again. We'll probably only find it on lower end cards from now on, honestly i'm very suprised that it's still around with the new nvidia cards. I just bought a displayport to vga active adator with (allegedly) 260mhz bandwidth, let's see how it will work out.
 
why not just get display port to VGA adapters? also does any one know how wide the FW900 is?

Most current ones don't have enough bandwidth. There is a select group of VGA monitors that require over 200 MHZ of bandwidth to drive, and most of these converters are just aimed at powering old VGA-only flat panels (top-out at 1600x1200 or 1080p, 60Hz).

That, and the filtering quality of these converters can be a lot worse than what you get on an expensive video card. High-quality 400 MHz VGA has been a solved problem for a decade, but it's not a cheap thing. It's a tough sell to charge more for higher quality that few will need..
 
High-quality 400 MHz VGA has been a solved problem for a decade, but it's not a cheap thing. It's a tough sell to charge more for higher quality that few will need..

but they surely don't cost too much to implement in graphics cards right? otherwise i'd think nvidia would probably have taken out analog video long ago
 
but they surely don't cost too much to implement in graphics cards right? otherwise i'd think nvidia would probably have taken out analog video long ago

No, because they order the parts in quantity, and make the cards in-quantity.

Something like this maybe adds less than a dollar to the BoM if you're selling 100,000-1 million, but probably costs several times that for a smaller run. These adapters probably sell less than 100k and sell for a fraction of the price that a solid video card does, so it suddenly becomes price-sensitive.

It's also made more complex because people don't just want A GRAPHICS CARD WITH A RAMDAC, they want an adapter that ttakes a common output from their favorite graphics card, and converts it to a VGA signal of high bandwidth. Small runs + high bandwidth interaction with DisplayPort = $$$.
 
It's also made more complex because people don't just want A GRAPHICS CARD WITH A RAMDAC, they want an adapter that ttakes a common output from their favorite graphics card, and converts it to a VGA signal of high bandwidth. Small runs + high bandwidth interaction with DisplayPort = $$$.

What's the advantage of having an adapter that converts a common output to VGA, vs. having "native" ramdac?
 
What's the advantage of having an adapter that converts a common output to VGA, vs. having "native" ramdac?

If you don't convert a standardized output to drive a RAMDAC, then you have to provide the rest of the graphics processor, memory and drivers to drive your custom RAMDAC. Not cheap, and not the best angle for enticing customers who seek performance/compatibility with numerous software titles.

Much easier to just make a high-bandwidth converter, which works with all existing graphics cards, but of course you have to find enough customers willing to pony-up the premium price it will cost :D How many people out there will actually buy one of these, instead of just settling for the cheapest VGA adapter they can find on Amazon? Got to cost a whole lot more to make something with far more bandwidth, since far fewer people will pay for it!

This is why it's such a problem with all the major graphics vendors ditching VGA - the only cheap and plentiful way to connect VGA was through that connector.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
thanks, just so I understand, when you talk about compatibility with software titles, are you referring to HDCP?

Also, do you know anything about whether HDMI has more input lag compared to VGA (not talking about conversion here, just driving a signal via HDMI vs driving it via VGA).
 
Most current ones don't have enough bandwidth. There is a select group of VGA monitors that require over 200 MHZ of bandwidth to drive, and most of these converters are just aimed at powering old VGA-only flat panels (top-out at 1600x1200 or 1080p, 60Hz).

That, and the filtering quality of these converters can be a lot worse than what you get on an expensive video card. High-quality 400 MHz VGA has been a solved problem for a decade, but it's not a cheap thing. It's a tough sell to charge more for higher quality that few will need..

is THAT why i have issues on my second monitor via disp to vga active adapter? maybe thats why it's capping at 60 hz!!!

is there any thing faster on the nvidia side than 770 4GB with dual DVI-I?

For SOME reason i do not have this saved in my book marks but, there exist a crowd funded and or small time CUSTOM display port to VGA adapter for enthusiasts and professional, you can do any thing from change firmware and embed custom EDID data to overclock it!.To bad i lost the dang link.
 
Last edited:
Back
Top