What was your least favorite graphics card?

GeForce SDR. I got it for Christmas and shortly thereafter the DDR version came out. 3dfx 1 was also kinda lame since I spent $200 on it and it I still needed a regular video card and it had that terrible loop-back cable.
 
HD6950, purchased when AMD was in the absolute pits of bad drivers. Ive had dozens of graphics cards and its by far the most forgettable.

Favorite card? my 8600GT, which was my gateway drug into PC building because of how slow it really was.
 
Geforce FX 5900.

There were certain games where the Geforce 4 Ti 4200 was faster. Worst video card purchase I ever made.
 
I know what you mean about early Nvidia cards having really bad image quality. I personally had an Nvidia Riva 128 (predecessor to the TNT lineup) to go along with my 3dfx Voodoo 2. I couldn't believe my eyes at how different things looked when I eventually went to a new card 3 or 4 years later. I used to degauss my monitor on occasion thinking it was the monitor, but those old nvidia cards were poorly designed. I even read reviews saying they had bad quality, but ignored them thinking it couldn't be THAT bad.

My worst gpu would be a tossup with that old nvidia riva 128 or the XFX 7970 DD card that had tons of graphical anomalies right out of the box (quickly returned).

Don't forget the Riva 128ZX ;)
 
  • Like
Reactions: AIM9x
like this
Would have to be my GTX 590. Advertised as a 3GB card, but since it was two downclocked GTX 580s in SLI then it really only had 1.5GB of usable VRam. Along with inherent problems that come with SLI it was just not a GPU that was worth $700 back in 2011.
 
Had many many cards. Some stayed longer than others but never really disappointed with any of them. I usually made considered choices.

Maybe the biggest mistake was buying two 7900GTX for SLI and then finding out I really only needed one. The other stayed in its box till I sold it off on Ebay much later.
 
Gt 710 because its so damn slow. runs survivio well but barely averages 60 fps on zombie rush
 
Would have to be my GTX 590. Advertised as a 3GB card, but since it was two downclocked GTX 580s in SLI then it really only had 1.5GB of usable VRam. Along with inherent problems that come with SLI it was just not a GPU that was worth $700 back in 2011.
my robotics lab gtx 580 refuses to die
 
I'll jump on the Voodoo5 5500 bandwagon, it really was not a great card performance wise. Luckily, I bought it several years after 3dfx was defunct and by that point, the card was worthless. I got it for $60 from a buddy in High school in 2002.

Surprisingly though, it ran most games up well into 2005 if you didn't crank the resolution up too high. It ran the original Half-Life 2 and CS:S decent at 1024x768 because the original Source engine still had DX 6 support. This was dropped in Ep1 for DX 7 and completely in the Orangebox engine, which had a minimum of DX 8.1. The 2013 Steampipe update dropped anything below DX 9 because of cross platform support being added with their OpenGL wrapper emulating DX 9 calls.

Another contender was Nvidia's Tesla architecture, it was like Lassie and kept coming home. Originally released in their 8000 series, it came back for their 9000 series, 100 series, some 200 series cards (GTS 250, GTS 240) and finally some 300 series parts (GT 330). It was not advertised as being recycled in those subsequent generations and burned many a gamer.

S3TC

The negative was that S3TC wasn't more broadly adopted.

S3TC was widely adopted by the industry and integrated into both Direct X 6.0 and OpenGL 1.3, and still exists today as DXT and DDS texture compression:
https://en.wikipedia.org/wiki/S3_Texture_Compression
https://en.wikipedia.org/wiki/DirectDraw_Surface

While S3TC did allow higher resolution textures to be used, it had the drawback of being an extremely lossy texture compression format. If you didn't have the source to the compressed texture stored in a non-lossy format like PNG or TGA, editing became an impossible feat because the texture would become perpetually more corrupt the more times you edited it. This is an especially annoying problem in Half-Life 2 and Source engine games in general because almost all textures use DXT1 and DXT5, which are extremely lossy, making edits to textures look very poor if they need to be done. The biggest problem is that S3TC has more color representation for green, so compressed textures tend to shift in color to green.

It made sense at the time to use texture compression to allow for higher resolution textures on video cards with limited amounts of RAM, but today it's just a painful legacy technology that needs to go away and be replaced with something better.
 
lets see.. the entire radeon x1000 line up.. i tried the x1500, x1600, x1650 pro, x1800 and x1950 pro. they all sucked. i hated them so much so that i didn't buy another radeon card again until the vega 56 that i bought in early 2019.

You missed out BIG time on the X1300 then. I had that card in a lofty 256mb/PCI configuration and it was definitely capable of running 3D pinball at 12FPS.

Though one of the most underwhelming cards ive owned was probobly the GTX 560Ti, that thing sucks :/
 
ATI Radeon 9800XT.
Built-by-ATI, purchased it directly and shipped from ATI Markham, Ontario.

As I recall they had warnings all over the place to remind you to plug in the 4-pin molex auxiliary power cable, as that sort of thing was in its infancy.

A close runner up to the above was the 9800 Pro AIW that I bought and used for a DIY DVR build that I did to record Star Trek Voyager Re-runs that were on my local cable station at 0200 every weekday.

Reading that all back I realize I sound like a big-time loser...
 
ATI Radeon 9800XT.
Built-by-ATI, purchased it directly and shipped from ATI Markham, Ontario.

As I recall they had warnings all over the place to remind you to plug in the 4-pin molex auxiliary power cable, as that sort of thing was in its infancy.

A close runner up to the above was the 9800 Pro AIW that I bought and used for a DIY DVR build that I did to record Star Trek Voyager Re-runs that were on my local cable station at 0200 every weekday.

Reading that all back I realize I sound like a big-time loser...

These were your least favorite?
 
These were your least favorite?
Most favorite.
My bad. Reading comprehension...
Sorry. You can all kick my ass if you think I deserve it.

I guess my least favorite would be the nVidia GeForce 210 that I have sitting in my closet.
NOT because it's woefully underpowered (because I only bought it as an emergency boot card) BUT because it has a screamy little fan that makes WAAAY too much noise for its own good.
 
Most favorite.
My bad. Reading comprehension...
Sorry. You can all kick my ass if you think I deserve it.

I guess my least favorite would be the nVidia GeForce 210 that I have sitting in my closet.
NOT because it's woefully underpowered (because I only bought it as an emergency boot card) BUT because it has a screamy little fan that makes WAAAY too much noise for its own good.

Oh, geez. Let me guess, it sings like heaven's most irritating, off-key angel the entire time you use it? As an emergency backup, it'd work... It'll just goad you into slapping in something better as fast as financially possible.
 
While S3TC did allow higher resolution textures to be used, it had the drawback of being an extremely lossy texture compression format. If you didn't have the source to the compressed texture stored in a non-lossy format like PNG or TGA, editing became an impossible feat because the texture would become perpetually more corrupt the more times you edited it. This is an especially annoying problem in Half-Life 2 and Source engine games in general because almost all textures use DXT1 and DXT5, which are extremely lossy, making edits to textures look very poor if they need to be done. The biggest problem is that S3TC has more color representation for green, so compressed textures tend to shift in color to green.

It made sense at the time to use texture compression to allow for higher resolution textures on video cards with limited amounts of RAM, but today it's just a painful legacy technology that needs to go away and be replaced with something better.

The reason for that green push was because of how RGB is handled for 16-bit color in (nearly all) operating systems. Human eyes are most receptive to the color green - it's thought that this is an adaptation from our hunter-gatherer days, when picking the right shade of ripe or slightly under-ripened fruit or vegetable could be the difference between a good meal and a very bad one. Thus green got the extra bit, which is also why some alpha blending effects skewed vaguely greenish in the days before 32-bit color. As a side note, MacOS Classic set RGBA as 5:5:5:1, so green didn't get preference and one bit was allocated to facilitate slightly more flexibility in handling transparency. That yielded some distinct visual artifacts for Mac games of the time.

S3TC was a late '90s development. The decisions to accommodate available hardware with aggressive compression to make the most of limited bandwidth and framebuffer size hobbled the spec's ability to deliver a high fidelity experience. These days you'd use PVRTC for most mobile platforms and ASTC or 3Dc for everything else, unless there's some big development I haven't heard about.

I will also corroborate the 3Dfx VSA parts being underwhelming. The Voodoo4 I picked up a few years after the company went bankrupt didn't reliably beat a Radeon 7000 and was stuck in the DirectX 6 doldrums. At least they finally supported more blending modes.
 
Last edited:
Most favorite.
My bad. Reading comprehension...
Sorry. You can all kick my ass if you think I deserve it.

I guess my least favorite would be the nVidia GeForce 210 that I have sitting in my closet.
NOT because it's woefully underpowered (because I only bought it as an emergency boot card) BUT because it has a screamy little fan that makes WAAAY too much noise for its own good.
1590769447723.png
 
All the hype on the Voodoo cards back in the day. The performance was underwhelming and the company went bankrupt.
 
All the hype on the Voodoo cards back in the day. The performance was underwhelming and the company went bankrupt.

Until near the end performance was the main thing they had going for them, honestly. Prior to Voodoo4/5 they didn't support 32-bit color, textures beyond 256x256 dimensions, or certain blending modes without burning another rendering pass, but 3Dfx cards were fast at painting pixels onto triangles and had vast market penetration. 3Dfx was also second only to Nvidia in working with developers to fix issues, so they had some goodwill even if a lot of their advantage was tied up in Glide and MiniGL drivers. But, like a lot of Web 1.0 tech companies, they burned cash irresponsibly and didn't invest enough into their next-gen Sage design. There are a lot of things they should have done to prepare the company for the future, but at this point I don't know if a duopoly between 3Dfx and Nvidia would be a more favorable outcome than the current Nvidia-AMD scenario. And it's water under a very distant bridge anyway.

You describe it so well it's almost like I bought it from you...

I lived through the early '00s and those first terrible actively cooled graphics cards. Dinky low TDP designs like your GT 210 / 8400GS with a fan plopped on to save a few cents on a few cc's of machined aluminum are a throwback.
 
Last edited:
Unfortunately my current card. PowerColor RX 5700 XT. No fault of PowerColor, it's a well built card and cooler.

So many driver issues, I've spent more time figuring out which "features" I have to disable for driver stability.
 
Unfortunately my current card. PowerColor RX 5700 XT. No fault of PowerColor, it's a well built card and cooler.

So many driver issues, I've spent more time figuring out which "features" I have to disable for driver stability.
I've heard others say this and is why have remained with NVIDIA. Never had issues with my older RX 580 and 590 cards but was scared off of the RX 5500/5700
 
I've had a total of 2 black screens so far in the 3 months I've had my 3900x 5700 setup. win+ctrl+shift+B can fix that. Or i can RDP into it and reboot that way (it doesn't actually freeze my PC when it black screens)
I have also had a weird BSOD one time when I decided to launch some random free game off steam while I already had another game running.... Don't know if it was GPU related though.
I'm surprised my PC has been as stable as it is lol, as I run, a Minecraft server, ARK server, a mail server VM, some VMs for school courses (not 24/7), file shares, and often play games with all of that running lol.
I use my own PC as a server basically, I love having 32GB RAM and 24 threads lol.
 
I've heard others say this and is why have remained with NVIDIA. Never had issues with my older RX 580 and 590 cards but was scared off of the RX 5500/5700

I had the opposite experience, oddly enough - Polaris was always weird for me, but Navi has been perfect so far. Drivers are not AMD's strong suit.
 
The only card I had major issues was the MSI 7850 Twin Frozr OC. The first card had artifacts out of the box, and the replacement experienced long-life deteriorating cooling issues until death.
 
6800 ultra, I ran a pair of them when sli was a thing.
They got me into notions that water cooling keeps a gpu under a certain temp threshold for better boost duration or stock max clock hold 24/7
This was back when my 570j was the king of single core consumer cpus
 
Back
Top