NVIDIA GeForce FX Named Best Graphics Processor of 2002

Man, does that ever bring me back.
Oh yeah, I remember the LC was 'Low Cost' and lacked the FPU, and EC was 'Embedded Controller' and lacked both the FPU and PMMU - they were all still very expensive, though!

Those systems were built like tanks, and to this day, I always wanted a Quadro 700 since it was the one that Dennis Nedry used in the first Jurassic Park movie.
Forgot all about the Mac clones back then - those are actually pretty tough to find now and can go for a pretty penny.

I actually don't remember ever seeing Copland with its multithreading capabilities back then - don't know how I missed out on that one.
So it basically was competing against OS 8 & 9, then?

You might appreciate this if you haven't seen it already:
https://hardforum.com/threads/red-falcons-retrocomputing-thread.1841374/

Cheers! :D

The LC when we got it was $3000 with a 12 inch color display. We had a dot matrix printer, and a 2400 baud modem. I later paid for a 14.4 Modem to play BBS Door games faster. I think Prodigy was my first only experience.

Great thread on the Quadra setup!

Copland was Apples failed attempt to make a “modern” OS. If it had come to be, it would have had all the features of Windows 95/98 with OS 7 backward compatibility. After it failed Apple bought NeXT for their OS, though they were also considering Be OS.

A lot of the look and some features were released with System 8.
 
Doom was August of 04 and the 6000 series was mid 04. I vaguely remember somewhat of a rivalry with Doom between the X800 and 6800 crowds.
That's right!
Back when SLI and Crossfire were both in their very first iteration, no less.

Games back then we still single-threaded as well - the AMD FX-55 (and later FX-57) was the king single-core CPU back in those days.
 
Oh wow, the memories... I was smart enough to steer my parents toward the Radeon 9600 XT in 2003, though what I really wanted, of course, was a 9700/9800 for the considerable performance boost for only a few dollars more.

Then again, that crappy Compaq was held back by an AMD Athlon XP 1800+, 512 MB of RAM, and a 5,400 RPM hard drive. I don't think I need to describe the leap I felt going from that thing to an Intel Core 2 Quad Q6600, 2 GB of RAM (later the full 8 GB), a GeForce 8800 GT, then a GTX 480, and currently a GTX 760 (albeit a hand-me-down after I moved on to the i7-4770K and GTX 980). I could now Alt-Tab without bringing the entire system to a screeching halt, for starters!

Funny thing is, the GeForce FX line has seen some vindication by VOGONS sorts, because while they absolutely suck at anything involving Direct3D9/Shader Model 2.0, they're actually pretty good backwards compatibility-wise, down to still supporting paletted textures, table fog, all those other weird features that would start being omitted on GeForce 6 Series and later (and generally never worked right on ATI Radeons or other vendor GPUs).

They can even theoretically support shadow buffers in Splinter Cell, but they bug out if you're not using a very specific driver version that only supports GeForce 4-class cards; otherwise, you get either bugged shadows or inferior dynamic shadows to the original Xbox version! And people say modern PC ports are terrible... trust me, early/mid-2000s PC ports were usually much worse, as the native PC games were much better back then.

Doom was August of 04 and the 6000 series was mid 04. I vaguely remember somewhat of a rivalry with Doom between the X800 and 6800 crowds.
Funny thing is, I don't remember much of the rivalry in question, but Doom 3 skirts the biggest disparity between GeForce 6800 and Radeon X800/X850: only the former supports Direct3D 9.0c/Shader Model 3.0.

This was a selling point at the time, as Splinter Cell: Chaos Theory, Far Cry, and TES IV: Oblivion quickly got patched to implement, or even shipped with, HDR rendering that could only be done with SM 3.0 support, giving NVIDIA's GPUs a visual advantage until ATI shipped Radeon X1800, but it could be argued in hindsight that it wasn't until the next generation that GPUs really had enough grunt to do SM 3.0 without a significant framerate penalty. (Worth noting is that the HDR in question here only applies to the 3D renderer; display HDR is a much more modern concept from a consumer perspective, and it makes me wonder if all those old DX9c-era games could be patched to have their in-engine HDR more accurately mapped to a modern HDR display.)

Doom 3, however, is an OpenGL game, so the point's kinda moot. On top of that, it's said to be one of the few games of the time that GeForce FX didn't completely suck at, also for that exact reason.

That's right!
Back when SLI and Crossfire were both in their very first iteration, no less.

Games back then we still single-threaded as well - the AMD FX-55 (and later FX-57) was the king single-core CPU back in those days.
I sure don't miss CrossFire master/slave cards, that's for sure, though SLI seemed janky back then.

I still have a magazine lying around somewhere that highlighted one of the SLI issues with Far Cry, showing a screenshot where only part of the frame on screen had a shader effect applied. This was because SFR (Split-Frame Rendering) was still an option back then, whereas modern SLI is entirely assumed to be AFR (Alternate Frame Rendering), if it's even supported.

I do wonder if revisiting SFR is the way to go for modern VR headsets, where VR SLI/CrossFire adoption in software is almost nonexistent and raw frame caps have each eye buffer on two halves of a single frame. Lack of GPU performance is one of the big things holding back VR performance in general right now, and the fact that you can't just double up on GPUs and we still don't have anything meaningfully faster than a 1080 Ti for a sane price (Titan V doesn't count) isn't helping one bit.
 
Back
Top