Should the 5700xt be considered one of the greatest of all time?

I confused the driver issues with Voodoo Rush.

Performance of the Banshee though was lower than the V2 due to the lack of a second texture unit. Image quality on both was considered equal.


https://www.anandtech.com/show/197/3
Yeah, "Rush" was more problematic.
Though I am not sure if that is due to drivers as in Windows drivers or because many early Glide games were DOS games with baked Glide support and these did not run well on Rush. BTW. With V2 most DOS games ran well, albeit in only 640x480 and only case I had issues was Carmageddon, though overall while it did irritate me at the time I have to say that I prefer original software rendering in this game, especially "hi-res" 640x480 mode - which on anything faster than Pentium 1 runs perfectly fine.

As for image quality I had V2, V3 and Banshee on different points in time so I have some experience with those cards.
Certainly when going from V2 12MB to V3 3000 the image quality improved with much cleaner look on V3.
Right after V3 I had Banshee (money issues, profitable exchange) and it looked very similar to how it did on V2. Maybe dithering was slightly more visible because of the filter used but dithering itself was definitely looking more like it did on V3 than V2. I am not sure if Banshee had the same kind of 2x2 filtering as V3 later did, probably not, but it definitely used better dithering and all 3dfx cards did have some kind of special filter on them. The most visible difference between V3 and Banshee for me however was that while on V3 there was no point in running at anything than 1024x768 or at the very least 800x600 the poor Banshee was slower enough that resolution had to be set one notch lower so 800x600 was typical resolution (so exactly how I used Voodoo2) and in more extreme cases 640x480 - though I do not remember actually ever using 640x480 in games on any 3dfx card other than those which were designed for Voodoo1 and could not change to anything higher and other than to see how much better performance could get with better 3dfx card.

After that I got my hands on V2 8MB and then even second one for testing SLI (the SLI testing was the purpose to borrow these cards from people :) ) When comparing V2 vs Banshee it was immediately clear that V2 image was slightly rougher. After adding second V2 and enabling SLI the image got kinda scanline look to it (probably why they named it Scan-Line Interleave 🙃) with half of lines being slightly different brightness but of course performance was enough to forego 800x600 and run everything at 1024x768 at higher framerate so overall it worked great.

BTW. One thing to note: Voodoo2 especially 8MB variants in newer games did run out of memory and in these cases ran slower than even Banshee.
I am not sure if there is any game which could still be played on 3dfx cards (pretty much anything DX8 was not, also some DX7 games might be too much for old 3dfx with 256x256 texture limit and other limitations) which had this issue on 12MB Voodoo2 but for 8MB the good example is Quake 3 which could not be played at max texture setting, even with V2 in SLI, while it ran fine on Banshee.

Overall I was very pleased with Banshee and other than the fact I had it after much faster Voodoo3 3000 I was very pleased with its performance. I also had opportunity to test it against Riva TNT2 M64 and while M64 was faster in something like Quake3 it looked quite a lot worse at 16bit and ran much slower at 32bit so cheap vs cheap Banshee was imho better accelerator to have.
 
Don't kid yourself. AMD wasn't trying to make a midrange card with the 5700XT. It was shooting for the high end but the architecture's inefficiency reared its ugly head and it just wasn't able to compete. The idea is always to build an ultra-high end card and then scale it back for different price points as that's generally the best way to go from both a manufacturing and a technical standpoint. AMD also talked about scaling etc. which given its horrendous power consumption was never in the cards. Pun intended. There was never going to be a scaled up version of this. This was the high end, it just failed to measure up to NVIDIA's midrange at the time. This was also the end of the line for that architecture as the architecture behind the 6800/6800XT and 6900XT is almost entirely new. There is a good reason for that.
They could scale-up Navi to compete with even 2080Ti if they really wanted.
There was however no point in doing that because in the end the cost would be huge while without modern features they could never truly claim to have the best GPU and it is mid-range which generates the highest profits.

Is 6800 entirely new? Navi looks like something in between GCN and RDNA2... perhaps its RNDA1 🙃
 
Yeah, "Rush" was more problematic.
Though I am not sure if that is due to drivers as in Windows drivers or because many early Glide games were DOS games with baked Glide support and these did not run well on Rush. BTW. With V2 most DOS games ran well, albeit in only 640x480 and only case I had issues was Carmageddon, though overall while it did irritate me at the time I have to say that I prefer original software rendering in this game, especially "hi-res" 640x480 mode - which on anything faster than Pentium 1 runs perfectly fine.

As for image quality I had V2, V3 and Banshee on different points in time so I have some experience with those cards.
Certainly when going from V2 12MB to V3 3000 the image quality improved with much cleaner look on V3.
Right after V3 I had Banshee (money issues, profitable exchange) and it looked very similar to how it did on V2. Maybe dithering was slightly more visible because of the filter used but dithering itself was definitely looking more like it did on V3 than V2. I am not sure if Banshee had the same kind of 2x2 filtering as V3 later did, probably not, but it definitely used better dithering and all 3dfx cards did have some kind of special filter on them. The most visible difference between V3 and Banshee for me however was that while on V3 there was no point in running at anything than 1024x768 or at the very least 800x600 the poor Banshee was slower enough that resolution had to be set one notch lower so 800x600 was typical resolution (so exactly how I used Voodoo2) and in more extreme cases 640x480 - though I do not remember actually ever using 640x480 in games on any 3dfx card other than those which were designed for Voodoo1 and could not change to anything higher and other than to see how much better performance could get with better 3dfx card.

After that I got my hands on V2 8MB and then even second one for testing SLI (the SLI testing was the purpose to borrow these cards from people :) ) When comparing V2 vs Banshee it was immediately clear that V2 image was slightly rougher. After adding second V2 and enabling SLI the image got kinda scanline look to it (probably why they named it Scan-Line Interleave 🙃) with half of lines being slightly different brightness but of course performance was enough to forego 800x600 and run everything at 1024x768 at higher framerate so overall it worked great.

BTW. One thing to note: Voodoo2 especially 8MB variants in newer games did run out of memory and in these cases ran slower than even Banshee.
I am not sure if there is any game which could still be played on 3dfx cards (pretty much anything DX8 was not, also some DX7 games might be too much for old 3dfx with 256x256 texture limit and other limitations) which had this issue on 12MB Voodoo2 but for 8MB the good example is Quake 3 which could not be played at max texture setting, even with V2 in SLI, while it ran fine on Banshee.

Overall I was very pleased with Banshee and other than the fact I had it after much faster Voodoo3 3000 I was very pleased with its performance. I also had opportunity to test it against Riva TNT2 M64 and while M64 was faster in something like Quake3 it looked quite a lot worse at 16bit and ran much slower at 32bit so cheap vs cheap Banshee was imho better accelerator to have.

Sorry about confusing the two.

You’re bringing me back to days of trying to decide between an original Power VR and Voodoo 1. I had an HP P120 desktop...which I recently repurchased LOL.... but I ended up with a RIVA 128 and then a V2 with the RIVA on a P2-300. My HP was stolen. I later had a Falcon NW with V2-SLI.

I was thinking about the image quality issue and you reminded me of something: the external dongle. I had forgotten that the external dongle would lower image quality, which the Banshee didn’t have. That might be part of the issue. It got even worse if you had an MPEG playback card (remember those?!?). Also CRTs were more finicky about that stuff. Trinitron especially.

Funny thing is, I just dropped V2 in a Pentium MMX 200 machine. I think it’s the same Voodoo 2 I used in my P2-300...like exactly the same card.

I actually still have two new V2s sealed in the box. STB models after 3dfx bought them.

Voodoo brand died off I think due to the lack of a Texture and Lighting engine. My father had a Voodoo 5, but I recall it not playing Warcraft 3.

He threw that old PC out much to my great sadness.

I want to say I have a pair of V3-3500 TV units that are new white-box models.

Good times.
 
Voodoo brand died off I think due to the lack of a Texture and Lighting engine. My father had a Voodoo 5, but I recall it not playing Warcraft 3.
Yes, T&L unit and "the world first GPU" that GeForce 256 was... I think other than in Nvidia materials and some games enabling a kind of bump mapping (DOT3 bump-mapping) which did not strictly require T&L I do not remember it giving that much to overall experience. 3d mark ran much better with it enabled but not much else did. Maybe Max Paine as it might have run on the same engine but I am actually not sure. GPU tech like CPU tech skyrocketed at that time and often hardware changes were mandatory to keep up with newest game releases.

I had GF256 32MB SDR as my second 3d accelerator and first graphics card with accelerator and my first real GPU. It was decent card this whole "GeForce" and where it worked it performed admirably. Geometry with T&L in 3d mark 2001 tests ran better on GF256 with it 4x1 setup than say GeForce MX 400 which was stripped down version but which had 2x2 architecture (two pixel pipelines with two texturing unit each). In games higher GF2 MX400 models could outperform old GF256 though. Friend got MX400 card and we did a bunch of comparisons. I swapped it for Voodoo3 (and some money back) as it ran most of the games better, especially based on Unreal engine and many other with Glide support while newer games still ran pretty decent on it compared to mandatory 32bit color on GF256 (as 16-bit sucked on this card as much as it did on Riva TNT/2) for which this card didn't quite have the required memory bandwidth yet.

Those were quite crazy times. Graphics card tech literally skyrocketed, pretty much the same as CPU speeds. One year was enough to make Glorious PC gaming master race rig literally obsolete. At least hardware was much less expensive back then than it is today.

Did Warcraft 3 really require T&L though? I remember friend playing the game on his Riva TNT2 M64 and that card had even less features than Voodoo4 4500 thus also V5 5500. Maybe driver issue?

BTW. 3dfx cards in the box are getting more and more expensive. Hold on to them as your retirement funds 🙃
 
So they made a 40 CU part to compete at the high end with a small die? Yeah... No. Whether or not they tried to scale it up and it didn't work is not an argument of whether the 5700xt was a good card or not. The 5700xt was not meant to compete with a 2080.. they may have hoped they could scale up/down or w/e but those imaginary parts are not the 5700xt. The question is the 5700xt not some mythical part that was never made that couldn't compete. It's more like they were focused on the CPU side and didn't want huge dies taking up space at TSMC with questionable (at the time) defect rates. The card at hand that was released was the 5700xt and it competed well with what it was intended, which is the discussion. Not other cards that never were produced from this architecture, but the actual 5700xt that was produced.
My point still stands, the 5700xt wasn't meant to compete at the high end (the 5700xt itself, not the architecture). The argument isn't about how or why AMD didn't compete, it's about the 5700xt and it's position. It was a decent $/perf and that's about it. It aged pretty well for what it was, but, in my opinion, is not as long lived a card as others.
AMD was focused on RDNA2 core for next cards release and next gen consoles.
Both SONY and M$ wanted GPU with all the newest features like ray-tracing and other optimization options like eg Variable Rate Shading to be available on their upcoming consoles so not point in giving first Navi any more R&D attention.
Also in 2019 AMD also released Radeon VII which for them was their top-end GPU and it being GCN didn't really matter because features were the same as Navi features.
 
I've always seen the 5700xt as an overpriced rx 480 replacement. Priced high because the market allowed it. It wasn't anything much besides being what was available at the performance level I wanted. Didn't have the same feel as the 290/290x situation.
 
It was a decent $/perf and that's about it. It aged pretty well for what it was, but, in my opinion, is not as long lived a card as others.
I don't know. 5700 XT as a mid-range part has always seemed overpriced to me. Seemed to me AMD wanted to price it like a high-end part. Remember the backlash they got at the announcement pricing and had to lower it $50?
 
I don't know. 5700 XT as a mid-range part has always seemed overpriced to me. Seemed to me AMD wanted to price it like a high-end part. Remember the backlash they got at the announcement pricing and had to lower it $50?
Overpriced compared to what? It was ~20% cheaper than it's competition that was ~5% faster. The fact that they didn't finalize the price until the competition released their price doesn't change the price that it was released at. It wasn't the "backlash" they got, it was Nvidia pricing that determined how much they could charge. If Nvidia would have priced higher AMD wouldn't have dropped prices, (they are both trying to maximize profits), they price it relative to the market, which they did. As i said, i feel their $/perf (actual price, not the fantasy of what they claimed prior to release) was decent. Great? I don't think I'd say that, but decent? I would say it was decent, you can disagree, just stating my opinion based on the #'s of said card compared to it's competition.
 
ATI/AMD has had a lot of market-shaping products but I wouldn't even put the 5700XT on the list. It's competitive because they cut it's price to compete. That's it.

The 9700 Pro, 4870, or the 7970 were significantly more impactful.
Agreed, at least those cards made a huge dent on nvidia's crown. The 5700XT is an impressive card as it trounced Vega easily for far less, but its not on the same level as the mentioned cards.
 
The launch event images of the 5700xt anniversary edition had "RX 690" on the card. These were designed, even up until they were making early promotional materials, to be spiritual successors to Polaris. I remember laughing quite hard when they announced 379, 449, and 499 launch prices for them and they kept switching the feed to all the redshirt fans applauding like this was a good thing. I only paid $300 for my 5700 and I don't think I would've paid much more than that.
 
+1 vote for the ATI 'all in wonder' series cards! (debuted the word Radeon I believe) (sorry for the late post)

transparent 'always on top' tv window superimposed over ur fav game FTW!!!

pre2k when TV and blockbuster were all she wrote in terms of content selection, and dont forget the magic of importing and editing video for the first time (home movies!)... swiss army knife of gpu. mine was 32mb, pci bus... but they made better ones. ran CS:1.5 well for how weak it sounds

was cheap! hooked up to crt monitor and 27" tv and a Vcr (or SNES or Genesis) may have had FM radio cant remember...


id like to see feature innovation return, like... removable ram, wireless monitors, standardized cooling fans, lol 16x slots on smartphones...

the ground.... its gone sour!!

5700 may go down in history as the last 'bargain' enthusiast video card cuz 6700xt 192 bit 479$ = facepalm!!! not a legend tho Imo
ive been surprised before tho! (fingers crossed)
 
I paid $399 for the XFX 5700xt THIC last year to have as a backup card in my old computer. I can't get over the fact that people are paying upwards of $800 for a used version of the 5700xt.
 
I paid $399 for the XFX 5700xt THIC last year to have as a backup card in my old computer. I can't get over the fact that people are paying upwards of $800 for a used version of the 5700xt.

Yes, but that's not "normal" pricing. People are also paying $500 for 1080s, $400 for RX580s, etc. It's all a ROI calculation for eth miners. The 5700xt is a fantastic mining card. I remember back in the day I sold my 5870 to a BTC miner for more than I paid for it and thought I got a deal. Turns out he was fleecing me...lol. I should have been mining back then.
 
+1 vote for the ATI 'all in wonder' series cards! (debuted the word Radeon I believe) (sorry for the late post)

transparent 'always on top' tv window superimposed over ur fav game FTW!!!

pre2k when TV and blockbuster were all she wrote in terms of content selection, and dont forget the magic of importing and editing video for the first time (home movies!)... swiss army knife of gpu. mine was 32mb, pci bus... but they made better ones. ran CS:1.5 well for how weak it sounds

was cheap! hooked up to crt monitor and 27" tv and a Vcr (or SNES or Genesis) may have had FM radio cant remember...


id like to see feature innovation return, like... removable ram, wireless monitors, standardized cooling fans, lol 16x slots on smartphones...

the ground.... its gone sour!!

5700 may go down in history as the last 'bargain' enthusiast video card cuz 6700xt 192 bit 479$ = facepalm!!! not a legend tho Imo
ive been surprised before tho! (fingers crossed)

Except that the Radeon AIW series was a pain to work with due to their shit drivers. I can't tell you how many systems I fought with over the years that had them. I don't think I ever installed one and saw it work like it was supposed to. Always one facet of it failed to work properly. More often than not, the TV tuner part.

They were nifty, but they were a niche item for sure. And let's be honest, the video card part was a Radeon which was often a nightmare where actual gaming was concerned. Once again due to shit drivers.

I fought with these things allot. I think of them as a great idea at the time but their execution left a lot to be desired.
 
There was a version of the AIW Radeon X800XL that I remember was pretty legit. It was basically an X800XT w/ AIW features. Not sure why some of them were made that way.

My list:

3DfX Voodoo1/2 (just for the fact that it was so mind blowing at the time)
Nvidia 8800GTX/GT (still running a variety of them over my network. arcades, spare boxes, etc)
ATi 9700/9800 (these also seemed to last a long time for me)
 
I got to build one rig with an All-In-Wonder. For my sister in law when she started college. I forget the specs of the rest of the rig, but she got an All-In-Wonder 9700 Pro, Klipsch 5.1 Pro Media speaker set, and a 19" LCD monitor. F'n sweet rig back then. They ended up spilling drinks on it less than a year in, and ATI honored the warranty and let me RMA the card.

I think I had a 9500 modded to a 9500 Pro at the time (does that sound right)? With another PCI TV tuner card instead of an AIW.
 
Except that the Radeon AIW series was a pain to work with due to their shit drivers. I can't tell you how many systems I fought with over the years that had them. I don't think I ever installed one and saw it work like it was supposed to. Always one facet of it failed to work properly. More often than not, the TV tuner part.

They were nifty, but they were a niche item for sure. And let's be honest, the video card part was a Radeon which was often a nightmare where actual gaming was concerned. Once again due to shit drivers.

I fought with these things allot. I think of them as a great idea at the time but their execution left a lot to be desired.

x1900AIW had zero issues for me back in the day and I got it for a ridiculous $200 through some weird clearance thing when the 1900 was still considered to be very competitive. Maybe the earlier AIW/GPUs had issues but that 1900 was a monster for me on water.
 
"Famous" cards as I remember them.

Geforce 4200ti - priced as a low tier product, mid range performance
Radeon 9800 Pro - Great performance for a long time
Geforce 6600gt - almost identical performance to the more expensive 6800 vanilla
Geforce 8800gt - king of value/midrange for several years
Geforce 560ti - Great value again, but very power hungry for a mid range card
Radeon 7950/7970 - These things are still relevant today, despite their age now
Geforce 1080ti - I own one, and several others I know do to, despite being several years old, It still holds up well on VR and modern games

"Infamous" cards
Geforce FX (5th gen) series (Especially the fx 5200) - Horrible performance, DX9 issues, the Radeon 9xxx series spanked it.
Intel GMA900 - Was around on nearly every laptop and netbook for years and years. Claimed to support DX9, but always required special drivers/patching to be DX9 functional. Obsolete the day it came out.
 
"Famous" cards as I remember them.

Geforce 4200ti - priced as a low tier product, mid range performance
Radeon 9800 Pro - Great performance for a long time
Geforce 6600gt - almost identical performance to the more expensive 6800 vanilla
Geforce 8800gt - king of value/midrange for several years
Geforce 560ti - Great value again, but very power hungry for a mid range card
Radeon 7950/7970 - These things are still relevant today, despite their age now
Geforce 1080ti - I own one, and several others I know do to, despite being several years old, It still holds up well on VR and modern games

"Infamous" cards
Geforce FX (5th gen) series (Especially the fx 5200) - Horrible performance, DX9 issues, the Radeon 9xxx series spanked it.
Intel GMA900 - Was around on nearly every laptop and netbook for years and years. Claimed to support DX9, but always required special drivers/patching to be DX9 functional. Obsolete the day it came out.
I remember the GMA900 having issues but the derivitive GMA950 had a very brief moment of relevance as when it came out it was the first Intel iGPU that was actually capable of gaming, sort of. Ran UT2K4, Red Orchestra, STALKER, and Halo ok on my systems with it but very quickly became obsolete and now the crappy iGPU perf is barely enough for web browsing lol

GeForce 6600GT was a legend, love that one. The only issue was the AGP versions using the PCIe-AGP bridge that ran hot as hell and was usually undercooled. I resorted to hacking up an old Slot 1 heatsink to cool mine, there was a thread on 2CPU about it but all that's gone now (rip)

F**k GeForce FX, that's about all I have to say about that. Epic hype followed by epic fail and they caught teenage me hook line and sinker with it causing me to waste my summer money on a disappointing card.
 
No, because my 5700 XT did the random blank/black screen all the time. Only way to fix it for the longest time was to unplug/replug the Displayport cable. It took almost a year for that thing to finally have drivers that actually fixed the problem.

My favorite all star GPUs have been the 8800 GT, the 290x, and the 1080 Ti.
 
I made a totally rad wall plaque for my brother xmas 2016

8800gts (Dead, 10$, dude said he had reflowed it several times)
cedar wood plank, there were some convenient holes on the PCB for small brass screws (classy)

hand etched with my dremel
"all in all youre just another brick on the wall"

he said his gf no like and tossed it when he moved! man I am still mad he didnt at least return it... took all night to make!

need to make another!
need to make it so the fan spins up when someone steps on the welcome mat lol...
 
ATI 9700 Pro is king.
This was the card that changed the way I looked at graphics cards forever. With it anything was possible, and it felt like I was committing a crime to turn on 8xAA and not see the framerate tank.

5700xt will be remembered as a card that ALMOST brought value back to the midrange gaming segment. If it was a $300 card, AMD might have had something game changing, otherwise, it was just another product for Nvidia to outsell by 40:1 in every segment.

I found it to well outperform my expectations across the board, and trade blows all the way up to the 2080 in game experience in titles. The price was attractive, yet still not quite enough to lure away Nvidia fans, which is what AMD needs to do in order to reestablish themselves as a competitor and not just the scraps. Unfortunately with the number of other silicon commitments they have, I doubt they will ever produce enough gpus to really reclaim their former glory.
 
I doubt they will ever produce enough gpus to really reclaim their former glory.
And that was ATI's glory. AMD hasn't cared to really throw resources into graphics like they have on CPU. Seems to be changing, we'll see how it goes.

And I don't think Nvidia really sees Radeon as a competing product anymore. I think that may change if RTG can put out an effective DLSS equivalent and really beef up their RT in the next generation. Until that happens, Nvidia still has a leg up on feature offerings over any AMD options right now...in my opinion. AMD is also shooting themselves in the foot a bit on pricing. Yes I know MSRP's are fantasy land right now, but supposing they weren't, they don't have feature parity with Nvidia's offering right now. The RX 6000 cards are way overpriced in my opinion. If all you want is rasterization performance, then they are great options, but still they lack features that many care about (not just RT/DLSS). And to me that makes them a bad priced product relative to the competition. Again, if all you care about is raster and don't use a lot of other features then its fine I guess.
 
One point AMD has going for it is hardware scheduler, so they can get around 20% more out of your CPU than Nvidia in DX12.

Not a high-end consideration, but with mid to low tier CPU's this should be a deal breaker, if it was more widely known.
 
Back
Top