ATI Technologies: Gone But Not Forgotten

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
11,047
R300, RV770 were some of the best

“However, with a significant amount of GPU development work now carried out by AMD itself in various locations worldwide, it's no longer accurate to label a graphics chip (whether in a Radeon card or the latest console processor) as an ATI product.


The ATI brand has now been absent for over a decade, but its legacy and the fond memories of its old products endure. The ATI Technologies branch continues to contribute to the graphics field nearly 40 years after its inception and you could say it's still kicking out the goods. Gone but not forgotten? No, notgone and not forgotten.”

1687329171786.png

Source: https://www.techspot.com/article/2689-ati-technologies-history/
 
I've always felt that there is some potential here... as GPUs become less and less focused on gaming, and are trying harder and harder to become generalized computational devices, to release a separate product line that focuses only on gaming. For AMD they could release cards under the ATI brand. For Nvidia, they could release products under the 3dfx brand. Strip out the BS and release cards that are 100% focused on gaming. No mining, no AI, etc. Maybe even bring back Crossfire/SLI. I know that's just a pipe dream, but oh well.
 
It's interesting to consider their history. Small company in Ontario, Canada. Started putting out video cards long before we had 3d accelerators. Branched out into that as well, but man they were completely half assed compared to the competition. What kept them afloat was selling to system builders. They made some deals and sold their cards in large numbers for far less.

And then came the original Radeon where they started to get serious....it wasn't anything particularly wonderful, but still not bad. It also utilized some new rendering methods to increase performance....and then not that long after came the R300 we all know and love.
 
It's da RAGE, it tellz ya, da RAGE :D

as in Rage128, I can still recall the stench mine made when I overclocked it by 25%....ah those were the dayz !
 
It's da RAGE, it tellz ya, da RAGE :D

as in Rage128, I can still recall the stench mine made when I overclocked it by 25%....ah those were the dayz !
I wanted a rage 128 back then ... Ended up with a voodoo 3. Guess I can't complain.
 
Makes you wonder what kind of GPUs they'd be putting out if they were still around not owned by AMD today - better? worse? would have gone out of business?
 
I've always felt that there is some potential here... as GPUs become less and less focused on gaming, and are trying harder and harder to become generalized computational devices, to release a separate product line that focuses only on gaming. For AMD they could release cards under the ATI brand. For Nvidia, they could release products under the 3dfx brand. Strip out the BS and release cards that are 100% focused on gaming. No mining, no AI, etc. Maybe even bring back Crossfire/SLI. I know that's just a pipe dream, but oh well.
unless you want to go back to what graphics of 20 years ago were it's not really possible. Gaming has evolved from a chain of fixed function operations in DX9 and prior to generalized compute; with an emphasis on processing data streamed sequentially from memory.

Other than FP64, which was nerfed into the ground on consumer cards years ago, there's really not anything in the GPU itself that's not actually needed for gaming purposes.
 
I had more BSOD'd cause of nvidia what ever .dll problems....than I ever had problems with ATI drivers....
Same here, the ONLY time any Rage cards of mine caused a BSOD was when I was pushing them to absolutely insane overclocks....

OTOH, nGreediya cards of that era, well, lets just leave that for separate thread :)
 
When I was new I used ATi, loved them! Being they were Canadian too made me feel good. But then they were bought by AMD, and I tried Nvidia.. never looked back until recently. I would still probably buy an Nvidia GPU.
 
VGA Wonder and Mach64 baby!
Mach64 was a nice chip. Also, ATI had really functional OS/2 drivers for the mach64 which almost no one did at the time.
Got me by until me and everyone else gave up on OS/2. Sad / not sad.
 
I've always felt that there is some potential here... as GPUs become less and less focused on gaming, and are trying harder and harder to become generalized computational devices, to release a separate product line that focuses only on gaming. For AMD they could release cards under the ATI brand. For Nvidia, they could release products under the 3dfx brand. Strip out the BS and release cards that are 100% focused on gaming. No mining, no AI, etc. Maybe even bring back Crossfire/SLI. I know that's just a pipe dream, but oh well.
Hell yeah, it'd drum up nostalgia having ATi GPUs again amongst us old timers. Have the AI/Instinct line be AMD still. Multi GPU I think was killed because MS didn't want PC ports with SLi/Crossfire support to upstage their baby, the XBox. At least that's my thought. Look at what MS has been doing, gobbling up major publishers and studios to pad out their first party lineup to squeeze out Sony, and by extension Nintendo. Because when you leave extreme high end features on the table for devs (*cough* SLi *cough, cough*), usually they get untouched. Lowest common denominator, baby. That's the law of the land. No point in making a game that 90% of people can't at least run at low settings, pubs say...
 
Last edited:
Multi GPU I think was killed because MS didn't want PC ports with SLi/Crossfire support to upstage their baby, the XBox.
Sli and crossfire were killed because the drivers no longer handled them in dx12, leaving game devs to support multigpu themselves rather than AMD and Nvidia. Game devs didn't see enough benefit in the cost/effort involved to make it work in their titles, so here we are without it. Microsoft had nothing to do with killing it.
 
Sli and crossfire were killed because the drivers no longer handled them in dx12, leaving game devs to support multigpu themselves rather than AMD and Nvidia. Game devs didn't see enough benefit in the cost/effort involved to make it work in their titles, so here we are without it. Microsoft had nothing to do with killing it.
No attitude here or snark to preface; I am aware that is it is now up to developers to implement explicit or (what was the other option with different GPUs?) in DirectX 12 and DX12 Ultimate. My line of thought is that Microsoft had everything to do with it because the PC was outstripping even the XBox One X by far. Who wrote the spec? Microsoft did. #tinfoilhat
 
Back
Top