Anyone seriously considering the 4070 TI

Yes, choose the 4070 Ti over the 7900 XT? If so, why?
Yes, choose the 4070ti. It has better upscaling in far more games (dlss), lower power draw frame gen is taking off in some demanding games which is major and gives roughly double the perf, it has usable raytracing speeds, enough vram, better video encoding (nvenc, higher quality better file sizes), and arguably better drivers for most people. Plus, Nvidia keeps better resale values of you want to upgrade later.
 
Yes, choose the 4070ti. It has better upscaling in far more games (dlss), lower power draw frame gen is taking off in some demanding games which is major and gives roughly double the perf, it has usable raytracing speeds, enough vram, better video encoding (nvenc, higher quality better file sizes), and arguably better drivers for most people. Plus, Nvidia keeps better resale values of you want to upgrade later.
I hear you - and the other reason I'm interested in the 4070 Ti is the superior Compute and video editing performance. However, I have some suggestions for you: I am not an expert and probably haven't researched it as much as you (guys):

4070 Ti - pros - better video encoding - lower power consumption, DLSS (I am not sure what to compare that to - FSR? I concede I am not familiar with it) - arguably better/more stable drivers
cons - low VRAM - really needed this to be 16GB at min., some argue it's a crippled/"cut down" Nvidia card - 196bit bus? - c'mon Nvidia, WHY?!? - Nvidia claims it doesn't need to be this high, *price is pretty hard for this tech. of card - although, price has somewhat fallen in my country, RT - hurts performance - so, not sure if this is really usable - so, asking you.... :) Lots of ppl say you need a 4080 or 4090 (ideally) to really use raytracing - for it to be worth buying a card for RT
7900 XT - pros - lots of VRAM - 20gb is pretty decent/acceptable, good performance - rasterization - supposedly 'beats' the 4070 Ti/4070 in 4K res - not horrible at video editing although that is debatable, FSR, this card and the 20 GB VRAM - might 'age well' (it's argued).
cons - high power consumption - some problem at idle power consumption particularly when using more than one monitor - especially, if the two are different refresh rates - this issue has yet to be solved? - pretty mediocre/disappointing Compute performance - it's debatable whether the ray tracing that is yet to be fully implemented will compete with Nvidia cards (e.g. HIP-RT vs OptiX).

I guess that should be better organized (in bullet form?) but those points are pretty much the oft-repeated pros/cons and features of the cards? But, it doesn't help me much determine which one to get if I can get one. :) The scenario for me is selling my card (for $700 CAD) and I'd need about $500 to get either of these cards - taxes puts them both around $1200 - they'll be close enough in price to make it an 'either card' buy. Depending on make/model, probably $20-$50 differences so negligible. Also, I have seen sales from time to time - so various makes/models will be 'good deals' on either Nvidia/AMD. I suspect both reduced in price a smidgeon so that they are at similar price points. Thoughts?
 
I just don’t think the 12g cards for $600 plus are worth it…You will be shopping for another one in 2 years. Now I’f you like to keep current then it’s not a big deal really. Buy what you need now for right now usually works out. The consensus is that if you use software that heavily favors Nvidia then just look there. Forget AMD. If I had to have team green for professional use it’s 4080 or 4090 for the vram IMO. Some projects just use more than 12g so.

What I did:
I picked up a reference 7900xt open box for under 700 all in and it has been working great. Idle power consumption is definitely higher than other cards. Ray tracing in Cyberpunk 1440p max with fsr looks good and is super smooth. I don’t have too many other games with RT and my opinion on the tech is that yes it’s awesome and no we aren’t there yet for the masses. Meaning it’s costing a lot of fps and cash to really get into it. I don’t have any software use cases for cuda or other nvidia specific features. AMD or Nvidia cards that game well at 1440p and lots of eye candy fit my use case so I have no blind loyalty either way.
 
Last edited:
And then people fellatiate Nvidia for this. Like, they feel proud of Nvidias ability to just nail the consumer base to the wall.

Must be shareholders because as a middle class worker, its revolting.
The days of 400$ video cards are gone. Thank you nV for that.

Ah well. Was fun while it lasted. I want to go back to 1993 and live life in a loop. When it gets to 8/2001 reset it back to 1993. Ah the good days. I loved pc gaming in the 90s. I think I'm gonna fire up unreal tournament right now just to have a blast!

If only there were a competitor to Nvidia that could save us from such evil...
 
  • Like
Reactions: pavel
like this
I just don’t think the 12g cards for $600 plus are worth it…You will be shopping for another one in 2 years. Now I’f you like to keep current then it’s not a big deal really. Buy what you need now for right now usually works out. The consensus is that if you use software that heavily favors Nvidia then just look there. Forget AMD. If I had to have team green for professional use it’s 4080 or 4090 for the vram IMO. Some projects just use more than 12g so.
Exactly, the main problem I have with the 4070ti. In 2023, only having 12GB of VRAM for $800 seems crazy to me and severely limits the cards potential. For me it might not be a big deal because I swap hardware fairly often, but most people hang onto hardware for years. I currently run a 4090 so I am very familiar with DLSS, frame generation and other Nvidia benefits, but also have owned a 7900XTX and my son currently has a 7900XT so I am very familiar with AMD's offerings as well. Based on those experiences, at the same price, I would get the 7900XT almost every time over the 4070ti (except in maybe a very specific use case).
 
I bought a 4070ti for my nephew. He is so happy with it. He is upgrading to a 13700k, z790 motherboard and ddr5 ram because he has it.

Be kind, the universe is watching.

I bought a 4090 and am not buying anything for so long...I mean really. Shit gets expensive quick.
 
I mean I get it. My buddy does a total new build every 5 years or so and always buys a flagship gpu. He has yet to ever upgrade his gpu before building a whole new rig. Naturally he went 4090 this time. I tinker too much so swap parts more often so I go cheaper. What’s funny is with pricing being what it is today that approach may mean 2 or 3 $600 GPUs for me before he retires his 4090.
 
Exactly, the main problem I have with the 4070ti. In 2023, only having 12GB of VRAM for $800 seems crazy to me and severely limits the cards potential. For me it might not be a big deal because I swap hardware fairly often, but most people hang onto hardware for years. I currently run a 4090 so I am very familiar with DLSS, frame generation and other Nvidia benefits, but also have owned a 7900XTX and my son currently has a 7900XT so I am very familiar with AMD's offerings as well. Based on those experiences, at the same price, I would get the 7900XT almost every time over the 4070ti (except in maybe a very specific use case).
The new AMD cards are so power hungry though - and they seem to always have some driver or hardware problem relating to power consumption of some sort - look at the idle power with multiple monitors especially if the monitors have differing refresh rates. Nvidia is evil but it's hard to argue that their cards aren't technically superior. The problem is Nvidia being evil and greedy - they CHOOSE to omit larger VRAM quantity and they don't care about their shoddy software 'centers' - compared to AMD's. But, their architecture is more efficient. Look at this:
https://www.tomshardware.com/news/amd-rx-7900-xtx-matches-rtx-4090-at-700w
https://www.pcgamer.com/nvidia-geforce-rtx-4090-founders-edition-review-performance-benchmarks/

https://www.tomshardware.com/reviews/amd-radeon-rx-7900-xtx-and-xt-review-shooting-for-the-top/8

How do you justify picking an AMD card if you also want the card for video editing/Compute performance? AMD just isn't there - but, it's good at almost everything else - which is.... mostly gaming?
But, to get a decent card with sufficient/decent amount of VRAM - you are looking at the 4090 on the Nvidia side but it's insanely expensive outside the USA - in Canada, Australia and Europe - it's like a billion dollars! It's considerably more expensive even if you account for currency conversion. The 7900 XT cards are expensive too - and they mostly get you AV1 encoding/decoding and improved gaming performance but Blender performance is still meh and video editing performance is pretty good but that's it. There's also accusations/allegations of driver and software program crashes. The 4090, at the price it currently is, is unattainable for a lot of ppl despite it being a good card - architecture wise.
 
ALL I can say is i am happy with the 4070 card, mind you i am moving from the 2070s.
The difference in shader and look in RTX games is very good.
I am just talking about Quake2 RTX and poertal RTX, but wow very impressed.
Quality of RTX is a huge jump from 20 series cards.
 
Apart from a value proposition, the new cards, even the "lesser cards" are not really horrible. If you must buy latest and greatest and new and Nvidia and you can justify the cost, the cards should do well.

Money is not a problem for many. Especially enthusiasts. Of course, in that case, I say buy the 4090 (or future 4090 Ti?)... I mean, why not??
 
Can I bump this? :) Money is a problem for me - I have enough for a used 3090 but I sometimes see some 4070 Ti cards for sale - and only about $100-$200 more. I hate that they're only 12gb and that's probably my main complaint with them. If I want a card for productivity and a bit of gaming - 12gb seems kinda low? Would getting one be a mistake? The Ada cards are way better for power efficiency and they'd have a longer warranty too - being newer etc. - less chance of it being a mining card and so on. AV1 encoding/decoding - that could be useful.
I want it for Blender, Davinci Resolve and some gaming - but, the 12gb of vram worries me. The 4080 has 16gb - but, it's so much more $$ - $700 more (used card) and the 4090 - forget it. :-(
The 7900 XT is actually more expensive than the 4070 Ti - but, a used one could be doable. 3090, 4070 Ti and 7900 XT - I think I could afford in a little while. The other two cards have 20 and 24gb of vram, respectively, and thus, would be more versatile. I dunno if I need the vram but if one does - having too little, you are screwed - for the most part. If you have amble vram or extra, well.... obviously, it's better to have more. Damn Nvidia - for the low vram!!! Bastards! That card should at least have 16gb, no?!?
 
Can I bump this? :) Money is a problem for me - I have enough for a used 3090 but I sometimes see some 4070 Ti cards for sale - and only about $100-$200 more. I hate that they're only 12gb and that's probably my main complaint with them. If I want a card for productivity and a bit of gaming - 12gb seems kinda low? Would getting one be a mistake? The Ada cards are way better for power efficiency and they'd have a longer warranty too - being newer etc. - less chance of it being a mining card and so on. AV1 encoding/decoding - that could be useful.
I want it for Blender, Davinci Resolve and some gaming - but, the 12gb of vram worries me. The 4080 has 16gb - but, it's so much more $$ - $700 more (used card) and the 4090 - forget it. :-(
The 7900 XT is actually more expensive than the 4070 Ti - but, a used one could be doable. 3090, 4070 Ti and 7900 XT - I think I could afford in a little while. The other two cards have 20 and 24gb of vram, respectively, and thus, would be more versatile. I dunno if I need the vram but if one does - having too little, you are screwed - for the most part. If you have amble vram or extra, well.... obviously, it's better to have more. Damn Nvidia - for the low vram!!! Bastards! That card should at least have 16gb, no?!?
Get a 3090, yes the 4070 ti is a BS card.
 
Can I bump this? :) Money is a problem for me - I have enough for a used 3090 but I sometimes see some 4070 Ti cards for sale - and only about $100-$200 more. I hate that they're only 12gb and that's probably my main complaint with them. If I want a card for productivity and a bit of gaming - 12gb seems kinda low? Would getting one be a mistake? The Ada cards are way better for power efficiency and they'd have a longer warranty too - being newer etc. - less chance of it being a mining card and so on. AV1 encoding/decoding - that could be useful.
I want it for Blender, Davinci Resolve and some gaming - but, the 12gb of vram worries me. The 4080 has 16gb - but, it's so much more $$ - $700 more (used card) and the 4090 - forget it. :-(
The 7900 XT is actually more expensive than the 4070 Ti - but, a used one could be doable. 3090, 4070 Ti and 7900 XT - I think I could afford in a little while. The other two cards have 20 and 24gb of vram, respectively, and thus, would be more versatile. I dunno if I need the vram but if one does - having too little, you are screwed - for the most part. If you have amble vram or extra, well.... obviously, it's better to have more. Damn Nvidia - for the low vram!!! Bastards! That card should at least have 16gb, no?!?
You can get 4080 for $1100 new. If you can get a used 4070 ti for $700 less, that would make it $500. Which is a hell of a deal, that you should take.
 
I know this was not the most awesome launch ever, but I have started to think about what to replace my stepsons aging 2060 Super with. The 4070ti might be a little pricy, but have people warmed up to the sub 4090 4000 series, or are they still generally considered shit?

And if so, what are people upgrading to?
 
Last edited:
I know this was not the most awesome launch ever, but I have started to think about what toreplace my stepsons aging 2060 Super with. The 4070ti might be a little pricy, but have people warmed up to the sub 4090 4000 series, or are they still generally considered shit?

And if so, what are people upgrading to?
I think the only problem with the non-4090 40 series is the overpricing. If the 4080 debuted at $999 and the 4070Ti at $699, they'd be considered fantastic price/performance deals. I got my 4080 for $1070 (w/tax) brand new, so I'm :D.
 
Last edited:
I know this was not the most awesome launch ever, but I have started to think about what toreplace my stepsons aging 2060 Super with. The 4070ti might be a little pricy, but have people warmed up to the sub 4090 4000 series, or are they still generally considered shit?

And if so, what are people upgrading to?

I bought a vanilla 4070. It is a little underpowered at 1440p, but I got it right around $500 which is a nice upgrade over something like a 2060.
 
I bought a vanilla 4070. It is a little underpowered at 1440p, but I got it right around $500 which is a nice upgrade over something like a 2060.

Yeah, I think the 4070 is probably the lowest I can go, due to needing 16x lanes.

I am considering doing an end of life refresh on the stepsons machine.

He has an MSI B350 Tomahawk I originally paired with a Ryzen 5 1600x back in 2017, and later upgraded to a Ryzen 7 3800x. For this end of life refresh I'm thinking of dropping a Ryzen 7 5800x3d in it which should work nicely. (I have to say that MSI B350 Tomahawk is a real champ. Great board that has lasted a long time, and it only cost me just north of $100 when new. Once this last iteration is done, it will owe me absolutely nothing.)

The issue then is, since it is a B350 board, it will still be on PCIe Gen 3, which should be fine for a 16x capable GPU, but one of the 8x models might be problematic. 8x Gen4 may be sufficient, but 8x Gen3 might be an issue. So 4070 (non-ti) is probably the lowest model i can go with, unless I go previous gen. (maybe a 3080 or a 3080ti if I can find them)

Since it is not my system, and I'm buying them for my stepson, I am leaning towards latest gen, because previous gen is often used, and no one likes to get a previously opened retail package as a birthday gift :p
 
You can get 4080 for $1100 new. If you can get a used 4070 ti for $700 less, that would make it $500. Which is a hell of a deal, that you should take.
Oh, sorry, this is CAD - $800 CAD = $586 USD. Used 4080s range: $1300 - $1500 ($950 - $1100 USD) - however*, these are almost Zotac Trinity cards. Dunno if those should be avoided but that's what I notice. :) Used 3090 - are often overpriced but the lowest price I've found is $700 CAD ($513 USD).
 
I know this was not the most awesome launch ever, but I have started to think about what to replace my stepsons aging 2060 Super with. The 4070ti might be a little pricy, but have people warmed up to the sub 4090 4000 series, or are they still generally considered shit?

And if so, what are people upgrading to?
Here are my thoughts on the 4070 and others from another thread:

Eh, it's not bad at all. $100+ cheaper than what the 3080 was, more vram, slightly faster, runs much cooler with lower wattage, and has frame generation. What's not to like?

All of the 4xxx series is pretty good, really. Especially the 4090 which is $100 more than the 3090 but 75%+ faster in raster and more in raytracing, has frame gen, uses around the same power but comes with a huge honking cooler, etc.

The 4080 is pretty much a linear downgrade from that on price perf, and the 4070ti is a good value for its price.
 
INteresting. I hadn't kept up. I didn't realize that frame generation wasn't implemented in previous generation RTX cards.
Nope. Their optical flow is too slow for it per Nvidia at least. The previous gen does support everything else though including the new raytracing stuff in dlss 3.5. :)
 
He's working in areas where he needs high amounts of vram, not everything is about gaming.
I understand the recommendations for the 4070 Ti - lower power consumption, more features, new/latest gen, decent performance compared to the 3090 - better in some areas; beaten in others - but pretty close in performance. But, yes, the vram is a factor in content creation, Compute/GPGPU/Video editing - or at least potentially - also, for ML/AI - whether you do it or just in case....
Unfortunately, Nvidia has crippled most of their next gen cards - whether it's reduced bus, reduced vram etc. - although they did well at efficiency - but, it's still shady bs they did with the gimping of their cards - so, that you look at the 4090 if you 'want everything.' Pretty shady and greedy of them.
I am thinking of just getting a 3090 - call it a day - and maybe upgrade later when the 4090 goes on the used market more - when the 50 series comes out or something. Although, the 4070 Ti and 7900 XT are still intriguing for different reasons. A 2nd hand/used 3090 is also cheaper than the other cards. However, I didn't think I'd find a 4070 Ti (used) at the price I did - it's not much more than a 3090.

The 4070 Ti - though - is also 'beaten' by the 3090 in some 4K gaming - it's interesting - although, they're pretty close - most of the time.
 
I understand the recommendations for the 4070 Ti - lower power consumption, more features, new/latest gen, decent performance compared to the 3090 - better in some areas; beaten in others - but pretty close in performance. But, yes, the vram is a factor in content creation, Compute/GPGPU/Video editing - or at least potentially - also, for ML/AI - whether you do it or just in case....
Unfortunately, Nvidia has crippled most of their next gen cards - whether it's reduced bus, reduced vram etc. - although they did well at efficiency - but, it's still shady bs they did with the gimping of their cards - so, that you look at the 4090 if you 'want everything.' Pretty shady and greedy of them.
I am thinking of just getting a 3090 - call it a day - and maybe upgrade later when the 4090 goes on the used market more - when the 50 series comes out or something. Although, the 4070 Ti and 7900 XT are still intriguing for different reasons. A 2nd hand/used 3090 is also cheaper than the other cards. However, I didn't think I'd find a 4070 Ti (used) at the price I did - it's not much more than a 3090.

The 4070 Ti - though - is also 'beaten' by the 3090 in some 4K gaming - it's interesting - although, they're pretty close - most of the time.
It’s not gimped as much as you’re comparing an old flagship card to a new tier 3-4 card.
 
The only issue with the 4070ti is that its 800 bucks and has 12g of vram. Performance, power needs, features -- all great. Price is dogshit. End of story.
 
I like my 4070 Ti. Big deal if it only has 12GB. Dont like it buy something else right? I play at 4K/60 with mine just fine. Gets over 20K in CB 2024. Yes it is expensive.. but so is everything else.

I got the Zotac Trinity OC, was hundreds cheaper than a 7900. Good deal. Heck it was cheaper than my 3070 Ti lol..
 
The only issue with the 4070ti is that its 800 bucks and has 12g of vram. Performance, power needs, features -- all great. Price is dogshit. End of story.
Yeah what makes it interesting is the second hand market. I’ve seen some good deals like OP has and at around $600 or so it’s a no brainer.
 
Yeah what makes it interesting is the second hand market. I’ve seen some good deals like OP has and at around $600 or so it’s a no brainer.
The used prices for both the 3090 and 4070 Ti - aren't awful. They might not be 'steals' but they're not bad. That's why I can't decide between them - older power hog but lots of vram 24gb - is a big difference compared to 12gb - but, the 4070 Ti has better power efficiency, newer gen, more features - DLSS 3, AV1 - even if one doesn't use these/this - it's 'nice to have.' I have a 850w psu and although both cards can use it - the 3090 I'd want to undervolt whereas the 4070 Ti - I could just leave stock.
I think the 3090 (not gimped) 24gb - has some advantages in some 4K gaming - while the 4070 Ti is usually good (or better) but not always - not every game.
For productivity - the 4070 Ti should be good enough for most things but it might be lacking the memory for certain things, too. This should be a 16gb card! LOL! I'm tempted to go with the vram - but, although it's cheaper - it's only $100 cheaper (comparing 2 2nd hand cards). However, the 4070 Ti price I found - is only one card - although, used ones might come down in price in time.
 
The used prices for both the 3090 and 4070 Ti - aren't awful. They might not be 'steals' but they're not bad. That's why I can't decide between them - older power hog but lots of vram 24gb - is a big difference compared to 12gb - but, the 4070 Ti has better power efficiency, newer gen, more features - DLSS 3, AV1 - even if one doesn't use these/this - it's 'nice to have.' I have a 850w psu and although both cards can use it - the 3090 I'd want to undervolt whereas the 4070 Ti - I could just leave stock.
I think the 3090 (not gimped) 24gb - has some advantages in some 4K gaming - while the 4070 Ti is usually good (or better) but not always - not every game.
For productivity - the 4070 Ti should be good enough for most things but it might be lacking the memory for certain things, too. This should be a 16gb card! LOL! I'm tempted to go with the vram - but, although it's cheaper - it's only $100 cheaper (comparing 2 2nd hand cards). However, the 4070 Ti price I found - is only one card - although, used ones might come down in price in time.
AMD has some compelling options with more VRAM up and down the price stack. Worth a shot if you don't need CUDA.
 
AMD has some compelling options with more VRAM up and down the price stack. Worth a shot if you don't need CUDA.
Imho, only the 7900 series - aka RDNA 3 - and these are rare to find used/2nd hand - new, they are about $1100 CAD - so, quite a bit more than a used 4070 Ti - although, they're often the same price. The AMD card is a lot better in Blender, however - well, if you use HIP-RT - although, more benchmarks/testing is needed - they're all good at video editing although in DR, the AMD cards don't offer H265 encoding.
 
Imho, only the 7900 series - aka RDNA 3 - and these are rare to find used/2nd hand - new, they are about $1100 CAD - so, quite a bit more than a used 4070 Ti - although, they're often the same price. The AMD card is a lot better in Blender, however - well, if you use HIP-RT - although, more benchmarks/testing is needed - they're all good at video editing although in DR, the AMD cards don't offer H265 encoding.
Wow they came down quite a bit since I last looked. 7900XT was like 1400 with tax from Amazon, I got my 4070Ti for 1026 with tax. I am in Manitoba.
 
Wow they came down quite a bit since I last looked. 7900XT was like 1400 with tax from Amazon, I got my 4070Ti for 1026 with tax. I am in Manitoba.
Well, I was stating used prices - so, no tax. :) The 7900 XT pre-tax is $1100 (CC) and $1200 (for same cards on Amazon). At CC, the cheapest 4070 Ti is about $1k (open box) - brand new ->$1040 +tax on Amazon. I guess the AMD card prices came down.
 
I know this was not the most awesome launch ever, but I have started to think about what to replace my stepsons aging 2060 Super with. The 4070ti might be a little pricy, but have people warmed up to the sub 4090 4000 series, or are they still generally considered shit?

And if so, what are people upgrading to?
Overpriced is what the rest of the 40-series is, as well as misbranded. Going by prior generational precedence for performance expectations per brand (i.e. "70 card"), the 4070 Ti should be the "4070" while the 4070 should be the "4060 Ti". Take the 4070 Ti for example. It meets or beats last gen's flagship...similar to how the 3070, 2070, 1070, etc. were to their respective prior gens.

The 4070 barely meeting a 3080 somethings is not that, so at best its a 4060 Ti branded incorrectly and as such, priced wrong.
 
Overpriced is what the rest of the 40-series is, as well as misbranded. Going by prior generational precedence for performance expectations per brand (i.e. "70 card"), the 4070 Ti should be the "4070" while the 4070 should be the "4060 Ti". Take the 4070 Ti for example. It meets or beats last gen's flagship...similar to how the 3070, 2070, 1070, etc. were to their respective prior gens.

The 4070 barely meeting a 3080 somethings is not that, so at best its a 4060 Ti branded incorrectly and as such, priced wrong.
While all this may be true. The 4000 series is still far from "shit". But yeah, good point in the sense that NVIDIA has essentially racheted up the milking of the consumer once again.

Depending on budget - lots of options from Intel, AMD, and NVIDIA. If I were to be buying today for my kid I'd lean hard on the RTX 4070 and see if I can snag a used or open box RTX 4070 Ti for a similar price or a bit more (I live near Micro Center). But hey, I bought my kid an RTX 4090 so I'm no Einstein.

I'd stay away from used 3xxx series / 6xxx series and stick with used/open box "current gen" cards (due to mining risk).

AMD has some awesome stuff if you are willing to go that route. 6950 XT, 6800 XT / 7800 XT...

I just got an Intel A380 for $100 and I'm super happy with that one. Plex monster.
 
Last edited:
While all this may be true. The 4000 series is still far from "shit". But yeah, good point in the sense that NVIDIA has essentially racheted up the milking of the consumer once again.

Depending on budget - lots of options from Intel, AMD, and NVIDIA. If I were to be buying today for my kid I'd lean hard on the RTX 4070 and see if I can snag a used or open box RTX 4070 Ti for a similar price or a bit more (I live near Micro Center). But hey, I bought my kid an RTX 4090 so I'm no Einstein.

I'd stay away from used 3xxx series and stick with used/open box "current gen" cards.

AMD has some awesome stuff if you are willing to go that route. 6950 XT, 6800 XT / 7800 XT...

I just got an Intel A380 for $100 and I'm super happy with that one. Plex monster.
Why would you stay away from used 3xxx series but willing to go with 6000 series - used or new?
The problem with the 4xxx series is the low amount of vram until you go up to 4080 and then, it's just barely adequate. As many have said before, Nvidia is trying to force ppl go up to their flagship 4090 if you want 'everything with enough vram.'
 
Why would you stay away from used 3xxx series but willing to go with 6000 series - used or new?
The problem with the 4xxx series is the low amount of vram until you go up to 4080 and then, it's just barely adequate. As many have said before, Nvidia is trying to force ppl go up to their flagship 4090 if you want 'everything with enough vram.'
Good point - edited my post. Used 3xxx and used 6xxx has the mining risk. Coming from a miner I know that risk is low. I'd rather not deal with it.

So yeah, get a 7800 XT, 7900 XT, 7900 XTX on team red or an RTX 4070, RTX 4070 Ti in that range.
 
Last edited:
Good point - edited my post. Used 3xxx and used 6xxx has the mining risk. Coming from a miner I know that risk is low. I'd rather not deal with it.

So yeah, get a 7800 XT, 7900 XT, 7900 XTX on team red or an RTX 4070, RTX 4070 Ti in that range.
Well, I was worried about getting a mining card - I bought a 3080 a while ago and it was fine - in fact, it's the best gpu I've ever had so far - really quiet and ran cool - good temps (tried gaming). In saying that, yeah, I was lucky once.... I'm still concerned if I decide to get a 3090 or 3090 Ti - but, it'll be easier for me to buy one of those - as it'll be a lot cheaper. However, I do find a few 'cheap' <$1K 4070 Tis so getting one of those is doable - but, I wonder about the much lower vram. For gaming, it wouldn't be too bad - although, there's some gamers saying that more games will be tuned for higher vram or require it - in the future?

Also, I've read that's it's possible to undervolt a 3090 (and 3090 Ti?) to 300-350W? That would be something, at least?
 
Well, I was worried about getting a mining card - I bought a 3080 a while ago and it was fine - in fact, it's the best gpu I've ever had so far - really quiet and ran cool - good temps (tried gaming). In saying that, yeah, I was lucky once.... I'm still concerned if I decide to get a 3090 or 3090 Ti - but, it'll be easier for me to buy one of those - as it'll be a lot cheaper. However, I do find a few 'cheap' <$1K 4070 Tis so getting one of those is doable - but, I wonder about the much lower vram. For gaming, it wouldn't be too bad - although, there's some gamers saying that more games will be tuned for higher vram or require it - in the future?

Also, I've read that's it's possible to undervolt a 3090 (and 3090 Ti?) to 300-350W? That would be something, at least?
3090 Ti would be the safest bet of all as it was super expensive and released at the end of the mining boom.
 
Back
Top