Help me Choose RTX 3060 12 GB or RTX 4060 8GB

johnb35

n00b
Joined
Jul 28, 2023
Messages
4
hi all i had a GTX 1060 6GB and it died last month I am hoping to buy a new GPU in the coming weeks and I am a bit confused about which one to go please Help me Choose RTX 3060 12 GB or RTX 4060 8 GB?

will the 8GB in 4060 will be an issue in the future? if I go with the 3060 12 GB will it be enough for the next 4 years? i play most of the games in Max setting possible in 1080p so I like to choose a card where I can play it like that for the next few years at least

the price difference between 2 cards is less than 20 USD in my country

My specs
intel core i5 10400
32GB DDR 4 RAM
FSP AURUM S 600W
 
12GB will be better than 8GB for the next 4 years on max settings, everything else ignored. The writing is on the wall that 8GB is not going to be enough to last the next 4 years as I'm sure you've seen Steve at Hardware Unboxed on Youtube attest to many, many times.

I have personal experience and learned a lesson about lack of video card memory: I had a GTX690 which has 2 GPUs with 2 GB on each (basically it was SLI on a single card) which meant it really only had 2 GB of usable memory. In 2012 when I first got it, it was incredible. By 2016, however, it was struggling in some games due to lack of VRAM (ignoring that SLI was going away). The horsepower of the GPUs was there, but the lack of VRAM was hindering performance and causing stuttering on the highest settings. I then upgraded to the GTX1080 and then the GTX1080Ti which has 11GB VRAM in 2017. Now in mid-2023, I'm still using the GTX1080Ti and I have to say, the 11GB of VRAM has allowed me to completely ignore any VRAM concerns, even in the latest games. Now it's just the GPU horsepower that is the limiting factor as games are more demanding and so things are just slower, but not due to VRAM. It's a much better spot to be in, IMO. My next video card will have 16GB or more for this reason. I tend to upgrade every few generations rather than jump yearly so this is more important for me.
 
You don't have to ask if 8 gigs will be an issue in the future when it's already an issue right now in some cases. Go look at the hardware unboxed review and you will see that even at 1080p there are a couple games running into issues. Also, the clowns claiming 8 gigs is not an issue at all are also forgetting the fact that some textures will not even properly load so you can't just look at FPS numbers.
 
I woud nt buy any 8GB or less card anymore.

3060 12GB
RX 6750XT
RX 6800
4070
4060 16GB

Shoul be your ony consideration for budget cards. A 3060 12GB woud be a good match with your CPU. 4 years is a lot to sk for anything not a 4090.
 
Easy Peasy answer here. Go with the 3060. I'm starting to think i should've went with that instead of my 3060ti. Didn't want to go 192 bit mainly. But anyway.
 
8GB is already starting to be a problem now. It will only get worst later on.

1690674091147.png
 
Everybody points to VRAM but, why not a 3060TI on your list which has 256 memory bus compared to 196 on the 3060?
 
Everybody points to VRAM but, why not a 3060TI on your list which has 256 memory bus compared to 196 on the 3060?
What difference does the wider bus make if it still has the same 8gb of vram?
 
So in other words you have no clue what you are talking about. Bus width does not matter if the limitation is the amount of vram. :rolleyes:
Well it's better at all times to have wider bus width, but it would still suck when you over run the frame buffer, just suck less than a more narrow bus. The problem games addressed the issue to some extent for 1440p+ res, but their solution was texture degradation. Who's to say the next batch of titles can run on 8GB at 1440p+, at all. I would shoot for 16GB+ now. I paid $400 for 12GB 3060 lsst Dec, and I regret it, but I needed HDMI 2.1 and it was the cheapest locally.
 
Well it's better at all times to have wider bus width, but it would still suck when you over run the frame buffer, just suck less than a more narrow bus. The problem games addressed the issue to some extent for 1440p+ res, but their solution was texture degradation. Who's to say the next batch of titles can run on 8GB at 1440p+, at all. I would shoot for 16GB+ now. I paid $400 for 12GB 3060 lsst Dec, and I regret it, but I needed HDMI 2.1 and it was the cheapest locally.
Where are you coming up with such nonsense? If the limitation is the amount of vram itself then the bus width is irrelevant.
 
Where are you coming up with such nonsense? If the limitation is the amount of vram itself then the bus width is irrelevant.
No it's not. Just like system RAM, data will be swapped out. Some engines handle it better, but most don't for obvious reasons.
 
No it's not. Just like system RAM, data will be swapped out. Some engines handle it better, but most don't for obvious reasons.
AGAIN if the actual limitation is from not having enough vram then the bus width means nothing. You could put a 512 bit bus on the 4060 but in situations where 8gb was not enough then it would perform the same as it does now with the 128 bit bus as the bandwidth was NOT the limiting factor in that situation.
 
Last edited:
4060 is a much faster card. The market has a LOT of 8GB cards in gamers hands including last Gen's stellar sellers the 3070/3070ti and that is not lost on software makers.
You might have to drop a top graphical option here and there on mostly AAA titles but you won't have dog shit stupid low FPS in comparison to the 3060. As DLSS 3 improves adoption the lead over the 3060 12GB will only improve.
 
4060 is a much faster card. The market has a LOT of 8GB cards in gamers hands including last Gen's stellar sellers the 3070/3070ti and that is not lost on software makers.
You might have to drop a top graphical option here and there on mostly AAA titles but you won't have dog shit stupid low FPS in comparison to the 3060. As DLSS 3 improves adoption the lead over the 3060 12GB will only improve.
The "4060 is a much faster card"?? The 3060 has "dogshit low fps" compared to the 4060??

The 4060 is only 18% faster overall than the 3060 at 1080p and 15% faster at 1440p in the techpowerup review.
 
The "4060 is a much faster card"?? The 3060 has "dogshit low fps" compared to the 4060??

The 4060 is only 18% faster overall than the 3060 at 1080p and 15% faster at 1440p in the techpowerup review.
Yes that is faster. Add in DLSS 3 and it is much faster. Of course "cheap" cards like these should be considered top 1080p cards today and not puffed up to 1440p with sad FPS. No?
If you need you need better FPS maybe consider an 8GB 3070ti?
 
AGAIN if the actual limitation is from not having enough vram then the bus width means nothing. You could put a 512 bit bus on the 4060 but in situations where 8gb was not enough then it would perform the same as it does now with the 128 bit bus as the bandwidth was NOT the limiting factor in that situation.
You obviously have a liot to learn about game API and hardware. How do you think the data gets into the frame buffer? Do you realize there are buffer culling commands? Buffer overwrites? These operations perform better over the wider bus. It's important when the buffer is mostly full. Once the buffer is full, that's a problem. A good engine with good coders, won't allow that to happen by culling the buffer regularly of unused textures, and overwrite them. This requires use of the bus on the card just as when loading initial textures. There are other commands in newer API, to free up buffer and bandwidth as well. You can see this behaviour in a buffer over run. A card wityh higher badwidth will recover sooner and drop fewer frames than the narrow bus card. It's a known issue at higher reolutions in some modern titles that push the detected buffer capacity too far. There is memory allocation AND memory utilization.

This is why a 4060Ti with a more narrow bus, performs similar to a 3060Ti, both with 8GB. It's also why a 4060Ti w/16GB performs about the same as 8GB cards.
 
Last edited:
Yes that is faster. Add in DLSS 3 and it is much faster. Of course "cheap" cards like these should be considered top 1080p cards today and not puffed up to 1440p with sad FPS. No?
If you need you need better FPS maybe consider an 8GB 3070ti?
Not everything supports DLSS. The 4060 in any flavor is a wqaste of money, until the 16GB drops, or a non-Ti gets a 16GB version for the price of the 8GB, it's just trash.
 
  • Like
Reactions: pavel
like this
You obviously have a liot to learn about game API and hardware. How do you think the data gets into the frame buffer? Do you realize there are buffer culling commands? Buffer overwrites? These operations perform better over the wider bus. It's important when the buffer is mostly full. Once the buffer is full, that's a problem. A good engine with good coders, won't allow that to happen by culling the buffer regularly of unused textures, and overwrite them. This requires use of the bus on the card just as when loading initial textures. There are other commands in newer API, to free up buffer and bandwidth as well. You can see this behaviour in a buffer over run. A card wityh higher badwidth will recover sooner and drop fewer frames than the narrow bus card. It's a known issue at higher reolutions in some modern titles that push the detected buffer capacity too far.
You clearly cant comprehend that if vram is the actual limitation then the goddamn bus width being a different size does not matter. Why do you think all the talk and testing and patches have mentioned vram size NOT bus width or bandwidth. Not one reputable review site has ever been stupid enough to say that you if you run out of vram it will be better on a card with higher bus width. What you are saying is about as dumb as trying to buy an item that cost 100 bucks when you only have 80 bucks and thinking that if you have four 20 dollar bills you are better off than having eight 10 dollar bills. It is irrelevant just as thinking bus width matters if its the capacity that is the limitation in a specific situation.
 
Last edited:
I have always wondered why the 3060 12Gb has not really took it place over 8Gb cards, Nvidia is holding it back in the drivers to allow the new products to look better than they really are in benchmarks is my best guess.
 
You clearly cant comprehend that if vram is the actual limitation then the goddamn bus width being a different size does not matter. Why do you think all the talk and testing and patches have mentioned vram size NOT bus width or bandwidth. Not one reputable review site has ever been stupid enough to say that you if you run out of vram it will be better on a card with higher bus width. What you are saying is about as dumb as trying to buy an item that cost 100 bucks when you only have 80 bucks and thinking that if you have four 20 dollar bills you are better off than having eight 10 dollar bills. It is irrelevant just as thinking bus width matters if its the capacity that is the limitation in a specific situation.

Bus width dowsn't matter. You can't be that stupud,. If the bus width/ bandwidth was fast enough, you wouldn't need near as much VRAM. That's nvidia's whole point with the cache on the 4060. And they were proven right with the 16GB version. Operations don't just stop when the buffer fills up, textures are swapped, shaders are swapped, sh*t is happening on the bus all the time. If you are too stupid to research the facts, or too arrogant to admit yo are wrong, then you gert blocked for spamming the same stupid and ignorant statement.

Bus width woud not "fix" those titles, but a faster/ wider bus would stll perform better in those cases than a slower/ narrow bus. JUST like I said in my post that started your igniorant rant.
 
Last edited:
Bus width dowsn't matter. You can't be tha stupud,. If the bus width/ bandwidth was fast enough, you wouldn't need near as much VRAM. That's nvidia's whole poiny with the cache on the 4060. And they were proven right with the 16GB version. Operations don't just stop when the buffer fills up, textures are swapped, shaders are swapped, sh*t is happening on the bus all the time. If you are too stupid to research the facts, or too arrogant to admit yo are wrong, then you gert blocked for spamming the same stupid and ignorant statement.
You are fucking clueless. If a game truly needs 10 gb of vram on certain settings and you only have 8 gb then the damn bus width is not going to help anything as the limitation is the capacity. If a game runs out of vram then the game will reduce visuals, stutter or crash no matter what the fucking bus width is, you ignorant clown. Even Steve from hardware unboxed mentioned that the bus width is irrelevant if the true limitation is the amount of vram.
 
hi all i had a GTX 1060 6GB and it died last month I am hoping to buy a new GPU in the coming weeks and I am a bit confused about which one to go please Help me Choose RTX 3060 12 GB or RTX 4060 8 GB?

will the 8GB in 4060 will be an issue in the future? if I go with the 3060 12 GB will it be enough for the next 4 years? i play most of the games in Max setting possible in 1080p so I like to choose a card where I can play it like that for the next few years at least

the price difference between 2 cards is less than 20 USD in my country

My specs
intel core i5 10400
32GB DDR 4 RAM
FSP AURUM S 600W

Neither option. Buying a 3060 is a waste given its already a gen old barely midrange card. Isn't worth the price, and the 4060 is a waste of perfectly good silicon. I can't think of a reason to recommend anyone buy it.

If you want something that is going to last 4+ years for 1080p, you will need at min 12 GB of Ram. 16 would be better.

Given where we are with GPU's. That means either

RX 6700 XT / 6750 XT (~320ish $)
RX 6800 / XT / 6900 XT / 6950 XT (~450-550$)
RTX 4070 and above (600+$)
Intel ARC A770 16GB (warning! early adopters be wary) (~330ish$)

Honestly. if you want something to last 4+ years your going to need to shell out some cash. ~500ish will get a 6800 XT which blows the 4060 TI 16GB out of the water.

and for the same price (300$ish) of the 4060, the 6700 XT / 6750 XT is just faster with more vram.

Right now, if your not planning to spend over a grand, AMD is the best bang for buck when it comes to pure rasterization. I wouldnt buy a Nvidia GPU this generation unless your looking for the 4070 or higher (600+$ GPU's) and those will probably also have issues in a few years as they are only 12 GB cards.

Basically Nvidia's line up is stupid this for the RTX 40 series.

Oh, and depending on timing, we're waiting on AMD to release the 7700 and 7800... which are both slated for 12 and 16 GB respectably. Anticipated for later this summer. Be very interesting to see how those slot into the market.
 
Here we go with that game, I don't know their area of the benchmark.
 
Last edited:
You are fucking clueless. If a game truly needs 10 gb of vram on certain settings and you only have 8 gb then the damn bus width is not going to help anything as the limitation is the capacity. If a game runs out of vram then the game will reduce visuals, stutter or crash no matter what the fucking bus width is, you ignorant clown. Even Steve from hardware unboxed mentioned that the bus width is irrelevant if the true limitation is the amount of vram.

if the game needs 10 and you have 8 how do you swap those extra 2gb in and out of memory? You do it via the bus. Now obviously a 16gb card with a slightly narrower bus will perform better than an 8 GB card with a slightly wider bus In this scenario, BUT if You have two 8gb cards the one with the wider bus will suffer less from having to swap 2GB worth of assets in and out of vram. With the exception of that rare game that will simply crash if you exceed your frame buffer, but that’s not very common. That said a 3060Ti still makes absolutely no sense if your main concern is running out of vram. So that suggestion given the topic is dumb.
 
Not everything supports DLSS. The 4060 in any flavor is a wqaste of money, until the 16GB drops, or a non-Ti gets a 16GB version for the price of the 8GB, it's just trash.
Stupid over priced cards with +RAM still make them stupid slow cards. Sure go ahead and pay extra $$ for more RAM on a potato card so you can see Ultra Textures and shit frame rates.
 
Stupid over priced cards with +RAM still make them stupid slow cards. Sure go ahead and pay extra $$ for more RAM on a potato card so you can see Ultra Textures and shit frame rates.
16GB would still e useful to max certain titles at 1080p that still struggle with 8GB. Higher settings at 1440p would be possible too. What they are doing is obvious, the same as always, filling gaps in the product stack by crippling the memory bus of better cards.
 
16GB would still e useful to max certain titles at 1080p that still struggle with 8GB. Higher settings at 1440p would be possible too. What they are doing is obvious, the same as always, filling gaps in the product stack by crippling the memory bus of better cards.
Well DUH! All I am saying is shit cards with extra RAM are still shit. Going from ULTRA to HIGH on a setting isn't the end of the world as you will actually still get top FPS, but not the other way around.
Pick your poison.
 
Well DUH! All I am saying is shit cards with extra RAM are still shit. Going from ULTRA to HIGH on a setting isn't the end of the world as you will actually still get top FPS, but not the other way around.
Pick your poison.
RTX eats a lot of VRAM. There is a legit reason for 16GB on a slow bus, it's just only present in about three titles right now. Still not worth $500 though, for three games. The shit cards would not be so shitty at about $100 less.
 
if the game needs 10 and you have 8 how do you swap those extra 2gb in and out of memory? You do it via the bus. Now obviously a 16gb card with a slightly narrower bus will perform better than an 8 GB card with a slightly wider bus In this scenario, BUT if You have two 8gb cards the one with the wider bus will suffer less from having to swap 2GB worth of assets in and out of vram. With the exception of that rare game that will simply crash if you exceed your frame buffer, but that’s not very common. That said a 3060Ti still makes absolutely no sense if your main concern is running out of vram. So that suggestion given the topic is dumb.

Which bus?

When you're trying to pull data through a 31.5GB/s PCIe Gen4 x16 bus, the VRAM bus with ~10x that bandwidth isn't going to be the bottleneck.
 
Which bus?

When you're trying to pull data through a 31.5GB/s PCIe Gen4 x16 bus, the VRAM bus with ~10x that bandwidth isn't going to be the bottleneck.
Technically... Both. Assets not stored in VRAM will have to traverse the PCIe bus from main memory and still need to traverse the vram bus for processing by the GPU. But at this point, we can also argue main memory bus. That's fair argument if you're chasing the "biggest" bottleneck, which no one is disputing. The main bottleneck is by far and away, running out of VRAM. But being the biggest bottleneck doesn't make it the only bottleneck, which is an important distinction.
 
Both the RX 67xx series and the RTX 3060 have 192 bit bus widths so in addition to more VRAM the bus widths are also wider on both cards. The only thing the 4060 has going for it is DLSS3 and honestly I don't think DLSS3 is worth the sacrifices to memory but just my opinion.



Lol given that the 3060 can sometimes be almost double the FPS of the 4060 in this game....you would need DLSS3 just to even match the performance level of a 3060. I expect this trend to continue moving forward. Imagine losing to the last gen counterpart and needing fake frames just to even the score.
 
Last edited:
Love these threads where OP's first post is a simple question and the thread devolves immediately into the weeds about edge case scenarios where one card can have a significant difference over the other or other mostly irrelevant architecture workings.

OP, go with whatever is cheapest in that class of GPUs and you'll likely be fine for a while, even with 8GB at 1080p. You're not going to be running a lot of RT features (which use a bit more VRAM) on that class of GPU anyways if you favor higher performance.

Right now I think the best value card in that class is the RX 6700 (non-XT) with 10GBs of VRAM, which you can find for well under $300 (was $270 new for a while a couple months ago) usually. Otherwise, the 12GB 3060 would be fine as well for the foreseeable future. Worst case if it's not, you sell the card towards a higher end card that meets your expectations if your budget allows for it.

The midrange 40-series cards are fine cards too, just bad values at their current prices and most are generally avoiding them.
 
12GB will be better than 8GB for the next 4 years on max settings, everything else ignored. The writing is on the wall that 8GB is not going to be enough to last the next 4 years as I'm sure you've seen Steve at Hardware Unboxed on Youtube attest to many, many times.

I have personal experience and learned a lesson about lack of video card memory: I had a GTX690 which has 2 GPUs with 2 GB on each (basically it was SLI on a single card) which meant it really only had 2 GB of usable memory. In 2012 when I first got it, it was incredible. By 2016, however, it was struggling in some games due to lack of VRAM (ignoring that SLI was going away). The horsepower of the GPUs was there, but the lack of VRAM was hindering performance and causing stuttering on the highest settings. I then upgraded to the GTX1080 and then the GTX1080Ti which has 11GB VRAM in 2017. Now in mid-2023, I'm still using the GTX1080Ti and I have to say, the 11GB of VRAM has allowed me to completely ignore any VRAM concerns, even in the latest games. Now it's just the GPU horsepower that is the limiting factor as games are more demanding and so things are just slower, but not due to VRAM. It's a much better spot to be in, IMO. My next video card will have 16GB or more for this reason. I tend to upgrade every few generations rather than jump yearly so this is more important for me.
Hell, I was having VRAM issues on my GTX 670 in 2013 just due to Skyrim mods.
 
Lol given that the 3060 can sometimes be almost double the FPS of the 4060 in this game....you would need DLSS3 just to even match the performance level of a 3060. I expect this trend to continue moving forward. Imagine losing to the last gen counterpart and needing fake frames just to even the score.
Simply due to Nvidia's greed. All stop. Silicon must be diverted to the AI business line at the cost of all other lines of business. They will make their money there. We are now the peasants who will eat the scraps.
 
Back
Top