6700 Speculation & Rumors (NOT 6800/6900x)

BenWah

Weaksauce
Joined
Jan 21, 2014
Messages
98
Very little known on 6700 cards yet.
Since almost all discussion is about the 6800 & 6900, I created this thread for speculation, rumors, and crumbs of info specific to 6700.

They're based on navi 22 gpu, rumored to have 12gb gddr6 memory.
wccftech posted speculation that there will be two 6700 cards.
Love to find out TDP, card size, guesses on release date etc...
 
Last edited:
40CU
at least 2Ghz, probably more, possibly much more
12GB GDDR6 on 192bit bus
2-fan reference design, much smaller than 6800/6900
6700XT probably between 2080S and 2080Ti
6700 probably around 2080S
no idea on power, likely 200-225W for 6700XT and 175-200W for 6700
 
PS5 is 36CU, which is half a 6800XT. And 2.23ghz sustained boost. I would expect at least that as starting spec for a 67** card.
 
The 6800 is shown to be faster then the 2080 Ti / 3070 from AMD's bench marks . so they would need to fit above RX 5700 XT/2070 super to 2080 Ti
 
$275 for 6700; $350 for 6700xt.

192 bit very likely, 12GB, possibly 6GB for the 6700.

AMD is really going to eat nVidias lunch in this price point with its infinity fabric. Nvidia will probably use ga106, likely 6 GB and bandwidth starved unless they use gddr6x. Once production costs goes down, nvidia may offer a cut down ga104 to compete.

I see the 6700 series as the next Polaris it that it will be the go to midrange card for the next 2-3 years. RX 470 4GB to RX 590 8 GB then 6700 6 GB to 7700xt 12 GB now.
 
I saw an article the other day that the RTX 3060 could be released mid-November.

I hope that's true so AMD will release the mid-range 6700 and 6600 cards sooner.
I really don't need to spend $500-600 on gpu for my wife's computer. Sure it'd be nice, but I doubt she'd notice the difference between a 6700 and 6800 since we really don't play demanding games.
 
Not sure how a $350 6700XT would make sense... seeing that the 6800 is $579. That's quite a huge hole in the product lineup.

It would make more sense to it the $499 and $399 price points. Then the 6600 series can cover the $249/$299 area. And finally the 6500 series can cover the $149/$199 area.
 
Not sure how a $350 6700XT would make sense... seeing that the 6800 is $579. That's quite a huge hole in the product lineup.

It would make more sense to it the $499 and $399 price points. Then the 6600 series can cover the $249/$299 area. And finally the 6500 series can cover the $149/$199 area.
For some reason when talking about AMD people just feel like AMD should price lower than anything any other company would charge. It's an unrealistic expectation for some reason. The 6700xt will probably be as fast as the 3070 and will/should be priced appropriately. 6700 will be slightly slower and should be slightly cheaper.
 
This is what I foresee:

3070 - $499
6700XT - $499

3060 Ti - $399
6700 - $399

3060 - $349 (unless Nvidia stops being greedy and prices it at $299 where it should be)
6600XT - $349
6600 - $299

3050 Ti - $249
6500XT - $249

6500 - $199
3050 - $199

Personally I'm hoping to spend no more than $299, so if Nvidia remains greedy as usual, I'm not giving them that extra $30 and AMD gets my money. I'm sure they'll be quite comparable in performance. Wouldn't be surprised if the 3060 came with 6GB vs 6700's 8GB.
 
Last edited:
I saw an article the other day that the RTX 3060 could be released mid-November.

I hope that's true so AMD will release the mid-range 6700 and 6600 cards sooner.
I really don't need to spend $500-600 on gpu for my wife's computer. Sure it'd be nice, but I doubt she'd notice the difference between a 6700 and 6800 since we really don't play demanding games.
Looks like that isn't happening!

"NVIDIA GeForce RTX 3060 Ti Graphics Card’s Launch Reportedly Pushed Back to December 2nd"
https://wccftech.com/nvidia-geforce-rtx-3060-ti-graphics-cards-launch-december/
 
Specs allegedly leaked here:

https://www.tweaktown.com/images/news/1/6/16907_1_full.jpg
16907_1_full.jpg
 
This is what I foresee:

3070 - $499
6700XT - $499

3060 Ti - $399
6700 - $399

3060 - $349 (unless Nvidia stops being greedy and prices it at $299 where it should be)
6600XT - $349
6600 - $299

3050 Ti - $249
6500XT - $249

6500 - $199
3050 - $199

Personally I'm hoping to spend no more than $299, so if Nvidia remains greedy as usual, I'm not giving them that extra $30 and AMD gets my money. I'm sure they'll be quite comparable in performance. Wouldn't be surprised if the 3060 came with 6GB vs 6700's 8GB.

I hope there's a 75W or 100W card in there.
 
tdp estimate from same source is:

186-211W for 6700 XT
146-156W for 6700
 
I hope both the 6700 and the 3060 (non TI) get released soon - I'm guessing nothing is happening until early January. Then we'll see which one is worth my money. I'm genuinely conflicted: AMD tends to give you more VRAM (high textures everywhere!), but if DLSS keeps catching on, the performance improvements are hard to ignore.
 
GA104 is ~400mm2 which will be competing against Navi 22 which maybe under 300mm2 (don't know the actual size, Navi 1 with 40CU units was around 250mm2) and Navi22 is supposed to have 40CUs same as the number in the 5700XT. The 3060Ti does not beat the 5700XT (full Navi 40CU chip) by more than 20%. If Navi 22 is 50% or better performance per watt I expect the 6700XT to beat the 3060Ti as much as the 5700 XT beat the 2060 Super. The smaller chip size does allow better yields on the same process and a larger quantity of chips. The rumour of over 2700mhz clocks is also very interesting for either OCing as well as for what will be the gaming clock be set at. Once the initial exponential buy curve for the new generation of consoles die down, plenty of PC GPUs I would expect to become available from AMD.
 
I can now see a good reason for AMD pricing the 6800 at $579, somewhat close to the 6800XT for the reference models -> This allows the 6700 XT to have a larger price range for the AIB partners, similar to the 6800XT and 6900XT big gap -> Speculation. The 6700 XT may catch up to the 3070 in other words or even beat it.
 
Even if the 6700xt is at 3060ti performance, though I expect it to be a little under, it needs to be $350 or lower.

The nVidia cards just have too many features for AMD to simply match perf/$, ie NVENC, dlss, better RTX.
 
Better DXR, you mean. RTX is just nvidia’s name for it’s acceleration approach.

The DLSS argument is only momentarily valid, as we should get a directML equivalent in a few months.

NVENC is irrelevant to anyone who doesn’t process videos or stream.

nvidia broadcast might actually be a legitimate advantage, given how much everyone is zooming around these days.

To me, perhaps the most relevant factor is VRAM/price. I’m sick of running out of it, and I fear nvidia will release the 3060 with a garbage 6g. If the expected comparable 6700 non xt gives me 8 or more for a similar price, that might just do it. I’m amazed people haven’t revolted over 8gb 3080s, considering the asking price.
 
Better DXR, you mean. RTX is just nvidia’s name for it’s acceleration approach.

The DLSS argument is only momentarily valid, as we should get a directML equivalent in a few months.

NVENC is irrelevant to anyone who doesn’t process videos or stream.

nvidia broadcast might actually be a legitimate advantage, given how much everyone is zooming around these days.

To me, perhaps the most relevant factor is VRAM/price. I’m sick of running out of it, and I fear nvidia will release the 3060 with a garbage 6g. If the expected comparable 6700 non xt gives me 8 or more for a similar price, that might just do it. I’m amazed people haven’t revolted over 8gb 3080s, considering the asking price.

The 6700 non xt will be either 6 GB or 12 GB given a 192 bit bus. Perhaps there will be both versions akin to an existing rx 570 4gb/8gb, but most got the lower vram version due to the incredible value.

The 6GB version will do just fine for most at 1080p ultra / 1440p med.

The 3080 is 10 GB and I know of no scenario where this has caused an ACTUAL AND MEASURABLE performance hit.
 
The 6GB version will do just fine for most at 1080p ultra / 1440p med.
The 3080 is 10 GB and I know of no scenario where this has caused an ACTUAL AND MEASURABLE performance hit.
Ah, you're right, the 3080 is inded 10GB. It's the 3070 that's 8GB, which is still too low for $500 in 2020. As for your 6GB comment, that's highly debatable. VRAM is not just dependent on resolution. Is 6GB it enough for current 1080p games? Sure. But this type of card is not just to play current and older games, it's meant to play the games coming out in 2021, 2022 and perhaps even a bit beyond. With the release of new consoles, games are going to change their expected minimums, loading tons of assets/textures is the first noticeable thing in PS5/XBSX announced games. So, even at 1080p, 6GB will be quite tight. Plenty of games are now taking 8 and 10 GB VRAM when available, and it's about to get a lot heavier when 2020 games designed with next-gen consoles in mind come out.
 
Ah, you're right, the 3080 is inded 10GB. It's the 3070 that's 8GB, which is still too low for $500 in 2020. As for your 6GB comment, that's highly debatable. VRAM is not just dependent on resolution. Is 6GB it enough for current 1080p games? Sure. But this type of card is not just to play current and older games, it's meant to play the games coming out in 2021, 2022 and perhaps even a bit beyond. With the release of new consoles, games are going to change their expected minimums, loading tons of assets/textures is the first noticeable thing in PS5/XBSX announced games. So, even at 1080p, 6GB will be quite tight. Plenty of games are now taking 8 and 10 GB VRAM when available, and it's about to get a lot heavier when 2020 games designed with next-gen consoles in mind come out.
Textures matter more than resolution, that's for sure.

A few things on that:
- New memory technology is making each GB go further, especially with AMD's new tech with RDNA
- Look at the past to anticipate the future. A 4 GB 5500xt gets 4 FPS at 1080p in Godfall. A 4 GB RX 570 likely do similar. However, an 8 GB rx580 gets 31 fps. HUGE difference for sure, but do you really want to play at even 30 fps?
- My bet is that if both were set to a more manageable 1080 med, both would get closer to 60 fps. Its all about matching settings as game demands increase.

If they can drop a 6 GB 6700 for $250, a 12 GB 6700 for $300, and a 12 GB 6700xt for $350, that would make for a solid lineup.
 
- Look at the past to anticipate the future.
Ah, you make some good points, but that line is where we totally differ. Past performance never necessarily reflects on future developments, if you ask me. Especially considering that these new consoles are the 1st time ever they have been SSD based, and the fact that they're NVME is even more meaningful, orders of magnitude more capable than an HDD. That is very much going to change how games are made, freeing developers from loading specific game level assets and focusing more on what's on screen at any given moment. This will likely create a push for more detailed worlds, again hammering memory (though if more and more is passed directly from storage to GPU, VRAM might not be hit in this specific way). My point being: the more objects you have on screen, the more textures you'll need to load, so if you want to keep the quality high, you better have more VRAM. We'll see how this evolves!

We certainly agree though that at $250 I'd take no less than 6GB, and when it comes to $300... that 3060 better come out with 8GB, otherwise I'm not even considering it. Your suggestion of a 6700xt for $350 seems unlikely though, that'd leave a $230 gap between it and the 6800. I think the 6700xt will go for $399, the 6700 for $320 at 12GB, and the 6700 6GB at $250. That creates space at $200 and below for the 6600 series (given that everything seems to have trickled up in price in the past few generations).

Frankly, I'm also quite burned with Nvidia, as when I bought my 1060 3GB nowhere did they say this was a very different GPU than the 1060 6GB in shader count. I found out afterwards, when my performance didn't match the reviews, and by then I was out of the return window. So, while I'm keen on Nvidia's features, I certainly don't forgive this kind of shenanigan and I'm more than ready to go back to AMD. A 6GB 6700 at $250 is very tempting, but if I can get the 12GB for $300, that's a no brainer for me this time. Better to have too much than too little, I'm not making that mistake again.
 
^^ Tough break on the 3 GB 1060, that was pretty shady.

The 3060 will be interesting. Very likely it will be a smaller die with a 192 bit interface.

This means either a 6 GB GDDR6x or a 12 GB GDDR6, but most likely the former as nVidia really needs the bandwidth to compete now that AMD has the huge infinity cache to compensate.
 
Even if the 6700xt is at 3060ti performance, though I expect it to be a little under, it needs to be $350 or lower.

The nVidia cards just have too many features for AMD to simply match perf/$, ie NVENC, dlss, better RTX.
Do not follow your logic on performance, it will most likely beat if not cream the 3060Ti, just looking at the below data, 18 games at 1440p, the 3060Ti is 21% faster than a 5700XT. Are you saying a 40CU RNDA2 6700XT will be less than 21% performance increase over a 40CU 5700XT? While with a 50% or more performance/watt improvement? I expect it to be closer to the 3070 if not matching/beating it since the 3070 is only 10% over the 3060Ti.

3060ti.jpg
 
Do not follow your logic on performance, it will most likely beat if not cream the 3060Ti, just looking at the below data, 18 games at 1440p, the 3060Ti is 21% faster than a 5700XT. Are you saying a 40CU RNDA2 6700XT will be less than 21% performance increase over a 40CU 5700XT? While with a 50% or more performance/watt improvement? I expect it to be closer to the 3070 if not matching/beating it since the 3070 is only 10% over the 3060Ti.


Well, using your graphics, the 6800 is 50% faster than the 5700xt while having 50% more CUs and that is ignoring the clock differences.

So I guess I am not sure why you thing 40 CUs of RDNA 2 will be that much faster than 40 in the 5700xt.

Performance degrades when you scale up and the 6700xr should have higher clocks, but it seems very unlikely to beat a 3060ti.
 
Well, using your graphics, the 6800 is 50% faster than the 5700xt while having 50% more CUs and that is ignoring the clock differences.

So I guess I am not sure why you thing 40 CUs of RDNA 2 will be that much faster than 40 in the 5700xt.

Performance degrades when you scale up and the 6700xr should have higher clocks, but it seems very unlikely to beat a 3060ti.
Well we will see but that would be a pretty poor performance improvement with faster ram, faster clock speeds with Infinity Cache as well.
 
Well, using your graphics, the 6800 is 50% faster than the 5700xt while having 50% more CUs and that is ignoring the clock differences..... Performance degrades when you scale up
The same graphic show the 6800xt being just 10% faster with 20% more CU, like you said the scaling could be far from perfect and loosing 33% of the CU could mean loosing only 15-20% of the performance.
 
Well we will see but that would be a pretty poor performance improvement with faster ram, faster clock speeds with Infinity Cache as well.

Ram will actually be slower since it is allegedly going down to a 192 bit bus, but yeah, Infinity cache could more than offset that.
 
  • Like
Reactions: noko
like this
The same graphic show the 6800xt being just 10% faster with 20% more CU, like you said the scaling could be far from perfect and loosing 33% of the CU could mean loosing only 15-20% of the performance.
Yes, plus clock speed differences, if 10% faster as in 200mhz+ or something. We just have to wait and see. To me it looks to be more around a 3070 performance as a guess.
 

AMD Radeon RX 6700 XT Reportedly a “Budget Card,” Weaker Than NVIDIA GeForce RTX 3060 Ti​


"The Radeon RX 6700 XT will reportedly feature 40 Compute Units, 12 GB of memory on a 192-bit bus, boost clocks of 2.35 to 2.5 GHz, a GPU power of around 200 watts, and less Infinity Cache than the flagship RDNA 2 models. Moore’s claims that it’ll be faster than the Radeon RX 5700 XT (obviously) but will be unable to beat the GeForce RTX 3060 Ti in the rasterization department. It could, however, cost less than $350."

"The Radeon RX 6700, on the other hand, is said to perform very similarly to the Radeon RX 5700 XT. This standard model will supposedly feature a 150 to 180 watt TDP and various potential memory configurations (e.g., 10 GB of memory on a 160-bit bus). Moore’s predicts that the Radeon RX 6700 could be priced at under $300 to trade blows with NVIDIA’s highly probable GeForce RTX 3060."

https://www.thefpsreview.com/2020/1...-card-weaker-than-nvidia-geforce-rtx-3060-ti/
 
Last edited:
Interesting. I was hoping on better competitiveness from AMD this time around, because at this point I really dislike giving Nvidia money. But you have to buy the better value card, and right now Nvidia seems to be it for the $300 market... one more week and we'll see.
 
6700 and XT should fall above and below, respectively, the 5700 XT, if my information is correct. Of course pricing will be key here for what will surely be an excellent 1440p card.
 
Back
Top