Finally.......a 3070 will I be happy about it?

narsbars

2[H]4U
Joined
Jan 18, 2006
Messages
2,779
My number finally came up in the EVGA lottery. I got a
ProductPriceQuantityTotal Price
EVGA 08G-P5-3755-KL GeForce RTX 3070 XC3 ULTRA GAMING

This LHR card will be replacing/upgrading a GTX 1080TI FE. I had asked opinions on this all the way back to Jan 2021. Will I see the difference? I do regular gaming, Cyberpunk and some FPS. I only use the onboard "legal" OC.
Will I be disappointed ?
Look for the 1080TI in the WTS in about ten days:):)
 
My number finally came up in the EVGA lottery. I got a
ProductPriceQuantityTotal Price
EVGA 08G-P5-3755-KL GeForce RTX 3070 XC3 ULTRA GAMING

This LHR card will be replacing/upgrading a GTX 1080TI FE. I had asked opinions on this all the way back to Jan 2021. Will I see the difference? I do regular gaming, Cyberpunk and some FPS. I only use the onboard "legal" OC.
Will I be disappointed ?
Look for the 1080TI in the WTS in about ten days:):)

Well, in Cyberpunk you'll be able to turn on at least some level of Raytracing (how much depends on the resolution you play at and the FPS you are comfortable with) and that makes a pretty decent difference in that title alone.


Anecdote: I had a Radeon VII in my main rig until last month. It played everything I wanted it to just fine. 4K was playable for most games though some setting reductions had to occur to make it "smooth" on a lot of different titles. At 2K resolution, no compromises needed to be made at all. The Radeon VII is roughly comparable in performance and features to the 1080Ti you are coming from...

I recently had the opportunity to acquire a reference Radeon 6800XT and EK waterblock for pretty close to retail ($1080 shipped, actually). It is night and day. I can run everything I play maxed out at 4K. Raytracing in Control is amazing. The "smooth" is very very smooth. Yes, I feel safe in saying you will be satisfied.
 
Well, in Cyberpunk you'll be able to turn on at least some level of Raytracing (how much depends on the resolution you play at and the FPS you are comfortable with) and that makes a pretty decent difference in that title alone.


Anecdote: I had a Radeon VII in my main rig until last month. It played everything I wanted it to just fine. 4K was playable for most games though some setting reductions had to occur to make it "smooth" on a lot of different titles. At 2K resolution, no compromises needed to be made at all. The Radeon VII is roughly comparable in performance and features to the 1080Ti you are coming from...

I recently had the opportunity to acquire a reference Radeon 6800XT and EK waterblock for pretty close to retail ($1080 shipped, actually). It is night and day. I can run everything I play maxed out at 4K. Raytracing in Control is amazing. The "smooth" is very very smooth. Yes, I feel safe in saying you will be satisfied.
You are right. I did not mention FPS. Like any gamer I would like to play at 4K and my monitor caps out at 60 FPS. My primary goal is the highest possible quality settings first at 2K, followed by dreaming about 4K. Will any current games play at 4K? Everything I read says a 3070 won't deliver a reliable 4K with the highest settings.
 
You are right. I did not mention FPS. Like any gamer I would like to play at 4K and my monitor caps out at 60 FPS. My primary goal is the highest possible quality settings first at 2K, followed by dreaming about 4K. Will any current games play at 4K? Everything I read says a 3070 won't deliver a reliable 4K with the highest settings.

If you're actually using a 2K display, you should be able to max everything out, no problem. As far as 4K is concerned, the 3070 is still pretty performant, especially in raster mode. It should be just fine. 4K Raytracing might be a little more challenging on a 3070, but that really depends on the game - and most games offer different levels of raytracing. Then again if the game is popular enough to warrant nVidia supporting it, there is also DLSS which will get you to 4K through upscaling. Overall, you'll be coming out ahead.
 
If you're actually using a 2K display, you should be able to max everything out, no problem. As far as 4K is concerned, the 3070 is still pretty performant, especially in raster mode. It should be just fine. 4K Raytracing might be a little more challenging on a 3070, but that really depends on the game - and most games offer different levels of raytracing. Then again if the game is popular enough to warrant nVidia supporting it, there is also DLSS which will get you to 4K through upscaling. Overall, you'll be coming out ahead.
I am using an early 4K LG display.
 
It will be a notable, but not earth-shattering upgrade from your 1080 ti. You're looking at ~30% raster performance improvement.

As an example, in real terms, you'll be able to play CP2077 at 1440p ultra with somewhat reasonable performance e.g. 40-60 FPS whereas a 1080 ti would be nearly unplayable. Similarly, you'll be able to play many games at 4k with some reduced settings on that 3070 whereas the 1080 ti would be a non-starter at 4k. You're not going to be enabling VRAM-heavy settings e.g. AA high enough for the reduced VRAM on the 3070 vs the 1080 ti to matter.

As for RT, it's really going to be a tech demo on the 3070 since once you raise RT settings e.g. "high" to where it's really noticeable, you won't get acceptable FPS (in the handful of games where RT has real benefit). I'd rather play highest settings at 1440p w/o RT. Similarly, on DLSS -- and I've said my piece here -- your better off playing "clean" at lower resolution than upscaling.
 
The RTX 3070 is a solid upgrade from a GTX 1080 Ti. An RTX 3070 is roughly equivalent to an RTX 2080 Ti if I am not mistaken. Although, it has less memory than the 2080 Ti does. But when it comes to Cyberpunk 2077, don't get your hopes up.

At the time of its release, the RTX 3090 was only capable of around 22FPS in Cyberpunk 2077 at 4K native with the ray tracing ultra preset and without DLSS being turned on. Let that sink in. 22FPS, on an RTX 3090 with a Ryzen 9 5950X. I got about that on a Core i9 10900K and a watercooled RTX 3090 give or take a couple FPS. Now, Cyberpunk 2077 has had some "optimization" by CDPR (basically, shorter draw distances) and DLSS performance quality has improved with newer libraries. With an RTX 2080 Ti, I used to get around 41FPS in Cyberpunk 2077 with DLSS set to performance and the ray tracing ultra preset at 3440x1440 which is a little easier than 3840x2160.

On my RTX 3090 FE, I get about 55FPS average using a Core i9 10900K with the ray tracing ultra preset and DLSS set to balanced. I haven't tried it with my Core i9 12900K. I've fired the game up and it felt smoother, but I didn't have the FPS counter enabled, so I am not 100% sure its improved at all. You are mostly GPU limited at 4K, but CPU performance does factor in somewhat.
 
It will be a notable, but not earth-shattering upgrade from your 1080 ti. You're looking at ~30% raster performance improvement.

As an example, in real terms, you'll be able to play CP2077 at 1440p ultra with somewhat reasonable performance e.g. 40-60 FPS whereas a 1080 ti would be nearly unplayable. Similarly, you'll be able to play many games at 4k with some reduced settings on that 3070 whereas the 1080 ti would be a non-starter at 4k. Your not going to be enabling VRAM-heavy settings e.g. AA high enough for the reduced VRAM on the 3070 vs the 1080 ti to matter.

As for RT, it's really going to be a tech demo on the 3070 since once you raise RT settings e.g. "high" to where it's really noticeable, you won't get acceptable FPS (in the handful of games where RT has real benefit). I'd rather play highest settings at 1440p w/o RT. Similarly, on DLSS -- and I've said my piece here -- your better off playing "clean" at lower resolution than upscaling.
I'm going to disagree with that last part. When I had the RTX 2080 Ti, I tried running at 2560x1440 and 1920x1080 (and still do for benchmarking during reviews) on a 4K display and I'd rather run with DLSS enabled. Running a lower resolution that's essentially stretched to fit the display gives the image a "fuzzy" look. That's true anytime you run a non-native resolution on an LCD display of any kind. With DLSS, it's not as good as running native at 4K without it but it's somewhere in between. It's true DLSS can produce some weird pop-in effects or artifacting at times, but for the most part it works very well and the performance trade off is usually worth it. At least, running on balanced or something like that. The ultra-performance mode for DLSS is not quite as good but in that mode there is more of an argument to be made for running at 2560x1440 and upscaling.
 
I'm going to disagree with that last part. When I had the RTX 2080 Ti, I tried running at 2560x1440 and 1920x1080 (and still do for benchmarking during reviews) on a 4K display and I'd rather run with DLSS enabled. Running a lower resolution that's essentially stretched to fit the display gives the image a "fuzzy" look. That's true anytime you run a non-native resolution on an LCD display of any kind. With DLSS, it's not as good as running native at 4K without it but it's somewhere in between. It's true DLSS can produce some weird pop-in effects or artifacting at times, but for the most part it works very well and the performance trade off is usually worth it. At least, running on balanced or something like that. The ultra-performance mode for DLSS is not quite as good but in that mode there is more of an argument to be made for running at 2560x1440 and upscaling.
That's a fair point. If you were starting fresh on monitors and had a choice between a 1440p and 4k display to compliment a 3070, I'd pick the 1440p display. If you're "stuck" with a 4k display and a 3070, running at a moderate DLSS setting at 4k is probably the best option so you're not dealing with scaling "fuzziness" at 1440p.

And again, I've given my 2c on DLSS: to me, it's like you're applying an astigmatism to play at a higher resolution, which gets worse as you move the setting/sampling up. "Balanced" in CP is somewhat passable.
 
That's a fair point. If you were starting fresh on monitors and had a choice between a 1440p and 4k display to compliment a 3070, I'd pick the 1440p display. If you're "stuck" with a 4k display and a 3070, running at a moderate DLSS setting at 4k is probably the best option so you're not dealing with scaling "fuzziness" at 1440p.

And again, I've given my 2c on DLSS: to me, it's like you're applying an astigmatism to play at a higher resolution, which gets worse as you move the setting/sampling up. "Balanced" in CP is somewhat passable.
You caught it. I have no choice in monitors. I have a 4K limited to a 60fps rate. My eyes are bad enough, and funny your comment mentions astigmatism as I have a bad case in both eyes, can't wear contacts. Can I get some more newbie to the 3000 series input?
I usually just let the game pick the settings and then mildly tweak to try for a little more quality. From what you are saying I have to move into 2022 and learn some new settings. I am a little confused on your view on DLSS. I don't want any more fuzzies than my eyes already provide.
 
I am still on 1080p and I wanted a 6700xt but none in stock at the time as coming from a ref RX 5700 that was pulling 27,000 gpu in FS on a 3700x /x470 set up .. I understood my limits with a 3070 as the memory is GDDR 6 and not plus memory like the upper tier Nvidia cards .. It pulling 34,000 in FS on the same setup which is PCI Express 3 @ 16x but the RT /DLSS is more geared like others say 1080p and 2k gaming .

I had no issue with my FreeSync display and G -Sync driver useing displayport @ 165Hz as it's been a smooth transaction moving from amd gpu to nvidia gpu.. also the nvidia recorder workers just as good as amd relive if you like recoding your game play .

It was testing my new 3600mhz memory sticks ..

 
Last edited:
I did a similar jump 1080ti to 3070FE, had to camp out at Bestbuy to get it but looking back was definetly worth it at $500. The performance boost is about 35% depending on game but the minimum frames/smoothness was/is a big improvement. You said you will use the "legal OC", do you mean what comes stock or adjusting the slider in Afterburner/Precision? When I first got my card adjusting the slider caused temperature to jump a lot more than I was expecting. However, I was able to overclock/undervolt which kept clock speeds around 1920 and dropped power by about 40 watts. Inside a NCASE M1 my temps dropped from 80s to mid 60/low70s, anything with RTX would cause temps to go to mid 70s.
 
I'm also a bad-eye-having person (w/ astigmatism etc) and decided on a budget 4K60 monitor (Acer CB282K) instead of 1440P165 for my recent 3070Ti build, knowing full well that I wouldn't be able to run many games at 4K even with DLSS. I do a lot of non-gaming stuff (photo, video, 3D modeling, CAD, spreadsheets, the works) that benefit from more screen area and I figured that 8.2MP is enough spatial resolution that I can more or less run games at whatever resolution I need to for 60fps with the effects settings I'd like and with my crappy eyesight they'll look ok scaled to panel rez. Also taking into account that my working stance puts my eyes ~1.5-2ft from the monitor while in Gaming Comfort Position it's more like 3ft. So far it's actually working pretty well.
When I can run something at native 4K I do notice the incredible microdetail and edge sharpness but 1440P has been completely useable, just with lowered microdetail and a loss of edge sharpness that can be recovered quite a lot with aggressive Image Sharpening. For 1080P use Integer Scaling and it looks like native 1080P. I wouldn't recommend this setup to someone with good eyesight but for those of us who can't see sh*t even with glasses it's totally workable.

Some general impressions of what resolutions are getting 60fps at High settings
This is on a 2Ghz core / 21GT mem 3070Ti but a regular 3070 will still be generally in the same performance class
(note: "Raytracing" either means RTX, or ReShade RGTI, performance hit is about the same)
6K Native- Fall Guys

4K Native- Mirror's Edge: Catalyst, DOOM (2016), Doom Eternal, Mad Max, Civ 6, Metro Exodus (old version), Far Cry 5

4K Quality DLSS- Doom Eternal + Raytracing

4K Performance DLSS- Control, Cyberpunk 2077, Metro Exodus + Raytracing (Enhanced Edition), Shadow of the Tomb Raider, Quake II RTX, Red Dead Redemption 2

1440P Native- Mirror's Edge: Catalyst + Raytracing, Far Cry 5 + Raytracing, Mad Max + Raytracing

1440P Balanced DLSS- Cyberpunk 2077 + Raytracing, RDR2 + Raytracing, Control + Raytracing, Shadow of the Tomb Raider + Raytracing

1080P Quality DLSS- Cyberpunk 2077 + RTX and ReShade Raytracing
 
I'm also a bad-eye-having person (w/ astigmatism etc) and decided on a budget 4K60 monitor (Acer CB282K) instead of 1440P165 for my recent 3070Ti build, knowing full well that I wouldn't be able to run many games at 4K even with DLSS. I do a lot of non-gaming stuff (photo, video, 3D modeling, CAD, spreadsheets, the works) that benefit from more screen area and I figured that 8.2MP is enough spatial resolution that I can more or less run games at whatever resolution I need to for 60fps with the effects settings I'd like and with my crappy eyesight they'll look ok scaled to panel rez. Also taking into account that my working stance puts my eyes ~1.5-2ft from the monitor while in Gaming Comfort Position it's more like 3ft. So far it's actually working pretty well.
When I can run something at native 4K I do notice the incredible microdetail and edge sharpness but 1440P has been completely useable, just with lowered microdetail and a loss of edge sharpness that can be recovered quite a lot with aggressive Image Sharpening. For 1080P use Integer Scaling and it looks like native 1080P. I wouldn't recommend this setup to someone with good eyesight but for those of us who can't see sh*t even with glasses it's totally workable.

Some general impressions of what resolutions are getting 60fps at High settings
This is on a 2Ghz core / 21GT mem 3070Ti but a regular 3070 will still be generally in the same performance class
(note: "Raytracing" either means RTX, or ReShade RGTI, performance hit is about the same)
6K Native- Fall Guys

4K Native- Mirror's Edge: Catalyst, DOOM (2016), Doom Eternal, Mad Max, Civ 6, Metro Exodus (old version), Far Cry 5

4K Quality DLSS- Doom Eternal + Raytracing

4K Performance DLSS- Control, Cyberpunk 2077, Metro Exodus + Raytracing (Enhanced Edition), Shadow of the Tomb Raider, Quake II RTX, Red Dead Redemption 2

1440P Native- Mirror's Edge: Catalyst + Raytracing, Far Cry 5 + Raytracing, Mad Max + Raytracing

1440P Balanced DLSS- Cyberpunk 2077 + Raytracing, RDR2 + Raytracing, Control + Raytracing, Shadow of the Tomb Raider + Raytracing

1080P Quality DLSS- Cyberpunk 2077 + RTX and ReShade Raytracing

I feel you, man... I have an astigmatism in each eye (not uncommon), but each eye has a different astigmatism orientation (pretty uncommon) - there is no pair of glasses in this world that are gonna fix that. So my solution is to use a 55" LG TV mounted hanging down from the rafters at 4K res with 200% zoom at 120Hz refresh utilizing Freesync that I sit ~6 feet from. My man cave is the basement, so it works really well.
 
Depends on what you paid for the 3070 and what you think you can get for a 1080ti. I wouldn't expect any earth shattering difference... I play on 4k all the time on my 1080ti. RTX cores would be nice for demanding games via DLSS though.
 
I went from a evga 680gtx classified to a 3070Ti and the upgrade is massive, I dont know if you will be happy with your 3070, but I sure as fuck am, as I haven’t played games so smoothly and nice for a good few years now.
 
I went from a evga 680gtx classified to a 3070Ti and the upgrade is massive, I dont know if you will be happy with your 3070, but I sure as fuck am, as I haven’t played games so smoothly and nice for a good few years now.
Well your upgrade was much bigger than the OPs. While your GTX680 was the high end at the time, you waited long enough that a more mid-range card became a huge upgrade. You skipped four generations of GPU and allot more releases than that.
 
Well your upgrade was much bigger than the OPs. While your GTX680 was the high end at the time, you waited long enough that a more mid-range card became a huge upgrade. You skipped four generations of GPU and allot more releases than that.

what can I say, I honestly didn’t feel the need to upgrade during those years as the games I played and still play worked really well on my 680, but I did start to notice slowdowns when some of my games got updates, hence the reason I tried to get me a 3080 or 3080ti, but there was absolutely none to be had, in the end I had to go to fleabay for my card.

You wanna buy a non castrated 680 gtx classified ? 😀
 
You wanna buy a non castrated 680 gtx classified ? 😀

680 GTX is still an ok 1080p card, you can still get a couple hundred bucks for it I bet. I'd say list it in the for sale forum but you might not meet the posting requirements yet.

I mean we live in a world where 1050ti's are $300 new, and I bet the performance of a 680 GTX is in the same 1080p ballpark.
 
You wanna buy a non castrated 680 gtx classified ? 😀
It's cool that it's an EVGA GTX 680 Classified Edition, but hell no. I've got a couple of GTX 680 4GB cards floating around here somewhere. I also have better cards than that I'm not using.
 
It's cool that it's an EVGA GTX 680 Classified Edition, but hell no. I've got a couple of GTX 680 4GB cards floating around here somewhere. I also have better cards than that I'm not using.
Sell those things, Dan! I bet you could get $200 each for GTX 680's.... people are desperate. You will never get more money for those than today. I sold a 680 in 2020 for $60, like a fool.

Not only will you make $$$, you are performing a community service to the GPU needy. I'm looking at microcenter site right now, the best thing you can buy under $200 is a GT 1030, which is giant turd of a card.

The only downside is lack of new driver support, but should be fine for older games.
 
Last edited:
It's cool that it's an EVGA GTX 680 Classified Edition, but hell no. I've got a couple of GTX 680 4GB cards floating around here somewhere. I also have better cards than that I'm not using.

I was joking about selling it 😀 but it is the version that has the hardwired settings for water over clocking and another for ln2 overclocking and has the adapter for attaching the overclocking device evbot, but then nvidia had to come along and forced evga to break off the ezbot connector on latter gtx680s.

For all I know, that version of the gtx680 could be worth some money some day due to the the fact it has inbuilt crazy hardware oc settings.
 
Sell those things, Dan! I bet you could get $200 each for GTX 680's.... people are desperate. You will never get more money for those than today. I sold a 680 in 2020 for $60, like a fool.

Not only will you make $$$, you are performing a community service to the GPU needy. I'm looking at microcenter site right now, the best thing you can buy under $200 is a GT 1030, which is giant turd of a card.

The only downside is lack of new driver support, but should be fine for older games.
I should. I have a bunch of old GPUs. GTX 1080 Ti, two Titan X's, 780 Ti, AMD HD 6970, 2080 Ti, and plenty of others.
 
I run a 3070 on 1440@144 and its been fine. No complaints.

Me to, i’ve just bought a samsung odyssey g7 32 inch 2560@1440 and my 3070ti handles the resolution pretty good, but I’ve just bought it and things could change with games I havent played yet. Whats weird is, the monitor has its own inbuilt crosshairs in it, which I cant think of a reason why I would ever enable it.
 
The RTX 3070 is a solid upgrade from a GTX 1080 Ti. An RTX 3070 is roughly equivalent to an RTX 2080 Ti if I am not mistaken. Although, it has less memory than the 2080 Ti does. But when it comes to Cyberpunk 2077, don't get your hopes up.

At the time of its release, the RTX 3090 was only capable of around 22FPS in Cyberpunk 2077 at 4K native with the ray tracing ultra preset and without DLSS being turned on. Let that sink in. 22FPS, on an RTX 3090 with a Ryzen 9 5950X. I got about that on a Core i9 10900K and a watercooled RTX 3090 give or take a couple FPS. Now, Cyberpunk 2077 has had some "optimization" by CDPR (basically, shorter draw distances) and DLSS performance quality has improved with newer libraries. With an RTX 2080 Ti, I used to get around 41FPS in Cyberpunk 2077 with DLSS set to performance and the ray tracing ultra preset at 3440x1440 which is a little easier than 3840x2160.

On my RTX 3090 FE, I get about 55FPS average using a Core i9 10900K with the ray tracing ultra preset and DLSS set to balanced. I haven't tried it with my Core i9 12900K. I've fired the game up and it felt smoother, but I didn't have the FPS counter enabled, so I am not 100% sure its improved at all. You are mostly GPU limited at 4K, but CPU performance does factor in somewhat.

3080 Ti FE here undervolted and I have locked my game to 60fps and it sits comfortably at that all day long in game with Psycho RT Lighting and SSR, everything else on Ultra/highest at 3440x1440. The difference on mine is that I'm on a 12700KF (stock). I see no reason why a 3070 non-Ti should not get solid frames close or at 60fps at 2k resolution like OP is proposing.

Keep in mind that at 3440x1440 I am seeing 9GB+ of VRAM utilisation in Cyberpunk 2077 and that's with DLSS set to Balanced. So 4K on an 8GB card is out of the question especially if running with RTX on even with DLSS.
 
3080 Ti FE here undervolted and I have locked my game to 60fps and it sits comfortably at that all day long in game with Psycho RT Lighting and SSR, everything else on Ultra/highest at 3440x1440. The difference on mine is that I'm on a 12700KF (stock). I see no reason why a 3070 non-Ti should not get solid frames close or at 60fps at 2k resolution like OP is proposing.

Keep in mind that at 3440x1440 I am seeing 9GB+ of VRAM utilisation in Cyberpunk 2077 and that's with DLSS set to Balanced. So 4K on an 8GB card is out of the question especially if running with RTX on even with DLSS.
You are comparing 3440x1440 to 3840x2160 and shouldn't. While 3440x1440 isn't trivial, it's not as demanding as 3840x2160 is. The reason I say a 3070 can't get 60FPS in Cyberpunk 2077 with ray tracing set to ultra (even with DLSS) at 4K is because it can't. I didn't actually say anything about running at 2560x1440 in terms of frame rates. I simply said it would look like shit. He can probably do it at 2560x1440 with decent frame rates but as I said, it will look like shit on a 4K display. Using non-native resolutions always look terrible on LCD's. If you are dead set on having ray tracing enabled in Cyberpunk 2077, your choices are to set the resolution for 2560x1440 and upscale or set it for 3840x2160 and use DLSS.

In my opinion, you are better off turning off ray tracing entirely and running it at the native resolution of the monitor than you are to run it at 2560x1440 and upscale with ray tracing enabled. There is nothing from stopping the OP from trying both. As for video memory, I already mentioned that the 2080 Ti had more than the RTX 3070 does. I think over time the RTX 2080 Ti will ultimately age better because of that, but that's another topic. I only brought up the RTX 2080 Ti, because outside of ray tracing it and the 3070 are similar performers. Keep in mind that while your setup may be utilizing 9GB of VRAM, that does not mean its required. I can make Cyberpunk 2077 pull 13-14GB of VRAM if I use a console command to allow that. However, it does nothing for performance. There is a big difference between VRAM allocation and VRAM utilization. The tools we have measure the former, not the latter.
 
No but the OP did say they are planning on running at 2K with the hope of future 4K, which is what my comment was in ref to mostly. I know there is no hope of 4K at a decent fps with RT on even with DLSS, hence why I mentioned 1440p range which is entirely possible on a 4K display as OP already knows.

Utilisation is definitely accurate too. I'm using Geforce Experience stats osd and tested using some high 4K and 8K texture mods for Cyberpunk 2077, one of the packs is the asphalt and rock/brick materials. With all of the texture mods installed the VRAM use jumped to almost 12GB which was causing some stutter in some areas of the game where those textures were heavily used but as soon as I removed two of the texture packs that totalled a few hundred MB each, the VRAM use dropped to around 10GB and the stutters stopped.
 
Back
Top