New Rig Video Card help

Mottski

n00b
Joined
Mar 11, 2021
Messages
5
Hi,

I am building a new rig 7800X3d centered rig. This will be my first new computer since 2012, so I am out of the loop on things like ray tracing and if I need it or not. I have been reading and watching videos, but there is so much contradicting info.

I will be looking to play BG3 and other RPGs (Elden Ring, Upcoming Elder Scrolls VI) as my primary games. Other games will be sims like Mechwarrior 5, and MW5 Clans when it comes out, sports games like Madden, First person (Star Wars type games (like Jedi Survivor and the upcoming Outlaws)/some shooters). My other uses will include 3d printer slicing software (Cura, Lychee), some linfrequent ight CAD for when I need to create STLs (if they are not available for 3d printing) , surfing, productivity work (office, Photoshop, acrobat), streaming, etc . Currently using 2 24”1080 monitors, but will likely upgrade within the year to bigger screens at 1440.

I am stuck between a 7900xt at $699.99 or a 4070 ti Super at $799.99- $840.99. With the info above, do I even care about ray tracing? And what recommendation do you have and why?

Thanks!

-Matt
 
Do you need ray tracing? No. Do you want it? Totally up to the individual. I'm much more of a performance/gameplay type person than graphics so more often than not I've turned it off. But it can make your GPU choice much simpler, if you really care about it you pretty much should go the nvidia route.
 
rt is up to you. i personally dont like it, it makes everything look wet, even if it shouldnt be. either card will serve you well, and both do have rt....
Interesting comment Pend, wet/glossy is how the Crytek RT demo looks to me. Same on Cyberpunk.
 
At a $100 price difference I believe the NV card is a better choice. Less power usage, DLSS/Frame Gen, better RT performance, Cuda support.

If you were someone who upgraded often, I would say save the money and buy the 7900xt. Since you appear to keep your stuff a long time, I believe the NV card will serve you better over time.
 
  • Like
Reactions: Niner
like this
Interesting comment Pend, wet/glossy is how the Crytek RT demo looks to me. Same on Cyberpunk.

RT looks good for screenshots, like static images. Cyberpunk especially looks fantastic with RT/PT on. That being said, having played 200 hours of that game on a 4090 it really isn't worth it. Even with the 4090 at 1440p I could only get 70-80fps with frame gen on with settings that looked amazing, at playable settings it didn't look that great especially when you're running around and not really paying attention.
 
At a $100 price difference I believe the NV card is a better choice. Less power usage, DLSS/Frame Gen, better RT performance, Cuda support.
I agree with this as well. The Nvidia card will be better over the long run.
 
RT looks good for screenshots, like static images. Cyberpunk especially looks fantastic with RT/PT on. That being said, having played 200 hours of that game on a 4090 it really isn't worth it. Even with the 4090 at 1440p I could only get 70-80fps with frame gen on with settings that looked amazing, at playable settings it didn't look that great especially when you're running around and not really paying attention.

I'm going to second this. While I think ray tracing is really cool and has great potential, this is basically my experience using it. Also, it really depends on the title. Certain titles do it very well, but a lot of those titles already look amazing enough as it is with raster that the added benefit isn't worth the framerate cost. There are a lot of games I play that have a ray tracing option, and I turn it on and my framerate tanks but I barely notice the visual upgrade, and I almost never notice it when I'm concentrating on playing the objective and not standing still admiring the scenery. I find I notice ray tracing mostly in games like Quake II RTX, DOOM, or Minecraft, where the raster graphics are already not that intense. That said, I think it will be a big benefit to graphical fidelity eventually, but I won't worry about it until another generation or two when the technology catches up.

When it comes to whether or not this feature matters, look at your current and planned games list. If RT in particular is really important to you, then go with Nvidia. In my case, AMD had better raster performance, cost less, and I wasn't going to be making use of ray tracing because I game at 4K and the only card that you can really do it with at this point is the 4090, which I wasn't willing to pay for, and even then it takes a big hit to the framerate. Anyone else is free to disagree, just sharing my point of view.
 
Hi,

I am building a new rig 7800X3d centered rig. This will be my first new computer since 2012, so I am out of the loop on things like ray tracing and if I need it or not. I have been reading and watching videos, but there is so much contradicting info.

I will be looking to play BG3 and other RPGs (Elden Ring, Upcoming Elder Scrolls VI) as my primary games. Other games will be sims like Mechwarrior 5, and MW5 Clans when it comes out, sports games like Madden, First person (Star Wars type games (like Jedi Survivor and the upcoming Outlaws)/some shooters). My other uses will include 3d printer slicing software (Cura, Lychee), some linfrequent ight CAD for when I need to create STLs (if they are not available for 3d printing) , surfing, productivity work (office, Photoshop, acrobat), streaming, etc . Currently using 2 24”1080 monitors, but will likely upgrade within the year to bigger screens at 1440.

I am stuck between a 7900xt at $699.99 or a 4070 ti Super at $799.99- $840.99. With the info above, do I even care about ray tracing? And what recommendation do you have and why?

Thanks!

-Matt
If i may offer a differing opinion.

the 4070 ti super reviews have shown it to be a lackluster product. https://www.techspot.com/review/2793-nvidia-geforce-rtx-4070-ti-super/ It's not a 4080, and it sounds like you don't upgrade very often.

Personally, I would look into the 7900 XTX.

But if your at this price range, I'd actually argue that the 7900 xt would end up being better in the long run. It's cheaper, has more vram and better out of the box raster performance. RT is pretty but unless your going to sacrifice frames, isn't needed. and you can still get decent framerates with RT and FSR 2.0 enabled if you really want the eye candy.

I do have an XTX and play most some of the same games (like BG3) and it works great at 1440p. But really it just depends on ecosystem. Nvidia's ecosystem is a walled garden approach. AMD is more open. Both companies like to abandon products with new tech. Given the chips underneath the 7900xt are a new way of doing things compared to the existing monolithic design, I took the chance that they will only get better over time. While with nvidia, what it is today is going to be all it ever is.
 

Attachments

  • 1710180722433.png
    1710180722433.png
    476.8 KB · Views: 0
I'm going to second this. While I think ray tracing is really cool and has great potential, this is basically my experience using it. Also, it really depends on the title. Certain titles do it very well, but a lot of those titles already look amazing enough as it is with raster that the added benefit isn't worth the framerate cost. There are a lot of games I play that have a ray tracing option, and I turn it on and my framerate tanks but I barely notice the visual upgrade, and I almost never notice it when I'm concentrating on playing the objective and not standing still admiring the scenery. I find I notice ray tracing mostly in games like Quake II RTX, DOOM, or Minecraft, where the raster graphics are already not that intense. That said, I think it will be a big benefit to graphical fidelity eventually, but I won't worry about it until another generation or two when the technology catches up.

When it comes to whether or not this feature matters, look at your current and planned games list. If RT in particular is really important to you, then go with Nvidia. In my case, AMD had better raster performance, cost less, and I wasn't going to be making use of ray tracing because I game at 4K and the only card that you can really do it with at this point is the 4090, which I wasn't willing to pay for, and even then it takes a big hit to the framerate. Anyone else is free to disagree, just sharing my point of view.
But isn't frame generation and dlss scaling making up the difference? Aside from cyberpunk what games are not able to hit 144 ?
 
While with nvidia, what it is today is going to be all it ever is.
Recent history does not necessarily match that, DLSS "3.5" was available on Turing, around Lovelace Launch ampere got a nice performance uplift with all the work that went on the drivers for the launch:
https://www-computerbase-de.transla...uto&_x_tr_tl=en&_x_tr_hl=en-US&_x_tr_pto=wapp

Nvidia reflex launched in 2020 and work on Geforce 900 series and up.

Maybe part of DLSS 4 or other new techs (in vram texture inference for example has been worked on a very long time) will be available on some of today Nvidia GPU.

Cura-Lychee does not for now using a quick google use much of the GPU (outside regular display), maybe you can go ahead with AMD without issue, 1440p, the 7900GRE depending on the rebate could be an interesting option.

The only little *, is the Upcoming Elder Scrolls VI, not known yet, but if it launch to run well on the XboxSeries X as it kind of must, 7900xt-7900GRE-3080-6800xt should not have any issues, if it take so long that it run well or just on the next Xbox, that far away enough to not be a big deal, planning for 2029... can be a bit of a foul errant.
 
But isn't frame generation and dlss scaling making up the difference? Aside from cyberpunk what games are not able to hit 144 ?

It helps, but it's not a silver bullet. Otherwise, why not just buy a 4060 and let DLSS and frame gen make everything playable? Because native performance matters, you don't get anything for free. Plenty of games are not able to hit 144 ray traced with DLSS on with various Nvidia GPUs if you're not gaming at 1080P.

AMD does have FMF and FSR as well (these also work on Nvidia GPUs if you want to use them). Granted, DLSS is technically superior, but having used FSR and FMF, I personally feel they work well enough, although I'm also not taking screenshots of both side by side and then analyzing the snot out of it trying to find this, that, and the other artifact. Both look fine to me, typically, but both have limitations. Frame generation introduces input latency, for example.
 
LukeTbk I am referring to rendering native. Any upscaling tech (DLSS or FSR) has its downsides compared to native rendering (latency, loss of fidelity, ect) and personally, I spend a lot of money to have things look pretty, not to have a slightly higher framerate but it look not as good.

Basically the 7900 series is using a fundamentally new architecture design. Its a MCM GPU. This will be how things are done in the future as it simply is not feasible to be making giant monolithic designs.

But because we are at the beginning of this, the 7900 series in particular are going to receive the biggest boost over time as the driver team learns how better to optimize for this new architecture.
 
am referring to rendering native
A right even if the AMD drivers are inferior so there is more room to growth could be less true that in the past, it is still a little bit true.

not to have a slightly higher framerate but it look not as good.
That will tend to be how available option to have higher framerate will tend to work, dlss is not special in that regard outside the fact that it is a lot of higher frame rate. I am not so sure that it will look worst if you choose a very moderate higher framerate, will probably be a case by case. 62 fps DLSS image quality vs 60 fps native rendering image quality (and the reason why almost every console game choose upscaling instead of native, they subjectively find it look better that way), I would imagine for a lot of case the dlss will look better, we tend to only look at example where people choose giant higher frame jump over native.
 
That will tend to be how available option to have higher framerate will tend to work, dlss is not special in that regard outside the fact that it is a lot of higher frame rate. I am not so sure that it will look worst if you choose a very moderate higher framerate, will probably be a case by case. 62 fps DLSS image quality vs 60 fps native rendering image quality (and the reason why almost every console game choose upscaling instead of native, they subjectively find it look better that way), I would imagine for a lot of case the dlss will look better, we tend to only look at example where people choose giant higher frame jump over native.

I'm not getting what your saying here. If I'm reading this right I'm thinking you and I have a fundamental difference in understanding of what tech like DLSS and FSR are and why are they used. If you think about it, what DLSS and FSR do is 'make up' mixing pixels. Say your doing a 4k image output, when you render native 4k your rendering ~8.3million pixels every frame. 4 times the number of pixels at 1080p. In a very simple example of how these techs work. They effectively render at 1080p, stick on the screen and then the 'magic' upscaling tech fills in the missing pixels, DLSS and FSR do this in different ways. but in essence its 'guessing' what those pixels are w/o rendering them. It's an educated guess. and this is a very simplified example as different versions of DLSS and FSR render more or less pixels and guess at more or less pixels, but its still a good explanation of how the tech works.

This is how it increases frame rate, because using this example your having to render 1/4 the amount of pixels as native 4k but your outputting a 4k picture. Therefor you in theory can render up to 4x as many frames in a given amount of time. (obviously not that many due to overhead of FSR and DLSS). But you get the idea.

The reason that consoles are using upscaling are different than the reasons we would use in PC world. In consoles they are prioritizing consistent frame rate (ex, locked 30 or 60 fps), at a given resolution WITHIN a specific thermal and performance limit. That limit is the point. Supporting 4k with the power of the existing consoles necessitates the use of technology like DLSS and FSR in order to maintain that consistent framerate, otherwise the raw power of the consoles is simply not enough to maintain that rate and customers would have a sub-par experience.

There is no situation where an image that was using an upscaling tech will have more detail and be better than rendering native. That is not possible.
 
There is no situation where an image that was using an upscaling tech will have more detail and be better than rendering native. That is not possible.
That not what interesting, the question is there situation that a 60 fps DLSS render look better than a 60 fps native render. There even case that if you do not try to reach 90 fps from 60 fps with DLSS, but just reach 70 that DLSS at 70 will look better than native at 60 to some.

To achieve to reach 60 fps at the native resolution, a list of compromise will be needed.

As for the claim, yes if the upscaled text as knowledge of what the rendering is trying to do it can have more details, an easy example imagine I have a word document, I print it, scan it, it would be possible for a process to give me back the original quality by detecting the font, the letter, etc... It is easy to imagine (and we can see the same occur) with text in the 3d render as well, being better in the dlss render than the native one, because it detect that it is text.... easy to imagine for brick wall, electric line and other common affair where a reconstruction can beat native and has your training model of the world get bigger the amount of case can grow. But that was not what I am talking about.

at a given resolution WITHIN a specific thermal and performance limit.
PC always end up at a giving end resolution since the end of the CRT and always have a performance limit, there not really a difference.
 
LukeTbk To me what your describing is not upscaling tech, its AI enhancement to existing images. To my understanding, upscaling is the way to render a frame at a lower resolution and 'upscale' it to display natively at a higher resolution. It does this by using various techs to 'fill in the details' that are missed by not rendering at the higher resolution. The higher resolution image is still the native image, the best quality image as it is the native rendered source.

Now using AI enhancement to images, that is a separate conversation. That's specifically going into the idea that for a given resolution rendered, more detail can be added than was originally in the native rendering. Say that's enhancing the texture detail through AI when a 1k texture was used to a 4k texture. That is not upscaling per say with resolution, that's using AI to enhance the game itself and improve the quality regardless of how it is rendered.

I'm not aware of DLSS or FSR doing the latter, if so can you point me to a whitepaper on it? I'd love to have a read if someone is working on that tech for real time image enhancement.

As for consoles, my point was they were more limited while the PC was not. specifically, a PC can pull up to 1440w on a 15 amp circuit. a console is rarely going to do what, more than 200w maybe 300w?

Such a drastic difference in available power by necessity limits what type of processing power means that console optimization has very different prioritization structure.
 
To me what your describing is not upscaling tech, its AI enhancement to existing images
That what AI upscaler try to do, they train on a large dataset of very high quality 16K image versus and compare them with what their real time in game like low resolution look like and learn how to get from the low game imagery to 16K very high details.

So sometime it can "known" that the 3d engine was trying to render a stair step, a brick wall, an electric line in the distance, but the many compromise, some 24 bits float rounding limitation, etc... made the wire look like it has step or disapear and reconstruct it by knowing what it should look like,

Now using AI enhancement to images, that is a separate conversation.
Upscaling is more and more linked to it, DLSS, Xess, upcoming Sony system are all based on it.

Such a drastic difference in available power by necessity limits what type of processing power means that console optimization has very different prioritization structure.
I think one big difference is that they know in advance what they have, probably that right now for most game upscaled 60 always look better (specially at console sitting distance) than what they achieve to do in 4k native 60 when they optimise their image.... In PC it is harder to optimise that way, but I imagine many game look better at upscaled 4k-60 on a RTX 3070 (say that last Ubisoft Avatar game or Cyberpunk) than the setting you need to choose to be able to run it in 4k native 60.
 
LukeTbk OK I see where your coming from now. Sadly though, no, upscalers don't work that way. You don't get a 'better' image by rendering at a lower resolution and then using an upscaler to fill in the rest. They are not actually improving the textures, they are just filling in pixels lost due to the upscaling. For examples of some of the latest improvements, id recommend the latest video by hardware unboxed.


View: https://youtu.be/mF_rlgyvuBg?si=IyhaQUELEambzcl2
 
They are not actually improving the textures, they are just filling in pixels lost due to the upscaling
FSR, not dlss and Xess.

And we are engaging in 2 different conversation, can DLSS improve image quality over native vs at the same FPS can a DLSS upscaled image look better than the native resolution one, on a lot game-platform-target resolution the answer can be yes (and that why almost all console game do it).
 
Back
Top