RTX is a joke

Status
Not open for further replies.

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,893
RTX is just a gimmick, long live rasterization or Real Ray/Path Tracing

"In the review-in-progress, I noted that Atomic Heart looks good and performs extremely well. It is possible that Mundfish decided to pull ray tracing out of the final build because the performance would have suffered for most PC users (those without access to the RTX 40 GPUs, for instance), but this contradicts earlier statements where we were told that ray tracing performance was already good years ago. The reviewer guide provided by NVIDIA also mentions an 'RT Ultra' preset that doesn't exist in the build I checked out, again pointing to a recent decision to remove the ray traced features from the game.

At any rate, there's no doubt communication about its availability at launch should have been clearer and its timing more appropriate than it has been from both Mundfish and NVIDIA. It's also the first time something like this happened, as far as memory goes; hopefully, it doesn't become a trend."



Atomic Heart Ditches Ray Tracing on PC at the Very Last Minute

 
how is it a gimmick?...there are a lot of RT games available...yes it's annoying that they heavily marketed Atomic Heart around RT and didn't deliver it at launch but it's almost the norm nowadays- Doom Eternal RT support came 10+ months after the game launched, Elden Ring is coming on 1 year and still no RT which they promised would be coming post-release and most recently A Plague Tale: Requiem launched with no RT support even though it was also heavily promoted prior to release (an RT patch arrived 3 months later)

which is why you should never pre-order games...wait until at least Day 2 before buying after reading reviews...no pre-order bonuses are worth it
 
I know its probably the worst example to use, but in World of Warcraft it almost looks better without it. I turn it on, and I can't really tell what has changed. I can see something altering, but I almost prefer it off.

Its weird.
 
how is it a gimmick?

It's a little bit of a scam of you ask me.

Developers don't need RT to make visually stunning games. A well designed raster game can look every bit as good as the best RT title.

But RT has become a marketing need due to the hype. As we learned in this story, people will complain when they don't get RT, even if it isn't necessary to make a game look good. Sometimes (especially in the RTX 2000 days) there was also pressure and/or coercion from Nvidia.

There is also an argument to be made that it might be easier to develop levels etc. with RT than it is to create light maps etc. like you have to in raster. Even so, we are talking something that helps Nvidia or the dev, not something that benefits the user.

(This is also more of a future state, as in today's games you can't just make them 100% RT, so you are still going to need light maps, etc.)

Once the developers use RT the in the game - however - GPU requirements go through the roof and Nvidia profits, as kids feel they NEED an RTX GPU to get the most out of their games.

So yeah, it's a bit of Nvidia RDF utilized in full swing to sell kids thousand dollar GPU 's.
 
Last edited:
It's a little bit of a scam of you ask me.

Developers don't need RT to make visually stunning games. A well designed raster game can look every bit as good as the best RT title

I agree that games can look great without RT...but RT can push visuals to another level...take a look at Metro Exodus: Enhanced Edition, Cyberpunk 2077, Dying Light 2
 
Yeah, it depends on the game, but I love the feature overall. CP2077 looks amazing, Portal RT (real path RT) looks amazing, even BF2042 with RTAO looks way better than without it (IMO anyway), Control was another amazing RT game... and those are just the ones I can think of off the top of my head I have played.

I don't think it should be removed from games because "many people will suffer using it". Screw them, turn it off then, some of us are graphics whores and buy the best to see the best! It's not very Personal Computer of them to simply remove a feature because many can't use it, it is more console of them.
 
Join Plague Tales in the list of games with a lot of RTX Nvidia marketing without it at launch.
 
how is it a gimmick?...there are a lot of RT games available...yes it's annoying that they heavily marketed Atomic Heart around RT and didn't deliver it at launch but it's almost the norm nowadays- Doom Eternal RT support came 10+ months after the game launched, Elden Ring is coming on 1 year and still no RT which they promised would be coming post-release and most recently A Plague Tale: Requiem launched with no RT support even though it was also heavily promoted prior to release (an RT patch arrived 3 months later)

which is why you should never pre-order games...wait until at least Day 2 before buying after reading reviews...no pre-order bonuses are worth it
Look at the source. WCCF Tech thrives on clickbait.
 
I agree that games can look great without RT...but RT can push visuals to another level...take a look at Metro Exodus: Enhanced Edition, Cyberpunk 2077, Dying Light 2

I have played all of those.

I played Cyberpunk back when I had a 6900xt (which turned RT modes into a 20fps slide show) I played Metro Exodus on my old Pascal Titan, but I am playing through Dying light now.

I now have a 4090, and I have gone back and played the titles with and without RT, and honestly, the differences are marginal as all hell. I wouldn't even tell RT is on unless I compared them side by side.

To me - for instance - Cyberpunk just looked slightly less dark with RT enabled. The reflections and lighting didn't really appear very different outside of that.

And that marginal improvement came at a HUGE performance hit.

And I'd even argue that you could accomplish that small improvement in visuals using raster graphics as well, but they just don't, because they want to give you a reason to turn RT on.

It's more like raster modes are being gimped to make RT look better, than anything else, and even then the difference is minor.
 
And that marginal improvement came at a HUGE performance hit.

And I'd even argue that you could accomplish that small improvement in visuals using raster graphics as well, but they just don't, because they want to give you a reason to turn RT on.

It's more like raster modes are being gimped to make RT look better, than anything else, and even then the difference is minor.

every new graphics technology lowers performance- tessellation, AA etc...but as GPU's get more powerful and developers implement them better than everyone benefits...are we not supposed to make advancements in graphics?...should we keep using older technology?...plus RT is like most other graphics settings- it can be disabled
 
I agree that games can look great without RT...but RT can push visuals to another level...take a look at Metro Exodus: Enhanced Edition, Cyberpunk 2077, Dying Light 2
Also it makes things easier to look good and time matters. While it is easy to just say "Oh they should just spend more time making the game look good," what you are REALLY saying is they need to spend more MONEY making it look good and we are already getting pretty damn expensive. If technology can save artist and dev time, then it something that could be quite useful. Good examples are lighting, shadows, and reflections. These are things that require separate work with rasterization, if you want to have ti and have it look good. They don't inherently exist. With RT they do. Put an object and a light in a scene, the light will cast shadows. If the material is reflective, there will be reflections. It just happens as part of the algorithm and looks right. Makes it much easier for designers.

There's a reason why Hollywood uses various RT algorithms to render movies, even non-realistic ones like Wall-E, and not rasterization and it isn't because they love spending lots of time and money on render farms.

Also as you say it can really elevate a good game. Control looks good with just rasterization but it looks great with RT. Same with RE: Village. There are effects that we just don't have a good way of doing with rasterization. Really good reflections for example. Both of those have reflections, shiny and diffuse, in the rasterization only mode and they are good, but they just don't look as good or as real as the RT reflections.

As for the title: it is extremely silly clickbait. nVidia's RTX does legit speed up raytracing a lot. The RT cores help with the setup and deciding where to cast the rays, the tensor cores help with the denoising and upscaling. It is legit new hardware that helps make ray tracing faster and easier to do in realtime.
 
It's a cool tech and will be the future in some form, but still belive it's overhyped and still isn't "good" overall. The 7900xtx launch convinced me that all previous RT was in fact overhyped because suddenly last gen RT performance on last gen games was considered "crap" because AMD was doing it.
 
It's a cool tech and will be the future in some form, but still belive it's overhyped and still isn't "good" overall. The 7900xtx launch convinced me that all previous RT was in fact overhyped because suddenly last gen RT performance on last gen games was considered "crap" because AMD was doing it.
Well, to be fair, any old cards are crap if you're an uber enthusiast. :)

I think RTX is awesome, especially with Fortnite. It's bonkers.
 
Well, to be fair, any old cards are crap if you're an uber enthusiast. :)

I think RTX is awesome, especially with Fortnite. It's bonkers.
Serious fortnight players would not even use RTX as they would want higher framerate. So thats a pretty bad example.
 
Well, to be fair, any old cards are crap if you're an uber enthusiast. :)

I think RTX is awesome, especially with Fortnite. It's bonkers.
I think the most bonkers thing about Fortnite's visuals are not actually that they look great. But.....there's a "software" mode, where it runs everything on the shading/compute portion of the GPU. Rather than the Ray Tracing portion. Looks as good as anything and it runs equally well on Nvidia and AMD.
 
I think the most bonkers thing about Fortnite's visuals are not actually that they look great. But.....there's a "software" mode, where it runs everything on the shading/compute portion of the GPU. Rather than the Ray Tracing portion. Looks as good as anything and it runs equally well on Nvidia and AMD.
That's pretty neat. I'm always in favor of hardware agnoistic solutions.
 
Eh. I am old. I just throw tons of CPU and GPU at it and I can still push my 165Hz monitor. It's a beautiful game and definitely a showcase for UE.
Also, not everyone is super competitive, even in competitive games. I know that the hardcore stance is that everyone should care about max FPS, minimum lag, no graphics, etc. However, in reality people play how they like and not everyone cares about being the best. For a lot of people, it probably doesn't matter either. You can go nuts optimizing your setup only to discover it is your own reaction time that is the big issue and you still aren't ranking up.
 
I think the most bonkers thing about Fortnite's visuals are not actually that they look great. But.....there's a "software" mode, where it runs everything on the shading/compute portion of the GPU. Rather than the Ray Tracing portion. Looks as good as anything and it runs equally well on Nvidia and AMD.
I'd love to see more on this!
 
That's pretty neat. I'm always in favor of hardware agnoistic solutions.

I'd love to see more on this!
Time stampted for recent benchmarks


Here's a comparison. For the "software" mode, all of the lighting and Global illumination, looks exactly the same. The thing you lose, compared to the "hardware" Ray Tracing mode, are the character reflections. However, Ray Traced reflections are usually one of the "cheaper" RT effects. So theoretically, you could do reflections on the 'hardware' and everything else "sofware", and still have good performance, even on AMD.

 
Last edited:
every new graphics technology lowers performance- tessellation, AA etc...but as GPU's get more powerful and developers implement them better than everyone benefits...are we not supposed to make advancements in graphics?...should we keep using older technology?...plus RT is like most other graphics settings- it can be disabled

I'm all for advancement, but as I've said previously, thus far RT hasn't really done much in the way of advancement.

You have to practically go side by side pixel-peeping to notice the difference in most cases. Is that worth a 60-80% performance hit?

And In most cases I'm not sure RT would even be needed in order to achieve the small differences we are seeing. It could be done without RT at a lower performance hit, so they actually wind up just gimping the raster version to make the RT version look better.

And that's exactly the problem with turning it off. Sure you can, but then you are not getting the very best raster can deliver. You are getting the raster version that was gimped to make the RT version look better.

I got frustrated with this, and it was one of the reasons I bought a 4090. Not because I feel that the industry really needs RT, but because RT is being forced on me and unless I want a gimped experience, I need hardware that allows me to enable it.

As a kid I'd play games for hours on my old 286 which I only got in 1991, the year the 486 launched.

These days, I have way to little downtime to enjoy games, so when I do get the time, I don't want to deal with gimped nonsense.
 
I'm all for advancement, but as I've said previously, thus far RT hasn't really done much in the way of advancement.

You have to practically go side by side pixel-peeping to notice the difference in most cases. Is that worth a 60-80% performance hit?

And In most cases I'm not sure RT would even be needed in order to achieve the small differences we are seeing. It could be done without RT at a lower performance hit, so they actually wind up just gimping the raster version to make the RT version look better

DLSS and FSR were created specifically to lower that performance hit...RT is still in its early stages...in a few years we won't even need DLSS

as far as visual differences...I do notice...reflections in particular look much better with RT...global illumination as well completely changes the light rendering...yes you can say that you need to look at side-by-side comparisons to notice but you can say the same thing with a lot of advanced graphics settings
 
They need to use ray tracing (and other new technologies) to enhance gameplay instead of just enhancing visuals.

If you gain an advantage, say, like if you could better tell that an enemy was around a corner because of a realistic shadow or reflection from a puddle, then it would be worth it.
 
You have to practically go side by side pixel-peeping to notice the difference in most cases. Is that worth a 60-80% performance hit?
This has been my experience as well and I haven't even enabled RT effects since I got a card which is capable of some RT. I've looked at the reviews and the pictures and even some videos including side by side comparisons and I either can't tell a difference or the difference is so small that I couldn't care less. In some cases I'd even say the RT effects look worse.

I have yet to see any instance where RT is some sort of must have. I'm sure it will be useful eventually but we have yet to get near that point and I think when we do people are going to find the visual difference is little to nothing. The advantage of using it will be on the developer side, not the consumer side.
 
Here's a comparison. For the "software" mode, all of the lighting and Global illumination, looks exactly the same. The thing you lose, compared to the "hardware" Ray Tracing mode, are the character reflections. However, Ray Traced reflections are usually one of the "cheaper" RT effects. So theoretically, you could do reflections on the 'hardware' and everything else "sofware", and still have good performance, even on AMD.
You loose shadow for some dynamic object, lod for shadow (you see them pop-up more it look like), light bleeding is more of an issue. It is true you do not lose that much, the performance cost to go to hardware lumen vs software seem to be quite small.

Metro Exodus RT edition run about 50% faster on an AMD card than Fornite Lumen does.

They need to use ray tracing (and other new technologies) to enhance gameplay instead of just enhancing visuals.

I think for some game, realism itself can enhance the experience without changing anything to the gameplay, say a Detroit: Becoming Human, Last of Us, anything cinematic.

Otherwise using the AI power of the card for the AI of the enemies, IK solutions and so on could be maybe more interesting like AMD look like they would be pushing for.
 
when you do look at side-by-side comparisons with RT On and Off, there can be a huge difference...take a look at the Dying Light 2 video that Digital Foundry put out when the game was first released...I timestamped it...there's an almost night/day difference with RT enabled...

 
It's clear to me that raytracing is the future.

Let's not pretend like the world ends in 2024.

There will be an RTX 5000 and RTX 6000, and at that point, do you think raytracing will be considered a mere gimmick? No, it will just be a tried and true standard that every game easily implements natively, without even wondering if it's draining resources. The RTX 7000 series is at least going to happen, and although there's no way to know how powerful it will actually be, it will certainly be both in the future, and in a future where raytracing is the default standard, with the idea that AMD and Intel would also have embraced it and made raytracing simply the norm in every new game.
 
Time stampted for recent benchmarks


Here's a comparison. For the "software" mode, all of the lighting and Global illumination, looks exactly the same. The thing you lose, compared to the "hardware" Ray Tracing mode, are the character reflections. However, Ray Traced reflections are usually one of the "cheaper" RT effects. So theoretically, you could do reflections on the 'hardware' and everything else "sofware", and still have good performance, even on AMD.


Here is a better comparison
 
Also, not everyone is super competitive, even in competitive games. I know that the hardcore stance is that everyone should care about max FPS, minimum lag, no graphics, etc. However, in reality people play how they like and not everyone cares about being the best. For a lot of people, it probably doesn't matter either. You can go nuts optimizing your setup only to discover it is your own reaction time that is the big issue and you still aren't ranking up.
Too true... I play COD:MW2 on max settings, full AA at 4K... granted my 4090 runs that at 144 FPS now, but even on my 3090, I did not care and would play with 110 FPS or so, didn't matter to me. I'm all about the visuals and still do perfectly fine in MP games.
 
I wonder how the gaming world survived without RT and its streamers/fanbois pushing and pushing ad nauseum $1000-$3000 graphics cards?
I wonder how the gaming world survived without cards with shaders? I wonder how the gaming world survived without 3D cards? I wonder how the gaming worls survived without SVGA?

The answer is, with less graphical fidelity. I don't understand why gamers, particularly on a site called [H]ardforum, are opposed to progress. The idea behind RT is to push gaming graphics to a new level, why is that a bad thing?

Like all technologies, it is difficult at first and of limited use. Shaders were the same way. When the GeForce 3 came out we finally got cards with programmable pipelines (sort of, not fully). However they were pretty weak. You could use them in a game, but it had to be done sparingly. Maybe for some shiny water, or Dot3 products on a bumpy surface. But know what happened? Each generation they improved, the cards got more powerful, what you could do with shaders got better. As time went on, got to the point where you couldn't find a game not using them. Even more graphically simple games used them because it is useful, and the hardware is powerful and present even on low end cards.
 
Ray tracing can, if well implemented make things look slightly more realistic. However, and while the physics and math behind it are amazing, the apparent visual effect is often at best about as noticeable and gimmicky as what TressFx did for hair in some games. Ray tracing is nowhere near as transformative as high-end rasterization, tessellation, or even older anti-aliasing technology like MSAA. Then again, even remembering the performance hit from running 8xAA is often a pittance compared to the penalty imposed by ray tracing, especially when Nvidia is using the lighting tech as an excuse to push people into the newest, most expensive hardware.

Just look at how the remake of Portal with tracing as brought to you be Nvidia crawls with anything less than a 4090. PORTAL!
 
It's clear to me that raytracing is the future.

Let's not pretend like the world ends in 2024.

There will be an RTX 5000 and RTX 6000, and at that point, do you think raytracing will be considered a mere gimmick? No, it will just be a tried and true standard that every game easily implements natively, without even wondering if it's draining resources. The RTX 7000 series is at least going to happen, and although there's no way to know how powerful it will actually be, it will certainly be both in the future, and in a future where raytracing is the default standard, with the idea that AMD and Intel would also have embraced it and made raytracing simply the norm in every new game.
There is no future for raytracing until low and midrange cards can run it without unacceptable performance losses and that's not happening anytime soon. Two generations at the very minimum before that might happen and I'd say closer to three more generations.

If you don't believe me, look at the current landscape. People go nuts over the 4090 despite the price. They call it the future of RT. The very same people were saying the same thing about the nVidia 3000 series. Radeon RX 7000 series has the same RT performance of the nVidia 3000 series but it's somehow complete crap since it's AMD.

How long do you think it's going to take until 4090 RT performance makes it to the low end? With the amount of hardware the 4090 throws at RT it would be close to never factoring in the costs of silicon.

Until the low to midrange can run something acceptably, universal adoption of new graphics technology doesn't happen because most companies aren't stupid enough to waste money on something where the market is only a very small percentage of the normal market. That's a great way to make sure your product doesn't sell. It's important to note that the low to midrange cards capable of RT don't currently exist. That's also the market that is slowest to upgrade and holds onto cards the longest. Not only do the cards need to exist (which they currently don't) they also need to be on the market for years before enough of them become ubiquitous.

Thinking RT performance can be solved just like previous graphics technologies is a pipe dream. Practically all the graphics technologies we find normal today were simple to integrate into the current raster pipeline. RT is outside of that pipeline and currently requires separate, dedicated hardware. It's a completely different scenario.
 
There is no future for raytracing until low and midrange cards can run it without unacceptable performance losses and that's not happening anytime soon. Two generations at the very minimum before that might happen and I'd say closer to three more generations.

If you don't believe me, look at the current landscape. People go nuts over the 4090 despite the price. They call it the future of RT. The very same people were saying the same thing about the nVidia 3000 series. Radeon RX 7000 series has the same RT performance of the nVidia 3000 series but it's somehow complete crap since it's AMD.

How long do you think it's going to take until 4090 RT performance makes it to the low end? With the amount of hardware the 4090 throws at RT it would be close to never factoring in the costs of silicon.

Until the low to midrange can run something acceptably, universal adoption of new graphics technology doesn't happen because most companies aren't stupid enough to waste money on something where the market is only a very small percentage of the normal market. That's a great way to make sure your product doesn't sell. It's important to note that the low to midrange cards capable of RT don't currently exist. That's also the market that is slowest to upgrade and holds onto cards the longest. Not only do the cards need to exist (which they currently don't) they also need to be on the market for years before enough of them become ubiquitous.

Thinking RT performance can be solved just like previous graphics technologies is a pipe dream. Practically all the graphics technologies we find normal today were simple to integrate into the current raster pipeline. RT is outside of that pipeline and currently requires separate, dedicated hardware. It's a completely different scenario.

I tend to believe this.

Until this happens we won't get real full scene RT. Just a raster base with a sprinkling of RT on top.

RT becomes usable in the entire product lineup, then you might start seeing devs ditching raster all together for a whole scene RT render.

When this happens we will likely see some of the real visual benefits of RT.

I suspect it will also be easier on level designers who now only have to set light sources and models, and let the RT do the work. Less work with light maps and that sort of thing.

When this happens, full scene RT will have a much higher GPU requirement than we have today. Even the 4090 would likely struggle with it, which means it will take even longer until it trickles down to low to medium range cards.
 
  • Like
Reactions: kac77
like this
Status
Not open for further replies.
Back
Top