AMD Could Tease DLSS 3-rivaling FSR 3.0 at GDC 2023

"Tough shit" ≠ holy grail

Still a holy grail, even if you have to sit on it and spin 🤷‍♂️

So you're arguing ray tracing isn't and hasn't been a holy grail of computer graphics for over 50 years, because you or your APU can't run it well?

Come back with an argument that makes sense at least please.
 
People still use Intel 4th and 6th gen CPU's because CPU's don't age as badly as GPU's. You could pick up a R7 7700G and use that as a basis so when you do want better graphics performance you can just add a faster GPU. To justify the price, AMD would need decently fast GPU performance because traditionally their G products are slower than X products. If the GPU performance is bad enough that you need to buy a discrete GPU, so then why go for a G product other than you can't currently afford a card?
Except if you go back that far no smart access memory and no ReBar. Use any of the entry level cards with that off and performance tanks. So it’s not really a fun option.
 
It is and has been whether you want to call it one or not lol

It's computationally expensive. Holy shit stop the presses. So is 3D over 2D. Ray tracing is simulating photons, I'd be suprised if it wasn't computationally expensive. You don't actually make a point here.

That's why we have upscaling, whether you like that too or not.

Holy grail, now thats funny. Want to simulate photons, then you better get a Holodeck and that to me is the Holy Grail.
 
So why does that mean we should drop 'hackish' ray tracing for raster?

How do you propose we learn to run and get to the promised land of real time full ray tracing/path tracing without walking crutched hacked together real time ray tracing first? How do we get to the promised land if we just use raster?

Why argue for technological hinderance?
RT model can be effective on lighting calculations or somewhat more natural to real lighting. Real lighting, Mother Nature type has virtually infinite number of photons scattering around, being absorb, emitting light. Jest is real time RT (laughable at this stage) is a gross, not even close to accurate lighting, can be better than DX models but also has issues. The RT that is used in Movies like Avartare, each frame can take hours to make on over 3000 CPUs, sorry the 4090 does not have the memory nor processing power at this stage.

Now raytracing has been used in games for years, decades. Baking in the lighting into texture maps with way more rays per pixel than current generation hardware can do in real time. Way more accurate light maps to light a screen. Except this can be very time consuming but very performant when used. Plus not as dynamic having to use tricks for changes in the screen.

Of course I want more RT used, still at this stage it is very limited, even for the 4090. For most games, turning it off your not missing much but gaining a hell a lot of fps for smooth fast game play.
 
Of course I want more RT used, still at this stage it is very limited, even for the 4090. For most games, turning it off your not missing much but gaining a hell a lot of fps for smooth fast game play.

Right so keep pumping out more games with it and keep it coming on cards gen after gen so it can improve one gen at a time game after game and get us there 👍

Can't get to that part without doing this part first

Gotta learn to walk before you can run

Can't get to where you want if we just all drop ray tracing and never use it, it comes on no more cards and no games come out with it anymore lol
 
Right so keep pumping out more games with it and keep it coming on cards gen after gen so it can improve one gen at a time game after game and get us there 👍

Can't get to that part without doing this part first

Gotta learn to walk before you can run

Can't get to where you want if we just all drop ray tracing and never use it, it comes on no more cards and no games come out with it anymore lol
Well if AI can come up with approximate lighting with less processing needed, yep, keep it rolling along. RT is brute force method, be prepared to pay Peter for it. Nvidia and others are constantly experimenting and pushing ahead. See what happens.
 
Well if AI can come up with approximate lighting with less processing needed, yep, keep it rolling along. RT is brute force method, be prepared to pay Peter for it. Nvidia and others are constantly experimenting and pushing ahead. See what happens.

I dunno if you saw the DF/Nvidia interview video where they talked about all that crap a while back - but they said AI volumetrics/particles (fog, smoke grenade explosions/clouds, clouds, sparks, muzzle flash, fire etc) is gonna be the next AI accelerated feature like ray reconstruction - I wanna see is that stuff gonna get more advanced looking, look the same as high/ultra and just offload/free up raster performance for elsewhere, both? Stuff's changing, in weird and cool ways.

oineqir.gif
 
Now raytracing has been used in games for years, decades. Baking in the lighting into texture maps with way more rays per pixel than current generation hardware can do in real time. Way more accurate light maps to light a screen.

What has baked in lighting to do with realtime ray tracing? What you are describing is simply classic rasterization. And which game with baked in lighting has ever reached the level of path tracing+RR in Cyberpunk?

I think you need a little bit more context about ray tracing vs rasterization:

https://qr.ae/pKGOpb
 
Are the Ray-traced lights manipulated by developers, and are they real or again the developer's vision of how they are traced? :)
And if they are developer's vision what's the difference with the Raster kind of their vision?
And if we just get more from the same but not really the same but a bit worse version because of DLSS/FSR upscaling - what's the point of doing it?
 
What has baked in lighting to do with realtime ray tracing? What you are describing is simply classic rasterization. And which game with baked in lighting has ever reached the level of path tracing+RR in Cyberpunk?

I think you need a little bit more context about ray tracing vs rasterization:

https://qr.ae/pKGOpb
Yes, ray tracing has been used for awhile in making game contact. Real-time does not equate to real good. just because RT is used in frame to frame rendering does not magically make it super specially eye popping. In many cases it looks off, contrived, low quality, blurry in motion, poor performance, other techniques needed to be used with their own set of artifacts etc. In other words Real Time RT means crap, the term that is and in most games it is a hybrid solution with rasterization doing what it has always done.
 
Real Time RT means crap, the term that is and in most games it is a hybrid solution with rasterization doing what it has always done.
Agree. RT has been used in movies earlier because the action is predictable.

But dynamic lighting as used in games takes a heavy toll on performance. Ex: hardware Lumen in Unreal Engine games. Many might prefer 4K with baked in lighting over 1440p real time lighting, I guess.
 
Ray tracing is more accurate and correct from a math and physics standpoint (thus looks too - except for any errors of course which can/do happen with raster lighting and rendering as well) than raster that's why it's been a holy grail, that's why we wanted it, that's why we want it, that's why we use it.

Edit: because it's in a computer and manipulatble, you can still have different aesthetics as creator intended with ray traced rendering just like raster - whether going for ultra realistic looks or Team Fortress 2 looks
 
In other words Real Time RT means crap, the term that is and in most games it is a hybrid solution with rasterization doing what it has always done.

For you it means crap. For people with capable hardware it's already a very promising and exciting glance into the future. Beside it's own flaws, CP on a 4090 with path tracing + RR at 4K DLSS quality is undeniable the best looking game right now. I can understand folks with below 4090 hardware though. It looks significantly worse at lower base res. Just like raw TAA image quality looks like a blurry mess at 1080p/1440p in my eyes.

Ex: hardware Lumen in Unreal Engine games. Many might prefer 4K with baked in lighting over 1440p real time lighting, I guess.

If the baked in lighting is made decent enough, there might be many games where I would prefer it over 1440p real time lighting too. There is definitely a harsh limit for me where I can't accept a too blurry image anymore. I wonder how Unreal Engine 5 games will evolve. It wasn't a good start so far in terms of visuals / performance ratio, especially on consoles. As a PS5 user, I think a hardware refresh in form of Pro models is needed badly for UE5 games / more games with RT in general.
 
Last edited:
Could this thread get any further of the tracks?

Overlapping technologies

Like how discussion of normal maps can be about textures/shaders/shadows/lighting all the same

Ray tracing and upscale and frame gen are different, yet also related and tied together (at this time until you don't need upscale and/or framegen to get performance in/from/with ray tracing, and then without ray tracing at all they can still be used as well)
 

You mean a 1060, 6Gb vRam card. Must be lies it's only a 6Gb card everyone knows you have to have 16Gb or it's trash... (lol).

FSR3 will be a bit of a win for AMD I think, since it works on any GPU. At least on old and low tier cards. Of course it didn't look amazing, or feel really smooth. But if you are a gamer playing on 6 year old or low tier GPUs, that's expected. You know you have to turn down settings.
 
For you it means crap. For people with capable hardware it's already a very promising and exciting glance into the future. Beside it's own flaws, CP on a 4090 with path tracing + RR at 4K DLSS quality is undeniable the best looking game right now. I can understand folks with below 4090 hardware though. It looks significantly worse at lower base res. Just like raw TAA image quality looks like a blurry mess at 1080p/1440p in my eyes.



If the baked in lighting is made decent enough, there might be many games where I would prefer it over 1440p real time lighting too. There is definitely a harsh limit for me where I can't accept a too blurry image anymore. I wonder how Unreal Engine 5 games will evolve. It wasn't a good start so far in terms of visuals / performance ratio, especially on consoles. As a PS5 user, I think a hardware refresh in form of Pro models is needed badly for UE5 games / more games with RT in general.
Well needing a 4090 to have something useful as in a quality improvement is not good. Hence FSR 3 to get lower resolution upscaled + frame generated frames to play smoothly basically for RT is bla. Just because it has ray tracing does not mean automatically it is more accurate. The more samples or rays simulated, the more accurate. 10,000 samples and up per pixel for full more realistic RT. The few samples the modern real time RT GPU does per frame is not accurate, hence the need for denoisers which basically blend samples, lower sharpness of textures and fine detail and can create shimmering.

Adding extra interpreted frames for RT, as in FSR or DLSS frame generation is degrading accuracy, upscaling also degrades accuracy.

Maybe some folks believe just saying the words real time RT some how magically make everything better. Real Time RT means it can do RT calculations per frame, it has nothing to do with accuracy, if it will look better, perform better and so on.
 
Well needing a 4090 to have something useful as in a quality improvement is not good. Hence FSR 3 to get lower resolution upscaled + frame generated frames to play smoothly basically for RT is bla. Just because it has ray tracing does not mean automatically it is more accurate. The more samples or rays simulated, the more accurate. 10,000 samples and up per pixel for full more realistic RT. The few samples the modern real time RT GPU does per frame is not accurate, hence the need for denoisers which basically blend samples, lower sharpness of textures and fine detail and can create shimmering.

Adding extra interpreted frames for RT, as in FSR or DLSS frame generation is degrading accuracy, upscaling also degrades accuracy.

Maybe some folks believe just saying the words real time RT some how magically make everything better. Real Time RT means it can do RT calculations per frame, it has nothing to do with accuracy, if it will look better, perform better and so on.

Right but be honest and realistic here..... is your complaint they didn't come out the gate with full fat path tracing 100% no hacks or cheats or upscale or denoising or frame gen needed - at 8K 500FPS native capability, back in 2018?

How do you get to that level of performance without going through what we have had and have in RTX 2K, RTX 3K and RTX 4k series and whatever in the future until we get to that point? Honestly and realistically?
 
Right but be honest and realistic here..... is your complaint they didn't come out the gate with full fat path tracing 100% no hacks or cheats or upscale or denoising or frame gen needed - at 8K 500FPS native capability, back in 2018?

How do you get to that level of performance without going through what we have had and have in RTX 2K, RTX 3K and RTX 4k series and whatever in the future until we get to that point? Honestly and realistically?

I'll chip in!

One of the issues I have with hardware RT is that it is REALLY rigid and can ONLY make rendering slower.

I was hoping that Hardware-accelerated Ray Tracing would mean we can have an intermediary step where a lot of the already RT-like operations (such as 2.5D AO calculations, raymarching volumetrics, etc.) could be written to be accelerated on the dedicated RT hardware to make existing render pipelines run even faster... but you can't. RT can ONLY be slower. It can't make anything run faster unless its specifically accelerating triangle-BVH ray intersection, which is REALLY REALLY slow, but because dedicated hardware was designed around it, it can be done fast-ish.

Think of Lumen in UE5? It uses software RT and gets it running in real-time (slow, but at least frames-per-second and not seconds per frame) by using a few tricks here and there but it does it decently and it's quite amazing that Epic was able to pull this off. Wouldn't it be amazing if you could use dedicated RT hardware to accelerate this and make it run faster? Well guess what! Epic implemented a hardware RT (DXR) version of Lumen that uses dedicated RT acceleration and it's slower

It's technically more accurate and can produce more realistic reflections... but it's slower. The dedicated RT hardware can't accelerate any part of the software lumen pipeline. That to me is a tragedy. Imagine being able to double the rendering speed of UE5.

This has made it so that DXR/hardware RT is a completely separate universe to literally anything that isn't triangle-BVH intersection. There isn't a way to make the hardware RT make your graphics run faster, there's no way to harness the (legitimately amazing) power or DXR unless you SPECIFICALLY want to do triangle-BVH-intersection, which, because it's such a long, inefficient, brute-force process, is absolutely absurd to think of doing without dedicated hardware.

RT as we know it in games can ONLY be slower.
 
I'll chip in!

One of the issues I have with hardware RT is that it is REALLY rigid and can ONLY make rendering slower.

I was hoping that Hardware-accelerated Ray Tracing would mean we can have an intermediary step where a lot of the already RT-like operations (such as 2.5D AO calculations, raymarching volumetrics, etc.) could be written to be accelerated on the dedicated RT hardware to make existing render pipelines run even faster... but you can't. RT can ONLY be slower. It can't make anything run faster unless its specifically accelerating triangle-BVH ray intersection, which is REALLY REALLY slow, but because dedicated hardware was designed around it, it can be done fast-ish.

Think of Lumen in UE5? It uses software RT and gets it running in real-time (slow, but at least frames-per-second and not seconds per frame) by using a few tricks here and there but it does it decently and it's quite amazing that Epic was able to pull this off. Wouldn't it be amazing if you could use dedicated RT hardware to accelerate this and make it run faster? Well guess what! Epic implemented a hardware RT (DXR) version of Lumen that uses dedicated RT acceleration and it's slower

It's technically more accurate and can produce more realistic reflections... but it's slower. The dedicated RT hardware can't accelerate any part of the software lumen pipeline. That to me is a tragedy. Imagine being able to double the rendering speed of UE5.

This has made it so that DXR/hardware RT is a completely separate universe to literally anything that isn't triangle-BVH intersection. There isn't a way to make the hardware RT make your graphics run faster, there's no way to harness the (legitimately amazing) power or DXR unless you SPECIFICALLY want to do triangle-BVH-intersection, which, because it's such a long, inefficient, brute-force process, is absolutely absurd to think of doing without dedicated hardware.

RT as we know it in games can ONLY be slower.


What time frame exactly is/was acceptable? RTX 2K series 'ok first generation understandable' but by RTX 3K series 'What?! No 8K 500FPS path tracing without upscale or framegen! LMFAO FAIL!!!!!!!!!!!!!'?

What were you expecting by when and how? Is it 'Don't know/don't care, just what I want!'?

Edit: Over 50 years it's been attempted in computer graphics, it came out 3 generations ago for the first time real-time in gaming. Keep that in mind when timeframing. It's a demanding feature. The Crysis of features - should it have debuted only when as resource intensive as sprites? How do you achieve that without this stage? I really don't understand how you guys expect what you expect out of thin air?

Do you worry this much about tessellation still since introduction for example? Or is it a proprietary thing again with this here too? I'm just actually trying to imagine the frame of mind/the position of the argument being presented and I can't, that's what I need help with what is the problem specifically besides 'slow'. We got that. People played with anti-aliasing off all the time too when it came out.
 
Last edited:
What time frame exactly is/was acceptable? RTX 2K series 'ok first generation understandable' but by RTX 3K series 'What?! No 8K 500FPS path tracing without upscale or framegen! LMFAO FAIL!!!!!!!!!!!!!'?

What were you expecting by when and how? Is it 'Don't know/don't care, just what I want!'?
I think my issue is not that the hardware isn't fast enough to do a full path traced 4k game at 120FPS, but rather that the hardware is so rigid in its purpose that it can't be used to make anything faster, it can only prettier and slower. And it will always be that way.

Cyberpunk without RT looks better than any game released in 2018 with RT, and some games in the future will look better without RT than Cyberpunk does WITH full path tracing. And those games with RT will run much more slowly than without. DXR will always be a setting you turn off if you want amazing visuals but value smooth gameplay in the newest titles on the newest hardware.

You might think 'well, duh!' but like I said, if RT hardware could be used to make all ray operations faster, it could be used as a net positive to make games prettier AND faster. And right now we have to chose one.

Let's look at another hardware accelerated feature introduced in our lifetine: hardware tessellation introduced in DX11. Sure, the first generation of cards struggled a bit to run it smoothly with the endulgent software designed to highlight it and over-use it, but now it's so ubiquitous that it does not make sense NOT to feature it. Existing pipelines and render techniques got a huge boosts to performance, and thus allowed for better visuals with enhanced geometry budgets because it makes a lot of geometrically heavy workloads easier and faster: it was a net positive that, if implemented correctly, could make a game look better and run faster. Both.


But for 3 generations we have the option of: Make the game look 10-20% prettier but run 50-70% slower.

And on top of that, some implementations of DXR make the game look different side by side but not necessarily... better.

I like DXR and I'm happy it's a thing... But I don't think it's at an acceptable performance level and I doubt it really ever will be.
 
Holy grail, now thats funny. Want to simulate photons, then you better get a Holodeck and that to me is the Holy Grail.
Anytime someone references Star Trek or Star Wars as any form of a Holy Grail tech I am embarrassed for humanity.
 
And it will always be that way.

As in you think you'll never be getting more performance than you are today? Even in 5 years/3 more generations/RTX 7000s for sake of argument? Or, sure even if you were doing it at 8,000 FPS then native no upscale, you cloud be getting 9,000FPS if it wasn't for ray-tracing? The latter could be said about every single graphic feature, no?
 
Anytime someone references Star Trek or Star Wars as any form of a Holy Grail tech I am embarrassed for humanity.

I am embarrassed for humanity when they think ray tracing is somehow simulating photons, when instead it's light rays and a limited number of bounces it's simulating. But thanks for the post...
 
I think my issue is not that the hardware isn't fast enough to do a full path traced 4k game at 120FPS, but rather that the hardware is so rigid in its purpose that it can't be used to make anything faster, it can only prettier and slower. And it will always be that way.

Cyberpunk without RT looks better than any game released in 2018 with RT, and some games in the future will look better without RT than Cyberpunk does WITH full path tracing. And those games with RT will run much more slowly than without. DXR will always be a setting you turn off if you want amazing visuals but value smooth gameplay in the newest titles on the newest hardware.

You might think 'well, duh!' but like I said, if RT hardware could be used to make all ray operations faster, it could be used as a net positive to make games prettier AND faster. And right now we have to chose one.

Let's look at another hardware accelerated feature introduced in our lifetine: hardware tessellation introduced in DX11. Sure, the first generation of cards struggled a bit to run it smoothly with the endulgent software designed to highlight it and over-use it, but now it's so ubiquitous that it does not make sense NOT to feature it. Existing pipelines and render techniques got a huge boosts to performance, and thus allowed for better visuals with enhanced geometry budgets because it makes a lot of geometrically heavy workloads easier and faster: it was a net positive that, if implemented correctly, could make a game look better and run faster. Both.


But for 3 generations we have the option of: Make the game look 10-20% prettier but run 50-70% slower.

And on top of that, some implementations of DXR make the game look different side by side but not necessarily... better.

I like DXR and I'm happy it's a thing... But I don't think it's at an acceptable performance level and I doubt it really ever will be.
I don't understand what you're trying to say. Do you expect ray tracing to make games faster instead of slower?
 
Well needing a 4090 to have something useful as in a quality improvement is not good. Hence FSR 3 to get lower resolution upscaled + frame generated frames to play smoothly basically for RT is bla. Just because it has ray tracing does not mean automatically it is more accurate. The more samples or rays simulated, the more accurate. 10,000 samples and up per pixel for full more realistic RT. The few samples the modern real time RT GPU does per frame is not accurate, hence the need for denoisers which basically blend samples, lower sharpness of textures and fine detail and can create shimmering.

Adding extra interpreted frames for RT, as in FSR or DLSS frame generation is degrading accuracy, upscaling also degrades accuracy.

Maybe some folks believe just saying the words real time RT some how magically make everything better. Real Time RT means it can do RT calculations per frame, it has nothing to do with accuracy, if it will look better, perform better and so on.
Ding ding ding. All of this. Just saying RT doesn't mean it's accurate. The amount of rays calculated greatly affects accuracy.
 
Ding ding ding. All of this. Just saying RT doesn't mean it's accurate. The amount of rays calculated greatly affects accuracy.

We're excited to see what mind-blowing insights you guys will share with us next. Or maybe I missed that today is captain obvious day?

Edit: now say raster is more real I'm sure one of you want to lol

I'm waiting too. Haha.
 
and can ONLY make rendering slower.
Isn't there a curve with raster when RT end up faster has complexity goes, RT having a log2 shape with raster being linear and ending up slower.

Like John Carmack said 10 years ago:

Because ray tracing involves a log2 scale of the number of primitives, while rasterization is linear, it appears that highly complex scenes will render faster with ray tracing, but it turns out that the constant factors are so different that no dataset that fits in memory actually crosses the time order threshold.

Really soon, before the end of this century it could become the other way around, once game are made with unlimited amount of individual objects that have infinite geometry with ridiculous resolutions.
 
Last edited:
Quite frankly, I was excited to read that nvidia is pushing for purely AI generated frames within 10 years and no rasterization. I absolutely love being on the cutting edge of tech and graphics. Path Tracing and AI is the future of gaming... holding on to "rasterization" is like holding on to "Glide" from the Voodoo days. Its on it's way out as better solutions come along, get developed and push the envelope further. It's not always going to be pain free, but exciting to see and be a part of (for me anyway).
 
Isn't there a curve with raster when RT end up faster has complexity goes, RT having a log2 shape with raster being linear and ending up slower.

Like John Carmack said 10 years ago:

Because ray tracing involves a log2 scale of the number of primitives, while rasterization is linear, it appears that highly complex scenes will render faster with ray tracing, but it turns out that the constant factors are so different that no dataset that fits in memory actually crosses the time order threshold.

Really soon, before the end of this century it could become the other way around.

This will definitely be the case.

Certain scenarios are horrible for raster and aren't as big a deal in ray tracing. Games are just careful to not go there in the first place.

Obviously the end game is to just have artists throw _anything_ in the engine and it works with good performance while being physically correct. RT is the only way that happens.
 
Back
Top