How many of you actually use ray tracing in the games you play?

Do you use Ray Tracing in the games you play?

  • I only play games with Ray Tracing options.

    Votes: 28 17.5%
  • I will sometimes enable Ray Tracing in games.

    Votes: 57 35.6%
  • I will enable Ray Tracing momentarily check it out then turn it off to get better res/performance.

    Votes: 30 18.8%
  • I prefer not to use Ray Tracing.

    Votes: 30 18.8%
  • My GPU doesn’t even support it. 🤷🏻‍♂️

    Votes: 15 9.4%

  • Total voters
    160
There was about half-dozen games worth playing in the past few years since RT became a thing anyway. Worth it is such a subjective notion, what makes it worth it? Is it worth playing with Ultra textures instead of High? There are tons of graphical options in games that make barely noticable changes, RT is much more significant than those, yet nobody argues that having those options is not worth it.

And even if there was only one game where RT is "worth it", so what? That changes nothing, I'd still turn it on in that one game.
There was only one game at one point with per pixel shading. Did that make Pixel Shaders worthless? Not in the slightest.

When new graphics features are created there is always a few people mad, either because their HW doesn't support it, or it's too slow on it. So then they cope by trying to argue that it is not worth having anyway.

It changes everything. If you don't have a game you want to play that use's RT in a way that looks objectively great... then there is no point in buying hardware for a feature you have little interest in using or won't impact your gaming experience in a meaningful way.

I mean I get where your coming from... the feature is cool af. And some of us want to have the power to use it. But RT's usefulness isn't there for every gamer or genre yet. As adoption increases that will change, but we are only just finally starting to get there.
 
Last edited:
I look at the equation like this...

You can largely quantify at this point how much extra you're paying for RT performance and how many games actually support it in a meaningful way. And you can quantify which of those games where RT makes a notable difference that you yourself plays.
When I buy a new GPU I don't buy it for one specific feature set, if I wanted to buy one now it would inevitably support RT, so I don't understand how would you pay extra for it, it is not an optional extra.
On the other hand, you can't quantify how much enjoyment RT actually offers -- and this is where arguments for/against one card or another fall down, and the gnashing of teeth begins.

For me, there are literally 2-3 games I play that implemented RT in a meaningful way, and even so, I don't really think it makes playing a game materially more enjoyable.
Then you missed my point because that's what I was pointing out, that this same argument can be made for any graphical option. Does AA or AF make a game materially more enjoyable?
Also missed my other point where I said if there is even one game where it is meaningful I'd rather have it than not have it.
Further, considering AMD does do RT (just not at the level an equivalent nV card does), the decision waters are even muddier. For me, ultimate RT performance is not worth paying 10-20% more for. For someone else, the proposition and outcome may be different.
I think when you are already paying 1000+ for a GPU, then paying 10-20% more to have the full package is a no brainer. I'd rather pay 1100 or 1200 and have everything than pay 1000 for a compromised experience.
 
It changes everything. If you don't have a game you want to play that use's RT in a way that looks objectively great... then there is no point in buying hardware for a feature you have little interest in using or won't impact your gaming experience in a meaningful way.
Of course moving the goalpost changes everything. First it was half dozen games, now it is zero?
I mean I get where your coming from... the feature is cool af. And some of us want to have the power to use it. But RT's usefulness isn't there for every gamer or genre yet. As adoption increases that will change, but we are only just finally starting to get there.
The same argument could have been made for everything, including 3D accelerator HW to begin with. If you choose not to be an early adopter that's fine, but why try to make a value judgment on it by saying it is not useful?
 
Fair enough.

Personally I just didn't see enough to justify spending the extra just for RT.

*sorry for the late edit*
 
I look at the equation like this...

You can largely quantify at this point how much extra you're paying for RT performance and how many games actually support it in a meaningful way. And you can quantify which of those games where RT makes a notable difference that you yourself plays.

On the other hand, you can't quantify how much enjoyment RT actually offers -- and this is where arguments for/against one card or another fall down, and the gnashing of teeth begins.

For me, there are literally 2-3 games I play that implemented RT in a meaningful way, and even so, I don't really think it makes playing a game materially more enjoyable. Further, considering AMD does do RT (just not at the level an equivalent nV card does), the decision waters are even muddier. For me, ultimate RT performance is not worth paying 10-20% more for. For someone else, the proposition and outcome may be different.
This.

And to make things even muddier, high refresh monitors and greater focus on game fluidity are more mainstream today that RT, even outside of competitive gaming, so RT might in some cases be more valuable off due to performance hit for the overall gameplay experience. In some cases, RT can make the game look worse. Shocking, yes, but having some object raytraced like glass with great realism, while the rest of the game is not as "real", creates a "Roger Rabbit" feeling to the game. Looks cool on the objects that are raytraced, but out of place compared to the rest of the game world. Like some of the water puddles in Control that has a different level of realism compared to the ground and game world. In some cases, the game might look worse in certain parts with raytracing off, but more consistent overall if its off.

RT has great potential when it comes to game realism in the future, but I feel that the importance of RT in current games is overblown when people make general statements about it.

Personally I value raytracing is evolving and become better, but as a feature, its long down the list when it comes to importance for me in current games (framerate, resolution, general fidelity comes first like AA/AF/textures etc, including HDR). If there is budget for it in performance after all other bases are covered (game looks good otherwise and runs fluid), then RT can be used if it is implemented at least decently and even then I might turn it off if I think there might be severe framerate dips. Perhaps when it becomes more mainstream and better implemented in games, it would get higher priority for me.

If its a game type like adventure games that doesnt benefit from higher refresh rate then 60fps, and I have that steady 60fps with RT on and it looks better with RT, of course, leave it on, why not! Some might feel this is wrong, but I would also consider using DLSS or FSR if it looks ok in that game and I can get RT with it on. Again, DLSS and FSR are also game features and I dont care if its upscaling or "fake frames"/frame generation. As long as the result in the end is better with it on, I do that and dont care how it got there. Each game is different and gameplay in the end is what matters.

Not directly comparable, but I am more excited about good HDR implementation then raytracing at the moment. It gives much more impact when viewed on a decent screen that support HDR, so if I were to buy a new system on a budget, I would put anything extra there. Especially since some of the fake HDR from consoles for games that dont natively support it is available now in Windows.
 
Also takes awhile for engine's to have good native implementations. UE4's RT implementation isn't really all that awesome, performance wise, so very very few developers used it at all (Also didn't show up till UE 4.22). UE5's Lumen RT implementation though is awesome, so that will likely speed up adoption.
Ray tracing in UE4 is the definition of tacked on. The engine was never made to support it, which is why it runs so poorly in both performance hit and image quality in games that have it available.
 
This.

And to make things even muddier, high refresh monitors and greater focus on game fluidity are more mainstream today that RT, even outside of competitive gaming, so RT might in some cases be more valuable off due to performance hit for the overall gameplay experience. In some cases, RT can make the game look worse. Shocking, yes, but having some object raytraced like glass with great realism, while the rest of the game is not as "real", creates a "Roger Rabbit" feeling to the game. Looks cool on the objects that are raytraced, but out of place compared to the rest of the game world. Like some of the water puddles in Control that has a different level of realism compared to the ground and game world. In some cases, the game might look worse in certain parts with raytracing off, but more consistent overall if its off.
What? Which games looks worse with RT? I never had the faintest feeling that Control was made worse by too realistic puddles. Or too unrealistic? I can't tell which by your description.
RT has great potential when it comes to game realism in the future, but I feel that the importance of RT in current games is overblown when people make general statements about it.
You can't make blanket statements like this, in some games it is not that significant, while in others it is like playing another game with RT on.
Personally I value raytracing is evolving and become better, but as a feature, its long down the list when it comes to importance for me in current games (framerate, resolution, general fidelity comes first like AA/AF/textures etc, including HDR). If there is budget for it in performance after all other bases are covered (game looks good otherwise and runs fluid), then RT can be used if it is implemented at least decently and even then I might turn it off if I think there might be severe framerate dips. Perhaps when it becomes more mainstream and better implemented in games, it would get higher priority for me.
That's not an argument against Ray Tracing. When SSAA was first becoming a thing, that had a similar performance impact as RT now, therefore it was the first thing that people turned off if performance wasn't good. Before that it was shadows, reflections, particle effects etc. Ray Tracing is the newest thing, it will trickle down too, and we will have something new to bitch about.
If its a game type like adventure games that doesnt benefit from higher refresh rate then 60fps, and I have that steady 60fps with RT on and it looks better with RT, of course, leave it on, why not! Some might feel this is wrong, but I would also consider using DLSS or FSR if it looks ok in that game and I can get RT with it on. Again, DLSS and FSR are also game features and I dont care if its upscaling or "fake frames"/frame generation. As long as the result in the end is better with it on, I do that and dont care how it got there. Each game is different and gameplay in the end is what matters.
I always valued graphics fidelity above frame rate, I know this upsets a lot of FPS elitists, but I just don't give a frack about getting more than 45FPS. I grown up playing F1GP2 with 15FPS, this is 3 times better already.
Not directly comparable, but I am more excited about good HDR implementation then raytracing at the moment. It gives much more impact when viewed on a decent screen that support HDR, so if I were to buy a new system on a budget, I would put anything extra there. Especially since some of the fake HDR from consoles for games that dont natively support it is available now in Windows.
Now there is something that makes games look patently worse, and movies too. Extreme oversaturation combined with burnt out highlights and crushed blacks are not my thing.
 
What? Which games looks worse with RT? I never had the faintest feeling that Control was made worse by too realistic puddles. Or too unrealistic? I can't tell which by your description.

You can't make blanket statements like this, in some games it is not that significant, while in others it is like playing another game with RT on.

That's not an argument against Ray Tracing. When SSAA was first becoming a thing, that had a similar performance impact as RT now, therefore it was the first thing that people turned off if performance wasn't good. Before that it was shadows, reflections, particle effects etc. Ray Tracing is the newest thing, it will trickle down too, and we will have something new to bitch about.

I always valued graphics fidelity above frame rate, I know this upsets a lot of FPS elitists, but I just don't give a frack about getting more than 45FPS. I grown up playing F1GP2 with 15FPS, this is 3 times better already.

Now there is something that makes games look patently worse, and movies too. Extreme oversaturation combined with burnt out highlights and crushed blacks are not my thing.
I mean no disrespect, but really dont have time answering by microquote, even though microquites might be more precise. :)

Raytraced objects in games, especially glass can have too much realism compared to the rest of the gameworld and though it looks cool, they become out of place. My "Roger Rabbit" reference was from the movie "Who framed Roger Rabbit", where you have cartoon characters combined with real people. For Control, I found this as example (around 5:18 mark):

As you can see, puddle sticks out like an object that seems out of place there. Looks cool and all, but sticks out compared to the rest of the scene. I chose Control as example of this, since this is one of the games where RT is implemented more significantly and beautifully on other parts.

Its VERY easy to argue that the importance of Raytracing in current games is overblown as a blanket statement.
First of all, we agree that some games raytracing is not implemented significantly. I would argue that this goes for most games that support RT to begin with.
I also think we can agree that performance hit is massive with RT on, so you have a potential sacrifice turning it on that impacts gameplay directly.
We might also agree that high refresh rate screens are more available and have become more mainstream?
In addition, gamers today have much higher focus on fluidity in gameplay with VRR and high refresh rate outside of competetive gaming. FPS matters more then for more people then a nieche of competetive gamers. For some, its a choice between RT or higher resolution.
Some are having issues also with using upscaling or frame generation to combat the performance hit due to imperfect implementations of them or input lag (I am not one of those, frankly I like the option to upscale, especially since I use VR a lot and I am not that sensitive to input lag).

Most gamers dont have GPU power to turn on RT, for those that do, many have reasons to turn it off to avoid the performance penalty for higher resolutions or higher refresh rate. The games where RT is implemented in any significant way are far between and even then many would argue that having it off doesnt impact gameplay to a degree where its a dealbreaker. The importance of Raytracing is made to be more significant in current games then it really is, generally speaking. That people want to turn it down or off, like they did with SSAA is not an argument for Raytracing. Nobody is claiming that raytracing itself is a bad idea I hope and personally, I like that raytracing is evolving. But at this stage with current games, the focus upon its importance is overblown.

I get it that you value graphics fidelity over framerate. Many do, including me before, so I see where you are coming from. I am not an FPS elitist, but 45FPS would be too slow for me in most games today. I seek a balance instead, where gameplay and fidelity is good enough and most of all consistent, so I can focus on the game itself. Some games don´t have high fidelity to begin with, but I still love them due to gameplay. :)

As for HDR, you should give it a second try. ME: Andromeda on an OLED screen with Dolby Vision looks very good, same goes for especially cartoonish games with Windows AutoHDR.
 
I use RT only if it's worth it. Control, for example, is a game that shows some quite better image quality with it. However, the game itself is a pain in the neck and I have no patience fot that, so I'd say most titles I play do not offer or take any real advantage of RT as of today. I hope newer games make better and more rational use of such a nice visual resource.
 
I mean no disrespect, but really dont have time answering by microquote, even though microquites might be more precise. :)
It's not that hard, you just press enter after the quote you want to reply to separately and type, it takes no extra time at all.
Raytraced objects in games, especially glass can have too much realism compared to the rest of the gameworld and though it looks cool, they become out of place. My "Roger Rabbit" reference was from the movie "Who framed Roger Rabbit", where you have cartoon characters combined with real people. For Control, I found this as example (around 5:18 mark):
As you can see, puddle sticks out like an object that seems out of place there. Looks cool and all, but sticks out compared to the rest of the scene. I chose Control as example of this, since this is one of the games where RT is implemented more significantly and beautifully on other parts.
No need to repeat, I understood the argument. I just never saw this as something that would bother me even remotely during my playthrough of Control.

Its VERY easy to argue that the importance of Raytracing in current games is overblown as a blanket statement.
You can't just repeat the original statement as a defense of it.
First of all, we agree that some games raytracing is not implemented significantly. I would argue that this goes for most games that support RT to begin with.
One does not follow from the other. Just because some games have it as an afterthought does not mean it applies to all or most of them.
If there is just one game that looks significantly better with it, then your blanket statement is instantly invalidated. And I played many games that greatly benefited from RT, including Control which probably the least of the bunch, but still enough for me to use it and take the performance hit than not use it.
I also think we can agree that performance hit is massive with RT on, so you have a potential sacrifice turning it on that impacts gameplay directly.
If it gets unplayable obviously I'll turn it off. But I've not seen a game where I'd rather not have it, even if the performance was still good with it.
We might also agree that high refresh rate screens are more available and have become more mainstream?
No, I definitely don't agree with that. High refresh rate screens were available for a long time going back to the CRT era (where it actually mattered for more reasons that only for the FPS).
High refresh rate screens are a niche, not an avenue of progress. I've tried it myself and decided it was not something I wanted. I used a 144Hz screen for three years, only to switch to a 75Hz one, and I never looked back, even for a second.
In addition, gamers today have much higher focus on fluidity in gameplay with VRR and high refresh rate outside of competetive gaming. FPS matters more then for more people then a nieche of competetive gamers. For some, its a choice between RT or higher resolution.
Some are having issues also with using upscaling or frame generation to combat the performance hit due to imperfect implementations of them or input lag (I am not one of those, frankly I like the option to upscale, especially since I use VR a lot and I am not that sensitive to input lag).
That's all and well, but then by the same logic why can't you conceede that to some people RT is more important than running at 100+ FPS? I mean many console games are still targeting 30FPS, so clearly people are OK with that.
Most gamers dont have GPU power to turn on RT, for those that do, many have reasons to turn it off to avoid the performance penalty for higher resolutions or higher refresh rate. The games where RT is implemented in any significant way are far between and even then many would argue that having it off doesnt impact gameplay to a degree where its a dealbreaker. The importance of Raytracing is made to be more significant in current games then it really is, generally speaking. That people want to turn it down or off, like they did with SSAA is not an argument for Raytracing. Nobody is claiming that raytracing itself is a bad idea I hope and personally, I like that raytracing is evolving. But at this stage with current games, the focus upon its importance is overblown.
"I don't have a GPU to run it fast enough" does not make it overblown. This is exactly the cope I was talking about, people not wanting to admit that their GPU is not fast enough would rather argue that RT is not worth it and overblown, etc.
Why is it so hard to say RT is great, I just can't run it on with my GPU? I mean this applies to me also. I think Hogwarts Legacy looked brilliant with RT, but I just couldn't get it to remain playable on my 2080Ti. Thus I was forced to play the game without it. So of course it is not a deal breaker, I never argued that.
As for HDR, you should give it a second try. ME: Andromeda on an OLED screen with Dolby Vision looks very good, same goes for especially cartoonish games with Windows AutoHDR.
Just as high refresh rate, HDR is not something I'm willing to pay double money for in a screen. An oled with similar size and features as my current screen doesn't even exist. I prioritize preferred resolution and screen real estate.
 
It's not that hard, you just press enter after the quote you want to reply to separately and type, it takes no extra time at all.

No need to repeat, I understood the argument. I just never saw this as something that would bother me even remotely during my playthrough of Control.


You can't just repeat the original statement as a defense of it.

One does not follow from the other. Just because some games have it as an afterthought does not mean it applies to all or most of them.
If there is just one game that looks significantly better with it, then your blanket statement is instantly invalidated. And I played many games that greatly benefited from RT, including Control which probably the least of the bunch, but still enough for me to use it and take the performance hit than not use it.

If it gets unplayable obviously I'll turn it off. But I've not seen a game where I'd rather not have it, even if the performance was still good with it.

No, I definitely don't agree with that. High refresh rate screens were available for a long time going back to the CRT era (where it actually mattered for more reasons that only for the FPS).
High refresh rate screens are a niche, not an avenue of progress. I've tried it myself and decided it was not something I wanted. I used a 144Hz screen for three years, only to switch to a 75Hz one, and I never looked back, even for a second.

That's all and well, but then by the same logic why can't you conceede that to some people RT is more important than running at 100+ FPS? I mean many console games are still targeting 30FPS, so clearly people are OK with that.

"I don't have a GPU to run it fast enough" does not make it overblown. This is exactly the cope I was talking about, people not wanting to admit that their GPU is not fast enough would rather argue that RT is not worth it and overblown, etc.
Why is it so hard to say RT is great, I just can't run it on with my GPU? I mean this applies to me also. I think Hogwarts Legacy looked brilliant with RT, but I just couldn't get it to remain playable on my 2080Ti. Thus I was forced to play the game without it. So of course it is not a deal breaker, I never argued that.

Just as high refresh rate, HDR is not something I'm willing to pay double money for in a screen. An oled with similar size and features as my current screen doesn't even exist. I prioritize preferred resolution and screen real estate.
I think you are missing my whole point, so let me boil it down a bit: Raytracing is available on some games, while only a few of them have any significant implementation of it. In addition to that, for a gamer, provided that one of the few games that have any significant raytracing in it are appealing enough for the gamer to buy it, there are still sacrifices to be made to enable it that the gamer might not want to do that (like resolution, refreshrate, inputlag with DLSS/FSR etc).

Raytracing is not a killer feature or game changer (in the future it can be, but not before its more mainstream I think), not even for the subset of the subset of the subset of gamers that are capable of running it with decent framerates at 1080P and up in games of today generally speaking compared to the overblown focus thats been on the feature. Proof is in the pudding, when people with more then capable hardware decide to turn it off often for the reasons I stated above.

I get it, raytracing might be important to some like it is for you. Some like you also game fine at 45fps, dont like high refresh monitors etc. We are all different and thats not a bad thing. However, going beyond personal taste, in general the importance of raytracing in games today are overblown. Also, gaming monitors today are heavily marketed with features like refresh rate and VRR capabilities. It has more focus then ever before and even consoles are starting to push out more 120fps games.

I play VR more then "pancake" games and have had all the Oculus developer kits since the beginning. My last 4-5 GPUs have all been Nvidia due to VR support, since its important to me to the point that I make my purchasing decition based upon that. VR IS in many aspects a game changer unlike RT of todays games. In addition, there are games that are made only for VR and many games have gotten VR support. Still I can look beyond whats important to me.

Its good you acknowledge that no RT is not a dealbreaker, as I have stated from the beginning that its not something that doesnt ever add anything to the visuals. However, as I said originally:

RT has great potential when it comes to game realism in the future, but I feel that the importance of RT in current games is overblown when people make general statements about it.
 
Speaking of Control, there's a puzzle in there where you need to look at a computer monitor to see which way some doors need to be position in order to open the entrance to the area, it's pretty obscure to begin with, even more so with rt'd reflections on. I had to turn them off for a quick minute in order to see wth I was doing. Interestingly, the mission is called 'The Mirror' lol
 
Last edited:
I think you are missing my whole point, so let me boil it down a bit: Raytracing is available on some games, while only a few of them have any significant implementation of it. In addition to that,
It not being in every game is not the same as it being overblown as a general rule. You are trying to argue that it is overblown but actually arguing that is's not all.
But even that is a stretch as every new game I played recently had it, and there was only one game where the implementation felt like an afterthought, and didn't add much graphically: Marvel's Midnight Suns. This was the exception, not the rule.
for a gamer, provided that one of the few games that have any significant raytracing in it are appealing enough for the gamer to buy it, there are still sacrifices to be made to enable it that the gamer might not want to do that (like resolution, refreshrate, inputlag with DLSS/FSR etc).
To buy it? What do you mean, buy what? RT comes at a performance cost, that was never questioned, I don't know why suddenly the goalposts moved from graphical benefits to the performance hit. The performance and the effect being overblown in significance are two separate things.
Raytracing is not a killer feature or game changer (in the future it can be, but not before its more mainstream I think),
It's interesting that you'd say that considering that real time ray tracing has been considered the holy grail since the idea was first considered. I think I first heard about it in the 90s. It is the ultimate game changer, becoming mainstream has no bearing on that. Although I'd really like to see statistics on how many people play games with RT on that have compatible video cards vs how many turn it off. I believe it is much more mainstream than you think.
not even for the subset of the subset of the subset of gamers that are capable of running it with decent framerates at 1080P and up in games of today generally speaking compared to the overblown focus thats been on the feature. Proof is in the pudding, when people with more then capable hardware decide to turn it off often for the reasons I stated above.
Really? I just bought a lower mid range AMD video card for my second PC and Shadow of the Tomb Raider runs with 70FPS average on it with ray tracing on in 1080p.
I get it, raytracing might be important to some like it is for you. Some like you also game fine at 45fps, dont like high refresh monitors etc.
I didn't say I don't like high refresh rate monitors, I just don't need it, the benefits didn't justify the cost. Especially considering that I always crank up the graphics to the point where most new games barely exceed 60fps.
We are all different and thats not a bad thing. However, going beyond personal taste, in general the importance of raytracing in games today are overblown.
You say you go beyond personal taste, then state your own personal opinion. You think it is overblown as an opinion, that's fine, but you come across as if you are stating a fact, not an opinion.
Also, gaming monitors today are heavily marketed with features like refresh rate and VRR capabilities. It has more focus then ever before and even consoles are starting to push out more 120fps games.
I don't understand why we can't have both, why one has to beat out the other? But even with VRR you need to pay extra if you want a decent implementation.
I play VR more then "pancake" games and have had all the Oculus developer kits since the beginning. My last 4-5 GPUs have all been Nvidia due to VR support, since its important to me to the point that I make my purchasing decition based upon that. VR IS in many aspects a game changer unlike RT of todays games. In addition, there are games that are made only for VR and many games have gotten VR support. Still I can look beyond whats important to me.
VR is an entirely different horse race, you can't have it without high refresh rate and high fps. So of course for VR RT is an issue. I don't even know if there are VR RT games.
However I think VR is another niche, that will never become mainstream, unlike RT effects in games.
Its good you acknowledge that no RT is not a dealbreaker, as I have stated from the beginning that its not something that doesnt ever add anything to the visuals. However, as I said originally:
I really don't get this, how would it be a deal breaker in the first place?

I just think it adds a lot to certain games, never meant it as a general rule. How could I? I didn't play all the games that support some sort of RT, there are probably some where the benefits are small to not existent. This does not make the technology as a whole overblown.
 
I have a 3090 @1440P and have run into four games with RT: Control (left it on), Doom Eternal (turned it off - didn't do anything but make the guns shiny), Portal RTX (LOL the DLSS artifacts to make it playable - also looked stupid with weird glows and other issues), and Elden Ring (broken).

I'm still waiting for it to show up in another game I actually want to play - right now that might end up being Metro Exodus, as I finally did 2033 back in February, and will probably hit the others later on this year, but that's it. Nothing else even twigged my radar on it. RT is a neat idea that isn't yet mainstream or fully built out yet - in a couple of years it'll be everywhere, but it's still bleeding edge right now and a lot of games have either no implementation or crappy implementation, much like the early days of AA/AF/T&L/etc. Personally, I don't care if the game has it or not - I can enjoy a game without it, and I can enjoy a game with it - if it's done well. Good games are good games regardless - heck, I'm replaying Wing Commander 1 right now, which is both 10-15 FPS (because of how the engine works) and SPRITE based. Still a good game.
 
Back
Top