NVIDIA Bans Reviewer for Concentrating on Rasterization Instead of Ray Tracing

Not saying what NVIDIA did was right by any stretch, but ray tracing performance is what made me decide to wait for a 3080 Ti over a 6900XT. Raster performance is close between the two, but the latter has half the ray tracing performance based on the 3090 numbers. Ray tracing is a purchase consideration for me going forward, so it should be getting more attention in reviews.
 
Not saying what NVIDIA did was right by any stretch, but ray tracing performance is what made me decide to wait for a 3080 Ti over a 6900XT. Raster performance is close between the two, but the latter has half the ray tracing performance based on the 3090 numbers. Ray tracing is a purchase consideration for me going forward, so it should be getting more attention in reviews.
Holy shit. Welcome back man.
 
I could be looking too fast, but for example:


I just read the title of the tag and clicked a bit on the benchmark, I do no see one mention of DSSL, not one benchmark that compare the 3090 to the 3080/2080TI with RTX on, etc...

I imagine that it could have simply be, that not a product our target audience care about, we will not take time to seriously review it for games more than not wanting to specially talk of the extra feature for any particular reason, just save time.

When they compare particular model and not the initial launch review, they tend to also have skip dssl/RT benchmark, assuming the people watching already watched the FE review at launch and just interested in oc/noise between each model and expressing the difference in regular raster performance.

Cyberpunk they also splitted in 2 video with one that didn't show the RT/DSSL performance (but that was after Nvidia decision was made I would imagine anyway timing wise)

Is their a requirement that a review site consider DLSS ? Why ? Most people consider it a garbage feature that should be burned anyway. If your card can't maintain 60 fps with specific features turned on at the resolution of a modern monitor then its not capable of powering those features >.< Silly AI scaling isn't a solution I'm interested in... I play 100s of games not 5. But anyway your probably right they didn't go into super detail on their 3090 review and why did they have to ? If Nvidia is including bullet point talking points that must be included in their reviews then they are just infomercials. But seriously the 3090 review wasn't their 3000 launch video their 3080 review was... the 3090 is a known entity its X amount faster you don't have to rerun a week of game tests to check that.

In their 3080 review (and the 3090 is just and extension review its not a drastically different card... I mean come on they didn't do as many benchmarks reviewing the 6900 from AMD either cause it was covered in their 6000 series launch reviews.)

Here is their 3080 (their 3000 series launch day video) where they clearly cover RT and DLSS and compare it to the 2000 cards.


And ya what could have made Nvidia made other then some personal attack oddness.... perhaps saying... ya 3080 is better at RT cause its a faster card, not cause the cores are improved. lol
 
Not saying what NVIDIA did was right by any stretch, but ray tracing performance is what made me decide to wait for a 3080 Ti over a 6900XT. Raster performance is close between the two, but the latter has half the ray tracing performance based on the 3090 numbers. Ray tracing is a purchase consideration for me going forward, so it should be getting more attention in reviews.
By the time even 5% of your personal game catalog supports RT of any kind... the 3080 TI will be a dinosaur. (might as well wait for 4000 or 7000 cards ;) half kidden) Your point is valid if all else is equal sure consider other stuff like RT.... but at this point RT is not something anyone should serious consider a game changer imo. Unless someone tells me they have really put 1000 hours into control.... or 100+ hours on Cyberpunk (low blow joke there seeing as numbers just came out showing Cyberpunk has bleed 75% of its PC players in the last few weeks).... if your that gamer then ok RT performance is important. Both NV and AMD have features that make their products stand out. For myself I run AMD.... and the most underrated feature that I use basically every day is Radeon Chill. (Nvidia doesn't have it and no its not just Vsync) My 5700 XT like most cards can be loud enough to be noticeable when playing newer AAA type stuff... I mean its not a leaf blower or anything I have a decent after market card and a good cooling setup. Still at full fan you (and your wife) know its on. However (I assume like most gamers) I have a handful of older titles I play all the time (MMOs, DOTA ect ect) none of those games push my hardware in ANY way.... so I use radeon chill, and in almost all my older titles (and a lot of new less demanding titles) I can run them at ultra settings with Chill enabled running silky smooth AND 95% of the time my GPU fan doesn't even turn on. That one feature alone has brought be back to team AMD exclusive. (I just retired my last Nvidia card I still had a very old 750 TI in a back up machine at a house I'm only at a few days a month.... upgraded another PC here and put my RX470 in that machine. lol) Chill is a killer must have feature imo.... going through my steam library I know probably 80% of my titles I can run with Chill at max settings with Zero fan noise. I'll take that real world feature over something like RT that I can only use in a couple games anyday. (and ya I am disappointed that Cyberpunk doesn't look like its the game that makes me want RT right now.... I think most of us wanted it to be. Perhaps it still will be in a year who knows)

Also on RT there is also the other very legit argument... that AMDs solution is going to be the developer target for the next 4-5 years anyway. Chances are PC games hitting in the next few years are going to be targeting AMD hardware specifically. I'm not saying for sure it will run faster on AMD hardware... but its as likely as a wave of RTX enabled games hitting the market.
 
This is ridiculous to the point that it's almost hyperbolic.

My 1080ti has been a great card for years, but compared to a 3080 or 6800xt the difference in raster is 50-100 PERCENT.

We're taking going from 60 to 90-120 fps.
I think it's great you can still game on a 1080Ti, a card over what 4 years old? 5700Xt plays just fine as well.

Of course newer stuff is faster.

Don't believe the persistent hate on RT and DLSS.

2080Ti plays Cyberpunk 3440x1440 DLSS quality 35fps to 50fps. Turn DLSS off, 18fps to 50fps. Mainly pointing out the low end numbers. From my experience, with a G-Sync lcd, if FPS is above 30, it's going to play great. I was running around playing it with DLSS off for awhile this morning, and it was smooth and actually playable even when showing 18fps. Kinda annoying when you turn and can see the frames tho. Keep it above 30 via settings, and it is fine, RT is fine, DLSS is fine, as long as you have some sort of adaptive sync. Turn cascaded shadows to low, texture quality to 1 step below Psycho (Ultra i think), DLSS to quality, and everything else to Max. It's mostly fine with Psycho on texture quality, but I couldn't see a difference, and it does get a bit higher fps with it one step down.
 
But anyway your probably right they didn't go into super detail on their 3090 review and why did they have to ? If Nvidia is including bullet point talking points that must be included in their reviews then they are just infomercials.
Well yes, that why they send cards/driver before launch to people, to get free infomercials, people agree to do them in exchange of access and enter a dance of power once they become big enough (where there is a potentially a bigger cost to not give them early access even if you get bad reviews for a generation, something I imagine Apple do not worry too much about and have a strong list of good people they use, smaller company need to be way more generous).

They don't have too do it obviously (that why they did not because they did not had too) in the same ways NVidia do not have to provide early version of drivers and cards to them, that why the uproar was a good thing and needed, because Nvidia really do not have too, they can pressure the coverage has much as they legally can and in the opposite the more powerful reviewer and their audience can pressure for Nvidia to not manipulate the reviews too much, it is an uneasy balance to have in the current world, if the reviews were made by the NYTimes/Financial times, that do not even think for a second about their relationship with hardware maker for their financial situation it would not be an issue at all. But people that want in anyway anyform of access and that become part of their business model, they do not have that luxury.

They more then go rapidly over RT/DLSS, we maybe would need to listen carefully to it, but it seem to me Raytracing and DSSL were not even mentioned at all during the review of the 3090 card, for all the talk of RT being useless, if you buy an 3090 card chance are high you are interesting in the highest eye candy and RT and a good proportion of people upgrading mostly for Cyberpunk launch were interested in RT/DLSS, that was a large proportion of people buying card this fall. I feel like on most occasion like this, people did spread a fake information that they always included those feature in all their reviews but what NVIDIA disliked was only the way they did it, making the situation much more fun to cover that way.

Has for clearly covering it in X other videos, I think that part of the confusion between them/fans and Nvidia, from their point of view and it is perfectly normal, their complete work perfectly and totally cover those feature, from a marketing person of Nvidia that correctly consider that any video is possibly watched by someone that watched only that video from them, each video is it's own ads for their product and can (need) to be judged at least a little bit in a vacuum on is own.
 
I think it's great you can still game on a 1080Ti, a card over what 4 years old? 5700Xt plays just fine as well.

Of course newer stuff is faster.

Don't believe the persistent hate on RT and DLSS.

2080Ti plays Cyberpunk 3440x1440 DLSS quality 35fps to 50fps. Turn DLSS off, 18fps to 50fps. Mainly pointing out the low end numbers. From my experience, with a G-Sync lcd, if FPS is above 30, it's going to play great. I was running around playing it with DLSS off for awhile this morning, and it was smooth and actually playable even when showing 18fps. Kinda annoying when you turn and can see the frames tho. Keep it above 30 via settings, and it is fine, RT is fine, DLSS is fine, as long as you have some sort of adaptive sync. Turn cascaded shadows to low, texture quality to 1 step below Psycho (Ultra i think), DLSS to quality, and everything else to Max. It's mostly fine with Psycho on texture quality, but I couldn't see a difference, and it does get a bit higher fps with it one step down.

It was a great purchase, but we clearly have different views on what's a great experience.
I'll take drastically improved raster in every title over upscaled 1440 in a handful.

I'm currently trying to buy a replacement, with no success so far, but I'd have preferred to play cyberpunk higher than 40-50 fps, even with VRR.

It's the hyperbolic way you try to dismiss the huge raster difference, that 40-50 would be almost 100fps at the same settings I'm playing at now. That's a difference that should matter more than RT/DLSS in a few titles, especially since it applies on every title
 
Is their a requirement that a review site consider DLSS ? Why ? Most people consider it a garbage feature that should be burned anyway. If your card can't maintain 60 fps with specific features turned on at the resolution of a modern monitor then its not capable of powering those features >.< Silly AI scaling isn't a solution I'm interested in... I play 100s of games not 5. But anyway your probably right they didn't go into super detail on their 3090 review and why did they have to ? If Nvidia is including bullet point talking points that must be included in their reviews then they are just infomercials. But seriously the 3090 review wasn't their 3000 launch video their 3080 review was... the 3090 is a known entity its X amount faster you don't have to rerun a week of game tests to check that.

In their 3080 review (and the 3090 is just and extension review its not a drastically different card... I mean come on they didn't do as many benchmarks reviewing the 6900 from AMD either cause it was covered in their 6000 series launch reviews.)

Here is their 3080 (their 3000 series launch day video) where they clearly cover RT and DLSS and compare it to the 2000 cards.


And ya what could have made Nvidia made other then some personal attack oddness.... perhaps saying... ya 3080 is better at RT cause its a faster card, not cause the cores are improved. lol

No most people don't consider DLSS garbage. It gives you a lot of performance for a minor PQ hit. Most people will be hard pressed to be able to see the difference.
 
No most people don't consider DLSS garbage. It gives you a lot of performance for a minor PQ hit. Most people will be hard pressed to be able to see the difference.

You forgot the big asterisk...

* only if it is supported in the game you're playing. Which, quite frankly, is a pitiful amount of games for them to be advertising it as a selling feature.

Edit:
I'm not against the technology. I just don't see it widespread and a lot of times the games they add DLSS to are games I wouldn't play. The last game I played through (AC: Valhalla) didn't support either RTX or DLSS and actually plays better on a RX6800. I short changed myself playing on an Nvidia card instead of swapping out my GF's 6800 while I was playing.
 
Last edited:
You forgot the big asterisk...

* only if it is supported in the game you're playing. Which, quite frankly, is a pitiful amount of games for them to be advertising it as a selling feature.
Nvidia has their puppets talk like DLSS is a solution we should all be happy to have. IT has MAJOR issues...
1) it requires Nvidia DGX-1 servers to train images of games.... meaning developers have to work directly with Nvidia.... and I don't personally believe that is good for any of us.
2) it REQUIRES training. So it will never work with the VAST majority of games. Most developers aren't going to want to deal with Nvidia every time they release a DLS or add on
3) see above... it will always have problems with mods both big and small. Including things like Reshade. ect
4) no matter how good it looks its never perfect... its still upscaling cause bottom line the hardware isn't ready to turn on RT... outside of making RT playable DLSS is iffy for the most part. 4k FPS on 3070 and up seems good enough to me that upscaling isn't really much of selling feature anyway. Perhaps dlss made more sense last gen when 2070s where still pretty iffy in some AAA 4k titles.

Now I know Nvidia has been working on fixing it.... and I have seen it doing its thing and I admit 2.0 looks better then the first version which was complete crap. Still its noticeable in games where your actually looking around at the eye candy... and frankly if your NOT looking around at the eye candy, why not just turn a few other features down and play at native resolution. So to sum it up its a tech that will NEVER see wide spread implementation... that is just true on its face, so review sites that tell me its a feature I must consider cause its a "game changer" are freaking paid adverts. It will never find its way into more then a handful of titles. Also they seem to overlook the obvious imo... if you are going top end 3080/3090 or even last gen 2080 and even 2070s type cards play at native if your not trying to use RT chances are 90% of gamers are already over their monitors refresh cause 4k and even 1440 monitors don't tend to have insane refresh rates quite yet. Sure perhaps its a good tech for gamers buying 2060/3060 type hardware.... but I would suggest most people in that budget frame rate are still gaming at 1080p (or low refresh 1440) and again.... they can play at native not counting RT.

In the end DLSS is a temporary fix for slow RT hardware. There is no future where this tech continues to exist as it is. Perhaps a DLSS 3 has some crazy pre compile ? or something that can work with out external help and require constant ongoing dev support. To be clear I can see some potential in the idea of the tech... but the current implementation is no future killer feature. Its likely to die off... but there is an outside chance they refine it into something that may help gamers squeeze an extra year or two out of their old cards. (which is why I am solidly in the camp that it will probably just die off, cause why create features that allow gamers to play newer AAA titles with their old cards. lol)
 
No most people don't consider DLSS garbage. It gives you a lot of performance for a minor PQ hit. Most people will be hard pressed to be able to see the difference.
For the record that may be true in one or two titles... where your not likely to stop and look around much.
 
It was a great purchase, but we clearly have different views on what's a great experience.
I'll take drastically improved raster in every title over upscaled 1440 in a handful.

I'm currently trying to buy a replacement, with no success so far, but I'd have preferred to play cyberpunk higher than 40-50 fps, even with VRR.

It's the hyperbolic way you try to dismiss the huge raster difference, that 40-50 would be almost 100fps at the same settings I'm playing at now. That's a difference that should matter more than RT/DLSS in a few titles, especially since it applies on every title
It's completely fine to disagree on those points.

If you have a card that can do RT, you should try it. I think for it to be as good of an experience as I have had, you need adaptive sync as well. Cyberpunk is not a game that plays like Quake3. It plays fine at 35 fps minimums. It seems that many are making assumptions about playability and enjoyment, for things they have no direct experience with.
 
It's completely fine to disagree on those points.

If you have a card that can do RT, you should try it. I think for it to be as good of an experience as I have had, you need adaptive sync as well. Cyberpunk is not a game that plays like Quake3. It plays fine at 35 fps minimums. It seems that many are making assumptions about playability and enjoyment, for things they have no direct experience with.

Completely fine to disagree, everyone has different priorities.

I've played over 50hrs in cyberpunk, and I've done it on a 1440p 144hz freesync monitor with gsync compatibility turned on.

VRR is a drastic improvement vs the old days of choosing stutter or tearing, but I don't think that even 45fps with VRR enabled compares to 100+ fps. It's an opinion, but one that I think most PC gamers share.
 
I thought that was only DLSS 1?
No, the AI still needs to be trained, but it doesn't take as long thanks to improvements in the algorithm. And the AI models have been integrated into the drivers now instead of needing to be pushed through the game's code, meaning if there are further improvements to the AI model for a specific game then all you need is the updated driver.
 
No, the AI still needs to be trained, but it doesn't take as long thanks to improvements in the algorithm. And the AI models have been integrated into the drivers now instead of needing to be pushed through the game's code, meaning if there are further improvements to the AI model for a specific game then all you need is the updated driver.
Ya that was my understanding... instead of tons of data from one game... they have a generic "4k game textures" file... and the information for the driver is still trained from game images but fewer and lower res. Which makes life much easier for everyone.

I read it sort of as the same way some 4k TV manufacturers do higher end upscaling and AI sharpening. Such as say Sony who train based on a ton of generic 4k material so any source can be AI sharpened with their "reality engine".

No doubt 2.0 is much better then 1.0 no one disagrees on that I'm sure. Still it requires Nvidia servers and developer work in training.... unless I'm mistaken. Wouldn't be the first time. lol
 
By the time even 5% of your personal game catalog supports RT of any kind... the 3080 TI will be a dinosaur.

Ray tracing and DLSS are just tack on features. The image quality impact in Control wasn't that big, especially in motion. Other games it can be worse.

But with Control, I had the option of using neither and getting 70 or so frame rates. Or using ray tracing & DLSS and getting around 55 or so. Or just DLSS to get around 90-95.

It will come down to each game, but you can determine yourself if the performance jump is worthwhile. In Control, I preferred Ray Tracing and DLSS overall to the other two options. Playing MW, I can't really see any difference ray tracing makes, so I'd rather turn it off seeing that it is an online FPS.
 
  • Like
Reactions: ChadD
like this
Back
Top