I look at it this way, what is more important? Having better rasterization per $ or RT per $? Better rasterization will help every game for the most part. Better RT, that is if it is usable and makes even a significant difference, is only useful with RT games. Most Turing RTX holders never or seldom ever used RT due to many reasons, very poor performance, can't really tell a difference, had to degrade other settings and IQ to use giving an overall worst gaming experience. Anyways I ended up with conclusion rasterization/$ is king.
With the above benchmark which is not clear driver version etc. the 6800 is 33% faster in that test for RT, will that even be significant for RT at all? We need some real games that are optimized with both AMD and Nvidia RT for a better view.
With the above benchmark which is not clear driver version etc. the 6800 is 33% faster in that test for RT, will that even be significant for RT at all? We need some real games that are optimized with both AMD and Nvidia RT for a better view.