Brackle
Old Timer
- Joined
- Jun 19, 2003
- Messages
- 8,568
Then don't play an AMD sponsored game.Because DLSS is good, FSR isn't
Because DLSS is good, FSR isn't
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Then don't play an AMD sponsored game.Because DLSS is good, FSR isn't
Because DLSS is good, FSR isn't
I dont understand why people are complaining why DLSS wasn't added. Its an AMD sponsored game.
I dont think I have ever seen people complain about FSR not being added to a game.......lol
DLSS is far superior to FSR no doubt. But bitching about why a feature for a company who didn't spare the game isn't in the game is just being idiotic imo.I feel like DLSS being always as good or better should make this extremely easy to understand. As well as the other way around you are talking about.
It is also a title for which DLSS 3 would make a lot of sense (cpu limited a lot)
So now we've come to the point where if a game is AMD sponsored it's going to be bad. Kind of a bad look that just about anything attached to AMD is suck and just helps to further cement NV as the "good" brand.Then don't play an AMD sponsored game.
It is, but this isn't an Nvidia sponsored game. People who expected DLSS to be in the game are going a little overboard.Because DLSS is superior to FSR. And apparently FSR is implemented badly in this game. And also with DLSS you can swap out DLLs to improve performance / image quality.
No, there have been plenty of Nvidia games recently that released in a bad shape. The reason why this game is getting more press is because its highly anticipated.So now we've come to the point where if a game is AMD sponsored it's going to be bad. Kind of a bad look that just about anything attached to AMD is suck.
Perhaps they should complain when FSR isn't available in a game. I'm not a fan of a game leaning so heavily into one GPU brand or another outside of cases where they are merely supporting a new feature or technology that AMD or NVIDIA doesn't have support for yet. I prefer games to support FSR and DLSS whenever possible.I dont understand why people are complaining why DLSS wasn't added. Its an AMD sponsored game.
I dont think I have ever seen people complain about FSR not being added to a game.......lol
Or in this case, actively removed from the game, Modders have already got DLSS 3 added and working overtop of FSR2 but Denovo is preventing them from Adding DLSS 2 in addition to FSR. So hopefully they figure that out shortly.DLSS is far superior to FSR no doubt. But bitching about why a feature for a company who didn't spare the game isn't in the game is just being idiotic imo.
Exactly this.Perhaps they should complain when FSR isn't available in a game. I'm not a fan of a game leaning so heavily into one GPU brand or another outside of cases where they are merely supporting a new feature or technology that AMD or NVIDIA doesn't have support for yet. I prefer games to support FSR and DLSS whenever possible.
Happens all the time. Look at Atomic Heart.....Was supposed to be the latest greatest Nvidia feature game.....only for them to remove parts of it before release.Or in this case, actively removed from the game, Modders have already got DLSS 3 added and working overtop of FSR2 but Denovo is preventing them from Adding DLSS 2 in addition to FSR. So hopefully they figure that out shortly.
Never forget Arkham Knight. The NVIDIA sponsored PhysX poster child that wouldn't work properly with a lot of the PhysX features enabled and lost performance with SLi enabled. Most of the game's issues were fixed about a year after the game launched but SLi never was implemented.Happens all the time. Look at Atomic Heart.....Was supposed to be the latest greatest Nvidia feature game.....only for them to remove parts of it before release.
It seems like just about every game releases with both DLSS and FSR nowadays so the lack of either shows some piss poor direction which likely came from the C Suites (and possibly a deal with AMD).Perhaps they should complain when FSR isn't available in a game. I'm not a fan of a game leaning so heavily into one GPU brand or another outside of cases where they are merely supporting a new feature or technology that AMD or NVIDIA doesn't have support for yet. I prefer games to support FSR and DLSS whenever possible.
Probably is, AMD now has money to throw around to keep games/software exclusive. Not saying its right, but we know Nvidia does it. So, I would expect thats what AMD did for Jedi Survivor.It seems like just about every game releases with both DLSS and FSR nowadays so the lack of either shows some piss poor direction which likely came from the C Suites (and possibly a deal with AMD).
Speaking of performance, Star Wars Jedi: Survivor is a very CPU intensive game, especially with ray tracing enabled, as the CPU usage is mostly single-threaded on PC due to a very poor implementation of Unreal Engine 4 DirectX 12, and high-powered GPUs such as the GeForce RTX 4080 can end up CPU bottlenecked in some sequences of the game at 1440p and below. We've seen these issues before in other recent Unreal Engine 4 games, such as The Callisto Protocol, Hogwarts Legacy or Gotham Knights. At 4K, there is no such issue, and with FSR 2.2 enabled, you can expect around 40% more performance in "Quality" mode with all graphics settings maxed out.
Then don't play an AMD sponsored game.
35 is about 40% of 91 (38.5%), in english and many other language talk about increase can be always strange and people can easily mix up the calculation, because some 60% faster one way will be 40% slower the other way aroundCurious how they get 40% increase. 91 from 56 is an increase of 62.5%
Indeed. The increase is starting from 56 FPS, so you need to divide the difference by 56. I don't think it's so much an "English" thing as it is people just being bad at math.35 is about 40% of 91 (38.5%), in english and many other language talk about increase can be always strange and people can easily mix up the calculation, because some 60% faster one way will be 40% slower the other way around
yeah i'm wondering why that's all of a sudden becoming a thing? the only time i've ever had to deal with compiling shaders is when i was running Zelda: Breath of the Wild on pc, which eventually had shader compile files you could download to where you didn't have to worry about it but by the time all the shaders you needed were compiled it was something in the 3000's i want to say. but the thing is, is if you were playing the game on nintendo hardware that wasn't even a thing!!?oh you know whats super fucking annoying? the game compiles shaders every boot up. ffs
DLL's you mean drivers? so you're saying you can update your drivers to improve performance/quality? because you can do that with all brandsalso with DLSS you can swap out DLLs to improve performance / image quality
but FSR isn't proprietary. but i could give a f*** if they support either. we shouldn't have to use upscaling / bluring bullsh** to be able to run games on $3000 computers. that sh**s a copout.Perhaps they should complain when FSR isn't available in a game. I'm not a fan of a game leaning so heavily into one GPU brand or another outside of cases where they are merely supporting a new feature or technology that AMD or NVIDIA doesn't have support for yet. I prefer games to support FSR and DLSS whenever possible.
No. You can replace the DLSS DLLs in the game folder with newer (or older) versions to fine tune the performance and quality of the DLSS implementation in the game. It's become quite common to do and generally works great for games that ship with an old version of DLSS. https://www.techpowerup.com/download/nvidia-dlss-dll/DLL's you mean drivers? so you're saying you can update your drivers to improve performance/quality? because you can do that with all brands
Compiling the shaders on first launch, after a hardware change, or after a major update is a best practice, compiling on every launch is a holdover from a rapid development model where they expect things to change every damned day so they just leave it on, it is also a ram thing the levels are huge so they try to break down the shaders into zones from a map and as you approach boundaries it starts compiling again, which makes sense to a degree but that should be done on a separate CPU core not the main ones the game is running on so that is just bad programming on their side.yeah i'm wondering why that's all of a sudden becoming a thing? the only time i've ever had to deal with compiling shaders is when i was running Zelda: Breath of the Wild on pc, which eventually had shader compile files you could download to where you didn't have to worry about it but by the time all the shaders you needed were compiled it was something in the 3000's i want to say. but the thing is, is if you were playing the game on nintendo hardware that wasn't even a thing!!?
why all of a sudden in the last few months are we gettng games that aren't coming with pre-compiled shaders???!!!!! unless it's like someone else said so that way if you let the game compile shaders for close to 1 1/2 - 2 hrs by the time you get in to actually playing the game that you are past the point of being able to use the Steam refund policy of 2hrs?!!
If I have 100 fps, and now I have 130, that is an increase of 30% (130/100-1). 100 * 1.30 = 130Indeed. The increase is starting from 56 FPS, so you need to divide the difference by 56. I don't think it's so much an "English" thing as it is people just being bad at math.
Newer versions of GeForce Experience will also detect any games with older versions and update them as well.No. You can replace the DLSS DLLs in the game folder with newer (or older) versions to fine tune the performance and quality of the DLSS implementation in the game. It's become quite common to do and generally works great for games that ship with an old version of DLSS. https://www.techpowerup.com/download/nvidia-dlss-dll/
Unfortunately FSR is hard-coded and can't be modified from outside of the game.
For the reason, the message you quoted quoted ?how is this possibly an increase of 40%? 56 * 1.4 is 78.4fps
I think this is more of a case where their example where yes they are getting 60+% of a performance increase is outside the norm, yes with their setup that is what they get but their hardware set falls outside the curve, more "normal" systems out there can expect to see something in the 40% range.If I have 100 fps, and now I have 130, that is an increase of 30% (130/100-1). 100 * 1.30 = 130
In the jedi fsr 2.2 example, you start at 56 fps and now you have 91. 91/56 is 1.625,-1, is 62.5%
how is this possibly an increase of 40%? 56 * 1.4 is 78.4fps
We're both saying the same thing. It's a simple formula that can be written many ways. Stating it the way I did just simplifies the language, but the way I describe would be written as (91 - 56) / 56.If I have 100 fps, and now I have 130, that is an increase of 30% (130/100-1). 100 * 1.30 = 130
In the jedi fsr 2.2 example, you start at 56 fps and now you have 91. 91/56 is 1.625,-1, is 62.5%
how is this possibly an increase of 40%? 56 * 1.4 is 78.4fps
FSR and DLSS are often needed at 4K with ray tracing enabled. It's not a cop out when the technology to push modern games at 60FPS+ with full ray tracing features enabled doesn't exist. People seem to think that there is some magic optimization developers could do in order to make that happen but that's simply not the case. Obviously, this game can run better than it does and we've seen evidence of that. Not only could it implement DLSS and FSR 2.2, but it could also be coded to behave properly with E-cores on Intel's 12th and 13th generation CPU's.DLL's you mean drivers? so you're saying you can update your drivers to improve performance/quality? because you can do that with all brands
but FSR isn't proprietary. but i could give a f*** if they support either. we shouldn't have to use upscaling / bluring bullsh** to be able to run games on $3000 computers. that sh**s a copout.
We're both saying the same thing. It's a simple formula that can be written many ways. Stating it the way I did just simplifies the language, but the way I describe would be written as (91 - 56) / 56.
Fascinating they would gimp 85 percent of the gpu market unless... There was some reason I can't quite... Grasp... .So the areas where it struggles the most are newer Intel processors with E-cores, and it doesn't make good use of NVidia GPUs.
Interesting.
Fascinating they would gimp 85 percent of the gpu market unless... There was some reason I can't quite... Grasp... .
i'm pretty sure everyone's struggling to run this, from how it sounds. and it sounds like it doesn't even matter what OS you run it on it runs like crap. hardware unboxed said they're gonna do an indepth benchmark of cpu/gpu once ea gets their sh** straightened out.Fascinating they would gimp 85 percent of the gpu market unless... There was some reason I can't quite... Grasp... .
i'm pretty sure everyone's struggling to run this, from how it sounds. and it sounds like it doesn't even matter what OS you run it on it runs like crap. hardware unboxed said they're gonna do an indepth benchmark of cpu/gpu once ea gets their sh** straightened out.
and with the "compiling shaders" thing. that, to me, sounds like a new low, even for ea. why anyone would think "because it runs bad on all systems that means it's amd sabotage". i would think amd would want all these games running well on all their hardware to not only build their brand, but to keep things running well on consoles now because that's a couple of really big customers
They actually spent dollars to take it out.Man a whole lotta whining in this thread about Nvidia features not being implemented in the game, guess what vote with your wallet then and don't buy the game. Otherwise you made your choice and you need to live with it, clearly they were right and you would buy the game anyway without spending dollars on implementing it. Pretty sure they would notice if most Nvidia video card owners did not buy the game. Of course then you would have to realize most of those Nvidia cards are not even capable of running DLSS, let alone Ray Tracing as they are way to weak to do so, or are not supported on the older hardware.
Still waiting for some proof on that. I think you mentioned "digging up a reddit post" about it and that was it.They actually spent dollars to take it out.
You literally have to work to remove DLSS from the Unreal Engine after version 4.26, it is cooked in there.Still waiting for some proof on that. I think you mentioned "digging up a reddit post" about it and that was it.
Compiling the shaders on first launch, after a hardware change, or after a major update is a best practice, compiling on every launch is a holdover from a rapid development model where they expect things to change every damned day so they just leave it on, it is also a ram thing the levels are huge so they try to break down the shaders into zones from a map and as you approach boundaries it starts compiling again, which makes sense to a degree but that should be done on a separate CPU core not the main ones the game is running on so that is just bad programming on their side.