Jedi Survivor is the best showcase of a looming problem for PC players

I dont understand why people are complaining why DLSS wasn't added. Its an AMD sponsored game.

I dont think I have ever seen people complain about FSR not being added to a game.......lol

FSR is fine IMO if it's done well but DLSS is at least marginally superior to FSR. And apparently FSR is implemented particularly badly in this game. And also with DLSS you can swap out DLLs to improve performance / image quality.
 
I feel like DLSS being always as good or better should make this extremely easy to understand. As well as the other way around you are talking about.

It is also a title for which DLSS 3 would make a lot of sense (cpu limited a lot)
DLSS is far superior to FSR no doubt. But bitching about why a feature for a company who didn't spare the game isn't in the game is just being idiotic imo.
 
So now we've come to the point where if a game is AMD sponsored it's going to be bad. Kind of a bad look that just about anything attached to AMD is suck.
No, there have been plenty of Nvidia games recently that released in a bad shape. The reason why this game is getting more press is because its highly anticipated.
 
I dont understand why people are complaining why DLSS wasn't added. Its an AMD sponsored game.

I dont think I have ever seen people complain about FSR not being added to a game.......lol
Perhaps they should complain when FSR isn't available in a game. I'm not a fan of a game leaning so heavily into one GPU brand or another outside of cases where they are merely supporting a new feature or technology that AMD or NVIDIA doesn't have support for yet. I prefer games to support FSR and DLSS whenever possible.
 
DLSS is far superior to FSR no doubt. But bitching about why a feature for a company who didn't spare the game isn't in the game is just being idiotic imo.
Or in this case, actively removed from the game, Modders have already got DLSS 3 added and working overtop of FSR2 but Denovo is preventing them from Adding DLSS 2 in addition to FSR. So hopefully they figure that out shortly.
 
Perhaps they should complain when FSR isn't available in a game. I'm not a fan of a game leaning so heavily into one GPU brand or another outside of cases where they are merely supporting a new feature or technology that AMD or NVIDIA doesn't have support for yet. I prefer games to support FSR and DLSS whenever possible.
Exactly this.
 
Or in this case, actively removed from the game, Modders have already got DLSS 3 added and working overtop of FSR2 but Denovo is preventing them from Adding DLSS 2 in addition to FSR. So hopefully they figure that out shortly.
Happens all the time. Look at Atomic Heart.....Was supposed to be the latest greatest Nvidia feature game.....only for them to remove parts of it before release.
 
Happens all the time. Look at Atomic Heart.....Was supposed to be the latest greatest Nvidia feature game.....only for them to remove parts of it before release.
Never forget Arkham Knight. The NVIDIA sponsored PhysX poster child that wouldn't work properly with a lot of the PhysX features enabled and lost performance with SLi enabled. Most of the game's issues were fixed about a year after the game launched but SLi never was implemented.
 
Perhaps they should complain when FSR isn't available in a game. I'm not a fan of a game leaning so heavily into one GPU brand or another outside of cases where they are merely supporting a new feature or technology that AMD or NVIDIA doesn't have support for yet. I prefer games to support FSR and DLSS whenever possible.
It seems like just about every game releases with both DLSS and FSR nowadays so the lack of either shows some piss poor direction which likely came from the C Suites (and possibly a deal with AMD).
 
It seems like just about every game releases with both DLSS and FSR nowadays so the lack of either shows some piss poor direction which likely came from the C Suites (and possibly a deal with AMD).
Probably is, AMD now has money to throw around to keep games/software exclusive. Not saying its right, but we know Nvidia does it. So, I would expect thats what AMD did for Jedi Survivor.

Problem is the launch wasn't all that great. I can expect performance issues on PC's, everyone has different hardware. But, when consoles are having issues.....well thats bad because everyone owns the same freaking hardware. Thats more on the developer than it is AMD though.
 
TPU has a FSR 2.2 review

https://www.techpowerup.com/review/star-wars-jedi-survivor-fsr-2-2/

Speaking of performance, Star Wars Jedi: Survivor is a very CPU intensive game, especially with ray tracing enabled, as the CPU usage is mostly single-threaded on PC due to a very poor implementation of Unreal Engine 4 DirectX 12, and high-powered GPUs such as the GeForce RTX 4080 can end up CPU bottlenecked in some sequences of the game at 1440p and below. We've seen these issues before in other recent Unreal Engine 4 games, such as The Callisto Protocol, Hogwarts Legacy or Gotham Knights. At 4K, there is no such issue, and with FSR 2.2 enabled, you can expect around 40% more performance in "Quality" mode with all graphics settings maxed out.

At 4K, FPS increases from 56fps to 91 (Quality)
At 1440 and 1080, 0 increase in fps

Curious how they get 40% increase. 91 from 56 is an increase of 62.5%
 
Curious how they get 40% increase. 91 from 56 is an increase of 62.5%
35 is about 40% of 91 (38.5%), in english and many other language talk about increase can be always strange and people can easily mix up the calculation, because some 60% faster one way will be 40% slower the other way around
 
35 is about 40% of 91 (38.5%), in english and many other language talk about increase can be always strange and people can easily mix up the calculation, because some 60% faster one way will be 40% slower the other way around
Indeed. The increase is starting from 56 FPS, so you need to divide the difference by 56. I don't think it's so much an "English" thing as it is people just being bad at math.
 
oh you know whats super fucking annoying? the game compiles shaders every boot up. ffs
yeah i'm wondering why that's all of a sudden becoming a thing? the only time i've ever had to deal with compiling shaders is when i was running Zelda: Breath of the Wild on pc, which eventually had shader compile files you could download to where you didn't have to worry about it but by the time all the shaders you needed were compiled it was something in the 3000's i want to say. but the thing is, is if you were playing the game on nintendo hardware that wasn't even a thing!!?

why all of a sudden in the last few months are we gettng games that aren't coming with pre-compiled shaders???!!!!! unless it's like someone else said so that way if you let the game compile shaders for close to 1 1/2 - 2 hrs by the time you get in to actually playing the game that you are past the point of being able to use the Steam refund policy of 2hrs?!!
 
also with DLSS you can swap out DLLs to improve performance / image quality
DLL's you mean drivers? so you're saying you can update your drivers to improve performance/quality? because you can do that with all brands
Perhaps they should complain when FSR isn't available in a game. I'm not a fan of a game leaning so heavily into one GPU brand or another outside of cases where they are merely supporting a new feature or technology that AMD or NVIDIA doesn't have support for yet. I prefer games to support FSR and DLSS whenever possible.
but FSR isn't proprietary. but i could give a f*** if they support either. we shouldn't have to use upscaling / bluring bullsh** to be able to run games on $3000 computers. that sh**s a copout.
 
DLL's you mean drivers? so you're saying you can update your drivers to improve performance/quality? because you can do that with all brands
No. You can replace the DLSS DLLs in the game folder with newer (or older) versions to fine tune the performance and quality of the DLSS implementation in the game. It's become quite common to do and generally works great for games that ship with an old version of DLSS. https://www.techpowerup.com/download/nvidia-dlss-dll/

Unfortunately FSR is hard-coded and can't be modified from outside of the game.
 
yeah i'm wondering why that's all of a sudden becoming a thing? the only time i've ever had to deal with compiling shaders is when i was running Zelda: Breath of the Wild on pc, which eventually had shader compile files you could download to where you didn't have to worry about it but by the time all the shaders you needed were compiled it was something in the 3000's i want to say. but the thing is, is if you were playing the game on nintendo hardware that wasn't even a thing!!?

why all of a sudden in the last few months are we gettng games that aren't coming with pre-compiled shaders???!!!!! unless it's like someone else said so that way if you let the game compile shaders for close to 1 1/2 - 2 hrs by the time you get in to actually playing the game that you are past the point of being able to use the Steam refund policy of 2hrs?!!
Compiling the shaders on first launch, after a hardware change, or after a major update is a best practice, compiling on every launch is a holdover from a rapid development model where they expect things to change every damned day so they just leave it on, it is also a ram thing the levels are huge so they try to break down the shaders into zones from a map and as you approach boundaries it starts compiling again, which makes sense to a degree but that should be done on a separate CPU core not the main ones the game is running on so that is just bad programming on their side.
 
Indeed. The increase is starting from 56 FPS, so you need to divide the difference by 56. I don't think it's so much an "English" thing as it is people just being bad at math.
If I have 100 fps, and now I have 130, that is an increase of 30% (130/100-1). 100 * 1.30 = 130

In the jedi fsr 2.2 example, you start at 56 fps and now you have 91. 91/56 is 1.625,-1, is 62.5%

how is this possibly an increase of 40%? 56 * 1.4 is 78.4fps
 
No. You can replace the DLSS DLLs in the game folder with newer (or older) versions to fine tune the performance and quality of the DLSS implementation in the game. It's become quite common to do and generally works great for games that ship with an old version of DLSS. https://www.techpowerup.com/download/nvidia-dlss-dll/

Unfortunately FSR is hard-coded and can't be modified from outside of the game.
Newer versions of GeForce Experience will also detect any games with older versions and update them as well.
 
If I have 100 fps, and now I have 130, that is an increase of 30% (130/100-1). 100 * 1.30 = 130

In the jedi fsr 2.2 example, you start at 56 fps and now you have 91. 91/56 is 1.625,-1, is 62.5%

how is this possibly an increase of 40%? 56 * 1.4 is 78.4fps
I think this is more of a case where their example where yes they are getting 60+% of a performance increase is outside the norm, yes with their setup that is what they get but their hardware set falls outside the curve, more "normal" systems out there can expect to see something in the 40% range.
At least that is how I interpreted the article.
 
If I have 100 fps, and now I have 130, that is an increase of 30% (130/100-1). 100 * 1.30 = 130

In the jedi fsr 2.2 example, you start at 56 fps and now you have 91. 91/56 is 1.625,-1, is 62.5%

how is this possibly an increase of 40%? 56 * 1.4 is 78.4fps
We're both saying the same thing. It's a simple formula that can be written many ways. Stating it the way I did just simplifies the language, but the way I describe would be written as (91 - 56) / 56.
 
DLL's you mean drivers? so you're saying you can update your drivers to improve performance/quality? because you can do that with all brands

but FSR isn't proprietary. but i could give a f*** if they support either. we shouldn't have to use upscaling / bluring bullsh** to be able to run games on $3000 computers. that sh**s a copout.
FSR and DLSS are often needed at 4K with ray tracing enabled. It's not a cop out when the technology to push modern games at 60FPS+ with full ray tracing features enabled doesn't exist. People seem to think that there is some magic optimization developers could do in order to make that happen but that's simply not the case. Obviously, this game can run better than it does and we've seen evidence of that. Not only could it implement DLSS and FSR 2.2, but it could also be coded to behave properly with E-cores on Intel's 12th and 13th generation CPU's.

And if developers didn't bring us games that had visuals that stressed modern hardware there would be no drive to upgrade.
 
Fascinating they would gimp 85 percent of the gpu market unless... There was some reason I can't quite... Grasp... ;).
i'm pretty sure everyone's struggling to run this, from how it sounds. and it sounds like it doesn't even matter what OS you run it on it runs like crap. hardware unboxed said they're gonna do an indepth benchmark of cpu/gpu once ea gets their sh** straightened out.

and with the "compiling shaders" thing. that, to me, sounds like a new low, even for ea. why anyone would think "because it runs bad on all systems that means it's amd sabotage". i would think amd would want all these games running well on all their hardware to not only build their brand, but to keep things running well on consoles now because that's a couple of really big customers
 
i'm pretty sure everyone's struggling to run this, from how it sounds. and it sounds like it doesn't even matter what OS you run it on it runs like crap. hardware unboxed said they're gonna do an indepth benchmark of cpu/gpu once ea gets their sh** straightened out.

and with the "compiling shaders" thing. that, to me, sounds like a new low, even for ea. why anyone would think "because it runs bad on all systems that means it's amd sabotage". i would think amd would want all these games running well on all their hardware to not only build their brand, but to keep things running well on consoles now because that's a couple of really big customers

You would think, but as we've seen over the years/decades 'execution of plans' isn't really the part AMD does best
 
Man a whole lotta whining in this thread about Nvidia features not being implemented in the game, guess what vote with your wallet then and don't buy the game. Otherwise you made your choice and you need to live with it, clearly they were right and you would buy the game anyway without spending dollars on implementing it. Pretty sure they would notice if most Nvidia video card owners did not buy the game. Of course then you would have to realize most of those Nvidia cards are not even capable of running DLSS, let alone Ray Tracing as they are way to weak to do so, or are not supported on the older hardware.
 
Man a whole lotta whining in this thread about Nvidia features not being implemented in the game, guess what vote with your wallet then and don't buy the game. Otherwise you made your choice and you need to live with it, clearly they were right and you would buy the game anyway without spending dollars on implementing it. Pretty sure they would notice if most Nvidia video card owners did not buy the game. Of course then you would have to realize most of those Nvidia cards are not even capable of running DLSS, let alone Ray Tracing as they are way to weak to do so, or are not supported on the older hardware.
They actually spent dollars to take it out.
But the biggest gripe the community has is that this game is using the same engine as the previous one, but launched with all the same bugs and flaws the first one had that they then spent 6 months patching out.
This game literally repeats all the same mistakes of the first only worse as though they didn't learn a single thing that first time, it shares the same texture loading issues, the same shitty menu problems, the same TAA bugs, the same hardcoded INI settings, the same audio glitches, and the same resource problems and CPU thread problems, and they are blaming all the exact same things. Only then it was people using new hardware with Windows 7 when we should be using Windows 10 for the best experience.
 
Still waiting for some proof on that. I think you mentioned "digging up a reddit post" about it and that was it.
You literally have to work to remove DLSS from the Unreal Engine after version 4.26, it is cooked in there.
I was trying to find proof that removing DLSS is a requirement for the AMD sponsorship and all I can find there is anecdotal but nothing concrete lots of people on Reddit saying it is a requirement for AMD sponsorship, but digging through it is all the same well this is an AMD sponsored game and its not there so AMD must have paid to remove it.
The modding community is pretty adamant that the only reason they haven't gotten it in there already is because Denovo is keeping them out so they have to get past that first.
 
Last edited:
Compiling the shaders on first launch, after a hardware change, or after a major update is a best practice, compiling on every launch is a holdover from a rapid development model where they expect things to change every damned day so they just leave it on, it is also a ram thing the levels are huge so they try to break down the shaders into zones from a map and as you approach boundaries it starts compiling again, which makes sense to a degree but that should be done on a separate CPU core not the main ones the game is running on so that is just bad programming on their side.

The engine had 2 issues. It always compiled the shader at draw time, I think. So you could load a model... and still hitch once it actually needed it get rendered the first time. And second, it always stalled to do this. Didn't matter if it was on another thread, the engine won't skip drawing if a shader isn't ready - it just waits every time and you get the stutter.

The latest UE5 finally makes this an option where it can just not render if it's not ready, and you maybe get a pop in a few frames later instead of the stutter.
 
Last edited:
Back
Top