Jedi Survivor is the best showcase of a looming problem for PC players

AMD is obviously sponsoring these games and getting developers to purposely use more VRAM because they know that most Nvidia mainstream cards don't have more than 12GB VRAM...it's something that will take a few months to fix for Nvidia owners and in that time AMD will prosper...the first few months of release are always the most important in terms of buzz
 
That's pretty bad tbh. My wife was playing Hogwarts on her pc and I was like wtf, why does it look slow motion? At some point her display was set at 30hz. "Slide show" is a total exaggeration, but 30 fps does just look slower. It's why I find tv content at 60 fps so...uncanny. It looks like it moves way too fast when you're so used to standard film fps.
It won't kill you to adjust to it, I know when FPS drops below 30 I can totally see that. It's not a perfect solution, it's just a means to ignore some of the issues and get in the game. I always thought the odd TV content was due to the TV's running that simulated version of 120hz on them more than it being native 60.
 
AMD is obviously sponsoring these games and getting developers to purposely use more VRAM because they know that most Nvidia mainstream cards don't have more than 12GB VRAM
Maybe, but that argument doesn't hold any merit when people are complaining about their 24 GB 4090, I agree it could hamper Nvidia cards and see why you say that
 
It won't kill you to adjust to it, I know when FPS drops below 30 I can totally see that. It's not a perfect solution, it's just a means to ignore some of the issues and get in the game. I always thought the odd TV content was due to the TV's running that simulated version of 120hz on them more than it being native 60.
No one (with decent specs) should have to drop their refresh rate to 30hz. Insta-refund if performance is that bad imo.
 
https://www.nme.com/news/gaming-news/star-wars-jedi-survivor-pc-performance-low-fps-issues-3436303
https://kotaku.com/star-wars-jedi-survivor-pc-performance-fps-reviews-1850382672

At this point I don't know if these sorts of PC ports are just malicious, or straight-up incompetent, glad I didn't pre-order but I am bummed because I was looking forward to this one.
Don't you think both AMD and Nvidia want games to get more demanding so we buy more graphic cards? Not a new tactic. If it's malicious it's in favor of AMD and Nvidia.
TLDR;
4090 struggles to maintain 60fps at 1440p
No DLSS only FSR2, apparently part of AMD sponsorship requires DLSS not to be present
Ray-Tracing features have been removed and limited to the bare minimum (again part of the AMD sponsorship)
155Gb install
32Gb ram basically required for smooth texture population
I call this a looming problem because while ports to PC have been traditionally not good, I struggle to remember a time when they were consistently this bad.
I don't see a problem with this. AMD sponsored the game and got features that favor AMD. Nvidia does this all the time, so now you know how AMD users feel when Nvidia sponsors a game and it runs like crap on AMD cards.
  • 4090 doesn't mean you automatically get the best frame rate in every game.
  • Having DLSS would be nice but again AMD sponsored it and this is what you get. Be glad you can use FSR
  • Why do you want Ray-Tracing when the game runs this badly?
  • Get a bigger SSD or buy a HDD?
  • Get more Ram? If you can afford a 4090 then 32GB of ram is a drop in the bucket.
Games are staring to get more demanding and while this isn't a good PC port, this seems to be the way things are going. I'm surprised you didn't mention the VRAM issue as this game demands a lot of VRAM as well. It lists the RX 580 as the minimum with a GTX 1070 as well, so it clearly favors AMD. What do you expect when Nvidia puts 8GB of VRAM in a lot of their cards, which is the same amount as a R9 390 from 2015?
 
Don't you think both AMD and Nvidia want games to get more demanding so we buy more graphic cards? Not a new tactic. If it's malicious it's in favor of AMD and Nvidia.

I don't see a problem with this. AMD sponsored the game and got features that favor AMD. Nvidia does this all the time, so now you know how AMD users feel when Nvidia sponsors a game and it runs like crap on AMD cards.
  • 4090 doesn't mean you automatically get the best frame rate in every game.
  • Having DLSS would be nice but again AMD sponsored it and this is what you get. Be glad you can use FSR
  • Why do you want Ray-Tracing when the game runs this badly?
  • Get a bigger SSD or buy a HDD?
  • Get more Ram? If you can afford a 4090 then 32GB of ram is a drop in the bucket.
Games are staring to get more demanding and while this isn't a good PC port, this seems to be the way things are going. I'm surprised you didn't mention the VRAM issue as this game demands a lot of VRAM as well. It lists the RX 580 as the minimum with a GTX 1070 as well, so it clearly favors AMD. What do you expect when Nvidia puts 8GB of VRAM in a lot of their cards, which is the same amount as a R9 390 from 2015?
An 8 GB RX 5700xt out performs the 4090 by a significant amount so vram isn’t the problem.
 
FSR shouldn't even be a thing with how it performs. It's more of a placebo thing to stall adoption of DLSS while AMD's actual DLSS competitor is still in development.
 
has it been confirmed that the ray tracing features have been removed (PC)?...if so is it coming later in a post launch patch?...same as Atomic Heart?
 
Almost seems like development should be on the high platform and dumbed down rather than scaled up. But what do i know lol

This is the issue.... as far as texture space, and texture loading capabilities. The PS5 IS the superior platform.
If we want to have = performance on PC in the future... a bit part of the solution is 24gb GPUs. YES you need a 24gb GPU to compete with a 16gb PS5... PS5 has dedicated texture decompression hardware and it can move Textures efficiently enough that the only way PCs are going to compete in that regard is to simply load an extra 8GB of textures early.

Also where did anyone read that DLSS... or RT being minimized was as per AMD. lol FSR makes sense for developers to focus on... it works on everything including Nvidia. I hate to break it to people but if Nvidia isn't PAYING you to implement DLSS why would you bother?
 
AMD is obviously sponsoring these games and getting developers to purposely use more VRAM because they know that most Nvidia mainstream cards don't have more than 12GB VRAM...it's something that will take a few months to fix for Nvidia owners and in that time AMD will prosper...the first few months of release are always the most important in terms of buzz

OR Crazy idea. Developers are finally using the superior texture moving and dedicated texture decompression chips on the PS5. PC GPUs are going to pretty much require 24gb going forward for full on Ultra mode textures. For High quality 16gb is probably going to be the absolute min and still not perform as well as a PS5 in that regard at least.

This last gen of consoles has some very powerful texture hardware baked in... and its just now starting to get used. Nvidia also isn't stupid they have known that for some time... perhaps they figured they could throw enough money around in the game development world to keep developers from going hard on their use of those capabilities. 14-16gb of texture ram on a PS5 is probably aprox = to 24gb on a PC. The PC will also have to preload while the PS5 can stream textures back and forth with pretty minimal impact.
 
has it been confirmed that the ray tracing features have been removed (PC)?...if so is it coming later in a post launch patch?...same as Atomic Heart?

I read that RT is a part of the game at launch...but there's only an On/Off toggle for RT...so it doesn't even list the RT features they are using...AMD almost always releases a half ass inferior version of RT because it knows their cards can't handle it...so they nerf RT and make sure that VRAM usage is close to 16GB to make their hardware look better
 
Well, I can set my AMD graphics card to limit fps to 30, just did that. No Hz alteration necessary

Funny you say that about the slide show. My cousin plays competitively at 144 Hz in shooters online. Had him over when Star Wars Squadrons released and had the game running 30 Hz on a 4K display in my front room on a GTX 1080, he couldn't tell the difference even when he moved to my main rig at the time with it's 2080Ti at 60 Fps.

It's all drama with the high Hz gamers all the time
I'm not sure how anyone can argue 144Hz vs. 30Hz these days... if you game all the time at 144Hz, you will absolutely notice 30Hz. If I set my monitor to 60Hz from 144Hz, it feels slow as hell, and thats even with gsync. I'm using Hz and FPS as the same here because I can drive that FPS up to 144 easy.

I'm not even talking competitively, it feels absolutely slugish and slow in most games. Can you get "used" to it? Sure, i played CP2077 on release on a 2080Ti at 4K and averaged 45 FPS half the time, I got used to it and enjoyed the game. But I've been playing far too long at 144 to go back to sub 80 FPS, which is about my cutoff for enjoyment and a smooth feeling.

Same with nvidia too BTW, you can set frame caps per game really easy and even match frequency if you want too with gsync and vsync combo.
 
Citation on that one? Again, wasn't in the articles posted.
That might be a translation bug, it’s from the article those ones are talking about from the German review site. It says the game wasn’t a problem for their 5700 but that may be the CPU.
https://www.gamestar.de/spiele/star-wars-jedi-survivor,15448.html

https://www.dsogaming.com/news/star...major-cpu-and-vram-optimization-issues-on-pc/

This site says the console reviews aren’t better where it’s a stuttering mess there too.

https://www.techpowerup.com/307861/star-wars-jedi-survivor-has-major-cpu-and-gpu-issues#comments

This one says that GPU utilization was always 35-60%.
 
This is the issue.... as far as texture space, and texture loading capabilities. The PS5 IS the superior platform.
If we want to have = performance on PC in the future... a bit part of the solution is 24gb GPUs. YES you need a 24gb GPU to compete with a 16gb PS5... PS5 has dedicated texture decompression hardware and it can move Textures efficiently enough that the only way PCs are going to compete in that regard is to simply load an extra 8GB of textures early.

Also where did anyone read that DLSS... or RT being minimized was as per AMD. lol FSR makes sense for developers to focus on... it works on everything including Nvidia. I hate to break it to people but if Nvidia isn't PAYING you to implement DLSS why would you bother?
In this case they took features that are already part of UE4 and spent time money and resources to disable and remove them.

And no it’s not using the Kraken textures. Looks like basic block.
 
An 8 GB RX 5700xt out performs the 4090 by a significant amount so vram isn’t the problem.

What are you comparing there though.... are you comparing a 5700xt with ultra textures turned on as I assume that probably isn't even possible? Or did you see someone post a 5700XT at medium settings vs a 4090 at ultra or something.
 
What are you comparing there though.... are you comparing a 5700xt with ultra textures turned on as I assume that probably isn't even possible? Or did you see someone post a 5700XT at medium settings vs a 4090 at ultra or something.
I think it was a translation thing, I think they were referring to the CPU being a 5700 not the GPU. The original article about bad performance is all from gamestar.de and I think google translate got the better of me there.
 
In this case they took features that are already part of UE4 and spent time money and resources to disable and remove them.

RT is a feature of the engine you don't just turn it on... developing for it still requires the developer to do the work. Its not like RT is zero developer input or something. DLSS again may be supported by Unreal 4... but its not automatic.

I know this might be a shock to people... a great many developers DON'T like DLSS. Its not that its bad tech... its just needs to be implemented for a specific companies hardware for one market. FSR is easier to implement and works for consoles and pc regardless of GPU installed. (including not just Nvidia but also Intel)

I don't see them doing anything here that Nvidia hasn't done x100 in the past. There was never a reason for developers to not include FSR either... but they would often skip the day of work and zero capital investment needed because they where the way it was meant to be played money takers. All they did here was refuse to spend the weeks it would take to implement DLSS.... which is galling when FSR works with Nvidia hardware just fine.
 
  • Why do you want Ray-Tracing when the game runs this badly?
The game as RT and the reflection seem completely broken if it is off (the game probably assume it would be on)

I know this might be a shock to people... a great many developers DON'T like DLSS. Its not that its bad tech... its just needs to be implemented for a specific companies hardware for one market. FSR is easier to implement and works for consoles and pc regardless of GPU installed. (including not just Nvidia but also Intel)
How much easier FSR 2.x (not 1) is to implement than XeSS or DLSS 2.x, it is exactly the same input required for both, that distinction was from the FSR 1 era.

So much that there is community made mod to use FSR 2.0 in DLSS 2 title and vice versa.
 
Last edited:
RT is a feature of the engine you don't just turn it on... developing for it still requires the developer to do the work. Its not like RT is zero developer input or something. DLSS again may be supported by Unreal 4... but its not automatic.

I know this might be a shock to people... a great many developers DON'T like DLSS. Its not that its bad tech... its just needs to be implemented for a specific companies hardware for one market. FSR is easier to implement and works for consoles and pc regardless of GPU installed. (including not just Nvidia but also Intel)

I don't see them doing anything here that Nvidia hasn't done x100 in the past. There was never a reason for developers to not include FSR either... but they would often skip the day of work and zero capital investment needed because they where the way it was meant to be played money takers. All they did here was refuse to spend the weeks it would take to implement DLSS.... which is galling when FSR works with Nvidia hardware just fine.
I thought it was in the first one, but it’s a fan mod that turns it on as well as fixes the audio and cutscene problems as well as implements the full RT effects.
 
The game as RT and the reflection seem completely broken if it is off (the game probably assume it would be on)


How much easier FSR 2.x (not 1) is to implement than XeSS or DLSS 2.x, it is exactly the same input required for both, that distinction was from the FSR 1 era.
Well I don’t know about FSR, but DLSS is native in UE 4.26 onward.
It’s a checkbox.

Edit: apparently you need to unzip it to the addons folder for the checkbox to appear. YouTube tutorials are abundant on how to add DLSS 2 to your project, most are 5 min or less.

It has the exact same install process as FSR 2.
https://gpuopen.com/learn/ue-fsr2/
 
Last edited:
OR Crazy idea. Developers are finally using the superior texture moving and dedicated texture decompression chips on the PS5. PC GPUs are going to pretty much require 24gb going forward for full on Ultra mode textures. For High quality 16gb is probably going to be the absolute min and still not perform as well as a PS5 in that regard at least.

This last gen of consoles has some very powerful texture hardware baked in... and its just now starting to get used. Nvidia also isn't stupid they have known that for some time... perhaps they figured they could throw enough money around in the game development world to keep developers from going hard on their use of those capabilities. 14-16gb of texture ram on a PS5 is probably aprox = to 24gb on a PC. The PC will also have to preload while the PS5 can stream textures back and forth with pretty minimal impact.
Nothing wrong here of course, but another perspective - it is possible to do a much better job of working-set management.
 
How much easier FSR 2.x (not 1) is to implement than XeSS or DLSS 2.x, it is exactly the same input required for both, that distinction was from the FSR 1 era.
If you have two competing standards that do the same thing, but one only works for one particular brand, and in some cases for the latest product of that brand, then it's more than twice as much work.
So much that there is community made mod to use FSR 2.0 in DLSS 2 title and vice versa.
That's because Nvidia also sponsored games where they wanted only DLSS.
An 8 GB RX 5700xt out performs the 4090 by a significant amount so vram isn’t the problem.
I cannot find any benchmarks to back this up. There will be benchmarks, but none that I can find right now.
OR Crazy idea. Developers are finally using the superior texture moving and dedicated texture decompression chips on the PS5. PC GPUs are going to pretty much require 24gb going forward for full on Ultra mode textures. For High quality 16gb is probably going to be the absolute min and still not perform as well as a PS5 in that regard at least.
You can take that crazy and throw it out. Jedi Survivor runs like crap on the PS5 too, it's just that PC gamers are upset that they have to play a game at 30fps, while the PS5 plays it at 24fps and console games think it's fine.
This last gen of consoles has some very powerful texture hardware baked in...
It's the same hardware in AMD's RX 6000 GPU's. Nothing to see here move along.
 
If you have two competing standards that do the same thing, but one only works for one particular brand, and in some cases for the latest product of that brand, then it's more than twice as much work.
Not sure I follow you I make my game engine (or make my code that use Unity-Unreal) feed to FSR 2.x the current buffer, the last image buffer and the motion vectors.

Now if I want to support Xess-DLSS that want the same current buffer, the last image buffer and the motion vectors most of the work is already done, how would it be more than twice the work to add it (or vice versa) ? Not sure I follow what you are saying at all.

That's because Nvidia also sponsored games where they wanted only DLSS.
I would imagine that (with how little change is needed for FSR to work when you enable DLSS in the option menu) for this to be the case yes, which seem to go exactly against you very previous statement that it is more than twice the work to support both.
 
I'd need to see an official statement by the studio to believe that.

Its funny cause its the exact opposite of the same argument made about the way its meant to be played for years now.
NO no no Nvidia isn't stopping them from implementing native FSR, or any of the AMD fidelityfx techs. its up to the developer.

Now I don't know if Nvidia ever really told any developers to skip AMD stuff... or with this game if AMD asked for them to skip Nvidia stuff.
I do know though that when you give a developer a bunch of money and direct support in the form of on site Nvidia/AMD staff working on your project. You probably aren't going to bite the hand that feeds.
 
Out of curiosity sake, what is the reason someone would choose to program using UE4 instead of 5? Is there that drastic a learning curve? Not that it would fix this problem, i am merely curious…
 
Out of curiosity sake, what is the reason someone would choose to program using UE4 instead of 5? Is there that drastic a learning curve? Not that it would fix this problem, i am merely curious…
Games like this were in development before UE5 and its associated toolkits were ready for prime time.
 
This is the issue.... as far as texture space, and texture loading capabilities. The PS5 IS the superior platform.
If we want to have = performance on PC in the future... a bit part of the solution is 24gb GPUs
Considering that a nvme 5.0 drive bandwith will be more than twice of an PS5 and 4x an Xbox (i,e, you will be achieving ps5 speed without decompression even), could depend on pricing of ssd versus large vram I suppose.
If decompression get better and install game get larger, almost no matter how much vram you get an issue could arise against a tech that use the SSD has a higher level cache.

GDeflate texture bandwith from a regular nvme 4 drive is higher than a PS5, but with how much GPU are being use when you game I am not sure how much room there is to use them to decompress texture, when you run direct storage GPU decompression demo GPU usage show goes quite high in windows task manager, if it is accurate, and obviously in that loading asset during a scene load demo you can 100% as fast as you can without any attemps to leave the GPU do something else, but still I wonder if it is part of the issue for why it has yet to happen for a single title on the PC side.
 
On the plus side, maybe patches have helped pre-release? Although I hate the idea of streamers and never watch any of them, a few of them who have access look to have decently smooth gameplay at 4K with 4090's according to reports... guess I will find out tomorrow for sure... I have 2 hours' worth of time to try before my return window expires... lol
 
Been scratching around online for benchmarks comparing performance between the RTX 40 and 7900 GPUs but haven't found anything yet.....anyone found some yet?
 
What baffles me is why people are complaining how much vram its using. Do you know how long its taken for Devs to finally use vram.....There have been so many games that dont even use 10gb+ at 4k......If you got a 24GB card there is 14GB of vram memory not even being used.

I am all for devs using more vram! Please actually use better/more textures. Keep moving forward not backwards.
 
I love how every time a high-profile game gets released and runs poorly on PC it's all doom and gloom from the media and community. It's the new "PC gaming is dying" schtick. You know what the common denominator is with most of these releases?

Unreal Engine 4.

If a game uses UE4, then expect it to run poorly out of the gate.
 
Back
Top