Jedi Survivor is the best showcase of a looming problem for PC players

Lakados

[H]F Junkie
Joined
Feb 3, 2014
Messages
10,389
https://www.nme.com/news/gaming-news/star-wars-jedi-survivor-pc-performance-low-fps-issues-3436303
https://kotaku.com/star-wars-jedi-survivor-pc-performance-fps-reviews-1850382672

At this point I don't know if these sorts of PC ports are just malicious, or straight-up incompetent, glad I didn't pre-order but I am bummed because I was looking forward to this one.

TLDR;
4090 struggles to maintain 60fps at 1440p
No DLSS only FSR2, apparently part of AMD sponsorship requires DLSS not to be present
Ray-Tracing features have been removed and limited to the bare minimum (again part of the AMD sponsorship)
155Gb install
32Gb ram basically required for smooth texture population
I call this a looming problem because while ports to PC have been traditionally not good, I struggle to remember a time when they were consistently this bad.
 
https://www.nme.com/news/gaming-news/star-wars-jedi-survivor-pc-performance-low-fps-issues-3436303
https://kotaku.com/star-wars-jedi-survivor-pc-performance-fps-reviews-1850382672

At this point I don't know if these sorts of PC ports are just malicious, or straight-up incompetent, glad I didn't pre-order but I am bummed because I was looking forward to this one.

TLDR;
4090 struggles to maintain 60fps at 1440p
No DLSS only FSR2, apparently part of AMD sponsorship requires DLSS not to be present
Ray-Tracing features have been removed and limited to the bare minimum (again part of the AMD sponsorship)
155Gb install
32Gb ram basically required for smooth texture population
I call this a looming problem because while ports to PC have been traditionally not good, I struggle to remember a time when they were consistently this bad.
Almost seems like development should be on the high platform and dumbed down rather than scaled up. But what do i know lol
 
https://www.nme.com/news/gaming-news/star-wars-jedi-survivor-pc-performance-low-fps-issues-3436303
https://kotaku.com/star-wars-jedi-survivor-pc-performance-fps-reviews-1850382672

At this point I don't know if these sorts of PC ports are just malicious, or straight-up incompetent, glad I didn't pre-order but I am bummed because I was looking forward to this one.

TLDR;
4090 struggles to maintain 60fps at 1440p
No DLSS only FSR2, apparently part of AMD sponsorship requires DLSS not to be present
Ray-Tracing features have been removed and limited to the bare minimum (again part of the AMD sponsorship)
155Gb install
32Gb ram basically required for smooth texture population
I call this a looming problem because while ports to PC have been traditionally not good, I struggle to remember a time when they were consistently this bad.
Dang a 4090 struggling?
Not having DLSS is not great but FSR2 is an ok alternative that works good enough. (I personally don't care about upscaling or nitpicking between the two)
Missing Ray tracing features sounds like a necessity if a 4090 is already struggling lol.
155GB install, sadly par for the course nowadays (especially with UE5 games coming Nanite...)
32GB RAM required, going to see more of this with these install sizes ballooning.

UE5 changed the game drastically and put more of the "optimization" work on the engine itself rather than developers. So hopefully UE5 will make ports better.

That said Jedi Survivor is UE4 and anybody who knows UE4 could see this coming a mile away since UE4 optimization relies more on the developers competence or time constraints ..... (I say this as someone who like to likes to mess around in unreal engine 4/5)
 
Last edited:
https://www.nme.com/news/gaming-news/star-wars-jedi-survivor-pc-performance-low-fps-issues-3436303
https://kotaku.com/star-wars-jedi-survivor-pc-performance-fps-reviews-1850382672

At this point I don't know if these sorts of PC ports are just malicious, or straight-up incompetent, glad I didn't pre-order but I am bummed because I was looking forward to this one.

TLDR;
4090 struggles to maintain 60fps at 1440p
No DLSS only FSR2, apparently part of AMD sponsorship requires DLSS not to be present
Ray-Tracing features have been removed and limited to the bare minimum (again part of the AMD sponsorship)
155Gb install
32Gb ram basically required for smooth texture population
I call this a looming problem because while ports to PC have been traditionally not good, I struggle to remember a time when they were consistently this bad.
It's not malicious, they just put all the dev time into the consoles. Like the previous title and it's PC issues (shitloads of them) it's likely geared to run at 30 FPS, so I'm just gonna dial my system into those settings and sit back and play the game the way it was intended to be played, like a console.

If it runs like ass at 30FPS then I may start throwing shit around the room and swearing. :eek:

I am very curious to see how it will run on my 13900K and 7900XTX, if it has any AMD graphical optimizations in it it may run perfectly
 
Does not know enough to judgee this, but would it not be possible for a game made to run on well on an Xbox series S to run well on a powerful PC easier than the other way around ?

The game run on a XboxS
Set your refresh to 30 Hz and 90% of the problems likely go away instantly on PC
 
It's not malicious, they just put all the dev time into the consoles. Like the previous title and it's PC issues (shitloads of them) it's likely geared to run at 30 FPS, so I'm just gonna dial my system into those settings and sit back and play the game the way it was intended to be played, like a console.

If it runs like ass at 30FPS then I may start throwing shit around the room and swearing. :eek:

I am very curious to see how it will run on my 13900K and 7900XTX, if it has any AMD graphical optimizations in it it may run perfectly
Here's the rub, a 5700xt currently outperforms a 4090 in the game. So your 7900xtx is likely fine.
 
Here's the rub, a 5700xt currently outperforms a 4090 in the game. So your 7900xtx is likely fine.
It kinda makes sense. They focused on consoles which are much cheaper to adopt than PCs capable of running the game properly. If it's anything like the 1st game, it will be months before the PC issues get ironed out. I had already beaten the game by then, and I never really went back. the story was pretty good, but it was very linear with little replay-ability. I could give a shit about all the zillion skins that were added later to the game. I think the janky gameplay had me never return to the title. Now, going into the second game with the knowledge of how the 1st one ran and why, and the info from you guys on here, it will likely be a very different experience for me.
 
No DLSS only FSR2, apparently part of AMD sponsorship requires DLSS not to be present
Ray-Tracing features have been removed and limited to the bare minimum (again part of the AMD sponsorship)

Who knew AMD backing a game would be even worse than Nvidia backing a game. The problem is FSR isn't as good as DLSS. Although I see why AMD is doing this. They are behind on the technological front, so taking away that advantage by paying developers is an avenue that works. The problem is it hurts the game itself. I can't recall anytime Nvidia paid developers to remove features, typically they just paid and helped them add extra stuff that wasn't necessary like overdone Physx, Hairworks or ray tracing.

If only the people who were so upset about games being sold on Epic were as upset about graphical features being removed and performance curtailed on certain hardware maybe we would see some change on this front.

With luck they will have a massive day one patch to help with some of the issues, but bad PC ports seems to be standard now. Add in high costs and I think PC gaming is going to see a bit of a decline in the coming years.
 
Who knew AMD backing a game would be even worse than Nvidia backing a game. The problem is FSR isn't as good as DLSS. Although I see why AMD is doing this. They are behind on the technological front, so taking away that advantage by paying developers is an avenue that works. The problem is it hurts the game itself. I can't recall anytime Nvidia paid developers to remove features, typically they just paid and helped them add extra stuff that wasn't necessary like overdone Physx, Hairworks or ray tracing.

If only the people who were so upset about games being sold on Epic were as upset about graphical features being removed and performance curtailed on certain hardware maybe we would see some change on this front.

With luck they will have a massive day one patch to help with some of the issues, but bad PC ports seems to be standard now. Add in high costs and I think PC gaming is going to see a bit of a decline in the coming years.
Is there an article on this? Nothing in the 2 posted mention anything about it.
 
Is there an article on this? Nothing in the 2 posted mention anything about it.
About AMD specifically sabotaging Nvidia and paying off the developer to have their own optimized exclusive title? Nothing says that. It's just optimized for AMD. That makes sense since the game was designed to run on AMD console hardware. It's a no brainer.
 
Most games are designed on around the limitations of Consoles. The PC ports are usually shit until they are patched to function properly.
Right. I'm more talking about the "paying to specifically remove features"
I have seen nothing to indicate this in any way shape or form, I was just searching articles about this (and found nothing). This is butthurt Nvidia adopters that paid way the hell too much for their video cards.
 
Last edited:
Plenty of games run better on one Video Card vs another. It's been this way for as long as I can recall. However, in this instance because the game doesn't support DLSS at launch people have drawn the conclusion that AMD paid to have the features removed and to have the game sabotaged for Nvidia hardware...

There's a finite amount of time that any developer has to optimize for specific Video Cards. Some games get Nvidia (most) and some get AMD. As I have said before, it was designed to run on consoles. What are the processor and Graphics cards of consoles? AMD. Enough said

PC gamers will see patches over the next year that fixes most of their issues.
 
Is there an article on this? Nothing in the 2 posted mention anything about it.
I'll dig out the Reddit posts on the topic, but they removed stuff that comes standard in UE4 and features that were available in the previous game which also used UE4.
 
Plenty of games run better on one Video Card vs another. It's been this way for as long as I can recall. However, in this instance because the game doesn't support DLSS at launch people have drawn the conclusion that AMD paid to have the features removed and to have the game sabotaged for Nvidia hardware...
I mean I'm playing X4 Foundations atm and it only has FSR 1 as an option, but no DLSS or FSR 2. Some devs just don't get around to adding it. Some games like Satisfactory state not having it because they don't want to add hardware specific features.
 
I'll dig out the Reddit posts on the topic, but they removed stuff that comes standard in UE4 and features that were available in the previous game which also used UE4.
Reddit is not official. Reddit is a horseshit festival of hearsay and speculation. You can't trust anything you read on there. It's almost worse than twitter.

As far as them stripping the options for DLSS, the tech has to be optimized for the game no? If they didn't have the dev cycle to get that shit done having a, broken, in menu option in there that doesn't function properly doesn't make sense (I would pull it until properly added). I totally agree with NightReaver above.
 
Just another garbage port.

1uubo5.jpg
 
Once you have in place in your game motion vector, adding DLSS, XeSS support must not be a big deal and it is an heavy partnered title so I get the suspiciousness.

It would be quite a fair assumption that when they made the latest Cyberpunk pathtracing montecarlo RT affair they did not cared much if it ran well on rdna 2 and 3.


As far as them stripping the options for DLSS, the tech has to be optimized for the game no? If they didn't have the dev cycle to get that shit done having a, broken, in menu option in there that doesn't function properly doesn't make sense (I would pull it until properly added). I totally agree with @NightReaver above.
That was more 1.x, 2.x big jump was a general training, from what I understand not much of a difference FSR 2 and DLSS 2.x, both need motion vector from you, current and previous frame buffer, I am not sure there much if any difference from a code user standpoint:
So DLSS 2.x and FSR 2.0 both use the same basic inputs for their upscaling algorithms: multiple frames of data, motion vectors, and depth buffers. However, FSR 2.0 processes all this data through a customized Lanczos upscaling algorithm instead whereas DLSS relies on a deep learning algorithm.
https://www.tomshardware.com/news/amd-fsr2-deathloop-vs-dlss
 
Last edited:
I'm fairly certain EA isn't trying to sabotage their game. I think they wanted to release it, start making money and then add in the other stuff as necessary. Just like any other game. I've never seen anything properly tested and stable prior to launch other than ancient games like X-Com (and I think it even had patches, you just didn't realize they existed. Terror from the Deep had a game breaking bug where you couldn't capture one of the commanders necessary to win the damn game, and most people didn't realize there was even a patch for it). Old shit was also way less complicated.

Give it time, the game will likely end up running on Nvidia pretty well if not amazing.
 
Once you have in place in your game motion vector, adding DLSS, XeSS support must not be a big deal and it is an heavy partnered title so I get the suspiciousness.

It would be quite a fair assumption that when they made the latest Cyberpunk pathtracing montecarlo RT affair they did not cared much if it ran well on rdna 2 and 3.



That was more 1.x, 2.x big jump was a general training, from what I understand not much of a difference FSR 2 and DLSS 2.x, both need motion vector from you, current and previous frame buffer, I am not sure there much if any difference from a code user standpoint:
So DLSS 2.x and FSR 2.0 both use the same basic inputs for their upscaling algorithms: multiple frames of data, motion vectors, and depth buffers. However, FSR 2.0 processes all this data through a customized Lanczos upscaling algorithm instead whereas DLSS relies on a deep learning algorithm.
https://www.tomshardware.com/news/amd-fsr2-deathloop-vs-dlss
They're not exactly the same thing:
"But instead of using a supercomputer and tens of thousands (millions?) of frames worth of data to train an algorithm, and then relying on special purpose tensor cores to run, AMD's FSR 2.0 simply uses the GPU shader cores, allowing the tech to work on virtually any GPU."

I think you essentially said that in other words... (y)
 
Almost seems like development should be on the high platform and dumbed down rather than scaled up. But what do i know lol
I'm not in games, but working in mobile software, if the company targetted low end, high end tends to just work; if the company targets high end, low end is never going to be good. Gaming is of course, different, but I don't think make it for high-end PC and then demake it for consoles is going to work so well; better to make it for consoles and be testing it on a handful of PCs during development.
 
I think you essentially said that in other words... (y)
Yes but the training is a general one, it does not need to have run your specific title like in the past.

There has been many years of 16k render being compared to lower quality one and train, that make it so now that it work on a new game that was never trained on.
 
Yes but the training is a general one, it does not need to have run your specific title like in the past.

There has been many years of 16k render being compared to lower quality one and train, that make it so now that it work on a new game that was never trained on.
You know way more than me about this. I defer to your knowledge sir.
 
Based on reviews, the game has been returned on Steam. No reason a 4090 should not be king of 4K in this game, instead it stumbles at 1440P? LoL. Poorly optimized and features removed to make AMD look good? That's not what PC gaming is about. I have no "need" to play this game, so skip it I shall I guess. Didn't want to pay $70 anyway, yay for returns.
 
Based on reviews, the game has been returned on Steam. No reason a 4090 should not be king of 4K in this game, instead it stumbles at 1440P? LoL. Poorly optimized and features removed to make AMD look good? That's not what PC gaming is about. I have no "need" to play this game, so skip it I shall I guess. Didn't want to pay $70 anyway, yay for returns.
No dude, you're entering conspiracy theory territory. The game was designed to run on consoles that are comprised of AMD hardware. It will run on some bullshit, old, AMD graphics cards because ... it was designed to run on weak AMD hardware being used in consoles.

PC gaming is dead and has been for 20 years (I'm not trying to be a dick here, I haven't seen many PC games that pushed the boundaries of hardware come out in my lifetime. Maybe Star Citizen if they ever finish it... but that's about it). I just refuse to accept that and keep playing games on my PC. Companies don't generally make games for PCs anymore. They make em for consoles. You can go return the game, I however, am gonna enjoy the shit out of the game when I get home. Gonna dial in the 30 Hz setings to limit the FPS to console levels and kill me some dark jedi.

I have not seen this many butt hurt people in a while... It's rather amusing to see people that paid a fuckton of money for the fastest video card on the planet whining... Seriously starting to crack me up.
 
Last edited:
No dude, you're entering conspiracy theory territory. The game was designed to run on consoles that are comprised of AMD hardware. It will run on some bullshit, old, AMD graphics cards because ... it was designed to run on weak AMD hardware being used in consoles.

PC gaming is dead and has been for 20 years (I'm not trying to be a dick here, I haven't seen many PC games that pushed the boundaries of hardware come out in my lifetime. Maybe Star Citizen if they ever finish it... but that's about it). I just refuse to accept that and keep playing games on my PC. Companies don't generally make games for PCs anymore. They make em for consoles. You can go return the game, I however, am gonna enjoy the shit out of the game when I get home. Gonna dial in the 30 Hz setings to limit the FPS to console levels and kill me some dark jedi
I mean, I get it, most games are 100% developed for consoles, but a lot of the games I play still work amazingly on PC. I will never buy another console, my last one was an SNES... lol. PC gamer ever since. I love the raw power a PC can put out, just wish more dev's took advantage of it. Other games that are ports work great for me, COD:MW2, BF2042, CP2077 is amazing on PC now, Forza Horizon 5 is solid too... this just appears (up front anyway) to be a rushed sloppy release for money that they will patch away later. Maybe by then it will be on sale... lol.
 
I mean, I get it, most games are 100% developed for consoles, but a lot of the games I play still work amazingly on PC. I will never buy another console, my last one was an SNES... lol. PC gamer ever since. I love the raw power a PC can put out, just wish more dev's took advantage of it. Other games that are ports work great for me, COD:MW2, BF2042, CP2077 is amazing on PC now, Forza Horizon 5 is solid too... this just appears (up front anyway) to be a rushed sloppy release for money that they will patch away later. Maybe by then it will be on sale... lol.
If you want to play the game, at least wait for the day 1 patch which is tomorrow, before you return it. I love PC because I can do anything & everything on it. The 1st game ran like complete ass on my 2080ti, and I still enjoyed it. This one is supposed to be stellar, dial your FPS down to 30 and have fun.

Why do I say 30? We watch movies at 28 FPS. It still looks good at 30, it's not a competitive game and the consoles were all designed to run at 30 FPS. Which means the combat system is likely tied to 30FPS like the first game was on day 1 (they fixed this later, months down the road). The 30 FPS is fine, it's not immersion breaking in any way. It is also the speed that the 4090 seems to be able to barely get as minimum FPS (HAHAHAHAHAHAHAH!!!!!)... So no need to cry, just dial that stupid expensive 2,000+ dollar video card down to 30 Hz and enjoy the game. I'm gonna do the same with my 1,000 Dollar 7900XTX.
 
Last edited:
most games are 100% developed for consoles
A certain type of consoles games, maybe, specially big budget.

But I would suspect most games are made for mobile, followed by PC than console.

In 2022, around ~11,000 new games were released on Steam, my guess vast majority were not 100% developed for consoles then ported.
 
A certain type of consoles games, maybe, specially big budget.

But I would suspect most games are made for mobile, followed by PC than console.

In 2022, around ~11,000 new games were released on Steam, my guess vast majority were not 100% developed for consoles then ported.
You're right, there's an ecosystem for indy developers out there that just make PC games. All the Pixel Cade stuff, Roguelikes, Anime/Manga/Hentai stuff, minor devs releasing their first mech game, etc. But the big companies go after the Consoles and we get the ports of those. For a while in the past PC didn't even get ports. Many games, including Halo were console exclusives. I had to get a damn Xbox just to play Halo, pissed me off.

A lot of the stuff that hits Steam is Shit. But sometimes there is a gem in there.
 
If you want to play the game, at least wait for the day 1 patch which is tomorrow, before you return it. I love PC because I can do anything & everything on it. The 1st game ran like complete ass on my 2080ti, and I still enjoyed it. This one is supposed to be stellar, dial your FPS down to 30 and have fun.

Why do I say 30? We watch movies at 28 FPS. It still looks good at 30, it's not a competitive game and the consoles were all designed to run at 30 FPS. Which means the combat system is likely tied to 30FPS like the first game was on day 1 (they fixed this later, months down the road). The 30 FPS is fine, it's not immersion breaking in any way. It is also the speed that the 4090 seems to be able to barely get as minimum FPS (HAHAHAHAHAHAHAH!!!!!)... So no need to cry, just dial that stupid expensive 2,000+ dollar video card down to 30 Hz and enjoy the game. I'm gonna do the same with my 1,000 Dollar 7900XTX.
30Hz will make my eyes bleed... lol. I got the first game late in the cycle and ran that on my 2080Ti at the time at 80+ FPS. There is no way I find it acceptable to dial down FPS so low, but on the "plus" side, the issues look to stem from poor CPU optimizations as the 4090 in the video was only using like 40% of the GPU. So maybe there is hope they fix that crap in game, or nvidia has a new driver tomorrow to address it. But yeah, almost every game I play, fully maxed at 4K gives me a solid locked 144 FPS on my monitor, so 30 FPS will be an absolute slide show and unbearable for me as I am VERY used to 100+ FPS, even when I had my 3090 before this card.

Sadly, I doubt the game will ever see DLSS (or Frame Gen), which it would be perfect for.... but maybe we will get lucky and someone will figure out how to hack it into the game... lol.
 
https://www.nme.com/news/gaming-news/star-wars-jedi-survivor-pc-performance-low-fps-issues-3436303
https://kotaku.com/star-wars-jedi-survivor-pc-performance-fps-reviews-1850382672

At this point I don't know if these sorts of PC ports are just malicious, or straight-up incompetent, glad I didn't pre-order but I am bummed because I was looking forward to this one.

TLDR;
4090 struggles to maintain 60fps at 1440p
No DLSS only FSR2, apparently part of AMD sponsorship requires DLSS not to be present
Ray-Tracing features have been removed and limited to the bare minimum (again part of the AMD sponsorship)
155Gb install
32Gb ram basically required for smooth texture population
I call this a looming problem because while ports to PC have been traditionally not good, I struggle to remember a time when they were consistently this bad.

AMD sponsorship, what a laugh. It was made for consoles thus AMD must have paid the developer off to make my Nvidia card suck. Nope was just optimized for consoles and obviously not for PC, AMD users just wont pay as heavy as a price since it's hardware it's similar to the console. I am sure a patch will come out shortly, once the devs see what is causing the issue on Nvidia hardware. Also you can use FSR on Nvidia, so maybe they felt that was good enough. Also 16 gigs of ram has kind of been the minimum you wanted for a PC for awhile now if you game.

Also bad ports, or just not playing as well on Nvidia as you think they should? Perhaps Nvidia made some bad choices which are now haunting owners?
 
Sounds like its an Nvidia problem. Remember people use to bitch about Nvidia sponsered games running poorly for AMD cards.....roles are reversed and now people blaming the developer? lol good times
 
30Hz will make my eyes bleed... lol. I got the first game late in the cycle and ran that on my 2080Ti at the time at 80+ FPS. There is no way I find it acceptable to dial down FPS so low, but on the "plus" side, the issues look to stem from poor CPU optimizations as the 4090 in the video was only using like 40% of the GPU. So maybe there is hope they fix that crap in game, or nvidia has a new driver tomorrow to address it. But yeah, almost every game I play, fully maxed at 4K gives me a solid locked 144 FPS on my monitor, so 30 FPS will be an absolute slide show and unbearable for me as I am VERY used to 100+ FPS, even when I had my 3090 before this card.

Sadly, I doubt the game will ever see DLSS (or Frame Gen), which it would be perfect for.... but maybe we will get lucky and someone will figure out how to hack it into the game... lol.
Well, I can set my AMD graphics card to limit fps to 30, just did that. No Hz alteration necessary

Funny you say that about the slide show. My cousin plays competitively at 144 Hz in shooters online. Had him over when Star Wars Squadrons released and had the game running 30 Hz on a 4K display in my front room on a GTX 1080, he couldn't tell the difference even when he moved to my main rig at the time with it's 2080Ti at 60 Fps.

It's all drama with the high Hz gamers all the time
 
30Hz will make my eyes bleed... lol. I got the first game late in the cycle and ran that on my 2080Ti at the time at 80+ FPS. There is no way I find it acceptable to dial down FPS so low, but on the "plus" side, the issues look to stem from poor CPU optimizations as the 4090 in the video was only using like 40% of the GPU. So maybe there is hope they fix that crap in game, or nvidia has a new driver tomorrow to address it. But yeah, almost every game I play, fully maxed at 4K gives me a solid locked 144 FPS on my monitor, so 30 FPS will be an absolute slide show and unbearable for me as I am VERY used to 100+ FPS, even when I had my 3090 before this card.

Sadly, I doubt the game will ever see DLSS (or Frame Gen), which it would be perfect for.... but maybe we will get lucky and someone will figure out how to hack it into the game... lol.
How did you even get it to run? Game doesn't release until 9 PM on the West Coast.
 
he couldn't tell the difference even when he moved to my main rig at the time with it's 2080Ti at 60 Fps.
That's pretty bad tbh. My wife was playing Hogwarts on her pc and I was like wtf, why does it look slow motion? At some point her display was set at 30hz. "Slide show" is a total exaggeration, but 30 fps does just look slower. It's why I find tv content at 60 fps so...uncanny. It looks like it moves way too fast when you're so used to standard film fps.
 
Back
Top