Jedi Survivor is the best showcase of a looming problem for PC players

Everything depends on game settings.

Yes.

Could the RTX 20 and 30 series do DLSS 3.0? Most likely yes, but Nvidia is locking older gen products out of it. DLSS is superior but it's also superior at fragmenting users out of tech that could be done on any GPU. FSR works on all cards, including yours and even older cards like mine.
They say it requires some optical sensor of some sort; it doesn't surprise me that new tech requires new hardware. Yet, my old ass 2080ti can use DLSS 2 which is better than any current (and probably future) version of FSR. My old ass card can use DLSS AND FSR while AMD is stuck on FSR on cards that are even older; it's not like anyone is choosing to use FSR over DLSS...

He also said it's also basically nothing over Pascal. It's also closer to 35% faster at 4k, which is still basically nothing.

For most people 33% isn't enough to justify hundreds or in your case thousands of dollars, hence the poor GPU sales. If I'm gaming at 40fps and buy a GPU that's 33% faster, then I'm running it at 60fps, for what is hundreds of dollars. For that kind of increase I'm better off waiting for a proper upgrade. For me to get a proper upgrade I would have to buy a 6700 XT, which are cheap but at best it offers double the frame rate of my Vega 56 for what is now $350 new and under $300 used on Ebay. If I'm more patient, I wouldn't be surprised if Jedi Survivor's poor frame rate is attributed to Denuvo. Remember that whole fiasco with Resident Evil 8 and DRM killing performance? Wouldn't shock me if new games are terrible just because of the new Denuvo. Most of these new games haven't been cracked yet, so who knows what performance killing effects Denuvo has on these games?
Yes but the 7X00 series is even closer to the 6X00 series than that so why does it matter?
 
It's playing mostly fine for me with a 4090. I turned the FSR to OFF. Raytracing is On, settings on everything else were at Epic. I'm using adaptive sync so that makes it smooth as long as FPS doesn't drop below 30 (I believe that is slowest FPS for adaptive sync to work).

There are times where you are running along and the details on structures etc have to fill in as you get close, but so far it hasn't looked horrible. I'm still on the Coruscant section and only played for about 2 hours.

The April 18 released nVidia driver says it is for best experience on Jedi Surviver.

Maybe the youtube reviewers were using older versions of the game plus older drivers.

I ordered the Limited Run physical copy that comes with a lightsaber. Thing thing is heavy, and looks great! EDIT: Added 1 more picture with the blade installed.

1682725582524.jpeg

1682725597648.jpeg

1682725612840.jpeg

1682725632766.jpeg

1683331499579.jpeg

Pretty sad all of the AMD fans who are not upset that the game Optimized by AMD wasn't performing as expected on nVidia hardware... when nVidia was accused of hampering AMD with Hairworks/Witcher 3 (like 7 years ago) it was all "Nvidia is making it run bad on AMD on purpose!" which was not true, AMD cards just had shitty shader performance that gen. The next gen of AMD cards corrected that and AMD cards played fine with Hairworks/The Way It's Meant To Be Played games since.
So the accusation was wrong.
Here I am not convinced that the same accusation in reverse is necessarily true, but if it turns out to be, looks pretty shitty for the "can't do any wrong" AMD...

And if the situation was reversed, the forum would be alight with torches and pitchforks (AMD fanboys) out for blood from the evil Nvidia. But now they are making excuses/rejoicing? I guess they only oppose it when nVidia supposedly does it...

That 5700 cards were playing better then a 4090 is pretty sus, there is no way that the 4090 doesn't have the power to easily run this game. There aren't even any crazy amounts of texture detail, in fact the game looks the same as Fallen Order to me, except for larger levels. It definitely does have some graphical glitches to be worked out, it has occasionally had some hitching. Feels like a game engine issue to me, but I am not finding it to be as horrible as the youtube reviewers experience was.
Again, I turned FSR completely off, so maybe that helps improve quality.

****************************************************************
Played some more. Tried various FSR settings.
From the behavior, FSR is broke in this game.. so much for "works on everything"

Settings I played with initially:
1682729829596.png

1682729886933.png

I took some screenshots but they are saving as .jxr files? Opening them places the afterburner screen overlay on (like I'm playing a game) and obfuscates the screen overlay that was in the screenshot (see above), and the screen overlay toggle isn't working, so it's hard to show the behavior/FPS/vRam usage. But as long as FPS is above 30, it feels pretty good with adaptive sync.
It dipped as low as 18fps in a cutscene. The cutscenes are rendered by the engine and it seamlessly transitions into and out of them, so it's not playing Bink video or anything like that. I like the way the custscenes play in that respect.

It's playable, but disable FSR for sure.

Performance observed:
FSR DIsabled, typical FPS 40 to 44, but can drop during cutscenes.
FSR Ultra Performance FPS 40 to 44, and looks like complete ass.

FSR is broke as hell in this game (at least on Nvidia GPU's), or only runs shitty on Nvidia GPU's either because:
1) on purpose by AMD, or
2) incompetence by AMD (if they assisted in implementation), or
3) incompetence by the Game Dev's, or
4) they had a timeline to get the game out, so here we are with a buggy release, or
5) FSR as a tech just sucks. I haven't ever used it in any other games. Since DLSS is superior, I use that if FPS needs a boost, or just turn it off like I did here.

The way the game behaved after turning FSR on, same FPS, complete shit on the screen, but then even after turning it back off, the Graphics quality suffered. So I recommend disabling FSR, exit the game to desktop, then relaunch it.

***********************************************************************************************
Played a few more hours. Sometimes during cutscenes there is skipping, FPS can be anywhere from 44 down to 18, at the settings I'm running posted above. With adaptive sync, it is still fun for the most part. Probably needs a few patches to smooth it out. Fallen Order played flawlessly on a 2080Ti, no reason they shouldn't be able to get this running well on PC. Almost seems like it's artificially capping FPS, maybe the consoles were too weak to do anything over 40fps, and some limit is still applied. Or it's AMD sabotage.

Game is still pretty fun, so give it a go after a patch or 2.
 
Last edited:
You would think that high performance would just be better performance wise than balanced but I think perhaps with the hybrid CPUs it leads to something where threads that shouldn't get stuck on E-cores.
I certainly noticed earlier this year when doing folding@home that it insisted on doing CPU folding only on the E cores of my 12600K unless I set process affinity to P cores (Windows 11, not sure what power plan I was using.)
 
Yeah…. You can you’re not supposed to but you can.

You are supposed to use that to bang it out so you can circle back around and do the cleanup and optimization. The idea being you have the trained monkeys bang out the bulk of the mind numbing work with pre approved and tested snippets then have the trained experienced team following behind to make sure all the parts are fitting together properly.

It doesn’t help that blueprint and other things like it have gotten a lot “better” while dealing with low level API’s has gotten more complex and the people experienced enough doing that sort of work retire out or move on for less stressful better paying pastures.

So blueprint lets you get something functional out faster than ever but cleaning it up is more complicated than ever, that’s where some AI tools are being introduced but troubleshooting their output is interesting from what I’ve been told.

Basically programming talent is moving on and fewer good programmers are interested in building games because they have more options than ever before.
I'm not sure that that's the case, we hired two people recently and were flooded with resumes, really inspired and talented people too. It might be the case that talented people don't want to work for large corporations that are designed to crush workers. Maybe the more talented people work on smaller projects for smaller businesses (my company is 15 people + some contractors). If I had a choice between being a dull web developer and not having to sleep at my desk and being a game dev I know what I'd choose (I took option 3, make less money but have a flexible schedule and still do game dev). I will say, we don't really work in unreal though so I'm not sure about the pros and cons of blueprints. I've messed around with it but I'm pretty strictly a unity dev. I can tell you for sure though, just because we're programming in C# doesn't mean a game is optimized (I've personally written some horrible C# code haha). A lot of the best people we get have made mobile games in 3D that are networked. Those are honestly the biggest challenges for the most part, server authoritative game play that feels good, and being able to run well on a mobile device. If you can do that you're a solid game play dev (obviously there are other standards that are more specific but who cares, also I don't want to think about interviewing people). People who are specialists that do things like write compute shaders, game engine dev, etc is a whole different ball game (I've learned a little bit HLSL but it's not my thing yet). Unity kind of abstracts most of the texture optimization away and mostly gives you parameters you can set. The same thing is true of compression (the algorithms are pretty well standardized). But there are multiple things happening, you basically set up their texture streaming, atlas your textures, and bake your occlusion culling. Of course there's always LODs and settings, both of which you make on your own, for example settings can influence the parameters of Unity's texture streaming however you see fit (or you can just buy a damn package that does most of the heavily lifting for you, god knows I don't need to do another ResolutionManager script). There are a bunch of other asset level techniques now with addressable assets and memory management, I'm not sure if they're relevant to the new DOTS programming stack though (absolutely should be used for standard unity dev).


Anyway, I'm not sure what the hell is going on with this game, I'm kind of shocked it's having this many problems on a very well established game engine. You shouldn't need to do a ton of low level coding to handle texture streaming, but they may have a bunch of custom engine code we're not privy to because they're a massive studio and wanted to sort of stretch what UE4 can do. I know one thing is for sure, base UE4 has very well established methods for handling texture data. It does seem like maybe the day 1 patch fixed the v-ram usage but the performance just isn't there which is very, very odd. I'm guessing the game has very specific requirements for the hardware it likes to run on currently, I'm an idiot and by no means smart enough to be a game engine dev so we'll just have to wait and see.
 
Last edited:
They say it requires some optical sensor of some sort; it doesn't surprise me that new tech requires new hardware.
The problem with closed source standards is that nobody knows. FSR isn't closed.
Yes but the 7X00 series is even closer to the 6X00 series than that so why does it matter?
Nobody is buying them either. Incremental improvements but with a huge mark up in price.
The problem with 6700xt is even 12gb may not support 1080p ultra going forward

Maybe you should switch target to a 16gb 6800 for $400.
(For ex this open box asrock is $420)

https://www.newegg.com/asrock-radeo...4930048R?Item=N82E16814930048R&quicklink=true
Maybe but I'm still waiting for the GPU market to crash. Despite the prices dropping it hasn't crashed yet. Patience is a virtue. The Vega 56 is getting old but it still plays games just fine at 1080P. It'll be interesting to see how it plays Jedi Survivor, especially under Linux. Maybe I'll get lucky and it's one of those games that runs better on Linux than Windows, like Elden Ring.
 
For those with NVidia, force Re-Bar on using nvidiaprofileinspector.... I found it gave me a 15+ FPS boost.

Also, try this out....

https://www.reddit.com/r/FallenOrder/comments/13265ps/potential_fixes_for_improved_performance/
This fix has worked amazingly well, getting over 60fps now.

For a 4090 you can turn settings higher than in the post above, he has a 3080:
Override AA in the Nvidia Control Panel, use game specific settings for Jedi Survivor and use these:
1682754919682.png


Then edit the GameUserSettings.ini file in C:\Users\<your username>\AppData\Local\SwGame\Saved\Config\WindowsNoEditor\
These are the settings to set: The AA set to 0 since it's being overridden (Setting 0 probably doesn't matter). ResolutionQuality is likely the primary fix, followed by correcting all of the resolutions (3 of the 4 sets were lower than my screen resolution), and setting it to use the DesiredScreenHeight. The Fullscreen modes not sure, the game was already in Full Screen, but they haven't hurt anything. Once done make the file read-only or the game will lower Resolution Quality back to 50.

[/Script/Engine.GameUserSettings]
bUseDesiredScreenHeight=True

[ScalabilityGroups]
sg.ResolutionQuality=100.00000
sg.ViewDistanceQuality=3
sg.AntiAliasingQuality=0
sg.ShadowQuality=3
sg.PostProcessQuality=3
sg.TextureQuality=3
sg.EffectsQuality=3
sg.FoliageQuality=3
sg.ShadingQuality=3

[/Script/SwGame.SWGameUserSettings]
Gamma=1.700000
DeficiencyType=NormalVision
DeficiencySeverity=0
bUseVSync=True
bUseDynamicResolution=False
ResolutionSizeX=3440
ResolutionSizeY=1440
LastUserConfirmedResolutionSizeX=3440
LastUserConfirmedResolutionSizeY=1440

WindowPosX=-1
WindowPosY=-1
FullscreenMode=1
LastConfirmedFullscreenMode=1
PreferredFullscreenMode=1

Version=9
AudioQualityLevel=0
LastConfirmedAudioQualityLevel=0
FrameRateLimit=237.00000
DesiredScreenWidth=3440
DesiredScreenHeight=1440
LastUserConfirmedDesiredScreenWidth=3440
LastUserConfirmedDesiredScreenHeight=1440


I kept Raytracing enabled. FSR Quality looked ok but didn't give any noticeable performance boost, so went back to Disabled. FPS is now 55 to 65. There's likely more work the game needs, but this is quick and easy and makes a significant difference, considering previously I was seeing 18 to 44fps.
 
Seems to be running fine for me (30-50fps) everything set to epic on my 6700XT at 4k resolution. I get the feeling though that maybe the textures aren't loading correctly when I have FSR off so I've left it on since it looks better. (maybe I just like the boost in sharpness)

I had the occasional stutter when I first opened it but then I had to reopen it since I forgot to turn on HDR and I don't notice it as much now. Only just loaded up the starting area though (1st hour). Maybe I'll have to dial things down in when I get further in.

Hopefully they fix the issues with Nvidia cards for you guys sooner rather than later.
 
Seems to be running fine for me (30-50fps) everything set to epic on my 6700XT at 4k resolution. I get the feeling though that maybe the textures aren't loading correctly when I have FSR off so I've left it on since it looks better. (maybe I just like the boost in sharpness)

I had the occasional stutter when I first opened it but then I had to reopen it since I forgot to turn on HDR and I don't notice it as much now. Only just loaded up the starting area though (1st hour). Maybe I'll have to dial things down in when I get further in.

Hopefully they fix the issues with Nvidia cards for you guys sooner rather than later.
There is a bug that has to do with how settings are applied and it renders at 50% unless you have FSR enabled or reset your settings.
 
  • Like
Reactions: zehoo
like this
Here some tips from dso gaming


"Now before continuing, we should detail the PC system we used for our initial tests. So, for the following benchmark, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit, and the GeForce 531.61 driver. Yes, you read that right. Although we haven’t upgraded yet to the latest NVIDIA driver, the game runs fine"

" how were we able to achieve such smooth framerates? For starters, we’ve disabled SMT (that’s Hyper-Threading for Intel owners). Since the game does not appear to take advantage of a lot of CPU threads, it makes sense to disable SMT for a much better experience"

"What’s crucial to note here, though, is that the game has major issues when changing its graphics settings. At the end of the video, we highlight this issue. Upon disabling and re-enabling RT, our GPU usage, in the exact same scene, drops from 98% to 77%. And that’s precisely why a lot of PC gamers cannot see any performance increase when changing graphics settings. In order to fix this GPU utilization issue, you’ll have to close the game and re-launch it."

https://www.dsogaming.com/articles/...t-90fps-at-4k-with-epic-settings-ray-tracing/
 
Could people went from code to blueprint over time ?
Maybe, the string of late 2022/2023 terribly optimized title:
https://www.polygon.com/23680398/redfall-30-fps-quality-mode-performance-xbox
Arkane and Bethesda say that a performance mode will arrive post-launch

That a game with a summer 2022 that was delayed already


Has some link, with massive change in the workflow of 2020-2021 ?
https://www.vg247.com/arkane-studios-covid-19

Game released now would be game, they thought to have released summer-fall 2022 and would have peak large team dev cycle in 2020-2021 ?

https://financialpost.com/fp-work/covid-is-forcing-video-game-companies-to-rethink-remote-work
https://www.videogameschronicle.com...fully-embracing-digital-first-remote-working/

And having to let go of the easier management that was crunch time:
https://www.gamesindustry.biz/respawn-emphasises-its-no-crunch-policy
(respawn being the Jedi Survivor studio)

That said it is not like there is not a long history of issue with game on release days, but it does not seem to be limited to port issues this time, even on fully predictable consoles, game seem to have a lot of issues
 
Here some tips from dso gaming


"Now before continuing, we should detail the PC system we used for our initial tests. So, for the following benchmark, we used an AMD Ryzen 9 7950X3D, 32GB of DDR5 at 6000Mhz, and NVIDIA’s RTX 4090. We also used Windows 10 64-bit, and the GeForce 531.61 driver. Yes, you read that right. Although we haven’t upgraded yet to the latest NVIDIA driver, the game runs fine"

" how were we able to achieve such smooth framerates? For starters, we’ve disabled SMT (that’s Hyper-Threading for Intel owners). Since the game does not appear to take advantage of a lot of CPU threads, it makes sense to disable SMT for a much better experience"

"What’s crucial to note here, though, is that the game has major issues when changing its graphics settings. At the end of the video, we highlight this issue. Upon disabling and re-enabling RT, our GPU usage, in the exact same scene, drops from 98% to 77%. And that’s precisely why a lot of PC gamers cannot see any performance increase when changing graphics settings. In order to fix this GPU utilization issue, you’ll have to close the game and re-launch it."

https://www.dsogaming.com/articles/...t-90fps-at-4k-with-epic-settings-ray-tracing/
This is literally every ray tracing game I've played ever. All recommend you exit the game after enabling ray tracing
 
The game does not keep that close to 60 fps on console despite running around a 1000p resolution in performance mode and it is a 30fps title at 1440p on the PS5-xboxX

performance-1920-1080.png
performance-2560-1440.png
performance-rt-2560-1440.png



I imagine the worst moments are worst on PC + shaders compile stutter issues, but in term of average performance.

It seem like a 3070 is 50% more performance than consoles (if the consoles quality setting are the equivalent of all out epyc on the PC settings) if you have a really strong CPU.

Seem to be the usual a 6700xt + good cpu is a bit stronger than latest console affair ?

Or maybe the average FPS numbers for benchmark are meaningless with new title, but the %1 low from techpowerup are much better on PC than consoles as well:
min-fps-rt-2560-1440.png


At higher setting than console quality mode, a 3070/6700xt minimum fps seem to beat the console framerate, this was with:
We tested the public release version of Jedi Survivor, not a press preview version.

Which maybe took care of a lot of things or maybe the issues does not even show up in 1% low and we will need to start to look at the 0.01% lows
 
The crux of the issue here is a shoddy port, likely with a console focus primarily (and a discussion about "acceptable" console performance standards is its own post) and then not enough time given to properly QA things for PC. Honestly, I'd think it would be preferable to design it for PC and then adapt to the limitations of consoles, but a lot of cross-platform titles seem to do the opposite for several reasons that would add a considerable amount of quasi-off topic discussion. The fact that so many "fixes" (note the bit above about disabling SMT also brings another issue into contention especially on a 7950X3D. Was the "Xbox bar" active and the game itself properly flagged as a game to be shunted to 3d cache cores? If not, that may be a reason why SMT off seemed to give a performance benefit? Among other potential issues) come down to relatively limited mods, driver updates and more (I am curious the amount of improvement that could come from Unreal Engine tweaks - messing with a handful of ini's have had big differences in the past and also fixed problems). makes me think this is just a poor port, as opposed to an AMD directed or negligent impediment.

The fact that AMD cards are in a slightly better position come down to the fact that most of the consoles were designed with AMD GPUs so its more of a side effect, but even on AMD GPUs the PC version of the game isnt' pefect or performing as it should given the hardware, once again leading to a poor port. Bad PC ports of multiplatform games have come in many forms over the years. When its a big Western publisher , you tend to get what you see here. For a long time if it was a Japanese publisher, you ended up with a lack of optimization for PC at all and instead titles that replicated exactly the console settings and norms; a good example is a capped framerate, sometimes with game physics tied to said framerate so if the game runs above 60 or even 30 FPS "weird" things start to happen! A developer named Kaldaien is known to have made a ton of fixes, upgrades, and even patched issues that major publishers said couldn't be done, moving from disparate FOSS fixes to his now unified Special K platform - https://special-k.info/ . In any case, bad ports are common enough though frustrating.

As far as it being AMD vs Nvidia in handling these issues, I would think it very suspect to suggest that not providing DLSS or otherwise cutting features is a requirement for a "partnered" game. I've seen quite a few over the years that leaned closer to AMD and offered FSR yet supported DLSS for those who had Nvidia cards. The Last of Us Part 1 is currently a "get a free copy for buying an AMD GPU" partner title and it offers both DLSS and FSR 2.x support, and Forza Horizon 5 is a RDNA3 / FSR 2.x example title on AMD's RDNA3 page, but it offers DLSS as well. It all seems to come down to developer questions an I'm guessing that may be the case here, especially given the rush and I'll get to that later. If AMD actually did prohibit the game to support DLSS i'd object to that as well, but looking over the historical comparisons between AMD and Nvidia, its obvious that the latter is way more invested in proprietary lockdown. Its not just a matter of developing something that suits their cards, but oftentimes is an active impediment to the competition. Even in cases where they could make an open standard and still thrive, they rarely choose to do so. AMD on the other hand, while not being perfect, has made far more attempts to support openness - open source Linux drivers that give parity experience or sometimes superior to the proprietary WIndows ones, FOSS release of FSR and similar technologies, FreeSync / VESA activesync / VRR being an open spec with universal compatibilit, and more. Adding to the multiple times that Nvidia has done something shifty with their own cards/to their own users (the 970 3.5gb issue among others), the vastly increased pricing of the 4000 series (where ironically the 4090 is the least overpriced vs the previous generation) etc.. it can get frustrating how Nvidia is still treated as "the default" (like Intel in the CPU arena) and bad behavior is ignored, whereas even the tiniest suspicious thing is jumped on when its AMD. I haven't even gotten into the whole seemingly industry-wide kowtow to "Raytracing is now the only thing that matters!" in the NV 3000 vs AMD 6000 era, when AMD's 6000 series competed well even on the high end when it came to standard, rasterized performance. Legitimate criticism is fine and none of these companies are perfect, but lets not pretend that that there aren't some clear differences between their approach.

Overall it seems that the biggest issue here is just shoddy porting, which seems to be a commonality for big, Western AAA published titles these days. They do tend to eventually get the fixes they needed post launch, but a lot of this come downs to the "delay, crunch, release, fix" loop that causes problems for anyone who's not a game company exec or major shareholder. Jedi Survivor is developed by Respawn (who is splitting their time between the stupidly profitable battle royale Apex Legends) and published by EA who apparently told Respawn to hurry up and get Jedi Survivor out. There were plans to give it a few more months before release for what seem to be badly needed QA and polish, but EA didn't want all the PR buzz taken up by the late Spring early Summer releases notably Legend of Zelda Tears of the Kingdom coming in late May and Diablo 4 arriving in June. This happens with frequency and rarely are the players the beneficiary of anything except for a buggy game and questions on when it will be improved.

There are also other compounding issues, such as how publishers are becoming increasingly desirous to simply invest in much simpler skinner-box money printing "live service" gacha titles, so every time a title like this has problems (even caused by those same greedy execs) we all lose out. Jedi Fallen Order was a pleasant surprise to EA that it did as well as it did, doing a "Star Wars + 3D Metroidvania with a pinch of Dark Souls" ; perhaps a bit safe and derivative but it as a single player , buy to play title. Thanks to it doing so well, its sequel was greenlit with a higher budget and expanded mechanics (we're in the age of Elden Ring now) . While Jedi Survivor is likely to eventually be fixed and on the backs of the Star Wars license, its predecessor's performance and decent potential in its own right will likely do well especially over the long tail- but I am concerned that the fickleness and greed of EA will not use this "learning opportunity" to learn "I should have just gotten Respawn to make more $18 skins for Apex Legends and run more $200+ collection events". Still, that's another sort of problem and for now we'll just need to se how Jedi Survivor evolves. Me? I'll probably pick up Fallen Order on Steam on sale instead and give it a spot in my ever increasing backlog.
 
Last edited:
The fact that AMD cards are in a slightly better position come down to the fact that most of the consoles were designed with AMD GPUs so its more of a side effect,
How much that really the case ?

845170_min-fps-rt-2560-1440.png
845168_performance-2560-1440.png


RT on, RT off, seem to be somewhat inline with the usual picking order.
 
How much that really the case ?

845170_min-fps-rt-2560-1440.png
845168_performance-2560-1440.png


RT on, RT off, seem to be somewhat inline with the usual picking order.
I was talking about in terms of the anomaly and what was supported initially prior to the patches or updates, when people were talking about not being able to handle playable performance (ie stuck at 30-45FPS) on a 4090 or whatnot. Same thing with lacking DLSS (something not present on any of the major consoles, exclusive to PC and NV GPU users) vs FSR (capable of being utilized on PS5 and XBSX in addition to PC regardless of GPU etc), especially given the rushed nature of the release. What you depict there seems that at least in certain circumstances things are more or less as expected.
 
After watching this, I'd just refund the game and forget it exists. This game is single-threaded for the most part. They're not doing any kind of parallelism at all. It's not getting fixed unless they fundamentally alter their code base. It's busted on consoles too apparently (not shocked) so it's not a bad port it's just a bad game. Sorry to anyone who was looking forward to playing this game.

 
After watching this, I'd just refund the game and forget it exists. This game is single-threaded for the most part. They're not doing any kind of parallelism at all. It's not getting fixed unless they fundamentally alter their code base. It's busted on consoles too apparently (not shocked) so it's not a bad port it's just a bad game. Sorry to anyone who was looking forward to playing this game.


Pfft. Listening to Digital Foundry on anything is just like calling Nvidia's marketing department. Notice how they never trash the Switch, a console so gimped it can only play Indy titles from Steam.

It's pretty damn obvious that any game not sponsored by Nvidia is all of a sudden a bad game, bad port etc.

My advice? Stop listening to Nvidia's marketing department. I'm playing the game now and it's awesome.
 
Pfft. Listening to Digital Foundry on anything is just like calling Nvidia's marketing department. Notice how they never trash the Switch, a console so gimped it can only play Indy titles from Steam.

It's pretty damn obvious that any game not sponsored by Nvidia is all of a sudden a bad game, bad port etc.

My advice? Stop listening to Nvidia's marketing department. I'm playing the game now and it's awesome.
Yup. Seems to be a decent number of people here playing the game just fine even without $1000 gpus. DF is....funny at times.
 
Pfft. Listening to Digital Foundry on anything is just like calling Nvidia's marketing department. Notice how they never trash the Switch, a console so gimped it can only play Indy titles from Steam.

It's pretty damn obvious that any game not sponsored by Nvidia is all of a sudden a bad game, bad port etc.

My advice? Stop listening to Nvidia's marketing department. I'm playing the game now and it's awesome.
They empirically show the game running on two threads, that means it's utilizing one cpu core. I don't care about any of that crap.
 
They empirically show the game running on two threads, that means it's utilizing one cpu core. I don't care about any of that crap.
That guy has lost his marbles. They have been testing lots of stuff on their mid-range system which has an AMD CPU. I haven't heard them trashing AMD hardware in any of the reviews I've watched.
 
Did they ever fix that on the first one? I own but haven't played fallen order because of that.

I'll take Fallen Order off your hands if you don't want it. ;)

Really though, it was mostly fine in the first game. I don't think they fixed it but you just don't parry. I finished the game but the was an aggravating boss fight, especially the last one. Parrying worked so bad I was part way through the game before I realized how bugged it was as it was tied to FPS if I recall. Really it was the end game boss fight that was aggravating, took me many tries because I assumed it was designed around parrying. Which I essentially never used yet was able to beat the game. I enjoyed it but it could have been better.
 
I can't recall anytime Nvidia paid developers to remove features, typically they just paid and helped them add extra stuff that wasn't necessary like overdone Physx, Hairworks or ray tracing.

I do recall, in times past, that Nvidia sponsored titles were also guilty of doing things to nerf the competition. One example that comes to mind is Arkham Asylum using an insane, and frankly unnecessary, amount of tessellation at a time when Nvidia was significantly better at that than AMD, giving Nvidia cards a huge advantage in the title. Whether or not Nvidia paid them to do that, I have no idea, but it was hard not to conclude that it was done for a reason.
 
Pfft. Listening to Digital Foundry on anything is just like calling Nvidia's marketing department. Notice how they never trash the Switch, a console so gimped it can only play Indy titles from Steam.

It's pretty damn obvious that any game not sponsored by Nvidia is all of a sudden a bad game, bad port etc.

My advice? Stop listening to Nvidia's marketing department. I'm playing the game now and it's awesome.

Sounds like it’s a driver issue for Nvidia. Typical Nvidia driver issues. They need a better driver team, like AMD…
 
  • Like
Reactions: noko
like this
lol, speaking of which.... if you do not want the game to look like poop, if you have a 4090 at game at 4K, disable FSR then proceed with the tricks I listed previously (including making the file read only). Then in game use nvidia sharpening filter, set to 50% sharpening and 20% film grain... games looks fantastic, and actually using 90%+ of the GPU because you are running 4K and will be more GPU bound, also helps a ton with stuttering plus the Re-Bar trick.

Another point, if you have a 5900X, 5950X, 7900X(3D) or 7950X(3D)... set the affinity of the jedisurvivor.exe to only CCD0. This gave me a surprising 8~10 FPS boost in many spots, probably because the game will spread across 2 CCDs causing processing delays. Keep it all on one CCD if you can for this specific game.
 
Another point, if you have a 5900X, 5950X, 7900X(3D) or 7950X(3D)... set the affinity of the jedisurvivor.exe to only CCD0. This gave me a surprising 8~10 FPS boost in many spots, probably because the game will spread across 2 CCDs causing processing delays. Keep it all on one CCD if you can for this specific game.

Long boot times, this fix you may have to do on a game by game basis, high prices, some issues with ASUS boards, AM5 still looks kind of underwhelming to me. Even with the initial Ryzen boards most of the issues were fixed 6-7 months in.
 
I don't link Youtube videos often but here is one running this on a 3050 at 1440p. Now the rest of the specs are up there. 13900k and 32g of system ram. This really brings home how this title doesn't use a lot of threads so it seems to benefit from a strong CPU and needs 32g of system ram as well. My takeaway is yes it can run decent on even an entry level Nvidia GPU despite being an AMD sponsored title but, being a shoddy port, lots of system ram and an overkill cpu helps a ton.

 
I don't link Youtube videos often but here is one running this on a 3050 at 1440p. Now the rest of the specs are up there. 13900k and 32g of system ram. This really brings home how this title doesn't use a lot of threads so it seems to benefit from a strong CPU and needs 32g of system ram as well. My takeaway is yes it can run decent on even an entry level Nvidia GPU despite being an AMD sponsored title but, being a shoddy port, lots of system ram and an overkill cpu helps a ton.


Needs more than 16gb system ram at 1440p high fsr performance on a 3050 !?

Have they mentioned minimum requirements as 32gb system ram ??

(Timestamped below)

 
Another point, if you have a 5900X, 5950X, 7900X(3D) or 7950X(3D)... set the affinity of the jedisurvivor.exe to only CCD0. This gave me a surprising 8~10 FPS boost in many spots, probably because the game will spread across 2 CCDs causing processing delays. Keep it all on one CCD if you can for this specific game.
That part really makes me scratch my head. Not that the game would run better on one CCD, that makes sense, but that this is an AMD sponsored game, and they didn't have the game itself ask the OS to do that. Like I can understand (though not defend) why it might not be as well optimized with Intel's new heterogeneous cores, but you'd think a sponsored by AMD game would be setup to work well with their own CPUs.
 
I'm about 17 Hours into the game. I restarted it so my buddy could see the intro. Frame rates at 4K, all settings maxed, have been nearly flawless. After disabling my E cores, the game has been almost perfect. That's not my problem.

My problem is that some of the jumping puzzles are so fucking unforgivable and insane that it doesn't matter what you have the difficulty set to. No save points in sight, I couldn't take a breather or I would fall to my death. There is a point on the laboratory moon where I started swearing, started yelling and swearing even more. Frame Rates were perfect, the FUCKING jumping puzzles are so absurd that you just wonder ho damn hard it would have been to just create a map that had normal traversal and great story. The greatest difficulty in the game is getting from point A to point B. I am having trouble typing this, my fingers are locked up, and after the last experience I will likely never revisit the game. Here's an example, you had to jump to one cable, then the next cable while in the air or you hit an electric spot that kills you, you have to do this frequently and if you miss you have to do it all over again. On the lab moon... You have to maintain a constant jumping, jump to pipes, jump to other pipes while being electrocuted and shot at run-jumping, grapple line, then wall run - jumping then opposite wall run jumping then back to other wall run jumping then other wall then grapple then wall run jumping and if you miss any of this couple minute sequence of hell you have to start it over. Thankfully there was an area I could get extra force shards at so I took a break, but restarting the sequence there is harder after you stopped to get the extras and there is no save point in sight. The end of this grapple, then jump to hope like hell you timed the jump perfectly your you fall to your death culminates in a perfect leap that forces you to wall jump back and forth to traverse vertically, only for you (as nearly as I can tell) hit steam vents every damn time and fall to your death before finishing the wall jump climb. There is no save point in this sequence of satan. I rage quit the game after playing for about 13 hours. Much of it was good, the jumping puzzles destroy your immersion in nearly EVERY area you encounter them in. Then there are the slippery slopes all over the game artificially barring you from going certain places until you finally realize you can subdue a mount (when they have been around you for hours)...

Only Cal has to do all the hard shit. Bode has a Jetpack and Merrin can fucking teleport...

Fuck this shit. Love the core gameplay and the story, but this shit is too much for my nearly 50 year old ass.
 
Last edited:
Back
Top