Jedi Survivor is the best showcase of a looming problem for PC players

Available for you to install is NOT the same as something being open source. One of the key aspects of open source software is that it can be altered freely. You cannot alter DLSS to run on AMD hardware without nVidia's permission.
On big other is able to compile with small or not work on a range of less popular different platform, but again, I am really not sure what it has to do with what was discussed here.

It is not like there is a list of the most popular fork of FSR 2 outthere or that alteration of the code was relevant to the subject.

Take Xess I am not sure if they made it openSource over time (seem like to still be only the header and not the code on github), but the API was an open Standard, which would be more important here for the competition to be able to do upscaling when a player set DLSS on in a game (or simply called advanced upscaling).

That the internal source code optimized for your hardware of the upscaling task is opensource or not, would the studio making a game particularly care ? What does it have to do ?
 
Available for you to install is NOT the same as something being open source. One of the key aspects of open source software is that it can be altered freely. You cannot alter DLSS to run on AMD hardware without nVidia's permission.
Not sure that is exactly correct. DLSS uses specialized compute cores. AMD GPU's don't have them, they can't do DLSS.

AMD's FSR is 'open source' (glowing Halo soft angel music plays)(and also probably free to use) because it's good marketing and because it wouldn't get much use otherwise. It's how they try to keep up yet are always a day late and a dollar short. I'm sorry you bought into AMD's underdog marketing... but it's underdog for a reason. The tech is years behind DLSS, and may never reach parity. It works completely different. But it's cool that they are trying, and tried a different approach.

Open source doesn't really mean all that much, at least not what it appears that people infer from it. You can use AMD's 'Open source' FSR, but you are not likely to get much help from them... ever. Same story of AMD's other 'open source' (glowing Halo soft angel music plays) stuff, which gets little support from them.

Altering DLSS wouldn't make it work on AMD, it's completely different tech. "I need something to mow the yard.." AMD: "I have a blender?". Expecting Nvidia to teach AMD how to (pick your tech term) is ludicrus. They are competitors... Nothing AMD has made "open source" has taught Nvidia anything.

Keep trying to take that hill though... you never know.
 
There seem to have only the headers in there with precompiled .dll and .lib, the API is public not the code (which was my whole point), i.e. you can if you are AMD or NVIDIA make your own Xess that will run in game that support only Xess if they want.
Yeah the XeSS implementation itself is currently closed and only the inputs and outputs are open, Intel hopes to change that with 2.0 and the only reason it isn't now is because they fear 3'rd parties running off with it and creating incompatible implementations muddying the waters. Intel developers talked about that a while ago, but still makes it the most "open" of the available options.

https://wccftech.com/intel-xess-interview-karthik-vaidyanathan/
 
Not sure that is exactly correct. DLSS uses specialized compute cores. AMD GPU's don't have them, they can't do DLSS.
AMD proved that wrong with FSR and Intel with XeSS. DLSS isn't a technology. It's a brand name for technology that existed through software development during the rendering process. Hell DLSS is based on the work of Pixar more than anything. Also people have hacked DLSS to run on Nvidia non-tensor core products. So no that's not the case.
AMD's FSR is 'open source' (glowing Halo soft angel music plays)(and also probably free to use) because it's good marketing and because it wouldn't get much use otherwise. It's how they try to keep up yet are always a day late and a dollar short. I'm sorry you bought into AMD's underdog marketing... but it's underdog for a reason. The tech is years behind DLSS, and may never reach parity. It works completely different. But it's cool that they are trying, and tried a different approach.
If it truly was years behind you wouldn't have reviewers lying about its prowess. There's not a single issue that affects FSR that doesn't affect DLSS as well and in many ways it can be worse. As you can see below:
1683371845687.png

Open source doesn't really mean all that much, at least not what it appears that people infer from it.
That's why Nvidia is working on open sourcing their drivers and people on this site try to convince us that DLSS is open source when its not. Right?
You can use AMD's 'Open source' FSR, but you are not likely to get much help from them... ever. Same story of AMD's other 'open source' (glowing Halo soft angel music plays) stuff, which gets little support from them.

Altering DLSS wouldn't make it work on AMD, it's completely different tech. "I need something to mow the yard.." AMD: "I have a blender?". Expecting Nvidia to teach AMD how to (pick your tech term) is ludicrus. They are competitors... Nothing AMD has made "open source" has taught Nvidia anything.

Keep trying to take that hill though... you never know.
It's a nice narrative that you've got going here, but all I see is "I like paying high video card prices and I want everyone else to as well."
 
Last edited:
  • Like
Reactions: Mega6
like this
It is not like there is a list of the most popular fork of FSR 2 outthere or that alteration of the code was relevant to the subject.
Incorrect. Valve does ALOT of work on drivers and FSR. You don't need to fork to work on open source code which is why I said there's a difference between compiling something yourself and altering code at will. Valve alters quite a bit.

1683372669301.png


You can't do what Valve is doing here w/o that open source framework.
 
Incorrect
I really fail to see the relevance with Jedi Survivor having an option for DLSS or not on its own menu with what follow.

FSR can be injected to pretty all title as it require no non current visual information to work (FSR 2.x is more of a challenge and more relevant to a DLSS talk), do they needed to actually change the code of FSR itself (they do not seem to have a fork on github but could be private or elsewhere) really not sure it was needed, there was little app released at launch of FSR that did let people run all their game with FSR on.

Maybe, maybe not (or maybe they did not need to but find a way to do the actual upscaling better than AMD or just simple in the Linux world and the many distro to have the code to compile to have it work), but what does it had to do with FSR 2.x, DLSS, Xess being supported or not in an Unreal 4 Windows title people just made ?

I feel like the plot of the conversation was lost and people talk about something else. FSR main branch had only 3 commits on its total lifetime, all by the same users, the latest was november 15.
 
Last edited:
Incorrect. Valve does ALOT of work on drivers and FSR. You don't need to fork to work on open source code which is why I said there's a difference between compiling something yourself and altering code at will. Valve alters quite a bit.

View attachment 568633

You can't do what Valve is doing here w/o that open source framework.
That is rather cool, wonder if FSR 3 will be included later. Frankly I've seen faster improvements with FSR than DLSS. DLSS was going rather slow until AMD released FSR, than Nvidia started to put a faster pace on DLSS development. That is one aspect as well, unless Nvidia keeps developing DLSS, it will stall while FSR does not depend on any one company or group to advance.
 
I really fail to see the relevance with Jedi Survivor having an option for DLSS or not on its own menu with what follow.

FSR can be injected to pretty all title as it require no non current visual information to work (FSR 2.x is more of a challenge and more relevant to a DLSS talk), do they needed to actually change the code of FSR itself, that something little app let people do as well, run FSR on all title ?

Maybe, maybe not (or maybe they did not needed to but find way to do the actual upscaling better than AMD or just simple in the Linux world and the many distro to have the code to compile to have it work), but what does it had to do with FSR 2.x, DLSS, Xess being supported or not in an Unreal 4 Windows title people just made ?

I feel like the plot of the conversation was lost and people talk about something else.
Why even bother with DLSS when FSR 2 is already available? Is it night and day difference overall or will most even notice any difference? Of course some of us can if we get nit picky, blown up images not representing real game play. The biggest issue I have with FSR, previously DLSS was motion stability of the image, second would be blurriness. Both can be trade offs between performance and image quality and what gives a better gaming experience. If Nvidia wants DLSS in titles that is only beholden to their hardware, they can pay the developer money to use it.
 
Why even bother with DLSS when FSR 2 is already available? Is it night and day difference overall or will most even notice any difference?
It is a cost-benefit analysis like anything else, it is often significantly better result, https://youtu.be/1WM_w7TBbj0?t=1381

If it is a lot of bother, obviously let it go, DLSS supporting card are quite common but not 80% a la Nvidia overall, you need a Turing or better card, but if it is Unreal 4 doing it for you and you feed it all DLSS/Xess need already to make FSR 2.0 work (to the point modder can make DLSS work on it quite fast), the question become why not, obviously please a major partner is an excellent answer.
 
Why even bother with DLSS when FSR 2 is already available? Is it night and day difference overall or will most even notice any difference? Of course some of us can if we get nit picky, blown up images not representing real game play. The biggest issue I have with FSR, previously DLSS was motion stability of the image, second would be blurriness. Both can be trade offs between performance and image quality and what gives a better gaming experience. If Nvidia wants DLSS in titles that is only beholden to their hardware, they can pay the developer money to use it.
Ya but don't you understand. DLSS makes them feel special. Nvidia is a designer brand. You can't challenge someone who has paid 30-300% more money for a designer brand. IT has to be better it cost them more its exclusive... being able to use DLSS makes you part of the special club. DLSS is their exclusive gate.... they have to freeze frame every game trying to find a frame or two that DLSS is "obviously" better in. Freezing frames to show the opposite either don't matter, are irrelevant or those frames are ones you wouldn't notice of course.

I can't see the difference between DLSS and FSR in use... that is just reality. I have no doubt we can find specific instances are subjectively better either way. Overall though I just don't see any difference when playing a game with em on.
 
It is a cost-benefit analysis like anything else, it is often significantly better result, https://youtu.be/1WM_w7TBbj0?t=1381

If it is a lot of bother, obviously let it go, DLSS supporting card are quite common but not 80% a la Nvidia overall, you need a Turing or better card, but if it is Unreal 4 doing it for you and you feed it all DLSS/Xess need already to make FSR 2.0 work (to the point modder can make DLSS work on it quite fast), the question become why not, obviously please a major partner is an excellent answer.
If I was a developer making games for the consoles (PS5/XBox Series) and PCs -> I would want the least amount of code necessary to maintain and to update and support the most hardware configurations I can. If modders can add in DLSS -> great, I won't have to bother.
Ya but don't you understand. DLSS makes them feel special. Nvidia is a designer brand. You can't challenge someone who has paid 30-300% more money for a designer brand. IT has to be better it cost them more its exclusive... being able to use DLSS makes you part of the special club. DLSS is their exclusive gate.... they have to freeze frame every game trying to find a frame or two that DLSS is "obviously" better in. Freezing frames to show the opposite either don't matter, are irrelevant or those frames are ones you wouldn't notice of course.

I can't see the difference between DLSS and FSR in use... that is just reality. I have no doubt we can find specific instances are subjectively better either way. Overall though I just don't see any difference when playing a game with em on.
Exactly, I see issue with both but when playing the game if I don't even care and the performance makes the game now smooth and more engaging great.
 
It is a cost-benefit analysis like anything else, it is often significantly better result, https://youtu.be/1WM_w7TBbj0?t=1381

If it is a lot of bother, obviously let it go, DLSS supporting card are quite common but not 80% a la Nvidia overall, you need a Turing or better card, but if it is Unreal 4 doing it for you and you feed it all DLSS/Xess need already to make FSR 2.0 work (to the point modder can make DLSS work on it quite fast), the question become why not, obviously please a major partner is an excellent answer.
Yeah, the cost benefit analysis. What is the cost benefit analysis of using something which works on only a fraction of the cards out there versus using something that works on almost every card out there?

As it is, each new major version of DLSS seems to work on fewer and fewer cards.
 
My first thought is to play console games on consoles. If a port isn't optimized well for PC, play it on its original platform. Owning a PS5 is a good idea simply because all of the extraordinary exclusives.
 
My first thought is to play console games on consoles. If a port isn't optimized well for PC, play it on its original platform. Owning a PS5 is a good idea simply because all of the extraordinary exclusives.

This is pretty shit advice in this case considering it ran like trash half the time there as well.
 
This is pretty shit advice in this case considering it ran like trash half the time there as well.
It's been perfectly fine on PS5 actually. Set to quality instead of performance if it matters.
 
It's been perfectly fine on PS5 actually. Set to quality instead of performance if it matters.

Oh so I have to make the concession of capping my framerate at 30 and still dropping under 1080p resolution to reach "perfectly fine," I see.

Let alone the performance mode that could be under 720p and still float in the 40 fps range or worse in extremes.
 
It's been perfectly fine on PS5 actually. Set to quality instead of performance if it matters.
Under the 15-30 fps at 1080p is perfectly fine or 45-60 fps at 720p in performance mode, has it not been perfectly fine on PC has well ?

Because it likely has more VRAM than the card you were going to buy.
A 6700xt has more vram than a xbox x and in most case a PS5
 
Last edited:
Yes and an Xbox-X has 10gb of fast memory, a PS5 has 16 gig of shared memory for everything, which tend to have room to dedicate around 10-12 has "vram" like usage, thus a 6700xt having more vram than an XboxX and usual
The PS5 can allocate anything less than the engine. Figure 4GB that leaves 12GB which is the same amount as a 6700.
 
I'm just over here like... man, wish my 4090 would get me over 100 FPS at 4K in this game with RT on, 70~90 FPS average looks too slow... :ROFLMAO: Can't wait for the DLSS mod bugs to be fixed and fully released though, I refuse to use FSR in this game... the motion blur is horrid with it. I'd use DLSS 3.0 Frame Gen in a heartbeat with DLAA.
 
Oh btw, the game crashed again during a cinematic. That makes twice more in 4 hours of total gameplay
 
Finally got a chance to run this game after the patch. It stutters now where it didn't before. Still playable, but they actually made it worse. I haven't had any crashes with the new patch, but I didn't before either. It seems that is an issue with some players now.
 
The PS5 can allocate anything less than the engine. Figure 4GB that leaves 12GB which is the same amount as a 6700.
It uses about 3gb for the OS, then still needs to hold regular engine and game code before any amount for VRam. So, staknhalo and LukeTbk are right. You get maybe 10gb for vram typically, if that. Plus the beautiful 1000p 20fps!
 
It uses about 3gb for the OS, then still needs to hold regular engine and game code before any amount for VRam. So, staknhalo and LukeTbk are right. You get maybe 10gb for vram typically, if that.
That 3GB number is a reddit number not confirmed by Sony at all. It's got an extra 512MB on top of that 16GB btw for the OS. But even if we take your word for it that's still going to mean worse case scenario is 10GB, which is more than what a 3070 8GB would give you. How you nVidia faithful think that the console is no faster than a 3050 is beyond me but then again ya'll like high video card prices so who knows what's circling up there.

Plus the beautiful 1000p 20fps!
So now DLSS/FSR is bad? Make up your minds. That's literally how DLSS and FSR work both are rendering below native resolutions. Remember?
 
Last edited:
I mean of course it's bad when it's fucking upsampling from 658p. It's like 800,000 less pixels than 1080p.

The internal resolution of this game is fucking _terrible_.
 
When did [H] become a place where a console that renders at 720p or less and has to upscale to get to 1020p is somehow a better way to play games than on a PC at native 1440p or 2560p?

If you can't afford the fastest GPU, that's fine. You buy what you can and overclock it. That's what [H] is supposed to be not this fucking mess of a thread.

If you own AMD that's fine too. You are not going to convince me that 'open source' (glowing halo angel music plays) is some magic cure-all for whatever you are currently bitching about. Designing a new GPU and new technologies costs money. RadCoder might be able to make nice mods and fixes for games, but he isn't designing new GPU tech. That's takes $$. AMD hasn't come up with a new tech in over 15 years.. they just keep copying what Nvidia does. No one cares if you /can't afford/don't like/ Nvidia. Games wouldn't look as good as they do today without them, but I'm sure you will keep propping up AMD...

btw consoles have shared ram, so the OS and running game are in there taking up space.. PC is far more capable, expandable, and has more total ram even for people using 8Gb graphics cards. 8Gb DDR, 8Gb vRam, that's on the low end for a pc but = high end for a console.

One thing Jedi Survivor is doing right is that it uses all available vRam presented to it. I've see it using over 21Gb vRam that last I looked. Most other games don't.
It's primarily single CPU threaded, so GPU's are being CPU bottlenecked... hence the shitshow of a launch. It's inexcusable, for as long as multi-core CPU's have been the norm. Hopefully patched soon. FSR (or DLSS if the game had it) is just a bandaid on a big ass wound. Game engine has bigger problems.
Other than performance and crashes, the game looks great and I haven't hit any gameplay bugs. It's fun too. It just needed 6 more months of polish and it would have met expectations.
 
Back
Top