Jedi Survivor is the best showcase of a looming problem for PC players

i installed the patch but by that time i was down in the sewers after coroscuent. Fps is now in the 70s-90s, but thats impossible to know if it's because of the level or the patch. AMD overlay still shows cpu util at 25%, so nothing fixed there.
 
You would think, but as we've seen over the years/decades 'execution of plans' isn't really the part AMD does best
what, like freesync, or ryzen? edit2: or 64bit or multicore processors?

edit: or apu? i guess gaming still rocks as long as you got an intel gpu (integrated/non-integrated) backing you up?

edit3: what's intel come up with in that time frame besides frivelous lawsuits that went nowhere and a bunch of security holes that can't be patched?

edit4: but they did come out with processors that beat the competition out of the gate regardless if they were insecure/cheating the public at large. so hey we got to give them at least that, right?
 
Last edited:
what, like freesync, or ryzen? edit2: or 64bit or multicore processors?

edit: or apu? i guess gaming still rocks as long as you got an intel gpu (integrated/non-integrated) backing you up?

edit3: what's intel come up with in that time frame besides frivelous lawsuits that went nowhere and a bunch of security holes that can't be patched?

edit4: but they did come out with processors that beat the competition out of the gate regardless if they were insecure/cheating the public at large. so hey we got to give them at least that, right?

Yet look at all the good that has done for them as a company today, eternally second fiddle

They have their CPUs, and that ain't even saving the day for them if you check their earnings from today
 
but hey, maybe it's a good thing, things happened like they did, because if MS had made a totally new OS to support hardware hyper-threading like the route amd was headed, things may not be like they are now and we could be paying a premium for amd parts.? so maybe nature has once again nature has balanced itself out to the benefit of modern life
 
what, like freesync, or ryzen? edit2: or 64bit or multicore processors?

edit: or apu? i guess gaming still rocks as long as you got an intel gpu (integrated/non-integrated) backing you up?

edit3: what's intel come up with in that time frame besides frivelous lawsuits that went nowhere and a bunch of security holes that can't be patched?

edit4: but they did come out with processors that beat the competition out of the gate regardless if they were insecure/cheating the public at large. so hey we got to give them at least that, right?
FreeSync is AMD’s branding of a standard that already existed.

AMD did deliver the first multi core CPU with the Athalon64x2 but it was an attempt to replicate Intels HyperThreading with out violating patents.

Intel has had integrated graphics longer than AMD has. Yes AMD was the first to market them as cheap gaming alternatives but I don’t know anybody who has used one and been happy with the exception of the 5700G and emulation.

Be careful with the security flaws, Intel has more known ones because for decades people have been actively hunting for them as there was not point looking for AMD’s because they didn’t command enough of a market share to be worth anybody’s time. That’s changed now, now that AMD holds a sizeable presence in the datacenter you better believe that there are teams of people working to break AMD the same way they have for Intel.
Not sure what frivolous lawsuits your talking about but I am always happy n need of a laugh so I want to read those because I bet some are just WTF levels of stupid.
 
FreeSync is AMD’s branding of a standard that already existed
what G-Sync or that's the standard you think is better?

AMD did deliver the first multi core CPU with the Athalon64x2 but it was an attempt to replicate Intels HyperThreading with out violating patents.
cmon... 2 cores on the same die was meant to replicate a single core that got 10-15% more performance in multi-threaded loads by using smt vs 90-100% having two actual cores, on the same multi-threaded workload?

Be careful with the security flaws, Intel has more known ones because for decades people have been actively hunting for them
well, no because back then amd wasnt cheating benchmarks by running their processors in an insecure state. edit: that mess is still biting intel in their as_ to this day..!
 
Last edited:
what G-Sync or that's the standard you think is better?
Adaptive Sync has existed as a thing since the 60’s it was needed to make CRT tubes work correctly with different voltages. Matriox and SiS cards had the feature in the 90’s but its support was spotty. There were forums that existed on how to modify monitors to better make it work but it was very hit and miss.
GSync was just NVidia’s way of making it reliably work, most monitors used shit display drivers and we’re very loose with their QC on the actual panels used.

GSync was more of a certification standard than just a variable refresh rate thing.

Now that display controllers have caught up there is little need for the hardware component that was distinct to the early models. Now it serves as a quality seal, that the panel and features will meet a minimum quality.

I prefer GSyncs clarity, they have a very well defined guide on what it takes to pass the testing to be GSync Certified. But you pay for that, FreeSync Premium Pro is the direct comparison and between the two they are near indistinguishable in quality and price. But I really like that VESA has worked to get multiple tiers in place so there are a much wider price range for FreeSync so you can get the adaptive refresh which is what most want it for while having options for a display that may better fit your budget.
cmon... 2 cores on the same die was meant to replicate a single core that got 10-15% more performance in multi-threaded loads by using smt vs 90-100% having two actual cores, on the same multi-threaded workload?
Basically yeah, AMD was in very bad shape at that time, they didn’t get SMT working until 2008 a full 6 years after Intel released their Hyper Threading. Hyper Threading was first introduced by Intel for their Xeon lineup and it made a big difference there it was immensely popular for servers especially those running in “Small Business” because it got similar day to day performance to their Dual CPU systems with only 1 socket.

well, no because back then amd wasnt cheating benchmarks by running their processors in an insecure state. edit: that mess is still biting intel in their as_ to this day..!
I can’t really answer this because I don’t know what it means to “cheat benchmarks” and I certainly can’t say for certain they knew at the time what they were doing was going to have any security implications. I mean there was a time when the very idea of adding out-of-order execution was considered “cheating” I remember back in the 90’s people complaining about how the SPARC processors were “cheating” because they weren’t executing commands in the order instructed. And Intel was cheating when they followed with the Pentium Pro, but by the time Citrix got around to adding it, it was a good thing. So I would need some specific examples of these cheats.
But yes their security flaws will haunt them for decades, it is still very common to see 6’th gen parts in operation now the flaws that exploit them are known and can be mitigated on the network and firewall side severely limiting the ability to actually exploit them. And it could be argued that any intruder by the time they had the ability to exploit those vulnerabilities would have needed to gain access to so many other critical systems that those specific vulnerabilities are basically moot. But they will be there for decades to come, and we can just hope AMD doesn’t have any similar because I expect the same longevity for the Zen and Epyc series as well.
 
Yes Sony paid for Oodle and built a hardware Kraken implementation and Microsoft made its own BCPack algorithm that was from the ground up only for game texture and an hardware accelerator, who say BCpack does not touch Kraken ?

That sentence should rise an eyebrow:
This trend is not universal -- Digital Foundry notes(Opens in a new window) that the entire Mass Effect Legendary Edition on Xbox Series S|X is 88GB, while the PS5 version is 101GB -- but where it appears, it typically makes a dramatic difference.
Mass Effect LE is not a Xbox Series/PS5 title. It's running previous gen BC on current gen consoles, which would exclude it from any current gen tech implementation in its code.
 


makes you wonder what they really gain by pushing out a broken product? unless it's just a test of how dumb the population really is and how many people will still pre-order games after receiving broken product after broken product?
 
Last edited:
Mass Effect LE is not a Xbox Series/PS5 title. It's running previous gen BC on current gen consoles, which would exclude it from any current gen tech implementation in its code.
ME:LE runs natively on gen 9 consoles and always has. It does not run through backward compatibility.
 
makes you wonder what they really gain by pushing out a broken product? unless it's just a test of how dumb the population really is and how many people will still pre-order games after receiving broken product after broken product?
50% type of increase so close to launch is quite something, but I would not underestimate how much motivation from top to bottom, bottom to top would be gained by the employees once the product is in the wild, who know how long it would have took without a release (if ever) got done.

A bit like it is hard to practice in sport like the real thing, could be hard to test and debug in pratice like the real thing.

To note the 3070 with 8gig of vram with a strong cpu is a good tier above console performance in this title.
 
Last edited:
50% type of increase so close to launch is quite something, but I would not underestimate how much motivation from top to bottom, bottom to top would be gained by the employees once the product is in the wild, who know how long it would have took without a release (if ever) got done.

A bit like it is hard to practice in sport like the real thing, could be hard to test and debug in pratice like the real thing.

To note the 3070 with 8gig of vram with a strong cpu is a good tier above console performance in this title.
I bet it was a management thing, they wanted it out ASAP probably to meet some Disney contract agreement about a pre May 4’th launch date.
The fact they could patch so quickly means they had that stuff on standby, maybe it was a countermeasure against pirates maybe they went gold with the wrong build maybe they simply missed a lot of really simple shit in the final days.
At this stage I’ve given up trying to guess.
 
I bet it was a management thing, they wanted it out ASAP probably to meet some Disney contract agreement about a pre May 4’th launch date.
The fact they could patch so quickly means they had that stuff on standby, maybe it was a countermeasure against pirates maybe they went gold with the wrong build maybe they simply missed a lot of really simple shit in the final days.
At this stage I’ve given up trying to guess.
It's almost always management pushing things out the door before things are ready. Your average developer doesn't have any say really. I think we're at a point where games takes 3+ years to properly develop across all platforms and these large corporate entities don't want to hear it so we get much lower standards for MVPs. Say what you will about Blizzard (they're a horrible company), but when D4 comes out, aside from server issues the game will run well.
 
It's almost always management pushing things out the door before things are ready. Your average developer doesn't have any say really. I think we're at a point where games takes 3+ years to properly develop across all platforms and these large corporate entities don't want to hear it so we get much lower standards for MVPs. Say what you will about Blizzard (they're a horrible company), but when D4 comes out, aside from server issues the game will run well.
That and costs have increased some 3x what they were according to the EA/Sony discovery documents that were presented to the EU team.
 
It's almost always management pushing things out the door before things are ready. Your average developer doesn't have any say really. I think we're at a point where games takes 3+ years to properly develop across all platforms and these large corporate entities don't want to hear it so we get much lower standards for MVPs. Say what you will about Blizzard (they're a horrible company), but when D4 comes out, aside from server issues the game will run well.
That's typically been true but they also rarely (if ever) push graphical boundaries. I'm still hoping they let us zoom out more in D4, that was my biggest gripe from the beta.
 
I think I tripped the drm when I fired up jedi survivor on my new 7800x3d build. I get a straight return to ea app after starting the game but have no indication of any errors. The log is not really much help but from what I interpret. I get a license failed error in the log but not much else to go on except that.
 
That and costs have increased some 3x what they were according to the EA/Sony discovery documents that were presented to the EU team.
Yeah we're basically headed off a cliff, I think it's why so many companies are dropping building their own engines and tools and just getting UE5. It's just too damn expensive to compete with those tools any longer. The amount of engineering investment is astronomical, then on top of it asset quality has to grow along side your tools. I think AI could help in the art and programming department for game dev, but people need to understand it still needs a capable human driving it, correcting it, and making their code / art production ready. I'm working on a piece of architectural software in unity and I used GPT 4 to generate a ton of stuff for me to get me started on a recent portion where I had to swap out a bunch of shaders and generate assets. It absolutely cut my work time in about half, but under no circumstance could it have done what I needed it to with just one prompt and been fitted to our code base. Basically, the industry is building better tools, and I think particularly for AAA games it's required. It's why games have become just like Hollywood, no one is willing to take a risk anymore and make new IPs, the bar for success keeps getting higher.
 
Yeah we're basically headed off a cliff, I think it's why so many companies are dropping building their own engines and tools and just getting UE5. It's just too damn expensive to compete with those tools any longer. The amount of engineering investment is astronomical, then on top of it asset quality has to grow along side your tools. I think AI could help in the art and programming department for game dev, but people need to understand it still needs a capable human driving it, correcting it, and making their code / art production ready. I'm working on a piece of architectural software in unity and I used GPT 4 to generate a ton of stuff for me to get me started on a recent portion where I had to swap out a bunch of shaders and generate assets. It absolutely cut my work time in about half, but under no circumstance could it have done what I needed it to with just one prompt and been fitted to our code base. Basically, the industry is building better tools, and I think particularly for AAA games it's required. It's why games have become just like Hollywood, no one is willing to take a risk anymore and make new IPs, the bar for success keeps getting higher.
It's honestly interesting that costs for AAA games have increased so much, and for me personally I feel like the entertainment value has dropped, but I think largely due to publishers trying to expand the market for these hugely expensive games to more casual gamers. At the same time the quality of what can be produced by smaller, even solo, developers with a fraction of the cost is amazing. I think it's more comparable to the music industry than movies. There has been an independent element in movies for quite a while & technology doesn't really seem to have disrupted things to the same degree as music. For music, renting a studio is expensive, outsourcing producing, mastering, manufacturing, & distribution of physical media is expensive. Technology has disrupted that to the point you can do it at all yourself at home with a modest investment even as a hobby with no real or pressing need to recoup costs.
 
It's honestly interesting that costs for AAA games have increased so much, and for me personally I feel like the entertainment value has dropped, but I think largely due to publishers trying to expand the market for these hugely expensive games to more casual gamers. At the same time the quality of what can be produced by smaller, even solo, developers with a fraction of the cost is amazing. I think it's more comparable to the music industry than movies. There has been an independent element in movies for quite a while & technology doesn't really seem to have disrupted things to the same degree as music. For music, renting a studio is expensive, outsourcing producing, mastering, manufacturing, & distribution of physical media is expensive. Technology has disrupted that to the point you can do it at all yourself at home with a modest investment even as a hobby with no real or pressing need to recoup costs.
If you take away the shoddy release of Jedi Survivor including all the menu issues and performance issues......The game is absolutely amazing. The story, graphics, combat, and the cut scenes are all top notch. This game has great value in that department.

Its everything else thats the issue. So we can blame the developers on that for sure. But the the writing to me was top notch.
 
Even Sierra back in the days were making King Quest 3-4-5-6-7, etc... and were subsidizing their riskier new title by putting their latest engine cost into that sure to sales game.

Franchise, sequel, reboot, remake have been with us, since I was born, Mario Bros 3 and so on, it is quite natural for game, specially when the tech was advancing fast, I wanted to make Sim city 2000 but because of the limitation I made Sim City, I had still a lot more to do-say in that game type`.

As cost goes up, taking a Mario Bros 2, Zelda 2 type of risk can get rarer, but like Hollywood that try to make new IP (or turn existing IP into new movie IP) from time to time.

Upcoming Starfield being an obvious example, Cyberpunk before that.

Some of them does not have much if any originality in how they play, genre, world they are in (say Cyberpunk, Elden Ring) but will have some in some aspect
 
does he ever get it on with that white night sister? the one from the first one?

still mad they killed the hot arab chick from the first one
 
It's honestly interesting that costs for AAA games have increased so much, and for me personally I feel like the entertainment value has dropped, but I think largely due to publishers trying to expand the market for these hugely expensive games to more casual gamers. At the same time the quality of what can be produced by smaller, even solo, developers with a fraction of the cost is amazing. I think it's more comparable to the music industry than movies. There has been an independent element in movies for quite a while & technology doesn't really seem to have disrupted things to the same degree as music. For music, renting a studio is expensive, outsourcing producing, mastering, manufacturing, & distribution of physical media is expensive. Technology has disrupted that to the point you can do it at all yourself at home with a modest investment even as a hobby with no real or pressing need to recoup costs.
No I think this is spot on, I was merely referring to the AAA industry. But you're totally right in regards to single A and AA titles, it's amazing what's being made by small groups of passionate people. It still takes a significant effort, a huge amount of time and talent to produce things worth while. I think people are okay with shorter, more focused games these days too so there's also that benefit. I've had my fill of pretty but bland a skin deep open world games.
 
does the sith arab chick force ghost hook up with the white night sister?

asking for a friend
Nope. She is totally gone. But the one fugly night sister from the 1st game, first duel, shows up with more metal than meat in the 2nd.
 
I think I tripped the drm when I fired up jedi survivor on my new 7800x3d build. I get a straight return to ea app after starting the game but have no indication of any errors. The log is not really much help but from what I interpret. I get a license failed error in the log but not much else to go on except that.
If you change your hardware configuration five or six compared to your initial EA account info, EA is likely to lock you out of your account for this game.

One of the Hardware Unboxed presenters was complaining about this in his review of the game and launch day problems.
 


makes you wonder what they really gain by pushing out a broken product? unless it's just a test of how dumb the population really is and how many people will still pre-order games after receiving broken product after broken product?

I think they'd have enough data to analyze after this long. I'll still pre-order certain things but it's infrequent now and I won't repeat the mistake if a company burns me. I can't fathom why most people don't seem to care and go back for more.
 
Correct. FSR is open source right?
In both case, not sure you need the code.

Nvidia license seem fully open to use DLSS in your 3d application (if it is not a life or death application):
https://github.com/NVIDIA/DLSS/blob/main/LICENSE.txt
https://www.rockpapershotgun.com/nvidias-dlss-is-now-available-to-any-developer-who-wants-it
. But Nvidia have chosen to be a benevolent graphics god and have released the latest SDK without any restrictions. AMD did the same thing last week, releasing their FidelityFX Super Resolution tech (or FSR to its friends) as a free download. (July 22, 2021)

you can compile NVIDIA path traced, DLSS project on your computer.
 
In both case, not sure you need the code.

Nvidia license seem fully open to use DLSS in your 3d application (if it is not a life or death application):
https://github.com/NVIDIA/DLSS/blob/main/LICENSE.txt
https://www.rockpapershotgun.com/nvidias-dlss-is-now-available-to-any-developer-who-wants-it
. But Nvidia have chosen to be a benevolent graphics god and have released the latest SDK without any restrictions. AMD did the same thing last week, releasing their FidelityFX Super Resolution tech (or FSR to its friends) as a free download. (July 22, 2021)

you can compile NVIDIA path traced, DLSS project on your computer.
DLSS can not be used my AMD. FSR can be used by any vendor. DLSS is not open source.
 
Correct. FSR is open source right?
yes, it's all on github.

In both case, not sure you need the code.

Nvidia license seem fully open to use DLSS in your 3d application (if it is not a life or death application):
https://github.com/NVIDIA/DLSS/blob/main/LICENSE.txt
https://www.rockpapershotgun.com/nvidias-dlss-is-now-available-to-any-developer-who-wants-it
. But Nvidia have chosen to be a benevolent graphics god and have released the latest SDK without any restrictions. AMD did the same thing last week, releasing their FidelityFX Super Resolution tech (or FSR to its friends) as a free download. (July 22, 2021)

you can compile NVIDIA path traced, DLSS project on your computer.

the SDK is just a precompiled library with a windows and linux build. it's everything you need to implement it, but the actual implementation is opaque - there's no code for you to see besides the headers.
 
the SDK is just a precompiled library with a windows and linux build. it's everything you need to implement it, but the actual implementation is opaque - there's no code for you to see besides the headers.
Seem like we say exactly the same thing ?

DLSS can not be used my AMD. FSR can be used by any vendor. DLSS is not open source.
Yes that all well establish, but I am not sure the link with the original point.
 
Cyberpunk 2077 is Nvidia sponsored and it has AMD FSR.
It has one of the lowest implementations FSR. Meanwhile the nVidia version is the latest version of DLSS. nVidia is extremely anti-competitive and zero nVidia users care about it. So there shouldn't be any complaints if you end up on the losing side of that deal.
 
Yes that all well establish, but I am not sure the link with the original point.
Available for you to install is NOT the same as something being open source. One of the key aspects of open source software is that it can be altered freely. You cannot alter DLSS to run on AMD hardware without nVidia's permission.
 
It has one of the lowest implementations FSR. Meanwhile the nVidia version is the latest version of DLSS. nVidia is extremely anti-competitive and zero nVidia users care about it. So there shouldn't be any complaints if you end up on the losing side of that deal.
I don't see how not using a competitor's tech is being anti-competitive.
 
Back
Top