Xar
Limp Gawd
- Joined
- Dec 15, 2022
- Messages
- 231
It's probably just you. I re-watched it 3 times and found nothing alike at all.Is it just me or when I saw short clips of this game, I immediately thought I was looking at Crysis?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It's probably just you. I re-watched it 3 times and found nothing alike at all.Is it just me or when I saw short clips of this game, I immediately thought I was looking at Crysis?
Morons still complain when a game comes out requiring AVX instructions and clearly stating it in the requirements despite both Intel and AMD supporting them for at least a 12 years at this point. Some people absolutely refuse to upgrade their CPU.Same, but just to stop morons from complaining about optimization, they should have obvious names, with a description that clearly states "these settings are not intended to be playable today at launch, and are included for the benefit of future players.
Call them Future+, Future++ and Future+++ or something.
Of course, one might argue, they can just launch with out them present, and patch the game when the time comes, but we all know major patches are unlikely more than a year or two after launch.
Also, presets are just presets. One could just leave the custom settings for those who really want to crank things up down the line, and not mess with presets at all.
Morons still complain when a game comes out requiring AVX instructions and clearly stating it in the requirements despite both Intel and AMD supporting them for at least a 12 years at this point. Some people absolutely refuse to upgrade their CPU.
I don't think it should be that extreme. I think developers just need to simply ignore the people who complain and admit to not meeting the published minimum system requirements just because "other games play fine" on their PC.The industry needs to stop enabling this shit. If you need a PC for just email and a web browser, that is fine, use whatever old thing you have around. It can even be a good experience. But if you want to play games, you should have to have a mid-range or higher machine from the last 5 or so years.
Anything less than - say a Ryzen 5 2600x with a Radeon RX580 or maybe a Core i5-9600 with a GTX 1060.
These should be the requirements for "very low" settings in 2023. Nothing lower than this should work at all in any games. They should just refuse to start, and come up with a system dialogue box telling you to upgrade.
It's frustrating that people who refuse to upgrade are holding the rest of us back. I blame the "we need every last player to make ends meet on our shitty free to play games with loot boxes and microtransactions" business model for enabling this crap.
Upgrade to at least mid grade hardware once every 5 years, or no soup for you. That should be the industry wide policy.
I imagine that not what you mean as a general rules.These should be the requirements for "very low" settings in 2023. Nothing lower than this should work at all in any games. They should just refuse to start, and come up with a system dialogue box telling you to upgrade.
If you want the game to run on PS5 (if you think you will release it in time for a ps5 release to make sense), maybe has well make a game that can run on a ps5 (2070 super, 6700 or so min, maybe a touch higher specially if you do not have Series S to run on), but if it run on a ps5 hard to imagine not running perfectly fine (i.e. at least console fine, 35fps-1080p high details) on a 6700xt-3060 type of cards.Given the length of development for new titles, a game that started development on 1/1/24, should probably have a 4070ti as the minimum requirements. Unfortunately, what really happens is almost no one makes their engine anymore, so they pull a 5 year old engine to start and the software just stagnates.
Agreed, and even the now-old AMD Jaguar CPU, a low-power/embedded CPU from 2013, in the PS4, thin clients, etc. has AVX instructions.The industry needs to stop enabling this shit. If you need a PC for just email and a web browser, that is fine, use whatever old thing you have around. It can even be a good experience. But if you want to play games, you should have to have a mid-range or higher machine from the last 5 or so years.
Agreed, and even the now-old AMD Jaguar CPU, a low-power/embedded CPU from 2013, in the PS4, thin clients, etc. has AVX instructions.
While Sandy Bridge and older CPUs can carry most software and other games just fine, missing CPU instructions that modern software and operating systems require for essential functions cannot be ignored.
Even modern iterations of PFsense require AES-NI instructions.
Many individuals complained about this during the changeover a few years ago, and every x86-64 CPU since 2011 has had these instructions, so if one doesn't have them then it is certainly time to upgrade.
To be fair, I'm all in favor of staying current, but I also question the value of AVX, especially considering how much it tends to drop the clocks. (or blast us with extra heat)
For 30 years we have been hearing that it is all about reduced instruction sets, hack, all modern x86 CPU's do is decode instructions to RISC-like micro-ops for processing, yet they keep adding additional instructions.
Apparently SIMD can be just that much faster it's felt to be worth it. I haven't looked in years but I recall seeing some articles doing timing analysis years ago that suggested the gains could be pretty big.yet they keep adding additional instructions.
Apparently SIMD can be just that much faster it's felt to be worth it. I haven't looked in years but I recall seeing some articles doing timing analysis years ago that suggested the gains could be pretty big.
Just imagine how big the improvements could be by getting rid of the instruction decode overhead instead, and moving the work done by these special purpose instructions to a competent compiler?
We've tried that, for a long time, it doesn't work. You can just build specialized silicon to be faster than more general purpose silicon. That's why we keep doing it. If you think you know how, then by all means please do it would be amazingly useful, but people have been trying ever since RISC became a thing CS professors couldn't shut up about, and to this day we build specialized silicon to make shit faster. Some of it is like AVX where it is still general use, but specialized in how it works like doing large vectors. Some is completely specialized like H.264 or AES hardware that does only that one algorithm but does it real fast with a small amount of silicon.Just imagine how big the improvements could be by getting rid of the instruction decode overhead instead, and moving the work done by these special purpose instructions to a competent compiler?
Even the ARM guys seem to disagree. I dunno.Just imagine how big the improvements could be by getting rid of the instruction decode overhead instead, and moving the work done by these special purpose instructions to a competent compiler?
Apparently SIMD can be just that much faster it's felt to be worth it. I haven't looked in years but I recall seeing some articles doing timing analysis years ago that suggested the gains could be pretty big.
AVX512 does this, but I haven't heard of AVX or AVX2 doing so.To be fair, I'm all in favor of staying current, but I also question the value of AVX, especially considering how much it tends to drop the clocks. (or blast us with extra heat)
Every RISC CPU in the last 20 years has been adding more and more instructions as well, and as needed for each platform and use-case scenario.For 30 years we have been hearing that it is all about reduced instruction sets, heck, all modern x86 CPU's do is decode instructions to RISC-like micro-ops for processing, yet they keep adding additional instructions.
My recommendation:
Grab the 6700xt/6750xt/6800 before they run out of stock !!