Alan Wake 2

I'll have a chance to play this all this week off for 9 days unless I call in today which would be a really good idea.
 
At least the new titles coming out has Intel working on drivers just like AMD and Nvidia for every new release, which is what it's going take from a driver team.
 
AMD Adrenalin 23.20.27.05 were tested by Ancient Gameplays, and look to give a 6 - 10 FPS boost on most AMD cards. These have actually been available for a few days. But, I haven't seen anyone specifically test them yet. and they aren't easy to find, without a direct link.
They also help clean up some flickering on textures.
https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-20-17-05-alan-wake-2

video has performance comparisons at the end:

View: https://www.youtube.com/watch?v=fyXDEvdqMmo
 
Last edited:
I'm kinda wondering how does RE4Remake and Alan Wake II compare graphically/gameplay wise. Because it's hard to not be impressed by the RE4 intro sequence. They did an awesome job on recreating that. I watched a bit into the beginning of AW and honestly it just seems kinda meh, plus the dead fat naked dude on the picnic table thing...like wtf. Not trying to piss on anyone's parade if you're enjoying the game, just making my own observations. Is aw2 is trying to steal a little of the recent RE remakes' thunder?. It's got a very similar hud to re4 and the inventory is also very similar in general. Combat seems a little more action oriented (or is it?), and more impactful gunplay. I don't remember much of the original so these things may have been that way. All I remember is a whole lotta flashlight and walking around.

Someone mentioned the visuals don't warrant the performance hit and I kinda have to agree. When you consider re4 can look as good as it does running 4k native, max everything on last gen flagship cards and maintain a steady 60fps (also consider there are probably like 20+ ganados on screen at a time during the village sequence), it makes AW seem a little questionable. You're getting sub 30fps with 4090's (non upscaled, max settings) in a tiny little scene staring at guy talking to him. Really...this is good? Even Lies of P has a very beautimously crafted game world, and scene vs scene could probably look as good as anything in aw2 and yet can probably run at 3x the framerate also.

There's most definitely a little more going on under the hood in aw2 with the newer rt stuff, but it's like way overdone and the performance is unacceptable imo. Like literally walking through a cattle chute path in a forest section and still laughable performance. Does the environment open up a bit later on and become much more voluminous, and then how the hell does that run? It seems things could've been toned down a bit and made to run better with just as nice looking end result. Devs in general need to find their stride with all this new tech being introduced and stop piling shit on and on, to the point your running a slideshow on the best of hardware. I don't like upscaling res in my games and most likely won't ever use the option. I don't plan on playing aw2 anyway as I found the og pretty boring tbh, but I'm still interested in the tech regardless.
 
I’m enjoying the hell out of this game (playing on PS5). Perfect time of year for a game like this.
 
  • Like
Reactions: noko
like this
She wasn't in the last game...
I'm specifically referring to the last game to reference the Alan Wake "universe", which was Quantum Break. All I'm saying is that for the same reason they felt to recast the character to someone who looks completely different is the exact same reason why I believe they should've kept the casting as similar as possible to maintain continuity.

And after finally trying the game, I had an extremely hard time making it past the intro. The game's optimization is crap and it doesn't actually even look that good. Remedy really lost their touch since the days of Max Payne. Maybe it's a good thing Rockstar made Max Payne 3...
 
It's a 20 hour game. Can be done in 2-3 days if you commit.
I don't agree with this. Like most things - This is if you just power through the main story and don't do any of the other things along the way.

I'm nearly at 20 right now and I'm only half way through the game. (Chapter 5 in both realities)
 
I'm specifically referring to the last game to reference the Alan Wake "universe", which was Quantum Break. All I'm saying is that for the same reason they felt to recast the character to someone who looks completely different is the exact same reason why I believe they should've kept the casting as similar as possible to maintain continuity.
I'm sorry but in the greater context of such a deliberately weird and malleable story. I'm not sure how I could disagree more about your reasons.
And after finally trying the game, I had an extremely hard time making it past the intro. The game's optimization is crap and it doesn't actually even look that good. Remedy really lost their touch since the days of Max Payne. Maybe it's a good thing Rockstar made Max Payne 3...
I wish that this game ran better too. I'm with you there. I'd love to disable everything related to ray tracing because it looks like an awful smeary mess at the lower settings. However Rockstar has been just as guilty for questionable performance decisions in the recent past as well.
 
Anyone wishing the game ran better just needs to turn off raytracing. This is not the game to choose as your battle for one that is not 'optimized' well. It runs perfectly fine given the immense level of detail. If you don't have a 40 series card you're basically not going to be able to use the raytracing in this game. Even anything less than a 4080 is going to struggle though.
 
The game uses Alan Wake 2 uses Remedy's Northlight engine. Wonder if previous games used the same engine seems like there tons of detail compared to control.
Every game Remedy has made since Quantum Break except Alan Wake Remastered uses the Northlight Engine. That includes Quantum Break, Control, and this game. Alan Wake Remastered still uses the "Alan Wake" engine.
 
I wasn't monitoring framerates but I had everything maxed including RT at 4k, DLSS quality and frame gen, but it was buttery smooth with my 4080.
 
Even anything less than a 4080 is going to struggle though.
exactly. Thant's kinda my point in general I guess. People are paying $1600+ for the best of the best current gen video cards to play current gen games and it cant even be done without upscaling. I find that absurd. Did you want to run AW2 @ 4k maxed without upscaling? Well you can do that......2 years from now when nvidia releases the 5000 series....for another $1800. That's right for just another small fee, you'll now be able to run this 2 year old game truly maxed out at acceptable framerates. fuck all that bs.
 
exactly. Thant's kinda my point in general I guess. People are paying $1600+ for the best of the best current gen video cards to play current gen games and it cant even be done without upscaling. I find that absurd. Did you want to run AW2 @ 4k maxed without upscaling? Well you can do that......2 years from now when nvidia releases the 5000 series....for another $1800. That's right for just another small fee, you'll now be able to run this 2 year old game truly maxed out at acceptable framerates. fuck all that bs.
You can turn down the graphics, you know. One of the videos in this thread shows it at low, it still looks great.

To me it seems like basically just jealous from people over naming of settings. Gamers are saying "I wanna max out the settings, if I can't the game sucks!" Not evaluating how the game actually looks or runs at a given setting, just getting made because they can't have everything turned up to max. Would it make you happy if they renamed low to ultra, and then called ultra "impossible" or "ludicrous" or something like that?

Have a look at the videos in this thread. The optimization guide gets it running at about 67fps on a 3060Ti. The playthrough review has some footage of the game in low, and it still looks damn good. Digital Foundry tested it vs the PS5 version and found out that PS5 quality mode stacks up about equal to the PC's medium preset.

To me, it looks like the game looks extremely good and atmospheric, and takes advantage of modern hardware. It just also has some "moon shot" abilities that are at or even a little beyond, what we have right now.

So just turn it down, and enjoy it. Not the first time a game has been like that. Crysis was famous for it. It doesn't matter what the settings are called.
 
Gamers are saying "I wanna max out the settings, if I can't the game sucks!" Not evaluating how the game actually looks or runs at a given setting,
It will change fast, I think, it must already be really rare for someone to really think like that, in reality it must be always a fear that it is a sign that will not scale down and never perform that well, because there a correlation.

The games should not the possibility of have setting only useful for future hardware, they would be better off not allowing you to choose them type of talk will not survive, upscaling is stupid has well, people will ask and reward good dynamic upscaling real soon.

Specially if game engine (or the AMD-Nvidia-Intel utils) become good at pre-selecting good setting for you when you give them an performance target preference, I fully understand not having to go online to find a list of setting that make sense for you to have good experience.
 
Anyone wishing the game ran better just needs to turn off raytracing. This is not the game to choose as your battle for one that is not 'optimized' well. It runs perfectly fine given the immense level of detail. If you don't have a 40 series card you're basically not going to be able to use the raytracing in this game. Even anything less than a 4080 is going to struggle though.
no! if i cant do 4k/144/hdr/full rt/no dlss, its a piece of unoptimized shit ;)
 
It will change fast, I think, it must already be really rare for someone to really think like that, in reality it must be always a fear that it is a sign that will not scale down and never perform that well, because there a correlation.
But people without the highest end hardware often have to turn down the settings. That has long been a thing. It is going to look ok too, since consoles are basically never able to run at max settings, so they are going to make sure it looks good for them.
The games should not the possibility of have setting only useful for future hardware, they would be better off not allowing you to choose them type of talk will not survive,

Why? What's wrong with supporting new technologies? It can help give a game more replay ability, make it age less. Likewise games have to start to try and use new tech for it to ever develop. If they just forever stay on older technology, we never get better visuals.

Also I'd note that it seems like pathtracing works fine on a 4090. It's intense, but seems like you can max that out and the game is playable on a 4090. So it doesn't require future hardware, it just requires the best of today's hardware.

upscaling is stupid has well, people will ask and reward good dynamic upscaling real soon.

The whining about upscaling is really silly. If you want higher resolutions, get used to it. We can't just make GPUs massively more powerful. Look at how much power the 4090 draws already. Even assuming you could double the performance by making it twice as big, you interested in a 900 watt GPU?

You still maintain the option you used to have: Turn down the resolution. You now just have an additional one: Turn down the render resolution but use a clever neural net to try and restore as much of the image quality as possible. If upscaling isn't for you, turn down the rez. Or turn down the detail and run at a higher rez. Or deal with a slower frame rate. You have options, choose the one that's right for you.


Really it just sounds like sour grapes. People who don't have the best hardware whining that they can't max the settings, even though the game looks good with lower settings and runs good. I guess they could have just not put in the higher settings but why? What does that do, except protect the egos of people who want to feel like they are running things in the highest settings?
 
Anyone wishing the game ran better just needs to turn off raytracing. This is not the game to choose as your battle for one that is not 'optimized' well. It runs perfectly fine given the immense level of detail. If you don't have a 40 series card you're basically not going to be able to use the raytracing in this game. Even anything less than a 4080 is going to struggle though.
Also by all accounts, the non-RT lighting is very good. They did a real good job on it and the RT lighting is a minor upgrade. A noticeable one, but not one that is night and day as it seems to be in Cyberpunk. They seem to have worked hard on making the non-RT lighting really good in this game.
 
Why? What's wrong with supporting new technologies?
I am not sure what problem people have with it, but like I said my prediction it will not survive, it will become perfectly accepted. Those views will become quite marginal in the Unreal 5 era. And will not make sense with pathtracing, game should let people throw 4,6 rays by pixel if they want with 5 bounces, how much more code is that ? and once RT core get powerful enough, voila, better light.

You still maintain the option you used to have: Turn down the resolution. You now just have an additional one: Turn down the render resolution but use a clever neural net to try and restore as much of the image quality as possible. If upscaling isn't for you, turn down the rez. Or turn down the detail and run at a higher rez. Or deal with a slower frame rate. You have options, choose the one that's right for you.

That option kind of disappeared a long time ago with CRT shift to LCD, it become who do the upscaling, with an LCD you cannot up out of upscaling if you do not run native.
 
Last edited:
exactly. Thant's kinda my point in general I guess. People are paying $1600+ for the best of the best current gen video cards to play current gen games and it cant even be done without upscaling. I find that absurd. Did you want to run AW2 @ 4k maxed without upscaling? Well you can do that......2 years from now when nvidia releases the 5000 series....for another $1800. That's right for just another small fee, you'll now be able to run this 2 year old game truly maxed out at acceptable framerates. fuck all that bs.
You can run AW2 @ 4k max settings without upscaling on a 4090 just fine currently. Although it would be silly NOT to use upscaling when it looks visually identical but performs better.

I'll never understand the folks who are absolutely obsessed with "native" resolution. Dead Space remake is the only game I can think of that I have played in years where native actually looked better than DLSS Quality (or FSR Quality for crappy games without DLSS support).
 
Reminds me of Crysis, at release the best hardware available couldn't maintain a playable framerate in that either. The big difference is that Crysis looked massively better than most other games when it released. Alan Wake 2 is not such a big step forward.

The fact that a $1600 GPU can only manage about 30FPS at native 4k tells me that we're still a long way from mainstream 4k gaming though. While TVs have decidedly moved on to 4k, it's too much for all but the most expensive PCs. If your're rendering at 1440p anyway, it'll probably look the same or better on a native 1440p panel than upscaled for a 4k one.
 
You can run AW2 @ 4k max settings without upscaling on a 4090 just fine currently. Although it would be silly NOT to use upscaling when it looks visually identical but performs better.
Depend on what people mean by both max setting and just fine, with the new reconstruction RT affair does it not look even better in some ways to use upscaling now (until Nvidia add DLAA + RR support) ?

min-fps-pt-3840-2160.png
 
  • Like
Reactions: horse
like this
^"just fine" rolf
You can turn down the graphics, you know.
people don't spend that kind of money to turn settings down...
and the point about crysis is not even comparable, flagship cards back then were a fraction of the price of what they are these days, like around $500 or so, which was nothing for anyone to afford back then.
 
^"just fine" rolf

people don't spend that kind of money to turn settings down...
So... That's just ego then. "I spent $X which means it hurts my ego if I have to turn details down!" What you are really demanding is that they simply don't offer an option, because you don't feel like the performance on your hardware is good enough. You don't have to use it, the game looks and works great without path tracing, but you want it taken away anyhow because it hurts your ego that it is there and the performance isn't what you'd like.

Because that's literally what you are asking for if you say "Take away an option, and rename the next level down to maximum." It isn't possible to just magically make PT faster on all hardware.
 
That's minimum fps, not average...
It's also basically the worst-case scenario. You want 4k, with no upscaling, with everything maxes, with path tracing on... Well for path tracing, that's pretty unrealistic. If you don't want to do upscaling, probably best to stick with raster, particularly if you run a 4k display. There they show a 73 average 63 1% minimum at max settings, no upscaling. Sounds quite nicely playable to me. If you want path tracing, really you need to be looking at upscaling both since more pixels are the hardest things, performance wise, for ray tracers (there is a reason old RT demos used to always be so low rez), and because you need a denoiser ANYHOW so you might as well use DLSS 3.5 and get your upscaling and denoising combined. With that, it looks like you can get 63 FPS with DLSS quality, path tracing, maxed settings, at 4k.
 
That's minimum fps, not average...
I remember in the Daniel_Owen vid when he set it to 4k max it tanked to the 20's.
"Take away an option, and rename the next level down to maximum."
That's what they probably should've done tbh. Because you know in a year or so when they go to sell you the game all over again, the complete version with all the dlc's, they could've included those max setting as well for a little visual upgrade. They'd be cut and dried ready to go, wouldn't even have to do any work. And by that time the next gen cards could possibly be out, or just around the corner anyway.
 
^"just fine" rolf

people don't spend that kind of money to turn settings down...
and the point about crysis is not even comparable, flagship cards back then were a fraction of the price of what they are these days, like around $500 or so, which was nothing for anyone to afford back then.

The 4090 can do ~60 fps native 4k WITH path tracing (which is bleeding edge technology and insanely demanding in terms of GPU performance) and FG. Is 60 fps not considered "just fine" anymore?

...and before you go "but but but!!! frame gen is fake frames!!!!!!! it doesn't count!!!" you never said anything about not using FG. Native res with FG is still native res. FG is a setting in the game, so lets max that setting out!

Just because your ego is too big to turn down settings (or use visually identical looking upscaling technology) to obtain the frame rates you desire, doesn't mean that other people won't happily do it.

Regarding the Crysis situation, the flagship card at the time (8800 ultra) cost $830, which is over $1200 in today dollars. More reasonably you could say the 8800 GTX was the flagship, which "only" cost $600 (over $900 in today dollars). Prices weren't as egregious as they are today, but they weren't exactly cheap either. Most people weren't going out buying top of the line graphics cards like they were an impulse buy at the Target checkout line.
 
That's what they probably should've done tbh. Because you know in a year or so when they go to sell you the game all over again, the complete version with all the dlc's, they could've included those max setting as well for a little visual upgrade. They'd be cut and dried ready to go, wouldn't even have to do any work. And by that time the next gen cards could possibly be out, or just around the corner anyway.
...or they could just offer the option now since:

1) It works well on 4090s, even at 4k, and is usable on lower end cards with lower screen resolutions.

2) Provides a nice bonus for replayability later, when you load it up on a couple years and want to run through on newer hardware.

3) It IN NO WAY HURTS TO HAVE IT. Seriously. You can just not use it. That has the same effect on you as it not being there.


I really don't see a good reason for them to have not included this. Now if the game was only path traced, or if it looked like shit rasterized? Then ya, that's a problem. However it looks fantastic in its rasterized mode. The addition of PT doesn't hurt that.
 
perhaps in the Daniel_Owen vid he didn't have frame gen enabled?
To be fair though Remedy's recent games have aways ran a little poor at launch. yeah dlss
I agree the comments, it's all good. I have no ego about this, I'm not even running a 4090 (I'm not even playing this game lol!!!), but if I was I would feel a little bitter about the performance, absolutely.
 
perhaps in the Daniel_Owen vid he didn't have frame gen enabled?
To be fair though Remedy's recent games have aways ran a little poor at launch. yeah dlss
I agree the comments, it's all good. I have no ego about this, I'm not even running a 4090 (I'm not even playing this game lol!!!), but if I was I would feel a little bitter about the performance, absolutely.
Why? The game is stunning, and truly next-gen from a technology standpoint. 32 FPS native w/ max on the 4090 is fine. DLSS quality gets you 64FPS, and once you add frame-gen that's an effective 128hz.

This is why i've said again and again that this game is basically perfectly tuned for the 4090 for 4k, and the 4080 for 1440. Each one gets you effectively 60FPS at the resolution buyers buy the card for.

And yes, I do consider DLSS quality to be acceptable. It has very minimal IQ loss and basically doubles frames. Frame gen when your base FPS is already 60FPS is perfectly fine as well, because your input latency is perfectly fine in a game like this w/ 60FPS, but you get the benefits of an effective double hz to the display with zero IQ loss.

I disagree that Remedy's games have ran poor at launch. Control was perfectly fine, it's just that it was a first-gen DLSS title that had the original DLSS 1.0 which generally wasn't very good from either an IQ or performance perspective. It was DLSS 2.0 that made it more playable with raytracing enabled, which really had nothing to do with Remedy as that was a nvidia thing.

Finally, i'll point out this is like the only AAA title this year that has basically launched bug-free, and runs exactly the way people expect it would with all the path tracing enabled.
 
In this case, higher IQ in many ways I think (because of ray reconstruction)
Also more fine detail in areas, even in no-ray reconstruction cases. I've seen a number of games where DLSS quality reveals fine details in distant geometry (like wires and such) that you don't see at clearly at native rez. Now of course it introduces some minor artifacts too, but I find in quality it ends up being a tradeoff that while there may be some minor downgrades over native, there are often some minor upgrades too.
 
Back
Top