Starfield

Since people are asking:

Solid enough story. The pacing is up to the player. Like others have said we know about the technical complaints so far. My biggest quibble, too, really is some of the menus and especially inventory management. I'm hoping refinements can happen. I remember No Man's Sky when it first came out and the menus were abysmal and Hello Games fixed it. If they can do that then Bethesda can do it. Otherwise, maybe we can get DarnifiedUI for Starfield. ;)

Since someone else mentioned let it take a moment to plug Outer Worlds yet again. It's tight and fun. It doesn't try to be more than what it is.
 
Ok, so wasn't just me thinking at the start why my OLED tv is gray instead of black in these mines?

Wasn't expecting Metro Exodus levels of badassness in dark areas, but man did Starfield look bad.

Requested a refund from Steam. Thing is only an hour in, but still want to keep playing. The story, characters and what tiny gunplay I experienced did leave me wanting more. More desire to continue Starfield than keep going after about 6 hours in Cyberpunk.

Will likely buy again very soon from 3rd party to save a few $. Maybe even today. Guess this is why Bethesda doesn't bother making a new engine. They know we'll play on their continually polished turds indefinitely.
 
Proof that AMD paid off Bethesda, and put a $13B company into the corner. /s

Have no idea who this guy is, but I really like how he moves his camera around with his finger pointer to exactly what he is referring to.
Haha. His production values/way of doing things is definitely DIY. But it works! He's been around for a couple of years. But I feel like this year, he's hit a bit of a groove and is a good, alt resource.
Starfield on a potato.
Minimum spec R5 2600X and a GTX 1070ti.
Honestly plays a lot better than I thought it would.


View: https://www.youtube.com/watch?v=kfCCKCeEzUU

Well, its running at half 1080p! I would expect a 1070ti to be cool with that ;)
https://steamcommunity.com/app/1716740/discussions/0/3824173464656784022/?ctp=2#c3824173464658978011

How to turn on Auto-HDR in Win11. It works. Looks a little better but still screwed up.

I also find it a bit odd that there's no exclusive full-screen mode in the game. Wouldn't that improve performance?
Borderless Window can perform just as well as exclusive full-screen, if they use something called "flip-model". Which I believe MS has attempted to force on any windowed game, in Windows 11. Since a few months ago. Some games do it natively already. And SpecialK mod ads it for a few games. And otherwise, borderless Window is nice, because your game doesn't hang or crash when alt-tabbing or doing stuff in the task bar.
https://devblogs.microsoft.com/directx/dxgi-flip-model/
Folks running Starfield on NVidia GPUs, I'm curious what your performance / GPU utilization / GPU power consumption are like, specifically at Ultra settings and native res.

Based on some things I'm seeing around (such as the Daniel Owen's video) and my own little bit of testing, it seems that Ultra quality settings may result in broken performance on NV cards in some scenarios...

Peep this (not actually spoilers)



Note
- 7900XT is performing ~30% better than 3090Ti and could even go further on faster CPU as it's hitting CPU limit on 5950X
- 3090Ti is only consuming 300W despite reporting 99% utilization at 1075mV- that's a big red flag for me, should be more like 400W-500W at that voltage at 99% util
- This seems consistent across the few areas of the game I've visited so far
- Odd performance behavior persists when the Radeon GPU is disabled in Device Manager and all Radeon-related background software is closed
- Behavior persists after clearing shader caches and reinstalling the game
- Both GPUs are using latest (as of 9/1) drivers, Windows 11 fully updated, Steam version of Starfield

So is there something wrong with maxxing settings at native res on NV GPUs in the current version of the game? I'd say this is a fluke on my system, NV and AMD drivers conflicting or something- but I have yet to see this behavior in any other game, and some of the early YT vids seem to cooberate this.
There are problems for Nvidia in this game. But....7900 XTX SHOULD be performing a lot better than a 3090 ti, as a rule. 7900 XT is closer in equivalent performance to a 3090 ti.
 
There is literally like no immersion to this game. You are locked into a lot of tiny cubes on the planets and in space, all linked together with string/loading screens.

You cannot move your spacecraft on a planet, you can only touchdown and takeoff with cut scenes, and in space you can fly about 2 kilometers in either direction. That's the entirety of the movement in this game, besides walking. This in 2023.

Makes me appreciate Star Citizen more.
I'm enjoying it. At least it's a full fledged game unlike star citizen.

That being said, I do wish we'd get more of the complexity of Elite Dangerous and how that works with the ship/planet stuff and having a more full featured campaign and building systems that this game has.
 
I'm enjoying it. At least it's a full fledged game unlike star citizen.

That being said, I do wish we'd get more of the complexity of Elite Dangerous and how that works with the ship/planet stuff and having a more full featured campaign and building systems that this game has.
With coherent leadership and management, SC really could've been the defining PC space title. What we're left with is a landscape of well-meaning and talented developers taking a stab at it, but due to the time and resources required, always having to settle on some subset of The Ultimate Space Game-- a baby pool sized interpretation of space. Thank you for coming to my TED talk.
 
Last edited:
Starfield on a potato.
Minimum spec R5 2600X and a GTX 1070ti.
Honestly plays a lot better than I thought it would.


View: https://www.youtube.com/watch?v=kfCCKCeEzUU


I don't understand why this weirdo is walking his PC parts around his garden, but this video provides some good data points.

It looks like on his Ryzen 5 2600X and GTX1070TI, the GPU was the limiting factor. The CPU looked like it had plenty of more frame potential in it.

Maybe I shouldn't worry about my Zen2 Threadripper after all.
 
I'm willing to admit that it is possibly because the HDR features on my screen aren't that amazing. It's an Assus XG438q, a 43" 4k VA panel with DisplayHDR 600...

But it was a screen that got top reviews when I only just bough it a little while ago.... uuhh. I guess that was 4 years ago, but still! In the grand scheme of things, what is 4 years?

Anyway, I don't use Windows 11, so I don't have any of the AutoHDR features. For some games that natively support HDR I use the feature. Cyberpunk 2077 was one of these, but - at least in Windows 10 - it requires that I turn on HDR for the desktop (otherwise the HDR settings don't show up in game, and it doesn't use HDR), which makes white windows (like the file manager and other things) blast so bright that they sear my eyeballs. In general it makes the desktop experience mostly unusable, or at the very least very uncomfortable, so I wind up turning it on, starting the game and turning it off again, which gets old, and I often forget to enable it before starting the game, and don't even notice until I've been playing for a half an hour, and don't feel like quitting to enable it.

I guess to me HDR is more of a marginal feature than a game changing one, and on or off it doesn't make a huge difference in game to me, but it does completely ruin my 2D/Desktop experience, and for this reason I almost always keep it off.

And yes, this may be in part because I have refused to "upgrade" to Windows 11, and in part because my monitor is 4 years old, but still.

Monitors used to be the one part of our hobby that would last nearly indefinitely. I can't bring myself to replace it just for HDR, a feature my screen purportedly already has...

At some point I may invest in a 42" LG C<insert number here> but it hasn't been a priority yet, in part because the possibility of burn-in concerns me. I still use this computer for work more than I do for games or movies. We are probably talking 50 hours a week for work work (MS Office, Stats software, email, etc.) with static windows menus and window decorations all day every day. Then maybe another 20-30 hours a week of home productivity and web browsing. I maybe squeeze in 4 hours of games per week.

So for this reason, the screen I pick HAS TO be compatible with productivity style static windows without side effects. Everything else is secondary. I'm not going to get a dedicated screen just for games, when I only have time to spend a few hours per week on games.



I havent read up enough to know the difference between FALD and other forms of local dimming, but I know my monitor has local dimming and I usually turn it off. The screen segments are way too large, and it usually just winds up looking worse.



I'm with you on the "too bright for my eyes" part in many cases. Usually not in games, but definitely on the desktop, which is where I spend most of my time.
The XG438Q, if that is the display you're referring to, is edge lit and DisplayHDR 600. It only has 8 dimming zones that project vertically across the height of the display. It is the second worst form of local dimming, made even worse by the low number of zones on such a large display. Imagine trying to provide highlight to individual stars in space. On edge lit you will have a fat light beam stretching across the screen to try and highlight it, while washing out everything else in its way.

FALD is an array of lights replacing the backlight that light up a circular area of the screen. We're getting more accurate with this form of local dimming, as you can better isolate bright spots on the screen. The localized nature of the lighting also means that it can generally get brighter. The higher the number of zones, the better it is. MicroLED FALD use a sequence of LEDs to light up an area for more accuracy, but the drawback is blooming that needs to be compensate for in some way.

Direct lighting is an hybrid solution that puts a second LCD screen behind the primary LCD screen which mirrors the output image, but only display the luminance channel to boost or tone down the brightness of the primary image. It can be more accurate and support brightness levels even higher than FALD, but latency is an issue. The reference displays that video producers use which reach up to 10,000 nit Dolby specification use direct lighting, and they're extremely expensive. The latency is also up to hundreds of milliseconds.

OLED is a self-emissive display, where each individual pixel is able to adjust brightness levels independently. As a result, it can be the most accurate HDR display if the signal processing adheres to the EOTF curve. The drawback, of course, is peak brightness level due to the diodes being sensitive having a habit of burning themselves alive when too much voltage is applied. Current OLED displays all suffer from varying levels of automatic brightness limiting (ABL) to minimize the speed and effect of burn in.

TL;DR
A display with FALD for local dimming is the minimum type of display you need to get a good HDR experience. The number of zones for such an experience also varies by screen size. 576 would be the floor for a 27-32" monitor, in my opinion. I would pick FALD if peak brightness is your primary concern, OLED for picture quality.
in Documents/MyGames/Starfield create a text file and name it "StarfieldCustom.ini"


Here is a sample config, from a satisfied user:


[Display]
fDefault1stPersonFOV=90
fDefaultFOV=90
fDefaultWorldFOV=90
fFPWorldFOV=90.0000
fTPWorldFOV=90.0000

[Camera]
fDefault1stPersonFOV=90
fDefaultFOV=90
fDefaultWorldFOV=90
fFPWorldFOV=90.0000
fTPWorldFOV=90.0000
If 90 is the baseline for 4:3, 106.6667 is what you should use for 16:9.
 
There are problems for Nvidia in this game. But....7900 XTX SHOULD be performing a lot better than a 3090 ti, as a rule. 7900 XT is closer in equivalent performance to a 3090 ti.
I'm using 7900XT, not XTX. Hence the confusion... Usually the 7900XT has been very close to the 3090Ti (maybe a little faster sometimes) which is why the significant gap was such a surprise, coupled with the suspiciously-low TBP on the GeForce card.
 
The XG438Q, if that is the display you're referring to, is edge lit and DisplayHDR 600. It only has 8 dimming zones that project vertically across the height of the display. It is the second worst form of local dimming, made even worse by the low number of zones on such a large display. Imagine trying to provide highlight to individual stars in space. On edge lit you will have a fat light beam stretching across the screen to try and highlight it, while washing out everything else in its way.

FALD is an array of lights replacing the backlight that light up a circular area of the screen. We're getting more accurate with this form of local dimming, as you can better isolate bright spots on the screen. The localized nature of the lighting also means that it can generally get brighter. The higher the number of zones, the better it is. MicroLED FALD use a sequence of LEDs to light up an area for more accuracy, but the drawback is blooming that needs to be compensate for in some way.

Direct lighting is an hybrid solution that puts a second LCD screen behind the primary LCD screen which mirrors the output image, but only display the luminance channel to boost or tone down the brightness of the primary image. It can be more accurate and support brightness levels even higher than FALD, but latency is an issue. The reference displays that video producers use which reach up to 10,000 nit Dolby specification use direct lighting, and they're extremely expensive. The latency is also up to hundreds of milliseconds.

OLED is a self-emissive display, where each individual pixel is able to adjust brightness levels independently. As a result, it can be the most accurate HDR display if the signal processing adheres to the EOTF curve. The drawback, of course, is peak brightness level due to the diodes being sensitive having a habit of burning themselves alive when too much voltage is applied. Current OLED displays all suffer from varying levels of automatic brightness limiting (ABL) to minimize the speed and effect of burn in.

TL;DR
A display with FALD for local dimming is the minimum type of display you need to get a good HDR experience. The number of zones for such an experience also varies by screen size. 576 would be the floor for a 27-32" monitor, in my opinion. I would pick FALD if peak brightness is your primary concern, OLED for picture quality.

If 90 is the baseline for 4:3, 106.6667 is what you should use for 16:9.

Even edge lit SDR monitors are garbage (garbage form of backlighting vs backlit backlighting)
 
I'm using 7900XT, not XTX. Hence the confusion... Usually the 7900XT has been very close to the 3090Ti (maybe a little faster sometimes) which is why the significant gap was such a surprise, coupled with the suspiciously-low TBP on the GeForce card.
Ah......for some reason I thought you said XTX!

There are a few games where AMD's cards perform unusually well. But, Starfield will inevitably get some patches and driver optimizations. We'll see how things end up in 2 months.
 
Proof that AMD paid off Bethesda, and put a $13B company into the corner. /s

Have no idea who this guy is, but I really like how he moves his camera around with his finger pointer to exactly what he is referring to.

It drives me insane that he thinks his face should be in the center of a video comparing performance and image quality.
 
The complexity of this game is pretty amazing once you really dig into it. Almost overwhelming. I honestly don't mind the space travel system they implemented given the sheer amount of locations with quests, and the fact that there is a whole outpost/crafting/resource meta-game you can do as well.
 
No HDR or Auto HDR on PC, HDR is broken on console, and the SDR color gamut is broken. On console, HDR works in the main menu and loading screens, but not the game itself. Elevated black level is a serious issue on both PC and console. This is retarded. I don't understand how this continues to be an issue in the industry.


View: https://www.youtube.com/watch?v=xi-C3PzKdsI

Screenshot_20230901_200014_YouTube.jpg




Refunded the game before seeing these comments. Wish I could check how much better it may look on my C1.
 
HDR works in the main menu and loading screens, but not the game itself. Elevated black level is a serious issue on both PC and console. This is retarded. I don't understand how this continues to be an issue in the industry.

That's what I saw on PC. The menu's looked fine so I thought auto HDR worked until the game started. It just looked wrong. SDR is still kinda weird but not as bad as auto HDR? That's just my impressions no real data to back it up.
 
I thought I’d have a glance at the Steam forums for this game… just for the hell of it… and literally the first three threads that I saw were titled… “Not enough gays!” “This game is anti-White” and my personal favourite, “Starfailed”.

LOL.
 
So I'm looking at 40fps on my RTX 2070 at 1440p...if I'm lucky?

Yikes.
 
Now that I've got DLSS and AutoHDR working, I decided to keep the game. Although; this game has the worst graphics quality to performance ratio of any game I've ever played.
 
Now that I've got DLSS and AutoHDR working, I decided to keep the game. Although; this game has the worst graphics quality to performance ratio of any game I've ever played.
The graphics seem kinda mid to me tbh. Not bad per se, but not quite what I'd expect from a 2023 AAA game that pushes high-end systems to the max. Cyberpunk 2077, Elden Ring, and Control (to name a few) all look better to me, and run better too.

Starfield... Just looks and runs like a Bethesda game? If that makes sense?

Ultimately I don't think that matters so much though aside from the poor performance... People tend to play Bethesda games very much for the gameplay, characters, and story. Look at Fallout New Vegas- looks like absolute ass even compared to contemporary games, but people still love it and play it.
 
The graphics seem kinda mid to me tbh. Not bad per se, but not quite what I'd expect from a 2023 AAA game that pushes high-end systems to the max. Cyberpunk 2077, Elden Ring, and Control (to name a few) all look better to me, and run better too.

Starfield... Just looks and runs like a Bethesda game? If that makes sense?

Ultimately I don't think that matters so much though aside from the poor performance... People tend to play Bethesda games very much for the gameplay, characters, and story. Look at Fallout New Vegas- looks like absolute ass even compared to contemporary games, but people still love it and play it.
I firmly believe that its a 20-80 mixture. People buy Bethesda games for the platform, then mod the game to their own taste. The REAL game is downloading and testing the mods to someday finally have a game you like playing.

It's true & I think Bethesda knows it.
 
It drives me insane that he thinks his face should be in the center of a video comparing performance and image quality.
We use to read about PC's in Maxium PC back in the day, it never said anything about all the PC experts falling from the Sky 20 years into the future making Youtube Vidoes. they need to look up the word MOHAA and then read about it as my era of PC gaming.
 
  • Like
Reactions: Q-BZ
like this
Anybody playing on a 3070ti at 1440p? I have been trying to decide if I want to get this now or wait a bit, poor performance would probably be the only big deal breaker for me.
 
If thats the case I simply wont buy it. I may see what the 7800xt does in terms of performance before I make my ultimate decision.
This is still early days.
I'm sure performance will be a bit more palatable after a few patches from Bethesda and driver updates from AMD & Nvidia.
 
  • Like
Reactions: Q-BZ
like this
This is still early days.
I'm sure performance will be a bit more palatable after a few patches from Bethesda and driver updates from AMD & Nvidia.
No offense in any way shape or form but listen to what you just said, "the early days". They've had 10 fucking years!

Has it really gotten to the point that these companies have conditioned us to have such low expectations that its become the the norm to expect shoddy performance from a launch title that costs $70 dollars at the minimum?!

Holy fucking shit!
 
Back
Top