Which RTX 4090 card are you planning or consider to get?

Individual use cases differ between users, what you subjectively consider "overkill" is perfect subjectively for another user.
Assuming that your definition suits all users is a flawed assumption.

I would like more computational power than the 4090 provides but alas that is not an option yet.
Different goals makes blanket statements an errand in futility.
Really? I did not realize that.
 
I mean it's pretty simple. For 1440p @ 165hz a 4090 is serious overkill, you're not utilizing the card @ 100% like you were often doing with the 3090.
Exactly this. If his room is cooler and he's playing the same thing as before the upgrade then his room IS cooler. So what if the card is overkill - you don't buy something like this to only ever play what you currently are. Will a more demanding game make his video card work harder (maybe at 4k), then his room will get warmer.
 
There's different kinds of Overkill. As already pointed out it really depends on the games you're playing but I would say in the majority of games you're literally getting no better playable performance at 1440p with a 4090 over what a 3090 can deliver and in fact there are cases where you could actually get worse performance due to all the overhead. So the overall picture is yes a 4090 is pretty dumb for 1440p but there are a few cases where you actually could make it worthwhile as long as you have the top of the line CPU. And I hope people that have a 1440p monitor have at least enough sense to run most games with DSR and actually get some use out of their card. I know I sure as hell wouldn't spend over $1,600 to play all my games at 1440p and watch a 4090 twiddle its thumbs.
 
There's different kinds of Overkill. As already pointed out it really depends on the games you're playing but I would say in the majority of games you're literally getting no better playable performance at 1440p with a 4090 over what a 3090 can deliver and in fact there are cases where you could actually get worse performance due to all the overhead. So the overall picture is yes a 4090 is pretty dumb for 1440p but there are a few cases where you actually could make it worthwhile as long as you have the top of the line CPU. And I hope people that have a 1440p monitor have at least enough sense to run most games with DSR and actually get some use out of their card. I know I sure as hell wouldn't spend over $1,600 to play all my games at 1440p and watch a 4090 twiddle its thumbs.
Haha. Just such a stretch to try to prove a very odd point.

I had a 3090 Ti previously - never has my 4090 been worse - even on my 5800X3D/4090 rig that I play at 1080p/360Hz. This is NVIDIA - not AMD. ;)
 
Exactly this. If his room is cooler and he's playing the same thing as before the upgrade then his room IS cooler. So what if the card is overkill - you don't buy something like this to only ever play what you currently are. Will a more demanding game make his video card work harder (maybe at 4k), then his room will get warmer.


This actually makes a lot of sense. I have spent more time than I care to admit trying to understand why I was feeling the difference after being told here that my room isn't cooler. I can understand people telling me my theory was wrong, but the results just made that hard to accept. My room is cooler lol Cooling my PC is really important to me, so it always stood out to me how warm my room was getting with the 3090 because it was making all my temps go up, which drives me nuts, so I would open the window to get my ambient down. Even a couple degrees is very noticeable to me because I am a bit too obsessed with looking at temps. But also, it just feels different when you lower the air temps a degree or 2. I am too cold in my house at 71, at 72 it's juuuust right. I am playing the exact games I was before, Halo Infinite, Stray, Ace Combat 7, Street Fighter 5 to name a few. Probably not the most demanding games. Anyway, at least it makes sense to me now. I honestly was expecting the card to be much warmer overall because of the wattage specs.
 
Haha. Just such a stretch to try to prove a very odd point.

I had a 3090 Ti previously - never has my 4090 been worse - even on my 5800X3D/4090 rig that I play at 1080p/360Hz. This is NVIDIA - not AMD. ;)
You being ignorant on the subject doesn't change any facts. Go look at the techpoweup review and you will see damn well a 4090 can actually go backwards at 1440p compared to other cards due to CPU overhead. Anyone not living under a rock has known this has been a discussion even since the last generation where hardware unboxed showed if you had a slower CPU that the 3090 and 3090 TI can actually fall well behind much much slower AMD cards at 1440p because they handled CPU limited situations better. And at 1080p you're just living in a goddamn fantasyland if you think performance can't go backwards due to CPU limitations.

Again it's going to come down to people's individual games their settings and their personal uses and needs as well as the rest of their system configuration. If you have a top-of-the-line CPU then of course it's a hell of a lot more justifiable and you can get some good use out of it but a 4090 is way more suitable to 4K gaming for the vast majority of games and people's setups.
 
Last edited:
You being ignorant on the subject doesn't change any facts. Go look at the techpoweup review and you will see damn well a 4090 can actually go backwards at 1440p compared to other cards due to CPU overhead. Anyone not living under a rock has known this has been a discussion even since the last generation where hardware unboxed showed if you had a slower CPU that the 3090 and 3090 TI can actually fall well behind much much slower AMD cards at 1440p because they handled CPU limited situations better. And at 1080p you're just living in a goddamn fantasyland if you think performance can't go backwards due to CPU limitations.

Again it's going to come down to people's individual games their settings and their personal uses and needs as well as the rest of their system configuration. If you have a top-of-the-line CPU then of course it's a hell of a lot more justifiable and you can get some good use out of it but a 4090 is way more suitable to 4K gaming for the vast majority of games and people's setups.
Please enlighten me. The tech power up review that I found shows the 4090 on top of all other cards at 1440p and even 1080p. Unless you’re talking about an individual game and not the overall average? Just odd.
 
Please enlighten me. The tech power up review that I found shows the 4090 on top of all other cards at 1440p and even 1080p. Unless you’re talking about an individual game and not the overall average? Just odd.
Yes I'm just talking about individual games in the 4090 review. There are games where the performance is essentially the same from 1080p to 4K and then a couple of cases where a 4090 will actually drop a few percent behind some slower cards at 1440p and especially 1080p.That's why I keep trying to make it clear it really depends on the games you're playing and your exact setup to determine what your experience will be. Just throwing the fastest GPU at a game at a lower resolution does not always yield benefits.
 
Yes I'm just talking about individual games in the 4090 review. That's why I keep trying to make it clear it really depends on the games you're playing and your exact setup to determine what your experience will be. Just throwing the fastest GPU at a game at a lower resolution does not always yield benefits.
Okaaay. Keep moving those goal posts.

I get so tired of the xxxx resolution is a waste for xxxx crew. Let people live with their fast cards. Has happened forever.

Sure if I use a Core 2 Duo and a 720p monitor with my 4090 and play Warcraft II it’ll be a waste. Cmon man. Common sense.
 
Okaaay. Keep moving those goal posts.

I get so tired of the xxxx resolution is a waste for xxxx crew. Let people live with their fast cards. Has happened forever.

Sure if I use a Core 2 Duo and a 720p monitor with my 4090 and play Warcraft II it’ll be a waste. Cmon man. Common sense.
I have not moved the goal post a single time and I've been trying to make it clear over and over and over that it depends on your setup and the individual games. And your core 2 duo comment is just beyond fucking asinine and you know damn well nobody is testing with a CPU even remotely that outdated and slow. Again go do some damn research because there can be a massive performance increase just going from a 5800 to a 5800x3D in some games never mind something like a 3600x which is actually a fairly common CPU that people are using even with a 4090. Hell I've even seen people with a 4770k running a 4090 and ignorantly thinking they don't have a CPU bottleneck.
 
Last edited:
I have not moved the goal post a single time and I've been trying to make it clear over and over and fucking over that it depends on your setup and the individual games.
You said this - "I would say in the majority of games you're literally getting no better playable performance at 1440p with a 4090 over what a 3090 can deliver" - hence why I responded. The average FPS as shown at techpowerup (which you referenced) proves that to be false.
 
You said this - "I would say in the majority of games you're literally getting no better playable performance at 1440p with a 4090 over what a 3090 can deliver" - hence why I responded. The average FPS as shown at techpowerup (which you referenced) proves that to be false.
Do you know what playable performance means? For example if someone had a 1440p 120 HZ monitor and a 5800x CPU then there would be no playable performance difference using a 4090 over a 3090 in the majority of games at that resolution as the 3090 would already deliver that performance if there was not a game or CPU limitation. So again it depends on the games you are playing in your current setup but for most people that would be the case. There is absolutely no way to go through every single individual case scenario here. That said it doesn't take too much sense to figure out the 4090 is much more suitable to 4K gaming if you want to get real use out of the card. At 1440p they're just becomes a lot more variables that can come into play that will affect performance.
 
Do you know what playable performance means? For example if someone had a 1440p 120 HZ monitor and a 5800x CPU then there would be no playable performance difference using a 4090 over a 3090 in the majority of games at that resolution as the 3090 would already deliver that performance if there was not a game or CPU limitation. So again it depends on the games you are playing in your current setup but for most people that would be the case. There is absolutely no way to go through every single individual case scenario here. That said it doesn't take too much sense to figure out the 490 is much more suitable to 4K gaming if you want to get real used out of the card. At 1440p they're just becomes a lot more variables that can come into play that will affect performance.
Completely agree with you.
 
This actually makes a lot of sense. I have spent more time than I care to admit trying to understand why I was feeling the difference after being told here that my room isn't cooler. I can understand people telling me my theory was wrong, but the results just made that hard to accept. My room is cooler lol Cooling my PC is really important to me, so it always stood out to me how warm my room was getting with the 3090 because it was making all my temps go up, which drives me nuts, so I would open the window to get my ambient down. Even a couple degrees is very noticeable to me because I am a bit too obsessed with looking at temps. But also, it just feels different when you lower the air temps a degree or 2. I am too cold in my house at 71, at 72 it's juuuust right. I am playing the exact games I was before, Halo Infinite, Stray, Ace Combat 7, Street Fighter 5 to name a few. Probably not the most demanding games. Anyway, at least it makes sense to me now. I honestly was expecting the card to be much warmer overall because of the wattage specs.
It's simple. Your 4090 is giving you double the frames per watt on a fixed amount of frames. So you halved your previous GPU's heat output, roughly speaking.

People are just too busy screaming "600 watt card! Why aren't they more efficient?" to recognize thermal efficiency did vastly improve, but that improvement got poured back into a doubling of the performance volume at default settings. And that's totally user optional - you can run it undervolted or frame-capped and it's barely warm.

A VR game maxed my 3090 with 100% GPU utilization and 65-67C temp at a fixed 90FPS. I dropped in a 4090 FE, and GPU temp is a steady 39C playing same settings because it's snoring through the workload.

These GPUs are monsters.
 
Last edited:
Decided to go with 4090 since I couldn’t snag a 7900xtx at launch. Got new case (Fractal Meshify2 RGB) arriving tomorrow and will get the new Seasonic ATX3.0 that comes out on the 21st. Now just got to find a 4090.

I'm waiting on the same PSU or the MSI MEG.

Probably going to be just as hard to find as the 4090 at this point.
 
Think MC will carry it on day 1?
Not sure but Newegg is supposed to have the gold 1200 watt on the 21st. Pricing is a lot more than what Seasonic listed in September. Newegg right now doesn’t have the platinum model. I think the gold will be fine for me.
 
I hear ya. I just think everything below the 4090 is way overpriced for the performance. I ignore the AMD fanboi crap. It's the same every generation, NV dominates the top and they trade blows below that. Obviously, NV is winning the war and the minority don't like it. AMD needs to step it up with their crap launch drivers and get with it on the RT front. The market has spoken.

I really don't care if AMD doesn't want to compete with a 1600 dollar card. It's not where you make money. As a business compete up to 1k, if they have a halo product sure price it like 4090 but not sure if that is required to be successful. Nvidia is just going to keep increasing prices on you so you can't really keep up that game. I could say the same about nvidia fans, they will pay up for top performance even if they chrage 2k next time for 5090. Can I afford it right now? sure but I don't expect everyone else to be caring about $1500+ cards. They seem to be selling shit load of 7900xtx just fine and with chiplet design margins are probably decent.
 
Same here with my Gigabyte 4090 OC. Very well built card that runs cooler than my previous EVGA 3090 TI card which still blows my mind. Thinking of mounting it vertically, but haven't fully decided yet.
Tried mounting my 4090 vertically and though it only ran a little hotter I went back to the horizontal installation. I don't think these cards were really meant to be mounted vertically IMO.
 
I hear ya. I just think everything below the 4090 is way overpriced for the performance. I ignore the AMD fanboi crap. It's the same every generation, NV dominates the top and they trade blows below that. Obviously, NV is winning the war and the minority don't like it. AMD needs to step it up with their crap launch drivers and get with it on the RT front. The market has spoken.
Totally agree with what you said here. I've been in the hobby for over 25 years and the fanboi stuff is more rampant than ever.
 
Totally agree with what you said here. I've been in the hobby for over 25 years and the fanboi stuff is more rampant than ever.
I remember about 20+ years ago going to E3, Nvidia was there and a friend and I told the Nvidia people "We're team green, Nvidiots for life!". He looked so sad, not sure he'd heard the term Nvidiot before. I do still prefer nvidia but only because I spend enough that they tend to be the only option. AMDs still has lots of catching up to do....
 
I remember about 20+ years ago going to E3, Nvidia was there and a friend and I told the Nvidia people "We're team green, Nvidiots for life!". He looked so sad, not sure he'd heard the term Nvidiot before. I do still prefer nvidia but only because I spend enough that they tend to be the only option. AMDs still has lots of catching up to do....
Nvidia was my first card and they have been trouble free for me for years. I did try ATI/AMD back in the 9700 and X800 days where the Omega drivers were a great alternative driver suite, and they were great cards. I tried AMD again with another card (can't remember the model) a while back, and had a ton of issues with it and returned the card and stayed with Nvidia ever since. I shouldn't be so stubborn, but I usually just stick with what works for me. That was the only piece of hardware I have ever had to send back in the long time of building PC's that was defective.
 
Nvidia was my first card and they have been trouble free for me for years. I did try ATI/AMD back in the 9700 and X800 days where the Omega drivers were a great alternative driver suite, and they were great cards. I tried AMD again with another card (can't remember the model) a while back, and had a ton of issues with it and returned the card and stayed with Nvidia ever since. I shouldn't be so stubborn, but I usually just stick with what works for me. That was the only piece of hardware I have ever had to send back in the long time of building PC's that was defective.
This was exactly my story as well. Like EXACTLY the same lol.
 
Nvidia was my first card and they have been trouble free for me for years. I did try ATI/AMD back in the 9700 and X800 days where the Omega drivers were a great alternative driver suite, and they were great cards. I tried AMD again with another card (can't remember the model) a while back, and had a ton of issues with it and returned the card and stayed with Nvidia ever since. I shouldn't be so stubborn, but I usually just stick with what works for me. That was the only piece of hardware I have ever had to send back in the long time of building PC's that was defective.
That's similar to my experience as well.

The only Radeon GPUs that I own right now are the ones in my PS5 and Steam Deck and I haven't had any problems with those. Probably because Sony and Valve takes care of the software side of things.
 
Not sure but Newegg is supposed to have the gold 1200 watt on the 21st. Pricing is a lot more than what Seasonic listed in September. Newegg right now doesn’t have the platinum model. I think the gold will be fine for me.

Were you able to find out? Think I read it's now supposed to be Q1 2023.

These PSUs are the new 4090s.
 
Were you able to find out? Think I read it's now supposed to be Q1 2023.

These PSUs are the new 4090s.
I went to purchase today and newegg just removed the ETA 12/21 release date and it just says out of stock. I haven't found anywhere when it will become available. Bummer, I still need the PSU and 4090 to finish my upgrades. I really don't want to compromise and get a different PSU if I can keep from it.
 
I went to purchase today and newegg just removed the ETA 12/21 release date and it just says out of stock. I haven't found anywhere when it will become available. Bummer, I still need the PSU and 4090 to finish my upgrades. I really don't want to compromise and get a different PSU if I can keep from it.

Yeah I just left MC and they had like 3 4090s on the shelves. I think they had one of the MSI MEG 1300W PSUs a couple of weeks ago but that was it (for which now newegg is giving an ETA of 12/22).
 
In another thread, apparently Seasonic has pushed it back to January. Since I can't even get a 4090, I will just wait.
 
I ended up returning the MSI MPG A1000G PCIE 5 1000 Watt ATX 90 Plus Gold PSU, and instead just using a CableMod adapter on an existing EVGA P2 1000w PSU. Good enough.
 
My MSI MEG 1300W backorder on Newegg just flipped to packaging...so it appears like they got some stock in!
awesome! The only thing that I don't like is the heat wrap about the 16pin cable vs Seasonic leaves it all single braided cabled (no wrap).
 
Back
Top