NVIDIA rumored to be preparing GeForce RTX 4080/4070 SUPER cards

I'm agnostic, and was expecting to get an XTX, but I grabbed one of the last EVGA 3080 w/ waterblock because A: it was MSRP, and free waterblock. That said, the 4xxx series has been slightly tempting, but the supers did zero to interest me. I'm waiting for the 5xxx and (whatever AMD), and really hope AMD can bring it, open source drives stuff forward. Remember PhysX?
 
I'm agnostic, and was expecting to get an XTX, but I grabbed one of the last EVGA 3080 w/ waterblock because A: it was MSRP, and free waterblock. That said, the 4xxx series has been slightly tempting, but the supers did zero to interest me. I'm waiting for the 5xxx and (whatever AMD), and really hope AMD can bring it, open source drives stuff forward. Remember PhysX?

Basically every single UE game made for the last couple years has used PhysX. Several Unity games as well. It's built into both engines, along with some others.
 
And as you I think are alluding to, it's not as if the 7900XTX performs poorly in RT. It does compared to nVidia Lovelace, but it's very comparable to 3090 level performance which was top for the previous gen. And a 3090 owner today (if they could magically swap cards for free) would greatly benefit from the raster on a 7900XTX, which they would be more than happy to have...

why are you comparing the current AMD flagship GPU to Nvidia's last gen flagship?...why not make a direct comparison to the 4090?...we all know why...it's also not just about straight frame rates...you also have to take DLSS vs FSR into consideration...not just in terms of game support but image quality...Nvidia is the best option if you care about the best image quality and performance

you also keep talking about upcoming UE5 titles but neglect to mention upcoming RT and path tracing titles...do you really think RT game support is slowing down and doesn't require beefy hardware?...most AAA games for the forseeable future are going to support RT...path tracing is much more rare and niche but AMD is getting annihilated there...the main selling point of next gen GPU's and beyond is RT performance...otherwise you can still use a 3080 and get excellent raster performance

you act like Cyberpunk 2077 and Alan Wake 2 are the end of hardware intensive RT enabled games
 
Last edited:
why are you comparing the current AMD flagship GPU to Nvidia's last gen flagship?...why not make a direct comparison to the 4090?...
Because the 4090 is a card that costs twice as much if you can even buy it.

I’ve said multiple times >/=$1000, then sure, buy nVidia. Anything less than that if dollar value matters at all, AMD wins there.
we all know why...it's also not just about straight frame rates...you also have to take DLSS vs FSR into consideration...not just in terms of game support but image quality...Nvidia is the best option if you care about the best image quality and performance
Do you even read my posts? Even the one that you’re quoting here? I have never “not said” that nVidia has an IQ advantage. I feel like you’re missing the point, the point is whether or not there is enough ROI there to make it worth buying an nVidia card.

I feel like you should read the rest of the post you just quoted. Because you’ve selected two paragraphs out of all that, and I addressed all of those things.
you also keep talking about upcoming UE5 titles but to neglect to mention upcoming RT and path tracing titles...do you really think RT game support is slowing down and doesn't require beefy hardware?...most AAA games for the forseeable future are going to support RT...path tracing is much more rare and niche but AMD is getting annihilated there
I’ve addressed all this too. I think for your argument to make sense you’d have to show that there are going to be a bunch of engines that are incredibly taxing with their RT implementations.

And frankly there aren’t that many devs developing their own game engines and far fewer that use RT, let alone path tracing.

From Tekken 8 to Witcher Next, those games are all using UE5. Feel free to let me know all the games that are coming that feature heavy RT implementations. Most are going to be using light implementations that will just trade blows, like the rest of the titles I listed above. That’s just factual. And you can blame the consoles for that if you want.
you act like Cyberpunk 2077 and Alan Wake 2 are the end of hardware intensive RT enabled games
They pretty much are. You missed Ratchet and Clank and Fortnite in hardware RT enabled mode. But the rest of the RT titles you can list the 7900XTX and 4080 run neck in neck.
RE4, RE Village, Spiderman, Jedi Survivor, Avatar, etc.

I feel like there isn’t a point in talking to you about this stuff because the only reason you can misconstrue my positions is because you literally aren’t reading what they are. You posted an HU video earlier in the thread and I literally laid all this stuff out. Piece by piece. Precept by precept.

If you want to have an unbalanced view on me, nVidia, and AMD, that’s your perogative. But it’s not as if HU (or anyone else’s) “dollar per frame” graph is particularly hard to read.
 
  • Like
Reactions: noko
like this
If you want to have an unbalanced view on me, nVidia, and AMD, that’s your perogative. But it’s not as if HU (or anyone else’s) “dollar per frame” graph is particularly hard to read.

I've said it multiple times that if all you're interested in is raster performance than the better buy is AMD...I'm talking about strictly RT in terms of technology and image quality...of course AMD is the better value, again no one is saying otherwise...you're missing my point

the only games that feature 'light' implementations of RT are AMD sponsored games and the reason for that is because AMD GPU's can't handle a heavy RT workload...RT is the only reason to buy a next-gen GPU otherwise an older gen card will still raster perfectly fine...Alan Wake 2 literally just came out 3 months ago...the Phantom Liberty expansion came out 4 months ago...so to say that the era of heavy RT and PT enabled games is over is silly

you act like UE5 titles are not hardware intensive with Lumen, Nanite etc...look at the recent Lords of the Fallen game...developers are also taking shortcuts by using DLSS/FSR as a substitute for actual better coding
 
Last edited:
I've said it multiple times that if all you're interested in is raster performance than the better buy is AMD...I'm talking about strictly RT in terms of technology and image quality...of course AMD is the better value, again no one is saying otherwise...you're missing my point

the only games that feature 'light' implementations of RT are AMD sponsored games and the reason for that is because AMD GPU's can't handle a heavy RT workload...RT is the only reason to buy a next-gen GPU otherwise an older gen card will still raster perfectly fine...Alan Wake 2 literally just came out 3 months ago...so to say that the era of heavy RT and PT enabled games is over is silly
Light RT is going to be a majority of the titles for the next 10 years. Considering only the top of the graphics card product stack is a common mistake.

Even if AMD graphics cards cease to exist tomorrow, people aren’t rolling with 4080s and 4090s. Not most of them. Or 3090s or 3090Tis either.

For game devs the target will always be the median. Pull up the Steam survey and you can see for yourself that a majority of cards owned by real people are not cards that will perform well with heavy RT.

Then there is consoles on top of that. Agree or not, but the PS5 and Pro have a big impact on game development. There will be zero PT PS5 Games. That’s just facts.

Which is why I simply asked you to feel free to list all the heavy RT games that are coming. The reality is we may get 1 or 2 of those a year. If that. Meanwhile every other game also exists. Settings can be turned down for minimal IQ impact (see Alan Wake 2 and Fortnite for obvious examples with great software RT). And the dollar to performance for both RT and Raster both favor AMD below $1000 especially when considering the whole field and yes “settings”.

Some devs like CDPR and Remedy will absolutely target the top. But it has to be painfully obvious even to you that there are scores of games made per year and we got 2 in 3 years that are making you come to this conclusion. CP2077 is a 2020 game. AW2 came out in 2023. Ratchet and Clank is a PS5 port… from 2020? Anyway. So nothing from 2021 or 2022 even registered.

Avatar the other big “RT title” that came out in 2023 and won DF’s best graphics of the year is neck and neck on both AMD and nVidia GPU’s. If Ubisoft uses the same engine for the next AC games, then AMD will continue to perform perfectly well in them.
you act like UE5 titles are not hardware intensive with Lumen, Nanite etc...look at the recent Lords of the Fallen game
Do I need to post Daniel Owen’s video again? 7900XTX is ahead of 4080 in 5 out of 6 titles.
 
Which is why I simply asked you to feel free to list all the heavy RT games that are coming. The reality is we may get 1 or 2 of those a year. If that. Meanwhile every other game also exists. Settings can be turned down for minimal IQ impact (see Alan Wake 2 and Fortnite for obvious examples with great software RT). And the dollar to performance for both RT and Raster both favor AMD below $1000 especially when considering the whole field and yes “settings”

so now you're saying we may get 1- 2 heavy RT titles a year when you previously said that CP2077 and AW2 was the end...do you think developers are going to announce ahead of time that their games are going to be brutal to run?...no...we only find out post-release...was Alan Wake 2 pre-release expected to be so heavy?...I believe they only announced path tracing support a few weeks before release

Steam surveys etc are not valid for high end gamers...of course the majority are still on 1080's or below just like the majority still game at 1080p...so games should cater to that demographic?

why are more displays coming with 240+hz, QD-OLED gaming panels etc?...hardware is being pushed more than ever...consoles will always be the baseline but consoles have also become more PC-like with mid cycle refreshes...isn't the PS5 Pro rumored to be coming out?
 
so now you're saying we may get 1- 2 heavy RT titles a year when you previously said that CP2077 and AW2 was the end...
I never said that. I’ve only ever said that you’re basing your position off of a minority of games. And games with PT will continue to be that for a while.
do you think developers are going to announce ahead of time that their games are going to be brutal to run?...no...we only find out post-release...was Alan Wake 2 pre-release expected to be so heavy?...I believe they only announced path tracing support a few weeks before release
This is basically dodging the question.
Steam surveys etc are not valid for high end gamers...of course the majority are still on 1080's or below just like the majority still game at 1080p...so games should cater to that demographic?
Yes. If you’re a dev that’s who you’re targeting.
why are more displays coming with 240hz, QD-OLED gaming panels etc?...hardware is being pushed more than ever...consoles will always be the baseline but consoles have also become more PC-like with mid cycle refreshes...isn't the PS5 Pro rumored to be coming out?
I mentioned the PS5 Pro (again, it still won’t have path tracking though. And even if it did/will it will be targeting AMD hardware and or/low RT implementations). But this whole section of your post is “obvious”. You don't need to bring up monitors: we're talking about top end consumer hardware with the 4090. It's a $1600 MSRP graphics card, it costs more than a 42" LG C3 or Sony A90k. Tech companies are gonna push tech.

A majority of titles are not targeting that card though, I figured even saying that would be redundant or obvious.

Also, you should also look up monitor stats. Those move pretty slow even with enthusiasts. You think everyone has or is going to have OLED 4k 240Hz monitors? 1080p monitors are still nearly 60% of what most gamers are on. 2.5k is 16~%. 4k is <4%.

EDIT: Maybe the reason why you keep arguing with me is that I’m pragmatic and you’re idealistic. You're reaching for an idealized future where everyone is playing on top end GPU's and every dev is building games for PC and will all include the best tech. But it's never been that way. Sure there are outliers, the id softwares (John Carmack specifically) and Epics of the world. But that has always been the minority.

Devs for the most part target systems that can play their games today. AW2 is going to have a near 0% player count in 4 years. So will CP2077. So while having "cool tech" is cool or whatever, it's a lot of work to build something 90%+ of your audience will never experience. It's actually a better usage of their time to optimize their games so that the maximum number of people will get the best experience possible. Because that is something players will remember. It's not like I'm telling you these things to offend your or something. I'm simply speaking what the reality of the situation is. Which also says a lot about why even CP2077 and AW2 have spent a lot of time on software solutions so that they both can look really good for the most amount of people that will never see the game with that Path Tracing switch turned on.

Is PT cool? Yes. Does path tracing look amazing? Aboslutely yes. Do I think there will be a bunch of games with top end PT over the next 2 hardware generations when a majority of people can't get playable framerates? No, absolutely not.

So if you're joe average gamer, focusing on PT is a mistake. It is still definitely for people who can afford $1000+ video cards. But, if your goal is to have a card with reasonable dollar value, will generally speaking play more games longer for less, and apparently it still has to be mentioned: will compete well with light RT implementations, then AMD makes a lot of sense. If you can afford a $1000 video card in the first place (which as all the numbers will show you is basically <4% of all gamers), then sure, nVidia is absolutely the way to go. No question. The 4080 has "enough" raster and can play these future looking path tracing titles better. And look better while doing it.
 
Last edited:
Basically every single UE game made for the last couple years has used PhysX. Several Unity games as well. It's built into both engines, along with some others.
Yeah, the open CPU branch. Nvidia's highly optimized advanced (proprietary / owned) version? Dead.
 
Epic has also ditched PhysX in UE5 for their own home grown physics engine.
I would think that was necessary in order to have better control over resources. Putting together, especially black box code, makes making optimize coordinated programs almost impossible.
 
Devs for the most part target systems that can play their games today. AW2 is going to have a near 0% player count in 4 years. So will CP2077. So while having "cool tech" is cool or whatever, it's a lot of work to build something 90%+ of your audience will never experience...

that makes no sense whatsoever...as the years go by mid and even lower end GPU's will be able to run games like Cyberpunk and Alan Wake 2 with path tracing enabled...it's a future tech...has Witcher 3 gotten less popular over the years?...are people still playing Crysis today?...Deus Ex Mankind Divided, Half Life 2?...Frontiers of Pandora has a secret graphics preset that is meant for future GPU's
 
that makes no sense whatsoever...as the years go by mid and even lower end GPU's will be able to run games like Cyberpunk and Alan Wake 2 with path tracing enabled...it's a future tech...has Witcher 3 gotten less popular over the years?...are people still playing Crysis today?...Deus Ex Mankind Divided, Half Life 2?...Frontiers of Pandora has a secret graphics preset that is meant for future GPU's
4090 at 4K Native, Alan Wake, High Quality Preset, High Quality Ray tracing -> 28fps. I really don't think in the near or somewhat distant future low end and mid range GPUs will have double the performance of the 4090 (basically double the transistors or clock speed) to make PT that common even many years to come. DLSS and frame generation can make it more playable. I don't see frame generation even usable unless the base frame rate is rather high as in 60+, preferably much higher. Unless AMD/Nvidia start making multi die GPU solutions that work, I see a wall with Path tracing or any real ray tracing solution for lighting in games. The performance hit to image quality gains is hard to justify. Plus may types of games, fast paced, MP, VR games need the FPS to play. FPS can be the overriding most important feature for a game that trumps RT image quality gains.
 
that makes no sense whatsoever...as the years go by mid and even lower end GPU's will be able to run games like Cyberpunk and Alan Wake 2 with path tracing enabled...it's a future tech...has Witcher 3 gotten less popular over the years?...are people still playing Crysis today?...Deus Ex Mankind Divided, Half Life 2?...Frontiers of Pandora has a secret graphics preset that is meant for future GPU's
Your logic is flipped.

The vast majority play the game at launch or near launch.

These are not tech demos. They’re games.

There will always be outliers. I personally tend to play games late because I care more about sales than hype. Even including people like me, I would say my 90% number is generous.
 
Last edited:
that makes no sense whatsoever...as the years go by mid and even lower end GPU's will be able to run games like Cyberpunk and Alan Wake 2 with path tracing enabled...it's a future tech...has Witcher 3 gotten less popular over the years?...are people still playing Crysis today?...Deus Ex Mankind Divided, Half Life 2?...Frontiers of Pandora has a secret graphics preset that is meant for future GPU's

People will of course play games for many years but the amount of people playing Crysis now compared to 14 years ago is a lot lower.
 
Basically every single UE game made for the last couple years has used PhysX. Several Unity games as well. It's built into both engines, along with some others.
UE5 doesn't have PhysX, but it was the preferred physics engine in UE4. Epic decided to minimize the amount of third-party software present in UE5.
Yeah, the open CPU branch. Nvidia's highly optimized advanced (proprietary / owned) version? Dead.
NVIDIA released PhysX under the BSD-3 license long before GPU-accelerated physics fell to the wayside. You can still implement the GPU-accelerated physics in the current version should you choose to. Support for it still exists in the drivers, including for current products.
 
GeForce RTX 4080 SUPER reviews rescheduled to January 31st

Nvidia has opted to delay reviews for the RTX 4080 Super by one day...this decision is unrelated to any driver issues...the delay is due to problems with RTX 4080 Super Founders Edition card shipments to reviewers who did not receive their samples on time, with some only receiving them this weekend and many even just today...

https://videocardz.com/newz/geforce-rtx-4080-super-reviews-rescheduled-to-january-31st
 
GeForce RTX 4080 SUPER reviews rescheduled to January 31st

Nvidia has opted to delay reviews for the RTX 4080 Super by one day...this decision is unrelated to any driver issues...the delay is due to problems with RTX 4080 Super Founders Edition card shipments to reviewers who did not receive their samples on time, with some only receiving them this weekend and many even just today...

https://videocardz.com/newz/geforce-rtx-4080-super-reviews-rescheduled-to-january-31st
I understand why they did that (fairness) but I doubt anyone is going to care much. This is just going to be another small performance bump in all likelihood.
 
I understand why they did that (fairness) but I doubt anyone is going to care much. This is just going to be another small performance bump in all likelihood.

someone coming from a 2000 series GPU might care...it's a minor bump in performance only if you already have a 30 or 40 series card
 
4080 Super is pretty much a $200 cheaper 4080 non-Super
Yeah that's effectively what I was saying. That's why reviews probably won't tell us much we don't already know. I'd love to see a larger performance bump but ~5-10% (over the 4080) seems highly likely. I guess the only real thing to see is how close to MSRP the AIB models will be.
 
ASUS RTX 4080 Super Noctua OC Edition GPU: 4-Slot Cooler & Up To 2640 MHz

https://www.asus.com/motherboards-components/graphics-cards/asus/rtx4080s-o16g-noctua/

ASUS.png
 
https://videocardz.com/newz/nvidia-...3dmark-similar-gaming-performance-to-rtx-4080

So fucking retarded. Only thing going for this is the $200 discount but not in territories where Nvidia continues to price gouge. Honestly if a person has already waited this long to upgrade to a flagship card they might as well keep waiting for Blackwell.
Blackwell may be delayed. If it’s significantly delayed then what Makes sense changes.

https://www.thefpsreview.com/2024/0...-gb-following-delay-of-geforce-rtx-50-series/
 
Almost makes me wonder why they bothered. I suppose a new product/review cycle at the revised price gives it better visibility than just a regular price cut.

At least the 4070 Super got a nice bump.
Well, the alternative is what AMD has done, right? Not a fucking thing coming from that camp except the trash tier 7600XT, which is the worse price/performance anyone has seen like ever. No refresh even rumored and no price cuts to even compete with NV refresh.

So as a company, I can't even blame NV for this lack luster refresh cycle. They are competing with themselves as far as anyone can tell.
 
latest rumors have Nvidia considering a 4090 Super or 4090 Titan for the end of 2024 as a stopgap until Blackwell in 2025
 
I just built a new PC last weekend and I'm currently using the 3080ti out of my older PC in it. I play on an LGC2 and I like having all the bells and whistles on and don't mind using DLSS to give me a boost. I intended to grab a 4080 Super when they drop tomorrow and put the 3080ti back but now I'm not sure. I'm contemplating just grabbing a 4060/4060ti for the living room rig for casual gaming and movies and sticking with the 3080ti in my primary for now. If the 5000 series launches this Christmas I don't mind waiting but having to wait until summer(ish) of 2025 feels like such a long time.
 
And the source is suspect, random Chinese site who sourced it independently before the embargo lifted…

I'm expecting basically the same when the review embargo lifts in a few hours...at most maybe a 2- 3% uplift with the 4080 Super...4070 Super was around 15%...4070 Ti Super around 6- 8%...so don't go in expecting some massive improvement with the 4080 Super
 
Back
Top