RTX 4xxx / RX 7xxx speculation

I moved to couch gaming on a big screen a while back. When I redo my office I'll have the ability to switch between km/monitor and controller/TV.

For now I'm all controller and TV.
Same, for FPS such as battlefield it's keyboard and mouse on the local monitor.

For other games it's nvidia shield gamestream to the living room with controller and TV/Surround sound.
 
I remember getting new graphics cards for each of the new Battlefield games that came out for years. First time I am not upgrading my card on a yearly basis because there is really no game coming out that I am excited about.
 
Apparently the next gen will be some 80-120% faster than current gen.
MLID just put out a video. I trust his sources, especially when he claims the leak is very very likely correct.
I will sell my 3080 in a few months since I rarely game during the summer and if needed, I got my PS5 which I use more often anyway.
Then I hope to snatch either Nvidia or AMD top tier card at MRSP.
 
Apparently the next gen will be some 80-120% faster than current gen.

Would that be incredibly good at the high-end side of things, even for a 2+ year's cycle ?

If that way to calculate it is somewhat right, it would be the biggest jump since at least
https://www.reddit.com/r/nvidia/comments/im923c/an_analysis_of_generational_performance_gains_for/


Seriesx60x70x80
400-Series+15.38%N/A+56.25%
500-Series+19.05%+25.00%+16.28%
600-Series+41.15%+36.99%+29.87%
700-Series+25.00%+13.64%+26.58%
900-Series+11.11%+41.18%+31.58%
10-Series+104.08%+66.67%+69.49%
20-Series+56.25%+33.33%+31.58%
AVERAGE+38.86%+36.14%+37.38%

For a reference:
https://www.techspot.com/review/2099-geforce-rtx-3080/

At release time at least, the 3080 was 47% in average aboeve the 2080 at 1440p and 68% at 4K and those number were historically really good gen to gen (if msrp price would have happened) jump.

120% would be not doubling Ampere jump, but not too far from it.
 
Apparently the next gen will be some 80-120% faster than current gen.
MLID just put out a video. I trust his sources, especially when he claims the leak is very very likely correct.
I will sell my 3080 in a few months since I rarely game during the summer and if needed, I got my PS5 which I use more often anyway.
Then I hope to snatch either Nvidia or AMD top tier card at MRSP.
That will most likley be for RT. Raster won't see a 100% improvement.
 
Apparently the next gen will be some 80-120% faster than current gen.
MLID just put out a video. I trust his sources, especially when he claims the leak is very very likely correct.
I will sell my 3080 in a few months since I rarely game during the summer and if needed, I got my PS5 which I use more often anyway.
Then I hope to snatch either Nvidia or AMD top tier card at MRSP.

The top SKU has 70% more shaders. The gains beyond that is from cranking the TDP. For people who like less noise and will be focusing on sff builds and/or undervolting, the cap in next gen gains is much less.
 
Here you go if you want to watch:


8nm Samsung is pretty bad compared to 7nm TMSC, and Lovelace is using 5nm TMSC.
With cranked up TDP and Nvidias own infinity cache, I can see it happening, especially since the competition is heating up.
MLID is even claiming that Nvidia is doing this to be able to be on pair with AMD high end. HIGH END, not even enthusiast AMD cards.
I have no reason not to trust his leaks as he was spot on current gen leaks.

I just watched their stream and MLID is confident they will be 2x rasterization performance, worst case scenario 80%.
 
Last edited:
Really, really hopeful for these cards. Still sitting with a 2070m, desktop prices for the first time ever made me not upgrade the tower.
 
Here you go if you want to watch:


8nm Samsung is pretty bad compared to 7nm TMSC, and Lovelace is using 5nm TMSC.
With cranked up TDP and Nvidias own infinity cache, I can see it happening, especially since the competition is heating up.
MLID is even claiming that Nvidia is doing this to be able to be on pair with AMD high end. HIGH END, not even enthusiast AMD cards.
I have no reason not to trust his leaks as he was spot on current gen leaks.

I just watched their stream and MLID is confident they will be 2x rasterization performance, worst case scenario 80%.

I'll believe it when I see it.
 
The top SKU has 70% more shaders. The gains beyond that is from cranking the TDP. For people who like less noise and will be focusing on sff builds and/or undervolting, the cap in next gen gains is much less.
You can't straight compare shader counts from one gen to another and extrapolate performance estimates that way. They always work differently each gen as each new gen is a new architecture.

Personally I think MLID is full of crap most of the time.
 
You can't straight compare shader counts from one gen to another and extrapolate performance estimates that way. They always work differently each gen as each new gen is a new architecture.

Personally I think MLID is full of crap most of the time.
Pretty interesting combination of opinions you got there...
 


4090 halo parts may use 650 watts, 7900 over 800.

If this is right, then people are going to need to run their PCs on multiple common household circuits in North America. You might be able to get a PC on one circuit, but still have to run your display, sound, etc. on another. Maybe squeeze it all onto one if you've got 15 amps?
 
  • Like
Reactions: noko
like this


4090 halo parts may use 650 watts, 7900 over 800.

If this is right, then people are going to need to run their PCs on multiple common household circuits in North America. You might be able to get a PC on one circuit, but still have to run your display, sound, etc. on another. Maybe squeeze it all onto one if you've got 15 amps?

If those power consumption numbers are true it would essentially remove SFX/SFX-L mitx from the high end.
 
Last edited:
I don't care how good the performance is, I don't want some 800 watt behemoth in my computer, absurd.
 
  • Like
Reactions: ncjoe
like this
Hopefully the watts are just a silly rumor. Very few people have the ventilation to handle a 600w card. E.g. even in an airflow case you would have to run case fans at max speed or cook your motherboard. Case air would probably be in the high 30s or low 40s with very good ventilation and could get into the high 40s with average ventilation.

If the 4080 has 35% raster and 80% raytracing performance increase and 350w power draw then I might bite if there are any new games on the horizon that I want to play. Not getting a card that has 500w power draw at stock though.
 
Remember how up in arms everyone was when the 3dfx series needed extra power? (4 pin Molex connector for the 5500, external powersupply for the 6000)
What's old is new again.

But pushing up against the 15 amp limit is a whole other issue.

Assuming these power draws are real, I predict in less than a month these will be lawsuits claiming that "Nvidia burned my house down!" Because Johnny daisy chained 5 extension cords to power his new card.

I dont think they will want to take the risk. There is a whole lotta stupid and substandard wiring out there.
 
Yeah why is this even happening? Who thought it was good idea to triple the power draw lol.
Brute forcing performance possibly, but yeah definitely a weird way to go. I wouldn't touch a card with that kind of power draw if true with a 10 foot pole.

Funny, Maxwell gave me more performance for less watts. Do that again Nvidia.
 
Remember how up in arms everyone was when the 3dfx series needed extra power? (4 pin Molex connector for the 5500, external powersupply for the 6000)
What's old is new again.

But pushing up against the 15 amp limit is a whole other issue.

Assuming these power draws are real, I predict in less than a month these will be lawsuits claiming that "Nvidia burned my house down!" Because Johnny daisy chained 5 extension cords to power his new card.

I dont think they will want to take the risk. There is a whole lotta stupid and substandard wiring out there.
There is a huge difference though between needing a molex and having a powerdraw that makes cooling very difficult without having lots of noise or a massive custom loop. Most people run their cards on air and the heat is dumped inside the case. A 200w card is easy to deal with while with an air cooled 350w card becomes a bit of a challenge to keep case temps at a reasonable level and 600w is a whole different ballgame. E.g. you might have a 5 degree cas etemp rise with 200w but 12 degree rise with a 350 card and 20+ degrees rise with a 600w card due to your cooling becoming overloaded.

Also wonder how they would cool a 600w card as it would probably need a push/pull fat 360 rad or a 2+ kg air cooler and massive case airflow at the minimum to keep the temps in check. Cards drawing 600w basically means forget sub 40dbA systems under full load, unless you are on a massive custom loop.
 
Yeah, I wasn't even thinking about cooling.

I always wanted to put a liquid cooling loop outside and buried 3-4 feet down, where the temp is a constant 53ish degrees.

But with the price of copper and stainless these days, I dont think i could afford to anymore.
 
There is a huge difference though between needing a molex and having a powerdraw that makes cooling very difficult without having lots of noise or a massive custom loop. Most people run their cards on air and the heat is dumped inside the case. A 200w card is easy to deal with while with an air cooled 350w card becomes a bit of a challenge to keep case temps at a reasonable level and 600w is a whole different ballgame. E.g. you might have a 5 degree cas etemp rise with 200w but 12 degree rise with a 350 card and 20+ degrees rise with a 600w card due to your cooling becoming overloaded.

Also wonder how they would cool a 600w card as it would probably need a push/pull fat 360 rad or a 2+ kg air cooler and massive case airflow at the minimum to keep the temps in check. Cards drawing 600w basically means forget sub 40dbA systems under full load, unless you are on a massive custom loop.
Even so, 600W to dissipate, would it be a 360 radiator? Doubt a 120 or even 240 is really going to cut it. Are they really going to try to push a thick 360 rad in a mainstream product?

As for an air cooler, you'd need something massive I'd assume. I mean they are really having to engineer thermal solutions just to manage 350W in a manageable way, and even then you'll notice so many cards now are taller, longer, and just overall way bigger than they used to be. Imagine twice the power draw now.
 


4090 halo parts may use 650 watts, 7900 over 800.

If this is right, then people are going to need to run their PCs on multiple common household circuits in North America. You might be able to get a PC on one circuit, but still have to run your display, sound, etc. on another. Maybe squeeze it all onto one if you've got 15 amps?

Really seems like that would be a good time to go back to calling it a Titan. You kinda check a mental box when you buy something with a name like this of "I may need to consult an electrician".

I suspect these are trial runs to push boundaries, and probably not what will appear, but we'll see!
 
  • Like
Reactions: Axman
like this
Apparently the DGX station a100 compressor system is able to cool 4x A100 with 80gb of VRAM (300 watt rated video card), so the market for having more than 1,000 watt of gpu in a box and low noise in a box cooling solution are certainly already outthere and make the rumor of 550-600+ watt somewhat possible for the very niche product, but those tend to be somewhat splitted line of product, I have and hard time to see it on "regular" xx80 card, obviously on ridiculously priced xx90 editions I am not sure why there would be a limit, those buyer are probably for many of them, just let me the option to put has many watts in my thing that I want, and if I can undervolt and have good performance a 300watt I certainly do not mind having 650 watt when needed.

Dell//Hp would not be able to sell those in California, even for work type machine ?
 
Just bought a 3080 Ti. Paid a lot for it. Marked up a $100 plus forced into a bundle from Antonline. If the 4060 comes out next week with almost similar perf I will be hurt.
 
Just bought a 3080 Ti. Paid a lot for it. Marked up a $100 plus forced into a bundle from Antonline. If the 4060 comes out next week with almost similar perf I will be hurt.
I hear they're not launching the 4xxx series until September, meaning, they won't be available until next spring if that.
 
Just bought a 3080 Ti. Paid a lot for it. Marked up a $100 plus forced into a bundle from Antonline. If the 4060 comes out next week with almost similar perf I will be hurt.

Didn't Nvidia announce they're going to be dropping prices 8 to 12% and that their board partners would be passing that on to consumers?
 
You people are crazy if you think they going to release 600w+ cards. It is not going to happen. Maybe they will push it to 500w.
 
Brute forcing performance possibly, but yeah definitely a weird way to go. I wouldn't touch a card with that kind of power draw if true with a 10 foot pole.

Funny, Maxwell gave me more performance for less watts. Do that again Nvidia.
Not the first time AMD went for brute force to compete e.g. 290x/Hawaii.

I highly doubt, however, it will go anywhere near 500w -- that's outlandish.
 
You people are crazy if you think they going to release 600w+ cards. It is not going to happen. Maybe they will push it to 500w.

The speculation is for halo parts, not standard stock. AIB special editions and factory water-cooled models.

And I have to wonder, with all three companies adopting chiplets going forward, won't they have to start planting the seeds of multi-circuit gaming PCs at some point?

We know the 7900 will have at least three chips, with some people saying four. Two for graphics, one interconnect, and maybe an FPGA-ish part. Why not double things for the generation after that. Why not triple?

Obviously these won't be video cards for regular users. We're talking machines for running racing and flight sims, not CoD: Bleh Ops. These people are already building rooms around their hardware, they can handle a PC that requires their electrician to chime in. He's already there for the hydraulics.
 
The speculation is for halo parts, not standard stock. AIB special editions and factory water-cooled models.

And I have to wonder, with all three companies adopting chiplets going forward, won't they have to start planting the seeds of multi-circuit gaming PCs at some point?

We know the 7900 will have at least three chips, with some people saying four. Two for graphics, one interconnect, and maybe an FPGA-ish part. Why not double things for the generation after that. Why not triple?

Obviously these won't be video cards for regular users. We're talking machines for running racing and flight sims, not CoD: Bleh Ops. These people are already building rooms around their hardware, they can handle a PC that requires their electrician to chime in. He's already there for the hydraulics.
I already have over 4.8kw ready at 240v, Ton and a half A/C cooled 10x12 computer room, easily can double that capacity if needed for the next generation of GPU's :D. I was looking at future gaming a long time ago.
 
Not the first time AMD went for brute force to compete e.g. 290x/Hawaii.

I highly doubt, however, it will go anywhere near 500w -- that's outlandish.
Yep, each company ebbs and flows. Goes from high performing efficient cards to brute forcing performance via raw power and then back again.
 
Not the first time AMD went for brute force to compete e.g. 290x/Hawaii.

I highly doubt, however, it will go anywhere near 500w -- that's outlandish.

I think you mean Fury series? Perf per watt of the 290X was much closer to the 780/Titan than the Fury X was to the Titan X or 980 Ti.
 
I think you mean Fury series? Perf per watt of the 290X was much closer to the 780/Titan than the Fury X was to the Titan X or 980 Ti.
I meant the 290x, who's contemporary competitor was the GTX 780 or GTX 780 ti. While the 290x drew around 300w at load; the 780 and 780 ti pulled around 230w and 260w respectively. The 290x sat between the 780 and 780w, so about 50w more power usage than nV's competition. Moreover, the story got worse when the 290x/390x (Hawaii) remained AMD's top card (not counting the 295x) during a good part of Maxwell's e.g. 980 lifetime.

The Fury x drew a comparatively low amount of power, ay least at stock, much less than it's supposed competitor, the 980 ti (although it's performance was worse).
 
I think you mean Fury series? Perf per watt of the 290X was much closer to the 780/Titan than the Fury X was to the Titan X or 980 Ti.
No, they definitely brute forced it with the 290X. 95C die temps under load were considered "normal and by design". They did beat the OG Titan with it though, hence the release of the 780Ti shortly thereafter.
 
Back
Top