RTX 3xxx performance speculation

Seems less prone to failure to me. If any unintentional stress gets put on the power connector, it's getting put on a plastic adapter instead of a connector soldered to the PCB.

As someone who's fixed a lot of laptop power connectors, the ones hard soldered to the motherboard are a lot more prone to breaking, and a lot more of a pain in the ass to replace, versus the ones on a cable. I understand we're talking different use cases, one is plugged and unplugged a lot and exposed to a lot of abuse, the other in generally plugged in and left alone... but the logic carries none the less that Gigabytes solution is probably less prone to failure, and a lot easier to repair if that connector does somehow fail.

I guess that's true, though I have not heard of many PCI-E power connectors failing on GPUs.
 
That power adapter thing is...bizarre. Why didn't they just put the power connectors on the top of the actual PCB like everyone else? Seems like just more potential points of failure. Unlikely to fail, I would imagine, but still.
My guess is due to how tall these cards are they were worried about leaving room for the plug/wires in an "average" size case. Sure a 200mm card can easily have a connector on top.... 250mm+? I dunno the measurements of the real cards, but they seem tall and it most likely had some bearing on this decision.
 
Why does it matter though? You already made your intentions clear, you want a card, and you want it day one. You're going to get FE reviews before the cards go on sale, giving you the opportunity to see real world performance data from trusted outlets. This allows you to make an informed purchasing decision on if a 3080 is right for you. Beyond that, I said this before, and I'll say it again... anyone who plans on being choosy and chasing a specific AIB on Thursday is in for a bad time. All the reviews in the world won't change the fact that you're going to have to settle for the first card that you can get into a shopping cart. If you're after a very specific card, you're either going to have to be very lucky or very patient. So what does an AIB review change, if, even after reading every AIB review under the sun, you probably cannot get the card you've decided is right for you? How would your Thursday morning play out any differently if you had read a few reviews first?

as long as I can get a new card before Cyberpunk 2077 I'm fine with being a bit patient...I'm a fan of Gigabyte cards so I'm leaning towards waiting but I'm interested in reading reviews on the cooling of the Founders cards...plus I'm building a new Zen 3 rig so I'll be waiting a few weeks regardless
 
I‘m using a Fractal Design Meshify C and it ended up being a little smaller than I thought it would be. That being said, it still has a decent amount of space and my MSI Ventus 2080Ti fits comfortably in it and a 3080 would fit fine. The 3090 should also fit but only with a few millimeters to spare. Makes me miss my HAF X, lol!

I have the same case and I was eyeballing the fit. I don't think it will fit a 3090 and a AIO that's front mounted.
 
I have the same case and I was eyeballing the fit. I don't think it will fit a 3090 and a AIO that's front mounted.
Yeah, no way it fits with a front mounted radiator. I mounted mine at the top specifically to reserve more room for larger GPUs.
 
Yeah, no way it fits with a front mounted radiator. I mounted mine at the top specifically to reserve more room for larger GPUs.

I thought about this, but that pretty much limits you to a 240 rad. Yeah it has mounting holes up top for 280, but there is gonna be MB/ Ram clearance problems. I want to mount a front 280 or perhaps 360, and i think there will be room for it with the 3080. Will it be enough room for a fat AIO like the arctic LF2? that is yet to be determined.
 
I thought about this, but that pretty much limits you to a 240 rad. Yeah it has mounting holes up top for 280, but there is gonna be MB/ Ram clearance problems. I want to mount a front 280 or perhaps 360, and i think there will be room for it with the 3080. Will it be enough room for a fat AIO like the arctic LF2? that is yet to be determined.
You are correct in that I am limited to a 240 rad. I looked into getting a 280 rad but I found others online who tried mounting one at the top of their Meshify C case but couldn't make it work so I went 240 to be safe. The 3080 could possibly fit with a front mounted rad but it would be close if it fits at all.
 
so the Founders Edition cards really are the best binned?...I thought that was just a rumor in regards to previous generations like Turing?...performance is also better then AIB cards?...if so then I might try and snag a Founders 3080
 
so the Founders Edition cards really are the best binned?...I thought that was just a rumor in regards to previous generations like Turing?...performance is also better then AIB cards?...if so then I might try and snag a Founders 3080

Looks like that is the case. Or if you want a better binned Ampere you will have to pay more from AIB's.

 
Looks like that is the case. Or if you want a better binned Ampere you will have to pay more from AIB's.

but which AIB cards are the best binned?...only the very top end model from Asus, Gigabyte, MSI etc?...how would one know before purchasing?...
 
Correct, the more expensive cards will be better binned

yes but will only the top end, most expensive model be the best binned or will it be the top 2-3 models from Gigabyte's 3080 lineup for example?...I usually don't buy the very top end card but maybe a step or 2 down from that- for example I've bought Gigabyte's G1 Gaming cards for the last few generations...EVGA has like 6-8 different models for each card
 
Last edited:
yes but will only the top end, most expensive model be the best binned or will it be the top 2-3 models from Gigabyte's 3080 lineup for example?...I usually don't buy the very top end card but maybe a step or 2 down from that- for example I've bought Gigabyte's G1 Gaming cards for the last few generations...EVGA has like 6-8 different models for each card
Since (reportedly) only 10% of the dies will be the top binning, I find it doubtful that we'll see any but the most expensive of any AiB's lineup with them. And that's if Nvidia isn't keeping all of that 10% for their own FE line - which would suck since their shrouds look REALLY, REALLY, REALLY hard to take apart without damaging it (for watercooling).
 
Since (reportedly) only 10% of the dies will be the top binning, I find it doubtful that we'll see any but the most expensive of any AiB's lineup with them.

if the 3080 Founders unique cooling system is a winner I might try and get one...at least this way I'm guaranteed to get one of the best binned cards...the after market cooling is the main reason why I always went with AIB cards
 
Last edited:
Since (reportedly) only 10% of the dies will be the top binning, I find it doubtful that we'll see any but the most expensive of any AiB's lineup with them. And that's if Nvidia isn't keeping all of that 10% for their own FE line - which would suck since their shrouds look REALLY, REALLY, REALLY hard to take apart without damaging it (for watercooling).
I saw it talked about that they were not going to warranty the FE models if taken apart. Not sure if that is true or not.
 
I saw it talked about that they were not going to warranty the FE models if taken apart. Not sure if that is true or not.
I saw this also but we don't know if this rep has all the information but this doesn't bode well until Nvidia denies outright.
 

Attachments

  • gzq6qy73i6m51.png
    gzq6qy73i6m51.png
    12.4 KB · Views: 0
Hasn't it always been their official policy but they still (at least sometimes) honored the warranty if you put it back together?
 
Hasn't it always been their official policy but they still (at least sometimes) honored the warranty if you put it back together?
Someone on here recently sent in their waterblocked Titan RTX and they replaced it. Just make sure you put all the stock parts back and they'll take it.
 
but which AIB cards are the best binned?...only the very top end model from Asus, Gigabyte, MSI etc?...how would one know before purchasing?...

There are no guarantees.

You make educated guesses based on rumors and hope you luck out and win the silicon lottery.

I wish they were more forthcoming with this stuff, but sadly not.

I don't think you can even read ASIC quality anymore.

As always, the manufacturers win when they keep their customers in the dark as much as possible :/
 
There are no guarantees.

You make educated guesses based on rumors and hope you luck out and win the silicon lottery.

I wish they were more forthcoming with this stuff, but sadly not.

I don't think you can even read ASIC quality anymore.

As always, the manufacturers win when they keep their customers in the dark as much as possible :/

I doubt it matter to most people. I expect rates of GPU OC'ing are rather low.
 
I doubt it matter to most people. I expect rates of GPU OC'ing are rather low.

I guess. But that makes me sad.

That people don't want to be bothered with making the most out of their purchases.

That they are either ignorant or lazy enough that they don't want to gain free performance.

In the information age, ignorance is a choice.
 
No reviews in Europe.

Review NDA always ends at same time globally, it's never staggered. Time past, some reviewers have claimed ignorance and lost their access privileges. This time it will be by the book, likely 9am EDT.
 
Review NDA always ends at same time globally, it's never staggered. Time past, some reviewers have claimed ignorance and lost their access privileges. This time it will be by the book, likely 9am EDT.
I was just messing, there were no reviews at that particular time in Europe :)
He must have not seen reviews were pushed back.
I guessed where he was because his post was a few minutes earlier than mine just after 11pm on the 13th (I'm in the UK), but for him it was now the 14th, he is +1hr.
 
There are no guarantees.

You make educated guesses based on rumors and hope you luck out and win the silicon lottery.

I wish they were more forthcoming with this stuff, but sadly not.

I don't think you can even read ASIC quality anymore.

As always, the manufacturers win when they keep their customers in the dark as much as possible :/

with CPU's you can see the stepping/revision...I wish GPU's would have something similar
 
Since (reportedly) only 10% of the dies will be the top binning, I find it doubtful that we'll see any but the most expensive of any AiB's lineup with them. And that's if Nvidia isn't keeping all of that 10% for their own FE line - which would suck since their shrouds look REALLY, REALLY, REALLY hard to take apart without damaging it (for watercooling).
Yes, you will get pricing and best binning for the reviews then you will buy something slower for more money from the AIBs when they run out quickly. It's a win for Nvidia as first day benchmarks and mrsp is what everyone is going to be going by and recommending based on. It's really a genius move; if only it wasn't completely manipulating ;). I'm sure a select few will get in on the FE cards so Nvidia will claim they are available,, but most will just pick up the first thing in stock.
 
I guess. But that makes me sad.

That people don't want to be bothered with making the most out of their purchases.

That they are either ignorant or lazy enough that they don't want to gain free performance.

In the information age, ignorance is a choice.
Most people don't understand enough to do it safely or want to bother to monitor and configure temperatures. Not everyone wants to build custom loops or chance over heating these things. Some people just like to buy, install and enjoy. Obviously a lot of us on this forum and some others are exceptions, but by and large I would be surprised if even 50% of the people posting in these forums overclock their GPU.
 
https://videocardz.com/newz/alleged-geforce-rtx-3080-graphics-card-test-leaks-online

So this coupled with the other few performance leaks looks like 3080 will be on avg. ~30% above 2080 Ti in games. Seems to be in line with the historically accurate equation of: Nvidia's marketed improvement, then subtract 30%.

Amazing that people will still look at a 30% generational increase as Moses parting the sea when, prior to Turing, this was par for the course (and even a bit anaemic of an increase).
 
Most people don't understand enough to do it safely or want to bother to monitor and configure temperatures. Not everyone wants to build custom loops or chance over heating these things. Some people just like to buy, install and enjoy. Obviously a lot of us on this forum and some others are exceptions, but by and large I would be surprised if even 50% of the people posting in these forums overclock their GPU.

Well that and the fact the results aren’t worth it at all. The journey of overclocking is fun but the destination isn’t very exciting. Especially now with boost mechanisms that do most of the overclocking for you anyway.
 
Well that and the fact the results aren’t worth it at all. The journey of overclocking is fun but the destination isn’t very exciting. Especially now with boost mechanisms that do most of the overclocking for you anyway.
Yeah, it depends on the person and the goals, but it is not as much worth it as it used to be that's for sure. No more doubling your performance, haha. Now it's like, YES!!!! I got 3 more % out of it and only had to double my power draw. That said, some enjoy the journey, but it's getting less and less appealing. Nowadays you just about need water cooling to maintain "factory" clocks as they will throttle with any form of stock cooling (they as in, most anything at this point, Intel, AMD and NVIDIA).
 
I guess. But that makes me sad.

That people don't want to be bothered with making the most out of their purchases.

That they are either ignorant or lazy enough that they don't want to gain free performance.

In the information age, ignorance is a choice.

GPUs these days, tend to already be at diminishing returns for clock speed. You may end up increasing heat by 30% (and noise) to squeak out 5% performance gain which will be undetectable playing games.

So mostly OC'ing a GPU makes things you can notice (heat and/or noise) worse, for a performance increase you can't actually notice. It's a poor tradeoff.
 
https://videocardz.com/newz/alleged-geforce-rtx-3080-graphics-card-test-leaks-online

So this coupled with the other few performance leaks looks like 3080 will be on avg. ~30% above 2080 Ti in games. Seems to be in line with the historically accurate equation of: Nvidia's marketed improvement, then subtract 30%.

Amazing that people will still look at a 30% generational increase as Moses parting the sea when, prior to Turing, this was par for the course (and even a bit anaemic of an increase).

I think the ~30% improvement + $$$ price drop over overpriced 2080TIs makes 3080 look like a decent release. However I still think all these GPUs are freaking overpriced. The top dog 3090 should be 699... 3080 499...etc...
 
GPUs these days, tend to already be at diminishing returns for clock speed. You may end up increasing heat by 30% (and noise) to squeak out 5% performance gain which will be undetectable playing games.

So mostly OC'ing a GPU makes things you can notice (heat and/or noise) worse, for a performance increase you can't actually notice. It's a poor tradeoff.

so is it best to buy FE and plain vanilla AIB cards vs... OC $$$ AIB offerings? Myself too be honest I'm not planning on overclocking it, so maybe something cheap and quiet would be a way to go... I might even undervolt it if possible. Just like I did with 1070.
 
https://videocardz.com/newz/alleged-geforce-rtx-3080-graphics-card-test-leaks-online

So this coupled with the other few performance leaks looks like 3080 will be on avg. ~30% above 2080 Ti in games. Seems to be in line with the historically accurate equation of: Nvidia's marketed improvement, then subtract 30%.

Amazing that people will still look at a 30% generational increase as Moses parting the sea when, prior to Turing, this was par for the course (and even a bit anaemic of an increase).


That's because you are comparing across tiers. Old x80 Ti, vs new x80.

Compare it to the old x80 to new x80 and the performance gain is in the 70-80% range, and that is what people are excited about.

These kind of gains were NOT normal before Turing. They happens maybe once in a decade, world beating gains, a GPU series that becomes legend.

Pascal had gains like this, but no other series within a decade of Pascal did. 8800 GTX came close but still didn't match Pascal gains, and 8800 GTX seems to be the previous high mark people remember, and it was 9-10 years before Pascal. So nearly a decade between killer releases, and even 8800 GTX does measure up to Ampere gains.

Ampere looks to be more gains similar to Pascal, so on the order of the best people have ever seen. Definitely worth the excitement you are seeing.
 
Back
Top