Which RTX 4090 card are you planning or consider to get?

You guys think my ASUS TUF is a dud?

I can only get it to run at 2880 and 12,000 mem.

Also my scores seem to be lower than most people. I am doing 2709X in Port Royal and about 29K in Timespy. Superposition is 33K or something.

What gives?
Drop a 5800X3D in your board and I bet that helps. Better yet grab a 7XXX/13900K with DDR5 and a new MB. You'll see an increase in score.
 
installed it and didn’t have much time to play te but ran firestrike ultra and timespy. Boy the card is quiet and runs cool. Lmao. Blown away how good the temps are. I do have lian li base and intake fans under it. Quietest system right now. I have my AIO 360 fans set to spin up only when my 7700x hits 80+c and in games it never does that. Stays in 50-60s mostly.


So does this card not do 165hz via hdmi port? My 6600xt allowed me that option. I have the odyssey ark 55.
 

Attachments

  • 94856081-0789-4539-9333-559FCAF1465B.jpeg
    94856081-0789-4539-9333-559FCAF1465B.jpeg
    509.2 KB · Views: 0
  • 10358C2E-3701-4D7C-AD32-8AAF1A0451A3.jpeg
    10358C2E-3701-4D7C-AD32-8AAF1A0451A3.jpeg
    479.6 KB · Views: 0
Last edited:
Bummer though. Not getting 4k 165hz over hdmi with this card on my ark 55. I did with 6600xt. Which is kinda weird as both have hdmi 2.1. I love my super smooth browsing lmao. 120hz is cool but bummed about not getting 165hz.

If RDNA 3 does it. I might go that route. Really bummed out on this one lmao.
 
Last edited:
Bummer though. Not getting 4k 165hz over hdmi with this card on my ark 55. I did with 6600xt. Which is kinda weird as both have hdmi 2.1. I love my super smooth browsing lmao. 120hz is cool but bummed about not getting 165hz.

If RDNA 3 does it. I might go that route. Really bummed out on this one lmao.
See if you can add it in CRU. 4k 165hz would require reduced color depth/chroma subsampling or DSC though, so you also might need to make sure those are enabled. I don't have anything that uses DSC so I don't know if there's some special gotcha to enable it.
 
You guys think my ASUS TUF is a dud?

I can only get it to run at 2880 and 12,000 mem.

Also my scores seem to be lower than most people. I am doing 2709X in Port Royal and about 29K in Timespy. Superposition is 33K or something.

What gives?

That's definitely a dud. I've been game stable so far at an even 3Ghz. The highest I've been able to complete benchmarks is 3090 Mhz. Seems like most of them over on OCN are at least close to that. Some a little lower at 3045 or 3060 Mhz for benching. A small number that can go over 3100.
 
That's definitely a dud. I've been game stable so far at an even 3Ghz. The highest I've been able to complete benchmarks is 3090 Mhz. Seems like most of them over on OCN are at least close to that. Some a little lower at 3045 or 3060 Mhz for benching. A small number that can go over 3100.
On my FE. I have been able to get like 2950. Without any extra power. I haven’t tried to up it any more or give it more juice. Did like +250 on core.
 
See if you can add it in CRU. 4k 165hz would require reduced color depth/chroma subsampling or DSC though, so you also might need to make sure those are enabled. I don't have anything that uses DSC so I don't know if there's some special gotcha to enable it.

Its weird someone posted on samsung forums I was searching to another user having similar issue with another card. Someone posted it allows them to set 165hz. Talk about high end monitor that won't work at full capacity with latest and faster card that can possibly take advantage of higher hz lmao.
 
Had to return the Suprim X Liquid. The HDMI port was broken and the card couldn’t hold a Time Spy Extreme run without crashing. Got a Strix 4090 instead (mated to a 12900k and DDR5 Z5 kits running stable at 6400Mhz XMP1)

I am somewhat new to overclocking GPUs. I am using Asus GPU Tweak 3. So with the Tweak 3’s built in OCScanner function you target 120% power target (and 100% voltage target) and let it establish a stable OC profile.

In my case it seems to be 2910Mhz? I think? Because when I run the Speedway benchmark I get a steady 2910 MHz on the OSD (while running the established OCScanner profile). So does that mean 2910 is the max stable it can run on?

This 2910 seems kind of low, no? Or is it because I am relying on OC Scanner

Take a look at my Speedway test details (score was passed with 99.2% stress test frame stability)
 

Attachments

  • 62565B0B-064D-4E7C-A5E3-8CC0849D3F12.png
    62565B0B-064D-4E7C-A5E3-8CC0849D3F12.png
    366.7 KB · Views: 0
Which way is the bios toggled? If its performance mode should be able to easily hit 3k
Had to return the Suprim X Liquid. The HDMI port was broken and the card couldn’t hold a Time Spy Extreme run without crashing. Got a Strix 4090 instead (mated to a 12900k and DDR5 Z5 kits running stable at 6400Mhz XMP1)

I am somewhat new to overclocking GPUs. I am using Asus GPU Tweak 3. So with the Tweak 3’s built in OCScanner function you target 120% power target (and 100% voltage target) and let it establish a stable OC profile.

In my case it seems to be 2910Mhz? I think? Because when I run the Speedway benchmark I get a steady 2910 MHz on the OSD (while running the established OCScanner profile). So does that mean 2910 is the max stable it can run on?

This 2910 seems kind of low, no? Or is it because I am relying on OC Scanner

Take a look at my Speedway test details (score was passed with 99.2% stress test frame stability)
 
Stock with P switch on with 120% power (no OC) my STRIX card sits at around 2600 MHz. So you still need to dial it in to go past 3k.

I would start with +150 and test from there. Should go past 3000. As you get past 3000 use smaller increments.
 
Last edited:
Looks great! Did you get the cable directly from Corsair? I've yet to see it in stock on their site.
Thank you, sir! Yeah - got lucky F5ing on launch day. It’s stiff AF - solves one prob and semi creates another, lol. Good luck!
 
Last edited:
Stock with P switch on with 120% power (no OC) my STRIX card sits at around 2600 MHz. So you still need to dial it in to go past 3k.

I would start with +250 and test from there. Should go past 3000. As you get past 3000 use smaller increments.
Hi thanks for the reply! So here is my current situation as per the attached Speedway 20 loop test (passed with 98.9% frame rate stability).

Passes Port Royal with a 27466… I was hoping to cross 28k hah, but could be my SP85 AI overclocked 12900k holding it back?

Nonetheless, if you look at the attached picture it says average frequency is 3046MHz, and GPU Clock is 3075MHz (this is applying 220, at 225 Port Royal doesn’t start). Also 1725MHz on VRAM is stable as well (a smidge more and Port Royal doesn’t start; at 1725 it passes 20 loop Speedway). Funny thing is, if I increase either the memory or GPU clock even a smidge more Port Royal does not even start, but at these limits it passed any and every stress tests. Weird, because I’d imagine a band of increasing instability.

Secondly I am incredibly confused about the GPU tweak terminology, but that’s probably just me not understanding how it works. No OC it says “Boost Clock” at 2640 on mine… seems decent enough. But then the offset is applied on top of the Boost Clock? Like with this stable 220 it says 2860MHz, yet the test shows I am at 3060… why?
 

Attachments

  • 69A04DC3-CFB3-4D5C-875F-D67A13558BB1.png
    69A04DC3-CFB3-4D5C-875F-D67A13558BB1.png
    259.9 KB · Views: 0
Hi thanks for the reply! So here is my current situation as per the attached Speedway 20 loop test (passed with 98.9% frame rate stability).

Passes Port Royal with a 27466… I was hoping to cross 28k hah, but could be my SP85 AI overclocked 12900k holding it back?

Nonetheless, if you look at the attached picture it says average frequency is 3046MHz, and GPU Clock is 3075MHz (this is applying 220, at 225 Port Royal doesn’t start). Also 1725MHz on VRAM is stable as well (a smidge more and Port Royal doesn’t start; at 1725 it passes 20 loop Speedway). Funny thing is, if I increase either the memory or GPU clock even a smidge more Port Royal does not even start, but at these limits it passed any and every stress tests. Weird, because I’d imagine a band of increasing instability.

Secondly I am incredibly confused about the GPU tweak terminology, but that’s probably just me not understanding how it works. No OC it says “Boost Clock” at 2640 on mine… seems decent enough. But then the offset is applied on top of the Boost Clock? Like with this stable 220 it says 2860MHz, yet the test shows I am at 3060… why?
I'm no expert...but happy to help based on my experience with the card. Granted - I play online FPS so I do not OC very hard.

First off, within GPU Tweak III - max "Power Target" (120%) and "GPU Voltage" (100%) - turn off "0dB Fan" (on right). That's the "safe" mode I play Fortnite with. Gets me about 2900 MHz and it's rock solid.

From there, you want to try adjusting the "Clock" sliders. I'd isolate "GPU Boost Clock" first - once you get that dialed in, then work on "Memory Clock".

Modern NVIDIA GPUs have "Boost Clock" which is basically the card grabbing any headroom it sees. So when you max power and voltage at the start - that's why it auto-OC's to a decent, safe spot (you can ignore "OC Scanner" by ASUS - IMO).

So when you add "GPU Boost Clock" you're overriding what it will hit by default with the headroom it sees as "safe" (this all started after the GTX 5xx Fermi 2.0 series had people blowing up their cards - NVIDIA started adding training wheels) - so when I say go "+150" (I said "+250" but that's maybe too aggressive) and start from there - try that, play games, run benches - when you're comfortable with the stability - try inching it forward. That's all OCing is. It isn't very sexy. It takes time and patience and also validation. For me, as an online FPS player it is much worse for me to crash than if I were playing a single player game - so I am pretty chill with that. But when I'm benching I will let it rip.

EDIT: based on your results that you’re talking about - sounds like you got a great card! Just keep hammering for stability. If it’s auto hitting over 3000 that’s awesome.
gputweak.jpg
 
Last edited:
I'm no expert...but happy to help based on my experience with the card. Granted - I play online FPS so I do not OC very hard.

First off, within GPU Tweak III - max "Power Target" (120%) and "GPU Voltage" (100%) - turn off "0dB Fan" (on right). That's the "safe" mode I play Fortnite with. Gets me about 2900 MHz and it's rock solid.

From there, you want to try adjusting the "Clock" sliders. I'd isolate "GPU Boost Clock" first - once you get that dialed in, then work on "Memory Clock".

Modern NVIDIA GPUs have "Boost Clock" which is basically the card grabbing any headroom it sees. So when you max power and voltage at the start - that's why it auto-OC's to a decent, safe spot (you can ignore "OC Scanner" by ASUS - IMO).

So when you add "GPU Boost Clock" you're overriding what it will hit by default with the headroom it sees as "safe" (this all started after the GTX 5xx Fermi 2.0 series had people blowing up their cards - NVIDIA started adding training wheels) - so when I say go "+150" (I said "+250" but that's maybe too aggressive) and start from there - try that, play games, run benches - when you're comfortable with the stability - try inching it forward. That's all OCing is. It isn't very sexy. It takes time and patience and also validation. For me, as an online FPS player it is much worse for me to crash than if I were playing a single player game - so I am pretty chill with that. But when I'm benching I will let it rip.

EDIT: based on your results that you’re talking about - sounds like you got a great card! Just keep hammering for stability. If it’s auto hitting over 3000 that’s awesome.
View attachment 520931
Oh that’s a great response! Thank you!

Yes the result I quoted above was through manual OC’ing (patiently upticking the Boost clock first, testing it and then the same with memory clock); I dumped the OCScanner. If I leave it to auto-OC it goes up till 2915Mhz max.
 
Oh that’s a great response! Thank you!

Yes the result I quoted above was through manual OC’ing (patiently upticking the Boost clock first, testing it and then the same with memory clock); I dumped the OCScanner. If I leave it to auto-OC it goes up till 2915Mhz max.
Awesome.

And - my apologies. I responded a bit hastily without fully understanding everything that you outlined (was in between Fortnite with my son, bedtime stuff with my other sons - Dad life, lol). You did an awesome job with that OC, dude! Now just validate the stability. :)
 
Awesome.

And - my apologies. I responded a bit hastily without fully understanding everything that you outlined (was in between Fortnite with my son, bedtime stuff with my other sons - Dad life, lol). You did an awesome job with that OC, dude! Now just validate the stability. :)
Oh no need to apologize friend! The “dad life” bit made me laugh. Same here. We have two little ones and, frankly, I don’t have time to play long drawn out video games… so I have taken to overclocking; low commitment thrills 😂
 
I have a Gigabyte 3080 OC 10GB LHR card. I play at 3440x1440p. Would upgrading to a 4080 or 4090 be worth it? My monitor is 144hertz.
 
I have a Gigabyte 3080 OC 10GB LHR card. I play at 3440x1440p. Would upgrading to a 4080 or 4090 be worth it? My monitor is 144hertz.
Up to you - I personally game at that resolution but 175Hz and I’d like more power than what my 7950X / 4090 combo delivers. Lol
 
From what I've seen yes. On 3090's Gigabyte cards had a boatload of outputs, but it seems like they switched to the standard 3xDP 1xHDMI this time.
Thanks! I think I'll hold out for an Asus card as the dual HDMI would be a great benefit as all of my displays are 4k TVs and while I've had success using HDMI to DP cables for 10ft or less 4k @ 60Hz my newest display (Samsung 43" Q90nb) is running 4k @ 144Hz over a 40ft cable. If I want to repeat that I think it will be difficult to find a conversion cable that can do 4k @ 144Hz.
 
The age demographic of the forum members is shifting older and don't want to take risks with expensive products anymore?
Maybe but I believe it's more of "custom water cooling" is too expensive crowd that shouts down WC IMO. Although if you're in the market for a 4090, I'm not sure you would be in that too expensive to justify crowd.
 
Well looking at how you are not suppose to get any bend. I guess I am screwed lmao. Good thing I won’t be gaming till holidays. RDNA 3 it might be. This is crazy how shitty this adapter situation is. After I saw the chart how you are not suppose to not bend.it’s so hard to do with how the connecter is to the side Lmao.

 

Attachments

  • 773FCEB3-648F-468C-91C1-26A9EE87ADAA.jpeg
    773FCEB3-648F-468C-91C1-26A9EE87ADAA.jpeg
    452.3 KB · Views: 0
Well looking at how you are not suppose to get any bend. I guess I am screwed lmao. Good thing I won’t be gaming till holidays. RDNA 3 it might be. This is crazy how shitty this adapter situation is. After I saw the chart how you are not suppose to not bend.it’s so hard to do with how the connecter is to the side Lmao.


You'll be fine, dude. If you're worried about it, grab more slack and let it bend later in the cable. Otherwise, alternative cables are coming.
 
You'll be fine, dude. If you're worried about it, grab more slack and let it bend later in the cable. Otherwise, alternative cables are coming.
Side panel is the issue. Can’t have that to avoid bend. This is the Corsair cable direct to the PSU not the included adapter. Maybe the 90 degree adapter will fix it if that fits and has enough clearance
 
Thanks! I think I'll hold out for an Asus card as the dual HDMI would be a great benefit as all of my displays are 4k TVs and while I've had success using HDMI to DP cables for 10ft or less 4k @ 60Hz my newest display (Samsung 43" Q90nb) is running 4k @ 144Hz over a 40ft cable. If I want to repeat that I think it will be difficult to find a conversion cable that can do 4k @ 144Hz.
Yup, this was another major reason I defaulted to Asus Strix. I have a Samsung Q950A hooked up to my computer as well as two OLEDs… and a G9 Neo. HDMI is the only channel which allows 12bpc colors on the G9 Neo (which has a huge impact on how colors look with HDR). Secondly the Q950A is useless if I feed in video to it, can only get 4k6Hz, so that takes up an HDMI spot since it acts as an independent sound system

All in all, don’t know why HDMI is becoming so unpopular. My use case is all HDMI; can’t get above 10bpc on the Neo using DP.
 
Side panel is the issue. Can’t have that to avoid bend. This is the Corsair cable direct to the PSU not the included adapter. Maybe the 90 degree adapter will fix it if that fits and has enough clearance
Have the same cable - extra slack can help - or don’t use the side panel. Also - feel free to live with it. One report out of thousands of builds (many ppl are absolute morons out there compared to us). I feel pretty good about it.
 
Yup, this was another major reason I defaulted to Asus Strix. I have a Samsung Q950A hooked up to my computer as well as two OLEDs… and a G9 Neo. HDMI is the only channel which allows 12bpc colors on the G9 Neo (which has a huge impact on how colors look with HDR). Secondly the Q950A is useless if I feed in video to it, can only get 4k6Hz, so that takes up an HDMI spot since it acts as an independent sound system

All in all, don’t know why HDMI is becoming so unpopular. My use case is all HDMI; can’t get above 10bpc on the Neo using DP.
I hear you. I setup a second gaming station connected to my main system that just happens to be on the opposite side of my office. 4k @ 60Hz was no issue for longer cable runs but I went through multiple cables trying to find one that could do 4k @ 144Hz over 40ft in length. Ended up with this AAXY 8K 40' HDMI

Having done 4k TVs since 2015 I am surprised that with the smaller sets (below 50") being so popular for PC use that the manufacturers by and large haven't really pushed to add DP connection abilities.

Oh well - adapt and enjoy!
 
Back
Top