dual 3090 ti SLI/ nvlink early benching

Venturi

Limp Gawd
Joined
Nov 23, 2004
Messages
264
well here tis in its first few passes before any optimizations in 4k
dual 3090 Ti FE nvlink



So here are some early benchmarks,, temps, power draw etc, without optimizations, see system spec for PC configuration, yes... its a SFF case

IMG_0746.jpg

IMG_0745.jpg

IMG_0750.jpg


system spec:
2x 3090 Ti RTX Founders Edition & SLI / NvLink bridge
2x 8280L, 56/112 cores, Asus c621 Sage Dual socket bios 6605
1.5 TB ram. DDR4 ECC LRDIMMs 1600W silent digital power supply
(Data drive) 4x VROC Raid 0 Micron 9300 Max (51.2 TB volume) (OS Drive) Sabrent Rocket 4 Plus (8TB). Asus PA32UCG-K monitor,
TT SFF case, MS Data Center 2022 (customized) & Ubuntu (customized

this is a 3 minute test, took measurements at 2/3 of the way through

IMG_0695.jpg




IMG_0703.jpg

InfoPanel.jpg

nvlinkView.jpg
 
Last edited:
game benchmark


using 2x 3090 TI FE nvlink
RDR2 4k (3840x2160) , max all eye candy settings (except no motion blur)

IMG_0728.jpg

IMG_0729.jpg

IMG_0713.jpg

IMG_0731.jpg
 
147 fps? So you you have dlss on? You don't seem to have that screenshot of the setting. I'm having a hard time believing that without dlss a 3090 Ti setup is nearly three times faster than a single 3080 Ti.

EDIT: Yeah there is no way that could be without dlss on. I get 54 fps on your settings without any dlss with a 3080 it and a single 3090 ti would only be about 12% faster which would be around 62 fps for a single 3090 ti so even with magical perfect scaling only be 125 fps or so yet you are getting 147 fps.
 
Last edited:
147 fps? So you you have dlss on? You don't seem to have that screenshot of the setting. I'm having a hard time believing that without dlss a 3090 Ti setup is nearly three times faster than a single 3080 Ti.

EDIT: Yeah there is no way that could be without dlss on. I get 54 fps on your settings without any dlss with a 3080 it and a single 3090 ti would only be about 12% faster which would be around 62 fps for a single 3090 ti so even with magical perfect scaling only be 125 fps or so yet you are getting 147 fps.
sir, its a dual 3090 ti setup and red dead redemption 2 vulkan is mgpu enabled.


and absolutely NO DLSS, I don't even remember seeing that as a choice.
I loathe DLSS. Why use 4k, 5k, 8k monitor just to have it down-sampled to a lower resolution? If that is the case, then just play at the lower resolution.




Edit, just checked - on my version there isn't a DLSS selection to make, version 1436
 
Last edited:
So, benchmarking

tough call on what can be a standard.
So I went looking for something that is truly free in its max configuration, can do mgpu, ray tracing, rasterization, select object counts, in vulkan, DX, and metal, can do compute and can do CUDA.

Also wanted it to report GPU load, temps, and power, and be OS agnostic


all I could find is Gravity Mark.

https://gravitymark.tellusim.com

I ran some benches with both the 3090 cards and the 3090 ti cards

https://gravitymark.tellusim.com/leaderboard/
 
sir, its a dual 3090 ti setup and red dead redemption 2 vulkan is mgpu enabled.


and absolutely NO DLSS, I don't even remember seeing that as a choice.
I loathe DLSS. Why use 4k, 5k, 8k monitor just to have it down-sampled to a lower resolution? If that is the case, then just play at the lower resolution.




Edit, just checked - on my version there isn't a DLSS selection to make, version 1436
You clearly have no understanding of how DLSS actually works based on what you said there. And then I guess my 3080 TI is broken because if you're getting 147 FPS with two 3090 TI cards then you would be getting at least 73 FPS with one card making that 36% faster than my slightly overclocked 3080 TI. And as I mentioned a 3090 TI should only be about 12% faster.
 
bench your card in gravity mark and see if it compares to other 3080 ti setups. Maybe it is not optimized Or 147fps is about right on scaling in an mgpu system.

You clearly have no understanding of how DLSS actually works based on what you said there. And then I guess my 3080 TI is broken because if you're getting 147 FPS with two 3090 TI cards then you would be getting at least 73 FPS with one card making that 36% faster than my slightly overclocked 3080 TI. And as I mentioned a 3090 TI should only be about 12% faster.

the bench is free and probably the most fair bench tool out there, seem an unbiased tool, run it and let the software post the result.
Maybe it can help you with some optimizations.

I don't know why your RDR2 numbers are what they are.
 
I've tested my card in numerous other games and synthetic benchmarks and it's performing exactly as it should if not better. Can you run the RDR2 bench with just a single 3090 Ti?
 
I've tested my card in numerous other games and synthetic benchmarks and it's performing exactly as it should if not better. Can you run the RDR2 bench with just a single 3090 Ti?
I could, out of curiosity I'll try it tis weekend.
 
I'll have to fire up RDR2 and test again.... I feel like at 4K, max settings on my 3090 OCed I would get in the 70s average. Its been a long time though.
 
Techpowerup only shows a 9% difference in RDR2 at 4K between a custom 3090 Ti and an FE 3080 Ti. I am just on my phone now but I believe it showed 65 FPS for the FE 3080 Ti but they are using custom settings not fully maxed out settings.
 
Well with NO overclocking:

Some of the optimizations I use:
windowes server 2022, not secuity mitigations, (in reg use features set override mask 03)
no pageffile, remove or turn off av, in mvcpl the shader cache is disabled
on motherboard I have optimized the lanes, mmo, per cpu opt, ram optimized
I use the vroc premium direct cpu store write for apps drive in "cpu storage" bios setting
on benches I use the inspector bench settings, max performance power by app/game
use inspector to set better scaling in gpu, and nvlink/sli set, and GPU don't have to traverse the QPI
also RDR2 is an MGPU Vulkan title.

RDR2 version 1436.28 has no DLSS

here are the other benchmarks on the system using an actual benchmark software

https://gravitymark.tellusim.com/leaderboard/

for 2k, 4k, and 4k Ray traced, at 4k with ray tracing it was over 331 FPS

Strange Brigade (not as popular) is also an MGPU Vulkan title. but at 4k I get 317 fps

in MGPU Vulkan, remove the shader cache /disable and in inspector optimize specifically for Vulkan titles.

Lastly, throttling, under max load I have not seen the GPUs get higher than 68C

633339_IMG_0703.jpg



IMG_0371.jpg

IMG_0370.jpg

drivespeed.jpg



IMG_0752.jpg


Hope all this helps. I take these build rigs very seriously:

My wife's PC is almost the same configuration, I've learned to optimize along the way, the other PC has appropriately similar scores. I use both PCs at the same time to share the load in my dissertation
587455_IMG_8867.jpg
 
Last edited:
Hi

so I wasn't happy with the temps. I re-padded and repasted both 3090 ti cards. I used the gelid pads and I used the pink kryonaut extreme. It will need a few days to settle down but it seems to be about a 7C difference in the ram and about a 2C difference in the GPU/GPU hotspot.

On the 3090 ti FE there isnt any reason to use 2 or 3mm pads, 1mm to 1.5mm seems to work best. I did do some creative padding on the backside to move some of the heat off the pcb to the outer shell.

I'll keep an eye on it and see if I get an extra 1C out of it as it settles.


I watched a few videos on the AIB partner cards being apart and it seems an easier disassembly than the 3090 ti. It took me 1.5 hours in total process per card. the bulk was spent cleaning, and then selecting and cutting the pads to the right size. 1.5 mm was slightly too thicj and 1.mm seemed too thin, the gelid pads don't compress that well so it was trial and error .

agonized over the TIM. Had noctua 2, gelid extreme, and kryonaut gray and pink. Best guess was the kryonaut pink.
 
I had to resort to make custom cables, designed and built for the 3090 ti, after prototypes, I'm having a batch made, delivery in a few weeks

Does anyone else need cables for type 4 interface / PSU?
I can add more to the run.

cable.jpg

IMG_0747.jpg
 
Last edited:
how many games actually support SLI anymore?
I think SLI is rather dead, the change now is apps and games that are specifically written to be multi GPU, not anything less than complex. The DX12 magic bullet of mgpu didn't really take hold. Vulkan scales much better than DX12 and seems better embraced in many indie developments. With that said, in pure SLI, I would usually avoid TAA (med/high) and favor SMAA or FXAA, There are bumps - I had to learn a lot of inspector tricks and settings to get the best out of it - something no one should have to go through. I've had to learn some nvlink inspector tricks to get the full effect and again, not something folks should have to do. I have had good SLI experiences in many titles, I have had poor SLI experiences in even more titles. After a while one learns which engines suck at it and which are ok (example UNITY suck in SLI and now in NVlink) UE3 great in SLI, but UE4 required skullduggery, etc. So SLI was a mixed bad based on how much one wanted to believe it was great,- it was 25% great for many games, 20% so-so games, and the best to to say much about the rest.

Some smooth mgpu titles are RDR2, SOTTR, ROTTR, Sniper elite 4, Zombie Army 4, Gears, X4, Strange Brigade, Quake 2 RTX, the last three Hitman games, Deus Ex MnkDvd, Echo, Ashes of the singularity, some of the civilization games,

Where is the carrot? in order to get 4K, 5K and 8K to be playable with max eye candy settings, its going to require more than one GPU for years to come. Example, Cyberpunk 2077, graphics are 'OK" but the frames with RT and max candy (psycho settings) stink at 4K - 44 FPS on a 3090 Ti in the built in benchmark (ver 1.52). This game would benefit from an mgpu make over. But its single card only. All that horsepower and 44 FPS.....

Also even when using the 3.9 DLSS on quality in cyberpunk, there is significant drop in texture clarity, resolution and definition. So the answer to getting the frames in an RT game is not DLSS if one is trying to max the visual experience at 4k, (we don't speak of 8K and cyberpunk). Sometimes it seems DLSS is there just to give the illusion that Ray Tracing is playable at larger resolutions.

I've had great success improving Star Citizen / Squadron 42 (another behemoth). But the field of MGPU games is poor. Hopefully as monitors grow in resolution there may be more effort on MGPU. But even the future 4090 RTX is not going to provide decent 5k and 8k gaming with max eye candy and RT, no DLSS as a single card.

Unfortunately, my rig also has to do my dissertation, so that's why it exists, and that is where it gets its workout.
 
It’s interesting you have one processor exhausting into the other. Would it make more sense to have them both exhaust outwards or inwards?

I also find it interesting you only use one relatively small monitor but I guess whatever works for you!
 
I discovered SLi was dead when I went from 2 1080's to a 2080Ti... and so on and so fourth... I haven't looked back since. I think your going to see DLSS improved vs. any effort towards SLi/NVLink and mgpu. There is no money in it for game developers and people who can afford 2 video cards are few and far between outside of forums like these. It only even took hold back in the day because nvidia was helping the developers because it sold more cards.

I do miss the look of 2 video cards in my setups though... lol.
 
It’s interesting you have one processor exhausting into the other. Would it make more sense to have them both exhaust outwards or inwards?

I also find it interesting you only use one relatively small monitor but I guess whatever works for you!
Lol Its a 32 inch 4k asus PA32UCG with 1152 lighting zones, you've set a new standard if that is small.

I tried the various different available coolers and orientation, this final version had the most amount of cooling. Again under max load the CPU's never go higher than 47C, and idle in the low/mid 20C range...so its working well.
 
I discovered SLi was dead when I went from 2 1080's to a 2080Ti... and so on and so fourth... I haven't looked back since. I think your going to see DLSS improved vs. any effort towards SLi/NVLink and mgpu. There is no money in it for game developers and people who can afford 2 video cards are few and far between outside of forums like these. It only even took hold back in the day because nvidia was helping the developers because it sold more cards.

I do miss the look of 2 video cards in my setups though... lol.
I miss the look of 4 video cards....
 
I had a system that had a R9-295x2 and an R9-290x in Crossfire-X/Trifire. Prolly my favorite computer I ever built. I built it for BF4 at 4K in 2013 on the recommendation of an H article. Loved it. Running BF4 with Mantle on a 4K monitor...good times. A shame the article is no longer posted. Would love to read it again.

BTW, pretty kick ass system you have there. I wish SLI/Crossfire was a worth it anymore.
 
I had a system that had a R9-295x2 and an R9-290x in Crossfire-X/Trifire. Prolly my favorite computer I ever built. I built it for BF4 at 4K in 2013 on the recommendation of an H article. Loved it. Running BF4 with Mantle on a 4K monitor...good times. A shame the article is no longer posted. Would love to read it again.

BTW, pretty kick ass system you have there. I wish SLI/Crossfire was a worth it anymore.

Apparently this next gen is the old multi-gpu’ers dream. SLI/xfire power on a single board.

I’d be excited if my 3080 didn’t crush everything already.

I am sure OP will try to mgpu that too if history is any lesson lol.
 
Apparently this next gen is the old multi-gpu’ers dream. SLI/xfire power on a single board.

I’d be excited if my 3080 didn’t crush everything already.

I am sure OP will try to mgpu that too if history is any lesson lol.

I have read the same thing. Will be really neat to see this happen. I am going to skip this round of cards and wait for them to work out any bugs and driver issues. I am done with release day purchases, lol.
 
Apparently this next gen is the old multi-gpu’ers dream. SLI/xfire power on a single board.

I’d be excited if my 3080 didn’t crush everything already.

I am sure OP will try to mgpu that too if history is any lesson lol.
I think OP uses them for compute work.
 
Interesting and logical factoid

I repasted the cards with kryonaut extreme, the overpriced pink. Which improved temp performance by several C

as you know all the vram on the 3090 ti is on the same side as the gpu

and I re-padded the vram as well

so: —>

now, when the gpu is maxing out temps of 69-71 C, well that is exactly the same temp as the vram

example, gpu is 69C under load, 69C is also the exact temp of the vram

here is a pick of how the thermal pads came originally from nvidia. I replaced them with 3 mm gelid pads that I squished down to 2.5mm in a pasta maker for even precise thickness.

BEAC5B39-C794-448B-B415-9D500591D8CD.jpeg
 
It’s interesting you have one processor exhausting into the other. Would it make more sense to have them both exhaust outwards or inwards?

I also find it interesting you only use one relatively small monitor but I guess whatever works for you!
Pretty standard for enterprise server/workstation boards. At the pressure those fans tend to push, it doesn't matter.
 
sir, its a dual 3090 ti setup and red dead redemption 2 vulkan is mgpu enabled.


and absolutely NO DLSS, I don't even remember seeing that as a choice.
I loathe DLSS. Why use 4k, 5k, 8k monitor just to have it down-sampled to a lower resolution? If that is the case, then just play at the lower resolution.




Edit, just checked - on my version there isn't a DLSS selection to make, version 1436
Negative lod bias
 
Back
Top