Do triple GPU setups further increase input latency?

Bop

2[H]4U
Joined
Oct 1, 2003
Messages
3,307
I've been contemplating going for a triple GPU setup when the new ATi/nVidia cards are released (and when I get a better CPU to ease the bottleneck). It is known that a dual card setup increases input latency over a single GPU, but does it necessarily follow that a third GPU will increase it further? If true, there could be potentially be ~33.3ms added latency at 60fps!
 
"It is known that a dual card setup increases input latency over a single GPU"

I've never heard that. Source?
 
I would think that if your GPU's are all running at 100% the frame will be displayed as soon as each GPU has one finished. So however fast your GPU can render a frame would be your input lag. 2 GPU's running at 50fps, all things equal each frame would take 40ms, but each GPU makes one, so ideally you would get a frame every 20ms. Your input lag would still be 40ms, even though you get a frame every 20ms. A single GPU spitting out 50fps would just have a 20ms input lag.

That is my understanding, although it could be wrong (I usually am).
 
i've never heard of this and highly doubt its true because i use a 37" LCD TV as a monitor and because of this i know about and have experienced 30-40ms input lag on my display before. due to this when i wanted to upgrade in size (went from a 32" sony to the 37" panasonic i use now) i did extensive input lag testing on more than one LCD (using a good camera, timer program and CRT monitor as a control) and ended up with the panasonic because its bar none the fastest LCD TV made in recent years at about 20 MS lag vs a CRT.

i recently upgraded my machine from 1 560 Ti to 2x in SLI and noticed no change in input lag. if by going to SLI even if it had only added 20ms of lag that stacked on top of the 20ms of my display's own lag would have been very noticeable to me in my mouse movements in game.

also BF3 has a performance graph you can turn on that tells you how long your CPU and GPU subsystems are taking to render each frame measured in MS and both of mine are well under 20ms. my GPU is usally taking anywhere from 2-7 ms to render, CPU is the bottleneck most of the time taking anywhere from 5-15ms.
 
Last edited:
I would say just the opposite. I went from CrossFire 5850s to TriFire 5970/5850.

I did notice microstuttering at times with CF. TF completely elminated it (and got me > 100fps in BF3).

I would say in my experience the microstuttering can "feel" like momentary input lag.
 
here is a good article done on input lag.

http://www.anandtech.com/show/2803/2

at the end he speculated that multi GPU setups may add to input lag but they never got around to testing it with the high speed camera. what i got out of it is exactly what i thought though.

the factors on input lag are

your input device (good gaming mouse has almost no lag, a cheap one can have as much as 10ms of lag)

your CPU (not usually a major source and the faster it is the better)

your GPU (a major source and its performance is directly tied to the frame rate. higher performance less lag this means SLI etc should lower lag imo)
frame transmission (this is fixed by your display's refresh, 60hz display = 16.6 ms lag

V-Sync (here is a good reason to not run v-sync and esp not to run triple buffering. v-sync will add another 16.6ms of lag and triple buffering will add 33.20 of lag except for nvidia cards apparently because they employ a better method of triple buffering or something)

finally your display, most gaming monitors these days will have a total lag figure under 10ms, but in reality no less than 2-4ms. this is not the same thing as pixel response time but pixel response time is a part of this figure.

he also said and i have seen this first hand with crappy console ports, how a specific game works and is designed has a major impact on input lag regardless of many of these factors. i can think of Alice madness returns off the top of my head as a game that has some real major input lag that anyone will feel right away because it was poorly coded.

games where input lag matter (FPS games) should be and often are coded to have low input lag. its up to you to make sure you get the highest frame rates possible and have low input lag devices at your disposal to to the rest of the job and this is yet another argument for not using v-sync and i'm willing to bet that real anti aliasing and some other types of eye candy in games is input lag producing as well.
 
here is some interesting info from nVidia on this as well

http://developer.download.nvidia.com/whitepapers/2011/SLI_Best_Practices_2011_Feb.pdf

One additional detail worth noting is that while input latency is the same on SLI AFR configurations as it is on single GPU configurations (each frame will take as long to be complete), inter-frame latency is reduced due to parallelism, so the application appears more responsive. For example, if a typical frame takes 30 ms to render, in a 2-GPU AFR configuration with perfect scaling the latency between those frames is only 15 ms. Thus, increasing the number of frames buffered in SLI does not increase input lag.
 
Last edited:
I just stumbled upon that article as well. Very interesting. It appears, if I comprehended it correctly, that a third card will not add extra latency, and the tests from THG show it will also reduce microstutter. If that is the case it looks like I will be a triple GPU owner from the next upgrade on out. :)
 
Back
Top