I know the mantra, that cursors kill performance. However, in this case the dataset is enormous- I have a table with dozens of rows, and another with millions or more and the current (production) process iterates through the multi-million recordset once for each of the dozens. I know exactly how to replace the cursor with a join between the two tables, but the question is will the resulting single temp table that's dozens of times larger negate the advantage of removing triggers over just a couple of dozen passes? The current process takes about 4 hours with 14 iterations, just as an example.