Help me here, readers, because I can go off the deep end with metrics.
Assume for each of your major lawsuits that you have an annual budget as well as an estimate of both the amount at risk and the likelihood of your incurring that risk (settlement or judgment).
Could you not compare their “importance burn rates”? Take the annual budget and divide it into the product of the at-risk amount and the likelihood. (Don’t tell me you have ranges, can’t be sure, depends on the judge and other side, not quantifiable – work with me on this!)
An example: You have budgeted to spend $600,000 this year on a case where $70 million is at risk, but only a 40% chance you will end up paying this amount – we are leaving out any consideration of timing and net present value. You divide $28 million (40% of $70 MM) by $600,000 and find you have at risk $46 for every defense dollar. If you budget $1 million for a case with an adjusted risk of $8 million, you have $8 dollars at risk for every defense dollar. Do this calculation for each case; won’t the ratios indicate directionally whether your spend rates are in line with your financial risks? Don’t wide disparities suggest examining whether your situation shows a rationale allocation in defense dollars?