Friday, June 19, 2015

A Sharpe Rule for Compensation?

As we think about the extent to which compensation incentives caused the financial mess, we might start with a basic fact: annual internal rate of return is a bad metric for evaluating performance. The largest problem with it is that it rewards return without punishing risk.

When firms use leverage to invest, they increase the risk they are taking on. The value of a firm's assets divided by its equity gives a rough multiple of risk created by leverage.

Suppose a firm has $100 in assets, and its return in one year can be either -$5 or $15, each with 50 percent probability. The expected return to the firm is 5 percent, with a standard deviation of 10 percent.

Now suppose it can borrow half the money to purchase the assets at an interest rate of 3 percent. Its expected return is now higher, because it will expect to earn $3.50 (13.50*.5+(-6.50)*.5) on a $50 investment, or seven percent. Thus positive leverage gooses the return.

But now the standard deviation or risk) of the investment is 20 percent (the investment produces a return swing of plus or minus $10 on a $50 investment). So while the return has improved, so too has the risk of the investment. Compensation strategies based on return would fail to recognize the risk.

This would not be the case if compensation were tied to a company's Sharpe Ratio. The Sharpe Ratio is corporate return less a risk free rate divided by the standard deviation. In our case, in the first instance, the sharp ratio is .2 (.05-.3)/.1. In the second case, it is also .2 (.07-.03)/.2). If the Sharpe Ratio were used to determine compensation, managers would not be rewarded for goosing returns via leverage. And we could avoid all kinds of future trouble.

No comments:

Post a Comment