The Sharpe Ratio is a measure in finance that attempts to “risk-adjust” the expected returns of a security. Consider two securities. Security A will pay out 15% with a 50% probability and 5% with a 50% probability. Security B has a 50% chance of paying out 20%, and a 50% chance of paying out nothing. While the expected value of the two securities is identical, Security A has a superior Sharpe ratio.
Formally, the Sharpe ratio is the expected difference between the return of the asset and of a “risk-free” asset divided by the variance of the underlying asset. Suppose the risk-free rate (often measured as the Treasury rate) is 2%. Using the values from the above example. The numerator of the Sharpe Ratio is 8% for both securities. But the variance of Security A is 0.25% while the variance of Security B is 1%. As a result, the Sharpe Ratio of Security A is 32, while the Sharpe Ratio of Security B is 8.