Sharpe Ratio
The Sharpe Ratio is a measure of risk-adjusted return that is used to help investors understand the return of an investment compared to its risk. The ratio was developed by Nobel Laureate William F. Sharpe in the 1960s and is calculated by subtracting the risk-free rate from the rate of return for a portfolio and dividing the result by the standard deviation of the portfolio returns. The higher the Sharpe Ratio, the better the portfolio’s risk-adjusted performance.
History of the Sharpe Ratio
The Sharpe Ratio was developed by William F. Sharpe in the 1960s as a way to measure the performance of a portfolio relative to its risk. Sharpe was a professor of finance at Stanford University and a Nobel Laureate in Economics. He developed the ratio as a way to compare the return of a portfolio to its risk. The Sharpe Ratio has become a widely used measure of risk-adjusted return and is used by investors to evaluate the performance of their portfolios.
Table of Comparisons
Portfolio | Return | Risk-Free Rate | Standard Deviation | Sharpe Ratio |
---|---|---|---|---|
Portfolio A | 10% | 2% | 5% | 1.6 |
Portfolio B | 12% | 2% | 7% | 1.3 |
Summary
The Sharpe Ratio is a measure of risk-adjusted return that is used to help investors understand the return of an investment compared to its risk. The ratio was developed by Nobel Laureate William F. Sharpe in the 1960s and is calculated by subtracting the risk-free rate from the rate of return for a portfolio and dividing the result by the standard deviation of the portfolio returns. The higher the Sharpe Ratio, the better the portfolio’s risk-adjusted performance. For more information about the Sharpe Ratio, investors can visit websites such as Investopedia, Morningstar, and Bloomberg.
See Also
- Alpha
- Beta
- Treynor Ratio
- Jensen’s Alpha
- Information Ratio
- Sortino Ratio
- Calmar Ratio
- Upside Potential Ratio
- Omega Ratio
- Value at Risk