What Is a Risk-Adjusted Return?
A risk-adjusted return is a calculation of the profit or potential profit from an investment that takes into account the degree of risk that must be accepted in order to achieve it. The risk is measured in comparison to that of a virtually risk-free investment—usually U.S. Treasuries.
- A risk-adjusted return measures an investment's return after taking into account the degree of risk that was taken to achieve it.
- There are several methods of risk-adjusting performance, such as the Sharpe ratio and Treynor ratio, with each yielding a slightly different result.
- In any case, the purpose of risk-adjusted return is to help investors determine whether the risk taken was worth the expected reward.
Understanding Risk-Adjusted Return
The risk-adjusted return measures the profit your investment has made relative to the amount of risk the investment has represented throughout a given period of time. If two or more investments delivered the same return over a given time period, the one that has the lowest risk will have a better risk-adjusted return.
Some common risk measures used in investing include alpha, beta, R-squared, standard deviation, and the Sharpe ratio. When comparing two or more potential investments, an investor should apply the same risk measure to each investment under consideration in order to get a relative performance perspective.
Different risk measurements give investors very different analytical results, so it is important to be clear on what type of risk-adjusted return is being considered.
Examples of Risk-Adjusted Return Methods
The Sharpe ratio measures the profit of an investment that exceeds the risk-free rate, per unit of standard deviation. It is calculated by taking the return of the investment, subtracting the risk-free rate, and dividing this result by the investment's standard deviation.
All else equal, a higher Sharpe ratio is better. The standard deviation shows the volatility of an investment's returns relative to its average return, with greater standard deviations reflecting wider returns, and narrower standard deviations implying more concentrated returns. The risk-free rate used is the yield on a no-risk investment, such as a Treasury bond (T-bond), for the relevant period of time.
For example, say Mutual Fund A returned 12% over the past year and had a standard deviation of 10%, Mutual Fund B returns 10% and had a standard deviation of 7%, and the risk-free rate over the time period was 3%. The Sharpe ratios would be calculated as follows:
- Mutual Fund A: (12% - 3%) / 10% = 0.9
- Mutual Fund B: (10% - 3%) / 7% = 1
Even though Mutual Fund A had a higher return, Mutual Fund B had a higher risk-adjusted return, meaning that it gained more per unit of total risk than Mutual Fund A.
Using the previous fund example, and assuming that each of the funds has a beta of 0.75, the calculations are as follows:
- Mutual Fund A: (12% - 3%) / 0.75 = 0.12
- Mutual Fund B: (10% - 3%) / 0.75 = 0.09
Here, Mutual Fund A has a higher Treynor ratio, meaning that the fund is earning more return per unit of systematic risk than Fund B.
Risk avoidance is not always a good thing in investing, so be wary of over-reacting to these numbers, especially if the timeline being measured is short. In strong markets, a mutual fund with a lower risk than its benchmark can limit the real performance that the investor wants to see.
Beware of over-reacting to these numbers, especially if the timeline being measured is short. Greater risks can mean greater rewards over the long-term.
A fund that entertains more risk than its benchmark may experience better returns. In fact, it has been shown many times that higher-risk mutual funds may accrue greater losses during volatile periods, but are also likely to outperform their benchmarks over full market cycles.