Understanding Risk-Adjusted Returns

We have often written about the importance of understanding your personal risk preferences as an investor. Likewise, when analyzing a potential investment, it is important for an investor to understand the possible return that the investment could yield. Of course, because both of these investment metrics are critical to investors when they are looking into new investment options – as people generally don’t like to step outside their risk comfort zones and want to know the prospects for making money prior to making investments – many investors like to consider both of these metrics at the same time. Therefore, investors frequently look to an indicator called the risk-adjusted return.

Put simply, the risk-adjusted return redefines an investor’s typical understanding of an investment’s potential return by calculating a new number that bases the potential return off of the risk that is involved in producing that return. This new number measures how much an investment has made over a period of time relative to the amount of risk it has taken in that same period of time. This indicator can be applied to individual securities or other assets. In fact, some investors even calculate risk-adjusted returns for their entire portfolios to make sure that they are adequately maintaining their risk preferences. After calculation, risk-adjusted returns are expressed as either numbers or ratings that can be used to compare different investment options side-by-side.

Clearly, these metrics can be extremely helpful to investors when it comes to picking new investments. However, risk-adjusted returns can start to become tricky for some investors, because there are different risk measurements that can be used, which might yield very different results. Therefore, it is very important that investors consistently know which kinds of units of measurement are being used in these calculations, so that they can ensure the consistency of their indicators and their analyses going forward. Below are examples of two different risk-adjusted return calculations:

1. Sharpe Ratio

When investors look at Sharpe ratios, they typically are analyzing an investment’s excess return – meaning any money that the investment has made above the risk-free rate (or the rate used for an investment that carries no-risk, such as a Treasury bond). They analyze this return relative to the stock’s standard deviation – which shows the volatility of the stock’s returns compared to its average return over time. Basically, this ratio can help investors to compare between risks, returns, and specific volatility, allowing them to have a better understanding of the performance that they are likely to expect from this single stock relative to its past performance.

2. Treynor Ratio

The Trenor Ratio is calculated in the same way as the Sharpe ratio but uses the investment’s beta in place of the investment’s standard deviation. This term, beta, compares a stock’s level of risk compared to the market as a whole – instead of just compared to its own past volatility. Therefore, this ratio may be better for investors looking at systematic (or market risk), as opposed to the Sharpe ratio, which looks primarily at unsystematic risk.

Ultimately, if you are an investor who values their risk preferences and also aims to maximize their returns, you might find it valuable to look more into risk-adjusted return calculations!