Investors who are concerned about market volatility should examine their investment choices from all angles when constructing a portfolio – evaluating not only by return, but risk, too.
There are a variety of risk measures that may come in handy. Of course, numbers don’t tell the whole story, but they may help you determine whether owning a particular investment is consistent with your personal risk tolerance.
5 Ways To Measure Risk
There are numerous ways your advisor can calculate risk tolerance, or the level of risk you as an investor are willing to take in order to achieve your financial goals. Here are five common ways advisors can measure your current risk and calculate an appropriate risk tolerance.
Alpha is a measure of investment performance that factors in the risk associated with the specific security or portfolio, rather than the overall market (or correlated benchmark). It is a way of calculating so-called “excess return” – that portion of investment performance that exceeds the expectations set by the market as well as the security’s/portfolio’s inherent price sensitivity to the market.
Alpha is a common way to assess an active manager’s performance as it measures portfolio return in excess of a benchmark index. In this regard, a portfolio manager’s added value is his/her ability to generate “alpha.”
Beta is the statistical measure of the relative volatility of a security (such as a stock or mutual fund) compared to the market as a whole. The beta for the market (usually represented by the S&P 500) is 1.00. A security with a beta above 1.0 is considered to be more volatile (or risky) than the market. One with a beta of less than 1.0 is considered to be less volatile.
R-squared (R2) quantifies how much of a fund’s performance can be attributed to the performance of a benchmark index. The value of R2 ranges between 0 and 1 and measures the proportion of a fund’s variation that is due to variation in the benchmark. For example, for a fund with an R2 of 0.70, 70% of the fund’s variation can be attributed to variation in the benchmark.
4. Sharpe ratio
The Sharpe ratio is a tool for measuring how well the return of an investment rewards the investor given the amount of risk taken.
For example, a Sharpe ratio of 1 indicates one unit of return per unit of risk, 2 indicates two units of return per unit of risk, and so on. A negative value indicates loss or that a disproportionate amount of risk was taken to generate a positive return.
The Sharpe ratio is useful in examining risk and return, because although an investment may earn higher returns than its peers, it is only a good investment if those higher returns do not come with too much additional risk. The higher a portfolio’s Sharpe ratio, the better its risk-adjusted performance has been.
5. Standard deviation
Standard deviation is a measure of investment risk that looks at how much an investment’s return has fluctuated from its own longer-term average.
Higher standard deviation typically indicates greater volatility, but not necessarily greater risk. That is because while standard deviation quantifies the variance of returns, it does not differentiate between gains and losses – consistency of returns is what matters most.
For instance, if an investment declined 2% a month for a series of months, it would earn a low (positive) standard deviation. But if an investment earned 8% one month and 12% the next, it would have a much higher standard deviation, even though by most accounts it would be the preferred investment.
Using a variety of risk measures may give you a more complete picture than any single gauge. Your financial advisor can help you decide which ones will serve your needs and assess the risks and potential rewards associated with your portfolio.
Measure Risk with Clarity
Explore your unique risk tolerance with Clarity Wealth Development. Click here to schedule a complimentary consultation with a member of our team today.