# Value At Risk - VaR

## What is 'Value At Risk - VaR'

Value at risk (VaR) is a statistical technique used to measure and quantify the level of financial risk within a firm or investment portfolio over a specific time frame. This metric is most commonly used by investment and commercial banks to determine the extent and occurrence ratio of potential losses in their institutional portfolios. VaR calculations can be applied to specific positions or portfolios as a whole or to measure firm-wide risk exposure.

## BREAKING DOWN 'Value At Risk - VaR'

VaR modeling determines the potential for loss in the entity being assessed, as well as the probability of occurrence for the defined loss. VaR is measured by assessing the amount of potential loss, the probability of occurrence for the amount of loss and the time frame. For example, a financial firm may determine an asset has a 3% one-month VaR of 2%, representing a 3% chance of the asset declining in value by 2% during the one-month time frame. The conversion of the 3% chance of occurrence to a daily ratio places the odds of a 2% loss at one day per month.

## Applying VaR

Investment banks commonly apply VaR modeling to firm-wide risk due to the potential for independent trading desks to expose the firm to highly correlated assets unintentionally. Employing a firm-wide VaR assessment allows for the determination of the cumulative risks from aggregated positions held by different trading desks and departments within the institution. Using the data provided by VaR modeling, financial institutions can determine whether they have sufficient capital reserves in place to cover losses or whether higher-than-acceptable risks require concentrated holdings to be reduced.

## Problems With VaR Calculations

There is no standard protocol for the statistics used to determine asset, portfolio or firm-wide risk. For example, statistics pulled arbitrarily from a period of low volatility may understate the potential for risk events to occur, as well as the potential magnitude. Risk may be further understated using normal distribution probabilities, which generally do not account for extreme or black swan events.

The assessment of potential loss represents the lowest amount of risk in a range of outcomes. For example, a VaR determination of 95% with 20% asset risk represents an expectation of losing at least 20% one of every 20 days on average. In this calculation, a loss of 50% still validates the risk assessment.

These problems were exposed in the financial crisis of 2008, as relatively benign VaR calculations understated the potential occurrence of risk events posed by portfolios of subprime mortgages. Risk magnitude was also underestimated, which resulted in extreme leverage ratios within subprime portfolios. As a result, the underestimations of occurrence and risk magnitude left institutions unable to cover billions of dollars in losses as subprime mortgage values collapsed.