What Is the Residual Sum of Squares (RSS)?

A residual sum of squares (RSS) is a statistical technique used to measure the amount of variance in a data set that is not explained by a regression model itself. Instead, it estimates the variance in the residuals, or error term.

Linear regression is a measurement that helps determine the strength of the relationship between a dependent variable and one or more other factors, known as independent or explanatory variables.

Key Takeaways

  • A residual sum of squares (RSS) measures the level of variance in the error term, or residuals, of a regression model.
  • Ideally, the sum of squared residuals should be a smaller or lower value than the sum of squares from the regression model's inputs.
  • The RSS is used by financial analysts in estimating the validity of their econometric models.

The Formula for the Residual Sum of Squares (RSS)

RSS = n i=1 ( yi - f( xi)) 2

where:

  • yi = the ith value of the variable to be predicted
  • f(xi) = predicted value of yi
  • n = upper limit of summation

Understanding the Residual Sum of Squares (RSS)

In general terms, the sum of squares is a statistical technique used in regression analysis to determine the dispersion of data points. In a regression analysis, the goal is to determine how well a data series can be fitted to a function that might help to explain how the data series was generated. The sum of squares is used as a mathematical way to find the function that best fits (varies least) from the data.

The RSS measures the amount of error remaining between the regression function and the data set after the model has been run. A smaller RSS figure represents a regression function.

The RSS, also known as the sum of squared residuals, essentially determines how well a regression model explains or represents the data in the model.

Residual Sum of Squares (RSS) vs. Residual Standard Error (RSE)

The residual standard error (RSE) is another statistical term used to describe the difference in standard deviations of observed values versus predicted values as shown by points in a regression analysis. It is a goodness-of-fit measure that can be used to analyze how well a set of data points fit with the actual model.

RSE is computed by dividing the RSS by the number of observations in the sample less 2, and then taking the square root: RSE = [RSS/(n-2)]1/2

The Residual Sum of Squares (RSS), Finance, and Econometrics

Financial markets have increasingly become more quantitatively driven; as such, in search of an edge, many investors are using advanced statistical techniques to aid in their decisions. Big data, machine learning, and artificial intelligence applications further necessitate the use of statistical properties to guide contemporary investment strategies. The residual sum of squares—or RSS statistics—is one of many statistical properties enjoying a renaissance.

Statistical models are used by investors and portfolio managers to track an investment's price and use that data to predict future movements. The study–called regression analysis–might involve analyzing the relationship in price movements between a commodity and the stocks of companies engaged in producing the commodity.

Any model might have variances between the predicted values and actual results. Although the variances might be explained by the regression analysis, the RSS represents the variances or errors that are not explained.

Since a sufficiently complex regression function can be made to closely fit virtually any data set, further study is necessary to determine whether the regression function is, in fact, useful in explaining the variance of the dataset. Typically, however, a smaller or lower value for the RSS is ideal in any model since it means there's less variation in the data set. In other words, the lower the sum of squared residuals, the better the regression model is at explaining the data.