What Is a Rounding Error?
A rounding error, or round-off error, is a mathematical miscalculation or quantization error caused by altering a number to an integer or one with fewer decimals. Basically, it is the difference between the result of a mathematical algorithm that uses exact arithmetic and that same algorithm using a slightly less precise, rounded version of the same number or numbers. The significance of a rounding error depends on the circumstances.
While it is inconsequential enough to be ignored in most cases, a rounding error can have a cumulative effect in the present-day computerized financial environment, in which case it may need to be rectified. A rounding error can be especially problematic when rounded input is used in a series of calculations, causing the error to compound, and sometimes to overpower the calculation.
The term "rounding error" is also used sometimes to indicate an amount that is not material to a very large company.
How a Rounding Error Works
Financial statements of many companies routinely carry the warning that "numbers may not add up due to rounding." In such cases, the apparent error is only caused by the financial spreadsheet's quirks, and would not need rectification.
Example of a Rounding Error
For example, consider a situation where a financial institution mistakenly rounds off interest rates on mortgage loans in a given month, resulting in its customers being charged interest rates of 4% and 5% instead of 3.60% and 4.70% respectively. In this case, the rounding error could affect tens of thousands of its customers, and the magnitude of the error would result in the institution incurring hundreds of thousands of dollars in expenses to correct the transactions and rectify the error.
The explosion of big data and related advanced data science applications has only amplified the possibility of rounding errors. Many times a rounding error occurs simply by chance; it's inherently unpredictable or otherwise difficult to control for—hence, the many issues of "clean-data" from big data. Other times, a rounding error occurs when a researcher unknowingly rounds a variable to a few decimals.
Classic Rounding Error
The classic rounding error exemplar includes the story of Edward Lorenz. Around 1960, Lorenz, a professor at MIT, input numbers into an early computer program simulating weather patterns. Lorenz changed a single value from .506127 to .506. To his surprise, that tiny alteration drastically transformed the whole pattern his program produced, affecting the accuracy of over two months’ worth of simulated weather patterns.
The unexpected result led Lorenz to a powerful insight into the way nature works: small changes can have large consequences. The idea came to be known as the “butterfly effect” after Lorenz suggested that the flap of a butterfly’s wings might ultimately cause a tornado. And the butterfly effect, also known as “sensitive dependence on initial conditions,” has a profound corollary: forecasting the future can be nearly impossible. Today, a more elegant form of the butterfly effect is known as chaos theory. Further extensions of these effects are recognized in Benoit Mandelbrot's research into fractals and the "randomness" of financial markets.