Runge's phenomenon
From Wikipedia, the free encyclopedia
In the mathematical field of numerical analysis, Runge's phenomenon is a problem which occurs when using polynomial interpolation with polynomials of high degree. It was discovered by Carle David Tolmé Runge when exploring the behaviour of errors when using polynomial interpolation to approximate certain functions.
[edit] Problem
Consider the function:
Runge found that if this function is interpolated at equidistant points xi between −1 and 1 such that:
with a polynomial Pn(x) which has a degree , the resulting interpolation oscillates toward the end of the interval, i.e. close to −1 and 1. It can even be proven that the interpolation error tends toward infinity when the degree of the polynomial increases:
However, the Weierstrass approximation theorem states that there is some sequence of approximating polynomials for which the error goes to zero. This shows that high-degree polynomial interpolation at equidistant points can be dangerous.
[edit] Solutions to the problem of Runge's phenomenon
The oscillation can be minimized by using Chebyshev nodes instead of equidistant nodes. In this case the maximum error is guaranteed to diminish with increasing polynomial order. The phenomenon demonstrates that high degree polynomials are generally unsuitable for interpolation. The problem can be avoided by using spline curves which are piecewise polynomials. When trying to decrease the interpolation error one can increase the number of polynomial pieces which are used to construct the spline instead of increasing the degree of the polynomials used.
[edit] See also
- Compare with the Gibbs phenomenon for sinusoidal basis functions