Hausdorff moment problem
From Wikipedia, the free encyclopedia
In mathematics, the Hausdorff moment problem, named after Felix Hausdorff, asks for necessary and sufficient conditions that a given sequence { μn : n = 1, 2, 3, ... } be the sequence of moments
of some probability distribution, with cumulative distribution function F, supported on the closed unit interval [0, 1].
In 1921, Hausdorff showed that { μn : n = 1, 2, 3, ... } is such a moment sequence if and only if all of the differences
are non-negative, where is the difference operator given by
For example, it is necessary to have
When one considers that this is the same as
or, generally,
then the necessity of these conditions becomes obvious.
[edit] References
- Hausdorff, F. "Summationsmethoden und Momentfolgen. I." Mathematische Zeitschrift 9, 74-109, 1921.
- Hausdorff, F. "Summationsmethoden und Momentfolgen. II." Mathematische Zeitschrift 9, 280-299, 1921.
- Shohat, J.A.; Tamarkin, J. D. The Problem of Moments, American mathematical society, New York, 1943.