An inequality on location and scale parameters
From Wikipedia, the free encyclopedia
For probability distributions having an expected value and a median, the mean (i.e., the expected value) and the median can never differ from each other by more than one standard deviation. To express this in mathematical notation, let μ, m, and σ be respectively the mean, the median, and the standard deviation. Then
(There is no need to rely on an assumption that the variance exists, i.e., is finite. Unlike the situation with the expected value, saying the variance exists is equivalent to saying the variance is finite. But this inequality is trivially true if the variance is infinite.)
[edit] Proof
This proof uses Jensen's inequality twice. We have
![]() |
![]() |
![]() |
|
![]() |
The first inequality comes from (the convex version of) Jensen's inequality applied to the absolute value function, which is convex. The second comes from the fact that the median minimizes the absolute deviation function
The third inequality comes from (the concave version of) Jensen's inequality applied to the square root function, which is concave. Q.E.D.
[edit] Alternative proof
The one-tailed version of Chebyshev's inequality is
Letting k = 1 gives Pr(X ≥ μ + σ) ≤ 1/2 and (by changing the sign of X and so μ) Pr(X ≤ μ − σ) ≤ 1/2. So the median is within one standard deviation of the mean.