Salta al contenuto
##### Derivation of the Chebyshev’s inequality and its application to prove the (weak) LLN

The Chebyshev’s inequality is a direct derivation of the Markov’s inequality and it says that if we consider X as a random variable with mean μ and variance σ^2, then for every r > 0 we have that:

P(|X – μ| >= r) <=  σ²/ r²

Demonstration:  The events {| X – μ| > r} and {(X – μ)² >= r²} are the same and so their probability is  the same too. Since ( X – μ)^2 is a random variable not negative, then we can apply the Markov’s inequality with a = r^2, so that:

P(|X – μ| >= r) = P((X – μ)² >= r²) <= (E[(X – μ)²])/r² =  σ²/ r²

Chebyshev’s inequality is also used to demonstrate the weak law of large numbers. Let’s define X1, X2,. ..  Xn a succession of random variables i.i.d (independent and identically distributed) with mean μ. Then for every ε > 0,

P ( |( X1 + X2+Xn )/n – μ|> ε) -> 0 when n -> to ∞

Demonstration: We prove this result with the additional hypothesis that the Xi random variables have variance limited to σ². Since the properties of mean and variance we’ve that:

E[(X1 + X2 + … +Xn)/n] =  μ and Var((X1 + X2 +…+Xn)/n) = σ²/n

So by applying the Chebyshev’s inequality to the random variable R = ( X1 + X2+Xn ) /n, we’ve that:

P ( |( X1 + X2+Xn )/n – μ|> ε) <= σ²/nε²

Since now for n -> ∞, σ²/nε² -> 0 the law is proved.

### Commenti

commenti

Published inStatistics

## Sii il primo a commentare

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.