info prev up next book cdrom email home

Weak Law of Large Numbers

Also known as Bernoulli's Theorem. Let $x_1$, ..., $x_n$ be a sequence of independent and identically distributed random variables, each having a Mean $\left\langle{x_i}\right\rangle{} = \mu$ and Standard Deviation $\sigma$. Define a new variable

\begin{displaymath}
x \equiv {x_1+\ldots +x_n\over n}.
\end{displaymath} (1)

Then, as $n\to\infty$, the sample mean $\left\langle{x}\right\rangle{}$ equals the population Mean $\mu$ of each variable.
\begin{displaymath}
\left\langle{x}\right\rangle{}=\left\langle{x_1+\ldots+x_n\o...
...+\ldots+\left\langle{x_n}\right\rangle{}) = {n\mu\over n}= \mu
\end{displaymath} (2)


$\displaystyle \mathop{\rm var}\nolimits (x)$ $\textstyle =$ $\displaystyle \mathop{\rm var}\nolimits \left({x_1+\ldots +x_2\over n}\right)$  
  $\textstyle =$ $\displaystyle \mathop{\rm var}\nolimits \left({x_1\over n}\right)+\ldots +\mathop{\rm var}\nolimits \left({x_n\over n}\right)$  
  $\textstyle =$ $\displaystyle {\sigma^2\over n^2}+ \ldots + {\sigma^2\over n^2}= {\sigma^2\over n}.$ (3)

Therefore, by the Chebyshev Inequality, for all $\epsilon > 0$,
\begin{displaymath}
P(\vert x-\mu \vert \geq \epsilon) \leq {\mathop{\rm var}\nolimits (x)\over \epsilon^2}= {\sigma^2\over n\epsilon^2}.
\end{displaymath} (4)

As $n\to\infty$, it then follows that
\begin{displaymath}
\lim_{n\to \infty}P(\vert x-\mu \vert \geq \epsilon ) = 0
\end{displaymath} (5)

for $\epsilon$ arbitrarily small; i.e., as $n\to\infty$, the sample Mean is the same as the population Mean.


Stated another way, if an event occurs $x$ times in $s$ Trials and if $P$ is the probability of success in a single Trial, then the probability that $\vert x/s-P\vert<\epsilon$ for $\epsilon$ an arbitrary Positive quantity approaches 1 as $s\to\infty$.

See also Law of Truly Large Numbers, Strong Law of Large Numbers




© 1996-9 Eric W. Weisstein
1999-05-26