Let
be a set of
Independent random variates and each
have an arbitrary probability distribution
with Mean
and a finite Variance
. Then the normal form variate
![\begin{displaymath}
X_{\rm norm} \equiv {\sum_{i=1}^N x_i - \sum_{i=1}^N \mu_i\over\sqrt{\sum_{i=1}^N {\sigma_i}^2}}
\end{displaymath}](c1_912.gif) |
(1) |
has a limiting distribution which is Normal (Gaussian)
with Mean
and Variance
. If conversion to normal form is not performed, then the
variate
![\begin{displaymath}
X\equiv{1\over N}\sum_{i=1}^N x_i
\end{displaymath}](c1_914.gif) |
(2) |
is Normally Distributed with
and
. To prove
this, consider the Inverse Fourier Transform of
.
Now write
|
|
|
|
|
(4) |
so we have
Now expand
![\begin{displaymath}
\ln(1+x)=x-{\textstyle{1\over 2}}x^2+{\textstyle{1\over 3}}x^3+\ldots,
\end{displaymath}](c1_934.gif) |
(6) |
so
since
Taking the Fourier Transform,
This is of the form
![\begin{displaymath}
\int_{-\infty}^\infty e^{iaf-bf^2}\,df,
\end{displaymath}](c1_946.gif) |
(11) |
where
and
. But, from
Abramowitz and Stegun (1972, p. 302, equation 7.4.6),
![\begin{displaymath}
\int_{-\infty}^\infty e^{iaf-bf^2}\,df = e^{-a^2/4b} \sqrt{\pi\over b}.
\end{displaymath}](c1_949.gif) |
(12) |
Therefore,
But
and
, so
![\begin{displaymath}
P_X = {1\over\sigma_X\sqrt{2\pi}} e^{-(\mu_X-x)^2/2{\sigma_X}^2}.
\end{displaymath}](c1_953.gif) |
(14) |
The ``fuzzy'' central limit theorem says that data which are influenced by many small and unrelated random effects are
approximately Normally Distributed.
See also Lindeberg Condition, Lindeberg-Feller Central Limit Theorem, Lyapunov Condition
References
Abramowitz, M. and Stegun, C. A. (Eds.).
Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 9th printing.
New York: Dover, 1972.
Spiegel, M. R. Theory and Problems of Probability and Statistics.
New York: McGraw-Hill, pp. 112-113, 1992.
Zabell, S. L. ``Alan Turing and the Central Limit Theorem.'' Amer. Math. Monthly 102, 483-494, 1995.
© 1996-9 Eric W. Weisstein
1999-05-26