info prev up next book cdrom email home

Exponential Distribution

\begin{figure}\begin{center}\BoxedEPSF{ExponentialDistribution.epsf scaled 650}\end{center}\end{figure}

Given a Poisson Distribution with rate of change $\lambda$, the distribution of waiting times between successive changes (with $k = 0$) is

$\displaystyle D(x)$ $\textstyle \equiv$ $\displaystyle P(X \leq x) = 1-P(X > x)$  
  $\textstyle =$ $\displaystyle 1 - {(\lambda x)^0e^{-\lambda x}\over 0!} = 1 - e^{-\lambda x}$ (1)
$\displaystyle P(x)$ $\textstyle =$ $\displaystyle D'(x) = \lambda e^{-\lambda x},$ (2)

which is normalized since
$\displaystyle \int_0^\infty P(x)\,dx$ $\textstyle =$ $\displaystyle \lambda \int_0^\infty e^{-\lambda x}\,dx$  
  $\textstyle =$ $\displaystyle - [e^{-\lambda x}]^\infty_0 = -(0-1) = 1.$ (3)

This is the only Memoryless Random Distribution. Define the Mean waiting time between successive changes as $\theta\equiv\lambda^{-1}$. Then
\begin{displaymath}
P(x) = \cases{
{1\over \theta} e^{-x/\theta} & $x \geq 0$\cr
0 & $x < 0$.\cr}
\end{displaymath} (4)

The Moment-Generating Function is
$\displaystyle M(t)$ $\textstyle =$ $\displaystyle \int_0^\infty e^{tx}\left({1\over \theta}\right)e^{-x/\theta}\, dx
= {1\over \theta }\int_0^\infty e^{-(1-\theta t)x/\theta}\, dx$  
  $\textstyle =$ $\displaystyle \left[{e^{-(1-\theta t)x/\theta} \over 1-\theta t}\right]_0^\infty = {1\over 1-\theta t}$ (5)
$\displaystyle M'(t)$ $\textstyle =$ $\displaystyle {\theta \over (1-\theta t)^2}$ (6)
$\displaystyle M''(t)$ $\textstyle =$ $\displaystyle {2\theta ^2\over (1-\theta t)^3},$ (7)

so
$\displaystyle R(t)$ $\textstyle \equiv$ $\displaystyle \ln M(t) = - \ln (1-\theta t)$ (8)
$\displaystyle R'(t)$ $\textstyle =$ $\displaystyle {\theta \over 1-\theta t}$ (9)
$\displaystyle R''(t)$ $\textstyle =$ $\displaystyle {\theta^2\over (1-\theta t)^2}$ (10)
$\displaystyle \mu$ $\textstyle =$ $\displaystyle R'(0) = \theta$ (11)
$\displaystyle \sigma^2$ $\textstyle =$ $\displaystyle R''(0) = \theta^2.$ (12)

The Skewness and Kurtosis are given by
$\displaystyle \gamma_1$ $\textstyle =$ $\displaystyle 2$ (13)
$\displaystyle \gamma_2$ $\textstyle =$ $\displaystyle 6.$ (14)

The Mean and Variance can also be computed directly
\begin{displaymath}
\left\langle{x}\right\rangle{} \equiv \int_0^\infty P(x)\, dx = {1\over s} \int_0^\infty xe^{-x/s}\, dx.
\end{displaymath} (15)

Use the integral
\begin{displaymath}
\int xe^{ax}\,dx = {e^{ax}\over a^2} (ax-1)
\end{displaymath} (16)

to obtain
$\displaystyle \left\langle{x}\right\rangle{}$ $\textstyle =$ $\displaystyle {1\over s}\left[{{e^{-x/s}\over\left({-{1\over s}}\right)^2} \left\{{\left({-{1\over s}}\right)x-1}\right\}}\right]^\infty_0$  
  $\textstyle =$ $\displaystyle -s\left[{e^{-x/s} \left({1+{x\over s}}\right)}\right]^\infty_0$  
  $\textstyle =$ $\displaystyle -s(0-1) = s.$ (17)

Now, to find
\begin{displaymath}
\left\langle{x^2}\right\rangle{} = {1\over s}\int_0^\infty x^2e^{-x/s}\,dx,
\end{displaymath} (18)

use the integral
\begin{displaymath}
\int x^2e^{-x/s}\,dx = {e^{ax}\over a^3} (2-2ax+a^2x^2)
\end{displaymath} (19)


$\displaystyle \left\langle{x^2}\right\rangle{}$ $\textstyle =$ $\displaystyle {1\over s}\left[{{e^{-x/s}\over\left({-{1\over s}}\right)^3} \left({2+{2\over s} x+{1\over s^2} x^2}\right)}\right]_0^\infty$  
  $\textstyle =$ $\displaystyle -s^2(0-2) = 2s^2,$ (20)

giving
$\displaystyle \sigma^2$ $\textstyle \equiv$ $\displaystyle \left\langle{x^2}\right\rangle{}-{\left\langle{x}\right\rangle{}}^2$  
  $\textstyle =$ $\displaystyle 2s^2-s^2 = s^2$ (21)
$\displaystyle \sigma$ $\textstyle \equiv$ $\displaystyle \sqrt{{\rm var}(x)} = s.$ (22)


If a generalized exponential probability function is defined by

\begin{displaymath}
P_{(\alpha,\beta)}(x)={1\over\beta}e^{-(x-\alpha)/\beta},
\end{displaymath} (23)

then the Characteristic Function is
\begin{displaymath}
\phi(t)={e^{i\alpha t}\over 1-i\beta t},
\end{displaymath} (24)

and the Mean, Variance, Skewness, and Kurtosis are
$\displaystyle \mu$ $\textstyle =$ $\displaystyle \alpha+\beta$ (25)
$\displaystyle \sigma^2$ $\textstyle =$ $\displaystyle \beta^2$ (26)
$\displaystyle \gamma_1$ $\textstyle =$ $\displaystyle 2$ (27)
$\displaystyle \gamma_2$ $\textstyle =$ $\displaystyle 6.$ (28)

See also Double Exponential Distribution


References

Balakrishnan, N. and Basu, A. P. The Exponential Distribution: Theory, Methods, and Applications. New York: Gordon and Breach, 1996.

Beyer, W. H. CRC Standard Mathematical Tables, 28th ed. Boca Raton, FL: CRC Press, pp. 534-535, 1987.

Spiegel, M. R. Theory and Problems of Probability and Statistics. New York: McGraw-Hill, p. 119, 1992.



info prev up next book cdrom email home

© 1996-9 Eric W. Weisstein
1999-05-25