info prev up next book cdrom email home

Gamma Distribution

\begin{figure}\begin{center}\BoxedEPSF{GammaDistribution.epsf scaled 650}\end{center}\end{figure}

A general type of statistical Distribution which is related to the Beta Distribution and arises naturally in processes for which the waiting times between Poisson Distributed events are relevant. Gamma distributions have two free parameters, labeled $\alpha$ and $\theta$, a few of which are illustrated above.


Given a Poisson Distribution with a rate of change $\lambda$, the Distribution Function $D(x)$ giving the waiting times until the $h$th change is

$\displaystyle D(x)$ $\textstyle =$ $\displaystyle P(X \leq x) = 1-P(X > x)$  
  $\textstyle =$ $\displaystyle 1 - \sum_{k=0}^{h-1} {(\lambda x)^ke^{-\lambda x}\over k!}$  
  $\textstyle =$ $\displaystyle 1 - e^{-\lambda x} \sum_{k=0}^{h-1} { (\lambda x)^k\over k!}$ (1)

for $x\geq 0$. The probability function $P(x)$ is then obtained by differentiating $D(x)$,
$\displaystyle P(x)$ $\textstyle =$ $\displaystyle D'(x)$  
  $\textstyle =$ $\displaystyle \lambda e^{-\lambda x} \sum_{k=0}^{h-1} { (\lambda x)^k\over k!}
- e^{-\lambda x} \sum_{k=0}^{h-1} { k(\lambda x)^{k-1}\lambda\over k!}$  
  $\textstyle =$ $\displaystyle \lambda e^{-\lambda x} + \lambda e^{-\lambda x} \sum_{k=1}^{h-1} ...
...ver k!}
- e^{-\lambda x} \sum_{k=1}^{h-1} { k(\lambda x)^{k-1}\lambda \over k!}$  
  $\textstyle =$ $\displaystyle \lambda e^{-\lambda x} - \lambda e^{-\lambda x} \sum_{k=1}^{h-1}
\left[{{ k(\lambda x)^{k-1}\over k!} - { (\lambda x)^k\over k!}}\right]$  
  $\textstyle =$ $\displaystyle \lambda e^{-\lambda x}\left\{{ 1 - \sum_{k=1}^{h-1} \left[{ (\lambda x)^{k-1}\over (k-1)!}
- { (\lambda x)^k\over k!}\right]}\right\}$  
  $\textstyle =$ $\displaystyle \lambda e^{-\lambda x}\left\{{ 1 - \left[{1 - { (\lambda x)^{h-1}...
...)!}}\right]}\right\}
= { \lambda (\lambda x)^{h-1}\over (h-1)!} e^{-\lambda x}.$  
      (2)

Now let $\alpha \equiv h$ and define $\theta \equiv 1/\lambda$ to be the time between changes. Then the above equation can be written
\begin{displaymath}
P(x) = \cases{
{x^{\alpha-1}e^{-x/\theta}\over \Gamma (\alpha)\theta^\alpha} & $0 \leq x < \infty$\cr
0 & $x < 0$.\cr}
\end{displaymath} (3)

The Characteristic Function describing this distribution is
\begin{displaymath}
\phi(t)=(1-it)^{-p},
\end{displaymath} (4)

and the Moment-Generating Function is
$\displaystyle M(t)$ $\textstyle =$ $\displaystyle \int_0^\infty {e^{tx}x^{\alpha-1}e^{-x/\theta}\,dx\over \Gamma (\alpha)\theta^\alpha}$  
  $\textstyle =$ $\displaystyle \int_0^\infty {x^{\alpha-1}e^{-(1-\theta t)x/\theta}\,dx\over \Gamma (\alpha)\theta^\alpha}.$ (5)

In order to find the Moments of the distribution, let
$\displaystyle y$ $\textstyle \equiv$ $\displaystyle {(1-\theta t)x\over\theta}$ (6)
$\displaystyle dy$ $\textstyle =$ $\displaystyle {1-\theta t\over\theta}\,dx,$ (7)

so
$\displaystyle M(t)$ $\textstyle =$ $\displaystyle \int_0^\infty\left({ \theta y\over 1-\theta t}\right)^{\alpha-1}
{e^{-y}\over \Gamma (\alpha)\theta^\alpha} { \theta\,dy\over 1-\theta t}$  
  $\textstyle =$ $\displaystyle {1\over(1-\theta t)^\alpha \Gamma(\alpha)} \int^\infty_0 y^{\alpha-1}e^{-y}\,dy$  
  $\textstyle =$ $\displaystyle {1\over(1-\theta t)^\alpha},$ (8)

and the logarithmic Moment-Generating function is
$\displaystyle R(t)$ $\textstyle \equiv$ $\displaystyle \ln M(t) = -\alpha \ln(1-\theta t)$ (9)
$\displaystyle R'(t)$ $\textstyle =$ $\displaystyle { \alpha\theta\over 1-\theta t}$ (10)
$\displaystyle R''(t)$ $\textstyle =$ $\displaystyle { \alpha\theta^2\over (1-\theta t)^2}.$ (11)

The Mean, Variance, Skewness, and Kurtosis are then
$\displaystyle \mu$ $\textstyle =$ $\displaystyle R'(0) = \alpha\theta$ (12)
$\displaystyle \sigma^2$ $\textstyle =$ $\displaystyle R''(0) = \alpha\theta^2$ (13)
$\displaystyle \gamma_1$ $\textstyle =$ $\displaystyle {2\over\sqrt{\alpha}}$ (14)
$\displaystyle \gamma_2$ $\textstyle =$ $\displaystyle {6\over\alpha}.$ (15)


The gamma distribution is closely related to other statistical distributions. If $X_1$, $X_2$, ..., $X_n$ are independent random variates with a gamma distribution having parameters $(\alpha_1,
\theta)$, $(\alpha_2, \theta)$, ..., $(\alpha_n, \theta)$, then $\sum_{i=1}^n X_i$ is distributed as gamma with parameters

$\displaystyle \alpha$ $\textstyle =$ $\displaystyle \sum_{i=1}^n\alpha_i$ (16)
$\displaystyle \theta$ $\textstyle =$ $\displaystyle \theta.$ (17)

Also, if $X_1$ and $X_2$ are independent random variates with a gamma distribution having parameters $(\alpha_1,
\theta)$ and $(\alpha_2, \theta)$, then $X_1/(X_1+X_2)$ is a Beta Distribution variate with parameters $(\alpha_1,
\alpha_2)$. Both can be derived as follows.
\begin{displaymath}
P(x,y)={1\over\Gamma(\alpha_1)\Gamma(\alpha_2)} e^{x_1+x_2}{x_1}^{\alpha_1-1}{x_2}^{\alpha_2-1}.
\end{displaymath} (18)

Let
\begin{displaymath}
u=x_1+x_2 \qquad x_1=uv
\end{displaymath} (19)


\begin{displaymath}
v={x_1\over x_1+x_2} \qquad x_2=u(1-v),
\end{displaymath} (20)

then the Jacobian is
\begin{displaymath}
J\left({x_1,x_2\over u,v}\right)= \left\vert\matrix{v & u\cr 1-v & -u\cr}\right\vert = -u,
\end{displaymath} (21)

so
\begin{displaymath}
g(u,v)\,du\,dv=f(x,y)\,dx\,dy=f(x,y)u\,du\,dv.
\end{displaymath} (22)


$\displaystyle g(u,v)$ $\textstyle =$ $\displaystyle {u\over\Gamma(\alpha_1)\Gamma(\alpha_2)}e^{-u}(uv)^{\alpha_1-1}u^{\alpha_2-1}(1-v)^{\alpha_2-1}$  
  $\textstyle =$ $\displaystyle {1\over\Gamma(\alpha_1)\Gamma(\alpha_2)} e^{-u}u^{\alpha_1+\alpha_2-1}v^{\alpha_1-1}(1-v)^{\alpha_2-1}.$ (23)

The sum $X_1+X_2$ therefore has the distribution
\begin{displaymath}
f(u)=f(x_1+x_2)=\int_0^1 g(u,v)\,dv={e^{-u}u^{\alpha_1+\alpha_2-1}\over \Gamma(\alpha_1+\alpha_2)},
\end{displaymath} (24)

which is a gamma distribution, and the ratio $X_1/(X_1+X_2)$ has the distribution
$\displaystyle h(v)$ $\textstyle =$ $\displaystyle h\left({x_1\over x_1+x_2}\right)=\int_0^\infty g(u,v)\,du$  
  $\textstyle =$ $\displaystyle {v^{\alpha_1-1}(1-v)^{\alpha_2-1}\over B(\alpha_1, \alpha_2)},$ (25)

where $B$ is the Beta Function, which is a Beta Distribution.


If $X$ and $Y$ are gamma variates with parameters $\alpha_1$ and $\alpha_2$, the $X/Y$ is a variate with a Beta Prime Distribution with parameters $\alpha_1$ and $\alpha_2$. Let

\begin{displaymath}
u=x+y\qquad v={x\over y},
\end{displaymath} (26)

then the Jacobian is
\begin{displaymath}
J\left({u,v\over x,y}\right)= \left\vert\matrix{1 & 1\cr {1\...
...over y^2}\cr}\right\vert = -{x+y\over y^2} =-{(1+v)^2\over u},
\end{displaymath} (27)

so
\begin{displaymath}
dx\,dy={u\over(1+v)^2}\,du\,dv
\end{displaymath} (28)


$\displaystyle g(u,v)$ $\textstyle =$ $\displaystyle {1\over\Gamma(\alpha_1)\Gamma(\alpha_2)}e^{-u}\left({uv\over 1+v}\right)^{\alpha_1-1}\left({u\over 1+v}\right)^{\alpha_2-1} {u\over(1+v)^2}$  
  $\textstyle =$ $\displaystyle {1\over\Gamma(\alpha_1)\Gamma(\alpha_2)} e^{-u}u^{\alpha_1+\alpha_2-1}v^{\alpha_2-1}(1+v)^{-\alpha_1-\alpha_2}.$ (29)

The ratio $X/Y$ therefore has the distribution
\begin{displaymath}
h(v)=\int_0^\infty g(u,v)\,du={v^{\alpha_1-1}(1+v)^{-\alpha_1-\alpha_2}\over B(\alpha_1,\alpha_2)},
\end{displaymath} (30)

which is a Beta Prime Distribution with parameters $(\alpha_1,
\alpha_2)$.


The ``standard form'' of the gamma distribution is given by letting $y\equiv x/\theta$, so $dy=dx/\theta$ and

$\displaystyle P(y)\,dy$ $\textstyle =$ $\displaystyle {x^{\alpha-1}e^{-x/\theta}\over \Gamma(\alpha)\theta^\alpha}\,dx
= {(\theta y)^{\alpha-1}e^{-y}\over \Gamma(\alpha)\theta^\alpha} \,(\theta\,dy)$  
  $\textstyle =$ $\displaystyle {y^{\alpha-1}e^{-y}\over\Gamma(\alpha)}\,dy,$ (31)

so the Moments about 0 are
\begin{displaymath}
\nu_r = {1\over \Gamma(\alpha)} \int_0^\infty e^{-x}x^{\alph...
...r}\,dx = {\Gamma(\alpha+r)\over \Gamma(\alpha)}
= (\alpha)_r,
\end{displaymath} (32)

where $(\alpha)_r$ is the Pochhammer Symbol. The Moments about $\mu=\mu_1$ are then
$\displaystyle \mu_1$ $\textstyle =$ $\displaystyle \alpha$ (33)
$\displaystyle \mu_2$ $\textstyle =$ $\displaystyle \alpha$ (34)
$\displaystyle \mu_3$ $\textstyle =$ $\displaystyle 2\alpha$ (35)
$\displaystyle \mu_4$ $\textstyle =$ $\displaystyle 3\alpha^2+6\alpha.$ (36)

The Moment-Generating Function is
\begin{displaymath}
M(t)={1\over(1-t)^\alpha},
\end{displaymath} (37)

and the Cumulant-Generating Function is
\begin{displaymath}
K(t)=\alpha\ln(1-t) = \alpha\left({t+{\textstyle{1\over 2}}t^2+{\textstyle{1\over 3}} t^3+\ldots}\right),
\end{displaymath} (38)

so the Cumulants are
\begin{displaymath}
\kappa_r = \alpha \Gamma(r).
\end{displaymath} (39)

If $x$ is a Normal variate with Mean $\mu$ and Standard Deviation $\sigma$, then
\begin{displaymath}
y\equiv {(x-\mu)^2\over 2\sigma^2}
\end{displaymath} (40)

is a standard gamma variate with parameter $\alpha=1/2$.

See also Beta Distribution, Chi-Squared Distribution


References

Beyer, W. H. CRC Standard Mathematical Tables, 28th ed. Boca Raton, FL: CRC Press, p. 534, 1987.



info prev up next book cdrom email home

© 1996-9 Eric W. Weisstein
1999-05-25