info prev up next book cdrom email home

Entropy

In physics, the word entropy has important physical implications as the amount of ``disorder'' of a system. In mathematics, a more abstract definition is used. The (Shannon) entropy of a variable $X$ is defined as

\begin{displaymath}
H(X)\equiv -\sum_x p(x)\ln [p(x)],
\end{displaymath}

where $p(x)$ is the probability that $X$ is in the state $x$, and $p\ln p$ is defined as 0 if $p=0$. The joint entropy of variables $X_1$, ..., $X_n$ is then defined by


\begin{displaymath}
H(X_1, \ldots, X_n)\equiv -\sum_{x_1}\cdots \sum_{x_n} p(x_1, \ldots, x_n)\ln[p(x_1,\ldots,x_n)].
\end{displaymath}

See also Kolmogorov Entropy, Kolmogorov-Sinai Entropy, Maximum Entropy Method, Metric Entropy, Ornstein's Theorem, Redundancy, Shannon Entropy, Topological Entropy


References

Ott, E. ``Entropies.'' §4.5 in Chaos in Dynamical Systems. New York: Cambridge University Press, pp. 138-144, 1993.




© 1996-9 Eric W. Weisstein
1999-05-25