info prev up next book cdrom email home

Kolmogorov Entropy

Also known as Metric Entropy. Divide Phase Space into $D$-dimensional Hypercubes of Content $\epsilon^D$. Let $P_{i_0,\ldots,i_n}$ be the probability that a trajectory is in Hypercube $i_0$ at $t=0$, $i_1$ at $t=T$, $i_2$ at $t=2T$, etc. Then define

\begin{displaymath}
K_n=h_K = -\sum_{i_0,\ldots,i_n} P_{i_0,\ldots,i_n}\ln P_{i_0,\ldots,i_n},
\end{displaymath} (1)

where $K_{N+1}-K_N$ is the information needed to predict which Hypercube the trajectory will be in at $(n+1)T$ given trajectories up to $nT$. The Kolmogorov entropy is then defined by
\begin{displaymath}
K\equiv \lim_{T\to 0}\lim_{\epsilon\to 0^+}\lim_{N\to\infty} {1\over NT} \sum_{n=0}^{N-1} (K_{n+1}-K_n).
\end{displaymath} (2)

The Kolmogorov entropy is related to Lyapunov Characteristic Exponents by
\begin{displaymath}
h_K = \int_P \sum_{\sigma_i>0} \sigma_i\,d\mu.
\end{displaymath} (3)

See also Hypercube, Lyapunov Characteristic Exponent


References

Ott, E. Chaos in Dynamical Systems. New York: Cambridge University Press, p. 138, 1993.

Schuster, H. G. Deterministic Chaos: An Introduction, 3rd ed. New York: Wiley, p. 112, 1995.




© 1996-9 Eric W. Weisstein
1999-05-26