info prev up next book cdrom email home

Redundancy


\begin{displaymath}
R(X_1, \ldots X_n)\equiv \sum_{i=1}^n H(X_i)-H(X_1, \ldots, X_n),
\end{displaymath}

where $H(x_i)$ is the Entropy and $H(X_1, \ldots, X_n)$ is the joint Entropy. Linear redundancy is defined as

\begin{displaymath}
L(X_1, \ldots, X_n) \equiv -{\textstyle{1\over 2}}\sum_{i=1}^n \ln\sigma_i,
\end{displaymath}

where $\sigma_i$ are Eigenvalues of the correlation matrix.

See also Predictability


References

Fraser, A. M. ``Reconstructing Attractors from Scalar Time Series: A Comparison of Singular System and Redundancy Criteria.'' Phys. D 34, 391-404, 1989.

Palus, M. ``Identifying and Quantifying Chaos by Using Information-Theoretic Functionals.'' In Time Series Prediction: Forecasting the Future and Understanding the Past (Ed. A. S. Weigend and N. A. Gerschenfeld). Proc. NATO Advanced Research Workshop on Comparative Time Series Analysis held in Sante Fe, NM, May 14-17, 1992. Reading, MA: Addison-Wesley, pp. 387-413, 1994.




© 1996-9 Eric W. Weisstein
1999-05-25