info prev up next book cdrom email home

Symmetric Matrix

A symmetric matrix is a Square Matrix which satisfies ${\hbox{\sf A}}^{\rm T} = {\hbox{\sf A}}$ where ${\hbox{\sf A}}^{\rm T}$ denotes the Transpose, so $a_{ij} = a_{ji}$. This also implies

\begin{displaymath}
{\hbox{\sf A}}^{-1}{\hbox{\sf A}}^{\rm T} = {\hbox{\sf I}},
\end{displaymath} (1)

where I is the Identity Matrix. Written explicitly,
\begin{displaymath}
\left[{\matrix{
a_{11} & a_{12} & \cdots & a_{1n}\cr
a_{21...
...ts & \vdots\cr
a_{n1} & a_{n2} & \cdots & a_{nn}\cr}}\right].
\end{displaymath} (2)

The symmetric part of any Matrix may be obtained from
\begin{displaymath}
{\hbox{\sf A}}_s = {\textstyle{1\over 2}}({\hbox{\sf A}}+{\hbox{\sf A}}^{\rm T}).
\end{displaymath} (3)

A Matrix A is symmetric if it can be expressed in the form
\begin{displaymath}
{\hbox{\sf A}}={\hbox{\sf Q}}{\hbox{\sf D}}{\hbox{\sf Q}}^{\rm T},
\end{displaymath} (4)

where ${\hbox{\sf Q}}$ is an Orthogonal Matrix and ${\hbox{\sf D}}$ is a Diagonal Matrix. This is equivalent to the Matrix equation
\begin{displaymath}
{\hbox{\sf A}}{\hbox{\sf Q}}={\hbox{\sf Q}}{\hbox{\sf D}},
\end{displaymath} (5)

which is equivalent to
\begin{displaymath}
{\hbox{\sf A}}{\bf Q}_n=\lambda_n{\bf Q}_n
\end{displaymath} (6)

for all $n$, where $\lambda_n=D_{nn}$. Therefore, the diagonal elements of ${\hbox{\sf D}}$ are the Eigenvalues of ${\hbox{\sf A}}$, and the columns of ${\hbox{\sf Q}}$ are the corresponding Eigenvectors.

See also Antisymmetric Matrix, Skew Symmetric Matrix


References

Nash, J. C. ``Real Symmetric Matrices.'' Ch. 10 in Compact Numerical Methods for Computers: Linear Algebra and Function Minimisation, 2nd ed. Bristol, England: Adam Hilger, pp. 119-134, 1990.




© 1996-9 Eric W. Weisstein
1999-05-26