PAGE 1

7.1 Diagonalization of Symmetric Matrices

Symmetric matrix: matrix \( A \) such that \( A^T = A \)

\[ A = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \quad A^T = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \]
\[ A = \begin{bmatrix} 1 & 3 \\ 3 & 1 \end{bmatrix} \quad A^T = \begin{bmatrix} 1 & 3 \\ 3 & 1 \end{bmatrix} \]

Try diagonalizing \( A = \begin{bmatrix} 1 & 3 \\ 3 & 1 \end{bmatrix} = PDP^{-1} \)

\[ \det(A - \lambda I) = 0 \implies \begin{vmatrix} 1 - \lambda & 3 \\ 3 & 1 - \lambda \end{vmatrix} = 0 \]
\[ (1 - \lambda)^2 - 9 = 0 \]

\( 1 - \lambda = 3 \quad \text{or} \quad 1 - \lambda = -3 \)

\( \lambda = -2, \quad \lambda = 4 \)

PAGE 2

Eigenvectors

\( \lambda = -2 \) solve \( (A - \lambda I)\vec{v} = \vec{0} \)

\[ \begin{bmatrix} 3 & 3 & 0 \\ 3 & 3 & 0 \end{bmatrix} \implies \vec{v} = \begin{bmatrix} 1 \\ -1 \end{bmatrix} = \begin{bmatrix} 1/\sqrt{2} \\ -1/\sqrt{2} \end{bmatrix} \]

\( \lambda = 4 \) solve \( (A - \lambda I)\vec{v} = \vec{0} \)

\[ \begin{bmatrix} -3 & 3 & 0 \\ 3 & -3 & 0 \end{bmatrix} \implies \vec{v} = \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 1/\sqrt{2} \\ 1/\sqrt{2} \end{bmatrix} \]

Note: the eigenvectors from different eigenvalues are orthogonal \( \rightarrow \) Always true for symmetric matrices.

\[ P = \begin{bmatrix} 1/\sqrt{2} & 1/\sqrt{2} \\ -1/\sqrt{2} & 1/\sqrt{2} \end{bmatrix} \]
\[ P^T = \begin{bmatrix} 1/\sqrt{2} & -1/\sqrt{2} \\ 1/\sqrt{2} & 1/\sqrt{2} \end{bmatrix} \]

The columns of \( P \) are orthonormal.

When a square matrix has orthonormal columns, it is an orthogonal matrix.

\( P^T P = P P^T = I \)

\( \rightarrow \) If matrix is orthogonal, its transpose = its inverse.

PAGE 3

Orthogonal Diagonalization

\[ A = PDP^{-1} = PDP^T \]\[ \begin{bmatrix} 1 & 3 \\ 3 & 1 \end{bmatrix} = \begin{bmatrix} 1/\sqrt{2} & 1/\sqrt{2} \\ -1/\sqrt{2} & 1/\sqrt{2} \end{bmatrix} \begin{bmatrix} -2 & 0 \\ 0 & 4 \end{bmatrix} \begin{bmatrix} 1/\sqrt{2} & -1/\sqrt{2} \\ 1/\sqrt{2} & 1/\sqrt{2} \end{bmatrix} \]

If a matrix can be diagonalized such that \( P \) is orthogonal, then we say the matrix is orthogonally diagonalizable.

Symmetry and Diagonalizability

If \( A = A^T \), then it is orthogonally diagonalizable. But is the reverse true? (If orthogonally diagonalizable, is it always symmetric?)

If \( A = PDP^T \), is \( A = A^T \)?

\[ A^T = (PDP^T)^T = (P^T)^T D^T P^T \]\[ = PDP^T = A \implies A^T = A, \text{ so yes} \]
\[ \text{symmetric} \longleftrightarrow \text{orthogonally diagonalizable} \]
PAGE 4

Repeated Eigenvalues

Example

\[ A = \begin{bmatrix} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{bmatrix} \]\[ \lambda = -1, -1, 2 \]

Eigenvector for \( \lambda = 2 \):

\[ \vec{v}_1 = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} \to \begin{bmatrix} 1/\sqrt{3} \\ 1/\sqrt{3} \\ 1/\sqrt{3} \end{bmatrix} \]

For \( \lambda = -1 \): \( (A - \lambda I)\vec{v} = \vec{0} \)

From last page, we know if \( A \) is symmetric, then it is orthogonally diagonalizable, which means the dimension of the eigenspace must match the multiplicity of the corresponding eigenvalue.

So, here, \( \lambda = -1 \) twice, there is guaranteed to be two eigenvectors for \( \lambda = -1 \).

\[ \begin{bmatrix} 1 & 1 & 1 & | & 0 \\ 1 & 1 & 1 & | & 0 \\ 1 & 1 & 1 & | & 0 \end{bmatrix} \sim \begin{bmatrix} 1 & 1 & 1 & | & 0 \\ 0 & 0 & 0 & | & 0 \\ 0 & 0 & 0 & | & 0 \end{bmatrix} \]

\( x_2, x_3 \) free
\( x_1 = -x_2 - x_3 \)

PAGE 5

Gram-Schmidt Process for Orthogonalization

\[ \vec{x} = x_2 \begin{bmatrix} -1 \\ 1 \\ 0 \end{bmatrix} + x_3 \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix} \]
\[ \vec{v}_2 = \begin{bmatrix} -1 \\ 1 \\ 0 \end{bmatrix} = \begin{bmatrix} -1/\sqrt{2} \\ 1/\sqrt{2} \\ 0 \end{bmatrix} \] \[ \vec{v}_3 = \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix} = \begin{bmatrix} -1/\sqrt{2} \\ 0 \\ 1/\sqrt{2} \end{bmatrix} \]

\( \vec{v}_1 \) is orthogonal to \( \vec{v}_2 \) and \( \vec{v}_3 \), and this is ALWAYS true because \( \vec{v}_1 \) and \( \{ \vec{v}_2, \vec{v}_3 \} \) are from distinct eigenvalues.

\( P \) must have orthonormal columns. But \( \vec{v}_2 \) is not orthogonal to \( \vec{v}_3 \).

Perform Gram-Schmidt process to change \( \vec{v}_3 \)

\[ \vec{u}_3 = \vec{v}_3 - \frac{\langle \vec{v}_3, \vec{v}_2 \rangle}{\langle \vec{v}_2, \vec{v}_2 \rangle} \vec{v}_2 = \begin{bmatrix} -1/\sqrt{2} \\ 0 \\ 1/\sqrt{2} \end{bmatrix} - \frac{1}{2} \begin{bmatrix} -1/\sqrt{2} \\ 1/\sqrt{2} \\ 0 \end{bmatrix} \] \[ = \begin{bmatrix} -\frac{1}{2\sqrt{2}} \\ -\frac{1}{2\sqrt{2}} \\ \frac{1}{\sqrt{2}} \end{bmatrix} \]
\[ \|\vec{u}_3\| = \frac{\sqrt{6}}{2\sqrt{2}} \]
PAGE 6
\[ \vec{w}_3 = \frac{\vec{u}_3}{\|\vec{u}_3\|} = \begin{bmatrix} -\frac{1}{\sqrt{6}} \\ -\frac{1}{\sqrt{6}} \\ \frac{2}{\sqrt{6}} \end{bmatrix} \]

so now \( \{ \vec{v}_1, \vec{v}_2, \vec{w}_3 \} \) is orthonormal and form columns of \( P \).

\[ P = \begin{bmatrix} \frac{1}{\sqrt{3}} & -\frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{6}} \\ \frac{1}{\sqrt{3}} & \frac{1}{\sqrt{2}} & -\frac{1}{\sqrt{6}} \\ \frac{1}{\sqrt{3}} & 0 & \frac{2}{\sqrt{6}} \end{bmatrix} \]
\( P \) is orthogonal, \( P^T = P^{-1} \)
\[ D = \begin{bmatrix} 2 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & -1 \end{bmatrix} \]