Section 5.2 Orthogonal Diagonalization

Theorem: The following conditions are equivalent for an [latex]n\times n[/latex] matrix [latex]U[/latex].1. [latex]U[/latex] is invertible and [latex]U^{-1}=U^{T}[/latex].

2. The rows of [latex]U[/latex] are orthonormal.

3. The columns of [latex]U[/latex] are orthonormal.

 

Proof: If [latex]U[/latex] is an [latex]n\times n[/latex] matrix with orthonormal columns then [latex]U[/latex] has orthonormal rows. Because [latex]U[/latex] is invertible, and [latex]U^{T}=U^{-1}[/latex] and [latex]UU^{T}=I[/latex].

 

Definition: An orthogonal matrix is a square invertible matrix [latex]U[/latex] such that  [latex]U^{-1}=U^{T}[/latex].

 

 

Definition: A symmetric matrix is a matrix [latex]A[/latex] such that [latex]A=A^{T}[/latex].

 

Remark: Such a matrix is necessarily square. Its main diagonal entries are arbitrary, but its other entries occur in pairs — on opposite sides of the main diagonal.

 

Theorem: If [latex]A[/latex] is symmetric, then any two eigenvectors from different eigenspaces are orthogonal.

 

Proof: Use [latex]\lambda_{1}\overrightarrow{v_{1}}\cdot\overrightarrow{v_{2}}=\lambda_{2}\overrightarrow{v_{1}}\cdot\overrightarrow{v_{2}}[/latex].

 

 

Example 1: Find eigenspace of [latex]A=\left[\begin{array}{cc} 16 & -4\\ -4 & 1 \end{array}\right][/latex] and verify the eigenvectors from different eigenspaces are orthogonal.

 

 

Exercise 1: Find eigenspace of [latex]A=\left[\begin{array}{cc} -7 & 24\\ 24 & 7 \end{array}\right][/latex] and verify the eigenvectors from different eigenspaces are orthogonal.

 

Definition: An [latex]n\times n[/latex] matrix [latex]A[/latex] is said to be orthogonally diagonalizable if there are an orthogonal matrix [latex]P[/latex] (with [latex]P^{-1}=P^{T}[/latex] and [latex]P[/latex] has orthonormal columns) and a diagonal matrix [latex]D[/latex] such that [latex]A=PDP^{T}=PDP^{-1}[/latex].

 

Remark: Such a diagonalization requires [latex]n[/latex] linearly independent and orthonormal eigenvectors. If [latex]A[/latex] is orthogonally diagonalizable, then [latex]A^{T}=(PDP^{T})^{T}=(P^{T})^{T}D^{T}P^{T}=PDP^{T}=A[/latex],

i.e. [latex]A[/latex] is symmetric.

 

Theorem: An [latex]n\times n[/latex] matrix A is orthogonally diagonalizable if and only if [latex]A[/latex] is symmetric
matrix.

 

Example 2: Orthogonally diagonalize the matrix [latex]A=\left[\begin{array}{cc} 3 & 1\\ 1 & 3 \end{array}\right][/latex].

 

 

Exercise 2: Orthogonally diagonalize the matrix [latex]A=\left[\begin{array}{cc} 1 & 5\\ 5 & 1 \end{array}\right][/latex].

 

Example 3: Orthogonally diagonalize the matrix [latex]A=\left[\begin{array}{ccc} 3 & -2 & 4\\ -2 & 6 & 2\\ 4 & 2 & 3 \end{array}\right][/latex].

 

 

Exercise 3: Orthogonally diagonalize the matrix [latex]A=\left[\begin{array}{ccc} 5 & -4 & -2\\ -4 & 5 & 2\\ -2 & 2 & 2 \end{array}\right][/latex].

 

Remark: The set of eigenvalues of a matrix [latex]A[/latex] is sometimes called the spectrum of [latex]A[/latex], and the following description of the eigenvalues is called a spectral theorem.

 

Theorem: The Spectral Theorem for Symmetric Matrices

An [latex]n\times n[/latex] symmetric matrix [latex]A[/latex] has the following properties:

(a) [latex]A[/latex] has [latex]n[/latex] real eigenvalues, counting multiplicities.

(b) The dimension of the eigenspace for each eigenvalue [latex]\lambda[/latex] equals the multiplicity of [latex]\lambda[/latex] as a root of the characteristic equation.

(c) The eigenspaces are mutually orthogonal, in the sense that eigenvectors corresponding to different eigenvalues are orthogonal.

(d) [latex]A[/latex] is orthogonally diagonalizable.

 

 

Example 4: Orthogonally diagonalize the matrix [latex]A=\left[\begin{array}{cccc} 2 & 0 & 0 & 0\\ 0 & 1 & 0 & 1\\ 0 & 0 & 2 & 0\\ 0 & 1 & 0 & 1 \end{array}\right][/latex].

 

 

Exercise 4: Orthogonally diagonalize the matrix [latex]A=\left[\begin{array}{cccc} 1 & 0 & 0 & 1\\ 0 & 3 & 0 & 0\\ 0 & 0 & 3 & 0\\ 1 & 0 & 0 & 1 \end{array}\right].[/latex]

 

GroupWorkExample 1: True or False.

 

a. An [latex]n\times n[/latex] matrix that is orthogonally diagonalizable must be symmetric.

 

b. If [latex]A=A^{T}[/latex] and if vectors [latex]\overrightarrow{u}[/latex] and [latex]\overrightarrow{v}[/latex] satisfy
[latex]A\overrightarrow{u}=3\overrightarrow{u}[/latex] and [latex]A\overrightarrow{v}=4\overrightarrow{v}[/latex] then
[latex]\overrightarrow{u}\cdot\overrightarrow{v}=0[/latex].

 

c. An [latex]n\times n[/latex] symmetric matrix has [latex]n[/latex] distinct real eigenvalues.

 

d. Every symmetric matrix is orthogonally diagonalizable.

 

e. If [latex]B=PDP^{T}[/latex], where [latex]P^{T}=P^{-1}[/latex] and [latex]D[/latex] is a diagonal matrix, then [latex]B[/latex] is a symmetric matrix.

 

f. The dimension of an eigenspace of a symmetric matrix equals the multiplicity of the corresponding eigenvalue.

 

GroupWork 2: Show that if [latex]A[/latex] and [latex]B[/latex] are orthogonal matrices then [latex]AB[/latex] is also an orthogonal matrix.

 

GroupWork 3: Suppose [latex]A[/latex] is invertible and orthogonal diagonalizable. Show that [latex]A^{-1}[/latex] is also orthogonal diagonalizable.

 

GroupWork 4: Prove the statement or give a counterexample.

 

a. An orthogonal matrix is orthogonally diagonalizable.

 

b. An orthogonal matrix is invertible.

 

c. An invertible matrix is orthogonal.

 

d. If a matrix is diagonalizable then it is symmetric.

 

GroupWork 5: Suppose [latex]A[/latex] is a symmetric [latex]n\times n[/latex] matrix and [latex]B[/latex] is any [latex]n\times m[/latex] matrix. Show that [latex]B^{T}AB[/latex], [latex]B^{T}B[/latex], and [latex]BB^{T}[/latex] are symmetric matrices.

License

Icon for the Creative Commons Attribution 4.0 International License

Matrices Copyright © 2019 by Kuei-Nuan Lin is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book