Section 3.3 Diagonalization and Eigenvalues

3.3A

 

How do we find the matrix [latex]A^100[/latex]? The application for this is that sometimes we need to apply the same process a couple of times and see what it would become.

Example: Consider the evolution of the population of a species of birds. Because the number of males and females are nearly equal, we count only females. We assume that each female remains a juvenile for one year and then becomes an adult and that only adults have offspring. We make three assumptions about reproduction and survival rates:

1. The number of juvenile females hatched in any year is twice the number of adult females alive the year before (we say the reproduction rate is 2).

2. Half of the adult females in any year survive to the next year (the adult survival rate is 1/2)

3. One-quarter of the juvenile females in any year survive into adulthood
(the juvenile survival rate is 1/4.)

If there were 100 adult females and 40 juvenile females alive initially,
compute the population of females [latex]k[/latex] years later.

Definition: An eigenvector of an [latex]n \times n[/latex] matrix [latex]A[/latex] is a nonzero vector [latex]\vec{x}[/latex] such that for some scalar [latex]\lambda[/latex], [latex]A\vec{x} = \lambda\vec{x}[/latex]. A scalar [latex]\lambda[/latex] is called an eigenvalue of [latex]A[/latex] if there is a nontrivial solution [latex]\vec{x}[/latex] of [latex]A\vec{x} = \lambda\vec{x}[/latex]; such an [latex]\vec{x}[/latex] is called an eigenvector corresponding to [latex]\lambda[/latex].

 

Remark: [latex]\lambda[/latex] is an eigenvalue of an matrix [latex]A[/latex] if and only if the equation [latex](A - I\lambda)\vec{x} = 0[/latex] has a nontrivial solution

 

 

Example 1: Show that [latex]-2[/latex] is an eigenvalue of matrix [latex]\begin{bmatrix}-1 & 2\\3 & 4\end{bmatrix}[/latex], and find the corresponding eigenvectors.

 

 

Exercise 1: Show that [latex]3[/latex] is an eigenvalue of matrix [latex]\begin{bmatrix}2 & 3\\-2 & 9\end{bmatrix}[/latex], and find the corresponding eigenvectors.

 

Example 2: Show that [latex]-3[/latex] is an eigenvalue of matrix [latex]\begin{bmatrix}1 & 1 & 2\\3 & 0 & 6\\-2 & 2 & 1\end{bmatrix}[/latex] and find eigenvectors corresponding to the eigenvalue [latex]-3[/latex].

 

 

Exercise 2: Show that [latex]4[/latex] is an eigenvalue of matrix [latex]\begin{bmatrix}6 & -4 & 6\\1 & 2 & 3\\-3 & 6 & -5\end{bmatrix}[/latex] and find eigenvectors corresponding to the eigenvalue [latex]4[/latex].

 

Question: How do we find the eigenvalues?

Theorem: The eigenvalues of a triangular matrix are the entries on its main diagonal.

Proof:

 

Remark: Unfortunately, we cannot reduce a non-triangular matrix to echelon or triangular matrix to find the eigenvalue of a matrix [latex]A[/latex]. [latex]-5[/latex] is an eigenvalue of matrix [latex]\begin{bmatrix}-4 & -3\\4 & -17\end{bmatrix}[/latex] but its’ echelon form is [latex]\begin{bmatrix}-4 & -3\\0 & -20\end{bmatrix}[/latex] which has eigenvalues [latex]-4[/latex] and [latex]-20[/latex].

 

 

Example 3: Find an eigenvalue of matrix [latex]A = \begin{bmatrix}3 & 3 & 4\\0 & 2 & 3\\0 & 0 & -1\end{bmatrix}[/latex], and find an eigenvector [latex]\vec{x}[/latex] corresponding to the eigenvalue you find. What is [latex]A^{10}\vec{x}[/latex]?

 

 

Exercise 3: Find an eigenvalue of matrix [latex]A = \begin{bmatrix}-2 & 4 & 3\\0 & 1 & 2\\0 & 0 & 1\end{bmatrix}[/latex], and find an eigenvector [latex]\vec{x}[/latex] corresponding to the eigenvalue you find. What is [latex]A^{20}\vec{x}[/latex]?

 

Remark: [latex]0[/latex] is an eigenvalue of a matrix [latex]A[/latex] if and only if [latex]A\vec{x} = 0\vec{x}[/latex] has nontrivial solutions if and only if [latex]A[/latex] is not invertible.

 

Theorem: Let [latex]A[/latex] be an [latex]n \times n[/latex] matrix. Then [latex]A[/latex] is invertible if and only if:

a. The number [latex]0[/latex] is not an eigenvalue of [latex]A[/latex].

b. The determinant of [latex]A[/latex] is not zero.

 

Theorem: Let [latex]A[/latex] and [latex]B[/latex] be an [latex]n \times n[/latex] matrix.

a. [latex]A[/latex] is invertible if and only if det [latex]A \neq 0[/latex].

b. det [latex]AB =[/latex] det [latex]A[/latex]det [latex]B[/latex]

c. det [latex]A^{T} =[/latex] det [latex]A[/latex]

d. If [latex]A[/latex] is triangular, then det [latex]A[/latex] is the product of the entries on the main diagonal of [latex]A[/latex].

e. A row replacement operation on
[latex]A[/latex] does not change the determinant. A row interchange changes the sign of the determinant. A row scaling also scales the determinant by the same scalar factor.

 

 

Question: How do we find eigenvalues of a matrix when the matrix is not a
triangular matrix?

Definition: det [latex](A - xI) = 0[/latex] is called the characteristic equation of [latex]A[/latex]. det [latex](A - xI)[/latex] is a polynomial of degree [latex]n[/latex] over the real number. Hence we called det [latex](A - xI)[/latex], characteristic polynomial of [latex]A[/latex].

 

Remark: [latex]\lambda[/latex] is an eigenvalue of an [latex]n \times n[/latex] matrix if and only if det [latex](A - I\lambda) = 0[/latex], i.e. [latex]\lambda[/latex] is a root of the det [latex](A - xI) = 0[/latex].

 

Definition: An eigenvalue [latex]\lambda[/latex] of a matrix [latex]A[/latex] is said to have multiplicity [latex]s[/latex] if [latex](\lambda - x)^s[/latex] is a factor of det [latex](A - xI)[/latex] but [latex](\lambda - x)^{s+1}[/latex] is not a factor of det [latex](A - xI)[/latex].

 

 

Example 4: The characteristic polynomial of a [latex]6 \times 6[/latex] matrix is [latex]x^6 - 4x^4 - x^5 + 4x^3[/latex]. Find the eigenvalues and their multiplicities.

 

 

Exercise 4: The characteristic polynomial of a [latex]4 \times 4[/latex] matrix is [latex]x^4 - 4x^3 + 4x^2[/latex]. Find the eigenvalues and their multiplicities.

 

Example 5: Find the characteristic equation of [latex]A = \begin{bmatrix}-4 & 1 & -3\\0 & 1 & 1\\3 & 0 & 3\end{bmatrix}[/latex], and find the eigenvalues of [latex]A[/latex] and their multiplicities.

 

 

Exercise 5: Find the characteristic equation of [latex]A = \begin{bmatrix}0 & 1 & 0\\3 & 0 & 1\\2 & 0 & 0\end{bmatrix}[/latex], and find the eigenvalues of [latex]A[/latex] and their multiplicities.

 

GroupWork 1: True or False.

a. If [latex]A\vec{x} = \lambda\vec{x}[/latex] for some vector [latex]\vec{x}[/latex], then [latex]\lambda[/latex] is an eigenvalue.

 

b. A matrix [latex]A[/latex] is not invertible if and only if [latex]0[/latex] is an eigenvalue of [latex]A[/latex].

 

c. A number [latex]c[/latex] is an eigenvalue of [latex]A[/latex] if and only if [latex](A - cI)\vec{x} = 0[/latex] has a nontrivial solution.

 

d. Find an eigenvector of [latex]A[/latex] may be difficult but checking if a given vector is an eigenvector is easy.

 

e. To find the eigenvalues of [latex]A[/latex], reduce [latex]A[/latex] to echelon form.

 

GroupWork 2: Let [latex]\lambda[/latex] be an eigenvalue of an invertible matrix [latex]A[/latex]. Show that [latex]\lambda^{-1}[/latex] is an eigenvalue of [latex]A^{-1}[/latex].

 

GroupWork 3: Show [latex]\lambda[/latex] is an eigenvalue of matrix [latex]A[/latex] if and only if [latex]\lambda[/latex] is an eigenvalue of [latex]A^{T}[/latex]. This shows that [latex]A[/latex] and [latex]A^{T}[/latex] has the same characteristic polynomial.

 

GroupWork 4: True or False. Justify each answer:

a. The eigenvalues of a matrix are on its main diagonal.

 

b. The determinant of [latex]A[/latex] is the product of the main diagonal entries.

 

c. An elementary row operation on a matrix [latex]A[/latex] does not change the determinant of [latex]A[/latex].

 

d. det [latex]A[/latex] det [latex]B[/latex] = det [latex]AB[/latex].

 

e. det [latex]A^T[/latex] = −det [latex]A[/latex].

 

f. A row replacement operation on [latex]A[/latex] does not change the eigenvalues.

 

g. If [latex]x + 5[/latex] is a factor of the characteristic polynomial of [latex]A[/latex] then [latex]5[/latex] is an eigenvalue.

 

GroupWork 5: Find [latex]h[/latex] in the matrix [latex]A = \begin{bmatrix}1 & 1 & -2 & 3\\0 & 2 & h & 0\\0 & 0 & 1 & 5\\0 & 0 & 0 & 2\end{bmatrix}[/latex] such that [latex]\lambda = 1[/latex] has two basic eigenvectors [latex](A - I\lambda)[/latex] has two free variables.

 

3.3B

 

Remember that we are trying to find a way to compute [latex]A^k[/latex] for a large number [latex]k[/latex].

 

Example 1: Find [latex]A^k[/latex] if [latex]A = \begin{bmatrix}1 & -2\\-5 & 4\end{bmatrix}[/latex] and [latex]A = PDP^{-1}[/latex], where [latex]P = \begin{bmatrix}1 & 2\\1 & -5\end{bmatrix}[/latex] and [latex]D = \begin{bmatrix}-1 & 0\\0 & 6\end{bmatrix}[/latex].

 

 

Exercise 1: Find [latex]A^k[/latex] if [latex]A = \begin{bmatrix}-1 & 2\\2 & 2\end{bmatrix}[/latex] and [latex]A = PDP^{-1}[/latex], where [latex]P = \begin{bmatrix}1 & 2\\2 & -1\end{bmatrix}[/latex] and [latex]D = \begin{bmatrix}3 & 0\\0 & -2\end{bmatrix}[/latex].

 

Definition: A square matrix [latex]A[/latex] is said to be diagonalizable if [latex]P^{-1}AP = D[/latex], for some invertible matrix [latex]P[/latex] and some diagonal, matrix [latex]D[/latex] ( or [latex]AP = PD[/latex]).

 

Theorem: An [latex]n \times n[/latex] matrix [latex]A[/latex] is diagonalizable if and only if [latex]A[/latex] has [latex]n[/latex] eigenvectors, [latex]\vec{x_{1}},\cdots,\vec{x_{n}}[/latex] such that the matrix [latex]P = \begin{bmatrix}\vec{x_{1}},\vec{x_{2}},\cdots,\vec{x_{n}}\end{bmatrix}[/latex] is invertible.

 

Remark: If [latex]A = PDP^{-1}[/latex] with [latex]D[/latex] a diagonal matrix, the diagonal entries of [latex]D[/latex] are eigenvalues of [latex]A[/latex] that correspond, respectively, to the eigenvectors in [latex]P[/latex].

 

 

Example 2: Diagonalize the following matrix, if possible. [latex]A = \begin{bmatrix}3 & -1\\2 & 6\end{bmatrix}[/latex]

 

 

Exercise 2: Diagonalize the following matrix, if possible. [latex]A = \begin{bmatrix}-2 & 2\\7 & 3\end{bmatrix}[/latex]

 

Theorem: An [latex]n \times n[/latex] matrix with [latex]n[/latex] distinct eigenvalues is diagonalizable.

 

Remark: It is not necessary for an [latex]n \times n[/latex] matrix to have [latex]n[/latex] distinct eigenvalues in order to be diagonalizable. The above theorem provides a sufficient condition for a matrix to be diagonalizable. (see Example 3).

 

 

Example 3: Diagonalize the following matrix, if possible. [latex]A = \begin{bmatrix}1 & 0 & 0\\0 & 2 & 0\\3 & 0 & 2\end{bmatrix}[/latex].

 

 

Exercise 3: Diagonalize the following matrix, if possible. [latex]A = \begin{bmatrix}3 & 0 & -1\\0 & 3 & 0\\0 & 0 & 2\end{bmatrix}[/latex].

 

Theorem: A square matrix [latex]A[/latex] is diagonalizable if and only if every eigenvalue [latex]\lambda[/latex] of multiplicity [latex]m[/latex] yields exactly [latex]m[/latex] basic eigenvectors; that is, if and only if the general solution of the system [latex](A - I\lambda)\vec{x} = 0[/latex] has exactly [latex]m[/latex] parameters. (([latex]A - I\lambda[/latex]) has [latex]m[/latex] free variables).

 

Example 4: Diagonalize the following matrix, if possible. [latex]A = \begin{bmatrix}1 & 1 & 4 & 5\\0 & 1 & 2 & 6\\0 & 0 & -3 & 1\\0 & 0 & 0 & -3\end{bmatrix}[/latex].

 

 

Exercise 4: Diagonalize the following matrix, if possible. [latex]A = \begin{bmatrix}2 & 0 & 0 & 0\\1 & 2 & 0 & 0\\1 & 1 & 3 & 0\\0 & 0 & 2 & -1\end{bmatrix}[/latex].

 

Example 5: Diagonalize the following matrix, if possible. [latex]A = \begin{bmatrix}3 & 0 & 4 & 5\\0 & 3 & 2 & 6\\0 & 0 & -2 & 0\\0 & 0 & 0 & 4\end{bmatrix}[/latex].

 

 

Exercise 5: Diagonalize the following matrix, if possible. [latex]A = \begin{bmatrix}2 & 0 & 0 & 0\\0 & 2 & 0 & 0\\1 & 0 & 3 & 0\\0 & 1 & 2 & -1\end{bmatrix}[/latex].

 

GroupWork 1: True or False. All matrices are [latex]n \times n[/latex].

a. [latex]A[/latex] is diagonalizable if [latex]A = PDP^{-1}[/latex] for some matrix [latex]D[/latex] and some invertible matrix [latex]P[/latex].

 

b. If [latex]\mathbb{R}^n[/latex] has [latex]n[/latex] eigenvectors of [latex]A[/latex] such that it forms an invertible matrix, then [latex]A[/latex] is diagonalizable.

 

c. [latex]A[/latex] is diagonalizable if and only if [latex]A[/latex] has [latex]n[/latex] eigenvalues, counting multiplicities.

 

d. If [latex]A[/latex] is invertible, then [latex]A[/latex] is diagonalizable.

 

e. [latex]\lambda[/latex] is an eigenvalue of [latex]A[/latex] then [latex]\lambda^2[/latex] is an eigenvalue of [latex]A^2[/latex].

 

GroupWork 2: Show that if [latex]A[/latex] is both diagonalizable and invertible, then so is [latex]A^{-1}[/latex].

 

GroupWork 3: Show that [latex]A[/latex] is diagonalizable if and only if [latex]A^T[/latex] is diagonalizable.

 

GroupWork 4: True or False. All matrices are [latex]n \times n[/latex]. Justify each
answer.

a. [latex]A[/latex] is diagonalizable if [latex]A[/latex] has [latex]n[/latex] eigenvectors.

 

b. [latex]A[/latex] is diagonalizable if [latex]A[/latex] has [latex]n[/latex] distinct eigenvectors.

 

c. If [latex]AP = DP[/latex], with [latex]D[/latex] diagonal, then the nonzero columns of [latex]P[/latex] must be eigenvectors of [latex]A[/latex].

 

d. [latex]A[/latex] is diagonalizable, then [latex]A[/latex] is invertible.

 

e. Two diagonalizable matrices [latex]A[/latex] and [latex]B[/latex] then their sum [latex]A + B[/latex] is diagonalizable.

 

GroupWork 5: Construct a nonzero [latex]2 \times 2[/latex] matrix that is diagonalizable but not invertible.

 

GroupWork 6: Construct a nonzero [latex]2 \times 2[/latex] matrix that is invertible but not diagonalizable.

License

Icon for the Creative Commons Attribution 4.0 International License

Matrices Copyright © 2019 by Kuei-Nuan Lin is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book