Section 4.2 Independence and Dimension

Definition: An indexed set of vectors [latex]\begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}[/latex] in [latex]\mathbb{R}^n[/latex] is said to be linearly independent if the vector equation [latex]x_{1}\vec{v_{1}}, \cdots, x_{p}\vec{v_{p}} = 0[/latex] in [latex]\mathbb{R}^n[/latex] has only trivial solution. [latex]\begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}[/latex] in [latex]\mathbb{R}^n[/latex] is said to be linearly dependent if there are [latex]c_{1}, \cdots, c_{p}[/latex] not all zero such that [latex]c_{1}\vec{v_{1}}, \cdots, c_{p}\vec{v_{p}} = 0[/latex]. [latex]c_{1}\vec{v_{1}}, \cdots, c_{p}\vec{v_{p}} = 0[/latex] is called the linear dependence relation among [latex]\vec{v_{1}}, \cdots, \vec{v_{p}}[/latex].

 

Example 1: Determine if the set [latex]\begin{Bmatrix}\vec{v_{1}},\vec{v_{2}},\vec{v_{3}}\end{Bmatrix}[/latex] is linearly independent. If possible, find a linear dependence relation among [latex]\vec{v_{1}},\vec{v_{2}},\vec{v_{3}}[/latex]. [latex]\vec{v_{1}} = \begin{bmatrix}-1\\2\\3\end{bmatrix}[/latex], [latex]\vec{v_{2}} = \begin{bmatrix}4\\-1\\9\end{bmatrix}[/latex], and [latex]\vec{v_{3}} = \begin{bmatrix}2\\-4\\-6\end{bmatrix}[/latex].

 

Exercise 1: Determine if the set [latex]\begin{Bmatrix}\vec{v_{1}},\vec{v_{2}},\vec{v_{3}}\end{Bmatrix}[/latex] is linearly independent. If possible, find a linear dependence relation among [latex]\vec{v_{1}},\vec{v_{2}},\vec{v_{3}}[/latex]. [latex]\vec{v_{1}} = \begin{bmatrix}-2\\2\\-3\end{bmatrix}[/latex], [latex]\vec{v_{2}} = \begin{bmatrix}6\\-1\\4\end{bmatrix}[/latex], and [latex]\vec{v_{3}} = \begin{bmatrix}4\\-4\\6\end{bmatrix}[/latex].

 

Note: 1. Given a matrix [latex]A = \begin{bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{bmatrix}[/latex] with [latex]p[/latex] columns, the matrix equation [latex]A\vec{x} = \vec{0}[/latex] can be written as [latex]x_{1}\vec{v_{1}} + \cdots + x_{p}\vec{v_{p}} = 0[/latex]. Then each linear dependence relation among the columns of [latex]A[/latex] corresponds to a nontrivial solution of [latex]A\vec{x} = \vec{0}[/latex]. Hence the columns of matrix [latex]A[/latex] are linearly independent if and only if the equation [latex]A\vec{x} = 0[/latex] has only the trivial solution.

 

2. A set with only one vector is linearly independent if and only if it is not a zero vector. The zero vector is linearly dependent.

 

3. A set with two vectors is linearly independent if and only if they are not multiple of each others.

 

Example 2: Show the column set of [latex]A[/latex] is a linearly independent set. [latex]A = \begin{bmatrix}2 & 0 & 1\\1 & -1 & 0\\-1 & 2 & 1\end{bmatrix}[/latex].

 

Exercise 2: Show the column set of [latex]A[/latex] is a linearly independent set. [latex]A = \begin{bmatrix}1 & 2 & 1\\0 & -1 & 2\\-1 & -2 & 0\end{bmatrix}[/latex].

 

Theorem: If [latex]S = \begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}[/latex] is an linear independent vectors in [latex]\mathbb{R}^n[/latex], then every vector in Span[latex]\begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}[/latex] has a unique representation as a linear combination of [latex]\vec{v_{i}}[/latex].

 

Note: Geometrically, any two vectors in [latex]\mathbb{R}^n[/latex] with [latex]n > 1[/latex] that are not multiple of each other span a plane( they are not co-line). Any three vectors is an linearly independent set if they are not co-plane.

 

Theorem: If a set contains more vectors than there are entries in each vector, then the set is linearly dependent. That is, any set [latex]S = \begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}[/latex] in [latex]\mathbb{R}^n[/latex] is linearly dependent if [latex]p > n[/latex].

 

Theorem: If [latex]S = \begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}[/latex] in [latex]\mathbb{R}^n[/latex] contains the zero vector then it is linearly dependent.

 

Example 3: Use inspection to decide if the vector set is linear independent. State the reasoning.

(a) [latex]\begin{bmatrix}2\\1\\-1\end{bmatrix}[/latex], [latex]\begin{bmatrix}-4\\-2\\2\end{bmatrix}[/latex].

(b) [latex]\begin{bmatrix}2\\1\\-1\end{bmatrix}[/latex], [latex]\begin{bmatrix}3\\4\\5\end{bmatrix}[/latex], [latex]\begin{bmatrix}0\\0\\0\end{bmatrix}[/latex].

(c) [latex]\begin{bmatrix}2\\1\\-1\end{bmatrix}[/latex], [latex]\begin{bmatrix}3\\4\\5\end{bmatrix}[/latex], [latex]\begin{bmatrix}-2\\6\\-7\end{bmatrix}[/latex], [latex]\begin{bmatrix}0\\2\\1\end{bmatrix}[/latex].

 

Exercise 3: Use inspection to decide if the vector set is linear independent. State the reasoning.

(a) [latex]\begin{bmatrix}2\\1\end{bmatrix}[/latex], [latex]\begin{bmatrix}-4\\-2\end{bmatrix}[/latex], [latex]\begin{bmatrix}4\\3\end{bmatrix}[/latex].

(b) [latex]\begin{bmatrix}2\\1\\0\\-1\end{bmatrix}[/latex], [latex]\begin{bmatrix}3\\2\\1\\5\end{bmatrix}[/latex], [latex]\begin{bmatrix}0\\0\\0\\0\end{bmatrix}[/latex], [latex]\begin{bmatrix}2\\1\\-1\\3\end{bmatrix}[/latex].

(c) [latex]\begin{bmatrix}2\\1\\-1\end{bmatrix}[/latex], [latex]\begin{bmatrix}3\\-6\\9\end{bmatrix}[/latex], [latex]\begin{bmatrix}-2\\4\\-6\end{bmatrix}[/latex].

 

Theorem: The following are equivalent for an [latex]n \times n[/latex] matrix [latex]A[/latex]:

1. [latex]A[/latex] is invertible.

2. The columns of [latex]A[/latex] are linearly independent.

3. The columns of [latex]A[/latex] span [latex]\mathbb{R}^n[/latex].

4. The rows of [latex]A[/latex] are linearly independent.

5. The rows of [latex]A[/latex] span the set of all [latex]n \times n[/latex] rows.

 

Example 4: Find the value of [latex]h[/latex] such that the columns of [latex]A = \begin{bmatrix}1 & -3 & 2\\1 & 2 & h\\-5 & -5 & 6\end{bmatrix}[/latex] is linearly dependent.

 

Exercise 4: Find the value of [latex]h[/latex] such that the columns of [latex]A = \begin{bmatrix}2 & -3 & h\\1 & -2 & 2\\-5 & 1 & 6\end{bmatrix}[/latex] is linearly dependent.

 

Definition: A basis for a subspace [latex]H[/latex] of [latex]\mathbb{R}^n[/latex] is a linearly independent set in [latex]H[/latex] that spans [latex]H[/latex].

 

Fact: The columns of an invertible [latex]n \times n[/latex] matrix form a basis of [latex]\mathbb{R}^n[/latex] because they are linearly independent and span [latex]\mathbb{R}^n[/latex].

 

Definition: The columns of [latex]n \times n[/latex] identity has columns [latex]\vec{e_{1}} = \begin{bmatrix}1\\0\\0\\\vdots\\0\end{bmatrix}, \vec{e_{2}} = \begin{bmatrix}0\\1\\0\\\vdots\\0\end{bmatrix}, \cdots, \vec{e_{n}} = \begin{bmatrix}0\\0\\0\\\vdots\\1\end{bmatrix}[/latex] which forms a basis of [latex]\mathbb{R}^n[/latex]. The set [latex]\begin{Bmatrix}\vec{e_{n}}, \cdots, \vec{e_{n}}\end{Bmatrix}[/latex] is called standard basis of [latex]\mathbb{R}^n[/latex].

 

Theorem: The pivot columns of a matrix [latex]A[/latex] form a basis for the column space of [latex]A[/latex].

 

Definition: The dimension of a nonzero subspace [latex]H[/latex], denoted by dim[latex]H[/latex], is the number of vectors in any basis for [latex]H[/latex]. The dimension of the zero subspace [latex]\begin{Bmatrix}\vec{0}\end{Bmatrix}[/latex] is defined to be zero.

 

The Basis Theorem: Let [latex]H[/latex] be a [latex]p[/latex]-dimensional subspace of [latex]\mathbb{R}^n[/latex], any linearly independent set of exactly [latex]p[/latex] elements in [latex]H[/latex] is automatically a basis for [latex]H[/latex]. Also, any set of [latex]p[/latex] elements of [latex]H[/latex] that spans [latex]H[/latex] is automatically a basis for [latex]H[/latex].

 

Example 5: Find a basis and calculate the dimension of the following
subspaces of [latex]\mathbb{R}^n[/latex]

 

[latex]U=\left\{\begin{bmatrix}a\\a+b\\a-c\\b\end{bmatrix}|a,b,c\text{ in }\mathbb{R}\right\}[/latex]

[latex]V=\left\{\begin{bmatrix}a\\b\\c\\d\end{bmatrix}|a + b - c + 2d = 0\text{ in }\mathbb{R}\right\}[/latex]

 

Exercise 5: Find a basis and calculate the dimension of the following
subspaces of [latex]\mathbb{R}^n[/latex]

 

[latex]U=\left\{\begin{bmatrix}a\\a+b\\a-2c\\c\end{bmatrix}|a,b,c\text{ in }\mathbb{R}\right\}[/latex]

[latex]V=\left\{\begin{bmatrix}a\\b\\c\\d\end{bmatrix}|a - 2b + c + d = 0\text{ in }\mathbb{R}\right\}[/latex]

 

Group Work 1: Mark each statement True or False. Justify each answer.

a. If [latex]B[/latex] is an echelon form of a matrix [latex]A[/latex], then the pivot columns of [latex]B[/latex] form a basis for Col[latex]A[/latex].

 

b. Row operations do not affect linear dependence relations among the
columns of a matrix.

 

c. The columns of a matrix [latex]A[/latex] are linearly independent if the equation [latex]A\vec{x} = \vec{0}[/latex] has trivial solution.

 

d. The columns of any [latex]4 \times 5[/latex] matrix are linearly dependent.

 

e. If [latex]\vec{u}[/latex] and [latex]\vec{v}[/latex] are linearly independent and if [latex]\vec{w}[/latex] is in Span[latex]\left\{ \vec{u},\vec{v}\right \}[/latex] then [latex]\left\{ \vec{u},\vec{v},\vec{w}\right \}[/latex] is linearly dependent.

 

f. If three vectors in [latex]\mathbb{R}^3[/latex] lie on the same plane then they are linearly dependent.

 

g. If a set contains fewer vectors then there are entries in the vectors then
they are linearly independent.

 

h. If a set in [latex]\mathbb{R}^n[/latex] is linearly dependent then it contains more than [latex]n[/latex] vectors.

 

Group Work 2: Describe the possible echelon form of the matrix.

(a) [latex]A[/latex] is a [latex]2 \times 2[/latex] matrix with linearly independent columns.

 

(b) [latex]A[/latex] is a [latex]4 \times 2[/latex] matrix such that the first column is the multiple of the second column.

 

Group Work 3: n each case show that the statement is true or give an example showing that it is false.

a. If [latex]\left\{ \vec{u},\vec{v}\right \}[/latex] is independent, then [latex]\left\{ \vec{u},\vec{v}, \vec{u} + \vec{v}\right \}[/latex] is independent.

 

b. If [latex]\left\{ \vec{u},\vec{v},\vec{w}\right \}[/latex] is independent, then [latex]\left\{ \vec{u},\vec{v}\right \}[/latex] is independent.

 

c. If [latex]\left\{ \vec{u},\vec{v}\right \}[/latex] is dependent, then [latex]\left\{ \vec{u},\vec{v},\vec{w}\right \}[/latex] is dependent for any [latex]\vec{w}[/latex].

 

Group Work 4: How many pivot columns must be a [latex]6 \times 4[/latex] matrix have if its columns are linearly independent.

Group Work 5: How many pivot columns must be a [latex]4 \times 6[/latex] matrix have if its columns span [latex]\mathbb{R}^4[/latex]? why?

License

Icon for the Creative Commons Attribution 4.0 International License

Matrices Copyright © by Kuei-Nuan Lin is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book