# Section 4.2 Independence and Dimension

Definition: An indexed set of vectors $\begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}$ in $\mathbb{R}^n$ is said to be linearly independent if the vector equation $x_{1}\vec{v_{1}}, \cdots, x_{p}\vec{v_{p}} = 0$ in $\mathbb{R}^n$ has only trivial solution. $\begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}$ in $\mathbb{R}^n$ is said to be linearly dependent if there are $c_{1}, \cdots, c_{p}$ not all zero such that $c_{1}\vec{v_{1}}, \cdots, c_{p}\vec{v_{p}} = 0$. $c_{1}\vec{v_{1}}, \cdots, c_{p}\vec{v_{p}} = 0$ is called the linear dependence relation among $\vec{v_{1}}, \cdots, \vec{v_{p}}$.

Example 1: Determine if the set $\begin{Bmatrix}\vec{v_{1}},\vec{v_{2}},\vec{v_{3}}\end{Bmatrix}$ is linearly independent. If possible, find a linear dependence relation among $\vec{v_{1}},\vec{v_{2}},\vec{v_{3}}$. $\vec{v_{1}} = \begin{bmatrix}-1\\2\\3\end{bmatrix}$, $\vec{v_{2}} = \begin{bmatrix}4\\-1\\9\end{bmatrix}$, and $\vec{v_{3}} = \begin{bmatrix}2\\-4\\-6\end{bmatrix}$.

Exercise 1: Determine if the set $\begin{Bmatrix}\vec{v_{1}},\vec{v_{2}},\vec{v_{3}}\end{Bmatrix}$ is linearly independent. If possible, find a linear dependence relation among $\vec{v_{1}},\vec{v_{2}},\vec{v_{3}}$. $\vec{v_{1}} = \begin{bmatrix}-2\\2\\-3\end{bmatrix}$, $\vec{v_{2}} = \begin{bmatrix}6\\-1\\4\end{bmatrix}$, and $\vec{v_{3}} = \begin{bmatrix}4\\-4\\6\end{bmatrix}$.

Note: 1. Given a matrix $A = \begin{bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{bmatrix}$ with $p$ columns, the matrix equation $A\vec{x} = \vec{0}$ can be written as $x_{1}\vec{v_{1}} + \cdots + x_{p}\vec{v_{p}} = 0$. Then each linear dependence relation among the columns of $A$ corresponds to a nontrivial solution of $A\vec{x} = \vec{0}$. Hence the columns of matrix $A$ are linearly independent if and only if the equation $A\vec{x} = 0$ has only the trivial solution.

2. A set with only one vector is linearly independent if and only if it is not a zero vector. The zero vector is linearly dependent.

3. A set with two vectors is linearly independent if and only if they are not multiple of each others.

Example 2: Show the column set of $A$ is a linearly independent set. $A = \begin{bmatrix}2 & 0 & 1\\1 & -1 & 0\\-1 & 2 & 1\end{bmatrix}$.

Exercise 2: Show the column set of $A$ is a linearly independent set. $A = \begin{bmatrix}1 & 2 & 1\\0 & -1 & 2\\-1 & -2 & 0\end{bmatrix}$.

Theorem: If $S = \begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}$ is an linear independent vectors in $\mathbb{R}^n$, then every vector in Span$\begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}$ has a unique representation as a linear combination of $\vec{v_{i}}$.

Note: Geometrically, any two vectors in $\mathbb{R}^n$ with $n > 1$ that are not multiple of each other span a plane( they are not co-line). Any three vectors is an linearly independent set if they are not co-plane.

Theorem: If a set contains more vectors than there are entries in each vector, then the set is linearly dependent. That is, any set $S = \begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}$ in $\mathbb{R}^n$ is linearly dependent if $p > n$.

Theorem: If $S = \begin{Bmatrix}\vec{v_{1}}, \cdots, \vec{v_{p}}\end{Bmatrix}$ in $\mathbb{R}^n$ contains the zero vector then it is linearly dependent.

Example 3: Use inspection to decide if the vector set is linear independent. State the reasoning.

(a) $\begin{bmatrix}2\\1\\-1\end{bmatrix}$, $\begin{bmatrix}-4\\-2\\2\end{bmatrix}$.

(b) $\begin{bmatrix}2\\1\\-1\end{bmatrix}$, $\begin{bmatrix}3\\4\\5\end{bmatrix}$, $\begin{bmatrix}0\\0\\0\end{bmatrix}$.

(c) $\begin{bmatrix}2\\1\\-1\end{bmatrix}$, $\begin{bmatrix}3\\4\\5\end{bmatrix}$, $\begin{bmatrix}-2\\6\\-7\end{bmatrix}$, $\begin{bmatrix}0\\2\\1\end{bmatrix}$.

Exercise 3: Use inspection to decide if the vector set is linear independent. State the reasoning.

(a) $\begin{bmatrix}2\\1\end{bmatrix}$, $\begin{bmatrix}-4\\-2\end{bmatrix}$, $\begin{bmatrix}4\\3\end{bmatrix}$.

(b) $\begin{bmatrix}2\\1\\0\\-1\end{bmatrix}$, $\begin{bmatrix}3\\2\\1\\5\end{bmatrix}$, $\begin{bmatrix}0\\0\\0\\0\end{bmatrix}$, $\begin{bmatrix}2\\1\\-1\\3\end{bmatrix}$.

(c) $\begin{bmatrix}2\\1\\-1\end{bmatrix}$, $\begin{bmatrix}3\\-6\\9\end{bmatrix}$, $\begin{bmatrix}-2\\4\\-6\end{bmatrix}$.

Theorem: The following are equivalent for an $n \times n$ matrix $A$:

1. $A$ is invertible.

2. The columns of $A$ are linearly independent.

3. The columns of $A$ span $\mathbb{R}^n$.

4. The rows of $A$ are linearly independent.

5. The rows of $A$ span the set of all $n \times n$ rows.

Example 4: Find the value of $h$ such that the columns of $A = \begin{bmatrix}1 & -3 & 2\\1 & 2 & h\\-5 & -5 & 6\end{bmatrix}$ is linearly dependent.

Exercise 4: Find the value of $h$ such that the columns of $A = \begin{bmatrix}2 & -3 & h\\1 & -2 & 2\\-5 & 1 & 6\end{bmatrix}$ is linearly dependent.

Definition: A basis for a subspace $H$ of $\mathbb{R}^n$ is a linearly independent set in $H$ that spans $H$.

Fact: The columns of an invertible $n \times n$ matrix form a basis of $\mathbb{R}^n$ because they are linearly independent and span $\mathbb{R}^n$.

Definition: The columns of $n \times n$ identity has columns $\vec{e_{1}} = \begin{bmatrix}1\\0\\0\\\vdots\\0\end{bmatrix}, \vec{e_{2}} = \begin{bmatrix}0\\1\\0\\\vdots\\0\end{bmatrix}, \cdots, \vec{e_{n}} = \begin{bmatrix}0\\0\\0\\\vdots\\1\end{bmatrix}$ which forms a basis of $\mathbb{R}^n$. The set $\begin{Bmatrix}\vec{e_{n}}, \cdots, \vec{e_{n}}\end{Bmatrix}$ is called standard basis of $\mathbb{R}^n$.

Theorem: The pivot columns of a matrix $A$ form a basis for the column space of $A$.

Definition: The dimension of a nonzero subspace $H$, denoted by dim$H$, is the number of vectors in any basis for $H$. The dimension of the zero subspace $\begin{Bmatrix}\vec{0}\end{Bmatrix}$ is defined to be zero.

The Basis Theorem: Let $H$ be a $p$-dimensional subspace of $\mathbb{R}^n$, any linearly independent set of exactly $p$ elements in $H$ is automatically a basis for $H$. Also, any set of $p$ elements of $H$ that spans $H$ is automatically a basis for $H$.

Example 5: Find a basis and calculate the dimension of the following
subspaces of $\mathbb{R}^n$

$U=\left\{\begin{bmatrix}a\\a+b\\a-c\\b\end{bmatrix}|a,b,c\text{ in }\mathbb{R}\right\}$

$V=\left\{\begin{bmatrix}a\\b\\c\\d\end{bmatrix}|a + b - c + 2d = 0\text{ in }\mathbb{R}\right\}$

Exercise 5: Find a basis and calculate the dimension of the following
subspaces of $\mathbb{R}^n$

$U=\left\{\begin{bmatrix}a\\a+b\\a-2c\\c\end{bmatrix}|a,b,c\text{ in }\mathbb{R}\right\}$

$V=\left\{\begin{bmatrix}a\\b\\c\\d\end{bmatrix}|a - 2b + c + d = 0\text{ in }\mathbb{R}\right\}$

Group Work 1: Mark each statement True or False. Justify each answer.

a. If $B$ is an echelon form of a matrix $A$, then the pivot columns of $B$ form a basis for Col$A$.

b. Row operations do not affect linear dependence relations among the
columns of a matrix.

c. The columns of a matrix $A$ are linearly independent if the equation $A\vec{x} = \vec{0}$ has trivial solution.

d. The columns of any $4 \times 5$ matrix are linearly dependent.

e. If $\vec{u}$ and $\vec{v}$ are linearly independent and if $\vec{w}$ is in Span$\left\{ \vec{u},\vec{v}\right \}$ then $\left\{ \vec{u},\vec{v},\vec{w}\right \}$ is linearly dependent.

f. If three vectors in $\mathbb{R}^3$ lie on the same plane then they are linearly dependent.

g. If a set contains fewer vectors then there are entries in the vectors then
they are linearly independent.

h. If a set in $\mathbb{R}^n$ is linearly dependent then it contains more than $n$ vectors.

Group Work 2: Describe the possible echelon form of the matrix.

(a) $A$ is a $2 \times 2$ matrix with linearly independent columns.

(b) $A$ is a $4 \times 2$ matrix such that the first column is the multiple of the second column.

Group Work 3: n each case show that the statement is true or give an example showing that it is false.

a. If $\left\{ \vec{u},\vec{v}\right \}$ is independent, then $\left\{ \vec{u},\vec{v}, \vec{u} + \vec{v}\right \}$ is independent.

b. If $\left\{ \vec{u},\vec{v},\vec{w}\right \}$ is independent, then $\left\{ \vec{u},\vec{v}\right \}$ is independent.

c. If $\left\{ \vec{u},\vec{v}\right \}$ is dependent, then $\left\{ \vec{u},\vec{v},\vec{w}\right \}$ is dependent for any $\vec{w}$.

Group Work 4: How many pivot columns must be a $6 \times 4$ matrix have if its columns are linearly independent.

Group Work 5: How many pivot columns must be a $4 \times 6$ matrix have if its columns span $\mathbb{R}^4$? why?