2.4 Linear independence

In this section, our focus turns to the uniqueness of solutions of a linear system, the second of our two fundamental questions asked in Question 1.4.2. This will lead us to the concept of linear independence.

Reading

Try out the Preview Activity and read section  Chapter 2 Section 4 in Understanding Linear Algebra by Davis Austin.

Definition 1: A set of vectors [latex]\{\vec{v}_1,\vec{v}_2,\dots,\vec{v}_n \}[/latex] is called linearly dependent if one of the vectors is a linear combination of the others. Otherwise, the set of vectors is called linearly independent.
Definition 2: A set of vectors [latex]\{\vec{v}_1,\vec{v}_2,\dots,\vec{v}_n \}[/latex] is called linearly dependent if [latex]c_1\vec{v}_1+\dots+c_n\vec{v}_n =\vec{0}[/latex] has more than one solution. Otherwise, if the only solution is [latex]c_1=0, c_2=0, \dots, c_n=0[/latex] then the set of vectors is called linearly independent.

 

 

Proposition 2.4.2.

The columns of a matrix are linearly independent if and only if in its reduced row echelon form, every column contains a pivot.

 

License

Icon for the Creative Commons Attribution 4.0 International License

Math 220, Matrices Copyright © 2018 by Kristen Pueschel is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book