PAGE 1

4.3 Linear Combinations and Linear Independence

vectors \( \vec{v}_1, \vec{v}_2, \vec{v}_3, \dots, \vec{v}_n \)

they are linearly independent if and only if

\[ c_1\vec{v}_1 + c_2\vec{v}_2 + \dots + c_n\vec{v}_n = \vec{0} \text{ implies } \underline{\text{ALL } c_i \text{ are zero}} \]

if these vectors are columns of a matrix, then the vectors being linearly indp if they there are as many pivots as there are columns

otherwise, the equation

\[ \underbrace{[\vec{v}_1 \, \vec{v}_2 \, \dots \, \vec{v}_n]}_{\text{matrix w/ } \vec{v}_i \text{ as columns}} \begin{bmatrix} c_1 \\ c_2 \\ \vdots \\ c_n \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \end{bmatrix} \]

will have more than one solution, but linearly indp means a unique solution \( c_1 = c_2 = \dots = c_n = 0 \) (trivial solution)

PAGE 2

for example, \( \vec{v}_1 = \begin{bmatrix} 1 \\ 3 \end{bmatrix} \) \( \vec{v}_2 = \begin{bmatrix} 2 \\ 4 \end{bmatrix} \)

\[ [\vec{v}_1 \, \vec{v}_2] = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} \xrightarrow{\text{reduction}} \begin{bmatrix} \boxed{1} & 2 \\ 0 & \boxed{-2} \end{bmatrix} \text{ two pivots} \]

that means

\[ \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} \begin{bmatrix} c_1 \\ c_2 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} \]
\[ \begin{bmatrix} 1 & 2 & 0 \\ 3 & 4 & 0 \end{bmatrix} \to \begin{bmatrix} \overset{c_1}{\boxed{1}} & \overset{c_2}{2} & 0 \\ 0 & \boxed{-2} & 0 \end{bmatrix} \]

both \( c_1 \) and \( c_2 \) are zero and only zero (no free variables)

so, \( \begin{bmatrix} 1 \\ 3 \end{bmatrix} \) and \( \begin{bmatrix} 2 \\ 4 \end{bmatrix} \) are linearly indp

Given vectors, if want to find if they are linearly indp, put in as cols of matrix, then count pivots

PAGE 3

for example, \( \vec{v_1} = \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix}, \vec{v_2} = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}, \vec{v_3} = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}, \vec{v_4} = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix} \)

\[ \begin{bmatrix} 1 & 0 & 0 & 1 \\ 0 & 1 & 0 & 2 \\ 1 & 0 & 1 & 3 \end{bmatrix} \rightarrow \dots \rightarrow \begin{bmatrix} 1 & 0 & 0 & 1 \\ 0 & 1 & 0 & 2 \\ 0 & 0 & 1 & 2 \end{bmatrix} \]

3 pivots, 4 vectors \( \rightarrow c_1\vec{v_1} + c_2\vec{v_2} + c_3\vec{v_3} + c_4\vec{v_4} = \vec{0} \)
has free variables \( \rightarrow \) multiple solutions

So, these vectors are NOT linearly indp

  • this set of FOUR vectors are dependent
    (one can be expressed as linear combo of others)
  • BUT, a subset of these four may be indp
    (e.g. \( \vec{v_1}, \vec{v_2}, \vec{v_3} \) ARE indp)

notice the max # of pivots is the # of rows \( \rightarrow \) # of components in each vector

\( \rightarrow \) if there are more vectors than # of components, the vectors are automatically linearly dependent

PAGE 4

BUT, if there are as many vectors as components or fewer vectors, then the vectors may or may not be indp.

for example, \( \vec{v_1} = \begin{bmatrix} 1 \\ 1 \end{bmatrix}, \vec{v_2} = \begin{bmatrix} 2 \\ 2 \end{bmatrix} \)

two \( \mathbb{R}^2 \) vectors (same # components and vectors)

\[ \begin{bmatrix} 1 & 2 \\ 1 & 2 \end{bmatrix} \rightarrow \begin{bmatrix} 1 & 2 \\ 0 & 0 \end{bmatrix} \]

one pivot so NOT indp

if \( [\vec{v_1} \, \vec{v_2} \, \dots \, \vec{v_n}] \) is square, then indp \( \rightarrow \) determinant is zero

PAGE 5

Example: Linear Independence of Vectors

Consider the following three vectors in \(\mathbb{R}^4\):

\[ \vec{v}_1 = \begin{bmatrix} -1 \\ -17 \\ -3 \\ 9 \end{bmatrix}, \quad \vec{v}_2 = \begin{bmatrix} 14 \\ 7 \\ 2 \\ -2 \end{bmatrix}, \quad \vec{v}_3 = \begin{bmatrix} 15 \\ 5 \\ +1 \\ -2 \end{bmatrix} \]

Since we have 3 \(\mathbb{R}^4\) vectors, they may or may not be independent.

\[ \begin{bmatrix} -1 & 14 & 15 \\ -17 & 7 & 5 \\ -3 & 2 & +1 \\ 9 & -2 & -2 \end{bmatrix} \rightarrow \dots \rightarrow \begin{bmatrix} \boxed{-1} & 14 & 15 \\ 0 & \boxed{-231} & -250 \\ 0 & 0 & \boxed{11} \\ \hline 0 & 0 & 0 \end{bmatrix} \]

3 pivots

A row of zeros normally means free variables. We assign free variables to variables WITHOUT pivots in their column.

Here, every column has a pivot, so despite the zero row, there are NO free variables.

3 pivots, 3 vectors \(\rightarrow\) they ARE linearly independent.

PAGE 6

The Span of a Set of Vectors

Next, we discuss the span of a set of vectors.

Span: the "things" you can make with the given vectors.

Often we talk about the space spanned by the vectors.

Example in \(\mathbb{R}^2\)

For example, consider the standard basis vectors:

\[ \vec{i} = \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \quad \vec{j} = \begin{bmatrix} 0 \\ 1 \end{bmatrix} \]

It is clear we can make every possible \(\mathbb{R}^2\) vector using linear combinations of these:

\[ \begin{bmatrix} a \\ b \end{bmatrix} = a \begin{bmatrix} 1 \\ 0 \end{bmatrix} + b \begin{bmatrix} 0 \\ 1 \end{bmatrix} \]

So, we say that \(\vec{i}\) and \(\vec{j}\) span \(\mathbb{R}^2\).

Example in \(\mathbb{R}^3\)

Likewise, the vectors:

\[ \vec{i} = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}, \quad \vec{j} = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}, \quad \vec{k} = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \text{ span } \mathbb{R}^3 \]

Written in set notation:

\[ \text{span} \left\{ \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \right\} = \mathbb{R}^3 \]
PAGE 7

Linear Algebra: Spanning Sets

Example 1: Linearly Dependent Vectors

What is \( \text{span} \left\{ \begin{bmatrix} 1 \\ 1 \end{bmatrix}, \begin{bmatrix} -2 \\ -2 \end{bmatrix} \right\} \)?

span: linear combos

\[ a \begin{bmatrix} 1 \\ 1 \end{bmatrix} + b \begin{bmatrix} -2 \\ -2 \end{bmatrix} = c \begin{bmatrix} 1 \\ 1 \end{bmatrix} \]

because \( \begin{bmatrix} -2 \\ -2 \end{bmatrix} \) is a multiple of \( \begin{bmatrix} 1 \\ 1 \end{bmatrix} \).

\[ \text{span} \left\{ \begin{bmatrix} 1 \\ 1 \end{bmatrix}, \begin{bmatrix} -2 \\ -2 \end{bmatrix} \right\} = k \begin{bmatrix} 1 \\ 1 \end{bmatrix} \]
A 2D coordinate graph showing vectors [1,1] and [-2,-2] lying on the same dashed line through the origin.

Example 2: Redundant Spanning Sets

What about \( \text{span} \left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \end{bmatrix}, \begin{bmatrix} 1 \\ 2 \end{bmatrix} \right\} \)?

It is still \( \mathbb{R}^2 \). \( \begin{bmatrix} 1 \\ 2 \end{bmatrix} \) is not needed but the set still spans \( \mathbb{R}^2 \).

We can have more vectors than we need in a spanning set.

PAGE 8

The minimum we need appears to be the number of components in each vector.

\( \mathbb{R}^n \): needs \( n \) vectors at least.

We need exactly \( n \) linearly independent vectors.

Spanning \( \mathbb{R}^2 \) with Extra Vectors

\[ \mathbb{R}^2: \left\{ \underbrace{\begin{bmatrix} 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \end{bmatrix}}_{2 \text{ indp}}, \underbrace{\begin{bmatrix} 1 \\ 2 \end{bmatrix}, \begin{bmatrix} 3 \\ 4 \end{bmatrix}, \begin{bmatrix} 5 \\ 6 \end{bmatrix}, \begin{bmatrix} \pi \\ e \end{bmatrix}}_{\text{extras}} \right\} \]

Minimal Spanning Set

Also ok:

\[ \mathbb{R}^2: \left\{ \underbrace{\begin{bmatrix} 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 1 \\ 2 \end{bmatrix}}_{\text{indp}} \right\} \]