PAGE 1

"Hw 6" and "Hw 7" are due together

1.7 Linear Independence

A set of vectors \( \{ \vec{v}_1, \vec{v}_2, \dots, \vec{v}_n \} \) is linearly independent if

\[ x_1\vec{v}_1 + x_2\vec{v}_2 + \dots + x_n\vec{v}_n = \vec{0} \]

can happen if and only if \( x_1 = x_2 = x_3 = \dots = x_n = 0 \).

Example:

\( \left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \end{bmatrix} \right\} \)

\[ x_1 \begin{bmatrix} 1 \\ 0 \end{bmatrix} + x_2 \begin{bmatrix} 0 \\ 1 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} \]

true if \( x_1 = x_2 = 0 \)

so this set of vectors is linearly independent.

This means if \( A\vec{x} = \vec{0} \) has only the trivial solution,

then the columns of \( A \) are linearly independent.

PAGE 2

A set of vectors \( \{ \vec{v}_1, \vec{v}_2, \dots, \vec{v}_n \} \) are linearly dependent if the set is not linearly independent.

Example:

\[ x_1 \begin{bmatrix} 1 \\ 0 \end{bmatrix} + x_2 \begin{bmatrix} 0 \\ 1 \end{bmatrix} + x_3 \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} \]

In addition to \( x_1 = x_2 = x_3 = 0 \),

\( x_1 = 1, x_2 = 1, x_3 = -1 \) is also a solution.

So \( \left\{ \begin{bmatrix} 1 \\ 0 \end{bmatrix}, \begin{bmatrix} 0 \\ 1 \end{bmatrix}, \begin{bmatrix} 1 \\ 1 \end{bmatrix} \right\} \) is linearly dependent.

If \( A = \begin{bmatrix} 1 & 0 & 1 \\ 0 & 1 & 1 \end{bmatrix} \), then \( A\vec{x} = \vec{0} \) has nontrivial solution.

\[ \begin{bmatrix} 1 & 0 & 1 & | & 0 \\ 0 & 1 & 1 & | & 0 \end{bmatrix} \]
\( \vec{a}_1 \)\( \vec{a}_2 \)\( \vec{a}_3 \)

two pivots, three variables

  • \( x_3 \) free
  • \( x_2 = -x_3 \)
  • \( x_1 = -x_3 \)
PAGE 3

Linear Dependence and Independence

One possible solution: \( x_1 = -1, x_2 = -1, x_3 = 1 \)

\[ (-1) \vec{a_1} + (-1) \vec{a_2} + (1) \vec{a_3} = \vec{0} \implies \text{linear dependence relation} \]

Inspection for Dependence

Sometime we can tell if vectors are dependent by inspection.

\[ \left\{ \begin{bmatrix} 1 \\ 2 \end{bmatrix}, \begin{bmatrix} -2 \\ -4 \end{bmatrix} \right\} \]\[ (-2) \begin{bmatrix} 1 \\ 2 \end{bmatrix} = \begin{bmatrix} -2 \\ -4 \end{bmatrix} \]

One is a multiple of another \( \implies \) dependent.

\[ \left\{ \begin{bmatrix} 1 \\ 2 \end{bmatrix}, \begin{bmatrix} 1 \\ 3 \end{bmatrix} \right\} \]

Not multiples of one another. Can check if independent by solving \( A\vec{x} = \vec{0} \).

\[ \text{or, } x_1 \begin{bmatrix} 1 \\ 2 \end{bmatrix} + x_2 \begin{bmatrix} 1 \\ 3 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} \]

If \( x_2 \neq 0 \), then \( \begin{bmatrix} 1 \\ 3 \end{bmatrix} = -\frac{x_1}{x_2} \begin{bmatrix} 1 \\ 2 \end{bmatrix} \)

But if this is true, then the vectors are multiples of one another, which is FALSE. So, \( x_2 \) must be zero.

PAGE 4

...which means \( x_1 = 0 \), so independent.

Sets Containing the Zero Vector

If a set contains \( \vec{0} \), it MUST be a dependent set.

\[ \{ \vec{a_1}, \vec{a_2}, \vec{0} \} \]

\( x_1 \vec{a_1} + x_2 \vec{a_2} + x_3 \vec{0} = \vec{0} \) means \( x_1 = x_2 = x_3 = 0 \) is the only solution \( \implies \) independent.

\( x_1 = 0, x_2 = 0, x_3 = 1 \)

Nontrivial so set IS dependent.

PAGE 5

Linear Dependence and Linear Combinations

If a set is dependent, not every vector has to be a linear combo of the others.

Example

Consider the following vectors:

\[ \vec{u} = \begin{bmatrix} 3 \\ 2 \\ -4 \end{bmatrix}, \vec{v} = \begin{bmatrix} -6 \\ 1 \\ 7 \end{bmatrix}, \vec{w} = \begin{bmatrix} 0 \\ -5 \\ 2 \end{bmatrix}, \vec{z} = \begin{bmatrix} 3 \\ 7 \\ -5 \end{bmatrix} \]

To determine if the set is independent, we solve the homogeneous system \( [\vec{u} \; \vec{v} \; \vec{w} \; \vec{z}] \vec{x} = \vec{0} \). Let \( A = [\vec{u} \; \vec{v} \; \vec{w} \; \vec{z}] \).

\[ \begin{bmatrix} 3 & -6 & 0 & 3 & 0 \\ 2 & 1 & -5 & 7 & 0 \\ -4 & 7 & 2 & -5 & 0 \end{bmatrix} \sim \dots \sim \begin{bmatrix} 1 & 0 & 3 & 0 & 0 \\ 0 & 1 & 1 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \end{bmatrix} \]

\( x_3 \) is free

\( x_4 = 0 \)

\( x_2 = -x_3 \)

\( x_1 = -3x_3 \)

So set is dependent

Dependence Relation

Let \( x_3 = 1 \), then \( x_1 = -3, x_2 = -1, x_4 = 0 \).

\[ (-3)\vec{u} + (-1)\vec{v} + (1)\vec{w} = \vec{0} \]

\( \vec{z} \) is not a linear combo of others.

PAGE 6

Theorem on Vector Set Size

If there are more than \( n \) vectors in a set of \( n \times 1 \) vectors, then the set is dependent.

Why?

If \( n = 3 \), then the Reduced Row Echelon Form (rref) of \( A\vec{x} = \vec{0} \) (where columns of \( A \) are the vectors in the set) is either:

\[ \begin{bmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 0 & 1 & 0 \end{bmatrix} \]

or

\[ \begin{bmatrix} 1 & 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 & 0 \\ 0 & 0 & 1 & 0 & 0 \end{bmatrix} \]

In both cases, there will be at least one free variable because there are more columns than pivot positions, leading to non-trivial solutions and thus linear dependence.

PAGE 7

1.8 Introduction to Linear Transformations

\( A\vec{x} = \vec{b} \) is a system, but we can also look at it as a transformation of \( \vec{x} \) into \( \vec{b} \) by the matrix \( A \).

\[ \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} 2 \\ 3 \end{bmatrix} = \begin{bmatrix} 3 \\ 2 \end{bmatrix} \]
A 2D coordinate graph showing a vector [2, 3] being flipped across the line y=x to become [3, 2].

This transformation flips \( x_1, x_2 \) coords of 2D vector.

Notation

\[ T(\vec{x}) = A\vec{x} \quad \text{or} \quad \vec{x} \mapsto A\vec{x} \]

Input

Domain of transformation \( T \)

Output

Range or codomain of \( T \)

PAGE 8

An \( m \times n \) matrix \( A \) transforms an \( n \)-vector into an \( m \)-vector: \( T: \mathbb{R}^n \to \mathbb{R}^m \)

\[ A = \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 0 & 1 \end{bmatrix} \quad (3 \times 2) \]

\( A\vec{x} \) means \( \vec{x} \) is \( 2 \times 1 \) (requirement for simplicity).

\[ \begin{bmatrix} 1 & 2 \\ 3 & 4 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} -1 \\ -5 \end{bmatrix} = \begin{bmatrix} -11 \\ -23 \\ -5 \end{bmatrix} \]

\( T: \mathbb{R}^2 \to \mathbb{R}^3 \)

"image" of \( T \)

PAGE 9

Linear Transformations and Systems

If \( A = \begin{bmatrix} 1 & -7 & -26 \\ -4 & 22 & 80 \end{bmatrix} \) and \( \vec{b} = \begin{bmatrix} -3 \\ 6 \end{bmatrix} \)

\( T: \mathbb{R}^3 \to \mathbb{R}^2 \)

How many vectors in \( \mathbb{R}^3 \) can be transformed into \( \vec{b} \)?

\( T(\vec{x}) = A\vec{x} \)

\( A\vec{x} = \vec{b} \), \( \vec{x} = ? \)

this is just a system!

\[ \begin{bmatrix} 1 & -7 & -26 & -3 \\ -4 & 22 & 80 & 6 \end{bmatrix} \]

\[ \sim \dots \sim \begin{bmatrix} 1 & 0 & 2 & 4 \\ 0 & 1 & 4 & 1 \end{bmatrix} \]

\( x_3 \) is free

\( x_2 = 1 - 4x_3 \)

\( x_1 = 4 - 2x_3 \)

\[ \vec{x} = \begin{bmatrix} 4 \\ 1 \\ 0 \end{bmatrix} + x_3 \begin{bmatrix} -2 \\ -4 \\ 1 \end{bmatrix} \]

infinitely many \( \vec{x} \) turn into \( \vec{b} \)

PAGE 10
Mapping diagram from domain to codomain showing multiple vectors T(x) converging to a single point b.

Basic Properties of Linear Transformation


\( T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v}) \)

\[ A \begin{bmatrix} 3 \\ 1 \\ 3 \end{bmatrix} = A \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} + A \begin{bmatrix} 2 \\ 1 \\ 2 \end{bmatrix} \]

3x3 3x1

\( T(c\vec{u}) = cT(\vec{u}) \)

\( T(\vec{0}) = \vec{0} \)

an easy way to test if a transformation is linear

\( A\vec{x} \) is ALWAYS linear

\( T(c\vec{u} + d\vec{v}) = cT(\vec{u}) + dT(\vec{v}) \)