PAGE 1

4.1 The Vector Space \(\mathbb{R}^3\)

3D coordinate system

point \((a, b, c)\) can be seen as the tip of the vector from origin to the point

\((a, b, c) \longleftrightarrow \begin{bmatrix} a \\ b \\ c \end{bmatrix} = a\vec{i} + b\vec{j} + c\vec{k}\)

this is (part of) why we can add/subtract or multiply by scalar w/ vectors

(each component is a number and we can do all the above w/ numbers)

A 3D coordinate system with x, y, and z axes. A vector points from the origin to a point (a, b, c).

all possible vectors \(\begin{bmatrix} a \\ b \\ c \end{bmatrix}\) are contained in the vector space \(\mathbb{R}^3\)

inside \(\mathbb{R}^3\) there is \(\mathbb{R}^2\) (xy-plane)

\(\begin{bmatrix} a \\ b \\ 0 \end{bmatrix} = \begin{bmatrix} a \\ b \end{bmatrix}\)

PAGE 2

two \(\mathbb{R}^n\) vectors are linearly dependent if one is a scalar multiple of the other

\(\vec{u}, \vec{v}\) are two \(\mathbb{R}^n\) vectors

linearly dependent if \(\vec{u} = c\vec{v}\) or \(\vec{v} = d\vec{u}\)

example: \(\vec{u} = \begin{bmatrix} 1 \\ 1 \end{bmatrix}\) \(\vec{v} = \begin{bmatrix} -2 \\ -2 \end{bmatrix}\)

\(-2\vec{u} = \vec{v}\) so linearly dependent

if not linearly dependent, then they are linearly independent

\(\vec{u} \neq c\vec{v}\)

or, \(a\vec{u} + b\vec{v} = \vec{0}\) implies \(a = b = 0\) only

for example, \(\vec{i} = \begin{bmatrix} 1 \\ 0 \end{bmatrix}\) \(\vec{j} = \begin{bmatrix} 0 \\ 1 \end{bmatrix}\)

they are linearly independent because \(a\vec{i} + b\vec{j} = \vec{0}\)

\(\begin{bmatrix} a \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ b \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix}\) possible if and only if \(a = b = 0\)

PAGE 3

Linear Independence and Determinants

\[ \vec{u} = \begin{bmatrix} 1 \\ 1 \end{bmatrix} \quad \vec{v} = \begin{bmatrix} 2 \\ -2 \end{bmatrix} \]

Clearly, not multiples of each other \( \rightarrow \) linearly indp

So, \( a\vec{u} + b\vec{v} = \vec{0} \) possible if \( a = b = 0 \)

\[ a \begin{bmatrix} 1 \\ 1 \end{bmatrix} + b \begin{bmatrix} 2 \\ -2 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} \]\[ \underbrace{\begin{bmatrix} 1 & 2 \\ 1 & -2 \end{bmatrix}}_{A} \begin{bmatrix} a \\ b \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} \]

notice \( \det A = -2 - 2 \neq 0 \)

if \( \vec{u}, \vec{v} \) are indp, then when put in as columns of a matrix \( A \), \( \det A \neq 0 \)

\[ A\vec{x} = \vec{b} \]\[ \vec{x} = A^{-1}\vec{b} \]

\( A^{-1} \) does not exist if \( \det A = 0 \)

if cols are indp, then \( \det A \neq 0 \) so \( A^{-1} \) exists

if cols of \( A \) are indp then \( A^{-1} \) exists

PAGE 4

three or more vectors are linearly dependent if at least one can be written as a linear combination of the others

\( \vec{u}, \vec{v}, \vec{w} \) if dep then \( \vec{u} = \underbrace{a\vec{v} + b\vec{w}}_{\text{linear combination}} \)

or \( \vec{v} = c\vec{u} + d\vec{w} \)

and so on

if not, then they are linearly independent

which implies \( a\vec{u} + b\vec{v} + c\vec{w} = \vec{0} \) if and only if \( a = b = c = 0 \)

Example

for example, \( \vec{i} = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \quad \vec{j} = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} \quad \vec{k} = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \)

\[ a\vec{i} + b\vec{j} + c\vec{k} = \vec{0} \text{ if and only if } a = b = c = 0 \]

so, \( \vec{i}, \vec{j}, \vec{k} \) are linearly indp

PAGE 5

Linear Independence and Linear Combinations

\[ \vec{u} = \begin{bmatrix} 4 \\ 3 \end{bmatrix}, \quad \vec{v} = \begin{bmatrix} 7 \\ 8 \end{bmatrix}, \quad \vec{w} = \begin{bmatrix} -1 \\ 2 \end{bmatrix} \]

Are they linearly independent?

If not, express one as a linear combination of the others.

Let's try to express \( \vec{u} \) as a linear combo of \( \vec{v} \) and \( \vec{w} \) (if not possible, then they must be independent).

\[ \vec{u} = a\vec{v} + b\vec{w} \quad \text{find } a, b \]
\[ \begin{bmatrix} 4 \\ 3 \end{bmatrix} = \begin{bmatrix} 7 & -1 \\ 8 & 2 \end{bmatrix} \begin{bmatrix} a \\ b \end{bmatrix} \]
\[ \left[ \begin{array}{cc|c} 7 & -1 & 4 \\ 8 & 2 & 3 \end{array} \right] \quad \begin{array}{l} \text{solve to find } a, b \\ \text{using, for example, Gaussian elimination} \end{array} \]
\[ \rightarrow \dots \rightarrow \quad \begin{aligned} a &= \frac{1}{2} \\ b &= -\frac{1}{2} \end{aligned} \]

So \( \vec{u} = \frac{1}{2}\vec{v} - \frac{1}{2}\vec{w} \)

so NOT independent.

PAGE 6
example
\[ \vec{u} = \begin{bmatrix} 4 \\ 0 \\ 1 \end{bmatrix}, \quad \vec{v} = \begin{bmatrix} -5 \\ 1 \\ -1 \end{bmatrix}, \quad \vec{w} = \begin{bmatrix} 0 \\ -4 \\ -1 \end{bmatrix} \]
  • Independent?
  • Express one as linear combo of other two?

Linearly independent if \( a\vec{u} + b\vec{v} + c\vec{w} = \vec{0} \rightarrow a = b = c = 0 \)

\[ \begin{bmatrix} 4 & -5 & 0 \\ 0 & 1 & -4 \\ 1 & -1 & -1 \end{bmatrix} \begin{bmatrix} a \\ b \\ c \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \]

Independent if \( A^{-1} \) exists \( \leftrightarrow \det A \neq 0 \)

\( \rightarrow \) does not tell us \( a, b, c \) if the vectors are not independent

But Gaussian elimination can give us \( a, b, c \) even if not independent:

\[ \begin{bmatrix} 4 & -5 & 0 & 0 \\ 0 & 1 & -4 & 0 \\ 1 & -1 & -1 & 0 \end{bmatrix} \]
PAGE 7
row ops→ ⋯ →
\[\begin{bmatrix}1 & -1 & -1 & 0 \\ 0 & 1 & -4 & 0 \\ 0 & 0 & 0 & 0\end{bmatrix}\]

↳ existence of free variables → infinitely-many solutions

\(a = b = c = 0\) is NOT the only solution

so, \(\vec{u}, \vec{v}, \vec{w}\) are NOT indp

now write one as linear combo of other two

\[\begin{bmatrix} \overset{a}{\boxed{1}} & \overset{b}{-1} & \overset{c}{-1} & 0 \\ 0 & \overset{b}{\boxed{1}} & -4 & 0 \\ 0 & 0 & 0 & 0 \end{bmatrix}\]

\(c\) is free → \(c = r\)

row 2: \(b - 4c = 0 \rightarrow b = 4r\)

row 1: \(\dots \rightarrow a = 5r\)

\[a\vec{u} + b\vec{v} + c\vec{w} = \vec{0}\]\[5r\vec{u} + 4r\vec{v} + r\vec{w} = 0\]

choose \(r = 1\)

\(5\vec{u} + 4\vec{v} + \vec{w} = 0\)

or
\[\vec{w} = -5\vec{u} - 4\vec{v}\]
or

\(\vec{u} = -\frac{4}{5}\vec{v} - \frac{1}{5}\vec{w}\)