PAGE 1

1.9 The Matrix of Linear Transformation

\[ T: \mathbb{R}^m \to \mathbb{R}^n \quad T(\vec{x}) = A\vec{x} \]

What is the minimum we need to know about \( T \) to find \( A \)?

If \( T(\vec{x}) = T\left( \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix} \right) = \begin{bmatrix} 13 \\ 13 \end{bmatrix} \), \( A = ? \)

It turns out we just need to know the transformation of the standard unit vectors \( \vec{e_1}, \vec{e_2}, \dots, \vec{e_m} \).

\[ T(\vec{e_1}), T(\vec{e_2}), \dots, T(\vec{e_m}) \Rightarrow \text{then we know } A \]

\( \mathbb{R}^3 \): \( \vec{e_1} = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \), \( \vec{e_2} = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} \), \( \vec{e_3} = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \)

(sometimes written \( \vec{e_1} = (1, 0, 0) \), \( \vec{e_2} = (0, 1, 0) \), etc.)

\( \vec{e_i} \)'s are columns of identity matrix \( \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \)

PAGE 2

Suppose we know:

\[ T\left( \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \right) = \begin{bmatrix} 3 \\ 2 \end{bmatrix} \]\[ T\left( \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} \right) = \begin{bmatrix} -1 \\ -2 \end{bmatrix} \]\[ T\left( \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \right) = \begin{bmatrix} 4 \\ 5 \end{bmatrix} \]

What is \( A \) such that \( T(\vec{x}) = A\vec{x} \)?

Since \( \vec{e_1}, \vec{e_2}, \vec{e_3} \) span \( \mathbb{R}^3 \), ANY vector in \( \mathbb{R}^3 \) is a linear combo of \( \vec{e_1}, \vec{e_2}, \vec{e_3} \).

\[ \vec{b} = x_1 \vec{e_1} + x_2 \vec{e_2} + x_3 \vec{e_3} \quad \text{for ANY } \vec{b} \text{ in } \mathbb{R}^3 \]

Because \( T \) is a linear transformation,

  • \( T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v}) \)
  • \( T(c\vec{u}) = c T(\vec{u}) \)
\[ T(\vec{b}) = T(x_1 \vec{e_1} + x_2 \vec{e_2} + x_3 \vec{e_3}) \]\[ = T(x_1 \vec{e_1}) + T(x_2 \vec{e_2}) + T(x_3 \vec{e_3}) \]\[ = x_1 T(\vec{e_1}) + x_2 T(\vec{e_2}) + x_3 T(\vec{e_3}) \]
PAGE 3
\[ = \underbrace{\begin{bmatrix} T(\vec{e_1}) & T(\vec{e_2}) & T(\vec{e_3}) \end{bmatrix}}_{A} \underbrace{\begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix}}_{\vec{x}} \]
\[ = \underbrace{\begin{bmatrix} 3 & -1 & 4 \\ 2 & -2 & 5 \end{bmatrix}}_{A} \vec{x} \]

\( A \rightarrow \) "standard matrix of the linear transformation \( T \)"

\( A \) is the matrix whose columns are \( T(\vec{e_1}), T(\vec{e_2}), \dots, T(\vec{e_m}) \)

PAGE 4

Some example transformations

Example: \( T: \mathbb{R}^2 \to \mathbb{R}^2 \)

\( T \) rotates a vector about the origin by \( \frac{\pi}{4} \) radians, then reflects about the vertical (\( x_2 \)) axis.

Find \( A \).

Find how \( \vec{e_1} = \begin{bmatrix} 1 \\ 0 \end{bmatrix} \) and \( \vec{e_2} = \begin{bmatrix} 0 \\ 1 \end{bmatrix} \) are transformed.

Coordinate plane showing vector (1,0) rotated by pi/4 to (sqrt(2)/2, sqrt(2)/2) and reflected to (-sqrt(2)/2, sqrt(2)/2).
\[ T(\vec{e_1}) = \begin{bmatrix} -\sqrt{2}/2 \\ \sqrt{2}/2 \end{bmatrix} \]
Coordinate plane showing vector (0,1) rotated by pi/4 to (-sqrt(2)/2, sqrt(2)/2) and reflected to (sqrt(2)/2, sqrt(2)/2).
\[ T(\vec{e_2}) = \begin{bmatrix} \sqrt{2}/2 \\ \sqrt{2}/2 \end{bmatrix} \]
\[ A = \begin{bmatrix} -\sqrt{2}/2 & \sqrt{2}/2 \\ \sqrt{2}/2 & \sqrt{2}/2 \end{bmatrix} \]
PAGE 5

See p. 74-76 for some standard transformation matrices: reflection, contraction/expansion, shear, projection.

Definition: Onto Mapping

A mapping \( T: \mathbb{R}^n \to \mathbb{R}^m \) is said to be onto \( \mathbb{R}^m \) if each \( \vec{b} \) in \( \mathbb{R}^m \) is the image of at least one \( \vec{x} \) in \( \mathbb{R}^n \).

Another way to say this: for any \( \vec{b} \) in \( \mathbb{R}^m \) there is a solution to \( A\vec{x} = \vec{b} \).

Example

\[ A = \begin{bmatrix} 1 & 0 & 5 \\ 0 & 1 & 0 \\ 2 & 4 & -1 \\ 3 & 5 & 2 \end{bmatrix} \quad T: \mathbb{R}^3 \to \mathbb{R}^4 \]

Is \( T \) onto \( \mathbb{R}^4 \)?

Equivalent question: is there at least one solution to \( A\vec{x} = \vec{b} \) where \( \vec{b} \) is some vector in \( \mathbb{R}^4 \)?

PAGE 6

Analysis of the Augmented Matrix

\[ \text{Augmented matrix: } \begin{bmatrix} 1 & 0 & 5 & b_1 \\ 0 & 1 & 0 & b_2 \\ 2 & 4 & -1 & b_3 \\ 3 & 5 & 2 & b_4 \end{bmatrix} \]

Note: this matrix can only have up to 3 pivot positions.

So one of these rows will look like:

\[ \begin{bmatrix} 0 & 0 & 0 & \dots \end{bmatrix} \]

...some expression involving \( b_1, b_2, b_3, b_4 \)

If the last element \( \neq 0 \) (which can happen for arbitrary \( b_1, b_2, b_3, b_4 \)), then there is no solution.

So \( T \) is NOT onto \( \mathbb{R}^4 \).

PAGE 7

Linear Transformations: Onto and One-to-One

Example: Onto Mapping

\[ A = \begin{bmatrix} 1 & -4 & 8 & 1 \\ 0 & 2 & -1 & 3 \\ 0 & 0 & 0 & 5 \end{bmatrix} \]

Consider the transformation \( T: \mathbb{R}^4 \to \mathbb{R}^3 \). Is it onto \( \mathbb{R}^3 \)?

There are 3 pivots. Since there is a pivot in every row, there is always a solution to \( A\vec{x} = \vec{b} \).

So, yes, \( T \) is onto \( \mathbb{R}^3 \).


Definition: One-to-One Transformation

If \( T: \mathbb{R}^n \to \mathbb{R}^m \), \( T \) is one-to-one if each \( \vec{b} \) in \( \mathbb{R}^m \) is the image of at most one \( \vec{x} \) in \( \mathbb{R}^n \).

This is NOT one-to-one.

Mapping diagram showing two points in the domain mapping to the same single point in the range.
PAGE 8

Example: One-to-One Analysis

\[ A = \begin{bmatrix} 1 & -4 & 8 & 1 \\ 0 & 2 & -1 & 3 \\ 0 & 0 & 0 & 5 \end{bmatrix} \quad \text{is } T(\vec{x}) = A\vec{x} \text{ one-to-one?} \]
\[ \begin{bmatrix} 1 & -4 & 8 & 1 & b_1 \\ 0 & 2 & -1 & 3 & b_2 \\ 0 & 0 & 0 & 5 & b_3 \end{bmatrix} \]

4 variables, 3 pivots, one free variable \( \Rightarrow \) No unique solution.

Not one-to-one.


Theorem

If \( T: \mathbb{R}^n \to \mathbb{R}^m \) is one-to-one, then \( T(\vec{x}) = A\vec{x} = \vec{0} \) only has the trivial solution.

Why?

Recall the solution of \( T(\vec{x}) = A\vec{x} = \vec{b} \) is a shifted version of \( A\vec{x} = \vec{0} \), so if \( A\vec{x} = \vec{0} \) has a unique solution, then \( A\vec{x} = \vec{b} \) can have at most one solution.

PAGE 9

Summary of Linear Transformations

Now we can summarize everything:

\( T: \mathbb{R}^n \to \mathbb{R}^m \) is onto \( \mathbb{R}^m \) if columns of \( A \) span \( \mathbb{R}^m \)

\( T: \mathbb{R}^n \to \mathbb{R}^m \) is one-to-one if columns of \( A \) are linearly independent