PAGE 1

2.2 + 2.3 The Inverse of a Matrix

Scalar Inverse

\[ 5^{-1} \cdot 5 = 1 \quad \text{and} \quad 5 \cdot 5^{-1} = 1 \]

\( 5^{-1} \) is the inverse of the scalar 5.

Matrix Inverse

\[ A^{-1}A = I \quad \text{and} \quad AA^{-1} = I \]

\( A^{-1} \) is the inverse of matrix \( A \).

\( A \) must be square.

  • NOT every square matrix has an inverse.
  • If \( A \) has an inverse, \( A \) is invertible.
  • If \( A \) does not, \( A \) is singular.

(Invertible matrix is often called nonsingular)

PAGE 2

2x2 Case

\[ A = \begin{bmatrix} a & b \\ c & d \end{bmatrix} \]

If \( ad - bc \neq 0 \) then:

\[ A^{-1} = \frac{1}{ad - bc} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix} \]

If \( ad - bc = 0 \), then \( A \) is NOT invertible.

\( ad - bc \) is the determinant of \( A \).

Example

\[ A = \begin{bmatrix} 4 & 3 \\ 2 & 1 \end{bmatrix} \]

Determinant: \( (4)(1) - (3)(2) = -2 \neq 0 \), so \( A^{-1} \) exists.

\[ A^{-1} = \frac{1}{-2} \begin{bmatrix} 1 & -3 \\ -2 & 4 \end{bmatrix} = \begin{bmatrix} -1/2 & 3/2 \\ 1 & -2 \end{bmatrix} \]

Check:

\[ AA^{-1} = \begin{bmatrix} 4 & 3 \\ 2 & 1 \end{bmatrix} \begin{bmatrix} -1/2 & 3/2 \\ 1 & -2 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} = I \]
\[ A^{-1}A = \begin{bmatrix} -1/2 & 3/2 \\ 1 & -2 \end{bmatrix} \begin{bmatrix} 4 & 3 \\ 2 & 1 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} = I \]
PAGE 3

Unique Solutions and Matrix Inverses

If \( A^{-1} \) exists, then \( A\vec{x} = \vec{b} \) has a unique solution.

because

\[ A^{-1}A\vec{x} = A^{-1}\vec{b} \]\[ I\vec{x} = A^{-1}\vec{b} \]\[ \vec{x} = A^{-1}\vec{b} \]

pre-multiplying by \( A^{-1} \)

Order is important

Example

Consider the system of equations:

\[ 8x_1 + 6x_2 = 2 \]\[ 5x_1 + 4x_2 = -1 \]

This can be written in matrix form \( A\vec{x} = \vec{b} \):

\[ \underbrace{\begin{bmatrix} 8 & 6 \\ 5 & 4 \end{bmatrix}}_{A} \underbrace{\begin{bmatrix} x_1 \\ x_2 \end{bmatrix}}_{\vec{x}} = \underbrace{\begin{bmatrix} 2 \\ -1 \end{bmatrix}}_{\vec{b}} \]

First, find the inverse \( A^{-1} \):

\[ A^{-1} = \frac{1}{32-30} \begin{bmatrix} 4 & -6 \\ -5 & 8 \end{bmatrix} = \begin{bmatrix} 2 & -3 \\ -5/2 & 4 \end{bmatrix} \]

Then, solve for \( \vec{x} \):

\[ \vec{x} = A^{-1}\vec{b} = \begin{bmatrix} 2 & -3 \\ -5/2 & 4 \end{bmatrix} \begin{bmatrix} 2 \\ -1 \end{bmatrix} = \begin{bmatrix} 7 \\ -9 \end{bmatrix} \]
PAGE 4

Existence of Solutions

  • If \( A^{-1} \) does not exist, then \( A\vec{x} = \vec{b} \) can have no solution or infinitely many solutions.
  • If \( A^{-1} \) exists, then \( \vec{b} \) is a linear combo of columns of \( A \).

A more general method to find \( A^{-1} \) (any \( n \times n \))

Form an augmented matrix \( [A \quad I] \), then row reduce the left half to \( I \); the resulting right half is \( A^{-1} \).

(If the left half cannot be reduced to \( I \), then \( A^{-1} \) does not exist)

\[ [A \quad I] \sim \dots \sim [I \quad A^{-1}] \]
PAGE 5

Example: Finding the Inverse of a 2x2 Matrix

Given the matrix:

\[ A = \begin{bmatrix} 4 & 3 \\ 2 & 1 \end{bmatrix} \]

We augment the matrix with the identity matrix and perform row operations to find the inverse:

\[ \begin{bmatrix} 4 & 3 & \vdots & 1 & 0 \\ 2 & 1 & \vdots & 0 & 1 \end{bmatrix} \]
\[ \sim \begin{bmatrix} 2 & 1 & \vdots & 0 & 1 \\ 4 & 3 & \vdots & 1 & 0 \end{bmatrix} \]
\[ \sim \begin{bmatrix} 2 & 1 & \vdots & 0 & 1 \\ 0 & 1 & \vdots & 1 & -2 \end{bmatrix} \]
\[ \sim \begin{bmatrix} 2 & 0 & \vdots & -1 & 3 \\ 0 & 1 & \vdots & 1 & -2 \end{bmatrix} \]
\[ \sim \begin{bmatrix} 1 & 0 & \vdots & -1/2 & 3/2 \\ 0 & 1 & \vdots & 1 & -2 \end{bmatrix} \]

Resulting Inverse Matrix \( A^{-1} \):

\[ A^{-1} = \begin{bmatrix} -1/2 & 3/2 \\ 1 & -2 \end{bmatrix} \]
PAGE 6

Example: Finding the Inverse of a 3x3 Matrix

Given the matrix:

\[ A = \begin{bmatrix} 1 & 0 & 0 \\ 1 & 1 & 0 \\ 1 & 1 & 1 \end{bmatrix} \]

Augmenting with the identity matrix:

\[ \begin{bmatrix} 1 & 0 & 0 & \vdots & 1 & 0 & 0 \\ 1 & 1 & 0 & \vdots & 0 & 1 & 0 \\ 1 & 1 & 1 & \vdots & 0 & 0 & 1 \end{bmatrix} \]

Performing row operations:

\[ \sim \begin{bmatrix} 1 & 0 & 0 & \vdots & 1 & 0 & 0 \\ 0 & 1 & 0 & \vdots & -1 & 1 & 0 \\ 0 & 1 & 1 & \vdots & -1 & 0 & 1 \end{bmatrix} \]
\[ \sim \begin{bmatrix} 1 & 0 & 0 & \vdots & 1 & 0 & 0 \\ 0 & 1 & 0 & \vdots & -1 & 1 & 0 \\ 0 & 0 & 1 & \vdots & 0 & -1 & 1 \end{bmatrix} \]

The final inverse matrix is:

\[ A^{-1} = \begin{bmatrix} 1 & 0 & 0 \\ -1 & 1 & 0 \\ 0 & -1 & 1 \end{bmatrix} \]
PAGE 7

Matrix Inversion Example

Example

Consider the matrix A:

\[ A = \begin{bmatrix} 1 & -2 & -1 \\ -1 & 5 & 6 \\ 5 & -4 & 5 \end{bmatrix} \]

We attempt to find the inverse by augmenting with the identity matrix \( I \):

\[ \begin{bmatrix} 1 & -2 & -1 & 1 & 0 & 0 \\ -1 & 5 & 6 & 0 & 1 & 0 \\ 5 & -4 & 5 & 0 & 0 & 1 \end{bmatrix} \]
\[ \sim \begin{bmatrix} 1 & -2 & -1 & 1 & 0 & 0 \\ 0 & 3 & 5 & 1 & 1 & 0 \\ 0 & 6 & 10 & -5 & 0 & 1 \end{bmatrix} \]
\[ \sim \begin{bmatrix} 1 & -2 & -1 & 1 & 0 & 0 \\ 0 & 3 & 5 & 1 & 1 & 0 \\ 0 & 0 & 0 & -7 & -2 & 1 \end{bmatrix} \]

No way this can turn into \( I \)

so A is singular

PAGE 8

The above examples indicate that if \( A \) is row-equivalent to \( I \) then \( A^{-1} \) exists.

This also means if an \( n \times n \) matrix has \( n \) pivots, then it is invertible.

From 1.4 we know if an \( n \times n \) matrix \( A \) has \( n \) pivots, then the columns of \( A \) span \( \mathbb{R}^n \) and the columns are linearly independent.

\( \rightarrow \) if columns of \( A \) are linearly independent, then \( A^{-1} \) exists.

PAGE 9

P. 114

Theorem 8: The Invertible Matrix Theorem

Let \( A \) be a square \( n \times n \) matrix. Then the following statements are equivalent. That is, for a given \( A \), the statements are either all true or all false.

  1. a. \( A \) is an invertible matrix.
  2. b. \( A \) is row equivalent to the \( n \times n \) identity matrix.
  3. c. \( A \) has \( n \) pivot positions.
  4. d. The equation \( A\mathbf{x} = \mathbf{0} \) has only the trivial solution.
  5. e. The columns of \( A \) form a linearly independent set.
  6. f. The linear transformation \( \mathbf{x} \mapsto A\mathbf{x} \) is one-to-one.
  7. g. The equation \( A\mathbf{x} = \mathbf{b} \) has at least one solution for each \( \mathbf{b} \) in \( \mathbb{R}^n \).
  8. h. The columns of \( A \) span \( \mathbb{R}^n \).
  9. i. The linear transformation \( \mathbf{x} \mapsto A\mathbf{x} \) maps \( \mathbb{R}^n \) onto \( \mathbb{R}^n \).
  10. j. There is an \( n \times n \) matrix \( C \) such that \( CA = I \).
  11. k. There is an \( n \times n \) matrix \( D \) such that \( AD = I \).
  12. l. \( A^T \) is an invertible matrix.
PAGE 10

Why \( [A \quad I] \sim \dots \sim [I \quad A^{-1}] \) works

\[ \left[ \begin{array}{cc:cc} 1 & 2 & 1 & 0 \\ 3 & 4 & 0 & 1 \end{array} \right] \sim \dots \]

is the same as solving \( [A \mid \vec{e_1}] \) and \( [A \mid \vec{e_2}] \) at the same time, then put solutions as columns.

If solution to \( [A \mid \vec{e_1}] \) is \( _1 \) and solution to \( [A \mid \vec{e_2}] \) is \( _2 \), then that means:

\[ A\vec{b_1} = \begin{bmatrix} 1 \\ 0 \end{bmatrix} \quad \text{and} \quad A\vec{b_2} = \begin{bmatrix} 0 \\ 1 \end{bmatrix} \]

Then \( AB \) where \( B = [\vec{b_1} \quad \vec{b_2}] \)

\[ = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \quad \text{and so } B \text{ must be } A^{-1} \]
PAGE 11

Elementary matrix

is a matrix that results when one elementary row operation is performed on an identity matrix.

\[ I_3 = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix} \]
\[ \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 1 & 0 & 1 \end{bmatrix} \quad R_1 + R_3 \]

elementary matrix

so is

\[ \begin{bmatrix} 0 & 1 & 0 \\ 1 & 0 & 0 \\ 0 & 0 & 1 \end{bmatrix} \]

all elem. matrices are row equivalent to \( I \) so are all invertible