PAGE 1

5.5 (continued)

From last time:

\[ \vec{x}' = \begin{bmatrix} 2 & 1 \\ 0 & 2 \end{bmatrix} \vec{x} \]
\[ \vec{x} = c_1 e^{2t} \begin{bmatrix} 1 \\ 0 \end{bmatrix} + c_2 e^{2t} \left( t \begin{bmatrix} 1 \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ 1 \end{bmatrix} \right) \]

Sketch of the phase diagram

\( \vec{x} \to \vec{0} \) as \( t \to -\infty \). Increasing \( t \): move away.

Near the origin, \( \begin{bmatrix} 1 \\ 0 \end{bmatrix} \) dominates in \( t \begin{bmatrix} 1 \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ 1 \end{bmatrix} \).

Leave origin along \( \begin{bmatrix} 1 \\ 0 \end{bmatrix} \) (true eigenvector)

As \( t \) increases, \( \begin{bmatrix} 1 \\ 0 \end{bmatrix} \) still dominates but \( \begin{bmatrix} 0 \\ 1 \end{bmatrix} \) still contributes (both multiplied by \( e^{2t} \)).

PAGE 2

Phase Portrait Analysis and $3 \times 3$ Systems

A phase portrait on a Cartesian coordinate system with axes $x_1$ and $x_2$. The $x_1$-axis (horizontal) is marked as the direction of the true eigenvector $\begin{bmatrix} 1 \\ 0 \end{bmatrix}$. Green curved trajectories approach the $x_1$-axis asymptotically near the origin. Red dashed lines with negative slopes fill the plane. A small red vector at the origin represents the sum of the true and generalized eigenvectors. The generalized eigenvector is noted as $\begin{bmatrix} 0 \\ 1 \end{bmatrix}$.
  • $\begin{bmatrix} 0 \\ 1 \end{bmatrix}$ : generalized
  • red : sum of the two
  • $\begin{bmatrix} 1 \\ 0 \end{bmatrix}$ : (true eigenvector)

(above $x_1$ axis)

as $t$ increases, we move left on the red dashed line and to the next "level"

below, as $t \to \infty$, move right

Key point: only the true eigenvector is "visible" (asymptote near origin)

Now let's look at $3 \times 3$ systems ($\lambda$ repeats 3 times)

Possibilities:

  1. Full set of eigenvectors (matrix is complete)
  2. Missing one eigenvector (defect of one)
  3. Missing two eigenvectors (defect of two)
PAGE 3

Case 3: Missing Two Eigenvectors

Let's start with case 3 (missing two), the easier case.

\[ \vec{x}' = \begin{bmatrix} 2 & 1 & 0 \\ 0 & 2 & 1 \\ 0 & 0 & 2 \end{bmatrix} \vec{x} \quad \lambda = 2, 2, 2 \quad \vec{v} = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \text{ missing two} \]

Find Two Generalized Eigenvectors

Let \( \vec{v}_1 = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \)

Then,

\[ \begin{aligned} (A - \lambda I) \vec{v}_2 &= \vec{v}_1 \\ (A - \lambda I) \vec{v}_3 &= \vec{v}_2 \end{aligned} \]
Just like \( 2 \times 2 \) systems
\[ (A - \lambda I) \vec{v}_2 = \vec{v}_1 \rightarrow \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} \vec{v}_2 = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \]
\[ \vec{v}_2 = \begin{bmatrix} a \\ 1 \\ 0 \end{bmatrix} = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} \]

Repeat for \( \vec{v}_3 \):

\[ \vec{v}_3 = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \]
PAGE 4

Solutions for Systems of Differential Equations

  • solution 1:\( e^{\lambda t} \vec{v}_1 \)
  • solution 2:\( e^{\lambda t} (t \vec{v}_1 + \vec{v}_2) \)
  • solution 3:\( e^{\lambda t} (\frac{1}{2} t^2 \vec{v}_1 + t \vec{v}_2 + \vec{v}_3) \)

for this example, general solution is

\[ \vec{x} = c_1 e^{2t} \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} + c_2 e^{2t} \left( t \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} \right) + c_3 e^{2t} \left( \frac{1}{2} t^2 \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} + t \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} + \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \right) \]

alternative method:

from
\[ (A - \lambda I) \vec{v}_1 = \vec{0} \]\[ (A - \lambda I) \vec{v}_2 = \vec{v}_1 \]\[ (A - \lambda I) \vec{v}_3 = \vec{v}_2 \]
\[ (A - \lambda I)^2 \vec{v}_3 = (A - \lambda I) \vec{v}_2 \]\[ = \vec{v}_1 \]
mult. 3rd eq. by \( (A - \lambda I) \)

one more time

\[ (A - \lambda I)^3 \vec{v}_3 = (A - \lambda I) \vec{v}_1 = \vec{0} \]

missing \( k \) vectors (defect of \( k \))

then \( (A - \lambda I)^{k+1} = \vec{0} \)

PAGE 5

Matrix Nilpotency and Generalized Eigenvectors

Given the matrix difference:

\[ A - \lambda I = \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} \]

Calculating the third power of this matrix:

\[ (A - \lambda I)^3 = \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{bmatrix} = \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \]

\[ (A - \lambda I)^{k+1} = 0 \]

This is the 0 matrix ALWAYS for a nilpotent matrix of order \( k+1 \).

Revisiting the Generalized Eigenvector Equation

Consider the equation:

\[ (A - \lambda I)^3 \vec{v}_3 = \vec{0} \]

Since \( (A - \lambda I)^3 \) is the 0 matrix, we have:

\[ \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \text{ solve for } \vec{v}_3 \]

Therefore, \( \vec{v}_3 \) is arbitrary*.

* Pick any \( \vec{v}_3 \neq \vec{0} \) and it must be linearly independent from true eigenvectors.

True eigenvector: \[ \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \]

PAGE 6

Generalized Eigenvector Chains and Solutions

So, let's pick \( \vec{v}_3 = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \) (but \( \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \), \( \begin{bmatrix} 0 \\ 0 \\ \pi \end{bmatrix} \) are ok among others).

Now rebuild the entire chain, including \( \vec{v}_1 \):

\[ (A - \lambda I) \vec{v}_3 = \vec{v}_2 \quad \dots \quad \vec{v}_2 = \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} \]
\[ (A - \lambda I) \vec{v}_2 = \vec{v}_1 \quad \dots \quad \vec{v}_1 = \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \]

Here, it happens to match the true eigenvector but it is NOT always the case.

General Solutions

  • Solution 1: \( e^{\lambda t} \vec{v}_1 \)
  • Solution 2: \( e^{\lambda t} (t \vec{v}_1 + \vec{v}_2) \)
  • Solution 3: \( e^{\lambda t} (\frac{1}{2} t^2 \vec{v}_1 + t \vec{v}_2 + \vec{v}_3) \)

I call the 1st method "stepping up"

I call the 2nd method "stepping down"

Case of missing one is annoying w/ stepping up

PAGE 7

Solving a System of Differential Equations with a Defective Matrix

\[ \vec{x}' = \begin{bmatrix} 5 & -3 & -2 \\ 8 & -5 & -4 \\ -4 & 3 & 3 \end{bmatrix} \vec{x} \]

Eigenvalues:

\( \lambda = 1, 1, 1 \)

Eigenvectors:

\( \vec{v} = \begin{bmatrix} 1 \\ 0 \\ 2 \end{bmatrix}, \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} \)

Given the eigenvectors:

\[ \vec{v}_1 = \begin{bmatrix} 1 \\ 0 \\ 2 \end{bmatrix}, \quad \vec{v}_2 = \begin{bmatrix} 0 \\ 2 \\ -3 \end{bmatrix} \]

need \( \vec{v}_3 \)

The defect is one, so:

\[ (A - \lambda I)^{1+1} \vec{v}_3 = \vec{0} \]
\[ (A - \lambda I)^{1+1} = 0 \quad \text{(0 matrix)} \]

so,

\[ \underbrace{(A - \lambda I)^2}_{\text{zeros}} \vec{v}_3 = \vec{0} \quad \rightarrow \quad \vec{v}_3 \text{ is arbitrary} \]

Conditions for \( \vec{v}_3 \):

  • \( \vec{v}_3 \neq 0 \)
  • \( \vec{v}_3 \) is independent from \( \vec{v}_1 \) and \( \vec{v}_2 \)

Possible choices for \( \vec{v}_3 \):

\[ \vec{v}_3 = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} \text{ or } \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} \text{ or } \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} \text{ or } \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \]
PAGE 8

Generalized Eigenvector Chains

Let's go with:

\[ \vec{v}_3 = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} \]

Rebuild the chain:

\[ (A - \lambda I) \vec{v}_3 = \vec{v}_2 \quad \dots \quad \vec{v}_2 = \begin{bmatrix} -2 \\ -4 \\ 2 \end{bmatrix} \]

When we do \( (A - \lambda I) \vec{v}_2 = \vec{v}_1 \):

\[ \begin{bmatrix} 4 & -3 & -2 \\ 8 & -6 & -4 \\ -4 & 3 & 2 \end{bmatrix} \begin{bmatrix} -2 \\ -4 \\ 2 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \end{bmatrix} \]

Not allowed

This means the \( \vec{v}_2 \) we found is in the eigenspace of the two true eigenvectors.

Pick either true vector to be \( \vec{v}_1 \):

\[ \vec{v}_1 = \begin{bmatrix} 1 \\ 0 \\ 2 \end{bmatrix} \]

Solutions:

  • \( e^{\lambda t} \vec{v}_1 \)
  • \( e^{\lambda t} \vec{v}_2 \)
  • \( e^{\lambda t} (t \vec{v}_2 + \vec{v}_3) \)