PAGE 1

7.5 Homogeneous Systems with Constant Coefficients

solve \( \vec{x}' = A\vec{x} \) where \( \vec{x} \) is a vector \( \begin{bmatrix} x_1 \\ \vdots \\ x_n \end{bmatrix} \) and \( A \) is a constant matrix.

its solution is very similar to scalar eg \( x' = ax \) (or \( y' = ay \))
\( \hookrightarrow \) solution \( x = ce^{at} \)

if \( a \) is \( A \), what is \( e^{At} \)?

how to solve \( \vec{x}' = A\vec{x} \)?

Consider the second-order equation:
\( y'' + 5y' + 6y = 0 \) with initial conditions \( y(0) = 1, y'(0) = 2 \)

Characteristic equation:
\( r^2 + 5r + 6 = 0 \)
\( (r + 2)(r + 3) = 0 \implies r = -2, -3 \)

General solution:
\( y = c_1 e^{-2t} + c_2 e^{-3t} \)

\[ y(0) = 1 \rightarrow 1 = c_1 + c_2 \]

Derivative of solution:
\( y' = -2c_1 e^{-2t} - 3c_2 e^{-3t} \)

\[ y'(0) = 2 \rightarrow 2 = -2c_1 - 3c_2 \]
PAGE 2

turn into a system of two 1st-order eqs.

\( x_1 = y \)

\( x_2 = y' \)

\( x_1' = x_2 \) (definition of \( x_1, x_2 \))

\( x_2' = -6x_1 - 5x_2 \) (from \( y'' = -6y - 5y' \))

\[ \begin{bmatrix} x_1' \\ x_2' \end{bmatrix} = \begin{bmatrix} 0 & 1 \\ -6 & -5 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} \]

This is equivalent to \( y'' + 5y' + 6y = 0 \) where:
\( \vec{x}' = A\vec{x} \)
\( x_1 = y = c_1 e^{-2t} + c_2 e^{-3t} \)
\( x_2 = y' = -2c_1 e^{-2t} - 3c_2 e^{-3t} \)

Solution:

\[ \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} = c_1 e^{-2t} \begin{bmatrix} 1 \\ -2 \end{bmatrix} + c_2 e^{-3t} \begin{bmatrix} 1 \\ -3 \end{bmatrix} \]

Note: \( -2 \) and \( -3 \) in the exponents are the roots of the characteristic eq.

What are these vectors?

must be somehow related to \( A = \begin{bmatrix} 0 & 1 \\ -6 & -5 \end{bmatrix} \)

PAGE 3

Eigenvalues and Eigenvectors

\[ A = \begin{bmatrix} 0 & 1 \\ -6 & -5 \end{bmatrix} \quad \text{eigenvalues: } \begin{vmatrix} -\lambda & 1 \\ -6 & -5-\lambda \end{vmatrix} = 0 \]

Expanding the determinant:

\[ (-\lambda)(-5-\lambda) + 6 = 0 \]
\[ \lambda^2 + 5\lambda + 6 = 0 \quad \text{characteristic eq.} \]

Solving for \( \lambda \):

\[ \lambda = -2, \lambda = -3 \]

eigenvalues \( \longleftrightarrow \) roots of char. eq.

Eigenvectors

To find eigenvectors, solve \( (A - \lambda I)\vec{v} = \vec{0} \).

\( \lambda = -2 \)

\[ \begin{bmatrix} 2 & 1 & | & 0 \\ -6 & -3 & | & 0 \end{bmatrix} \rightarrow \begin{bmatrix} 2 & 1 & | & 0 \\ 0 & 0 & | & 0 \end{bmatrix} \]

Let \( x_2 = r \), then \( 2x_1 + x_2 = 0 \).

\[ \vec{v} = \begin{bmatrix} -\frac{1}{2}r \\ r \end{bmatrix} = r \begin{bmatrix} 1 \\ -2 \end{bmatrix} \rightarrow \text{Eigenvector: } \begin{bmatrix} 1 \\ -2 \end{bmatrix} \]

\( \lambda = -3 \)

\[ \begin{bmatrix} 3 & 1 & | & 0 \\ -6 & -2 & | & 0 \end{bmatrix} \rightarrow \begin{bmatrix} 3 & 1 & | & 0 \\ 0 & 0 & | & 0 \end{bmatrix} \]
\[ \vec{v} = \begin{bmatrix} 1 \\ -3 \end{bmatrix} \]
PAGE 4

General Solution

Solution of \( \vec{x}' = A\vec{x} \) is:

\[ \vec{x} = c_1 e^{\lambda_1 t} \vec{v}_1 + c_2 e^{\lambda_2 t} \vec{v}_2 + \dots + c_n e^{\lambda_n t} \vec{v}_n \]
\[ \vec{x}' = \begin{bmatrix} 0 & 1 \\ -6 & -5 \end{bmatrix} \vec{x} \quad \vec{x} = c_1 e^{-2t} \begin{bmatrix} 1 \\ -2 \end{bmatrix} + c_2 e^{-3t} \begin{bmatrix} 1 \\ -3 \end{bmatrix} = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} \]
\[ \begin{aligned} x_1(t) &= c_1 e^{-2t} + c_2 e^{-3t} \quad (y) \\ x_2(t) &= -2c_1 e^{-2t} - 3c_2 e^{-3t} \quad (y') \end{aligned} \quad \vec{x}(0) = \begin{bmatrix} 1 \\ 2 \end{bmatrix} \]

We can graph \( x_1(t) \) and \( x_2(t) \) but often we get more information out of a graph of \( x_1 \) vs \( x_2 \) \( \rightarrow \) phase diagram (slope field-like).

A 2D coordinate system with x1 and x2 axes. Small arrows indicate the vector field direction at various points.
\[ \vec{x}' = \begin{bmatrix} 0 & 1 \\ -6 & -5 \end{bmatrix} \vec{x} \]

For each choice of \( \vec{x} = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} \), we get \( \vec{x}' \):

  • \( \vec{x} = \begin{bmatrix} 1 \\ 0 \end{bmatrix} \implies \vec{x}' = \begin{bmatrix} 0 & 1 \\ -6 & -5 \end{bmatrix} \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \begin{bmatrix} 0 \\ -6 \end{bmatrix} \) (downward)
  • \( \vec{x} = \begin{bmatrix} 0 \\ 1 \end{bmatrix} \implies \vec{x}' = \begin{bmatrix} 1 \\ -5 \end{bmatrix} \)
PAGE 5

Qualitative Sketching of Solutions

Very easy to sketch (qualitatively) if we have the solution:

\[ \vec{x} = c_1 e^{-2t} \begin{bmatrix} 1 \\ -2 \end{bmatrix} + c_2 e^{-3t} \begin{bmatrix} 1 \\ -3 \end{bmatrix} \]

Note as \( t \to \infty \), \( \vec{x} \to \vec{0} \) (origin). Solutions go to origin as \( t \) increases.

If \( t \) is a large positive number, \( e^{-2t} > e^{-3t} \). So, solutions follow \( \begin{bmatrix} 1 \\ -2 \end{bmatrix} \), the eigenvector attached to \( \lambda = -2 \).

If \( t \) is a large negative number, \( e^{-3t} > e^{-2t} \). So, solutions follow \( \begin{bmatrix} 1 \\ -3 \end{bmatrix} \).

Phase portrait in the x1-x2 plane showing trajectories curving toward the origin, tangent to the eigenvector lines.

Solutions that start on either \( \begin{bmatrix} 1 \\ -2 \end{bmatrix} \) or \( \begin{bmatrix} 1 \\ -3 \end{bmatrix} \) remain on those lines, so go to origin over time.

Initially follow \( \begin{bmatrix} 1 \\ -3 \end{bmatrix} \) (some small \( t \), \( t < 0 \)), then follow \( \begin{bmatrix} 1 \\ -2 \end{bmatrix} \) on its way to origin.

The origin here is called a nodal sink.

"point" ← attracted to it

PAGE 6

If both eigenvalues are positive, the same qualitative picture but opposite direction → origin is a nodal source.

Example: Saddle Point

Let's look at:

\[ \vec{x}' = \begin{bmatrix} 1 & 2 \\ 2 & 1 \end{bmatrix} \vec{x} \]

\( \lambda = -1, \quad 3 \)

\( \vec{v} = \begin{bmatrix} 1 \\ -1 \end{bmatrix}, \quad \begin{bmatrix} 1 \\ 1 \end{bmatrix} \)

\( \vec{x} = c_1 e^{-t} \begin{bmatrix} 1 \\ -1 \end{bmatrix} + c_2 e^{3t} \begin{bmatrix} 1 \\ 1 \end{bmatrix} \)

\( \vec{x} \) is never origin unless we start on \( \begin{bmatrix} 1 \\ -1 \end{bmatrix} \) or \( \begin{bmatrix} 1 \\ 1 \end{bmatrix} \).

Phase portrait showing a saddle point at the origin with hyperbolic trajectories and eigenvector axes.

Arrow: direction of increasing \( t \).

Here, the origin is a saddle point.