PAGE 1

5.7 Applications to Differential Equations

HW 28+29 due together

Basic Differential Equation Example

Consider the basic differential equation: \( x'(t) = a x(t) \), where \( x \) is a scalar function of \( t \).

Solution: \( x(t) \) that satisfies the D.E.

\( x'(t) = a x(t) \)

has solution \( x(t) = C e^{at} \) where \( C \) is a constant and \( a \) is a constant.

Check: \( x'(t) = C \cdot a e^{at} = a \cdot \underbrace{C e^{at}}_{x(t)} \)

System of First-Order Linear Differential Equations

Now consider a system of first-order linear D.E.'s:

\[ \begin{aligned} x_1' &= a_{11} x_1 + a_{12} x_2 + \dots + a_{1n} x_n \\ x_2' &= a_{21} x_1 + a_{22} x_2 + \dots + a_{2n} x_n \\ &\vdots \\ x_n' &= a_{n1} x_1 + a_{n2} x_2 + \dots + a_{nn} x_n \end{aligned} \]

Simple Example:

\[ \begin{aligned} x_1' &= x_1 + 2x_2 \\ x_2' &= 3x_1 + 4x_2 \end{aligned} \]
PAGE 2

Matrix Representation

The system can be written in vector form as:

\[ \begin{bmatrix} x_1' \\ \vdots \\ x_n' \end{bmatrix} = x_1 \begin{bmatrix} a_{11} \\ \vdots \\ a_{n1} \end{bmatrix} + x_2 \begin{bmatrix} a_{12} \\ \vdots \\ a_{n2} \end{bmatrix} + \dots + x_n \begin{bmatrix} a_{1n} \\ \vdots \\ a_{nn} \end{bmatrix} \]
\[ = \begin{bmatrix} a_{11} & a_{12} & \dots & a_{1n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \dots & a_{nn} \end{bmatrix} \begin{bmatrix} x_1 \\ \vdots \\ x_n \end{bmatrix} \]

\( \vec{x}' = A \vec{x} \)

Example

\[ \begin{aligned} x_1' &= x_1 + 2x_2 \\ x_2' &= 3x_1 + 4x_2 \end{aligned} \] \( \Rightarrow \) \[ \vec{x}' = \begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} \vec{x} \]

How to solve this?

\( \vec{x} = ? \)

PAGE 3

Systems of Linear Differential Equations

Simple Case:

\[ \begin{cases} x_1' = x_1 \\ x_2' = 2x_2 \end{cases} \Rightarrow \vec{x}' = \begin{bmatrix} 1 & 0 \\ 0 & 2 \end{bmatrix} \vec{x} \]

Solve by using calculus:

\[ \begin{cases} x_1 = C_1 e^t \\ x_2 = C_2 e^{2t} \end{cases} \]

As vector equation:

\[ \vec{x} = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} = C_1 e^t \begin{bmatrix} 1 \\ 0 \end{bmatrix} + C_2 e^{2t} \begin{bmatrix} 0 \\ 1 \end{bmatrix} \]

Fundamental solutions (or eigenfunctions)

Each is a solution of \( \vec{x}' = A\vec{x} \) and a linear combination of them is also a solution.

Each solution is of the form:

\[ \vec{x} = e^{\lambda t} \vec{v} \]

What are \( \lambda \) and \( \vec{v} \)?

PAGE 4
\[ \vec{x}' = A\vec{x} \quad \text{solution: } \vec{x} = e^{\lambda t} \vec{v} \]

Then \( \vec{x}' = \lambda e^{\lambda t} \vec{v} \).

\[ \lambda e^{\lambda t} \vec{v} = A e^{\lambda t} \vec{v} \quad (e^{\lambda t} \neq 0) \]

\( \lambda \vec{v} = A \vec{v} \)

So \( \lambda, \vec{v} \) are the eigenvalue/eigenvector pair of \( A \).

  • If \( A \) is \( 2 \times 2 \), there are 2 fundamental solutions.
  • If \( A \) is \( n \times n \), there are \( n \) fundamental solutions.

Each is \( e^{\lambda t} \vec{v} \).

Cases:

  • \( \lambda \)'s are distinct
  • \( \lambda \)'s are complex
  • \( \lambda \)'s repeated (we won't look at this in 5.7)
PAGE 5

Example: Systems of Linear Differential Equations

\[ \vec{x}' = \underbrace{\begin{bmatrix} 9 & -2 \\ 6 & 1 \end{bmatrix}}_{A} \vec{x} \quad \begin{matrix} x_1' = 9x_1 - 2x_2 \\ x_2' = 6x_1 + x_2 \end{matrix} \]

Eigenvalues and Eigenvectors:

\[ \lambda = 3, \quad 7 \]\[ \vec{v} = \begin{bmatrix} 1 \\ 3 \end{bmatrix}, \quad \begin{bmatrix} 1 \\ 1 \end{bmatrix} \]

2 Fundamental Solutions:

\[ \vec{x}_1 = e^{3t} \begin{bmatrix} 1 \\ 3 \end{bmatrix} \]\[ \vec{x}_2 = e^{7t} \begin{bmatrix} 1 \\ 1 \end{bmatrix} \]

General Solution:

\[ \vec{x} = c_1 \vec{x}_1 + c_2 \vec{x}_2 = c_1 e^{3t} \begin{bmatrix} 1 \\ 3 \end{bmatrix} + c_2 e^{7t} \begin{bmatrix} 1 \\ 1 \end{bmatrix} \]

\( c_1, c_2 \) come from initial conditions e.g. \( \vec{x}(0) = \begin{bmatrix} 1 \\ 0 \end{bmatrix} \)

\[ \begin{bmatrix} 1 \\ 0 \end{bmatrix} = c_1 \begin{bmatrix} 1 \\ 3 \end{bmatrix} + c_2 \begin{bmatrix} 1 \\ 1 \end{bmatrix} \]

... \( c_1 = -\frac{1}{2} \quad c_2 = \frac{3}{2} \)

PAGE 6

Note: \( \vec{x} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} \) is also a solution.

The origin \( \begin{bmatrix} 0 \\ 0 \end{bmatrix} \) is known as an equilibrium solution.

Classification of the Origin

  • Solutions will move away from the origin if both \( \lambda \)'s are positive, the origin is a source or a repeller.
  • Solutions will move toward origin if both \( \lambda \)'s are negative \( \rightarrow \) sink or attractor.
  • If \( \lambda \)'s are of mixed signs, the origin is a saddle point (toward \( \vec{0} \) in some directions, away in others).

In this example:

\[ \lambda = 3, \quad \vec{v} = \begin{bmatrix} 1 \\ 3 \end{bmatrix} \]\[ \lambda = 7, \quad \vec{v} = \begin{bmatrix} 1 \\ 1 \end{bmatrix} \]

Arrow away if \( \lambda > 0 \)
Arrow toward if \( \lambda < 0 \)

Phase portrait on x1-x2 axes showing trajectories moving away from the origin along eigenvectors  \begin{bmatrix} 1 \\ 1 \end{bmatrix}  and  \begin{bmatrix} 1 \\ 3 \end{bmatrix} .
Phase portrait for the given system.
PAGE 7

Complex \(\lambda\)'s: Solutions are Spirals

Example

\[ \vec{x}' = \begin{bmatrix} -8 & 10 \\ -1 & -2 \end{bmatrix} \vec{x} \]

Eigenvalues and Eigenvectors:

\[ \lambda = -5 + i, \quad -5 - i \]\[ \vec{v} = \begin{bmatrix} 3 - i \\ 1 \end{bmatrix}, \quad \begin{bmatrix} 3 + i \\ 1 \end{bmatrix} \]

Fundamental Solutions

\[ \vec{x}_1 = e^{(-5+i)t} \begin{bmatrix} 3 - i \\ 1 \end{bmatrix} \]\[ \vec{x}_2 = e^{(-5-i)t} \begin{bmatrix} 3 + i \\ 1 \end{bmatrix} \]

These solutions are complex-valued.

General Solution

\[ \vec{x} = c_1 e^{(-5+i)t} \begin{bmatrix} 3 - i \\ 1 \end{bmatrix} + c_2 e^{(-5-i)t} \begin{bmatrix} 3 + i \\ 1 \end{bmatrix} \]

\(\rightarrow\) real-valued

\(\nwarrow \nearrow\) complex-valued

PAGE 8

This solution is inconvenient in many applications; we need a real-valued equivalent.

\[ \begin{aligned} \vec{x}_1 &= e^{(-5+i)t} \begin{bmatrix} 3 - i \\ 1 \end{bmatrix} \\ &= e^{-5t} e^{it} \begin{bmatrix} 3 - i \\ 1 \end{bmatrix} \quad \text{where } e^{it} = \cos t + i \sin t \\ &= e^{-5t} (\cos t + i \sin t) \begin{bmatrix} 3 - i \\ 1 \end{bmatrix} \\ &= e^{-5t} \begin{bmatrix} 3 \cos t + \sin t - i \cos t + 3i \sin t \\ \cos t + i \sin t \end{bmatrix} \\ \vec{x}_1 &= e^{-5t} \left( \begin{bmatrix} 3 \cos t + \sin t \\ \cos t \end{bmatrix} + i \begin{bmatrix} 3 \sin t - \cos t \\ \sin t \end{bmatrix} \right) \end{aligned} \]

Repeat with \(\vec{x}_2\):

\[ \vec{x}_2 = e^{-5t} \left( \begin{bmatrix} 3 \cos t + \sin t \\ \cos t \end{bmatrix} - i \begin{bmatrix} 3 \sin t - \cos t \\ \sin t \end{bmatrix} \right) \]
PAGE 9

Real-Valued Fundamental Solutions

Let

\[ \vec{u} = \text{Re}(\vec{x}_1) = e^{-5t} \begin{bmatrix} 3 \cos t + \sin t \\ \cos t \end{bmatrix} \] \[ \vec{v} = \text{Im}(\vec{x}_1) = e^{-5t} \begin{bmatrix} 3 \sin t - \cos t \\ \sin t \end{bmatrix} \]
} real-valued

Each is now a real-valued fundamental solution.

General Solution:

\[ \vec{x} = c_1 e^{-5t} \begin{bmatrix} 3 \cos t + \sin t \\ \cos t \end{bmatrix} + c_2 e^{-5t} \begin{bmatrix} 3 \sin t - \cos t \\ \sin t \end{bmatrix} \]

Everything is real.

Stability and Spiraling Behavior

If \( \lambda = a \pm ib \):

  • If \( a \) (\( \text{Re}(\lambda) \)) is positive, solutions spiral away from \( \vec{0} \).
  • If \( a \) (\( \text{Re}(\lambda) \)) is negative, solutions spiral into \( \vec{0} \).
PAGE 10

3x3 System Example

\[ A = \begin{bmatrix} -3 & -10 & 0 \\ 6 & 5 & 6 \\ -1 & 7 & -4 \end{bmatrix} \]

Eigenvalues:

\( \lambda = -3, -1, 2 \)

Eigenvectors:

\[ \vec{v} = \begin{bmatrix} -1 \\ 0 \\ -1 \end{bmatrix}, \begin{bmatrix} -5 \\ 1 \\ 4 \end{bmatrix}, \begin{bmatrix} -4 \\ 2 \\ 3 \end{bmatrix} \]

General Solution:

\[ \vec{x} = c_1 e^{-3t} \begin{bmatrix} -1 \\ 0 \\ -1 \end{bmatrix} + c_2 e^{-t} \begin{bmatrix} -5 \\ 1 \\ 4 \end{bmatrix} + c_3 e^{2t} \begin{bmatrix} -4 \\ 2 \\ 3 \end{bmatrix} \]

Mixed signs, so \( \vec{0} \) is a saddle point.

  • If \( c_3 = 0 \), solutions go to \( \vec{0} \) as \( t \to \infty \).
  • If \( c_1 = c_2 = 0 \), solutions go away from \( \vec{0} \) as \( t \to \infty \).
PAGE 11

Decoupling Differential Equations

\[ \vec{x}' = \begin{bmatrix} 9 & -2 \\ 6 & 1 \end{bmatrix} \vec{x} \]

A

\[ \begin{aligned} x_1' &= 9x_1 - 2x_2 \\ x_2' &= 6x_1 + x_2 \end{aligned} \]

"Coupled" because \( x_i' \) depends on other \( x \)'s.

Decoupling by Diagonalizing A

These can be decoupled by diagonalizing \( A \).

\[ \begin{aligned} \vec{x}' &= A \vec{x} \\ &= PDP^{-1} \vec{x} \end{aligned} \]

Let \( \vec{y} = P^{-1} \vec{x} \)

So \( P \vec{y} = \vec{x} \)

\( P \vec{y}' = \vec{x}' \), \( \vec{y}' = P^{-1} \vec{x}' \)

\[ \begin{aligned} \vec{x}' &= PD \vec{y} \\ P^{-1} \vec{x}' &= P^{-1} P D \vec{y} \\ \vec{y}' &= D \vec{y} \end{aligned} \]

Note: \( D \) is diagonal.

\( \vec{y} \) is a decoupled system. Solve for \( \vec{y} \), then \( \vec{x} = P \vec{y} \).