An Educated Guess Solution for a Homogeneous System of Differential Equations with Constant Coefficients.
\begin{align*}
\begin{cases}
x_1'& =x_1-x_2 \\
x_2'& =2x_1+4x_2 \\
\end{cases}
\end{align*}
Which can be written in matrix notation as
\begin{align}
\mathbf{x}'=
\left[
\begin{array}{rr}
1 & -1 \\
2 & 4
\end{array}
\right]
\mathbf{x}
\qquad
\text{ or }
\qquad
\mathbf{x}'=A\mathbf{x}
\text{ where }
A=\left[
\begin{array}{rr}
1 & -1 \\
2 & 4
\end{array}
\right]\tag{βΆ}
\end{align}
Our understanding of functions and their derivatives might lead us to make an educated guess that
\(x_i=v_ie^{\lambda t}\) for some scalars
\(v_1\text{,}\) \(v_2\text{,}\) and
\(\lambda\text{.}\) Why?
So we are hoping to find a solution to
(βΆ) of the form
\begin{gather*}
\left[
\begin{array}{r}
x_1 \\
x_2
\end{array}
\right]
=
\left[
\begin{array}{r}
v_1 \\
v_2
\end{array}
\right]
e^{\lambda t}
\qquad
\mathbf{x}=\mathbf{v}e^{\lambda t}
\end{gather*}
Substituting this potential solution into
(βΆ) will place some restrictions on
\(\mathbf{v}\) and
\(\lambda\text{.}\)
We are looking for a set of 2
linearly independent solutions (
\(\mathbf{x_1}\) and
\(\mathbf{x_2}\)) to
\begin{align*}
\mathbf{x}'=A\mathbf{x}
\text{ where }
A=\left[
\begin{array}{rr}
1 & -1 \\
2 & 4
\end{array}
\right]
\end{align*}
We are hoping to find solutions of the form
\(\mathbf{x}=\mathbf{v}e^{\lambda t}\text{.}\) We now know that these solutions have to satisfy
\begin{gather}
(A-\lambda I)\mathbf{v}=\mathbf{0}\tag{#}
\end{gather}
If we are going to keep linear independence, then we need to make sure that
\(\mathbf{v} \neq \mathbf{0}\text{.}\) In other words, we seek
nontrivial solutions to
(#).
Theorem 162.
The system of equations
\begin{gather*}
(A-\lambda I)\mathbf{v}=\mathbf{0}
\end{gather*}
has a nontrivial solution if and only if
\begin{gather*}
\det(A-\lambda I) = 0
\end{gather*}
ExampleΒ 161 continued: Continuing our efforts to find solutions of the form
\(\mathbf{x}=\mathbf{v}e^{\lambda t}\) to
\begin{align*}
\mathbf{x}'=A\mathbf{x}
\text{ where }
A=\left[
\begin{array}{rr}
1 & -1 \\
2 & 4
\end{array}
\right]
\end{align*}
we now know that we seek \(\lambda\) values that satisfy
\begin{gather*}
\det(A-\lambda I)=0.
\end{gather*}
Now that we have found that we require
\(\lambda=3\) or
\(\lambda=2\text{,}\) we now seek nonzero
\(\mathbf{v}\) vectors for each
\(\lambda\) value that satisfy
\begin{align*}
(A-\lambda I)\mathbf{v} = \mathbf{0}\\
\left[
\begin{array}{cc}
1-\lambda & -1 \\
2 & 4-\lambda
\end{array}
\right]\mathbf{v} = \mathbf{0}
\end{align*}
-
We have only
equation for
variables. This means that we have
one degree of . In other words, we can freely choose a value for one of these variables. The other variable will be forced by the value we choose for the first variable.
Show that
\(\mathbf{x_1}\) and
\(\mathbf{x_2}\) are linearly independent.
So we now see that the general solution to
\begin{align*}
\mathbf{x}'=A\mathbf{x}
\text{ where }
A=\left[
\begin{array}{rr}
1 & -1 \\
2 & 4
\end{array}
\right]
\end{align*}
is given by
Eigenvalues, Eigenvectors, and Solutions of Homogeneous Systems of Differential Equations with Constant Coefficients.
Definition 163.
The number \(\lambda\) (real, complex, zero, or nonzero) is an eigenvalue of the matrix \(A\) if
\begin{gather*}
\det(A-\lambda I) = 0
\end{gather*}
If \(\lambda\) is an eigenvalue of \(A\text{,}\) then any nonzero vector \(\mathbf{v}\) that satisfies
\begin{gather*}
A\mathbf{v}=\lambda \mathbf{v}
\end{gather*}
or, equivalently,
\begin{gather*}
(A-\lambda I)\mathbf{v} = \mathbf{0}
\end{gather*}
is called an eigenvector of A associated with the eigenvalue \(\lambda\).
In our very first example, the solutions we found to
\begin{gather*}
\mathbf{x}'=A\mathbf{x}
\end{gather*}
were of the form
\begin{gather*}
\mathbf{v}e^{\lambda t}
\end{gather*}
where \(\lambda\) was and \(\mathbf{v}\) was .
Theorem 164.
If \(\lambda\) is an eigenvalue of the constant coefficient matrix \(A\) and \(\mathbf{v}\) is an eigenvector of \(A\) associated with \(\lambda\text{,}\) then
\begin{gather*}
\mathbf{x}(t)=\mathbf{v}e^{\lambda t}
\end{gather*}
is a solution of the linear homogeneous system of differential equations with constant coefficients
\begin{gather*}
\mathbf{x}'=A\mathbf{x}.
\end{gather*}
You should look at the Compartmental Analysis subsection of Example 2 about multiple brine tanks in Section 5.2 of your textbook.
Definition 168.
The
complex conjugate of
\(\lambda=p+qi\) is
\(\overline{\lambda}=\) .
The
complex conjugate of a matrix \(A=[a_{ij}]\) is
\(\overline{A}=[\overline{a_{ij}}]\text{.}\)
Properties of the Complex Conjugate
-
If
\(r\) is a real number then
\(\overline{r}=\) .
-
\(\displaystyle \overline{\alpha+\beta}=\overline{\alpha}+\overline{\beta}\)
-
\(\displaystyle \overline{\alpha-\beta}=\overline{\alpha}-\overline{\beta}\)
-
\(\displaystyle \overline{\alpha\beta}=\overline{\alpha}\overline{\beta}\)
-
\(\overline{\left(\frac{\alpha}{\beta}\right)}=
\frac{\overline{\alpha}}{\overline{\beta}}\text{,}\) provide
\(\beta \neq 0\)
Theorem 169.
Let \(A \in \mathbb{R}^{n\times n}\text{.}\) If there are real vector functions \(\mathbf{p}(t)\) and \(\mathbf{q}(t)\) such that
\begin{gather*}
\mathbf{v}(t)=\mathbf{p}(t)\pm i\mathbf{q}(t)
\end{gather*}
are solutions to
\begin{gather}
\mathbf{x}'=A\mathbf{x}\tag{#}
\end{gather}
then
\(\mathbf{p}(t)\) and
\(\mathbf{q}(t)\) are also solutions of
(#).
Theorem 170.
Let
\(A \in \mathbb{R}^{n\times n}\text{.}\) If
\(\alpha+i\beta\) is an eigenvalue of
\(A\) with eigenvector
\(\mathbf{a}+i\mathbf{b}\text{,}\) then
\(\alpha-i\beta\) is an eigenvalue of
\(A\) with eigenvector
\(\mathbf{a}-i\mathbf{b}\text{.}\)
Eigenvalues with multiplicity.
So far, we have been able to find general solutions to \(\mathbf{x}'=A\mathbf{x}\) because the eigenvalues have been distinct and, consequently, we have been able to find a set of \(n\) linearly independent eigenvectors. But what happens with the eigenvalues are not distinct? Said another way, what happens when some of the eigenvalues have multiplicity \(>1\text{?}\) Sometimes this is not a problem. Consider the matrix
\begin{align*}
A=\left[
\begin{array}{rr}
5 & 0 \\
0 & 5
\end{array}
\right]
\end{align*}
It is not to hard to see that this matrix has only one eigenvalue, namely \(\lambda=5\) of multiplicity 2, but we can use this eigenvalue to find 2 linearly independent eigenvectors associated with \(\lambda = 5\text{.}\)
But what happens when an eigenvalue
\(\lambda\) has multiplicity
\(k\text{,}\) but there are fewer than
\(k\) linearly independent eigenvectors for
\(\lambda\text{,}\) as in the next example.
Example 174. Defective eigenvalues.
Find a general solution for \(\mathbf{x}'=A\mathbf{x}\) where
\begin{align*}
A=\left[
\begin{array}{rr}
-7 & 18 \\
-2 & 5
\end{array}
\right]
\end{align*}
-
At home, you should check that the characteristic equation, \(\det(A-\lambda I) = 0\text{,}\) is
\begin{gather*}
(\lambda+1)^2=0.
\end{gather*}
Verify that \((A-(-1)I)\mathbf{v}=(A+I)\mathbf{v}=\mathbf{0}\) has one degree of freedom, and find one solution, \(\mathbf{x_1}\text{,}\) of \(\mathbf{x}'=A\mathbf{x}\text{.}\)
-
We might suspect that we could find a second solution of
\(\mathbf{x}'=A\mathbf{x}\) that has the form
\(\displaystyle \mathbf{x_2}= \left[\begin{array}{c}
u_1 \\ u_2 \end{array}\right]te^{-t}\text{.}\) Show that if
\(\mathbf{x_2}\) has this form, then
\(\mathbf{x_2} = \displaystyle \left[\begin{array}{c}
0 \\ 0 \end{array}\right]\text{.}\)
Definition 177.
Let \(\lambda\) be an eigenvalue of \(A\) of multiplicity \(k>1\text{.}\)
-
If
\(\lambda\) has
\(k\) linearly independent eigenvectors, then
\(\lambda\) is a
complete eigenvalue.
-
If
\(\lambda\) has only
\(p<k\) linearly independent eigenvectors, then
\(\lambda\) is a
defective eigenvalue and its
deficit is
\(k-p\text{.}\)
Procedure for Defective Eigenvalues of Multiplicity 2
-
Find an eigenvector \(\mathbf{v_1}\text{:}\)
\begin{equation*}
(A-\lambda I)\mathbf{v_1} = \mathbf{0}\text{.}
\end{equation*}
Solve
\begin{equation*}
(A-\lambda I)\mathbf{v_2} = \mathbf{v_1}\text{.}
\end{equation*}
-
Then
\begin{gather*}
\mathbf{x_1}(t)=\mathbf{v_1}e^{\lambda t}
\end{gather*}
and
\begin{gather*}
\mathbf{x_2}(t)=(\mathbf{v_1}t+\mathbf{v_2})e^{\lambda t}
\end{gather*}
are linearly independent solutions of \(\mathbf{x}'=A\mathbf{x}\text{.}\)