Skip to main content

Handout Lesson 14, General Solutions of Linear Equations

Textbook Section(s).

This lesson is based on Section 3.2 of your textbook by Edwards, Penney, and Calvis.

Minor Tweaks.

In the last lesson, we looked at second-order linear differential equations. Today we extend those results to \(n\)th-order linear differential equations. In this section, I am going to start by including several important definitions and theorems directly from your textbook so that we can quickly discuss how these compare with definitions and theorems that we learned in the last class.

Definition 97.

An \(n\)th-order differential equation in the function \(y(x)\) is a differential equation equivalent to an equation of the form
\begin{equation*} F(x,y,y^{(1)},y^{(2)},\dots, y^{(n)})=0\text{.} \end{equation*}

Definition 98.

An \(n\)th-order differential equation is linear if it is equivalent to an equation of the form
\begin{gather} P_0(x)y^{(n)}+P_1(x)y^{(n-1)}+\dots+P_{n-1}(x)y'+P_n(x)y=F(x)\tag{✢} \end{gather}
(In order to be an \(n\)th-order differential equation, we require \(P_0(x)\) is not the zero function.)
Unless otherwise noted, you should assume that \(P_i(x)\) (\(i\in\set{0,1,\dots,n}\)) and \(F(x)\) are continuous on an open interval \(I\text{.}\)
Provided that \(P_0(x) \neq 0\text{,}\) we can rewrite (✢) as
\begin{gather} y^{(n)}+p_1(x)y^{(n-1)}+\dots+p_{n-1}(x)y'+p_n(x)y=f(x)\tag{✢✢} \end{gather}
The homogeneous linear equation associated with (✢✢) is
\begin{equation*} y^{(n)}+p_1(x)y^{(n-1)}+\dots+p_{n-1}(x)y'+p_n(x)y=0\text{.} \end{equation*}
Here are some definitions and theorems from your textbook about \(n\)th-order linear differential equations.

Definition 101.

The \(n\) functions \(f_1,f_2, \dots f_n\) are linearly dependent on the interval \(I\) if there exist constants \(c_1, c_2, \dots, c_n\text{,}\) not all zero, such that the linear combination
\begin{equation*} c_1f_1+c_2f_2+\dots+c_nf_n \end{equation*}
is equal to the zero function on \(I\text{.}\) In other words,
\begin{equation*} c_1f_1(x)+c_2f_2(x)+\dots+c_nf_n(x)=0 \end{equation*}
for all \(x \in I\text{.}\)
If \(f_1, f_2, \dots, f_n\) are not linearly dependent, then they are linearly independent.

Example 102. Verifying linear dependencei.

(Exercise 2 from Section 3.2 of your textbook)
Prove that the functions are linearly dependent by finding a non-trivial linear combination of the functions.
\begin{equation*} f(x)=5, \quad g(x)=2-3x^2, \quad h(x)=10+15x^2 \end{equation*}

Definition 103. Wronskian.

The Wronskian, \(W(f_1,f_2,\dots,f_n)\text{,}\) of the functions \(f_1, f_2, \dots, f_n\) is defined by
\begin{equation*} W(f_1,f_2,\dots,f_n)=\left\lvert \begin{array}{cccc} f_1(x) & f_2(x) & \dots & f_n(x) \\ f_1'(x) & f_2'(x) & \dots & f_n'(x) \\ \vdots & \vdots & &\vdots \\ f_1^{(n-1)}(x) & f_2^{(n-1)}(x) & \dots & f_n^{(n-1)}(x) \\ \end{array} \right\rvert \end{equation*}
provided that these functions can be differentiated \(n-1\) times.
Note: \(W(f_1,f_2,\dots,f_n)\) is used to denote a function of \(x\text{,}\) so you may also see the Wronskian written as \(W(x)\) if we wish to emphasize that it can be evaluated at \(x\text{.}\)
In this class, we will only be calculating \(2 \times 2\) or \(3 \times 3\) determinants. You will learn more about determinants in linear algebra.
We looked at \(2 \times 2\) determinants in the last class.
\(\left\lvert \begin{array}{cc} a & b \\ c & d \end{array} \right\rvert \)
Today, we will encounter several \(3\times 3\) determinants. I will show you two different, but equivalent ways to calculate a \(3 \times 3\) determinant. This should be a review from Calculus II.
\(\left\lvert \begin{array}{ccc} a & b & c \\ d & e & f \\ g & h & i \\ \end{array} \right\rvert \)

Example 105. The Wronskian and independence.

(Exercise 9 from Section 3.2 of your textbook)
Use the Wronskian to show that the functions are linearly independent on the indicated interval.
\begin{equation*} f(x)=e^x, \quad g(x)=\cos(x), \quad h(x)=\sin(x), \quad x \in \mathbb{R} \end{equation*}

Example 107. An IVP.

(Exercise 14 from Section 3.2 of your textbook)
The functions \(y_1=e^x\text{,}\) \(y_2=e^{2x}\text{,}\) and \(y_3=e^{3x}\) are linearly independent solutions of the third-order linear differential equation
\begin{equation*} y^{(3)}-6y''+11y'-6y=0. \end{equation*}
(Verify the linear independence at home!)
Find a particular solution satisfying \(y(0)=0\text{,}\) \(y'(0)=0\text{,}\) \(y''(0)=3\text{.}\)

What’s New?

New Item 1: Trivial initial conditions.

The Existence and Uniqueness Theorem for Linear Equations implies that if the coefficient functions of
\begin{equation*} y^{(n)}+p_1(x)y^{(n-1)}+\dots+p_{n-1}(x)y'+p_n(x)y=0 \end{equation*}
are continuous at \(a\text{,}\) then the only solution that satisfies the IVP problem for this homogeneous differential equation with the trivial initial conditions
\begin{equation*} y(a)=y'(a)=\dots = y^{(n-1)}(a)=0 \end{equation*}
is the trivial solution \(y\equiv 0\text{.}\)

New Item 2: Nonhomogeneous equations.

We now consider nonhomogeneous equations, as shown in (#), and how their solutions relate to their associated homogeneous equations, as shown in (†).
\begin{gather} y^{(n)}+p_1(x)y^{(n-1)}+\dots+p_{n-1}(x)y'+p_n(x)y=f(x)\tag{#}\\ y^{(n)}+p_1(x)y^{(n-1)}+\dots+p_{n-1}(x)y'+p_n(x)y=0\tag{†} \end{gather}
This theorem essentially says that any solution of (#) can be written as the sum of a specified particular solution of (#) and one of the solutions of the associated homoogeneous equation given by (†).

New Item 3: Reduction of Order.

Recall that when the characteristic equation
\begin{equation*} ar^2+br+c=0 \end{equation*}
for the homogeneous equation
\begin{equation*} ay''+by'+cy=0 \end{equation*}
had repeated real root \(r_1\) (i.e. \(b^2-4ac=0\text{,}\) and the only solution of the characteristic equation was the real number \(r_1\)), then both \(y_1=e^{r_1x}\) and \(y_2=xe^{r_1x}\) were solutions of our homogeneous differential equation. Our second solution is the product of our first solution and a function of \(x\text{.}\) It is not uncommon for a second solution to be a product of the solution we know and another function of \(x\text{.}\) The reduction of order technique takes a known solution and tries to find a second solution that is the product of the known solution and a function of \(x\text{.}\) Let’s demonstrate this technique with an example.

Example 109. Reduction of order.

(Exercise 37 from Section 3.2 of your textbook
It can be verified that \(y_1(x)=x^3\) is a solution of the differential equation
\begin{equation*} x^2y''-5xy'+9y=0, \quad x>0 \end{equation*}
Use the method reduction of order to find a second linearly independent solution of this equation.