Differential Equations Classification
- Ordinary/Partial
- involving only ordinary derivatives with respect to a single independent variable is called an ordinary differential equation.
- involving partial derivatives with respect to more than one independent variable is a partial differential equation.
- Linear/Non-linear
- Linear: One in which the dependent variable \(y\) and its derivatives appear in additive combinations of their first powers. May be written in the form
\[ a_{n}(x) \frac{d^{n} y}{d x^{n}}+a_{n-1}(x) \frac{d^{n-1} y}{d x^{n-1}}+\ldots+a_{1}(x) \frac{d y}{d x}+a_{0}(x) y=F(x) \]
- Order
- The order of the highest-order derivatives present in the equation.
- Homogeneous/Non-homogeneous
- Homogeneous: One that does not have terms involving only \(x\) (and constants).
Explicit/Implicit Solutions
Explicit solution is a function which when substituted in the equation for the depended variable satisfies the equation for any value of the independent variable within a given interval.
Implicit solution is a relation that defines one or more explicit solutions.
Initial Value Problems
Definition
By an Initial Value Problem (IVP) we mean the set of a \(n\)th order differential equation
\[ F\left(x, y, \frac{d y}{d x}, \ldots, \frac{d^{n} y}{d x^{n}}\right)=0 \]
and the \(n\) initial conditions
\[ y\left(x_{0}\right)=y_{0}, \frac{d y}{d x}\left(x_{0}\right)=y_{1}, \ldots, \frac{d^{n-1} y}{d x^{n-1}}\left(x_{0}\right)=y_{n-1} \]
where \(y_{i}, i=0, \ldots, n-1\) are given numbers.
Solution of an Initial Value Problem
By an explicit solution of the above defined Initial Value Problem we mean a function \(y=y(x)\) such that satisfies the above differential equation and all the above initial conditions.
Separable Equations
Determine a equation is separable
If the right-hand side of the equation \(\frac{d y}{d x}=f(x, y)\) can be expressed as a function \(g(x)\) that depends only on \(x\) times a function \(p(y)\) that depends only on \(y\), then the differential equation is called separable.
In other words, a first-order equation is separable if it can be written in the following form.
\[ \frac{d y}{d x}=g(x) p(y) \]
Solving Separable Equations
\[ \frac{d y}{d x}=g(x) p(y) \Longleftrightarrow \int \frac{1}{p(y)} d y=\int g(x) d x \]
Solving Linear Equations with an Integrating Factor
We consider equations of the form
\[ \frac{d y}{d x}+P(x) y=Q(x) \]
The key idea: multiply both sides with a function \(\mu(x)\) to get
\[ \mu(x) \frac{d y}{d x}+\mu(x) P(x) y=\mu(x) Q(x) \]
and hope that we can determine \(\mu(x)\) such that we combine the terms \(\mu(x) \frac{d y}{d x}\) and \(r(x) P(x) y\) as
\[ \mu(x) \frac{d y}{d x}+\mu(x) P(x) y=\frac{d}{d x}(\mu(x) y) \]
Since the left hand side of the above equation is equal to the right hand side, the function \(\mu(x)\) has to satisfy the equation
\[ \mu^{\prime}(x)=\mu(x) P(x) \rightarrow \mu(x)=e^{\int P(x) d x} \]
and thus we can write down the solution of the original equation (1) as
\[ y(x)=\frac{1}{\mu(x)} \int \mu(x) Q(x) d x \]
Check Exact Equation
\(M(x, y) d x+N(x, y) d y=0\) is an exact equation iff
\[ \frac{\partial M}{\partial y}(x, y)=\frac{\partial N}{\partial x}(x, y) \quad \forall x, y \in \mathbb{R} \times \mathbb{R} \]
Exact Differential Equations: The Algorithm
If \(M(x, y) d x+N(x, y) d y=0\) is exact then check exact equation.
- \(\frac{\partial F}{\partial x}=M \Longrightarrow F(x, y)=\int M(x, y) d x+g(y)\)
- Take the partial derivative with respect to \(y\) of the above, make it equal to \(N\) and solve for \(g \prime(y)\)
- Integrate \(g\prime (y)\) to get \(g(y)\) and substitute back to the above equation to get the solution \(F(x, y)\)
Special Integrating Factors
\[ \begin{aligned} &\frac{\frac{\partial M}{\partial y}-\frac{\partial N}{\partial x}}{N}=T(x) \Longrightarrow \mu(x, y)=\mu(x)=\exp \left[\int T(x) d x\right] \\ &\frac{\frac{\partial N}{\partial x}-\frac{\partial M}{\partial y}}{M}=S(y) \Longrightarrow \mu(x, y)=\mu(y)=\exp \left[\int S(y) d y\right] \end{aligned} \]
Solution lost or gained
- Lost: When multiplying \(\frac {p(x, y)} {q(x, y)}\), check if \(q(x, y)=0\) is a solution to the original equation
- Gained: When multiplying \(\mu(x, y)\), identify those solutions to \(\mu(x, y)=0\) that are not solutions to the original equation.
Solving Second Order Homogeneous Linear Equations
Solving
\[ ay{\prime \prime} + by{\prime} + cy = 0 \]
Solve auxiliary algebraic equation
\[ a r^{2}+r+c=0 \]
There are three cases
- \(r_1 \neq r_2\) → \(y(x) = C_1 e^{r_1 x} + C_2 e^{r_2 x}\)
- \(r_1 = r_2\) → \(y(x)=C_{1} e^{\lambda x}+C_{2} x e^{\lambda x}\)
- Complex roots
- If the auxiliary equation for the equation ay \({ }^{\prime \prime}+\mathrm{by}^{\prime}+\mathrm{cy}=0\) has complex conjugate roots \(\alpha \pm \beta i\), then two linearly independent solutions to the equation are \(e^{\alpha t} \cos \beta \mathrm{t}\) and \(e^{\alpha t} \sin \beta \mathrm{t}\).
- A general solution can be expressed as a linear combination of these two solutions where \(c_{1}\) and \(c_{2}\) are arbitrary constants. $ y(t)=c_{1} e^{t} t+c_{2} e^{t} t $
- If the auxiliary equation for the equation ay \({ }^{\prime \prime}+\mathrm{by}^{\prime}+\mathrm{cy}=0\) has complex conjugate roots \(\alpha \pm \beta i\), then two linearly independent solutions to the equation are \(e^{\alpha t} \cos \beta \mathrm{t}\) and \(e^{\alpha t} \sin \beta \mathrm{t}\).
Theorem (Wronskian & linear independence)
The Wronskian of \(y_{1}(x), y_{2}(x)\) is defined to be
\[ W\left(y_{1}, y_{2}\right):=y_{1} y_{2}^{\prime}-y_{2} y_{1}^{\prime} \]
$\(y_{1}, y_{2}\) are linearly dependent on an interval I iff \(W\left(y_{1}, y_{2}\right)(x)=0, \forall x \in I\).
Undetermined coefficients
First, solve the homogeneous differential equation.
To find a particular solution to the differential equation for \(a\prime\prime +b y \prime + cy = Ct^{m} e^{\alpha t} \cos \beta t\) or \(a\prime\prime +b y \prime + cy = Ct^{m} e^{\alpha t} \sin \beta t\), use the trial solution\(y_{p}(t)=t^{s}\left(A_{m} t^{m}+\cdots+A_{1} t+A_{0}\right) e^{\alpha t} \cos \beta t+t^{s}\left(B_{m} t^{m}+\cdots+B_{1} t+B_{0}\right) e^{\alpha t} \sin \beta t\), with
- \(s=0\) if \(\alpha+i \beta\) is not a root of the associated auxiliary equation and
- \(s=1\) if \(\alpha+i \beta\) is a root of the associated auxiliary equation.
- \(s=2\) if \(r\) is a double root of the associated auxiliary equation
Determine the values for \(m, \alpha, \beta\), and \(s\) to create the trial solution. Substitute the trial solution into the given differential equation and determine the unknown coefficients \(A_{j}\) by equating the coefficients of like terms.
Variation of parameters
To determine a particular solution to the non-homogeneous differential equation \(ay\prime\prime+b y{\prime}+c y=f\), take a particular solution of the non-homogeneous differential equation to be \(y_{p}(t)=v_{1}(t) y_{1}(t)+v_{2}(t) y_{2}(t)\) for some functions \(v_{1}(t)\) and \(v_{2}(t)\).
Now determine \(v_{1}(t)\) and \(v_{2}(t)\) by solving the system below for \(v_{1}{\prime}(t)\) and \(v_{2}{\prime}(t)\) and integrating.
\[ \begin{aligned} y_{1} v_{1}^{\prime}+y_{2} v_{2}^{\prime} &=0 \\ y_{1}^{\prime} v_{1}^{\prime}+y_{2}^{\prime} v_{2}^{\prime} &=\frac{f}{a} \end{aligned} \]
Integrate to find \(v_1\) and \(v_2\). Or, the general formula for \(v_{1}, v_{2}\) is
\[ v_{1}=\int \frac{-f y_{2}}{a W\left(y_{1}, y_{2}\right)} d x, \quad v_{2}=\int \frac{f{y_{1}}}{a W\left(y_{1}, y_{2}\right)} d x \]
Method for solving Cauchy Euler Equations
A linear second order equation that can be expressed in the form
\[ a x^{2} y^{\prime \prime}(x)+b x y^{\prime}(x)+c y(x)=f(x) \]
where \(a, b, c\) are constants, is called a Cauchy-Euler, or equidimensional equation.
Substitute \(y=x^{r}\) to get the auxiliary equation
\[ a r^{2}+(b-a) r+c=0 \]
If this equation has
- two real roots \(r_{1} \neq r_{2}\) then \(x^{r_{1}}, x^{r_{2}}\) are two solutions.
- one repeated real root \(r_{1}=r_{2}=r\) then \(x^{r}, x^{r} \ln x\) are two solutions.
- two complex roots \(\alpha+\beta i\) then \(x^{\alpha} \cos (\beta \ln x), x^{\alpha} \sin (\beta \ln x)\) are two solutions.
Linear Algebra
Singular Matrix → \(\det=0\), no inverse
- Compute \(\lambda\) and \(v\) such that \(A v=\lambda v \Rightarrow(A-\lambda I) v=0\).
- For nontrivial solutions we need to have \(|A-\lambda I|=0\). Solve for \(\lambda\).
- Plug-in the computed value of the eigenvalue \(\lambda\) in \((A-\lambda I) \mathrm{v}=0\) to compute, associated to \(\lambda\), eigenvector \(v\).
Wronskian
The Wronskian of two differentiable functions \(f\) and \(g\) is \(W(f, g)=f g^{\prime}-g f^{\prime}\).
More generally, for \(n\) real- or complex-valued functions \(f_{1}, \ldots, f_{n}\), which are \(n-1\) times differentiable on an interval \(I\), the Wronskian \(W\left(f_{1}, \ldots, f_{n}\right)\) as a function on \(I\) is defined by
For vector valued functions, the Wronskian is just the determinant of the matrix where each column is a function.
On an interval \(I\) where the entries of \(A(t)\) are continuous, let \(x_1\) and \(x_{2}\) be two solutions and \(W(t)\) their Wronskian. Then either
- \(W(t) \equiv 0\) on \(I\), and \(x_{1}\) and \(x_{2}\) are linearly dependent on \(I\), or
- \(W(t)\) is never 0 on \(I\), and \(x_{1}\) and \(x_{2}\) are linearly independent on \(I\).
Solving \(x^\prime(t)=\mathbf Ax(t)\)
Suppose the \(n \times n\) matrix \(\mathbf A\) has \(n\) linearly independent eigenvectors \(\mathbf{u_1, u_2, \ldots u_n}\). Let \(r_i\) be the eigenvalue corresponding to \(\mathbf u_i\). Then \(\{\mathbf{e^{r_1t} u_1, e^{r_2t} u_2, \ldots e^{r_nt} u_n}\}\)is a fundamental solution set. Consequently, a general solution is \(\mathbf x(t) = \mathbf{e^{r_1t} u_1 + e^{r_2t} u_2 + \ldots + e^{r_nt}u_n}\) where \(c_1, c_2, \ldots, c_n\) are arbitrary constants.
Same eigenvalues \(\lambda_1 = \lambda_2 = \ldots = \lambda_n\)
- Try finding two linearly independent eigenvectors \(\vec v_1, \vec v_2, \ldots \vec v_n\)
- If found, do the same as above
- If not, do this
- Start with eigenvalue \(\vec v_1\) (solution to \((\mathbf A - \lambda \mathbf I)\vec v=0\))
- Find \(\vec v_2\) from \((\mathbf A - \lambda \mathbf I)\vec v=\vec v_1\)
- Repeat to find Find \(\vec v_n\) from \((\mathbf A - \lambda \mathbf I)\vec v=\vec v_{n-1}\)
- Solution will be:
- general solution to the homo.
- \(\vec x_1 = \vec v_1 e^{\lambda t}\)
- \(\vec x_2= t \vec x_1\ + \vec v_2 e^{\lambda t}\) (note that \(\vec x_1\) is from the previous line)
- …
- \(\vec{x_{n}}=e^{\lambda t}\left(\vec{v_{n}}+t v_{n-1}+\cdots+\frac{t^{n-1}}{(n-1) !} \vec{v_{1}}\right)\)
- Superposition of the above
Complex Eigenvalues
If the real matrix \(A\) has complex conjugate eigenvalues \(\alpha \pm i \beta\) with corresponding eigenvectors \(\mathrm{a} \pm i \mathrm{~b}\), then two linearly independent. To find the complex eigenvectors, use complex conjugate in ref. Complex vector solutions to \(x^{\prime}(t)=Ax(t)\) are
\[ \mathrm{x}_{1}=(\mathrm{a}+i \mathrm{~b}) e^{(\alpha+i \beta) t}, \quad \mathrm{x}_{2}=(\mathrm{a}-i \mathrm{~b}) e^{(\alpha-i \beta) t} \]
Wr equivalently to the following two real vector solutions
\[ \mathrm{x}_{1}=e^{\alpha t} \cos \beta t \mathrm{a}-e^{\alpha t} \sin \beta t \mathrm{~b}, \quad \mathrm{x}_{2}=e^{\alpha t} \sin \beta t \mathrm{a}+e^{\alpha t} \cos \beta t \mathrm{~b} \]
Solving \(x^\prime(t)=\mathbf Ax(t) + \vec f(t)\)
- Let \(\vec x_g = c_1 \vec x_1 + c_2 \vec x_2 + \ldots + c_n \vec x_n\) be solution to the homogeneous system of equations; let \(\vec x_p\) be a particular solution to the non homogeneous system of equations.
If \(\vec f\) contains | Guess |
---|---|
\(t^k (k \in \mathbb Z^+)\) | \(\vec a_{k} t^{k} + \vec a_{k-1} t^{k-1} + \ldots + \vec a_1 + \vec a_0\) |
\(\sin t\) | \(\vec a_1 \sin t+\vec a_2 \cos t\) |
\(e^{\lambda t}\) (not solution to homo.) | \(\vec a_1 e^{\lambda t}\) |
\(e^{\lambda t}\) (solution to homo.) | \((\vec a_{n} t^{n} + \vec a_{n-1} t^{n-1} + \ldots + \vec a_1 + \vec a_0) e^{\lambda t}\) |
The Matrix Exponential Function
\(\vec x \prime = \mathbf P \vec x\) has solution \(\vec x = e^{t\mathbf P} \vec c\).
To calculate \(e^{t \mathbf P}\):
- If \(\mathbf P\) is diagonal, \(\begin{pmatrix} \lambda_1 & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & \lambda_n \end{pmatrix}\). \(e^{t \mathbf P}=\begin{pmatrix} e^{t \lambda_1} & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & e^{t \lambda_n} \end{pmatrix}\)
- Otherwise, let \((\lambda_1, \vec v_1), \cdots, (\lambda_n, \vec v_n)\) be Eigenpairs of \(\mathbf P\)
- \(e^{t\mathbf P}=\mathbf E e^{t \mathbf D} \mathbf E^{-1}\)
- \(\mathbf E = \begin{pmatrix} | & & | \\ \vec v_1 & \cdots & \vec v_n \\ | & & | \end {pmatrix}\), \(\mathbf D = \begin{pmatrix} \lambda_1 & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & \lambda_n \end{pmatrix}\)
- Non-diagonalizable
- Not enough linearly independent eigenvectors
- Cannot be written as \(\mathbf E \mathbf D \mathbf E^{-1}\) for some diagonal matrix \(\mathbf D\)
- Use “Same eigenvalues”
Laplace Transform
Properties of Laplace Transforms Table of Laplace TransformsDefinition: \(\mathcal{L}\{f(t)\}=F(s)=\int_{0}^{\infty} e^{-s t} f(t) d t\)
\(f(t)\) | \(F(t)\) |
---|---|
\(C\) | \(\frac C s\) |
\(t\) | \(\frac 1 {s^2}\) |
\(t^2\) | \(\frac 2 {s^3}\) |
\(t^n\) | \(\frac {n!} {s^{n+1}}\) |
\(e^{-at}\) | \(\frac 1 {s+a}\) |
\(\sin \omega t\) | \(\frac{\omega}{s^{2}+\omega^{2}}\) |
\(\cos \omega t\) | \(\frac{s}{s^{2}+\omega^{2}}\) |
\(\sinh \omega t\) | \(\frac{\omega}{s^{2}-\omega^{2}}\) |
\(\cosh \omega t\) | \(\frac{s}{s^{2}-\omega^{2}}\) |
\(u(t-a)\) | \(\frac{e^{-a s}}{s}\) |
\(g^{\prime}(t)\) | \(s G(s)-g(0)\) |
---|---|
\(g^{\prime \prime}(t)\) | \(s^{2} G(s)-s g(0)-g^{\prime}(0)\) |
\(g^{\prime \prime \prime}(t)\) | \(s^{3} G(s)-s^{2} g(0)-s g^{\prime}(0)-g^{\prime \prime}(0)\) |
Solving DE with Laplace Transforms
- Take the Laplace transform of both sides of the equation.
- Use the properties of the Laplace transform and the initial conditions to obtain an equation for the Laplace transform of the solution and then solve this equation for the transform.
- Determine the inverse Laplace transform of the solution by looking it up in a table or by using a suitable method (such as partial fractions) in combination with the table.