Theory of Differential Equations

Definitions

Definition (Order)
The power the differential is raised to.

Definition (Linear)

The dependent variable and its derivatives are all not non-linear.

\begin{aligned} \underbrace{\frac{d^2 y}{d t}} &\quad \underbrace{\cos(x) \frac{dy}{dx}} &\quad \underbrace{\frac{dy}{dt} \frac{d^3 y}{dt^3}} &\quad \underbrace{y' = e^y} &\quad \underbrace{y \frac{dy}{dx}} \\ \text{linear} &\quad \text{linear} &\quad \text{non-linear} &\quad \text{non-linear} &\quad \text{non-linear} \end{aligned}

Definition (Autonomous)
Independent variable does not appear in the equation.

Definition (Non-autonomous)
Independent variable does appear in the equation.

Definition (Ansatz)
Our initial guess for the form of a solution, e.g. $y_p = A \cos(t) + B \sin(t)$.

Definition (Indicial Equation)
A quadratic equation that pops out during the application of the Frobenius method.

Definition (Analytic)
A function is analytic at a point if it can be expressed as a convergent power series in a neighborhood of that point.

Definition (Ordinary Point)
When $p(x)$ and $q(x)$ are analytic at that point.

Definition (Regular Singular Point)
If $P(x) = (x-x_0)p(x)$ and $Q(x) = (x-x_0)^2 q(x)$ are both analytic at $x_0$.

Definition (Irregular Singular Point)
Not regular.

Definition (Mean Convergence)
A sequence of functions $f_n$ converges in mean to $f$ on $[a,b]$ if $\lim_{n \to \infty} \int^b_a |f_n(x) - f(x)|^2 \, dx = 0$.

Definition (Pointwise Convergence)
A sequence of functions $f_n$ converges pointwise to $f$ on $[a,b]$ if $\lim_{n \to \infty} f_n(x) = f(x)$ for every $x \in [a,b]$.

Definition (Uniform Convergence)
A sequence of functions $f_n$ converges uniformly to $f$ on $[a,b]$ if $\lim_{n \to \infty} \sup_{x \in [a,b]} |f_n(x) - f(x)| = 0$.

Definition (Equilibrium Point)
A point where the derivative of the dependent variable with respect to the independent variable is zero.

Definition (Stable Node)
Trajectories approach the equilibrium point from all directions and eigenvalues are real and negative.

Definition (Unstable Bicritical Node (Star))
Trajectories move away from the equilibrium point in all directions and eigenvalues are real and positive.

Definition (Stable Centre)
Trajectories orbit around the equilibrium point with eigenvalues that are purely imaginary.

Definition (Unstable Saddle Point)
Trajectories approach the equilibrium point in one direction and move away in another, with eigenvalues having opposite signs.

Definition (Unstable Focus)
Trajectories spiral away from the equilibrium point with eigenvalues having positive real parts and non-zero imaginary parts.

Solving Methods

First Order

Definition (Standard Form)
\[\frac{dy}{dx} = f(x, y)\]

Definition (Separable)
\[\frac{dy}{dx} = f(x) g(y) \implies \int \frac{dy}{g(y)} = \int f(x) \, dx\]

Definition (Reduction to Separable)
\[\frac{dy}{dx} = f\left(\frac{y}{x}\right)\] with substitution: $y(x) = x v(x)$.

Definition (Linear Standard Form)
\[\frac{dy}{dx} + p(x) y = q(x)\]

Definition (Integrating Factor)
Note, the coefficient of $y'(x)$ must be 1. \[\phi(x) = \exp\left(\int p(x) \, dx\right)\] Multiplying the Linear Standard Form with $\phi(x)$ yields: \[\frac{d}{dx}(\phi y) = \phi(x) q(x) \implies y = \phi^{-1} \int \phi q(x) \, dx\]

Definition (Exact)
A first-order ODE is exact if it can be written in the form: \[M(x,y) \, dx + N(x,y) \, dy = 0\] where $\frac{\partial M}{\partial y} = \frac{\partial N}{\partial x}$. The solution is then given by: $F(x,y) = C$ where $F(x,y)$ satisfies $\frac{\partial F}{\partial x} = M(x,y)$ and $\frac{\partial F}{\partial y} = N(x,y)$.

Second Order

Definition (Standard Form)
\[y'' + p(x)y' + q(x)y = r(x)\]

Definition (Reducible to First Order)
\[\frac{d^2 y}{dx^2} + f\left(y, \frac{dy}{dx}\right) = 0\] is reducible to the first-order ODE \[p \frac{dp}{dy} + f(y, p) = 0\] with substitution $p = \frac{dy}{dx}$.

Definition (Constant Coefficients)
When $p(x)$ and $q(x)$ are constants: \[y'' + a_1 y' + a_0 y = 0\]

Definition (Homogeneous (Constant Coefficients))

Solve the characteristic equation: \[\lambda^2 + a_1 \lambda + a_0 = 0\] Cases:

  • $\lambda_1, \lambda_2$ are real and distinct
  • $\lambda_1, \lambda_2$ are real and coincide (same)
  • $\lambda_1, \lambda_2$ are complex conjugates

In each case, the solution of $y(x)$ becomes:

  • $y(x) = C e^{\lambda_1 x} + D e^{\lambda_2 x}$
  • $y(x) = C e^{\lambda_1 x} + D x e^{\lambda_1 x}$
  • $y(x) = e^{\alpha x}(A \cos(\beta x) + B \sin(\beta x))$ by DeMoivre's Theorem

Definition (Inhomogeneous → Method of Undetermined Coefficients)

\[y(x) = y_h(x) + y_p(x)\] Guesses for $y_p(x)$:

  • For $r(x) = P_n(x)$ (polynomial of degree $n$), try $y_p(x) = Q_n(x)$
  • For $r(x) = e^{\alpha x}$, try $y_p(x) = A e^{\alpha x}$
  • For $r(x) = \sin(\beta x)$ or $r(x) = \cos(\beta x)$, try $y_p(x) = A \sin(\beta x) + B \cos(\beta x)$
  • For products of the above forms, try products of the corresponding forms
  • If $y_p(x)$ is already a solution of the homogeneous equation, multiply by $x$ or $x^k$ until linearly independent

Theorem (Variation of Parameters)

This method works for any 2nd order inhomogeneous ODE if the complementary solution is known.

The general solution of the 2nd order inhomogeneous ODE: \[y'' + b_1(x) y' + b_0(x) y = f(x)\] is given by $y(x) = u_1(x) y_1(x) + u_2(x) y_2(x)$

where $y_1$ and $y_2$ are linearly independent solutions of the homogeneous ODE such that the Wronskian $W(x) \neq 0$ and \[u_1(x) = -\int \frac{y_2(x)f(x)}{W(x)} \, dx\] and \[u_2(x) = \int \frac{y_1(x)f(x)}{W(x)} \, dx\]

Definition (Power Series Method)

Note, that we embark on this approach because the second order standard form is not solvable in general with elementary functions!

Pick ansatz of the form \[y = \sum^{\infty}_{n=0} a_n z^n\] and take derivatives as required. For example: \[\frac{dy}{dz} = \sum^{\infty}_{n=1} n a_n z^{n-1}, \quad \frac{d^2 y}{dz^2} = \sum^{\infty}_{n=2} n(n-1) a_n z^{n-2}\] and substitute them into the ODE. Then solve by rearranging indices as necessary to obtain a recurrence relation. Apply the initial conditions and then guess the closed-form solution of the recurrence relation. Change back to the original variables if required.

If $x_0$ is an ordinary point of the differential equation \[y'' + p(x)y' + q(x)y = 0\] then the general solution in a neighbourhood $|x - x_0| < R$ may be represented as a power series.

Theorem (Method of Frobenius)
If $x_0 = 0$ is a regular singular point of the differential equation \[y'' + p(x)y' + q(x)y = 0\] then there exists at least one series solution of the form \[y(x) = x^r \sum^{\infty}_{n=0} c_n x^n = \sum^{\infty}_{n=0} c_n x^{n+r}, \quad c_0 \neq 0\] for some constant $r$ (index).

Definition (General Indicial Equation)
\[r(r-1) + p_0 r + q_0 = 0\]

n-th Order

Remark
Admits $n$ linearly independent solutions.

Definition (Power Series Expansion)

For an $n^{\text{th}}$ order linear ODE with variable coefficients: \[a_n(x) y^{(n)} + a_{n-1}(x) y^{(n-1)} + \dots + a_1(x) y' + a_0(x) y = f(x)\]

We assume a solution of the form: \[y(x) = \sum^{\infty}_{k=0} c_k (x-x_0)^k\]

Taking derivatives and substituting yields a recurrence relation for coefficients $c_k$, typically allowing us to determine $c_n$ in terms of $c_0, c_1, \dots, c_{n-1}$.

Definition (Reduction of Order)

Any $n^{\text{th}}$ order ODE can be formulated as a system of $n$ first order ODEs.

For $y^{(n)} = f(x, y, y', \dots, y^{(n-1)})$, set $y_i = y^{(i-1)}$ for $i = 1,2,\dots,n$ to obtain: \[y_i' = y_{i+1} \text{ for } i = 1,2,\dots,n-1\] \[y_n' = f(x, y_1, y_2, \dots, y_n)\]

Partial Differential Equations

Definition (Standard Form (Linear, Homogeneous, 2nd Order PDE))

\[A \frac{\partial^2 u}{\partial x^2} + B \frac{\partial^2 u}{\partial x \partial y} + C \frac{\partial^2 u}{\partial y^2} + D\frac{\partial u}{\partial x} + E\frac{\partial u}{\partial y} + F u = 0\]

  • Parabolic equation: $B^2 - 4AC = 0$ (Heat Equation)
  • Hyperbolic equation: $B^2 - 4AC > 0$ (Wave Equation)
  • Elliptic equation: $B^2 - 4AC < 0$ (Laplace Equation)

Definition (Separation of Variables)
\[U(x,y) = X(x) Y(y)\] then $U_x = Y X'$ and $U_y = Y' X$. Rewrite the PDE with these substitutions, then divide through by $XY$. Integrate and solve.

Definition (Change of Variables)

When a PDE is difficult to solve directly, changing variables can transform it into a simpler form.

For a second-order PDE, the transformation $u = u(\xi, \eta)$ where $\xi = \xi(x,y)$ and $\eta = \eta(x,y)$ requires computing: \[\frac{\partial u}{\partial x} = \frac{\partial u}{\partial \xi}\frac{\partial \xi}{\partial x} + \frac{\partial u}{\partial \eta}\frac{\partial \eta}{\partial x}\] \[\frac{\partial u}{\partial y} = \frac{\partial u}{\partial \xi}\frac{\partial \xi}{\partial y} + \frac{\partial u}{\partial \eta}\frac{\partial \eta}{\partial y}\]

And similarly for second-order derivatives. The canonical transformations are:

  • For hyperbolic: $\xi = x + y, \eta = x - y$ (characteristic coordinates)
  • For parabolic: $\xi = x, \eta = y - f(x)$ (transformation along characteristics)
  • For elliptic: $\xi = x + iy, \eta = x - iy$ (complex characteristics)

Systems / Dynamical Systems

Definition (Eigenvalue Classification)
  • $\lambda_2 < \lambda_1 < 0 \implies$ stable node
  • $0 < \lambda_1 < \lambda_2 \implies$ unstable node
  • $\lambda_1 = \lambda_2, \lambda_1 > 0 \implies$ unstable star
  • $\lambda_1 = \lambda_2, \lambda_1 < 0 \implies$ stable star
  • $\lambda_1 < 0 < \lambda_2 \implies$ unstable saddle node
  • $\operatorname{Re}(\lambda_1) = 0 \implies$ centre, stable
  • $\operatorname{Re}(\lambda_1) < 0 \implies$ stable focus
  • $\operatorname{Re}(\lambda_1) > 0 \implies$ unstable focus

Definition (Real Canonical Form)

For a linear system $\dot{\mathbf{x}} = \mathbf{A} \mathbf{x}$, the real canonical form depends on the eigenvalues of $\mathbf{A}$:

  • Real distinct eigenvalues $\lambda_1 \neq \lambda_2$:

\[\mathbf{A}_{\text{canonical}} = \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{pmatrix}\]

  • Real repeated eigenvalues $\lambda_1 = \lambda_2$ with linearly independent eigenvectors:

\[\mathbf{A}_{\text{canonical}} = \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_1 \end{pmatrix}\]

  • Real repeated eigenvalues $\lambda_1 = \lambda_2$ with one linearly independent eigenvector:

\[\mathbf{A}_{\text{canonical}} = \begin{pmatrix} \lambda_1 & 1 \\ 0 & \lambda_1 \end{pmatrix}\]

  • Complex conjugate eigenvalues $\lambda = \alpha \pm i\beta$:

\[\mathbf{A}_{\text{canonical}} = \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix}\]

Functions

Definition (Wronskian)
\[W(f_1, f_2, \dots, f_n)(x) = \begin{vmatrix} f_1(x) & f_2(x) & \dots & f_n(x) \\ f_1'(x) & f_2'(x) & \dots & f_n'(x) \\ \vdots & \vdots & \ddots & \vdots \\ f_1^{(n-1)}(x) & f_2^{(n-1)}(x) & \dots & f_n^{(n-1)}(x) \end{vmatrix}\] Note that if a set of functions is linearly dependent, then its Wronskian will equal 0.

Remark

Power Series, Taylor Series and Maclaurin Series:

  • Power Series: $\sum_{n=0}^{\infty} a_n (x - a)^n$
  • Taylor Series: $\sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!} (x - a)^n$
  • Maclaurin Series: $\sum_{n=0}^{\infty} \frac{f^{(n)}(0)}{n!} x^n$

Maclaurin is a special case of Taylor (at $a=0$), and Taylor is a special case of Power Series.

Definition (Orthogonality)
A set of functions $\{\phi_n\}_{n=1,2,3,\dots}$ is said to be orthogonal on the interval $[a,b]$ with respect to the inner product defined by \[(f, g)_w = \int^b_a w(x)f(x)g(x) \, dx\] with weight function $w(x) > 0$, if $(\phi_n,\phi_m)_w = 0$ for $m \neq n$.

Definition (Orthonormality)
A set $\{\phi_n\}_{n=1,2,3,\dots}$ is orthonormal when in addition to being orthogonal, $(\phi_n,\phi_n) = 1$, for $n = 1,2,3,\dots$.

Definition (Cauchy-Euler Equation)
\[x^2 y'' + a_1 x y' + a_0 y = 0\] You can solve this by either letting $x = e^t$ or using the ansatz $y = x^\lambda$. The characteristic equation is $\lambda^2 + (a_1 - 1) \lambda + a_0 = 0$. If you are blessed with the inhomogeneous case of above, just use method of undetermined coefficients.

Definition (Legendre's Equation)
\[(1 - x^2)y'' - 2x y' + n(n+1)y = 0\] Solutions are Legendre polynomials.

Definition (Bessel's Equation)

\[x^2 y'' + x y' + (x^2 - \nu^2) y = 0\]

Bessel function of the first kind of order $\alpha$: \[J_\alpha(x) = \sum^{\infty}_{m=0} \frac{(-1)^m}{\Gamma(m+1)\Gamma(m+\alpha+1)} \left(\frac{x}{2}\right)^{2m+\alpha}\] implies \[\frac{d}{dx} \left[x^\alpha J_\alpha(x)\right] = x^\alpha J_{\alpha-1}(x)\] implies \[\int^r_0 x^n J_{n-1}(x) \, dx = r^n J_n(r) \text{ for } n = 1, 2, 3, \dots\]

The DE admits solutions:

  • Case 1: $2\nu \notin \mathbb{Z}$: $y(x) = A J_{\nu}(x) + B J_{-\nu}(x)$, $J_{\nu}(x)$, $J_{-\nu}(x)$ linearly independent
  • Case 2: $2\nu \in \mathbb{Z}$: $y(x) = A J_{\nu}(x) + B J_{-\nu}(x)$
  • Case 3: $\nu \in \mathbb{Z}$: $J_{\nu}(x)$, $J_{-\nu}(x)$ linearly dependent, $y(x) = A J_{\nu}(x) + B Y_{\nu}(x)$

Definition (Laguerre's Equation)
\[x y'' + (1-x)y' + n y = 0\]

Definition (Hermite's Equation)
\[y'' - 2 x y' + 2 n y = 0\]

Definition (Sturm-Liouville Form)
\[(p y')' + (q + \lambda r) y = 0\] Note that Bessel, Laguerre, Hermite and Legendre equations can all be written in this form. Furthermore, any 2nd order linear homogeneous ODE $y'' + a_1(x)y' + [a_2(x) + \lambda a_3(x)]y = 0$ may be written in this form.

Definition (Heat Equation (PDE))
\[\frac{\partial^2 u}{\partial x^2} = \frac{\partial u}{\partial t}\]

Definition (Wave Equation (PDE))
\[\frac{\partial^2 u}{\partial x^2} = \frac{1}{c^2} \frac{\partial^2 u}{\partial t^2}\]

Definition (Laplace's Equation (PDE))
\[\frac{\partial^2 u}{\partial x^2} + \frac{\partial^2 u}{\partial y^2} = 0\]

Definition (Fourier Series)
\[y(x) = \frac{a_0}{2} + \sum^{N}_{n=1} (a_n \cos(n x) + b_n \sin(n x))\] \[a_n = \frac{1}{\pi} \int^{\pi}_{-\pi} y(x) \cos(nx) \, dx, \quad n = 0, 1, 2, \dots\] \[b_n = \frac{1}{\pi} \int^{\pi}_{-\pi} y(x) \sin(nx) \, dx, \quad n = 1, 2, \dots\]

Definition (Parseval's Identity)
\[\frac{\|f\|^2}{L} = \frac{1}{L} \int^{L}_{-L} f^2 \, dx = \frac{a_0}{2} + \sum^{\infty}_{n=1} (a_n^2 + b_n^2)\]