Theory of Differential Equations
Definitions
The dependent variable and its derivatives are all not non-linear.
\begin{aligned} \underbrace{\frac{d^2 y}{d t}} &\quad \underbrace{\cos(x) \frac{dy}{dx}} &\quad \underbrace{\frac{dy}{dt} \frac{d^3 y}{dt^3}} &\quad \underbrace{y' = e^y} &\quad \underbrace{y \frac{dy}{dx}} \\ \text{linear} &\quad \text{linear} &\quad \text{non-linear} &\quad \text{non-linear} &\quad \text{non-linear} \end{aligned}
Solving Methods
First Order
Second Order
Solve the characteristic equation: \[\lambda^2 + a_1 \lambda + a_0 = 0\] Cases:
- $\lambda_1, \lambda_2$ are real and distinct
- $\lambda_1, \lambda_2$ are real and coincide (same)
- $\lambda_1, \lambda_2$ are complex conjugates
In each case, the solution of $y(x)$ becomes:
- $y(x) = C e^{\lambda_1 x} + D e^{\lambda_2 x}$
- $y(x) = C e^{\lambda_1 x} + D x e^{\lambda_1 x}$
- $y(x) = e^{\alpha x}(A \cos(\beta x) + B \sin(\beta x))$ by DeMoivre's Theorem
\[y(x) = y_h(x) + y_p(x)\] Guesses for $y_p(x)$:
- For $r(x) = P_n(x)$ (polynomial of degree $n$), try $y_p(x) = Q_n(x)$
- For $r(x) = e^{\alpha x}$, try $y_p(x) = A e^{\alpha x}$
- For $r(x) = \sin(\beta x)$ or $r(x) = \cos(\beta x)$, try $y_p(x) = A \sin(\beta x) + B \cos(\beta x)$
- For products of the above forms, try products of the corresponding forms
- If $y_p(x)$ is already a solution of the homogeneous equation, multiply by $x$ or $x^k$ until linearly independent
This method works for any 2nd order inhomogeneous ODE if the complementary solution is known.
The general solution of the 2nd order inhomogeneous ODE: \[y'' + b_1(x) y' + b_0(x) y = f(x)\] is given by $y(x) = u_1(x) y_1(x) + u_2(x) y_2(x)$
where $y_1$ and $y_2$ are linearly independent solutions of the homogeneous ODE such that the Wronskian $W(x) \neq 0$ and \[u_1(x) = -\int \frac{y_2(x)f(x)}{W(x)} \, dx\] and \[u_2(x) = \int \frac{y_1(x)f(x)}{W(x)} \, dx\]
Note, that we embark on this approach because the second order standard form is not solvable in general with elementary functions!
Pick ansatz of the form \[y = \sum^{\infty}_{n=0} a_n z^n\] and take derivatives as required. For example: \[\frac{dy}{dz} = \sum^{\infty}_{n=1} n a_n z^{n-1}, \quad \frac{d^2 y}{dz^2} = \sum^{\infty}_{n=2} n(n-1) a_n z^{n-2}\] and substitute them into the ODE. Then solve by rearranging indices as necessary to obtain a recurrence relation. Apply the initial conditions and then guess the closed-form solution of the recurrence relation. Change back to the original variables if required.
If $x_0$ is an ordinary point of the differential equation \[y'' + p(x)y' + q(x)y = 0\] then the general solution in a neighbourhood $|x - x_0| < R$ may be represented as a power series.
n-th Order
For an $n^{\text{th}}$ order linear ODE with variable coefficients: \[a_n(x) y^{(n)} + a_{n-1}(x) y^{(n-1)} + \dots + a_1(x) y' + a_0(x) y = f(x)\]
We assume a solution of the form: \[y(x) = \sum^{\infty}_{k=0} c_k (x-x_0)^k\]
Taking derivatives and substituting yields a recurrence relation for coefficients $c_k$, typically allowing us to determine $c_n$ in terms of $c_0, c_1, \dots, c_{n-1}$.
Any $n^{\text{th}}$ order ODE can be formulated as a system of $n$ first order ODEs.
For $y^{(n)} = f(x, y, y', \dots, y^{(n-1)})$, set $y_i = y^{(i-1)}$ for $i = 1,2,\dots,n$ to obtain: \[y_i' = y_{i+1} \text{ for } i = 1,2,\dots,n-1\] \[y_n' = f(x, y_1, y_2, \dots, y_n)\]
Partial Differential Equations
\[A \frac{\partial^2 u}{\partial x^2} + B \frac{\partial^2 u}{\partial x \partial y} + C \frac{\partial^2 u}{\partial y^2} + D\frac{\partial u}{\partial x} + E\frac{\partial u}{\partial y} + F u = 0\]
- Parabolic equation: $B^2 - 4AC = 0$ (Heat Equation)
- Hyperbolic equation: $B^2 - 4AC > 0$ (Wave Equation)
- Elliptic equation: $B^2 - 4AC < 0$ (Laplace Equation)
When a PDE is difficult to solve directly, changing variables can transform it into a simpler form.
For a second-order PDE, the transformation $u = u(\xi, \eta)$ where $\xi = \xi(x,y)$ and $\eta = \eta(x,y)$ requires computing: \[\frac{\partial u}{\partial x} = \frac{\partial u}{\partial \xi}\frac{\partial \xi}{\partial x} + \frac{\partial u}{\partial \eta}\frac{\partial \eta}{\partial x}\] \[\frac{\partial u}{\partial y} = \frac{\partial u}{\partial \xi}\frac{\partial \xi}{\partial y} + \frac{\partial u}{\partial \eta}\frac{\partial \eta}{\partial y}\]
And similarly for second-order derivatives. The canonical transformations are:
- For hyperbolic: $\xi = x + y, \eta = x - y$ (characteristic coordinates)
- For parabolic: $\xi = x, \eta = y - f(x)$ (transformation along characteristics)
- For elliptic: $\xi = x + iy, \eta = x - iy$ (complex characteristics)
Systems / Dynamical Systems
- $\lambda_2 < \lambda_1 < 0 \implies$ stable node
- $0 < \lambda_1 < \lambda_2 \implies$ unstable node
- $\lambda_1 = \lambda_2, \lambda_1 > 0 \implies$ unstable star
- $\lambda_1 = \lambda_2, \lambda_1 < 0 \implies$ stable star
- $\lambda_1 < 0 < \lambda_2 \implies$ unstable saddle node
- $\operatorname{Re}(\lambda_1) = 0 \implies$ centre, stable
- $\operatorname{Re}(\lambda_1) < 0 \implies$ stable focus
- $\operatorname{Re}(\lambda_1) > 0 \implies$ unstable focus
For a linear system $\dot{\mathbf{x}} = \mathbf{A} \mathbf{x}$, the real canonical form depends on the eigenvalues of $\mathbf{A}$:
- Real distinct eigenvalues $\lambda_1 \neq \lambda_2$:
\[\mathbf{A}_{\text{canonical}} = \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_2 \end{pmatrix}\]
- Real repeated eigenvalues $\lambda_1 = \lambda_2$ with linearly independent eigenvectors:
\[\mathbf{A}_{\text{canonical}} = \begin{pmatrix} \lambda_1 & 0 \\ 0 & \lambda_1 \end{pmatrix}\]
- Real repeated eigenvalues $\lambda_1 = \lambda_2$ with one linearly independent eigenvector:
\[\mathbf{A}_{\text{canonical}} = \begin{pmatrix} \lambda_1 & 1 \\ 0 & \lambda_1 \end{pmatrix}\]
- Complex conjugate eigenvalues $\lambda = \alpha \pm i\beta$:
\[\mathbf{A}_{\text{canonical}} = \begin{pmatrix} \alpha & \beta \\ -\beta & \alpha \end{pmatrix}\]
Functions
Power Series, Taylor Series and Maclaurin Series:
- Power Series: $\sum_{n=0}^{\infty} a_n (x - a)^n$
- Taylor Series: $\sum_{n=0}^{\infty} \frac{f^{(n)}(a)}{n!} (x - a)^n$
- Maclaurin Series: $\sum_{n=0}^{\infty} \frac{f^{(n)}(0)}{n!} x^n$
Maclaurin is a special case of Taylor (at $a=0$), and Taylor is a special case of Power Series.
\[x^2 y'' + x y' + (x^2 - \nu^2) y = 0\]
Bessel function of the first kind of order $\alpha$: \[J_\alpha(x) = \sum^{\infty}_{m=0} \frac{(-1)^m}{\Gamma(m+1)\Gamma(m+\alpha+1)} \left(\frac{x}{2}\right)^{2m+\alpha}\] implies \[\frac{d}{dx} \left[x^\alpha J_\alpha(x)\right] = x^\alpha J_{\alpha-1}(x)\] implies \[\int^r_0 x^n J_{n-1}(x) \, dx = r^n J_n(r) \text{ for } n = 1, 2, 3, \dots\]
The DE admits solutions:
- Case 1: $2\nu \notin \mathbb{Z}$: $y(x) = A J_{\nu}(x) + B J_{-\nu}(x)$, $J_{\nu}(x)$, $J_{-\nu}(x)$ linearly independent
- Case 2: $2\nu \in \mathbb{Z}$: $y(x) = A J_{\nu}(x) + B J_{-\nu}(x)$
- Case 3: $\nu \in \mathbb{Z}$: $J_{\nu}(x)$, $J_{-\nu}(x)$ linearly dependent, $y(x) = A J_{\nu}(x) + B Y_{\nu}(x)$
Backlinks
1. Calculus /wiki/mathematics/calculus/
You should see gathered listings from this directory below: