## Picard's Theorem * Linear differential equations have the form $y'(x)=f(x,y)$, along with some data points $y(a)=b$ to determine the solution. ### Iterative Approximations * The goal is to create a sequence of functions $(y_{n})$ that converge to a solution. In particular, the sequence is $\begin{align*} y_{0}&=b\\ y_{n+1}(x)&=b+ \int_{a}^x f(t,y_{n}(t)) \, dt \end{align*}$ * The difference in each iteration $(e_{n})$ is given by $\begin{align*} e_{0}(x)&=b\\ e_{n+1}(x)&=y_{n+1}(x)-y_{n}(x)\\ &=\int_{a}^x f(t,y_{n}(t))-f(t,y_{n-1}(t)) \, dt \end{align*}$ * To bound this difference, we need to limits the integrand: a function $f(x,y)$ satisfies **the Lipschitz condition** in a region $R \subset \mathbb{R}^2$ if it is **Lipschitz continuous** in the region: $\exists L: \forall (x,u),(x,v) \in R, |f(x,u)-f(x,v)|\le L|u-v|$ * In particular, if $\partial f / \partial y$ exists and is bounded by $L$, Lipschitz continuity follows (with the same $L$) due to contradiction with the MVT if otherwise. * With the Lipschitz condition, there is $|f(t,y_{n})-f(y,y_{n-1})|\le L|e_{n}|$. ### Picard's Existence Theorem * Given $f$ on rectangle $R=[a \pm h] \times [b \pm k]$, the **Picard conditions** are: * $P(1a)$: $f$ is continuous on $R$ and bounded by $M$; * $P(1b)$: $Mh \le k$; * $P(2)$: $f$ satisfies the Lipschitz condition on $R$ with ratio-bound $L$. * Then **Picard's existence theorem** guarantees a solution $y$ to $y'(x)=f(x,y),\, y(a)=b$and $y:[a-h,a+h] \to [b-k.b+k]$. * Proof - Step 1: the iterates $y_{n}$ are well-defined, continuous, and $|y_{n}-b| \le k$ (so they are within the rectangle). > Continuity and well-defined-ness comes from FTC and continuity theorems. $|y_{n}-b|=|\int _{a}^{x} f(y,t) \, dt| \le \int_{a}^{x} |f(y,t)| \, dt \le M(x-a) \le k$. * Proof - Step 2: the differences $e_{n}$ diminish, as $|e_{n}(x)| \le L^{n-1}M |x-a|^n / n!$. Hence $|e_{n}| \le \frac{L^{n-1}Mh^n}{n!} $ > By the Lipschitz condition, $|e_{n+1}| \le L |\int _{a}^{x} |e_{n}| \, t$, then induct on $n$. * Proof - Step 3: the iterates $(y_{n})= \sum_{k=1}^{n}e_{k}$ converges uniformly to $y_{\infty}$, which is a solution to the problem. > Uniform convergence of $y_{n}$ is given by the Weierstrass M-test. Furthermore, $f(t,y_{n}) \xrightarrow{u} f(t, y_{\infty})$ because $\sup|f(t,y_{n})-f(t,y_{\infty})| \le \sup L|y_{n}-y_{\infty}| \to 0$. Hence $y_{\infty}=\lim_{ n \to \infty }b+\int_{a}^{x} f(t,y_{n}) \, dt=\int _{a}^{x} \lim_{ n \to \infty } f(t,y_{n})\, dx=\int _{a}^{x} f(t,y_{\infty}) \, dx$and differentiating wrt. $x$ gives $y'_{\infty}=f(x,y_{\infty}(x))$. Exchange of integral and supremum is legal due to uniform convergence. * Picard's conditions $P(1a)$ and $P(1b)$ forces the solution $y:y([a \pm h]) \subseteq [b \pm k]$. Graphically, $y$ would touch the left-and-right bounds of the rectangle $R$ (including endpoints), instead of the upper-and-lower ends. > Intuition: $M$ is a bound on $f=y'$, so the change of $y$ over $[a,x]$ is bounded by $M|x-a| < Mh \le k$. > > Proof: for contradiction suppose otherwise, so continuity and IVT gives $x \in [a \pm h]: y(x)=b+k$. But then $k=y-b=\int _{a}^{x}f \, dt \le\int _{a}^{x}|f| \, dx \le M|x-a| < Mh \le k$a contradiction. The penultimate inequality is strict since end points are allowed in the theorem, so assuming otherwise implies $x \ne a \pm h$. ### Uniqueness of the Solution * **Gronwall's Inequality** (for constant $r,b$): $r,b \ge 0, \,v$ continuous and non-negative, then $\begin{align*}\text{if}&&& v(x) \le b+r \bigg| \int_{a}^{x} v(s) \, ds \bigg|\\ \text{then }&&&v(x) \le be^{r|x-a|} \end{align*}$ > Proof: define $V= \int _{a}^{x}v(s) \, ds$, non-negative over $x \ge a$ since $v$ is, then $V' \le b+rV$. Then using integrating factor $\exp(-rx)$, there is $\begin{align*} (V'-rV)e^{-rx} &\le be^{-rx}\\ \frac{d}{dx}(Ve^{-rx}) &\le be^{-rx}\\ Ve^{-rx} &\le \int _{a}^{x}be^{-rt} \, dt = \frac{b}{r}(e^{-ra}-e^{-rx}) \\ V &\le \frac{b}{r}(e^{r(x-a)}-1) \end{align*}$and plugging into the assumption gives the result. The case for $x \le a$ follows by symmetry. * Given the conditions of Picard's existence theorem, the solution is unique. Precisely, if $y_{1},y_{2}$ are both solutions to the system, and $d=y_{1}-y_{2}$, then $|d|=0$ on the entirety of $R$. > Proof: There is $\begin{align*} |d|=\bigg|\int _{a}^{x}f(t,y_{1})-f(t,y_{2}) \, dt\bigg| &\le\int _{a}^{x}|f(t,y_{1})-f(t,y_{2})| \, dt\\ &\le L\int _{a}^x |y_{1}-y_{2}|\, dt=L\int _{a}^x |d|\, dt \end{align*}$ Then Gronwall's inequality gives $|d|=0$ since $|d|$ is a continuous function. * More generally, the solutions are **continuously dependent** on the initial data $y(a)=b$: given initial values $b_{1}, b_{2}$, and corresponding solutions $y_{1}, y_{2}$, there is$\forall \epsilon>0, \exists \delta >0: |b_{1}-b_{2}| < \delta \Rightarrow \sup|y_{1}-y_{2}| < \epsilon$ > Proof: apply Gronwall's inequality on $v=|y_{1}-y_{2}|$, there is $v \le |b_{1} - b_{2}| + L \bigg | \int _{a}^{x}v\, dt \bigg| $giving $|y_{1}-y_{2}| \le |b_{1}-b_{2}|e^{Lh}$. So in fact the solutions are Lipschitz continuous. ### Global Solutions * To extend the local solutions given above, we need a stronger condition called the $(P 3)$ **global Lipschitz condition**, which means Lipschitz continuity over $[a \pm h] \times \mathbb{R}$. * *that is, the Lipschitz inequality holds for any $y$.* * Given $(P1a)$ and $(P{3})$, there is a unique solution over $[a \pm h] \times \mathbb{R}$. > Proof: everything is done the same as in the local case. Replace $(P 1b)$ and $(P 2)$ with $(P3)$ where necessary. * If $(P 3)$ is satisfied for any $h$, then the solution is unique on any arbitrarily large interval, hence is global. ### Proof via the Contraction Mapping Theorem * Let $\mathcal{C}_{h,k}$ denote the metric space of continuos functions on $[a \pm h] \times [b \pm k]$ equipped with the supremum metric. * A map $T:\mathcal{C}_{h,k} \to \mathcal{C}_{h,k}$ is a **contraction** if there is $K< 1:\forall y_{1},y_{2} \in \mathcal{C}_{h,k},\, \|T(y_{1})-T(y_{2})\| \le K\|y_{1}-y_{2}\|$ * The **contraction mapping theorem** states that given a contraction $T:X \to X$, $X$ complete, then there exists a unique $y \in X: Ty=y$. * In this case, working with $T(y)=b+\int _{a}^x f(t,y)\, dt$, the differential equation becomes $y=T(y)$; if $T$ is a contraction, the solution is guaranteed to exist and to be unique. * Restricting to $\mathcal{C}_{\eta,k} \subseteq \mathcal{C}_{h,k},$ and $\eta>0: M\eta \le k, L\eta<1$, then $T$ is contraction. Hence there is a unique solution on $[a \pm \eta] \times [b \pm k]$. > Proof: $M\eta<k$ guarantees $T$ maps $\mathcal{C}_{\eta,k}$ into itself; $L \eta<1$ guarantees contraction with constant $K=L \eta$. * We can then extend the result to the entire $R=[a \pm h] \times [b \pm k]$ by shifting the box: $[(a+ i\eta) \pm \eta] \times [b \pm k]$ for $i \in \mathbb{Z}: i\eta<h$, and the solutions in each box can be pieced together. * We can use a smaller $\eta$ (which still satisfies the conditions) so that $h$ is an multiple of it, so the region $R$ can be exactly covered in an integer number of $\eta$-boxes. ## Plane Autonomous Systems ### Defintions * A **plane autonomous system** is the system of ODEs of $x(t), y(t)$ of the form $\begin{align*} \dot{x}(t) &= X(x,y)\\ \dot{y}(t) &= Y(x,y) \end{align*}$note that both derivatives are determined by the location $(x,y)$, andm $t$ is only involved indirectly. * Given initial data $(x(0),y(0))=(a,b)$, the solution curve is called a **trajectory** or a **phase path**. It is unique up to a shift in time: $(\tilde{x}(t),\tilde{y}(t))=(x(t+t_{0}),y(t+t_{0}))$ also solves the system. * hence two paths can only intersect if they trace out the same curve. * A **nullcline** is the collection of points where one of the derivatives is $0$. ### Critical Points and Linearization * A point $(a,b)$ is a **critical point** if $X(a,b)=Y(a,b)=0$. It represents a stationary point of the system. * A critical point $r_{crit}$ is **stable** if given $\epsilon>0$, there is $\delta>0:$ any starting point $r_{0}=(x(0),y(0)):d(r_{0},r_{crit})<\delta$ would have the trajectory stay within $\epsilon$ of $r_{crit}$ for any time $t>0$. * Writing $(x,y)=(a,b)+\mathbf{Z}(t)$ gives the linearized system $\mathbf{\dot{Z}}=\underbrace{\begin{pmatrix}X_{x}&X_{y}\\Y_{x}&Y_{y}\end{pmatrix}\bigg|_{(x,y)=(a,b)}}_{M}\mathbf{ Z}$ * Then the solution $\mathbf{Z}$ has forms: * Case 1a, the eigenvalues $\lambda_{1,2}$ are distinct reals, with eigenvectors $v_{1,2}$: $ \mathbf{Z}=c_{1}e^{\lambda_{1}t}v_{1}+c_{2}e^{\lambda_{2}t}v_{2}$ * Case 1b, the eigenvalues are distinct complex conjugates $\mu \pm i\nu$, and their eigenvectors are conjugates $v,\bar{v}$ too: $\mathbf{Z}=c_{1}e^{(\mu+i\nu)t}v+\bar{c}e^{(\mu+i\nu)t}\bar{v}$note that the second scaling factor must be the conjugate of the first, so that the solution is real. * Case 2, $\lambda=\lambda_{1}=\lambda_{2}$, and $M=\lambda I$: $\mathbf{Z}= Ce^{\lambda t}$ * Case 3, $\lambda=\lambda_{1}=\lambda_{2}$, but $M-\lambda I \ne 0$. Let $v$ be an eigenvector, and $(M-\lambda I)u=v$: $\mathbf{Z}=(c_{1}u+(c_{2}+c_{3}t)v)e^{\lambda t}$ ### Stability and Classification of Critical Points * Case 1, $0<\lambda_{1}<\lambda_{2}$: the exponentials blow up, but $e^{\lambda_{2}t}$ dominate, so $\mathbf{ Z} \sim kv_{2}$ when $t \to \infty$. The CP is an **unstable node**. * Case 1, $0>\lambda_{2}>\lambda_{1}$: the exponentials go to $0$, but $e^{\lambda_{2}t}$ goes more slowly. The CP is a **stable node**. * Case 1, $\lambda_{1}<0<\lambda_{2}$: a **saddle**. * Case 1b, $\mu = 0$: a **center**, around which the trajectories would rotate periodically. * Case 1b, $\mu \ne 0$: a **stable/unstable spiral**. * Case 2: the CP is a **star**, stable if $\lambda<0$, and unstable if otherwise. * Case 3: the CP is an inflected node, with same stability as the star. * In the case of spirals and centers, the direction of the spin depends on the sign of $X_{y}$, as evaluating it at $x=a,y=b+\eta,\eta>0$ gives the horizontal distance at the point: $X_{y}>0$ gives a clockwise spin, and $X_{y} <0$ gives counterclockwise spin. ### Bendixon-Dulac Theorem and Periodic Solutions *Big ideaL Benedixon-Dulac is a criteria to rule out periodic solutions.* * **Benedixon-Dulac Theorem**: given the system $\dot{x}=X,\dot{y}=Y$, then there can be no periodic solutions (other than a fixed point) in the simply connected region $R$ if there is a continuous function $\phi:$ $\rho=\frac{\partial}{\partial x}(\phi X)+\frac{\partial}{\partial y}(\phi Y)>0\,\,\, \forall (x,y) \in \mathbb{R}^2$In particular, the handier result if we require $\phi$ to be constant: there can be no periodic solutions if $ \rho=\frac{\partial X}{\partial x}+ \frac{\partial Y}{\partial y}=\text{tr}(M)\, \text{ has a fixed sign}$ > Proof: if for contradiction there is a periodic solution with trajectory $C$ and interior $D$, there is $\begin{align*} \iint_{D} \rho \,dA&=\iint_{D} \frac{\partial}{\partial x}(\phi X)+ \frac{\partial}{\partial y}(\phi Y)dA\\ &=\oint_{C} -\phi Y\, dx+\phi X\, dy & \text{by Green's Theorem}\\ &= \oint_{C}-\phi(-\dot{ y}\,dx+\dot{x}\, dy)\\ &= \oint_{C} -\phi(-\dot{y}\dot{x}+\dot{x}\dot{y})\, dt=0 \end{align*}$contradictory to the positivity of $\rho$. ## First Order Semi-Linear PDEs * A first order semi-linear PDE of an unknown $z=f(x,y)$ is: $P(x,y)\frac{\partial z}{\partial x}+Q(x,y) \frac{\partial z}{\partial y}=R(x,y,z)$ * The **solution surface** is the set $\Sigma=\{ (x,y,z) \in \mathbb{R}^{3}: z=f(x,y) \}$. It has normal vector $\mathbf{n}=(-f_{x},-f_{y},1)$. * So writing $\mathbf{t}=(P,Q,R)$, the PDE becomes $\mathbf{t} \cdot \mathbf{n}=0$and $\mathbf{ t}$ is the tangent vector to the solution surface. * A **characteristic (curve)** is a curve $\rho(t)=(x(t),y(t),z(t))\subset\Sigma$ that then moves along $\mathbf{t}$. * The **initial curve** is the initial data (say at $t=0$), parametrised in the form $\gamma(s)=(\gamma_{1}(s),\gamma_{2}(s),\gamma_{3}(\gamma_{1},\gamma_{2}))$and the solution surface is then the union of all the characteristics that start from a point on $\gamma$. We can hence solve for the characteristic curve $\rho_{s}(t)$ that goes through $\gamma(s)$ for an fixed $s$, and thereby parametrize the solution surface as $\Sigma=\sigma(s,t)=\rho_{s}(t)$. ### Solving for the Characteristics * To find the characteristic $\rho(t)=(x(t),y(t),z(t))$, since it goes along $\mathbf{t}=(P,Q,R)$, the curve must satisfy $\begin{align*} \dot{x}&= P(x,y)\\ \dot{y}&= Q(x,y)\\ \dot{z}&= R(x,y,z) \end{align*}$and hence the coordinates $(x,y)$ is a plane autonomous system, and its solution is the **characteristic projection**. * Solving the system gives $\rho(t)=(x(t),y(t),z(t))$, from which we can write $z=f(x,y)$ as a function of $x,y$. ### Domain of Definition * The **domain of definition** is the region in $\mathbb{R}^2$ over which the solution is uniquely determined. * The domain of definition is covered by the projections of characteristics starting on the initial curve $\gamma(t):[s_{0},s_{1}] \to \mathbb{R}^3$. * Since the characteristic projections starting on different points of the initial curve don't intersect, the domain of definition is bounded by the curves through the tips of the initial curve: $\sigma(s_{0},t)$ and $\sigma(s_{1},t)$. * Potential problems for existence of solutions: * Over-determined system: if the characteristic projection crosses the data curve more than once (say at $(x,y)|_{s_{a}}$ and $(x,y)|_{s_{b}}$), the data might not be consistent. The solution $z_{a}(x,y)$ found with the data at $\gamma(s_{a})$ might not pass through $\gamma(s_{b})$. In this case, we might need to trim the data curve. * More extremely, if the initial data $\gamma$ is a characteristic curve, we can draw any curve $C$ that (1) intersects with $\gamma$, and (2) has a projection not tangent to characteristic projections, can be treated as an initial data and hence generate a solution surface $\Sigma_{C}$. Hence choosing different $C$ provides an infinity of solutions. * Lack of **Cauchy data**: Cauchy data requires a non-zero Jacobian on the data curve: $J(s,0)=\det \begin{pmatrix} x_{t}&y_{t} \\ x_{s}&y_{s} \end{pmatrix}=\det \begin{pmatrix} P&Q \\ \partial _{s}\gamma_{1} & \partial_{s} \gamma_{2} \end{pmatrix}$which requires the data curve to be not tangent to any of the characteristics through it. Without Cauchy data, it might be impossible to solve for $z$ from $x,y$.