Throughout this note we assume that the random processes $(X_{n})$ and stopping times $\tau$ are adapted to a [[Filtrations and Stopping Times#^fb40a5|filtration]] $(\mathcal{F}_{n})$.
> [!definition|*] Martingales
> A stochastic process $(X_{n})$ is a **martingale** (with respect to $(\mathcal{F}_{n})$) if:
> - It is adapted to the filtration $(\mathcal{F}_{n})$.
> - It is integrable (i.e. $\forall n,~\mathbb{E}[| X_{n} |]<\infty$).
> - It is conditionally stationary: $\mathbb{E}[X_{n+1} ~|~\mathcal{F}_{n}]=X_{n}.$
>
> It is a **supermartingale** or a **submartingale** if $\mathbb{E}[X_{n+1} ~|~ X_{n}]\le X_{n}$ or $\ge X_{n}$ respectively.
> [!examples] Examples of martingales and super/submartingales
> Examples of martingales include:
> - Sum of independent, $0$-mean, integrable RVs,
> - Product of independent, $1$-mean, non-negative RVs,
> - The process $(\mathbb{E}[Y ~|~ \mathcal{F}_{n}])$ of accumulating knowledge about $Y$, if $Y$ integrable.
>
> Examples of submartingales/supermartingales include:
> - If $f:\mathbb{R} \to \mathbb{R}$ is convex and $(X_{n})$ is a martingale, then $(f(X_{n}))$ is a submartingale if it's integrable, by Jensen's inequality.
> - If $(X_{n})$ represents your wealth in an ordinary casino, then both $(X_{n})$ and $(X_{n+1}-X_{n})$ are supermartingales.
> [!warning] Martingales depend on the filtration
> If $(X_{n})$ and $(Y_{n})$ are martingales in different filtrations (e.g. their respective natural filtraitions), *it is not necessarily true that the sum of two martingales is still a martingale* in its natural filtration, especially when they are not independent:
>
> Consider $(X_{n})$ and $(Y_{n}):= (X_{n+1})$, then in the natural filtration $(\mathcal{G}_{n}):= (\mathcal{F}_{n+1})$ of $(X_{n}+Y_{n})=(X_{n}+X_{n+1})$, $\mathbb{E}[X_{n}+X_{n+1} ~|~ \mathcal{G}_{n-1}]=X_{n} + \mathbb{E}[X_{n+1} ~|~ \mathcal{F}_{n}]=2X_{n},$not $X_{n-1}+X_{n}$ that the martingale property requires.
A **martingale difference** is a sequence $(Y_{n})$ with zero conditional expectation: $\mathbb{E}[Y_{n} ~|~ \mathcal{F}_{n-1}]=0$.
- A stochastic process $(X_{n})$ is a martingale if and only if $(\Delta X_{n}):= (X_{n}-X_{n-1})_{n \ge 1}$ is a martingale difference in $(\mathcal{F}_{n})_{n \ge 1}$, and $X_{0} \in \mathcal{F}_{0}$ with zero expectation.
## Predictable Processes
> [!definition|*] Predictable Processes
> A stochastic process $(V_{n})$ adapted to $(\mathcal{F}_{n})$ is **predictable** or **previsible** if $\forall n \ge 1,~V_{n} \in m \mathcal{F}_{n-1}$. That is, their values can be determined before "turn" $n$ comes around.
Examples of predictable processes include:
- A sequence of constants,
- Any gambling strategy that don't involve foreseeing the outcome of the upcoming game.
Predictable processes are like opposites of martingales: one is known one step ahead, while the other has changes with one step ahead mean of $0$.
- In particular, if $X=(X_{n})$ is both a martingale and a predictable process, then it is repeating $X_{n}=X_{0}~\mathrm{a.s.}$ for $\forall n \ge 0$ -- the change is known ahead of time and has mean $0$, so it must be $0~ (\mathrm{a.s.})$.
> [!proof]- Proof of Predictable Martingales being Constants
> Suppose the martingale is adapted to the filtration $(\mathcal{F}_{n})$, then $X_{n-1}=\mathbb{E}[X_{n} ~|~\mathcal{F}_{n-1}]=X_{n},$where the first equality is the martingale's property, and the second predictability.
According to Doob, any adapted process $(X_{n})$ can be decomposed into a baseline, a martingale and a predictable process:
> [!theorem|*] Doob's Decomposition Theorem
> If $X=(X_{n})$ is an integrable process adapted to $(\mathcal{F}_{n})$, then there exists a martingale $M=(M_{n})$ and predictable process $A=(A_{n})$, both null at $0$ (meaning that $M_{0}=A_{0}=0$) such that $X=X_{0}+M+A.$Furthermore, this decomposition is unique up to $\mathrm{a.e.}$ equality.
>
> > [!proof]-
> > We explicitly construct these processes. Consider $\begin{align*}
> A_{n}&:= \sum_{i=1}^{n} \mathbb{E}[X_{i}-X_{i-1} ~|~ \mathcal{F}_{i-1}],\\
> M_{n}&:= X_{n}-X_{0}-A_{n},
> \end{align*}$i.e. $A$ is the not martingale-looking part of $X$, and $M$ is what is left. We verify this is the decomposition:
> > - $A$ is predictable since every term in the sum is $\in \mathcal{F}_{i-1} \subseteq \mathcal{F}_{n-1}$ by construction.
> > - $M$ is a martingale since it is integrable as $X$ is, and $\begin{align*}
> \mathbb{E}[M_{n} ~|~ \mathcal{F}_{n-1}]&= \mathbb{E}[X_{n} - X_{0}-A_{n} ~|~ \mathcal{F}_{n-1}]\\
> &= \mathbb{E}[X_{n} ~|~ \mathcal{F}_{n-1}]-X_{0}-A_{n}\\
> &= \mathbb{E}[X_{n} ~|~ \mathcal{F}_{n-1}]-X_{0}- \sum_{i=1}^n\mathbb{E}[X_{i}-X_{i-1} ~|~ \mathcal{F}_{n-1}]\\
> &= \mathbb{E}[X_{n-1} ~|~ \mathcal{F}_{n-1}] - X_{0}-\sum_{i=1}^{n-1}\mathbb{E}[X_{i}-X_{i-1} ~|~ \mathcal{F}_{n-1}]\\
> &= X_{n-1}-X_{0}-A_{n-1}\\[0.4em]
> &= M_{n-1}.\end{align*}$
> >
> > Lastly, $\mathrm{a.s.}$ equality holds since if $M,A$ and $M',A'$ both satisfy the requirements, then $X-X_{0}=A+M = A' + M'$, so $A-A'=M-M'$ is a predictable martingale null at $0$, i.e. a sequence equalling $0~\mathrm{a.s.}$ for all $n$.
If furthermore the predictable part $A$ is $\mathrm{a.s.}$ non-increasing/decreasing, then the original process $X$ is a super/submartingale.
One application of the result is to produce martingales by **compensating** a non-martingale process with its predictable part:
- For example, if $(X_{n})$ is a sequence of $0$-mean independent RVs, then $\left( \sum_{i=1}^{n}X_{i}^{2} \right)$ is a submartingale, but compensating with $(A_{n})=\left( \sum_{i=1}^{n}\mathrm{Var}(X_{i}) \right)$ gives a martingale $\left( \sum_{i=1}^{n}X_{i}^{2}-\mathrm{Var}(X_{i}) \right)$.
- Applied in reverse, studying $(A_{n})$ gives information about the expected increment in a process: for example, decomposing $(X_{n}^{2})=(X_{0}^{2}+M_{n}+A_{n})$ gives the **quadratic variance $(A_{n})=: \left< X \right>$**, where $A_{n+1}-A_{n}=\mathbb{E}[X_{n+1}^{2}-X_{n}^{2}~|~ \mathcal{F}_{n}]=\mathbb{E}[(X_{n+1}-X_{n})^{2} ~|~ \mathcal{F}_{n}],$i.e. the conditional variance of $(X_{n})$. Here $(A_{n})$ is an $\mathrm{a.s.}$ increasing process, and can be interpreted as the accumulated variance since time $n=0$.
## Martingale Transforms
> [!definition|*] Martingale Transforms
> If $V:=(V_{n})_{n \ge 1}$ is a predictable process and $X:=(X_{n})$ is a martingale adapted to the same filtration, then the **martingale transform** of $(X_{n})$ by $(V_{n})$ is $(V \circ X)_{n \ge 1}:= \left( \sum_{i=1}^{n}V_{i}(X_{i}-X_{i-1}) \right)_{n \ge 1}.$
This can be interpreted as gambling $\$V_{i}$ on the $i$th round, and $X$ is the total number of wins minus losses, so the $i$th game leads to a win/loss of $\$V_{i}(X_{i}-X_{i-1})$, and $(V \circ X)$ is the cumulative change in wealth since game start.
*Transformations preserve martingales*: if $V$ is predictable, $X$ is a martingale, then $(V \circ X)$ is a martingale, if it is integrable.
- One special case is when each $V_{n}$ is bounded, then $(V \circ X)$ is automatically integrable.
> [!proof]-
> Assume $Y:=(V \circ X)$ is integrable. It's obviously adapted to $(\mathcal{F}_{n})$, so it remains to check conditional stationarity: since $V_{n}$ is integrable (why?), we may take it out of the conditional expectation to get $\begin{align*}
\mathbb{E}[Y_{n}-Y_{n-1}~|~ \mathcal{F}_{n-1}]&= \mathbb{E}[V_{n}(X_{n}-X_{n-1}) ~|~ \mathcal{F}_{n-1}]\\
&= V_{n} \mathbb{E}[X_{n} ~|~ \mathcal{F}_{n-1}] - V_{n} \mathbb{E}[X_{n-1} ~|~ \mathcal{F}_{n-1}] & \left[\substack{\text{taking out what} \\ \text{is known}}\right]\\
&= V_{n} X_{n-1} - V_{n}X_{n-1}\\
&= 0.
\end{align*}$
*Non-negative transformations preserve submartingales and supermartingales*: if $V \ge 0$ is predictable and $X$ a submartingale (resp. supermartingale), then so is $(V \circ X)$ if it is integrable.
- This means that in an unfavorable game (so $X$ is a supermartingale) that does not allow shorting the outcome, no betting strategy wins on average.
## Stopped Martingales
> [!definition|*] Stopped Martingales
> If $X$ is a martingale and $\tau$ a [[Filtrations and Stopping Times#Stopping Time and Stopped Processes|stopping time]], both adapted to the filtration $(\mathcal{F}_{n})$, then the **stopped martingale** $X^\tau$ is $X^{\tau}_{n}:=X_{n \land \tau}, \text{ adapted to } \mathcal{F}_{n \land \tau},$where $n \land \tau:=\min(n, \tau)$. That is, $X$ stops changing after $\tau$.
>
> Stopped super/submartingales are similarly defined.
*Stopped martingales are martingales*: the stopped martingale $X^ {\tau}$ is equivalent to $(\mathbf{1}_{\tau \ge n} \circ X_{n})_{n \ge 0}$, so $X^{\tau}$ is a martingale if $X$ is one. Similar results hold for super/submartingales.
As a corollary, if $X$ is a martingale, then $\forall n,~\mathbb{E}[X^{\tau}_{n}]=\mathbb{E}[X_{0}],$and $\le, \ge$ for super/submartingales respectively.
- Hence for supermartingales, no stopping time can stop losses on average.
> [!warning]
> This does not imply that $\mathbb{E}[X_{\tau}]=\mathbb{E}[X_{0}]$: consider the simple random walk $(X_{n})$ with $X_{0}=0$ and the stopping time $\tau:= \inf \{t ~|~ X_{t} =1\}$. Although $\tau <\infty~ \mathrm{a.s.}$, obviously $1=\mathbb{E}[X_{\tau}] \ne \mathbb{E}[X_{0}]=0$.
Doob's theorems explore cases where this equality holds.
> [!theorem|*] Doob's Optional Stopping Theorem
> If $(X_{n})$ is a supermartingale and $\tau$ a stopping time, then $\mathbb{E}[X_{\tau}] \le \mathbb{E}[X_{0}]$if any of the following holds:
> - $\tau$ is bounded;
> - $\tau$ is $\mathrm{a.s.}$ finite and $(X_{n})$ is bounded ($\exists K: | X_{n} | \le K$ for $\forall n,\omega$);
> - $\tau$ is $\mathrm{a.s.}$ finite and $(X_{n})$ is non-negative;
> - $\mathbb{E}[\tau] < \infty$ and $\mathbb{E}[| X_{n+1} - X_{n}| ~\big|~ \mathcal{F}_{n}]<K$ for some $K$ for $\forall n,\omega$.
>
> If $(X_{n})$ is instead a martingale, the equality holds.
> >[!proof]-
> > $[\tau \text{ bounded by }N]$ Since $\mathbb{E}[X_{\tau \land n}]=\mathbb{E}[X^{\tau}_{n}]\leq \mathbb{E}[X_{0}]$ for any $n$, setting $n=N$ gives the result.
> >
> > $[\tau <\infty ~\mathrm{a.s.}, ~(X_{n})\text{ bounded}]$ $X_{\tau \land n}\to X_{\tau}$, so $(X_{n})$ bounded allows DCT/BCT to give $\mathbb{E}[X_{\tau\land n}] \to \mathbb{E}[X_{\tau}]$ and the same result holds.
> >
> > $[\tau < \infty ~\mathrm{a.s.}, (X_{n}) \ge 0]$ Since $(X_{n})$ is non-negative, Fatou gives $\mathbb{E}[X_{\tau}]=\mathbb{E}[\underset{n \to \infty}{\lim\inf} X_{\tau \land n}] \le \underset{n \to \infty}{\lim\inf}\mathbb{E}[X_{\tau \land n}] \le \mathbb{E}[X_{0}]$.
> >
> > $[\mathbb{E}[\tau]<\infty, \text{bounded expected increment}]$ triangle inequality: $\begin{align*}
| X_{\tau \land n}-X_{0} | &\le \sum^{\tau \land n}_{i=1}| X_{i}-X_{i-1} | \le \sum^{\infty}_{i=1} \mathbf{1}_{\tau \ge i}| X_{i}-X_{i-1} |\\
\mathbb{E}[| X_{\tau \land n}-X_{0} |] &\le \mathbb{E}\left[ \sum^{\infty}_{i=1} \mathbf{1}_{\tau \ge i}| X_{i}-X_{i-1} | \right]\\
&= \sum^{\infty}_{i=1} \mathbb{E}[\mathbf{1}_{\tau \ge i} | X_{i}-X_{i-1} |] &&\text{[MCT]}\\
&= \sum^{\infty}_{i=1} \mathbb{E}\big[ \mathbb{E}[\mathbf{1}_{\tau \ge i} | X_{i}-X_{i-1} | ~|~ \mathcal{F}_{i-1}] \big] &&\text{[Tower]}\\
&= \sum^{\infty}_{i=1} \mathbb{E}\big[\mathbf{1}_{\tau \ge i} \mathbb{E}[| X_{i}-X_{i-1} | ~|~ \mathcal{F}_{i-1}] \big]&&\left[\substack{\text{taking out} \\ \text{what is known}}\right]\\
&\le L \sum_{i=1}^{\infty}\mathbb{E}[\mathbf{1}_{\tau \ge i}]\\[0.4em] &= L\mathbb{E}[\tau] <\infty. \end{align*}$Then DCT applies, giving $0=\mathbb{E}[ X_{n \land \tau} -X_{0}] \to \mathbb{E}[ X_{\tau} - X_{0} ]$.
> [!theorem|*] Doob's Optional Sampling Theorem
> If $\tau \le \rho$ are two bounded stopping times, and $(X_{t})$ a martingale on the same filtration, then $\mathbb{E}[X_{\rho} ~|~ \mathcal{F}_{\tau}]=X_{\tau}~ \mathrm{a.s.}.$
>
> > [!proof]
> > Check the defining relationship for $X_{\tau}$ over any $A \in \mathcal{F}_{\tau}$.
> >
> > If $\rho$ is constant $\rho \equiv n$, then $\tau \le n$, and we can sum over all its possible values to get $\begin{align*}
\mathbb{E}[X_{n}\mathbf{1}_{A}]&= \sum_{t=0}^{n}\mathbb{E}[X_{n}\mathbf{1}_{\{ \tau = t \} \cap A}]\\
&= \sum_{t=0}^{n}\mathbb{E}[\mathbb{E}[X_{n}\mathbf{1}_{\{ \tau = t \} \cap A} ~|~ \mathcal{F}_{t}]] & [\text{tower rule}]\\
&= \sum_{t=0}^{n}\mathbb{E}[\mathbf{1}_{\{ \tau = t \} \cap A} \cdot \mathbb{E}[X_{n} ~|~ \mathcal{F}_{t}]] & [\text{def. of }\mathcal{F}_{\tau}]\\
&= \sum_{t=0}^{n}\mathbb{E}[X_{t}\mathbf{1}_{\{ \tau = t \} \cap A}]\\
&= \mathbb{E}[X_{\tau} \mathbf{1}_{A}].
\end{align*}$
Now generalize to non-constant $\rho$: consider the martingale $(Y_{t}):= (X_{\rho \land t}-X_{\tau \land t})$: $\begin{align*}
0 &= Y_{\tau \land n} = \mathbb{E}[Y_{n} ~|~ \mathcal{F}_{\tau \land n}] =\mathbb{E}[X_{\rho \land n} ~|~ \mathcal{F}_{\tau \land n}]-X_{\tau \land n},
\end{align*}$where the second equality follows from above, but with $\tau \land n$ instead of $\tau$. Now take $n \ge \rho,\tau$ (possible since $\rho, \tau$ bounded) to get the result.
> [!exposition] Application for computing expected time for a pattern to appear
>
> Consider a sequence of iid. variables $(X_{n})$ taking values $0,1,\dots$ with probabilities $\mathbb{P}[X_{1}=i]=:p_{i}>0$. Let $\tau$ indicate the first time a finite pattern $i_{1},\dots,i_{k}$ appears in $X_{\tau-k+1}, \dots, X_{\tau}$.
>
> Obviously $\tau<\infty~\mathrm{a.s.}$ (cf. [[Measure Theory#Borel-Cantelli Lemmas|the monkey with a typewriter]]), and we wish to compute $\mathbb{E}[\tau]$.
>
> Imagine a casino offering fair bets, and a sequence of gamblers arriving with a starting fortune of $\$1$. For simplicity assume that the pattern is $1,\dots,k$.
>
> The $n$th gambler bets that the pattern will appear in the next $k$ rolls, i.e.$X_{n}=1, X_{n+1}=2,\dots,X_{n+k}=i_{k},$and their fortune is calculated after each roll. They go all in for each roll, so their fortune goes $1 \to p_{1}^{-1} \to (p_{1}p_{2})^{-1} \to \cdots \to (p_{1}\dots p_{k})^{-1}$if they weren't wiped out halfway through.
>
> ---
>
> Since the casino is offering a fair bet, their fortune $(M_{n})$ after the $n$th roll is a martingale, where $M_{n}=n-\text{fortune of gamblers still in the game}.$Note that if $n \le \tau$, only the last $k$ gamblers can possibly still be gambling. Now, since $\tau < \infty~\mathrm{a.s.}$, we can use Doob's optional stopping theorem to deduce that $0=\mathbb{E}[M_{\tau}]=\mathbb{E}[\tau]-\left( \substack{\text{fortune of gamblers}\\ \text{ when the pattern appears}} \right),$where in the case of the pattern $123\cdots k$, only the $(\tau-k+1)$th gambler is still in (with a fortune of $(p_{1}p_{2}\cdots p_{k})$). Therefore $\mathbb{E}[\tau]=(p_{1}\cdots p_{k})^{-1}$.
>
> For patterns with repetition like $01201$, the $(\tau-1)$th and the $(\tau-4)$th are in, with a fortune of $(p_{0}p_{1})^{-1}$ and $(p_{0}^{2}p_{1}^{2}p_{2})^{-1}$ respectively, giving $\mathbb{E}[\tau_{01201}]=\frac{1}{p_{0}p_{1}}+\frac{1}{p_{0}^{2}p_{1}^{2}p_{2}}.$
## Upcrossings
Consider trading an asset with price $(X_{n})$ at time $n$, and the strategy where we fix two prices $a<b$, buying if the price dips below $a$, and hold it until it goes above $b$, where we sell:
![[Upcrossings.png#invert|w60|center]]
<p style="text-align:center;">Filled circles indicate holding a position.</p>
Then at time $N$, we earn at least $(b-a)\times$number of times $X$ crossed the gap $-$ PnL from current holding.
> [!definition|*] Upcrossings
> Fix $a < b$ and stochastic process $(X_{n})$ adapted to filtration $(\mathcal{F}_{n})$. Recursively define the sequences $\rho, \tau$:
> - $\rho$ are the times we buy, i.e. the first time $X$ dips below $a$ since it last reached $b$.
> - $\tau$ is the times we sell, i.e. first time $X$ reaches $b$ since it last dipped below $a$.
> - Default $\tau_{0}=0$ and $\inf\emptyset=\infty$, then for $k \ge 1$, $\begin{align*}
\rho_{k}:&= \inf \{ n \ge \tau_{k-1} ~|~ X_{n} \le a \}\\
\tau_{k} :&= \inf \{ n \ge \rho_{k-1} ~|~ X_{n} \ge b\}
\end{align*}$
>
>The **number of upcrossings** by time $n$ is then $U_{n}([a,b], X):= \max \{ k ~|~ \tau_{k} \le n\},$i.e. the number of times we could have sold with that strategy. Further denote $U([a,b],X):= \limsup_{n} U_{n}([a,b],X)$.
The earnings from this strategy by time $n$ is then at least $Y_{n} \ge \underbrace{(b-a)\cdot U_{n}([a,b],X)}_{\substack{\text{lower bound of} \\ \text{past profits}}} - \underbrace{(X_{n}-a)_{-}}_{\substack{\text{upper bound of} \\ \text{current loss}}}.$
- The first term is (the lower bound) of profit from the strategy: we could have earned more if $(X_{n})$ shot over/under $b / a$ when we sold/bought.
- The second term is the (exaggerated) potential loss due to holding a position while the price drops below $a$; if we bought at less than $a$, the actual loss is less than that.
However, note that (just like any investing strategy) we can *write the PnL $Y_{n}$ as a martingale transform*: $\begin{align*}
Y_{n}&= \sum_{i=1}^{n}(X_{i}-X_{i-1})\cdot \underset{=: C_{n}}{\mathbf{1}_{\text{holding a position}}} &\left( \substack{\text{this is true for any strategy} \\ \text{ where the position is } \in \{ 0,1 \}} \right)\\
&= (C \circ X),
\end{align*}$if we can show that $C$ is predictable: indeed, this is true since $\begin{align*}
C_{n}&= \begin{cases}
\mathbf{1}_{\text{has not sold}} &\text{if held a position at }(n-1) \\
\mathbf{1}_{\text{has bought}} &\text{if held no position at }(n-1)
\end{cases}\\[0.4em]
&= \mathbf{1}_{X_{n-1} < b} \cdot\mathbf{1}_{C_{n-1}=1} + \mathbf{1}_{X_{n-1} > a} \cdot\mathbf{1}_{C_{n-1}=0},
\end{align*}$where each term is $\in \mathcal{F}_{n-1}$.
> [!proof]- Alternative formulation of $C$
> Alternatively, with $k$ being the past number of upcrossings, $\begin{align*}
C_{n}&= \mathbf{1}\{ \text{during upcrossing}\}\\
&= \mathbf{1} \{ \rho_{k+1} \le n \text{ AND } \tau_{k+1} > n\} \in m\mathcal{F}_{n},
\end{align*}$as $\rho,\tau$ are stopping times.
Therefore, $C_{n}$ is a non-negative predictable process, and $(C \circ X)$ is a martingale if $X$ is a martingale, and a supermartingale if $X$ is a supermartingale. *This strategy does not beat the game*.
One consequence is:
> [!theorem|*] Doob's Upcrossing Lemma
> If $X$ is a supermartingale, then for any $a < b$ $\mathbb{E}[(a-X_{n})_{+}]\ge (b-a)\cdot\mathbb{E}[U_{n}([a,b], X)],$that is, *the loss of being stuck with a long position outweighs the profit of the strategy on average*.
- Similarly, we can prove that a “bad” strategy of buy-high (at $b$) and sell-low (at $a$) cannot break a submartingale, so if $X$ is a submartingale instead,
$\mathbb{E}[(X_{n}-b)_{+}]\ge (b-a)\cdot\mathbb{E}[U_{n}([a,b], X)].$
## Martingale Convergence
Upcrossing gives a way to measure the oscillation of a sequence, hence its convergence or divergence:
> [!lemma|*] Upcrossing and Convergence
> If $X=(X_{n})$ are random variables $\Omega \to \mathbb{R}$, then $X_{n}$ converges to a limit $X_{\infty}~\mathrm{a.s.}$ if and only if $\forall a,b \in \mathbb{R}, a<b, ~U([a,b],X) < \infty ~\mathrm{a.s.}$I.e., *a sequence of random variables converges if and only they do not oscillate about any pair of $(a,b)$*.
>
> >[!proof]-
> >We prove the contrapositive (i.e. divergence if and only if infinite upcrossing).
> >
> >$[\Rightarrow]$ If the limit does not exist, $\underset{n}{\lim\inf}~X_{n}\lneq\underset{n}{\lim\sup}~X_{n}$, so we can choose $\underset{n}{\lim\inf}~X_{n}<a<b<\underset{n}{\lim\sup}~X_{n}$.
> >$[\Leftarrow]$ If such $a,b$ exist, then $(X_{n} <a)~\mathrm{i.o.}$ and $(X_{n}>b)~\mathrm{i.o.}$, so $\underset{n}{\lim\inf}~X_{n}<a<b<\underset{n}{\lim\sup}~X_{n}$ are not equal.
Combined with Doob's upcrossing lemma, which bounds the (expected) number of upcrossings, it guarantees the convergence of sub/supermartingales.
> [!theorem|*] Doob's Forward Convergence Theorem
> If $X=(X_{n})$ is a sub/supermartingale, and it is bounded in $\mathcal{L}^{1}$ (i.e. $\exists K: \mathbb{E}[| X_{n} |] \le K~\forall n$), then $(X_{n}) \xrightarrow{\mathrm{a.s.}} X_{\infty}\in \mathcal{L}^{1}$.
> > [!warning]
> > This is point-wise convergence, with no guarantee that $(X_{n}) \to X_{\infty}$ in $\mathcal{L}^{1}$.
>
> >[!proof]-
> >Fix any $a,b \in \mathbb{Q},a<b$. The bid idea is that *an infinite oscillation between $a,b$ allows for infinite profit $(b-a)U_{n}$ that cannot be balanced by the risk of drawdown $(X_{n}-a)_{-}$ because $X$ is uniformly bounded*. This would contradict Doob's upcrossing lemma.
> >
> > Such oscillation happens with probability $
\mathbb{P}[\underset{n}{\lim\inf}~X_{n}<a<b<\underset{n}{\lim\sup}~X_{n}]= \mathbb{P}[U([a,b],X)\not< \infty], ~~~~~~(\ast)$But by Doob's upcrossing lemma, $\mathbb{E}[U_{n}([a,b],X)] \le \frac{1}{b-a}\mathbb{E}[(X_{n}-a)_{-}] \le \frac{1}{b-a}\mathbb{E}[| X_{n} |+| a |]\lneq\infty$by assumption, so by MCT, letting $n \to \infty$ gives $\mathbb{E}[U([a,b],X)]\lneq \infty$, and so $(\ast)$ must be $0$. Now taking the countable union over all $(a,b) \in \mathbb{Q}^{2}$ proves that $\Big( \not\exists a,b \in \mathbb{Q}~\mathrm{s.t.}~ U([a,b],X)=\infty \Big) \text{ is true }\mathrm{a.s.},$so the previous lemma gives $(X_{n}) \to X_{\infty}~\mathrm{a.s.}$.
Obvious corollaries include:
- If $X \ge 0$ is a supermartingale, then it has a limit $X_{\infty}$.
- A bounded martingale converges to some integrable limit $\mathrm{a.s.}$.
## Martingales in $\mathcal{L}^{2}$
If $X=(X_{n})$ is a martingale in $\mathcal{L}^{2}$ (not just in $\mathcal{L}^{1}$), it has the property $\mathbb{E}[X_{n}^{2}]=\mathbb{E}[X_{0}^{2}]+\sum_{i=1}^{n}\mathbb{E}[(X_{i}-X_{i-1})^{2}].$This is because the **orthogonality** of martingale increments: written as the inner product $\left< X_{i}, X_{j} \right>:= \mathbb{E}[X_{i}X_{j}]$, the equation becomes $\| X_{n} \|_{2}^{2}\overset{?}{=}\sum_{i=0}^{n}\| X_{i}-X_{i-1} \|_{2}^{2},$where $X_{-1}:=0$ for convenience.
> [!proof]- Rest of the proof
> Expanding $\mathrm{LHS}=\|\sum_{i=0}^{n} (X_{i}-X_{i-1}) \|_{2 }^{2}$ gives $\begin{align*}
\mathrm{LHS}&= \sum_{i=0}^{n}\| X_{i}-X_{i-1} \|_{2}^{2} + \sum_{i \ne j} \left< X_{i}-X_{i-1}, X_{j}-X_{j-1} \right>,
\end{align*}$so it remains to prove that the inner product is $0$ whenever $i \ne j$. WLOG assume $i < j$, and first note that expected martingale increments $\mathbb{E}[X_{j}-X_{j-1} ~|~ \mathcal{F_{i}}]$ are $0$ for any $i < j$: $\begin{align*}
\mathbb{E}[X_{j}-X_{j-1} ~|~ \mathcal{F_{i}}]&= \mathbb{E}[\mathbb{E}[X_{j}-X_{j-1} ~|~ \mathcal{F_{j-1}}] ~|~ \mathcal{F_{i}}] & [\text{towering}]\\
&= \mathbb{E}[X_{j-1}-X_{j-1} ~|~ \mathcal{F}_{i}] &\left[ \substack{\text{def. of martingales} \\ \text{\& conditional exps.}} \right]\\
&= 0.
\end{align*}$So the inner product is $\begin{align*}
\mathbb{E}[&(X_{i}-X_{i-1}) (X_{j}-X_{j-1})]\\
&= \mathbb{E}[\mathbb{E}[(X_{i}-X_{i-1}) (X_{j}-X_{j-1}) ~|~ \mathcal{F}_{i}]] &\left[ \text{towering} \right] \\[0.4em]
&= \mathbb{E}[(X_{i}-X_{i-1}) \cdot \mathbb{E}[X_{j}-X_{j-1} ~|~ \mathcal{F}_{i}]] &\left[ \substack{\text{taking out} \\ \text{what is known}} \right]\\[0.4em]
&= \mathbb{E}[(X_{i}-X_{i-1})) \cdot 0] & [\text{previous result}]\\
&= 0.
\end{align*}$Hence all the inner product terms are $0$.
> [!theorem|*] Martingale Convergence in $\mathcal{L}^2$
> If $X=(X_{n})$ is a martingale and $\forall n,~X_{n} \in \mathcal{L}^{2}$, then $X$ is bounded in $\mathcal{L}^{2}$ if and only if $\sum_{i=1}^{\infty}\mathbb{E}[(X_{i}-X_{i-1})^{2}] < \infty$, and in that case $(X_{n}) \to X_{\infty}~\mathrm{a.s.} \text{ and in }\mathcal{L}^{2}.$
> >[!proof]-
> >The equivalence is immediate from the previous result, and if that holds, monotonicity of $\mathcal{L}^{p}$ guarantees $X$ to be bounded in $\mathcal{L}^{1}$ as well, and Doob's forward convergence theorem deduces $(X_{n}) \to X_{\infty}~\mathrm{a.s.}$.
> >
> >Convergence in $\mathcal{L}^{2}$ follows from Fatou: $\begin{align*}
\mathbb{E}[(X_{\infty}-X_{n})^{2}] &= \mathbb{E}[\liminf_{r}(X_{n+r}-X_{n})^{2}]\\
&\le \liminf_{r}\mathbb{E}[(X_{n+r}-X_{n})^{2}] &[\text{Fatou}]\\
&= \liminf_{r} \sum_{i=n+1}^{n+r}\mathbb{E}[(X_{i}-X_{i-1})^{2}]\\
&= \sum_{i=n+1}^{\infty}\mathbb{E}[(X_{i}-X_{i-1})^{2}]\\[0.4em]
&\to 0 \text{ when }n \to \infty. &\left[ \substack{\text{Since the sum} \\ \text{is finite.}} \right]
\end{align*}$
One important consequence is that: if $(Y_{n})$ is a sequence of zero-mean RVs, then
- If $\sum_{n}\sigma^{2}_{n}<\infty$ where $\sigma^{2}_{n}:= \mathrm{Var}(Y_{n})=\mathbb{E}[Y_{n}^{2}]$, then their sum $S_{n}:=\sum_{i=1}^{n}Y_{i}$ converges to a RV $\mathrm{a.s.}$.
- Conversely, if $S_{n}\to S_{\infty}~\mathrm{a.s.}$, and we further assume $(Y_{n}) \le K$ for some constant $K$, then $\sum_{n}\sigma^{2}_{n} < \infty$.