> [!definition|*] Brownian Motions
> A stochastic process $(X_{t})$ over continuous time $t \in \mathbb{R}$ is a **Brownian motion** if it satisfies:
> - $X_{0}=0$,
> - The event $\{ t \mapsto X_{t} \text{ is continuous }\forall t\}$ is $\mathrm{a.s.}$.
> - $(X_{t})$ has Gaussian increments, i.e. for $0<s <t$, $X_{t}-X_{s} \sim N(0, (t-s)),$
> - Disjoint increments of $(X_{t})$ are independent, i.e. for any finite $k$, and intervals $(s_{1},t_{1}),\dots,(s_{k},t_{k})$ where $s_{1}<t_{1}<\dots<s_{k}<t_{k}$, there is $\forall 1 \le i \lneq j \le k,~~(X_{t_{i}}-X_{s_{i}}) \perp (X_{t_{j}}-X_{s_{j}}).$
In particular, $\mathrm{Cov}(X_{t},X_{s})=\min(t,s)$ since the second leg is independent of the first leg.
## Construction using Wavelets
One way of constructing a Brownian motion is by using [[Orthogonal Wavelets|Haar wavelets]] $\{ \phi_{n} \}_{0}^{\infty}$ to find a weighted sum of independent Gaussian variables $\{ Z_{n} \}_{0}^\infty$.
(Haar) Wavelets offer the following ideal properties:
- They form a basis of $L^{2}$, so the Brownian motion, being ($\mathrm{a.s.}$) continuous, can be approximated by them.
- They are localized to some interval $(k2^{-j}, (k+1)2^{-j})$, so the corresponding $Z_{n}$ will only be turned on over that small interval.
To find the way of combining them, we start with the lemma that
> [!lemma|*] When Gaussian Process is Brownian
> If $(X_{t})$ is a Gaussian process, starts at $0$, is continuous, and has $\mathrm{Cov}(X_{s},X_{t})=\min(s,t),$then it is a Brownian motion.
Therefore, (assuming for now that) since the combined $\{ Z_{n} \}$ produce a Gaussian process, we only need a construction that satisfy the covariance requirement.
Now we attempt to writing $\min(s,t)$ with the wavelet basis: since $\min(s,t)=\int \mathbf{1}_{[0,s]}\mathbf{1}_{[0,t]} ~ dx$, decomposing the two gives $\begin{align*}
\min(s,t)&= \int_{0}^{1} \left( \sum_{n}\left< \mathbf{1}_{[0,s]},\phi_{n} \right>\phi_{n} \right)\left( \sum_{n}\left< \mathbf{1}_{[0,t]},\phi_{n} \right>\phi_{n} \right) ~ dx \\
&= \sum_{n}\left< \mathbf{1}_{[0,s]},\phi_{n} \right>\left< \mathbf{1}_{[0,t]},\phi_{n} \right> \\
&= \sum_{n}\left( \int _{0}^{s}\phi_{n}(x) ~ dx \right)\left( \int _{0}^{t} \phi_{n}(x)~ dx \right) .
\end{align*}$where all $\phi_{m}\phi_{n}$ integrate to $0$ iff. $n \ne m$, and exchanges of integration and summations is justified by DCT.
Therefore, we are tempted to write $X_{t}\overset{?}{:=}\sum_{n}Z_{n}\int _{0}^{t} \phi_{n}(x)dx,$so that the covariances are $\begin{align*}
\mathrm{Cov}(X_{t},X_{s})&= \mathbb{E}[X_{t}X_{s}]\\
&\overset{?}{=} \sum_{n}\underbrace{\mathbb{E}[Z_{n}^{2}]}_{=1}\left( \int _{0}^{t}\dots \right)\left( \int _{0}^{s} \dots \right)\\
&= \min(t,s).
\end{align*}$Recall that $Z_{m},Z_{n}$ are independent if $m\ne n$, so $\mathbb{E}[Z_{m}Z_{n}]=0$. The equality marked with $?$ needs to be justified with uniform convergence of $X_{t}$.
### Justification of Uniform Convergence
To have uniform convergence, we need to bound the behavior of the Gaussian variables $\{ Z_{n} \}_{1}^{\infty}$ with the following lemma:
> [!lemma|*] Gaussian Variables are $O(\log n)$
> If $\{ Z_{n} \}_{1}^{\infty}$ is a sequence of independent $N(0,1)$ RVs, then their magnitudes $\{ |Z_{n}| \}$ are $O(\sqrt{ \log n })$, i.e. given $\omega$, we define $C:=\sup_{n \ge 2} \frac{|Z_{n}|}{\sqrt{ \log n }},$and we have $C<\infty~\mathrm{a.s.}$.
>
> > [!proof]-
> > For each $Z_{n}$, the probability that its magnitude is larger than $z>1$ is $\begin{align}
> 2\Phi(z)&= \sqrt{ \frac{2}{\pi} }\int _{z}^{\infty}e^{-x^{2} / 2} ~dx \\
> &\le \sqrt{ \frac{2}{\pi} }\int _{z}^{\infty}xe^{-x^{2} / 2}~dx \\
> &= e^{-z^{2} / 2} \sqrt{ \frac{2}{\pi} }.
> \end{align}$Therefore, by choosing $z=2 \sqrt{ \alpha\log n }$ for any $\alpha > 1$, we have $\mathbb{P}[|Z_{n}| \ge z]\le n^ {\alpha}\sqrt{ \frac{2}{\pi} }.$
> Since those probabilities have a finite sum (as we have $\alpha > 1$), [[Measure Theory#^db0190|BC2]] guarantees that the events $\{ |Z_{n}| \ge z \}$ only happens for $\mathrm{a.s.}$ finitely many $n$. Therefore, the supremum $C$ is $\mathrm{a.s.}$ finite.
>
Now we can show that $X_{t}$ defined above converges uniformly:
> [!proof]
> Note that at a particular $t$, only one of the $2^{j}$ wavelet of width $2^{-j}$ is "active", for $j=0,1,\dots$. Let them be indexed as $\phi_{n_{0}},\phi_{n_{1}},\dots$.
>
> Therefore, at that time the sum is $X_{t}:= \sum_{j}Z_{n_{j}}\int_{0}^{t}\phi_{n_{j}}(x)~dx.$Its tail sums (i.e. $X_{t}$ minus the partial sums) are $\sum_{j=J}^{\infty}Z_{n_{j}}\int_{0}^{t}\phi_{n_{j}}(x)~dx,$with magnitude bounded by $\begin{align}
&\le\sum_{j\ge J} |Z_{n_{j}}|\int_{0}^{t} \phi_{n_{j}}(x) ~dx \\
&\le C \sum_{j \ge J} 2^{-j / 2-1} \sqrt{ \log n_{j} }\\
&\le C \sum_{j \ge J} 2^{-j / 2 - 1}\sqrt{ j+1 } \\
& \to 0,
\end{align}$and the bound is uniform (independent of $x$), so the convergence of the series is uniform.
### Showing that the Sum is a Gaussian Process