A **Poisson process** is a stochastic process represented by $N_{t}$, the number of events that happened before time $t$.
* The $k$th **arrival time** is $T_{k}=\inf \{ t\ge 0:N_{t}\ge k \}$, and the $k$th **interarrival time** is $Y_{k}=T_{k}-T_{k-1}$.
* The **rate** of the process is a constant $\lambda\in \mathbb{R}$.
Poisson processes have two equivalent definitions. Each is useful for different problems.
- *Exponential interarrival time*: a counting process is Poisson with rate $\lambda$ if its interarrival times are iid. $\text{Exp}(\lambda)$.
- *Poisson increments*: a counting process $N_{t}$ is Poisson with rate $\lambda$ if: its increments over disjoint time intervals are independent, and over any interval $[t,t+d)$ its increment is $(N_{t+d}-N_{t}) \sim\text{Poisson}(\lambda d)$.
It is a special case of [[Continuous Time Markov Chains]] where the only possible transitions from some state $k$ is to $k+1$.
### Properties
> [!theorem|*] Uniform Jump Times upon Conditioning
> Conditioning on the event $\{ X_{t} = n \}$, the first $n$ arrival times $T_{1},\dots,T_{n}$ are identically distributed as an ordered sample of $U_{1},\dots,U_{n} \overset{\mathrm{iid.}}{\sim} U[0,t]$.
> [!proof]
> The first $n+1$ interarrival times $Z_{1},\dots,Z_{n+1} \overset{\mathrm{iid.}}{\sim} \mathrm{Exp}(\lambda)$ have joint density $f_{\mathbf{Z}}(\mathbf{z})=\lambda^{n+1}\exp\left( -\lambda \sum\nolimits_{i}z_{i} \right)\cdot,\mathbf{1}_{\forall i,~z_{i} \ge 0 },$so a change of variable to $T_{1},\dots,T_{n+1}$ (with Jacobian $1$) gives the joint density of $T_{1},\dots,T_{n+1}$ as $f_{\mathbf{T}}(\mathbf{t})=\lambda^{n+1} \exp(-\lambda t_{n+1}) \cdot \mathbf{1}_{0 \le t_{1} \le \cdots \le t_{n+1}}.$Now condition on the event $\{ X_{t}=n \}$, any region $A \subseteq [0,t]^{n}$ satisfies $\begin{align*}
\mathbb{P}&[(T_{1},\dots,T_{n}) \in A ~|~ X_{t}=n]\\[0.8em]
&= \frac{\mathbb{P}[(T_{1},\dots,T_{n}) \in A, T_{n+1}>t]}{\mathbb{P}[X_{t}=n]}\\
&= e^{\lambda t} (\lambda t)^{-n} n! \cdot \int _{A}\lambda^{n+1} \mathbf{1}_{0 \le t_{1} \le \cdots \le t_{n}} ~ d\mathbf{t}_{-(n+1)} \int_{t}^\infty \exp(-\lambda t_{n+1}) ~ dt_{n+1} \\
&= \int _{A} t^{-n} n! \cdot\mathbf{1}_{0 \le t_{1} \le \cdots \le t_{n}} ~d\mathbf{t}_{-(n+1)},
\end{align*}$which has the same distribution as the ordered uniform sample.
> [!theorem|*] Superposition of Poisson Processes
> Given two independent Poisson counting processes $X_{t}$ and $Y_{t}$ with rates $\lambda$ and $\mu$, their sum $Z_{t}=X_{t}+Y_{t}$, called their **superposition**, is also a Poisson process with rate $(\lambda+\mu)$.
> - That is, *sum of independent poissons is still poisson.*
>
> > [!proof]-
> > Verify that $Z_{t}$ satisfies the poisson increment definition of Poisson processes.
> [!theorem|*] Thinning of Poisson Processes
> Given a Poisson process $(X_{t})$ with rate $\lambda$, if an occurrence are counted by $(Y_{t})$ with probability $p$ (independently) and by $(Z_{t})$ otherwise, the **thinned processes** $Y,Z$ are still Poisson process, but with rates $\lambda p, \lambda(1-p)$. Moreover, $(Y_{t}),(Z_{t})$ are independent.
> - *That is, randomly sampled Poisson is still Poisson.*
>
> > [!proof]- Proof with exponential interarrival time
> > *The lecture notes has a proof using the poisson increment definition, while the following is an alternative using the exponential interarrival time definition.*
> >
> > After the thinning, the interarrival time is$\tilde Y=\sum_{k={1}}^{N} Y_{k}$where $N\sim \text{geo}(p)$.
> > Then $M_{\tilde{Y}}(t)=G_{R}\big(M_{Y_{k}}(t)\big)$ as shown in [[Generating Functions#^247049|this theorem]]. We can then verify that $\tilde Y$ has the same mgf as $\text{Exp}(p\lambda)$.
>
> > [!proof]- Proof with infinitesimal definition
> > Obviously the thinnned process $(Y_{t})$ is still a right-continuous counting process with independent increments, so it suffices to confirm that $\begin{align*}
> \mathbb{P}[Y_{t+h}-Y_{t}=0]&\overset{?}{=} 1-p \lambda h + o(h);\\
> \mathbb{P}[Y_{t+h}-Y_{t}=1] &\overset{?}{=} p \lambda h + o(h).
> \end{align*}$Now using infinitesimal definition on $(X_{t})$, $\begin{align*}
> \mathbb{P}[Y_{t+h}-Y_{t}=0]=~ &\mathbb{P}[X_{t+h}-X_{t}=0]\\
> &+\mathbb{P}[X_{t+h}-X_{t}=1 \text{ AND was discarded}]\\
> &+ \mathbb{P}[X_{t+h}-X_{t} \ge 2 \text{ AND all discarded}]\\[0.4em]
> \to ~& (1-\lambda h + o(h)) + (1-p)(\lambda h +o(h)) +O(h^{2})\\
> =~&1-p \lambda h + o(h).
> \end{align*}$
> The probability for $Y_{t+h}-Y_{t}=1$ is just $\begin{align*}
> \mathbb{P}[Y_{t+h}-Y_{t}=1]= ~&\mathbb{P}[X_{t+h}-X_{t}=1\text{ AND not discarded}] \\
> &+ \mathbb{P}[X_{t+h}-X_{t}=k \ge2 \text{ AND }(k-1) \text{ discarded}]\\[0.4em]
> \to ~& p(\lambda h + o(h)) + O(h^{2})\\
> =~& p \lambda h + o(h).
> \end{align*}$
>
> > [!proof]- Proof of independence (assuming Poisson)
> > The joint mass function of $Y,Z$ are $\begin{align*}
\mathbb{P}[&Y_{t}=y,~ Z_{t}=z]\\
&= \mathbb{P}[Y_{t}=y, X_{t}=Y_{t}+Z_{t}=y+z]\\
&= \mathbb{P}[Y_{t}=y ~|~ X_{t}=y+z] \cdot \mathbb{P}[X_{t}=y+z]\\
&= {y+z\choose y} p^{y}(1-p)^{z} \cdot \frac{e^{-\lambda}\lambda^{y+z}}{(y+z)!}\\
&= e^{-p \lambda}\frac{(p \lambda)^{y}}{y!} \cdot e^{-(1-p)\lambda}\frac{((1-p)\lambda)^{z}}{z!},
\end{align*}$so $Y,Z$ are independent.
### Examples in the Notes
- Example 7.3 (Geiger counter): basic application of the Poisson increment definition.
- Example 7.8 (call center): longer applications of both definitions.
- Example 7.9 (genetics): basic application of the Poisson increment definition with a huge chunk of context.