### Memorilessness
> [!theorem|*] Memorilessness Property
> The exponential distribution is **memoriless**: if $Z \sim \mathrm{Exp}(\lambda)$, then $Z-l ~|~Z > l$ is also $\mathrm{Exp}(\lambda)$.
>
> > [!proof]-
> > For $z > 0$, the distribution of $Z-l ~|~ Z>l$ is given by $\begin{align*}
\mathbb{P}[Z -l > z ~|~ Z > l] &= \frac{\mathbb{P}[Z>z+l]}{\mathbb{P}[Z >l]} = \frac{\exp(-\lambda (z+l))}{\exp(-\lambda l)}=\exp(-\lambda z).
\end{align*}$
> > Therefore $Z-l ~|~ Z>l$ is $\mathrm{Exp}(\lambda)$.
> [!theorem|*] Extended Memoriless Property
> If $L$ is independent of $Z$, then the same holds: $Z -L ~|~ Z>L \sim \mathrm{Exp}(\lambda)$.
>
> > [!proof]-
> > Again fix $z > l$. $\begin{align*}
\mathbb{P}[Z-L > z ~|~ Z -L > 0]&= \frac{\int _{0}^{\infty}\int _{l+z}^{\infty} [f_{L}\cdot \lambda \exp(-\lambda s) ]~ ds ~dl }{\int _{0}^{\infty}\int _{l}^{\infty} [f_{L}\cdot \lambda \exp(-\lambda s) ]~ ds ~dl}\\
&= \frac{\int _{0}^{\infty} f_{L} \exp(-\lambda(l+z)) ~ dl }{\int _{0}^{\infty} f_{L}\exp(-\lambda l) ~ dl }\\
&= \exp(-\lambda z).
\end{align*}$
### Finiteness of Sums
> [!theorem|*] Finiteness of Sum of Exponentials
> If $Z_{1},\dots$ are independent exponentials with rates $\lambda_{1},\dots$, then $\begin{align*}
&\left( \sum_{i=1}^{\infty}Z_{i} < \infty ~\mathrm{a.s.} \right)\iff \left( \sum_{i=1}^{\infty} \mathbb{E}[Z_{i}]=\sum_{i=1}^{\infty} \frac{1}{\lambda_{i}} < \infty \right);\\[0.4em]
&\left( \sum_{i=1}^{\infty}Z_{i} = \infty ~\mathrm{a.s.} \right)\iff \left( \sum_{i=1}^{\infty} \mathbb{E}[Z_{i}]=\sum_{i=1}^{\infty} \frac{1}{\lambda_{i}} = \infty \right).
\end{align*}$That is, *their sum if $\mathrm{a.s.}$ finite if and only if the sum of their means is finite; otherwise it is $\mathrm{a.s.}$ infinite, and there is no middle ground.*
> [!proof]
> For the *finite* case, MCT gives $\mathbb{E}\left[ \sum_{i}Z_{i} \right]=\sum_{i}\mathbb{E}[Z_{i}]$, so both are finite. But a random variable ($\sum_{i}Z_{i}$) must be $\mathrm{a.s.}$ finite in order to have a finite mean.
>
> For the *infinite* case, consider $\mathbb{E}\left[ \exp\left( -\sum_{i}Z_{i} \right) \right]$, which is $0$ if and only if the sum is $\mathrm{a.s.}$ infinite. Recall that the exponential distribution (of rate $\lambda$) has moment generating function $s \mapsto \lambda / (\lambda-s)$, so using $s=-1$ gives $\mathbb{E}\left[ \exp\left( -\sum_{i}Z_{i} \right) \right]=\prod_{i} \frac{\lambda_{i}}{\lambda_{i}+1}=\prod_{i}\left( 1+ \frac{1}{\lambda_{i}} \right)^{-1}.$Now $\mathrm{RHS}=0$ because its inverse is $\prod_{i}(1+ 1 / \lambda_{i}) \ge \sum_{i} 1 / \lambda_{i}=\infty$
>
> The inequality follows by expanding the products, in partial sums if rigor is necessary: every $\mathrm{RHS}$ term $1 / \lambda_{i}$ is in $\mathrm{LHS}$, while other $\mathrm{LHS}$ terms are all non-negative.
### Competing Exponentials
> [!theorem|*] Competing Exponentials
> Suppose $I=\{ 1,\dots \}$ is an index set (finite or countably infinite), and $(Z_{i})_{i \in I}$ are independent exponentials of rates $(\lambda_{i})_{i \in I}$. For intuition, consider an array of alarm clocks, where the $i$th clock goes off at time $Z_{i}$.
>
> Let $M:= \inf_{i \in I}Z_{i}$, i.e. the first time when a clock rings, and $K$ be the winner. Then:
> - As a whole, the clocks rings at time $M \sim \mathrm{Exp}\left( \sum_{i}\lambda_{i} \right)$.
> - There is $\mathrm{a.s.}$ a winner, i.e. $\exists k:M=Z_{k}$ is $\mathrm{a.s.}$; i.e.,
> - Clock $j$ wins with probability $\mathbb{P}[K=j]=\lambda_{j} / \sum_{i}\lambda_{i}$.
>
> Moreover, the match essentially restarts after the first ring: conditioning on $\{ K=k\}$,
> - The other clocks ($j \ne k$) now needs another $E_{j}-M ~|~ K=k \sim \mathrm{Exp}(\lambda_{j})$ time to ring.
> - Those times $\{ E_{j}-M ~:~j \ne k \}$ are independent after conditioning.