> [!note] Context > Construction and interpretations of a **continuous time Markov chain (CTMC)** can be found in [[Continuous Time Markov Chains|this note]]. This page assumes the Markov chain $(X_{t})_{t \ge 0}$ to be minimal. > >The analogs from discrete time Markov chains (DTMC) are introduced in [[Discrete Time Markov Chains|this note]]. ## Class Structures In the context of CTMC, if $i,j$ are two of its states, **$i$ leads to $j$** (denoted $i \to j$) if $\mathbb{P}[X_{t}=j \text{ for some }t>0 \,|\, X_{0}=i]>0,$and $i,j$ **communicate** (denoted $i \leftrightarrow j$) if they lead to each other. - Other notions like communicating classes, absorbing states, etc., follow naturally. - Class structures are naturally inherited from DTMCs: in particular, *the classes of a CTMC $(X_{t})$ equal that of its jump chain $(Y_{n})$.* On top of that, the notion of communication can also be expressed in terms of the rate matrix $Q$: > [!theorem|*] Communication via rate matrices > For a CTMC with rate matrix $Q$, if $q_{ij}>0$, then $p_{ij}(t)>0$ for any $t$. > > [!proof]- > > Let $X_{0}=i$, so $p_{ij}(t)=\mathbb{P}_{i}[X_{t}=j]$, and use the jump chain - holding time definition of CTMC: > > $p_{ij}(t)\ge \mathbb{P}_{i}[Z_{1}<t,Y_{1}=j,Z_{2}>t] = (1-e^{q_{i}t})\pi_{ij}\cdot e^{q_{j}t}.$ > > The right hand side is gt;0$ if $q_{ij}>0$. However, *any two communicating classes $i \leftrightarrow j$ can reach other over any period of time*. More precisely, any of the following are equivalent: - $(\mathrm{I})$: $i \to j$ in the jump chain $(Y_{n})$. - $(\mathrm{II})$: $p_{ij}(t)>0$ for some $t>0$. - $(\mathrm{III})$: $p_{ij}(t)>0$ for all $t > 0$. - $(\mathrm{IV})$: There is a path $i_{0},i_{1},\dots,i_{n}$ where $i_{0}=i,i_{n}=j$ such that $q_{i_{0},i_{1}}\cdot q_{i_{1},i_{2}}\cdots q_{i_{n-1}i_{n}} > 0$. One important consequence is that *periodicity is impossible in CTMC*. > [!proof]- > $(\mathrm{III})$ implies $(\mathrm{I})$ via the original CTMC. > $(\mathrm{IV}) \iff (\mathrm{I})$ since $q_{ij}>0 \iff \pi_{ij}>0$, so an analogous path exists for the jump chain. > $(\mathrm{IV})\Longrightarrow (\mathrm{III})$ since for any $t$, $p_{ij}(t)\ge p_{i_{0},i_{1}}(t / n) \cdot p_{i_{1},i_{2}}(t / n)\cdots p_{i_{n-1},i_{n}}(t / n) >0,$where every term in the product is non-$0$ by the rate-matrix characterization of communication ($q_{ij}>0 \Rightarrow p_{ij}(t)>0\, \forall t,i,j.$) ## Hitting Times and Probabilities Similar to that of DTMC, **hitting times** in CTMC of a set $A \subset \mathbb{S}$ is given by $D^{A}:= \inf \{ t:X_{t} \in A \}.$Moreover, if $H^{A} \in \mathbb{N}$ is the hitting time of $A$ by the jump chain $(Y_{n})$, then $\begin{gather} D^{A} = T_{H^{A}} \\ \{ H^{A} < \infty \} \iff \{ D^{A} < \infty \}, \end{gather}$where $T_{n}=\sum_{i=1}^{n}Z_{i}$ is the $n$th jump time of $(X_{t})$, and the equivalence holds since a finite sum of exponentials is $\mathrm{a.s.}$ finite. The **expected hitting times** are $k^{A}_{i}:= \mathbb{E}[D^{A} \,|\, X_{0}=i]$. They satisfy the following equations: > [!theorem|*] Expected Hitting Time Equations > The expected hitting times $k^{A}_{i}, i \in \mathbb{S}$ of a CTMC with jump matrix $\Pi$ satisfy $k_{i}^{A}= \begin{cases} 0 &\text{if }i \in A, \\[0.4em] {q_{i}}^{-1} + \sum_{j \ne i} \pi_{ij} k^{A}_{j} &\text{otherwise}. \end{cases}$In other words, if $i \not \in A$, then the expected hitting time is the expected amount of time before the next jump, i.e. $q_{i}^{-1}$, plus the weighted average expected hitting time after its next jump. > > Written in terms of the generator $Q$, $\begin{cases} k_{i}^{A}=0 &\text{if }i \in A, \\[0.4em] \sum_{j \in \mathbb{S}} q_{ij} k^{A}_{j}=-1 &\text{otherwise}. \end{cases}$ > Moreover, $(k_{i}^{A})$ is the minimal non-negative solution to the systems. > > > [!proof]- Proof for the cases $i \not \in A$ > > Formulation with $\Pi$: condition on the first jump, then use the Markov property. > > Formulation with $Q$ only: multiply both sides with $q_{i}=-q_{ii}$ and rearrange. The hitting probabilities of a class $A$ from starting state $i$ is defined to be $h_{i}^{A}:= \mathbb{P}[H^{A} < \infty \,|\, X_{0}=i].$ > [!theorem|*] Hitting Probability Equations > The hitting probabilities $h_{i}^{A},i \in \mathbb{S}$ of a CTMC with jump matrix $\Pi$ satisfy $h^{A}_{i}=\begin{cases} 1 &\text{if }i \in A, \\[0.4em] \sum_{j \ne i} \pi_{ij} h_{j}^{A} &\text{if }i \not \in A. \end{cases}$Equivalently, with generator $Q$, divide through by $q_{i}$ (assuming $i$ not absorbing, so $q_{i}\ne0$): $\begin{cases} h_{i}^{A}= 1 &\text{if }i \in A, \\[0.4em] \sum_{j \in \mathbb{S}} q_{ij}h_{j}^{A}=0 &\text{otherwise}. \end{cases}$Moreover, $(h_{i}^{A})$ is the minimal non-negative solution to the systems. ## Recurrence and Transience Similar to DTMC, recurrence and transience of a CTMS is defined by the set of times spent in a state $i$: $\{ t \,|\, X_{t}=i \}.$A state $i$ is: - **Transient** if $\mathbb{P}[\{ t \,|\, X_{i}=i \} \text{ is unbounded}]=0$, - **Recurrent** if $\mathbb{P}[\{ t \,|\, X_{i}=i \} \text{ is unbounded}]=1$; in particular, (minimal) CTMC cannot have recurrent states if it explodes. *Recurrence/transience in the CTMC $(X_{t})$ is equivalent to that in its jump chain $(Y_{n})$*, and as a result all properties follow from DTMC: - A state is either recurrent or transient. - *Recurrence and transience are class properties.* Similar to DTMC, recurrence/transience in CTMC can be characterized by **return times**: $R_{i}:= \inf \{ t \ge T_{1} \,:\,X_{t}=i \,|\, X_{0}=i\},$that is, the first time $t$ after the first jump $T_{1}$ leaving $i$ where $X_{t}=i$ again. > [!theorem|*] Recurrence and Transience with Return Times > For a state $i \in \mathbb{S}$, the following trichotomy holds: > - If $q_{i}=0$, then $i$ is absorbing hence recurrent. > - If $q_{i} > 0$ and $\mathbb{P}[R_{i} < \infty]=1$, then $i$ is recurrent, and $\int _{\mathbb{R}}p_{ii}(t) \, dt = \infty$. > - If $q_{i} > 0$ and $\mathbb{P}[R_{i}< \infty]<1$, then $i$ is transient, and $\int _{\mathbb{R}}p_{ii}(t) \, dt < \infty$. > > > [!proof]- > > Note that if $(S_{i}, i \in \mathbb{S})$ are the return times of the jump chain $(Y_{n})$, then $S_{i}<\infty \iff R_{i} < \infty,$so CTMC transience/recurrence follows from $\mathbb{P}[R_{i}<\infty]=\mathbb{P}[S_{i}<\infty]$ determining DTMC transience/recurrence of the jump chain. > > > > For the integrals, we prove the following identity: $\int _{\mathbb{R}}p_{ii}(t) \, dt=\frac{1}{q_{i}} \sum_{i=n}^{\infty}\pi_{ii}^{(n)}, $where $\pi_{ii}^{(n)}$ is the jump chain's transition probability $i \to i$ over a time gap of $n$. This is true by exchanging expectations and integrations (Fubini's theorem): $\begin{align*} \int _{\mathbb{R}}p_{ii}(t) \, dt&= \mathbb{E}\left[ \int \mathbf{1}\{X_{t}=i\} \, dt \,|\,X_{0}=i \right]\\[0.4em] &= \mathbb{E}\left[ \sum_{n=1}^{\infty} Z_{n}\mathbf{1}\{Y_{n}=i\} \,|\,X_{0}=i \right]\\[0.4em] &= \sum_{n=1}^{\infty} \mathbb{E}[Z_{n} \,|\, Y_{n}=i] \cdot \mathbb{P}[Y_{n}=i \,|\,Y_{0}=i]\\ &= \sum_{n=1}^{\infty} \frac{1}{q_{i}} \pi_{ii}^{(n)}. \end{align*} $ > Therefore, recurrence/transience of $i$ (in the jump chain) would make $\mathrm{RHS}$ infinite/finite, and the result follows. > [!warning]- Positive recurrence does not translate between a CTMC and its jump chain > Consider $(X_{t})$ with jump chain $(Y_{n})$ that is a random walk on $\mathbb{N}$ (reflected at $0$) with equal probability of going left or right. Then the jump chain is null recurrent. > > However, if the transition rates of $(X_{t})$ are fast enough, it can be positive recurrent: if for example $q_{i}=2^{i}$, detailed balance gives the invariant distribution $\eta=\left( \frac{1}{2}, \frac{1}{4}, \frac{1}{8},\dots \right).$Since $(X_{t})$ is non-explosive (otherwise contradicting recurrence of $(Y_{n})$), existence of $\eta$ means that $(X_{t})$ is positive recurrent. > > Conversely, if $(Y_{n})$ has probability of going up equal to $\frac{\lambda}{\lambda+\mu}$, $\lambda < \mu$, then it is positive recurrent. However, choosing $q_{i}=\rho^{i}$ where $\rho \le \frac{\mu}{\lambda}$ gives the invariant measure infinite mass, hence implying $(X_{t})$ is null recurrent. That being said, positive recurrence is still a class property in CTMC, although not directly inherited from DTMC properties: if $i \leftrightarrow j$, then $i$ positive recurrent $\iff$ $j$ positive recurrent. > [!proof]- > Let $H_{k}:= \inf\{ t \ge T_{1}~|~ X_{t}=k\}$ be the hitting time of $k \in \mathbb{S}$. Then by definition $\mathbb{E}[R_{i}]=\mathbb{E}_{i}[H_{i}]$; assume it is finite, and we shall show that $\mathbb{E}_{j}[H_{j}]<\infty$ too. > > Now $\mathbb{E}_{j}[H_{j}] \le \mathbb{E}_{j}[H_{i}]+\mathbb{E}_{i}[H_{j}]$, as the latter is the expected length of a round trip $j \to i \to j$. So we prove each term in $\mathrm{RHS}$ is finite. > > $\mathbb{E}_{j}[H_{i}]$ also appears in $\mathbb{E}_{i}[H_{i}]$ if we condition on $\{ H_{i} > H_{j} \}$, i.e. $(X_{t})$ visits $j$ before going back to $i$: by strong Markov, $ \mathbb{E}_{i}[H_{i} ~|~ H_{i} > H_{j}]= \mathbb{E}_{i}[H_{j} ~|~ H_{i} > H_{j}] + \mathbb{E}_{j}[H_{i}],$therefore $\mathbb{E}_{j}[H_{i}] \le \mathbb{E}_{i}[H_{i} ~|~ H_{i}>H_{j}] < \infty$ due to finiteness of $\mathbb{E}_{i}[H_{i}]$. > > As for $\mathbb{E}_{i}[H_{j}]$, let $p:= \mathbb{P}_{i}[H_{i}< H_{j}]$, then by strong Markov, the number of times $(X_{t})$ returns to $i$ is $\mathrm{Geom}(p)$, giving $\begin{align*} \mathbb{E}_{i}[H_{j}] &= \left( \frac{1}{p}-1 \right)\mathbb{E}_{i}[H_{i} ~|~ H_{i}<H_{j}] + \mathbb{E}_{i}[H_{j} ~|~ H_{j} < H_{i}]\\ &\le \left( \frac{1}{p}-1 \right)\mathbb{E}_{i}[H_{i} ~|~ H_{i}<H_{j}] + \mathbb{E}_{i}[H_{i} ~|~ H_{j} < H_{i}]\\ &< \infty, \end{align*}$again because $\mathbb{E}_{i}[H_{i}]$ is finite. *Transience/recurrence is preserved by discrete fixed-interval sampling*: if a DTMC $(W_{n}):= (X_{nh})$ for some fixed $h > 0$, then $i$ is transient in $(W_{n})$ if and only if it is transient in $(X_{t})$ (hence the same holds for recurrence). > [!proof]- > Obviously $(X_{t})$ transience implies $(W_{n})$ transience. > > For the converse, note that $(W_{n})$ has transition matrix $P(h)$. Therefore it suffices to prove that $\sum_{n=1}^{\infty} p_{ii}(nh)<\infty \overset{?}{\Longrightarrow} \int _{\mathbb{R}}p_{ii}(t) \, dt<\infty.$Now $p_{ii}(t)$ has the following bound when $t \in [nh, (n+1)h)$: $\begin{align*} p_{ii}((n+1)h) &\ge p_{ii}(t) \cdot \mathbb{P}[ X \text{ stays at }i \text{ over }[t, (n+1)h) \,|\, X_{t}=t ]\\ &\ge p_{ii}(t) \cdot \mathbb{P}[ X \text{ stays at }i \text{ over }[nh, (n+1)h) \,|\, X_{nh}=t ]\\ &= p_{ii}(t) \cdot \exp(-q_{i}h). \end{align*}$where the $\exp(-q_{i}h)$ comes from $\mathbb{P}[\mathrm{Exp}(q_{i}) \ge h]$, due to memorilessness of the exponential holding times. Now plugging this bound into the $\mathrm{RHS}$ integral gives $\mathrm{RHS} \le \exp(q_{i}h) \cdot\sum_{n=1}^{\infty}p_{ii}((n+1)h) = \exp(q_{i}h) \sum_{n=1}^{\infty} \mathbb{P}[W_{n}=i \,|\, W_{0}=i]$which is finite if $i$ is transient for $(W_{n})$. ## Invariant Distribution and Convergence > [!definition|*] Invariance Measures and Distributions > If a CTMC $(X_{t})$ has state space $\mathbb{S}$ and transition matrix $P(t)$, then a measure $\eta$ on $\mathbb{S}$ is an **invariant measure** if $\forall t \ge > 0,~~\eta P(t)=\eta;$if it is a probability measure (i.e. it has mass $1$), then it is an **invariant distribution** of $(X_{t})$. > [!theorem|*] Invariance with respect to the generator > A measure $\eta$ on $\mathbb{S}$ is invariant with respect to $P(t)$ if and only if $\eta Q=\mathbf{0}.$ > > > [!proof]- Proof for finite state spaces > > By the backwards equation, $P'(t)\eta=QP\eta=P\eta$, but $\mathrm{LHS}$ is the derivative of the constant invariant distribution $\frac{ d }{ d t }(P\eta)=\frac{ d }{ d t }\eta=\mathbf{0}$. > > > > The exchange of derivative and summation in claiming $(P\eta)'=P'\eta$ is justified in finite state spaces. ### Existence of Invariant Measures Since the recurrence and irreducibility of $Q$ and $\Pi$ are equivalent (as long as $\mathbb{S}$ is not a singleton), the DTMC results of the jump chain having an invariant measure carries over to the CTMC $(X_{t})$: > [!theorem|*] Recurrence and Existence of Invariant Measures > If $(X_{t})$ is irreducible and recurrent, then it has an invariant measure $\eta$, unique up to scaling. In that case, it can be **normalized** into a distribution if and only if $(X_{t})$ is positive recurrent. > [!theorem|*] Positive Recurrence is Equivalent with Existence of the Invariant Distribution > If $(X_{t})$ is irreducible, then the two are equivalent: > - $(X_{t})$ is positive recurrent. > - $(X_{t})$ is non-explosive and has a unique invariant distribution $\eta$. In that case, the invariant distribution satisfies $\eta_{i}= \frac{1}{m_{i}q_{i}},$where $m_{i}:= \mathbb{E}_{i}[H_{i}]$ is the expected time to return to $i$ after starting there. > [!warning] Existence of invariant distributions do not concern recurrence in case of explosion > > Consider the random walk in the warning above: if $q_{i,i+1}=2^{i}\lambda$ and $q_{i,i-1}=2^{i}\mu$, $1<\frac{\lambda}{\mu}<2$, then solving [[Time Reversal of Markov Chains#Reversibility and Detailed Balance|detailed balance]] gives an invariant distribution, but obviously in that case $(X_{t})$ is transient and explisive. ### Convergence to Invariant Distributions If $(X_{t})$ is minimal, irreducible and positive recurrent with invariant distribution $\xi$, then: - From any initial distribution, $\mathbb{P}[X_{t}=i] \to \xi_{i}$ when $t \to \infty$ for all states $i \in \mathbb{S}$. - **Ergodic theorem**: the proportion of time $(X_{t})$ spends on $i$ converges to $\xi_{i}$: $\frac{1}{t}\int_{0}^{t} \mathbf{1}\{ X_{s}=i \} \, ds \to \xi_{i}. $ > [!proof]- Proof of convergence to invariant distribution > Consider $(X_{t}') \sim \mathrm{Markov}(\nu, Q)$ with the target distribution, and a $(X_{t}^{\ast}) \sim \mathrm{Markov}(\xi, Q)$ already in equilibrium. > > The big idea is to show that $T:= \inf \{ t \,|\,X_{t}=X^{\ast}_{t}\}$ is $\mathrm{a.s.}$ finite, and we can splice $(X_{t}'), (X^{\ast}_{t})$ together at $T$ to obtain a $\mathrm{Markov}(\nu, Q)$ process that converges to $\xi$. > > First, $T$ is $\mathrm{a.s.}$ finite since the process $(X_{t}', X^{\ast}_{t}) \in \mathbb{S}^{2}$ has the stationary distribution with mass $\xi_{i}\xi_{j}$ at the state $(i, j)$. Therefore it is recurrent, implying that $T=\inf_{i} H_{(i,i)}$ is $\mathrm{a.s.}$ finite. > Now consider $X_{t}:= \begin{cases} X'_{t} &\text{if }t< T \\ X^{\ast}_{t} & \text{otherwise.} \end{cases}$By strong Markov property, $(X_{t}) \sim \mathrm{Markov}(\nu, Q)$, and it converges to $\xi$: $\mathbb{P}[X_{t}=i]=\mathbb{P}[X'_{t}=i , T> t] + \mathbb{P}[X^{\ast}_{t}=t, T \le t].$The former converges to $0$ when $t \to \infty$ since $T$ is $\mathrm{a.s.}$ finite, and the latter is just $\xi _i$.