Homepage > MCMC

Markov Chain

Let $ S$ be a discrete state space and let $ (X_t, t = 0, 1,\ldots)$ be a discrete time stochastic process, taking its value on $ S$. Then $ X_t$ is called a Markov chain if

$\displaystyle P(X_t = x_t \vert X_s = x_s, s \le t') = P(X_t = x_t \vert X_{t'} = x_{t'}).
$

Let $ \mathbf{P}(x,y)$ be a transition probability, that is, $ \mathbf{P}(x,\cdot)$ is a probability distribution on $ S$ for each $ x\in S$. Then one can construct a Markov chain $ X_t$ so that

$\displaystyle \mathbb{P}(X_t = y \vert X_{t-1} = x ) = \mathbf{P}(x,y)
$

Stationary distribution. Let $ \pi$ be a probability distribution on $ S$. $ \pi$ is said to be stationary if

$\displaystyle \sum_{x\in S} \pi(x) \mathbf{P}(x,y) = \pi(y)
$

for all $ y\in S$.

A Markov chain $ \tilde{\mathbf{P}}$ is called time-reversed if the detailed balance

$\displaystyle \pi(x) \mathbf{P}(x,y) = \pi(y) \tilde{\mathbf{P}}(y,x)
$

holds between $ \mathbf{P}$ and $ \tilde{\mathbf{P}}$. Moreover, $ \mathbf{P}$ is called a reversible Markov chain when $ \mathbf{P} = \tilde{\mathbf{P}}$. In particular if the probability distribution $ \pi$ satisfies the detailed balance between the two Markov chains $ \mathbf{P}$ and $ \tilde{\mathbf{P}}$, then $ \pi$ becomes stationary for both $ \mathbf{P}$ and $ \tilde{\mathbf{P}}$.

Irreducibility and aperiodicity. We can define $ \mathbf{P}^n(x,y)$ inductively by

$\displaystyle \mathbf{P}^n(x,y) = \sum_{z\in S} \mathbf{P}^{n-1}(x,z)
\mathbf{P}(z,y) .
$

In general, if we have transition probabilities $ \mathbf{P}$ and $ \mathbf{Q}$, then $ \mathbf{P} \mathbf{Q}$ is the transition probability defined by

$\displaystyle (\mathbf{P} \mathbf{Q})(x,y) := \sum_{z\in S} \mathbf{P}(x,z) \mathbf{Q}(z,y).
$

We can easily see that $ P(X_{t+n} = y \vert X_t = x ) = \mathbf{P}^n(x,y)$.

$ \mathbf{P}$ is said to be aperiodic if

$\displaystyle \gcd\{n: \mathbf{P}^n(x,y) > 0\} = 1
$

for any $ x,y\in S$. $ \mathbf{P}$ is said to be irreducible if for any $ x,y\in S$, there is an $ n$ such that $ \mathbf{P}^n(x,y) > 0$.

Ergodicity and limit theorem. We call $ (X_t)$ (and its transition probability $ \mathbf{P}$) an ergodic Markov chain if $ \mathbf{P}$ is irreducible and aperiodic. Now suppose that we have devised an ergodic Markov chain whose stationary distribution is $ \pi$. Then we have the following convergence theorem: If $ \mathbf{P}$ is irreducible and aperiodic, then there is a unique stationary distribution $ \pi$ such that

$\displaystyle \mathbf{P}^n(x,y) \to \pi(y)$    as $\displaystyle n \to \infty.
$

In terms of sample path it implies that

$\displaystyle \mathbf{X}_t \stackrel{\mathcal{L}}{\longrightarrow} \pi$    as $\displaystyle \quad t \longrightarrow \infty.
$


© TTU Mathematics