Homepage > MCMC

## Markov Chain

Let be a discrete state space and let be a discrete time stochastic process, taking its value on . Then is called a Markov chain if

Let be a transition probability, that is, is a probability distribution on for each . Then one can construct a Markov chain so that

Stationary distribution. Let be a probability distribution on . is said to be stationary if

for all .

A Markov chain is called time-reversed if the detailed balance

holds between and . Moreover, is called a reversible Markov chain when . In particular if the probability distribution satisfies the detailed balance between the two Markov chains and , then becomes stationary for both and .

Irreducibility and aperiodicity. We can define inductively by

In general, if we have transition probabilities and , then is the transition probability defined by

We can easily see that .

is said to be aperiodic if

for any . is said to be irreducible if for any , there is an such that .

Ergodicity and limit theorem. We call (and its transition probability ) an ergodic Markov chain if is irreducible and aperiodic. Now suppose that we have devised an ergodic Markov chain whose stationary distribution is . Then we have the following convergence theorem: If is irreducible and aperiodic, then there is a unique stationary distribution such that

as

In terms of sample path it implies that

as