## Markov Chain

Let be a*discrete*state space and let be a discrete time stochastic process, taking its value on . Then is called a

*Markov chain*if

*transition probability*, that is, is a probability distribution on for each . Then one can construct a Markov chain so that

Stationary distribution.
Let be a probability distribution on .
is said to be *stationary* if

A Markov chain
is called *time-reversed* if
the detailed balance

*reversible*Markov chain when . In particular if the probability distribution satisfies the detailed balance between the two Markov chains and , then becomes stationary for both and .

Irreducibility and aperiodicity. We can define inductively by

is said to be *aperiodic* if

*irreducible*if for any , there is an such that .

Ergodicity and limit theorem.
We call (and its transition probability
) an *ergodic*
Markov chain if
is irreducible and aperiodic.
Now suppose that we have devised an ergodic Markov chain
whose stationary distribution is .
Then we have the following convergence theorem:
If
is irreducible and aperiodic, then there is a unique
stationary distribution such that

as

In terms of sample path
it implies that

as

© TTU Mathematics