CHAPTER 1

Introduction to Finite Markov Chains

1.1. Finite Markov Chains

A finite Markov chain is a process which moves among the elements of a finite

set Ω in the following manner: when at x ∈ Ω, the next position is chosen according

to a fixed probability distribution P (x, ·). More precisely, a sequence of random

variables (X0, X1, . . .) is a Markov chain with state space Ω and transition

matrix P if for all x, y ∈ Ω, all t ≥ 1, and all events Ht−1 =

t−1

s=0

{Xs = xs}

satisfying P(Ht−1 ∩ {Xt = x}) 0, we have

P {Xt+1 = y | Ht−1 ∩ {Xt = x}} = P {Xt+1 = y | Xt = x} = P (x, y). (1.1)

Equation (1.1), often called the Markov property, means that the conditional

probability of proceeding from state x to state y is the same, no matter what

sequence x0, x1, . . . , xt−1 of states precedes the current state x. This is exactly why

the |Ω|× |Ω| matrix P suﬃces to describe the transitions.

The x-th row of P is the distribution P (x, ·). Thus P is stochastic, that is,

its entries are all non-negative and

y∈Ω

P (x, y) = 1 for all x ∈ Ω.

Example 1.1. A certain frog lives in a pond with two lily pads, east and west.

A long time ago, he found two coins at the bottom of the pond and brought one

up to each lily pad. Every morning, the frog decides whether to jump by tossing

the current lily pad’s coin. If the coin lands heads up, the frog jumps to the other

lily pad. If the coin lands tails up, he remains where he is.

Let Ω = {e, w}, and let (X0, X1, . . . ) be the sequence of lily pads occupied by

the frog on Sunday, Monday, . . .. Given the source of the coins, we should not

assume that they are fair! Say the coin on the east pad has probability p of landing

Figure 1.1. A randomly jumping frog. Whenever he tosses heads,

he jumps to the other lily pad.

3

http://dx.doi.org/10.1090/mbk/058/01