1.7. CLASSIFYING THE STATES OF A MARKOV CHAIN 17 Figure 1.5. The directed graph associated to a Markov chain. A directed edge is placed between v and w if and only if P (v, w) 0. Here there is one essential class, which consists of the filled vertices. We can interchange the order of summation in the first sum, obtaining πP (C) = y∈C π(y) z∈C P (y, z) + z∈C y∈C π(y)P (y, z). For y C we have z∈C P (y, z) = 1, so πP (C) = π(C) + z∈C y∈C π(y)P (y, z). (1.34) Since π is invariant, πP (C) = π(C). In view of (1.34) we must have π(y)P (y, z) = 0 for all y C and z C. Suppose that y0 is inessential. The proof of Lemma 1.24 shows that there is a se- quence of states y0, y1, y2, . . . , yr satisfying P (yi−1, yi) 0, the states y0, y1, . . . , yr−1 are inessential, and yr C, where C is an essential communicating class. Since P (yr−1, yr) 0 and we just proved that π(yr−1)P (yr−1, yr) = 0, it follows that π(yr−1) = 0. If π(yk) = 0, then 0 = π(yk) = y∈Ω π(y)P (y, yk). This implies π(y)P (y, yk) = 0 for all y. In particular, π(yk−1) = 0. By induction backwards along the sequence, we find that π(y0) = 0. Finally, we conclude with the following proposition: Proposition 1.26. The stationary distribution π for a transition matrix P is unique if and only if there is a unique essential communicating class. Proof. Suppose that there is a unique essential communicating class C. We write P|C for the restriction of the matrix P to the states in C. Suppose x C and P (x, y) 0. Then since x is essential and x y, it must be that y x also, whence y C. This implies that P|C is a transition matrix, which clearly must be irreducible on C. Therefore, there exists a unique stationary distribution πC for P|C . Let π be a probability on with π = πP . By Proposition 1.25, π(y) = 0 for
Previous Page Next Page