Topics: Transition Probabilities in a Markov Chain - Markov Chain


(theorem)

In a Markov chain, the existence of one step stationary transition probabilities implies that for every and :

We call these the step transition probabilities.

It can be handy to write these probabilities in a transition matrix.

Simpler Notation

To simplify notation, we’ll agree on:

Note that when , then .

Observe that, when , we have , which is trivially when and otherwise.

Properties

(theorem)

Derived from the properties of a probability measure, we have that:

  • , for every and
  • , for every and