Topics: Markov Chain


In a Markov chain, we say we transition when we go from one state to another. The study probability of transitioning from one state to another is of special interest.

We can define one step transition probabilities, which are basically the probabilities of transitioning from one state to another in a single step.

We can also define stationary transition probabilities, which show up when the transition probabilities in a Markov chain don’t change across time.

The existence of stationary transition probabilities also allow us to define n step transition probabilities.

All of these transition probabilities are actually conditional in the sense that they consider a specific present state. We can additionally obtain the unconditional transition probabilities, which are the probabilities of reaching a given state with no other condition specified.