Topics: Markov Chain - Transition Probabilities in a Markov Chain - n Step Transition Probability


(definition)

Given a Markov chain with total states, we can handily write its n step transition probabilities in a transition matrix, which is an matrix of the form:

That is, the th element of the transition matrix is the transition probability from to in steps.

step Transition Matrix

(theorem)

From the Chapman-Kolmogorov equation it follows that:

That is, matrix exponentiation.