Topics: Transition Probabilities in a Markov Chain - Markov Chain
(definition)
Given a Markov chain and its one step transition probabilities, if, for every , we have that:
…then we say the one step transition probabilities are stationary. That is, the transition probabilities don’t change across time and they are always the same.