Topics: Regular Markov Chain - Markov Chain
In a regular Markov chain, it happens that, regardless of the initial state, the probability of reaching a given state is a constant when the number of steps is sufficiently high.
(fundamental theorem)
For every regular Markov chain, the following limit (n step transition probability) exists and it’s the same one regardless of :
Furthermore, we have that and the following equations are satisfied:
- , for
The values are called steady state probabilities, while the vector that they form is called the stationary distribution vector.
In other words, the probability of reaching a state tends to as , regardless of the initial state.
Calculation of Steady State Probabilities
(observation)
We can calculate all of the in a given chain by formulating all corresponding equations then solving the resulting equation system.
Notice that this resulting equation system has equations but only unknowns. This system has a unique solution, so one of these equations will be redundant.
Thus, we may find it easier to solve the system that consists of all equations but one (the redundant one). Note that the equation that establishes that all must add up to is never redundant, so it can never be discarded.
Transitory States Tend to 0
(corollary)
If is a transitory state, then:
…since, the more steps we take, the more likely we are to step out of its containing (transitory) set and thus, stop being able to return to it.
Limit of the Transition Matrix
(corollary)
From the previous theorem, it’s also possible to observe that, given the n step transition matrix of the regular Markov chain: