Topics: Markov Chain - Ergodic Markov Chain
(definition)
A regular Markov chain is an ergodic Markov chain that is not cyclical.
In a regular Markov chains, the probability of reaching a given state is constant when the number of steps is high enough, regardless of the initial state. As such, regular Markov chains are characterised by a stationary distribution (vector), which contains these constant steady state probabilities.