Topics: Ergodic Markov Chain - Markov Chain
(definition)
A cyclical Markov chain is an ergodic Markov Chain where we can only enter all states in fixed periodic intervals.
An ergodic Markov chain that is not cyclical is regular.
Search
Topics: Ergodic Markov Chain - Markov Chain
(definition)
A cyclical Markov chain is an ergodic Markov Chain where we can only enter all states in fixed periodic intervals.
An ergodic Markov chain that is not cyclical is regular.