Topics: State Set - Stochastic Process


(definition)

Given a Markov chain, let be a subset of its state set and let be its complement in .

If each state in can be reached from any other state in , but no state in can be reached from any in , then we say that is an ergodic set (or recurrent set).

Thus, in a given ergodic set, all states communicate with each other. Once an ergodic set has been reached, it’s no longer possible to get out of it, but it is still possible to move across the states in that set.

An ergodic state is an element of an ergodic set. A state that is not ergodic is transitory. An ergodic set with a single element is said to be absorbing.

Every finite Markov chain has at least one ergodic set, but there can be more. Compare how a Markov chain may have no transitory states.