Topics: State Set - Stochastic Process


(definition)

Let be a subset of state set and let be its complement in .

If each state in can be reached from any other in , and it’s possible to move from (at least) one state in to another in , then we call a transitory set.

A transitory state is an element of a transitory set. A state that is not transitory is ergodic.

In simpler terms, a transitory state is a state where there exists a probability of getting out of it, but then never coming back (i.e. when reaching an ergodic set).

A Markov chain may have no transitory states. Compare how a Markov chain must have at least one ergodic set.