Topics: Markov Chain - Absorbing Set


(definition)

In a Markov chain, given a an absorbing state, the probability of reaching when we start from another state is called the probability of absorption to . This probability is denoted by .

(theorem)

The probability of absorption to when starting at is given by:

…where .

(theorem)

Let be all absorbing states in a Markov chain. Then, for a fixed state , it follows that:

In other words, the result of adding up all possible absorption probabilities when starting at a given state is simply .

(observation)

Note that we’ll have if the state is absorbing and different from .