Topics: Absorbing Set - Markov Chain
(definition)
A Markov chain whose every non-transitory state is absorbing is called an absorbing Markov chain.
Search
Topics: Absorbing Set - Markov Chain
(definition)
A Markov chain whose every non-transitory state is absorbing is called an absorbing Markov chain.