Topics: Markov Chain - Stationary Process - Stationary Transition Probability
(definition)
A finite and stationary Markov chain is a Markov chain with a finite state set whose transition probabilities are stationary.
A given finite and stationary Markov chain can be characterised by:
- Its finite state set
- Its one step transition matrix
- Its initial probability vector