Topics: Types of Stochastic Processes
(definition)
A Markov process is a stochastic process where we know the present state of the system, but the past states don’t influence the future states of the system (Markov property). However, the present state can influence the immediate future state of the system.
Example
Each day, we determine the mood of a person (happy, sad, indifferent, etc.). Today’s mood is defined according to the yesterday’s.
We’ll have, then, variables that are conditional to the immediate previous state.