Topics: Types of Stochastic Processes


(definition)

A Markov process is a stochastic process where we know the present state of the system, but the past states don’t influence the future states of the system (Markov property). However, the present state can influence the immediate future state of the system.

We’ll have, then, variables that are conditional to the immediate previous state.