Topics: Stochastic Process


(definition)

Let be a stochastic process with as its state set. Note that it has a discrete parameter set .

Let and let, for any :

We call such a stochastic process a random walk. When , we call it a simple random walk.

Do note that these probabilities don’t depend on the specific and, as such, we say that they are homogeneous across time. A simple random walk is a Markov process.

We can deduce several formulae that make it easier to obtain the transition probabilities in a random walk.

Formal Definition

(definition)

Let be a sequence of iid random variables such that, for any :

…where .

With that, we define a random walk as the stochastic process where and, for :

Recall that when , we call it a simple random walk.

Symmetrical and Asymmetrical Walks

(definition)

Recall that . When , we say that the random walk is symmetrical. When , we say it is asymmetrical.

Expected Value and Variance

(theorem)

For any integer , we have: