Unit 1: Events and Probability

Experiments, Sample Spaces and Events

A random experiment, denoted , is something we do that has various results and can’t be predicted.

A sample space, denoted , is the set that contains all the possible results of an experiment. Sample spaces are called espacios muestrales in Spanish.

An event, denoted by any uppercase letter of the Latin alphabet, is a subset of a sample space.

Frequency and Probability

We can define the frequency and the probability of a given event. The latter is of special importance in probability, as the name suggests.

σ-algebras and Probability Measures

σ-algebras (denoted ) are families of subsets that help us define probability measures, functions (denoted ) that let us measure the probability of a given event .

Probability measures are also of special importance, hence why it is important to know their properties.

Combinations and Permutations

When selecting items from a set, we can have:

  • A combination, when we don’t care about the order in which we select them.
  • A permutation, when we do care about the order in which we select them.

Principles of Counting

When selecting items from one set or another (both disjoint), the number of ways in which we can do so can be calculated by adding the cardinalities of both sets, according to the addition principle.

On the other hand, when selecting items from one set and another, the number of ways in which we can do so can be calculated by multiplying the cardinalities of both sets, according to the multiplication principle.

Number of Combinations and Permutations

The number of permutations we can have depends on whether we have permutations with no repetitions, with repetitions or with groups of equal elements.

The number of combinations is, in fact, the number of permutations with no repetitions divided by

More Complex Events

When calculating the probability of an event given the occurrence of another event , we say we have a conditional probability.

We can express the probability of the intersection of various events as the product of several conditional probabilities. This is known as the product rule.

When the occurrence of an event doesn’t affect the occurrence of another event , we say that and are independent. In that case, it also stands that doesn’t affect .

Partitions, Total Probability and Bayes’ Theorem

We can make partitions in a sample space. From them, we can obtain:

  • The total probability formula, which helps us obtain the probability of any event in the sample set given a partition of the sample set.

  • Bayes’ theorem, which helps us obtain the probability of an event given the probability of other related events.

Unit 2: Random Variables

There are various useful theorems that describe the behaviour of limits in a probability measure.

Random Variables

A random variable is a function that allows us to mathematically formalise the concept of randomness. They basically “translate” randomness into mathematical terms we can manipulate and study.

There are two types of random variables:

An indicating random variable is simply a random variable that tells us if a specific value is in an event.

Density Functions

A density function is a function that allows us to specify the probability of being a specific value (in the discrete case) or of falling within a particular range of values (in the continuous case, though indirectly).

Distribution Functions

A distribution function is a function that, in general, can be seen as the function that “accumulates” the values of a density function.

Distribution functions help us find the probability that the given random variable falls within a specific interval of values.

Moments

The expected value of a random variable corresponds, roughly, to the concept of the arithmetic mean of the outcomes of the random variable. Expected value has several properties.

When given a variable such that is defined according to another one (e.g. ), we can easily calculate its expected value with the law of the unconscious statistician.

A moment of a random variable is a special value related to the behaviour of its probability distribution. Moments have an order (), and are either centred around the variable’s expected value or around .

The moment of order around the variable’s expected value is better known as its variance.

Other Special Functions

The moment-generating function of a random variable is, as its name suggests, a function that lets us easily calculate its moments around when deriving and then evaluating it at .

Moment generating functions have various properties.

The characteristic function of a random variable is a function that’s very similar to the moment-generating function, except that it involves complex numbers. Characteristic functions also have various properties.

Unit 3: Notable Distributions

There are many notable probability distributions. Studying these distributions can help us solve problems more easily.

Discrete Distributions

Let be a discrete random variable.

  1. has a discrete uniform distribution when every one of ’s possible results are equally as probable.

  2. has a Bernoulli distribution when it handles the success or failure of a single experiment.

  3. has a binomial distribution when it handles the success or failure of multiple experiments of the same nature.

  4. has a geometric distribution when it handles the repetition of multiple success/failure experiments (of the same nature) until success is achieved. Beware: there is a normal geometric distribution and a very similar shifted geometric distribution.

  5. has a Poisson distribution when it handles the number of events that occur in a given time or space interval.

Continuous Distributions

Let be a continuous random variable.

  1. has a continuous uniform distribution when all intervals of a given length have the same probability.

  2. A very common continuous distribution is the normal distribution. Many random variables that model natural, social or psychological phenomena have a normal distribution.

Unit 4: Random Vectors

A random vector is the concept of a random variable generalised for multiple dimensions.

Random vectors have a (joint) density function and can also have marginal and conditional density functions.