Topics: Probability - Characteristic Function


(theorem)

Let be any random variable. Then, with respect to its characteristic function . we have that:

  1. The characteristic function of any random variable depends only on its distribution.

  2. The characteristic function takes values within the unit circle (i.e.

  3. Let be another random variable independent from . Then

  4. If and are random variables such that , then and have the same distribution.

  5. The characteristic function is a continuous function.

With Respect to Moments

(theorem)

Let be a random variable with an th finite moment. Then:

  1. The th derivative of exists and is continuous.

\left. \frac{d^n \varphi_{X}(t)}{dt^n} \right|{t=0} &= i^n \mathbb{E}[ X^n ] \[1em] &\implies \[1em] \mathbb{E}[ X^n ] &= \frac{1}{i^n} \left. \frac{d^n \varphi{X}(t)}{dt^n} \right|_{t=0} \end{align*}

# With Respect to Linear Functions ***(theorem)*** Let $a, b \in \mathbb{R}$ and $Y = aX+b$ with $X$ a [[Random Variable|random variable]]. Then:

\varphi_{Y}(t) = e^{ibt} \varphi_{X}(at)

# With Respect to Independent Random Variables ***(theorem)*** Let $X_{1}, \dots, X_{n}$ be [[Event Independence|independent]] random variables. Then:

\varphi_{\sum \limits_{i=1}^{n} x_{i}}(t) = \prod_{k=1}^{n} \varphi_{X_k}(t)

> [!warning]- The reciprocal is not true > > Note that the reciprocal is **not** true. That is: > > $$ > \begin{align*} > \varphi_{X_1 + X_2}(t) &= \varphi_{X_1}(t) \varphi_{X_2}(t) \\[1em] > &\centernot \implies \\[1em] > X_1 \text{ and } X_2 &\text{ are independent} > \end{align*} > $$