Moment-Generating Function
title: Moment-Generating Function (Definition)
The moment generating function (mgf) of a [[Random Variable|r.v.]] $X$ is:
$$M_X(t) = E(e^{tx})$$
- $E(e^{tx})$ is a function of $t$ ($t$ is a dummy variable)
- $E(e^{tx})$ needs to be finite on an open interval containing $0$, otherwise we say the mgf doesn't exist
Ex: Find the mgf for Bernoulli Distribution:
Ex: Find the mgf of a Uniform Distribution:
- 1st moment:
- 2nd moment: → , Variance
- 3rd moment: → Skewness
- 4th moment: → Kurtosis (how “fat the tail is”)
Why is the mgf important? Because we have the following results.
title: Result 1
We can calculate the $n$-th moment of a [[Random Variable|r.v.]] $X$ by evaluating the $n$-th derivative of the [[Moment-Generating Function|mgf]] at $t = 0$
MGFs allow us to replace “messy” integration with “cleaner” derivatives.
title: Result 2: The mgf determines the distribution
If two RVs have the same mgf, then they have the same distribution.
title:Result 3: The mgf of indenpendent distributions
If [[Random Variable|r.v.]]s $X$ and $Y$ are independent, then $M_{X +Y}(t) = M_X(t) \cdot M_Y(t)$
This is useful because we are often interested in sums and averages in [[Model|Modelling]].
Example: Find the mgf of a Binomial Distribution, i.e. find the mgf of , where . We know that where
Using Result 3, we get
TODO: COMPLETE THE PROOF HRE M_1_X WHAT IS happening here????