5Continuous random variables

IA Probability



5.7 Moment generating functions
If
X
is a continuous random variable, then the analogue of the probability
generating function is the moment generating function:
Definition (Moment generating function). The moment generating function of
a random variable X is
m(θ) = E[e
θX
].
For those θ in which m(θ) is finite, we have
m(θ) =
Z
−∞
e
θx
f(x) dx.
We can prove results similar to that we had for probability generating
functions.
We will assume the following without proof:
Theorem. The mgf determines the distribution of
X
provided
m
(
θ
) is finite for
all θ in some interval containing the origin.
Definition (Moment). The rth moment of X is E[X
r
].
Theorem. The
r
th moment
X
is the coefficient of
θ
r
r!
in the power series
expansion of m(θ), and is
E[X
r
] =
d
n
dθ
n
m(θ)
θ=0
= m
(n)
(0).
Proof. We have
e
θX
= 1 + θX +
θ
2
2!
X
2
+ ··· .
So
m(θ) = E[e
θX
] = 1 + θE[X] +
θ
2
2!
E[X
2
] + ···
Example. Let X E(λ). Then its mgf is
E[e
θX
] =
Z
0
e
θx
λe
λx
dx = λ
Z
0
e
(λθ)x
dx =
λ
λ θ
,
where 0 < θ < λ. So
E[X] = m
(0) =
λ
(λ θ)
2
θ=0
=
1
λ
.
Also,
E[X
2
] = m
′′
(0) =
2λ
(λ θ)
3
θ=0
=
2
λ
2
.
So
var(X) = E[X
2
] E[X]
2
=
2
λ
2
1
λ
2
=
1
λ
2
.
Theorem. If
X
and
Y
are independent random variables with moment generat-
ing functions m
X
(θ), m
Y
(θ), then X + Y has mgf m
X+Y
(θ) = m
X
(θ)m
Y
(θ).
Proof.
E[e
θ(X+Y )
] = E[e
θX
e
θY
] = E[e
θX
]E[e
θY
] = m
X
(θ)m
Y
(θ).