6More distributions
IA Probability
6.4 More on the normal distribution
Proposition. The moment generating function of N(µ, σ
2
) is
E[e
θX
] = exp
θµ +
1
2
θ
2
σ
2
.
Proof.
E[e
θX
] =
Z
∞
−∞
e
θx
1
√
2πσ
e
−
1
2
(x−µ)
2
σ
2
dx.
Substitute z =
x−µ
σ
. Then
E[e
θX
] =
Z
∞
−∞
1
√
2π
e
θ(µ+σz)
e
−
1
2
z
2
dz
= e
θµ+
1
2
θ
2
σ
2
Z
∞
−∞
1
√
2π
e
−
1
2
(z−θσ)
2
| {z }
pdf of N(σθ,1)
dz
= e
θµ+
1
2
θ
2
σ
2
.
Theorem. Suppose
X, Y
are independent random variables with
X ∼ N
(
µ
1
, σ
2
1
),
and Y ∼ (µ
2
, σ
2
2
). Then
(i) X + Y ∼ N(µ
1
+ µ
2
, σ
2
1
+ σ
2
2
).
(ii) aX ∼ N(aµ
1
, a
2
σ
2
1
).
Proof.
(i)
E[e
θ(X+Y )
] = E[e
θX
] · E[e
θY
]
= e
µ
1
θ+
1
2
σ
2
1
θ
2
· e
µ
2
θ+
1
2
σ
2
2
θ
= e
(µ
1
+µ
2
)θ+
1
2
(σ
2
1
+σ
2
2
)θ
2
which is the mgf of N(µ
1
+ µ
2
, σ
2
1
+ σ
2
2
).
(ii)
E[e
θ(aX)
] = E[e
(θa)X
]
= e
µ(aθ)+
1
2
σ
2
(aθ)
2
= e
(aµ)θ+
1
2
(a
2
σ
2
)θ
2
Finally, suppose
X ∼ N
(0
,
1). Write
ϕ
(
x
) =
1
√
2π
e
−x
2
/2
for its pdf. It would
be very difficult to find a closed form for its cumulative distribution function,
but we can find an upper bound for it:
P(X ≥ x) =
Z
∞
x
ϕ(t) dt
≤
Z
∞
x
1 +
1
t
2
ϕ(t) dt
=
1
x
ϕ(x)
To see the last step works, simply differentiate the result and see that you get
1 +
1
x
2
ϕ(x). So
P(X ≥ x) ≤
1
x
1
√
2π
e
−
1
2
x
2
.
Then
log P(X ≥ x) ∼ −
1
2
x
2
.