5Continuous random variables

IA Probability



5.3 Jointly distributed random variables
Definition (Joint distribution). Two random variables
X, Y
have joint distri-
bution F : R
2
7→ [0, 1] defined by
F (x, y) = P(X x, Y y).
The marginal distribution of X is
F
X
(x) = P(X x) = P(X x, Y < ) = F (x, ) = lim
y→∞
F (x, y)
Definition (Jointly distributed random variables). We say
X
1
, ··· , X
n
are
jointly distributed continuous random variables and have joint pdf
f
if for any
set A R
n
P((X
1
, ··· , X
n
) A) =
Z
(x
1
,···x
n
)A
f(x
1
, ··· , x
n
) dx
1
···dx
n
.
where
f(x
1
, ··· , x
n
) 0
and
Z
R
n
f(x
1
, ··· , x
n
) dx
1
···dx
n
= 1.
Example. In the case where n = 2,
F (x, y) = P(X x, Y y) =
Z
x
−∞
Z
y
−∞
f(x, y) dx dy.
If F is differentiable, then
f(x, y) =
2
x∂y
F (x, y).
Theorem. If
X
and
Y
are jointly continuous random variables, then they are
individually continuous random variables.
Proof. We prove this by showing that X has a density function.
We know that
P(X A) = P(X A, Y (−∞, +))
=
Z
xA
Z
−∞
f(x, y) dy dx
=
Z
xA
f
X
(x) dx
So
f
X
(x) =
Z
−∞
f(x, y) dy
is the (marginal) pdf of X.
Definition (Independent continuous random variables). Continuous random
variables X
1
, ··· , X
n
are independent if
P(X
1
A
1
, X
2
A
2
, ··· , X
n
A
n
) = P(X
1
A
1
)P(X
2
A
2
) ···P(X
n
A
n
)
for all A
i
X
i
.
If we let F
X
i
and f
X
i
be the cdf, pdf of X, then
F (x
1
, ··· , x
n
) = F
X
1
(x
1
) ···F
X
n
(x
n
)
and
f(x
1
, ··· , x
n
) = f
X
1
(x
1
) ···f
X
n
(x
n
)
are each individually equivalent to the definition above.
To show that two (or more) random variables are independent, we only have
to factorize the joint pdf into factors that each only involve one variable.
Example. If (
X
1
, X
2
) takes a random value from [0
,
1]
×
[0
,
1], then
f
(
x
1
, x
2
) = 1.
Then we can see that
f
(
x
1
, x
2
) = 1
·
1 =
f
(
x
1
)
· f
(
x
2
). So
X
1
and
X
2
are
independent.
On the other hand, if (
Y
1
, Y
2
) takes a random value from [0
,
1]
×
[0
,
1] with
the restriction that
Y
1
Y
2
, then they are not independent, since
f
(
x
1
, x
2
) =
2I[Y
1
Y
2
], which cannot be split into two parts.
Proposition. For independent continuous random variables X
i
,
(i) E[
Q
X
i
] =
Q
E[X
i
]
(ii) var(
P
X
i
) =
P
var(X
i
)