2Measurable functions and random variables

II Probability and Measure



2.5 Tail events
Finally, we are going to quickly look at tail events. These are events that depend
only on the asymptotic behaviour of a sequence of random variables.
Definition
(Tail
σ
-algebra)
.
Let (
X
n
) be a sequence of random variables. We
let
T
n
= σ(X
n+1
, X
n+2
, ···),
and
T =
\
n
T
n
.
Then T is the tail σ-algebra.
Then
T
-measurable events and random variables only depend on the asymp-
totic behaviour of the X
n
’s.
Example. Let (X
n
) be a sequence of real-valued random variables. Then
lim sup
n→∞
1
n
n
X
j=1
X
j
, lim inf
n→∞
1
n
n
X
j=1
X
j
are T -measurable random variables. Finally,
lim
n→∞
1
n
n
X
j=1
X
j
exists
T ,
since this is just the set of all points where the previous two things agree.
Theorem
(Kolmogorov 0-1 law)
.
Let (
X
n
) be a sequence of independent (real-
valued) random variables. If A T , then P[A] = 0 or 1.
Moreover, if
X
is a
T
-measurable random variable, then there exists a
constant c such that
P[X = c] = 1.
Proof.
The proof is very funny the first time we see it. We are going to prove
the theorem by checking something that seems very strange. We are going to
show that if A T , then A is independent of A. It then follows that
P[A] = P[A A] = P[A]P[A],
so P[A] = 0 or 1. In fact, we are going to prove that T is independent of T .
Let
F
n
= σ(X
1
, ··· , X
n
).
This σ-algebra is generated by the π-system of events of the form
A = {X
1
x
1
, ··· , X
n
x
n
}.
Similarly,
T
n
=
σ
(
X
n+1
, X
n+2
, ···
) is generated by the
π
-system of events of the
form
B = {X
n+1
x
n+1
, ··· , X
n+k
x
n+k
},
where k is any natural number.
Since the X
n
are independent, we know for any such A and B, we have
P[A B] = P[A]P[B].
Since this is true for all A and B, it follows that F
n
is independent of T
n
.
Since T =
T
k
T
k
T
n
for each n, we know F
n
is independent of T .
Now
S
k
F
k
is a
π
-system, which generates the
σ
-algebra
F
=
σ
(
X
1
, X
2
, ···
).
We know that if
A
S
n
F
n
, then there has to exist an index
n
such that
A F
n
.
So A is independent of T . So F
is independent of T .
Finally, note that T F
. So T is independent of T .
To find the constant, suppose that X is T -measurable. Then
P[X x] {0, 1}
for all x R since {X x} T .
Now take
c = inf{x R : P[X x] = 1}.
Then with this particular choice of
c
, it is easy to see that
P
[
X
=
c
] = 1. This
completes the proof of the theorem.