5Brownian motion

III Advanced Probability



5.1 Basic properties of Brownian motion
Definition
(Brownian motion)
.
A continuous process (
B
t
)
t0
taking values in
R
d
is called a Brownian motion in R
d
started at x R
d
if
(i) B
0
= x almost surely.
(ii) For all s < t, the increment B
t
B
s
N (0, (t s)I).
(iii)
Increments are independent. More precisely, for all
t
1
< t
2
< ··· < t
k
, the
random variables
B
t
1
, B
t
2
B
t
1
, . . . , B
t
k
B
t
k1
are independent.
If B
0
= 0, then we call it a standard Brownian motion.
We always assume our Brownian motion is standard.
Theorem
(Wiener’s theorem)
.
There exists a Brownian motion on some proba-
bility space.
Proof.
We first prove existence on [0
,
1] and in
d
= 1. We wish to apply
Kolmogorov’s criterion.
Recall that
D
n
are the dyadic numbers. Let (
Z
d
)
dD
be iid
N
(0
,
1) random
variables on some probability space. We will define a process on
D
n
inductively
on n with the required properties. We wlog assume x = 0.
In step 0, we put
B
0
= 0, B
1
= Z
1
.
Assume that we have already constructed (
B
d
)
dD
n1
satisfying the properties.
Take d D
n
\ D
n1
, and set
d
±
= d ± 2
n
.
These are the two consecutive numbers in
D
n1
such that
d
< d < d
+
. Define
B
d
=
B
d
+
+ B
d
2
+
1
2
(n+1)/2
Z
d
.
The condition (i) is trivially satisfied. We now have to check the other two
conditions.
Consider
B
d
+
B
d
=
B
d
+
B
d
2
1
2
(n+1)/2
Z
d
B
d
B
d
=
B
d
+
B
d
2
| {z }
N
+
1
2
(n+1)/2
Z
d
| {z }
N
0
.
Notice that
N
and
N
0
are normal with variance
var
(
N
0
) =
var
(
N
) =
1
2
n+1
. In
particular, we have
cov(N N
0
, N + N
0
) = var(N) var(N
0
) = 0.
So B
d
+
B
d
and B
d
B
d
are independent.
Now note that the vector of increments of (
B
d
)
dD
n
between consecutive
numbers in
D
n
is Gaussian, since after dotting with any vector, we obtain a
linear combination of independent Gaussians. Thus, to prove independence, it
suffice to prove that pairwise correlation vanishes.
We already proved this for the case of increments between
B
d
and
B
d
±
, and
this is the only case that is tricky, since they both involve the same
Z
d
. The
other cases are straightforward, and are left as an exercise for the reader.
Inductively, we can construct (
B
d
)
dD
, satisfying (i), (ii) and (iii). Note that
for all s, t D, we have
E|B
t
B
s
|
p
= |t s|
p/2
E|N|
p
for
N N
(0
,
1). Since
E|N|
p
<
for all
p
, by Kolmogorov’s criterion, we can
extend (
B
d
)
dD
to (
B
t
)
t[0,1]
. In fact, this is
α
-H¨older continuous for all
α <
1
2
.
Since this is a continuous process and satisfies the desired properties on
a dense set, it remains to show that the properties are preserved by taking
continuous limits.
Take 0
t
1
< t
2
< ··· < t
m
1, and 0
t
n
1
< t
n
2
< ··· < t
n
m
1 such that
t
n
i
D
n
and t
n
i
t
i
as n and i = 1, . . . m.
We now apply evy’s convergence theorem. Recall that if
X
is a random
variable in R
d
and X N(0, Σ), then
ϕ
X
(u) = exp
1
2
u
T
Σu
.
Since (B
t
)
t[0,1]
is continuous, we have
ϕ
(B
t
n
2
B
t
n
1
,...,B
t
n
m
B
n
t
m1
)
(u) = exp
1
2
u
T
Σu
= exp
1
2
m1
X
i=1
(t
n
i+1
t
n
i
)u
2
i
!
.
We know this converges, as n , to exp
1
2
P
m1
i=1
(t
i+1
t
i
)u
2
i
.
By L´evy’s convergence theorem, the law of (
B
t
2
B
t
1
, B
t
3
B
t
2
, . . . , B
t
n
B
t
m1
) is Gaussian with the right covariance. This implies that (ii) and (iii)
hold on [0, 1].
To extend the time to [0
,
), we define independent Brownian motions
(B
i
t
)
t[0,1],iN
and define
B
t
=
btc−1
X
i=0
B
i
1
+ B
btc
t−btc
To extend to
R
d
, take the product of
d
many independent one-dimensional
Brownian motions.
Lemma.
Brownian motion is a Gaussian process, i.e. for any 0
t
1
< t
2
<
··· < t
m
1, the vector (B
t
1
, B
t
2
, . . . , B
t
n
) is Gaussian with covariance
cov(B
t
1
, B
t
2
) = t
1
t
2
.
Proof.
We know (
B
t
1
, B
t
2
B
t
1
, . . . , B
t
m
B
t
m1
) is Gaussian. Thus, the
sequence (
B
t
1
, . . . , B
t
m
) is the image under a linear isomorphism, so it is Gaussian.
To compute covariance, for s t, we have
cov(B
s
, B
t
) = EB
s
B
t
= EB
s
B
T
EB
2
s
+ EB
2
s
= EB
s
(B
t
B
s
) + EB
2
s
= s.
Proposition
(Invariance properties)
.
Let (
B
t
)
t0
be a standard Brownian
motion in R
d
.
(i)
If
U
is an orthogonal matrix, then (
UB
t
)
t0
is a standard Brownian motion.
(ii)
Brownian scaling: If
a >
0, then (
a
1/2
B
at
)
t0
is a standard Brownian
motion. This is known as a random fractal property.
(iii)
(Simple) Markov property: For all
s
0, the sequence (
B
t+s
B
s
)
t0
is a
standard Brownian motion, independent of (F
B
s
).
(iv) Time inversion: Define a process
X
t
=
(
0 t = 0
tB
1/t
t > 0
.
Then (X
t
)
t0
is a standard Brownian motion.
Proof.
Only (iv) requires proof. It is enough to prove that
X
t
is continuous and
has the right finite-dimensional distributions. We haves
(X
t
1
, . . . , X
t
m
) = (t
1
B
1/t
1
, . . . , t
m
B
1/t
m
).
The right-hand side is the image of (
B
1/t
1
, . . . , B
1/t
m
) under a linear isomorphism.
So it is Gaussian. If s t, then the covariance is
cov(sB
s
, tB
t
) = st cov(B
1/s
, B
1/t
) = st
1
s
1
t
= s = s t.
Continuity is obvious for
t >
0. To prove continuity at 0, we already proved that
(
X
q
)
q>0,qQ
has the same law (as a process) as Brownian motion. By continuity
of X
t
for positive t, we have
P
lim
qQ
+
,q0
X
q
= 0
= P
lim
qQ
+
,q0
B
q
= 0
= 1
B
by continuity of B.
Using the natural filtration, we have
Theorem. For all s t, the process (B
t+s
B
s
)
t0
is independent of F
+
s
.
Proof. Take a sequence s
n
s such that s
n
> s for all n. By continuity,
B
t+s
B
s
= lim
n→∞
B
t+s
n
B
s
n
almost surely. Now each of
B
t+s
n
B
s
n
is independent of
F
+
s
, and hence so is
the limit.
Theorem
(Blumenthal’s 0-1 law)
.
The
σ
-algebra
F
+
0
is trivial, i.e. if
A F
+
0
,
then P(A) {0, 1}.
Proof.
Apply our previous theorem. Take
A F
+
0
. Then
A σ
(
B
s
:
s
0). So
A is independent of itself.
Proposition.
(i) If d = 1, then
1 = P(inf{t 0 : B
t
> 0} = 0)
= P(inf{t 0 : B
t
< 0} = 0)
= P(inf{t > 0 : B
t
= 0} = 0)
(ii) For any d 1, we have
lim
t→∞
B
t
t
= 0
almost surely.
(iii) If we define
S
t
= sup
0st
B
t
, I
t
= inf
0st
B
t
,
then S
= and I
= −∞ almost surely.
(iv)
If
A
is open an
R
d
, then the cone of
A
is
C
A
=
{tx
:
x A, t >
0
}
. Then
inf{t 0 : B
t
C
A
} = 0 almost surely.
Thus, Brownian motion is pretty chaotic.
Proof.
(i)
It suffices to prove the first equality. Note that the event
{inf{t
0 :
B
k
>
0
}
= 0
}
is trivial. Moreover, for any finite
t
, the probability that
B
t
>
0 is
1
2
. Then take a sequence
t
n
such that
t
n
0, and apply Fatou to conclude
that the probability is positive.
(ii) Follows from the previous one since tB
1/t
is a Brownian motion.
(iii) By scale invariance, because S
= aS
for all a > 0.
(iv) Same as (i).
Theorem
(Strong Markov property)
.
Let (
B
t
)
t0
be a standard Brownian
motion in
R
d
, and let
T
be an almost-surely finite stopping time with respect to
(F
+
t
)
t0
. Then
˜
B
t
= B
T +t
B
T
is a standard Brownian motion with respect to (
F
+
T +t
)
t0
that is independent of
F
+
T
.
Proof. Let T
n
= 2
n
d2
n
T e. We first prove the statement for T
n
. We let
B
(k)
t
= B
t+k/2
n
B
k/2
n
This is then a standard Browninan motion independent of
F
+
k/2
n
by the simple
Markov property. Let
B
(t) = B
t+T
n
B
T
n
.
Let
A
be the
σ
-algebra on
C
=
C
([0
,
)
, R
d
), and
A A
. Let
E F
+
T
n
. The
claim that
B
is a standard Brownian motion independent of
E
can be concisely
captured in the equality
P({B
A} E) = P({B A})P(E). ()
Taking
E
= tells us
B
and
B
have the same law, and then taking general
E
tells us B
is independent of F
+
T
n
.
It is a straightforward computation to prove (). Indeed, we have
P({B
A} E) =
X
k=0
P
{B
(k)
A} E
T
n
=
k
2
n

Since
E F
+
T
n
, we know
E {T
n
=
k/
2
n
} F
+
k/2
n
. So by the simple Markov
property, this is equal to
=
X
k=0
P({B
(k)
A})P
E
T
n
=
k
2
n

.
But we know B
k
is a standard Brownian motion. So this is equal to
=
X
b=0
P({B A})P
E
T
n
=
k
2
n

= P({B A})P(E).
So we are done.
Now as
n
, the increments of
B
converge almost surely to the increments
of
˜
B
, since
B
is continuous and
T
n
& T
almost surely. But
B
all have the same
distribution, and almost sure convergence implies convergence in distribution.
So
˜
B is a standard Brownian motion. Being independent of F
+
T
is clear.
We know that we can reset our process any time we like, and we also know
that we have a bunch of invariance properties. We can combine these to prove
some nice results.
Theorem
(Reflection principle)
.
Let (
B
t
)
T 0
and
T
be as above. Then the
reflected process (
˜
B
t
)
t0
defined by
˜
B
t
= B
t
1
t<T
+ (2B
T
B
t
)1
tT
is a standard Brownian motion.
Of course, the fact that we are reflecting is not important. We can apply any
operation that preserves the law. This theorem is “obvious”, but we can be a
bit more careful in writing down a proof.
Proof. By the strong Markov property, we know
B
T
t
= B
T +t
B
T
and
B
T
t
are standard Brownian motions independent of
F
+
T
. This implies that
the pairs of random variables
P
1
= ((B
t
)
0tT
, (B
t
)
T
t0
), P
2
= ((B
t
)
0tT
, (B
t
)
T
t0
)
taking values in C ×C have the same law on C × C with the product σ-algebra.
Define the concatenation map ψ
T
(X, Y ) : C ×C C by
ψ
T
(X, Y ) = X
t
1
t<T
+ (X
T
+ Y
tT
)1
tT
.
Assuming Y
0
= 0, the resulting process is continuous.
Notice that
ψ
T
is a measurable map, which we can prove by approximations
of
T
by discrete stopping times. We then conclude that
ψ
T
(
P
1
) has the same
law as ψ
T
(P
2
).
Corollary.
Let (
B
t
)
T 0
be a standard Brownian motion in
d
= 1. Let
b >
0
and a b. Let
S
t
= sup
0st
B
t
.
Then
P(S
t
b, B
t
a) = P(B
t
2b a).
Proof.
Consider the stopping time
T
given by the first hitting time of
b
. Since
S
=
, we know
T
is finite almost surely. Let (
˜
B
t
)
t0
be the reflected process.
Then
{S
t
b, B
t
a} = {
˜
B
t
2b a}.
Corollary. The law of S
t
is equal to the law of |B
t
|.
Proof. Apply the previous process with b = a to get
P(S
t
a) = P(S
t
a, B
t
< a) + P(S
t
a, B
t
a)
= P(B
t
a) + P(B
t
a)
= P(B
t
a) + P(B
t
a)
= P(|B
t
| a).
Proposition.
Let
d
= 1 and (
B
t
)
t0
be a standard Brownian motion. Then
the following processes are (F
+
t
)
t0
martingales:
(i) (B
t
)
t0
(ii) (B
2
t
t)
t0
(iii)
exp
uB
t
u
2
t
2

t0
for u R.
Proof.
(i) Using the fact that B
t
B
s
is independent of F
+
s
, we know
E(B
t
B
s
| F
+
s
) = E(B
t
B
s
) = 0.
(ii) We have
E(B
2
t
t | F
+
s
) = E((B
t
B
s
)
2
| F
s
) E(B
2
s
| F
+
s
) + 2E(B
t
B
s
| F
+
s
) t
We know
B
t
B
s
is independent of
F
+
s
, and so the first term is equal to
var(B
t
B
s
) = (t s), and we can simply to get
= (t s) B
2
s
+ 2B
2
s
t
= B
2
s
s.
(iii) Similar.