2Semi-martingales

III Stochastic Calculus and Applications



2.2 Local martingale
From now on, we assume that (Ω
, F,
(
F
t
)
t
, P
) satisfies the usual conditions,
namely that
(i) F
0
contains all P-null sets
(ii) (F
t
)
t
is right-continuous, i.e. F
t
= (F
t+
=
T
s>t
F
s
for all t 0.
We recall some of the properties of continuous martingales.
Theorem
(Optional stopping theorem)
.
Let
X
be a c`adl`ag adapted integrable
process. Then the following are equivalent:
(i) X is a martingale, i.e. X
t
L
1
for every t, and
E(X
t
| F
s
) = X
s
for all t > s.
(ii)
The stopped process
X
T
= (
X
T
t
) = (
X
T t
) is a martingale for all stopping
times T .
(iii)
For all stopping times
T, S
with
T
bounded,
X
T
L
1
and
E
(
X
T
| F
S
) =
X
T S
almost surely.
(iv) For all bounded stopping times T , X
T
L
1
and E(X
T
) = E(X
0
).
For X uniformly integrable, (iii) and (iv) hold for all stopping times.
In practice, most of our results will be first proven for bounded martingales,
or perhaps square integrable ones. The point is that the square-integrable
martingales form a Hilbert space, and Hilbert space techniques can help us say
something useful about these martingales. To get something about a general
martingale
M
, we can apply a cutoff
T
n
=
inf{t >
0 :
M
t
n}
, and then
M
T
n
will be a martingale for all
n
. We can then take the limit
n
to recover
something about the martingale itself.
But if we are doing this, we might as well weaken the martingale condition
a bit we only need the
M
T
n
to be martingales. Of course, we aren’t doing
this just for fun. In general, martingales will not always be closed under the
operations we are interested in, but local (or maybe semi-) martingales will be.
In general, we define
Definition
(Local martingale)
.
A c`adl`ag adapted process
X
is a local martingale
if there exists a sequence of stopping times
T
n
such that
T
n
almost surely,
and X
T
n
is a martingale for every n. We say the sequence T
n
reduces X.
Example.
(i)
Every martingale is a local martingale, since by the optional stopping
theorem, we can take T
n
= n.
(ii) Let (B
t
) to be a standard 3d Brownian motion on R
3
. Then
(X
t
)
t1
=
1
|B
t
|
t1
is a local martingale but not a martingale.
To see this, first note that
sup
t1
EX
2
t
< , EX
t
0.
Since
EX
t
0 and
X
t
0, we know
X
cannot be a martingale. However,
we can check that it is a local martingale. Recall that for any f C
2
b
,
M
f
= f(B
t
) f(B
1
)
1
2
Z
t
0
f(B
s
) ds
is a martingale. Moreover,
1
|x|
= 0 for all
x 6
= 0. Thus, if
1
|x|
didn’t have
a singularity at 0, this would have told us
X
t
is a martingale. Thus, we
are safe if we try to bound |B
s
| away from zero.
Let
T
n
= inf
t 1 : |B
t
| <
1
n
,
and pick
f
n
C
2
b
such that
f
n
(
x
) =
1
|x|
for
|x|
1
n
. Then
X
T
t
X
T
n
1
=
M
f
n
tT
n
. So X
T
n
is a martingale.
It remains to show that
T
n
, and this follows from the fact that
EX
t
0.
Proposition.
Let
X
be a local martingale and
X
t
0 for all
t
. Then
X
is a
supermartingale.
Proof. Let (T
n
) be a reducing sequence for X. Then
E(X
t
| F
s
) = E
lim inf
n→∞
X
tT
n
| F
s
lim
n→∞
E(X
tT
n
| F
s
)
= lim inf
T
n
→∞
X
sT
n
= X
s
.
Recall the following result from Advanced Probability:
Proposition. Let X L
1
(Ω, F, P). Then the set
χ = {E(X | G) : G F a sub-σ-algebra}
is uniformly integrable, i.e.
sup
Y χ
E(|Y |1
|Y |
) 0 as λ .
Recall also the following important result about uniformly integrable random
variables:
Theorem
(Vitali theorem)
. X
n
X
in
L
1
iff (
X
n
) is uniformly integrable and
X
n
X in probability.
With these, we can state the following characterization of martingales in
terms of local martingales:
Proposition. The following are equivalent:
(i) X is a martingale.
(ii) X is a local martingale, and for all t 0, the set
χ
t
= {X
T
: T is a stopping time with T t}
is uniformly integrable.
Proof.
(a)
(b): Let
X
be a martingale. Then by the optional stopping theorem,
X
T
=
E
(
X
t
| F
T
) for any bounded stopping time
T t
. So
χ
t
is uniformly
integrable.
(b)
(a): Let
X
be a local martingale with reducing sequence (
T
n
), and
assume that the sets
χ
t
are uniformly integrable for all
t
0. By the
optional stopping theorem, it suffices to show that
E
(
X
T
) =
E
(
X
0
) for any
bounded stopping time T .
So let T be a bounded stopping time, say T t. Then
E(X
0
) = E(X
T
n
0
) = E(X
T
n
T
) = E(X
T T
n
)
for all
n
. Now
T T
n
is a stopping time
t
, so
{X
T T
n
}
is uniformly
integrable by assumption. Moreover,
T
n
T T
almost surely as
n
,
hence
X
T T
n
X
T
in probability. Hence by Vitali, this converges in
L
1
.
So
E(X
T
) = E(X
0
).
Corollary.
If
Z L
1
is such that
|X
t
| Z
for all
t
, then
X
is a martingale. In
particular, every bounded local martingale is a martingale.
The definition of a local martingale does not give us control over what the
reducing sequence
{T
n
}
is. In particular, it is not necessarily true that
X
T
n
will be bounded, which is a helpful property to have. Fortunately, we have the
following proposition:
Proposition. Let X be a continuous local martingale with X
0
= 0. Define
S
n
= inf{t 0 : |X
t
| = n}.
Then
S
n
is a stopping time,
S
n
and
X
S
n
is a bounded martingale. In
particular, (S
n
) reduces X.
Proof. It is clear that S
n
is a stopping time, since (if it is not clear)
{S
n
t} =
\
kN
sup
st
|X
s
| > n
1
k
=
\
kN
[
s<t,sQ
|X
s
| > n
1
k
F
t
.
It is also clear that S
n
, since
sup
st
|X
s
| n S
n
t,
and by continuity and compactness, sup
st
|X
s
| is finite for every (ω, t).
Finally, we show that
X
S
n
is a martingale. By the optional stopping theorem,
X
T
n
S
n
is a martingale, so
X
S
n
is a local martingale. But it is also bounded by
n. So it is a martingale.
An important and useful theorem is the following:
Theorem.
Let
X
be a continuous local martingale with
X
0
= 0. If
X
is also a
finite variation process, then X
t
= 0 for all t.
This would rule out interpreting
R
H
s
d
X
s
as a Lebesgue–Stieltjes integral
for
X
a non-zero continuous local martingale. In particular, we cannot take
X
to be Brownian motion. Instead, we have to develop a new theory of integration
for continuous local martingales, namely the Itˆo integral.
On the other hand, this theorem is very useful. We will later want to define
the stochastic integral with respect to the sum of a continuous local martingale
and a finite variation process, which is the appropriate generality for our theorems
to make good sense. This theorem tells us there is a unique way to decompose a
process as a sum of a finite variation process and a continuous local martingale
(if it can be done). So we can simply define this stochastic integral by using the
Lebesgue–Stieltjes integral on the finite variation part and the Itˆo integral on
the continuous local martingale part.
Proof.
Let
X
be a finite-variation continuous local martingale with
X
0
= 0. Since
X
is finite variation, we can define the total variation process (
V
t
) corresponding
to X, and let
S
n
= inf{t 0 : V
t
n} = inf
t 0 :
Z
1
0
|dX
s
| n
.
Then
S
n
is a stopping time, and
S
n
since
X
is assumed to be finite
variation. Moreover, by optional stopping,
X
S
n
is a local martingale, and is also
bounded, since
X
S
n
t
Z
tS
n
0
|dX
s
| n.
So X
S
n
is in fact a martingale.
We claim its
L
2
-norm vanishes. Let 0 =
t
0
< t
1
< ··· < t
n
=
t
be a
subdivision of [0
, t
]. Using the fact that
X
S
n
is a martingale and has orthogonal
increments, we can write
E((X
S
n
t
)
2
) =
k
X
i=1
E((X
S
n
t
i
X
S
n
t
i1
)
2
).
Observe that
X
S
n
is finite variation, but the right-hand side is summing the
square of the variation, which ought to vanish when we take the limit
max |t
i
t
i1
| 0. Indeed, we can compute
E((X
S
n
t
)
2
) =
k
X
i=1
E((X
S
n
t
i
X
S
n
t
i1
)
2
)
E
max
1ik
|X
S
n
t
i
X
S
n
t
i1
|
k
X
i=1
|X
S
n
t
i
X
S
n
t
i1
|
!
E
max
1ik
|X
S
n
t
i
X
S
n
t
i1
|V
tS
n
E
max
1ik
|X
S
n
t
i
X
S
n
t
i1
|n
.
Of course, the first term is also bounded by the total variation. Moreover, we
can make further subdivisions so that the mesh size tends to zero, and then the
first term vanishes in the limit by continuity. So by dominated convergence, we
must have
E
((
X
S
n
t
)
2
) = 0. So
X
S
n
t
= 0 almost surely for all
n
. So
X
t
= 0 for all
t almost surely.