3The stochastic integral

III Stochastic Calculus and Applications



3.4 Extension to semi-martingales
Definition
(Locally boounded previsible process)
.
A previsible process
H
is
locally bounded if for all t 0, we have
sup
st
|H
s
| < a.s.
Fact.
(i) Any adapted continuous process is locally bounded.
(ii)
If
H
is locally bounded and
A
is a finite variation process, then for all
t 0, we have
Z
t
0
|H
s
| |dA
s
| < a.s.
Now if
X
=
X
0
+
M
+
A
is a semi-martingale, where
X
0
F
0
,
M
is a
continuous local martingale and
A
is a finite variation process, we want to define
R
H
s
d
X
s
. We already know what it means to define integration with respect to
d
M
s
and d
A
s
, using the Itˆo integral and the finite variation integral respectively,
and X
0
doesn’t change, so we can ignore it.
Definition
(Stochastic integral)
.
Let
X
=
X
0
+
M
+
A
be a continuous semi-
martingale, and
H
a locally bounded previsible process. Then the stochastic
integral H · X is the continuous semi-martingale defined by
H · X = H · M + H · A,
and we write
(H · X)
t
=
Z
T
0
H
s
dX
s
.
Proposition.
(i) (H, X) 7→ H · X is bilinear.
(ii) H · (K · X) = (HK) · X if H and K are locally bounded.
(iii) (H · X)
T
= H1
[0,T ]
· X = H · X
T
for every stopping time T .
(iv)
If
X
is a continuous local martingale (resp. a finite variation process), then
so is H · X.
(v)
If
H
=
P
n
i=1
H
i1
1
(t
i1
,t
i
]
and
H
i1
F
t
i1
(not necessarily bounded),
then
(H · X)
t
=
n
X
i=1
H
i1
(X
t
i
t
X
t
i1
t
).
Proof.
(i) to (iv) follow from analogous properties for
H · M
and
H · A
. The
last part is also true by definition if the
H
i
are uniformly bounded. If
H
i
is not
bounded, then the finite variation part is still fine, since for each fixed
ω
Ω,
H
i
(ω) is a fixed number. For the martingale part, set
T
n
= inf{t 0 : |H
t
| n}.
Then T
n
are stopping times, T
n
, and H1
[0,T
n
]
E. Thus
(H · M )
tT
n
=
n
X
i=1
H
i1
T
[0,T
n
]
(X
t
i
t
X
t
i1
t
).
Then take the limit n .
Before we get to Itˆo’s formula, we need a few more useful properties:
Proposition
(Stochastic dominated convergence theorem)
.
Let
X
be a contin-
uous semi-martingale. Let
H, H
s
be previsible and locally bounded, and let
K
be previsible and non-negative. Let t > 0. Suppose
(i) H
n
s
H
s
as n for all s [0, t].
(ii) |H
n
s
| K
s
for all s [0, t] and n N.
(iii)
R
t
0
K
2
s
d
hMi
s
<
and
R
t
0
K
s
|
d
A
s
| <
(note that both conditions are
okay if K is locally bounded).
Then
Z
t
0
H
n
s
dX
s
Z
t
0
H
s
dX
s
in probability.
Proof.
For the finite variation part, the convergence follows from the usual
dominated convergence theorem. For the martingale part, we set
T
m
= inf
t 0 :
Z
t
0
K
2
s
dhMi
s
m
.
So we have
E
Z
T
m
t
0
H
n
s
dM
s
Z
T
n
t
0
H
s
dM
s
!
2
E
Z
T
n
t
0
(H
n
s
H
s
)
2
dhMi
s
!
0.
using the usual dominated convergence theorem, since
R
T
n
t
0
K
2
s
dhMi
s
m.
Since
T
n
t
=
t
eventually as
n
almost surely, hence in probability, we
are done.
Proposition.
Let
X
be a continuous semi-martingale, and let
H
be an adapted
bounded left-continuous process. Then for every subdivision 0
< t
(m)
0
< t
(m)
1
<
··· < t
(m)
n
m
of [0, t] with max
i
|t
(m)
i
t
(m)
i1
| 0, then
Z
t
0
H
s
dX
s
= lim
m→∞
n
m
X
i=1
H
t
(m)
i1
(X
t
(m)
i
X
t
(m)
i1
)
in probability.
Proof.
We have already proved this for the Lebesgue–Stieltjes integral, and all
we used was dominated convergence. So the same proof works using stochastic
dominated convergence theorem.