Stochastic Calculus Notes 4/5

  • December 2019
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Stochastic Calculus Notes 4/5 as PDF for free.

More details

  • Words: 6,630
  • Pages: 22
1

MSc Financial Mathematics - SMM302

4

Itˆ o Integrals and Itˆ o Calculus

4.1

Motivation

A diffusion is a continuous-time Markov process Xt with continuous sample paths such that E[Xt+dt |Ft ] = Xt + µ(t, Xt ) dt, V ar[Xt+dt |Ft ] = σ(t, Xt )2 dt. We can write this in the form

Xt+dt = Xt + µ(t, Xt ) dt + σ(t, Xt ) (Wt+dt − Wt ), with W a Brownian motion. We would like to have an explicit representation for Xt . It is tempting to convert this to dX dW = µ(t, Xt ) dt + σ(t, Xt ) dt dt and deduce that Xt = X0 +

Z

0

t

µ(s, Xs ) ds +

Z

t

σ(s, Xs )

0

dW ds. ds

But it doesn’t work. The problem connected to this task is something related to what already mentioned in Unit 3, and it is explained in the following. Theorem 1 For any given t, the path of a Brownian motion is not differentiable at t almost surely. Proof. Assume W is right differentiable at t with derivative a; this is equivalent to say that there exists a ε > 0 small enough such that Wt+h − Wt < ε. − a h Then, in virtue of the properties of the increments of the Brownian motion,     Wt+h − Wt Wh P − a < ε = P − a < ε . h h

(1)

Use now the time-inversion property of Brownian motions, by setting Zs = sW1/s , with s = h1 . Then (1) can be written as P (|Zs − a| < ε) , We know that this event has probability zero (as the Brownian motion is a continuous process), and conclude that the original assumption, i.e. that W is differentiable at t, also has probability 0. The theorem can in fact be extended to prove that, almost surely, the Brownian motion is not differentiable everywhere. Rt Although we cannot define dW/dt, we can nevertheless define 0 fs dWs . This is the Itˆo integral. 0

c Laura Ballotta - Do not reproduce without permission.

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

2

4.2

The construction of the Itˆ o integral

The first step in the construction of the Itˆo integral (or stochastic integral ) is to consider a fairly elementary class of integrand functions, f . Hence, consider a partition of the time interval [0, t] such that 0 = t0 < t1 < ... < tj < ... < tn = t. Definition 2 (Simple function) Let H be the class of functions ft = f (t, ω) : [0, ∞) × Ω → R such that i) f (t, is Ft-adapted, Rω) ∞ ii) E 0 ft2 dt < ∞. Then, a function f ∈ H is called simple if it has the form X ej 1[tj ,tj+1 ) (t) , ft = j

 for any Ftj -adapted (possibly random) variable ej , such that E e2j < ∞. Now we can define the Itˆo integral for simple functions.

Definition 3 Let f ∈ H be a simple function and W a standard Brownian motion. Then I (f ) =

Z



ft dWt :=

0

∞ X j=0

ej Wtj+1 − Wtj



is the Itˆ o integral of f with respect to W. We define the finite-time integral It (f ) by the simple expedient of multiplying f by and indicator function of time, i.e. Z t Z ∞ It (f ) = fs dWs = 1(0,t] (s) fs dWs . 0

0

Example 1 Consider the integral It (f ) =

Z

t

dWs .

0

In this case, the simple function is given by the identity function. To calculate the integral, use the definition above: It (f ) =

Z

t

dWs =

0

= Wt ,

n−1 X j=0

Wtj+1 − Wtj

which follows by recognizing the telescopic sum for W .



3

4.2 The construction of the Itˆo integral

Example 2 To have an intuitive explanation of what the stochastic integral does, regard Wt as the price per share of an equity at time t (although with a small caveat... can you guess why?), and think of the sequence of time points in the partition of [0, t] as the trading dates in the asset. The quantity ej is then the position (i.e. the number of shares) taken in the asset at each trading date and held to the next one. In such a situation, you can calculate the gain from your trading strategy at each point in time tj , which is given by  ej Wtj+1 − Wtj . The total gain over the time period [0, t] is given by It (f ).

4.2.1

Properties of the Itˆ o integral

The stochastic integral has a number of interesting and useful properties, which are contained in the following. Proposition 4 i) E [I (f )] = 0. ii) E [I 2 (f )] = E

R ∞ 0

 ft2 dt (Itˆo isometry).

iii) The Itˆo integral is Gaussian, for any deterministic ej . iv) It (f ) is a P-martingale. Proof. i) Using the definition of Itˆo integral and the properties of the conditional expectation, we get Z ∞  E [I (f )] = E ft dWt 0 "∞ # X  = E ej Wtj+1 − Wtj j=0

= =

∞ X

j=0 ∞ X j=0

  E ej E Wtj+1 − Wtj Ftj

  E ej E Wtj+1 − Wtj = 0.

ii) Since ej is adapted and the increments of a Brownian motion are independent of the past, then h 2 i  E e2j Wtj+1 − Wtj = (tj+1 − tj ) E e2j , (2)

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

4

where the last equality follows from the Tower property and the distribution property of the Wiener process. Also, for any ti < tj     E E ei ej Wti+1 − Wti Wtj+1 − Wtj Ftj    E ei ej Wti+1 − Wti E Wtj+1 − Wtj Ftj    E ei ej Wti+1 − Wti E Wtj+1 − Wtj 0. (3)

   E ei ej Wti+1 − Wti Wtj+1 − Wtj = = = = Hence 

2

E I (f )



= E

"Z



ft dWt

0



= E = E =

"

ej Wtj+1

j=0

∞ X

∞ X j=0

∞ X

j=0

2 # !2    − Wtj

e2j Wtj+1 − Wtj

2

#

" XX

+E

i

j6=i

ei ej Wti+1 − Wti

 (tj+1 − tj ) E e2j ,

where the last equality follows from (2) and (3). Therefore ∞ X  2   E I (f ) = (tj+1 − tj ) E e2j j=0

= E

"∞ X

e2j (tj+1 − tj )

j=0 ∞

= E

Z



ft2 dt

0

#

.

iii) It follows from the definitions. iv) Let s = tk < tl = t. Then It (f ) =

l−1 X j=0

=

k−1 X j=0

ej Wtj+1 − Wtj

 

ej Wtj+1 − Wtj +

l−1 X j=k

 ej Wtj+1 − Wtj .



Wtj+1 − Wtj



#

5

4.2 The construction of the Itˆo integral Now calculate E [It (f ) |Fs ] = E = E

" l−1 X j=0

" k−1 X j=0



ej Wtj+1 − Wtj |Ftk 

# #

ej Wtj+1 − Wtj |Ftk + E

" l−1 X j=k



ej Wtj+1 − Wtj |Ftk .

The first term is equal to k−1 X j=0

 ej Wtj+1 − Wtj = Is (f ) .

The second term, using the Tower property, is " l−1 # l−1 X X      E ej Wtj+1 − Wtj |Ftk = E E ej Wtj+1 − Wtj Ftj Ftk j=k

j=k

l−1 X    = E ej E Wtj+1 − Wtj Ftk j=k

= 0,

as the increments of the Brownian motion have zero mean. Therefore E [It (f ) |Fs ] = Is (f ) .

Exercise 1 Consider the simple processes f and g defined as follows:  1/2 if t ∈ (0, 1/2] ft = −1 if t ∈ (1/2, 1] and

 if t ∈ (0, 1/2]  1 2 if t ∈ (1/2, 3/4] gt =  −1 if t ∈ (3/4, 1] R1 R1 and such that ft = gt = 0 if t > 1. Define I1 (f ) = 0 ft dWt and I1 (g) = 0 gt dWt . a) Calculate I1 (f ) and I1 (g).

b) Show that E [I1 (f ) I1 (g)] = 0.

#

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

6 4.2.2

Extension of the Itˆ o integral to a more general class of functions

So far we have worked with integrals defined for simple functions. The next step is to extend the definition of stochastic integral to a more general class of functions. For the ItˆoRisometry to work, we still need functions in the set H, i.e. Ft -adapted and such that ∞ 2  E 0 ft dt < ∞ almost surely. In order to achieve this, we use the following. (n)

Theorem 5 Let ft = f (t, ω) ∈ H be an arbitrary process and ft processes such that Z ∞ 2  (n) lim E ft − ft dt = 0. n→∞

a sequence of simple

0

Then

lim

n→∞

Z



(n)

ft dWt 0

converges in mean square as n → ∞ to a limit which depends only on ft and not on the approximating sequence chosen. (m)

Proof. For the convergence in mean square we need to check if any two sequences, ft (n) and ft , of simple processes have the same limit as n → ∞, i.e. if h   i (n) (m) 2 lim E I f = 0. −I f

n→∞

Note that

h h  i   i (n) (m) 2 (n) (m) 2 = E I f −f E I f −I f " Z 2 # ∞ (n)  = E f − f (m) dWt . 0

Since f (n) − f (m) is a simple function, Z ∞  h 2 (n)   2 i (n) (m) (m) f − f dt E I f −I f = E 0 Z ∞  (n)   2 (m) f − ft + ft − f dt = E 0 Z ∞   (n) 2  (m) 2 ≤ 2E f − ft + ft − f dt 0 Z ∞  Z ∞  (n) 2 (m) 2 = 2E f − ft dt + 2E ft − f dt 0

→ 0

as n → 0.

0

7

4.2 The construction of the Itˆo integral

Definition 6 (Stochastic integral) Let ft = f (t, ω) ∈ H be an arbitrary process. Then the stochastic integral of f is defined by Z ∞ Z ∞ (n) ft dWt = lim ft dWt , n→∞

0

(n)

where ft

0

is a sequence of elementary functions satisfying the condition of Theorem 5.

The properties of the Itˆo integral hold also for the more general stochastic integral defined above. The proof is based on the same limit procedure used in Theorem 5, but we do not explore the issue here. Example 3 Consider the integral It (f ) =

Z

T

Wt dWt . 0

To calculate it, we need to choose the approximating sequence for the function ft = Wt . The simplest choice is   Wt0 = 0 for 0 ≤ t < t1    Wt for t1 ≤ t < t2 1 (n) ft = ..  .    W for t ≤ t < t = T. tn−1

n−1

n

(n)

You can easily check that the sequence ft satisfies the requirements of Theorem 5. Then, in virtue of Definition 6, we can calculate the stochastic integral as follows: It (f ) =

Z

T

Wt dWt = lim

n→∞

0

= =

lim

n→∞

lim

n→∞

n−1 X

Z

T

(n)

ft dWt

0



Wti Wti+1 − Wti = lim

i=0 n−1 X i=0

Wti+1 Wti −

1 2

n−1 X i=0

n→∞

Wt2i −

1 2

n−1 X

i=0 n X i=1

Wti+1 Wti −

n−1 X

1 Wt2i + Wt2n 2

i=0

!

Wt2i

!

,

where the last equality follows from the fact that Wt0 = 0. Then, set k = i − 1; it follows that ! n−1 n−1 n−1 X 1X 2 1X 2 1 2 It (f ) = lim Wti+1 Wti − Wti − Wtk+1 + Wtn n→∞ 2 2 2 i=0 i=0 k=0 n−1

=

X 2 W 2 T WT2 1 − lim Wti+1 − Wti = T − . 2 2 n→∞ i=0 2 2

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

8 Therefore Z

T

Wt dWt = 0

WT2 T − . 2 2

Note that, for the case of any deterministic, continuous and differentiable function g (t), such that g (0) = 0, Z T 1 g (t) dg (t) = g 2 (T ) . 2 0 The example shows that the same rule does not apply to Brownian motions, and this is because of the non-zero quadratic variation property. The term T2 is, in fact, referred RT to as the Itˆo term. Finally, you can check (easy) that It (f ) = 0 Wt dWt is indeed a martingale. Exercise 2 Let Yt :=

Rt 0

fs dWs for f ∈ H. Show that E [(Yt4 − Yt3 ) (Yt2 − Yt1 )] = 0.

Exercise 3 Consider a one-dimensional standard Brownian motion W and let b = {bt : t ∈ [0, T ]} be a deterministic process such that E

Z

0

Show that

T

|bt |2 dt < ∞.

h RT i 1 RT 2 E e 0 bt dWt = e 2 0 bt dt

for any 0 ≤ t ≤ T.

Exercise 4 Let W be a standard one-dimensional Brownian motion. Let T be a fixed positive number and let Π = {t0 , t1 , ..., tn } be a partition of [0, T ] such that 0 = t0 < t1 < ... < tn . Define τn = sup1≤i≤n (ti − ti−1 ) and, for each i, let t∗i = ti+12+ti be the midpoint of the interval [ti , ti+1 ]. a) Show that the process 2

V (Π/2) :=

n−1 X i=0

Wt∗i − Wti

2



T 2

for τn → 0. [Hint: you might want to use the time-scale property of the Brownian motion seen in Question 25, and then use the quadratic variation result]. b) The Stratonovich integral of W with respect to W is defined to be Z

0

T

Wt ◦ dWt = lim

τn →0

n−1 X i=0

 Wt∗i Wti+1 − Wti .

9

4.2 The construction of the Itˆo integral Show that Z

T

Wt ◦ dWt =

0

WT2 . 2

[Hint: write the approximating sum that defines the Stratonovich integral as the sum  ∗ of an approximating sum for the Itˆo integral over Π/2 = t0 , t0 , t1 , t∗1 , ..., t∗n−1 , tn and V 2 (Π/2). c) Why is the Stratonovich integral inappropriate for financial applications? The definition of stochastic integral for any H-process can be extended further to the case of measurable processes f such that Z

T

ft2 dt < ∞ a.s.

0

In this case, the stochastic integral is still a linear mapping and a continuous (local) martingale, but the Itˆo isometry no longer holds as the integral E

Z

T

ft2 dt

0

may not exist. n o RT Let S := ft : ft ∈ Ft , 0 ft2 dt < ∞ . Then H ⊆ S. Proposition 7 Let f ∈ S. Then Proof. Let

Rt 0

fu dWu is a local martingale.



Tn = inf T ≥ 0 :

Z

T 0

ft2 dt



≥n∈N . (n)

Then Tn is a stopping time. Consider now the truncated process ft (n) ft is Ft -measurable. Also Z

∞ 0

Z (n) 2 ft dt =

0

(n)

by definition of Tn . Hence ft

ft 1(0,Tn ] 2 dt =

Z

Tn 0

∈ H. This implies that

Z is a martingale.



t∧Tn

fu dWu = 0

Z

0

t

fu(n) dWu

ft2 dt ≤ n

= ft 1(0,Tn ] . Then

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

10

4.3

Itˆ o processes and stochastic calculus

Definition 8 (Itˆ o processes) Let W be a one dimensional standard Brownian motion on (Ω, F , P, Ft ). A one dimensional Itˆ o process is a stochastic process X on (Ω, F , P, Ft ) of the form Z Z t

Xt = X0 +

t

bs ds +

0

σs dWs ,

(4)

0

where X0 is a F0 -measurable random variable, bt is a Ft -adapted process such that Z t |bs | ds < ∞ a.s. ∀t ≥ 0, 0

σt is a Ft -adapted process such that Z t |σs |2 ds < ∞ a.s. ∀t ≥ 0. 0

The shorthand notation for the above is: dXt = bt dt + σt dWt , which is also called stochastic differential. We will be encountering stochastic differentials where bt and σt are functions of Xt as well as t, on the like of dXt = b (t, Xt ) dt + σ (t, Xt ) dWt . Once initial conditions on the process at the origin are given, these are known as stochastic differential equations (SDEs). In general, we will be interested in solving these SDEs; however, the previous examples have shown that solving stochastic integrals using some approximating sequence might not be such an easy task. To solve SDEs we will need a special technique, called Itˆo’s formula (or Itˆo’s Lemma). Lemma 9 Let X be a one dimensional Itˆ o process described by the stochastic differential dXt = bt dt + σt dWt . Let f (t, x) be a twice continuously differentiable function on [0, ∞) × R. Then Yt = f (t, Xt ) is again an Itˆo process and dYt =

∂f ∂f 1 ∂2f dt + dXt + (dXt )2 . ∂t ∂X 2 ∂X 2

(5)

Proof. Use the Taylor expansion for a 2-variable function, i.e., for t˜ close to t and Xt˜ close to Xt we can write  ∂f  ∂f  1 ∂2f ˜2 f (t, Xt ) = f t˜, Xt˜ + t − t˜ + (Xt − Xt˜) + t − t ∂t ∂X 2 ∂t2 2 2  1∂ f ∂ f + (Xt − Xt˜)2 + t − t˜ (Xt − Xt˜) + ... 2 2 ∂X ∂t∂X

4.3 Itˆo processes and stochastic calculus

11

This can be rewritten (symbolically) in differential form as df (t, Xt ) =

∂f ∂f 1 ∂2f 2 dt + dXt + dt ∂t ∂X 2 ∂t2 ∂2f 1 ∂2f 2 dX + dtdXt + ... + 2 ∂X 2 t ∂t∂X

The required result follows from the properties of the quadratic variation and the cross variation. Example 4 then

1. If Yt = Wtn , where W is a standard one-dimensional Brownian motion, 1 dYt = nWtn−1 dWt + n (n − 1) Wtn−2 dt. 2

2. Consider an Itˆo process X described by the following stochastic differential equation dXt = µdt + σdWt , µ ∈ R, σ ∈ R+ . Define Yt = eXt . The stochastic differential equation of the process Y is then 1 dYt = Yt dXt + Yt (dXt )2 2 σ2 = Yt (µdt + σdWt ) + Yt dt 2   2 σ = µ+ Yt dt + σYt dWt . 2 3. Let dXt = µdt + σdWt , µ ∈ R, σ ∈ R+ , dYt = adt + bdWt , a ∈ R, b ∈ R+ , and Zt = Xt Yt . Then dZt = Yt dXt + Xt dYt + dXt dYt = (aXt + µYt + σb) dt + (bXt + σYt ) dWt . Exercise 5 Compute the stochastic differential of Wt2 and Wt4 . Exercise 6 For each of the following processes, state whether it is an Itˆo process: a) The process Xt which satisfies dXt = (Wt+1 − Wt ) dWt b) The process Xt = c) The process Xt =

1 1+Wt

Rt 0

Wu du

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

12

4.4

Stochastic differential equations

Definition 10 Let W be a one-dimensional standard Brownian motion on (Ω, F , P, Ft ). Then  dXt = b (t, Xt ) dt + σ (t, Xt ) dWt X0 ∈ F 0 is a stochastic differential equation. The solution is an Ft -measurable process X such that Z t Z t Xt = X0 + b (s, Xs ) ds + σ (s, Xs ) dWs . 0

0

The existence and uniqueness conditions for such equations are contained in the following theorem that we state without proof. Theorem 11 Under the following Lipschitz and growth conditions: |b (t, x) − b (t, y)| + |σ (t, x) − σ (t, y)| ≤ D |x − y| ∀x, y ∈ R, t ∈ [0, T ] , |b (t, x)| + |σ (t, x)| ≤ C (1 + |x|) ∀x ∈ R, t ∈ [0, T ] , Xt is the unique solution to the stochastic differential equation  dXt = b (t, Xt ) dt + σ (t, Xt ) dWt X0 ∈ F 0 . The key tool to find the solution to a stochastic differential equation is Itˆo’s lemma. Example 5

1. Consider the following stochastic differential equation dXt = σXt dWt .

We can try to find the solution working in analogy with the ordinary differential equations case. So, let’s try with Yt = ln Xt . Then, applying Itˆo’s lemma, we get 1 1 1 dXt − (dXt )2 2 Xt 2 Xt 2 σ = σdWt − dt. 2 Now, integrating both sides returns dYt =

Xt = X0 eσWt −

σ2 t 2

.

(6)

If you are not convinced, you can verify that (6) is in effect the solution to the given stochastic differential equation using Itˆo’s lemma again. In other words, calculate the stochastic differential of X from (6) : σ2 σ2 Xt dt + σXt dWt + Xt dt 2 2 = σXt dWt .

dXt = −

13

4.4 Stochastic differential equations 2. Consider the following stochastic differential equation dXt = µXt dt + σXt dWt . Let’s try again with Yt = ln Xt . Then dYt = Integrating both sides,



σ2 µ− 2 

Xt = X0 e



dt + σdWt . 2

µ− σ2



t+σWt

.

3. An Ornstein-Uhlenbeck process is a stochastic process defined by the following dXt = µXt dt + σdWt .

(7)

In this case, the analogy with the ODE case is not obvious. When this is the case, we might look for help in the auxiliary function method. Let’s consider the homogenous part of the stochastic differential equation (7) dXt = µXt dt. Its solution is then

ˆ t = X0 eµt . X

Construct now the auxiliary function Ft = X0

Xt = Xt e−µt , ˆt X

and compute its stochastic differential equation using Itˆo’s lemma: dFt = −µFt dt + e−µt dXt = σe−µt dWt . Hence −µt

Xt e

= X0 + σ

Z

t

e−µs dWs ,

0

i.e. µt

Xt = X0 e + σ

Z

t

eµ(t−s) dWs . 0

Exercise 7 Show that the Ornstein-Uhlenbeck process defined in Unit 4 is Gaussian with E (Xt ) = eµt E (X0 )  σ 2 2µt V ar (Xt ) = e −1 . 2µ

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

14

Exercise 8 Solve the following stochastic differential equations: 1. Mean-reverting OU process dXt = −bXt dt + σdWt ; X0 = x ∈ R. 2.



dXt = (aXt + b) dt + (σXt + β) dW X0 = x ∈ R

where a, b, σ and β are given real constants. 3.

 

1 dt + αXt dWt X  X = x ∈t R++ 0 dXt =

where α is a given real constant.

Exercise 9 Consider the following stochastic differential equation: drt = a (b − rt ) dt + σdWt , r0 ∈ R.

(8)

a) Compute the stochastic differential  d eat rt .

b) Using the previous result, obtain the solution rt of the equation (8) . c) Determine the expectation, variance and law of rt . d) Show that for each t > 0 the random variable rt takes negative values with positive probability. e) Consider the stochastic process Rt =

Z

t

rs ds, 0

where rt is the process satisfying the stochastic differential equation (8). Show that Rt is Gaussian with   1 − e−at E (Rt ) = bt + (r0 − b) a   2 −at σ 3 2e e−2at Var (Rt ) = 2 t − + − . a 2a a 2a

15

4.5 Moments of Itˆo processes f) Use the previous results to show that the function   Rt P (0, t) := E e− 0 rs ds has closed form

P (0, t) = A (t) e−B(t)r0 1 − e−at B (t) = a 

σ2 ln A (t) = (B (t) − t) b − 2 2a





σ2 B (t)2 . 4a

Exercise 10 Consider the equation dYt =

b − Yt dt + dWt , 1−t

where b and Y0 = a ∈ R. a) Show that Yt = a (1 − t) + bt + (1 − t) The process Y is called the Brownian bridge.

Z

t 0

dWs . 1−s

b) Consider another process X defined by the following SDE: dXt =

β − Xt dt + dWt 1−t

where β and X0 = α ∈ R. Under which conditions Xt ≥ Yt a.s.? State any general result you use. Exercise 11 Suppose there exist positive constants a, b and c such that Z T Z T Z T 2 ∂f ∂f ∂ f Mf (T ) = f (XT ) − f (X0 ) − a (Xt )dt + b Xt (Xt )dt − c (Xt )dt 2 ∂x 0 ∂x 0 0 ∂x is a martingale for each C 2 function f . Show that Xt is an Ornstein-Uhlenbeck process.

4.5

Moments of Itˆ o processes

Even when it is impossible to solve an Itˆo equation to find an explicit expression for the process in terms of a stochastic integral, it may be possible to find its moments. (1) If we define mt = E[Xt |X0 = a] then (1)

(1)

mt+dt − mt

= E[dXt |X0 = a] = E[b(t, Xt )|X0 = a] dt

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

16 (1)

In other words, dmdt = E[b]. If b(t, Xt ) is a linear function of Xt , we have an ODE for m(1) . (2) If we want mt = E[Xt2 |X0 = a], then we must first define Yt = Xt2 , so that dYt = 2Xt [b(t, Xt )dt + σ(t, Xt )dWt ] + σ 2 (t, Xt )dt. Hence

dm(2) = E[2Xt b(t, Xt ) + σ 2 (t, Xt )|X0 = a], dt which again will be soluble for appropriate b and σ. Example 6 As an example which illustrates the principle, consider p dXt = −αXt dt + σ Xt dWt , X0 = x ∈ R. (1)

(2)

Define mt = E[Xt ], mt = E[Xt2 ]. Now (1)

(1)

(1)

mt+dt − mt = E[dXt ] = −αE[Xt ] dt = −αmt dt, (1)

implying that dm(1) /dt = −αm(1) and therefore mt moment, define Yt = Xt2 . Then

= X0 e−αt . To find the second 3/2

dYt = 2Xt dXt + (dXt )2 = −2αXt2 dt + 2σXt from which it follows that

with solution (2)

mt

dWt + σ 2 Xt dt,

dm(2) = −2αm(2) + σ 2 m(1) , dt   σ2 σ2 −αt 2 = X0 e + X0 − X0 e−2αt . α α

Exercise 12 The Brownian bridge is a stochastic process {Xt : 0 ≤ t ≤ 1} with X0 = 0 satisfying 1 dXt = − Xt dt + dWt , 1−t where Wt is a standard Brownian motion. a) Find a differential equation satisfied by m(1) (t) = E(Xt ) for 0 < t < 1 and also by E[Xt |Xt0 = a] for t0 < t < 1. b) Show that E[Xt ] = 0 for 0 < t < 1 and that E[Xt |Xt0 = a] =

1−t a. 1−t0

c) Let Yt = Xt2 . Use Itˆo’s Lemma to find an expression for dYt . d) Find a differential equation satisfied by m(2) (t) = E(Xt2 ) for 0 < t < 1 and solve it. e) What can be said about the distribution of X1 ?

17

4.6 Steady-state distribution

4.6

Steady-state distribution

Some processes have a steady-state (equilibrium) distribution: if X0 is random with density π(x), then Xt has the same density function for all t. To investigate this, we can look at the moment generating function, E[esXt ]. First, it should be stressed that there can be no steady-state distribution if b or σ depends explicitly on t. Assume they do not. Let Yt = esXt . Then 1 dYt = sesXt [b(t, Xt ) dt + σ(t, Xt ) dWt ] + s2 esXt σ 2 (t, Xt ) dt 2 Define M(t, s) = E[esXt ]. Then ∂ 1 M(t, s) = E[dYt ] ∂t dt  1 2 sXt 2 sXt = E se b(Xt ) + s e σ (Xt ) . 2 If X0 has the steady-state density π(x), then M(t, s) is the same for all t, so that, for every s,   Z 1 2 sx 0 = s e b(x) + sσ (x) π(x) dx. 2 Rb The Laplace transform of a function is zero only if the function is zero, i.e. a esx h(x) dx = 0 for all s, if h(x) = 0 for all x ∈ (a, b). Also, Z

b

sx dh

e

dx

a

Therefore 0 =

 b dx = h(x)esx a − s

Z

b

esx h(x) dx.

a

b

   1 d 2 e b(x)π(x) − σ (x)π(x) dx 2 dx a  1 + σ 2 (b)π(b)esb − σ 2 (a)π(a)esa . 2

Z

sx

So, for a steady-state distribution to exist, we require that b(x)π(x) =

 1 d σ 2 (x)π(x) 2 dx

for a < x < b and that σ 2 (b)π(b) = σ 2 (a)π(a) = 0.

Example 7 (Ornstein-Uhlenbeck process) Consider the Ornstein-Uhlenbeck process defined by dXt = −αXt dt + σ dWt ,

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

18

so that µ(x) = −αx and σ 2 (x) = σ 2 . So we require σ2

dπ = −2αxπ(x), dx

  2 . We recognise this as the density of a Normal distribution giving π(x) = K exp − αx σ2 with mean 0 and variance σ 2 /(2α).

One possible use of the steady-state distribution is to show that, if a < X0 < b, then a < Xt < b for all t > 0. Remark. A stochastic process which possesses a stationary distribution has constant variance. We have seen that the variance of a martingale increases with time. Therefore no martingale possesses a stationary distribution. Exercise 13 Try to find the steady-state distribution of a geometric Brownian motion dXt = αXt dt + σXt dWt . You should find that there is no density function π(x) which satisfies the differential equation you derive. Why is this not surprising?

4.7

The Brownian Bridge and stratified Monte Carlo

Definition 12 Let Wt be a Brownian motion. Fix s > 0 and T > 0 with s < T , a ∈ R and b ∈ R. We define the Brownian bridge from a to b on [s, T ] to be the process (a,s)→(b,T )

Bt

=a+

t−s t−s (b − a) + (Wt − Ws ) − (WT − Ws ) T −s T −s

for ∀t ∈ [s, T ] . The function a + Tt−s (b − a), as a function of t, is the line from (s, a) to (T, b); when −s we add to this line the Brownian bridge from 0 to 0 on [s, T ], we obtain a process that begins at a at time s and terminates at b at time T . (a,s)→(b,T ) You can easily show that the process Bt has the following properties: • Mean • Variance • Covariance

a+

t−s (b − a) T −s

(t − s) (T − t) T −s (t ∧ z − s) (T − t ∨ z) T −s

4.7 The Brownian Bridge and stratified Monte Carlo

19

Note that the Brownian bridge cannot be written as a stochastic integral of a deterministic integrand, since the variance of the Brownian bridge is a non-monotone function of time, whilst all stochastic integrals have variance which is non-decreasing in t. However, we can (a,s)→(b,T ) obtain a process with the same distribution as Bt as a scaled stochastic integral: Z t t−s 1 Yt = a + (b − a) + (T − t) dWu . T −s s T −u 4.7.1

Simulating trajectories of the Brownian motion - part 2

The primary use for the Brownian bridge in finance is as an aid to Monte Carlo simulation, (a,s)→(b,T ) since the Brownian bridge Bt represents a Brownian motion on the time interval [s, T ], starting at Ws = a and conditioned to arrive at b at time T . To see this, consider a time partition such that ti < tj < tk let X = Wtj − Wti ; Y = Wtk − Wtj ; Z = Wtk − Wti = X + Y. 2 It follows that X and Y are independent; moreover, X ∼ N (0, σX ), Y ∼ N (0, σY2 ) and 2 Z ∼ N (0, σZ ), where 2 σX = tj − ti ; 2 σY = tk − tj ; 2 σZ2 = tk − ti = σX + σY2 .

Then, the conditional density of X given Y is fX (x)fY (y) fZ (z) 1 x−Az 2 1 √ e− 2 ( B ) , = B 2π

fX|Z (x) =

2 where A = σX /σZ2 and B = σX σY /σZ . Hence, we can claim that, conditioning on the knowledge of the process value at time tk , Wtj − Wti ∼ N (Az, B 2 ); from which it follows that s tk − tj tj − ti (tk − tj )(tj − ti ) Wti + Wtk + ε, ε ∼ N(0, 1). Wtj = tk − ti tk − ti tk − ti

But this is the Brownian bridge from Wti to Wtk on [ti , tk ]. Given this property of the Brownian bridge, we can use this process together with stratification to generate trajectories of the Brownian motion, which are better spread over the probability space. The idea is that you generate first a stratified sample from a normal distribution, with mean 0 and variance T . This gives you a random sample of the Brownian motion at the end of the observation period. Then, you “fill in” the

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

20

2

0.4

1.5

0.3

1 0.2

Bt = µ t + σ Wt

W

t

0.5

0

0.1

0

−0.5 −0.1 −1

−0.2

−1.5

−2

0

28

56

84

112

140

168

196

224

252

280

308

336

364

−0.3

0

28

56

84

112

140

168

196

224

252

280

308

336

364

1.5

1.4

2

(µ − σ /2)t + σ W

t

1.3

1.2

t

X =e

1.1

1

0.9

0.8

0.7

0

28

56

84

112

140

168

196

224

252

280

308

336

364

Figure 1: 10 stratified sample trajectories of the Wiener process, the arithmetic Brownian motion and the geometric Brownian motion. Parameter set: T = 1 year; µ = 0.1 p.a.; σ = 0.2 p.a.

21

4.8 Further sample exam questions

values that you need to generate the full trajectory between time 0 and time T using the Brownian bridge. Figure 1 shows sample trajectories for the Wiener process, the arithmetic Brownian motion and the geometric Brownian motion generated using this technique. If you compare them with the trajectories illustrated in Figure 3.3, you can see that the stratification approach with Brownian bridge reduces the variance of your Monte Carlo estimate.

4.8

Further sample exam questions

1. Consider the following stochastic differential equation dYt = rdt + αYt dWt

(9)

where W = {Wt : t ≥ 0} is a one-dimensional standard Brownian motion and Y0 , r, α are real constants. a) Use Itˆo’s Lemma to show that   α2 α2 −αWt + t 2 Yt d e = e−αWt + 2 t rdt and hence solve equation (9) to obtain an explicit expression for Yt . b) Consider another process X defined by the following stochastic differential equation dXt = γdt + αXt dWt where W is the same one-dimensional standard Brownian motion defining the process Y , and X0 , γ, α are real constants. State sufficient conditions under which Xt ≥ Yt a.s. State clearly any general result you use. 2. Consider the process Yt =

Wt2 . 2

a) Use Itˆo’s lemma to calculate dYt . b) Use the previous result to show that Z t W2 t Wu dWu = t − 2 2 0 3. If Xt satisfies the SDE dXt = λ(b − Xt ) dt + σ find the SDE satisfied by √ a) Yt = Xt b) Zt = eλt (b − Xt )

p

Xt dWt ,

ˆ INTEGRALS AND ITO ˆ CALCULUS 4 ITO

22

Show that Z is a local martingale. Why might it not be a martingale? 4. Suppose that X0 = 1 and

p dXt = dt + 2 Xt dWt . √ Find and solve a SDE for Yt = Xt and deduce the solution Xt .

5. Suppose that X0 = 0 and dXt = Obtain a SDE for Yt =

p



Xt t+ t



p dt + 2 tXt dWt .

Xt /t and hence solve for Xt .

6. Consider the stochastic process k(t−s)

Xt = Xs e



Z

t

ek(t−u) dWu ,

s

where k, σ ∈ R++ , and s < t. a) Derive the stochastic differential equation the process X is solution of. b) State whether the process X is a martingale, justifying your answer. d) Giving your reasons, state whether the process X has stationary distribution.

Related Documents