Sunteți pe pagina 1din 46

Course structure

1. Motivation for jump processes in finance


2. Basic notions
● Definitions, properties and first examples

3. Lévy-Ito decomposition and path properties


4. Financial modelling with Lévy processes
5. Monte Carlo simulations for Lévy processes
6. Stochastic calculus with Lévy processes
7. Measure transformations for Lévy processes
8. Options pricing within exponential Lévy models

1
2. Basic notions

 Let Ω, ℱ, ℙ a probability space


● Ω: sample space
● ℱ : sigma-algebra on Ω
● ℙ : probability measure on Ω, ℱ
● General framework for studying random variables (…)
 Filtration or information flow on Ω, ℱ, ℙ :
● Increasing family of sigma-algebra ℱ𝑡 𝑡∈[0,𝑇] :
∀0 ≤ 𝑠 ≤ 𝑡: ℱ𝑠 ⊆ ℱ𝑡 ⊆ ℱ
● When considering stochastic processes, we need the concept of filtration
for properly defining the notion of information, causality and
predictability
● When time passes, the information is progressively passed to the
observer, and quantities viewed as stochastic at t=0 are not necessarily
at t>0 (e.g. if their value is revealed to the observer by the information
available at t)  concept of filtration
● ℱ𝑡 represents the information available at t, known at t
2
2. Basic notions

 Adapted process
● A stochastic process (Xt)tϵ[0,T] is said adapted with respect to the filtration
𝓕𝒕 if for each tϵ[0,T], Xt is ℱ𝑡 - measurable
(i.e. the value of Xt is revealed at time t)
 Filtration generated by a stochastic process
 Stopping time
 Cadlag function
● A function 𝑓: 0, 𝑇 → ℝ𝑑 is cadlag if it is right-continuous with existing left
limits (continu à droite, limite à gauche):
∀𝑡 ∈ 0, 𝑇 : 𝑓 𝑡 − = lim 𝑓(𝑠) and 𝑓 𝑡 + = lim 𝑓(𝑠) both exist
𝑠→𝑡 𝑠→𝑡
𝑠<𝑡 𝑠>𝑡
and 𝑓 𝑡 = 𝑓 𝑡+
● If 𝑓 is cadlag and discontinuous at t, its jump at t will be denoted by Δ𝑓 𝑡 :

Δ𝑓 𝑡 = 𝑓 𝑡 − 𝑓(𝑡 − )
3
2. Basic notions

 Cadlag process: process with cadlag trajectories


 Caglad: continu à gauche, limite à droite  predictable process
 Why modelling market prices by cadlag and not caglad processes?
● If f is a cadlag function (appearing in practice as a sample path of a stochastic
process), its value at a discontinuity point appears as the value after the jump,
and not before: 𝑓 𝑡𝑖 = 𝑓(𝑡𝑖+ )
● Since t is interpreted as time, right-continuity means that the value 𝑓 𝑡𝑖 at
any time 𝑡𝑖 is not foreseeable by following the trajectory up to time 𝑡𝑖  a
discontinuity/jump is seen as a sudden event, not anticipated
● If the sample path is caglad, it means that an observer approaching t along the
sample path could predict the occurrence of a jump at t
● In the context of financial modeling, jumps represent sudden events,
motivating the choice of cadlag trajectories for the modeling processes
● More pragmatically: if there is a jump at some instant, nobody in the market
will consider that the value at that instant is the value before the jump (…)

4
2. Basic notions: Definition and first examples

 Definition: A Lévy process is a cadlag stochastic process (Xt) defined


on (Ω, ℱ, ℙ) with X0=0 and satisfying the following properties:
● Independent increments: for any increasing sequence of instants t0,
t1, …, tn, the increments
X(t0), X(t1) - X(t0),…, X(tn) - X(tn-1)
are independent random variables
● Stationary increments: for all h>0, 𝑡 ≥ 0, the law of X(t+h) – X(t) does
not depend on t
● Stochastic continuity: for all ε>0, for all 𝑡 ≥ 0,

lim P X t  h  X t     0
h0

5
2. Basic notions: Definition and first examples

 First example: Brownian motion


● Stochastic process (Xt) defined on (Ω, ℱ, ℙ) with X0=0 satisfying the
following properties:
 Independent increments

 Stationary increments

 Gaussian process: For all t,h>0, Xt+h-Xt ~𝑁(0, 𝜎 2 ℎ)


 Continuity of sample paths
0.05

-0.05

-0.1

-0.15
X(t)

-0.2

-0.25

-0.3

-0.35

-0.4

-0.45
0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5
Time t

6
2. Basic notions: Definition and first examples

 ! The 3rd condition (stochastic continuity) does not imply


continuity of sample paths !

 The idea is to exclude processes having jumps systematically at


deterministic instants
● Corresponding e.g. to calendar effects

 Stochastic continuity only means that the probability to observe


a jump at a particular instant t is zero, not the continuity of the
sample paths

7
2. Basic notions: Definition and first examples
Poisson process

 ! The 3rd condition does not imply continuity of sample paths !


● Counter-example: Poisson process
 Let (τn) be a sequence of independent exponential variables with
parameter λ and Tn = Σi=1,…,n τi
 The counting process N(t) defined by:

𝑁 𝑡 = 𝕀 𝑇𝑛 ≤𝑡 = # 𝑛 ∈ ℕ: 𝑇𝑛 ≤ 𝑡
𝑛=1
is called a Poisson process with intensity λ.
 Typically used for modelling rare events:

τ1 τ2 τ3 τ4
N(t) =3, i.e. number of occurrences of some event until t is equal to 3
8
2. Basic notions: Definition and first examples
Poisson Process

 Proposition. Let (Nt)tϵ[0,T] be a Poisson process. Then:


1. For each t>0, 𝑁𝑡 < ∞ a.s. (𝑁𝑡 𝜔 is well defined)
2. For each ω>0, the path 𝑡 ↦ 𝑁𝑡 (𝜔) is piecewise constant and increases
by jumps of size 1
3. The sample paths 𝑡 ↦ 𝑁𝑡 are cadlag
4. For any t>0 fixed, 𝑁𝑡 − = 𝑁𝑡 with probability 1 (and hence Nt is
continuous at t a.s. from 3)
i.e. the probability to have a jump at instant t is zero

5. 𝑁𝑡 is continuous in probability: ∀𝑡 > 0, 𝑁𝑠 𝑁𝑡
𝑠→𝑡
6. For any t>0, the r.v. 𝑁𝑡 follows a Poisson distribution with parameter λt:
𝑒 −𝜆𝑡 𝜆𝑡 𝑛
∀𝑛 ∈ ℕ, ℙ 𝑁𝑡 = 𝑛 =
𝑛!

9
2. Basic notions: Definition and first examples
Poisson Process

 Proposition. Let (Nt)tϵ[0,T] be a Poisson process. Then:


7. The characteristic function of 𝑁𝑡 is given by:
𝐸 𝑒 𝑖𝑢𝑁𝑡 = exp 𝜆𝑡 𝑒 𝑖𝑢 − 1 ∀𝑢 ∈ ℝ
8. (Nt ) has independent increments
9. (Nt) has stationary increments
10. (Nt) is has the Markov property :
∀𝑡 > 𝑠, 𝐸 𝑓 𝑁𝑡 𝑁𝑢 , 𝑢 ≤ 𝑠 = 𝐸 𝑓 𝑁𝑡 𝑁𝑠
for any measurable bounded function f

 Idea of the proof


10
2. Basic notions: Definition and first examples
Poisson Process

 Sample path of a Poisson process:

Sample path of Poisson process


14

12

10
Poisson process

0
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
Time t (years)
11
2. Basic notions: Definition and first examples
Poisson Process

 Link with counting processes


● Definition : counting process

● A Poisson process is a counting process

 Proposition: Let (Xt) a counting process with independent and

stationary increments. Then (Xt) is a Poisson process

● Idea of the proof

12
2. Basic notions: Definition and first examples
Poisson Process

 Example 3: Compensated Poisson process


~
N (t )  N (t )  t
N (t ) ~ Poisson process of intensity 

● We see immediately that this process has independent and stationary


increments, and is moreover a martingale

● Characteristic function:
𝑖𝑧 −1−𝑖𝑧)
Φ𝑵𝑡 𝑧 = 𝔼 𝑒 𝑖𝑧𝑁𝒕 = 𝑒 𝜆𝑡 (𝑒 𝑧∈ℝ

● « λt » is called the compensator of (Nt)

13
2. Basic notions: Definition and first examples
Poisson Process

 Example 3: Compensated Poisson process


● Illustration of rescaled process (𝑵(𝒕)/ 𝝀) and comparison with Standard
Brownian Motion W(t)
Sample path of compensated Poisson process (lambda=5) Sample path of standard BM
2 1

1
0

0
compensated poisson

-1

Brownian motion
-1
-2
-2

-3
-3

-4
-4

-5 -5
0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10
Time t (years) Time t (years)
~
Var ( N (t ) /  )  t  Var (W (t ))
14
2. Basic notions: Definition and first examples
Poisson Process

 Example 3: Compensated Poisson process


● Illustration on 20 years simulations, same intensity (𝜆 = 5)

Sample path of standard BM


4

Brownian motion
0

-1

-2

-3

-4

-5
0 2 4 6 8 10 12 14 16 18 20
Time t (years)

15
2. Basic notions: Definition and first examples
Poisson Process

 Example 3: Compensated Poisson process


𝑵𝒕
● Result: 𝑊𝑡 in distribution
𝜆 𝜆→∞

16
2. Basic notions: Definition and first examples
Poisson Process

 Example 3: Compensated Poisson process


𝑵𝒕
● Result: 𝑊𝑡 in distribution
𝜆 𝜆→∞

17
2. Basic notions: Definition and first examples
Compound Poisson Process

 Example 4: Compound Poisson process


● Definition: a compound Poisson process with intensity λ and jump size
distribution F is a stochastic process (Xt) defined by :
𝑁𝑡

𝑋𝑡 = 𝑌𝑖 ,
𝑖=1
where
 jump sizes 𝑌𝑖 are i.i.d. with distribution F, and
 (𝑁𝑡 ) is a Poisson process with intensity λ

● Properties:
 Trajectories are cadlag, piecewise constant

 Jump times (𝑇𝑖 )𝑖≥1 are partial sums of i.i.d. exponential variables , as in a
Poisson process
 The Poisson process is a particular case of a compound Poisson process: it
corresponds to the case 𝑌𝑖 ≡ 1 for all i.

18
2. Basic notions: Definition and first examples
Compound Poisson Process

 Example 4: Compound Poisson process


● One can see that it is a Lévy process.
● Moreover, one can show that it is the only Lévy process with piecewise
constant trajectories:

Proposition:
𝑋𝑡 𝑡≥0is a compound Poisson process

𝑋𝑡 𝑡≥0 is a Levy process and its sample paths are are
piecewise constant functions

Proof:

19
2. Basic notions: Definition and first examples
Compound Poisson Process

 Example 4: Compound Poisson process


● Illustration

Sample path of Poisson process Sample path of compound Poisson process (lambda=10, Y~N(2,2))
14 20

18
12
16

10 14

compound poisson
Poisson process

12
8
10
6
8

4 6

4
2
2

0 0
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
Time t (years) Time t (years)

20
2. Basic notions: Definition and first examples
Compound Poisson Process

 Example 4: Compound Poisson process


● Illustration

Sample path of compound Poisson process (lambda=10, Y~N(2,2)) Sample path of Jump-diffusion
20 20

18

16 15

14
compound poisson

12

jump diffusion
10

10

8 5

4 0

0 -5
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2
Time t (years) Time t (years)

21
2. Basic notions: Definition and first examples
From random walk to Levy processes

 Definition: Infinite divisibility


● A distribution F on ℝd is said to be infinitely divisible if for any
integer n≥2, there exists n i.i.d. random variables Y1, …, Yn such that
Y1+ …+Yn ∼ F
 If X(t) is a Levy process, we can sample it at regular time intervals
0, Δ, 2Δ, …. ,nΔ=t and consider its increments on these intervals:
Yi  X (i 1)   X i

 Yi are i.i.d. rvs having the same distribution as XΔ.


 We can consider the associated random walk

S k  S k ()   Y (discrete time process)


i  0..k 1
i

22
2. Basic notions: Definition and first examples
From random walk to Levy processes

 As Sn(Δ)= X(nΔ)=X(t), we see that we can decompose X(t) in n i.i.d.


parts:
S n ()  X (t )  Y
i  0..n 1
i

(whose distribution is identical to the one of X(t/n))

 the distribution of X(t) is infinitely divisible:


If (X(t)) is a Levy process, then for every t,
X(t) has an infinitely divisible distribution

 This restricts the possible choices for the distribution of X(t)

23
2. Basic notions: Definition and first examples
From random walk to Levy processes

 Now, one can see that the converse is also true : every infinitely
divisible distribution F leads to the definition of a Lévy process
X(t) such that X(1) ~ F

 Proposition: Let (Xt) be a Lévy process. Then for all t, Xt has an


infinitely divisible distribution. Moreover, for any infinitely
divisible distribution F, there exists a Lévy process (Xt) such that
the distribution of X1 is F.

24
2. Basic notions: Definition and first examples
From random walk to Levy processes

 natural way to construct Levy processes: from an infinitely


divisible distribution, taken for constructing the increments:

 For every infinitely divisible distribution with characteristic


function ϕ(u), we can define an associated Levy process which:
● Starts at 0: X(0)=0
● Has independent and stationary increments
● The distribution of an increment over [s,s+t] : X(t+s) –X(s) has (ϕ(u))t as
characteristic function

 this leads to the introduction of many Levy processes

25
2. Basic notions: Definition and first examples

1. Brownian motion
2. Poisson process
3. Compensated Poisson process
4. Compound Poisson process:
5. Gamma process (G)
6. Variance Gamma process (VG)
7. Inverse Gaussian process (IG)
8. Normal Inverse Gaussian process (NIG)
9. Generalized Inverse Gaussian process (GIG)
10. Meixner
 …
26
2. Basic notions: Definition and first examples
Example: Gamma process

 Gamma distribution:

b a a 1
f gamma ( x; a, b)  x exp(  xb), x  0
( a )
 a: shape parameter, b : intensity parameter
𝑛 𝑛
 If 𝑋𝑖 ∼ Γ(𝑎𝑖 , 𝑏) independent, then 𝑖=1 𝑋𝑖 ∼ Γ( 𝑖=1 𝑎𝑖 , 𝑏)
 Gamma process: X(t) , t≥0 such that X(t) ~Gamma(at,b)
 Scaling property: if X~Γ(a,b), then for any c>0, cX~ Γ(a,b/c)

Gamma(a,b)
Mean a/b
Variance a/b²
Skewness 2/a1/2
Kurtosis 3(1+2/a)
27
2. Basic notions: Definition and first examples
Example: Gamma process

Gamma density - a=1, b=0.1 to 1 (steps of 0.1)


0.4

0.35

0.3

0.25

0.2

0.15

0.1

0.05

0
0 5 10 15 20 25 30

28
2. Basic notions: Definition and first examples
Example: Gamma process

Alternative parameterization:
 Gamma process 𝐺𝑡 (𝜇, 𝜈) with mean parameter 𝜇 and variance
parameter 𝜈:
● Lévy process following a Gamma distribution with mean 𝜇𝑡 and variance
𝜈𝑡
 We parameterize here with the mean and the variance of G1

 Properties:
● Discountinuous sample paths
● Increasing process (!!)
●  can be used as subordinator for defining new Lévy processes

29
2. Basic notions: Definition and first examples
Example: Variance Gamma process

 Process defined from the characteristic function of the VG law,


as an explicit function of 3 parameters σ>0,θ,ν>0 :
VG(σ, ν ,θ)
 It is also possible to define it as a Gamma time-changed
Brownian motion with drift:
● Let G(t) be a Gamma process of parameters a=b=1/ν >0
● Let W(t) be a standard Brownian motion
● Let 𝜎 > 0, 𝜃 ∈ ℝ
● Then the variance gamma process XVG(t) with parameters σ,θ,ν is
defined as:
𝑋 𝑉𝐺 (𝑡) = 𝜃𝐺 𝑡 + 𝜎𝑊(𝐺 𝑡 )
 Brownian motion with drift where the physical deterministic time « t »
is replaced by a Gamma process G(t)

  subordination of a Brownian motion by a Gamma process


30
2. Basic notions: Definition and first examples
Example: Variance Gamma process

 Properties:
● When θ=0, then the distribution is symmetric (no skewness)
● θ>0  positive skewness
● Parameter ν controls the kurtosis

Variance Gamma (σ,θ,0)


Variance Gamma (σ,θ,ν) (symmetric case)
Mean θ 0
Variance σ²+νθ² σ²
Skewness θν(3σ²+2νθ²)/(σ²+ νθ²)3/2 0
Kurtosis 3 ( 1+2ν- νσ4 (σ²+νθ²)-2 ) 3 ( 1+ν )

● Explicit expression for the density function

31
2. Basic notions: Definition and first examples
Example: Variance Gamma process

 Alternative definition :
● C,G,M (Carr, Geman, Madan) parameterization:
C  1 / 
1 
 1 1 1

G  4  ² ²  2  ²  2  

1 
 
M  14  ² ²  12  ²  12  

C , G, M  0 

●  XVG(t)=G(1)(t) – G(2)(t) , where


 G(1)(t) ~ Gamma (C, M)
 G(2)(t) ~ Gamma (C, G)
 G(i) independents
 defined as the difference of two independent Gamma processes

● Possibility to get moments in terms of this new parameterization

32
2. Basic notions: Definition and first examples
Example: Variance Gamma process

 Existence of an analytical expression of the VG density in terms


of the C, G, M parameterization
 G – M : controls the skewness:
● G=M : symmetric case
● G<M: negative skewness, G>M: positive skewness

 C controls the kurtosis


 C-G-M = Carr-Geman-Madan

33
2. Basic notions: Definition and first examples
Example: Variance Gamma process

 In Summary: 3 possible definitions

● Definition in terms of the law of increments : Variance-


Gamma law (through the VG characteristic function)
● Subordination of a Brownian motion with drift by a Gamma
process
● Difference of two independent Gamma processes (with Carr-
Geman-Madan parameterization)

34
2. Basic notions: Definition and first examples
Example: Inverse Gaussian process

 Levy process constructed on the Inverse Gaussian distribution

f IG ( x; a, b)  a
2
exp( ab) x 3 / 2 exp(  12 ( ax²  b² x)), x  0

 Properties of the IG distribution:


● « b » represents the mean, « a » the shape parameter
● the support of the distribution is (0,∞)
● Infinitely divisible distribution
● Can be seen as the first time a Brownian motion with positive drift b>0
(W(s) + bs, s≥0) reaches a positive level a>0.
● Also called Wald distribution
● The term « inverse Gaussian » comes from the inverse relationship
between the cumulant generating function (= logarithm of the
characteristic function) and those of Gaussian ones

35
2. Basic notions: Definition and first examples
Example: Inverse Gaussian process

f IG ( x;  ,  )  
2
exp( ) x 3 / 2 exp( 12 ( x²   ² x)), x  0

36
2. Basic notions: Definition and first examples
Example: Inverse Gaussian process

 Scaling property:

if X~IG(a,b), then for any c>0, cX~IG(c1/2a,b/c1/2).

 Moments:

IG(a,b)
Mean a/b
Variance a/b3
Skewness 3(ab)-1/2
Kurtosis 3(1+5/(ab))

37
2. Basic notions: Definition and first examples
Example: Normal Inverse Gaussian process

 Levy process constructed from the Normal Inverse Gaussian


distribution (NIG)
 Infinitly divisible distribution
 Density function:
 K1 (  2  x 2 )
f NIG ( x; ,  ,  )  exp(     x)
2 2

  2  x2

where Ki(x) is the ith modified Bessel function (of the second type)

 The process can also be seen as a Brownian motion with drift


valued at a stochastic time following an Inverse Gaussian
process

38
2. Basic notions: Definition and first examples
Example: Normal Inverse Gaussian process

 Parameters:
● α: tail parameter
 K1 (  2  x 2 )
● β: asymmetry parameter f NIG ( x; ,  ,  )  exp(     x)
2 2

  2  x2
● δ : scale parameter

 Moments:

NIG(α,β,δ)

 /  2   2
Mean
 2 ( 2   2 ) 3 / 2
Variance
3 1 1/ 2 (a 2   2 ) 1/ 4
Skewness
  2  4 2 
31  
  2  2   2 
Kurtosis  

39
2. Basic notions: Definition and first examples
Example: Generalized Inverse Gaussian process

 Generalization of the IG process


 Density function

(b / a)  1  1 
f GIG ( x;  , a, b)  x exp   (a 2 x 1  b 2 x) 
2 K  (ab)  2 
 Moments:

GIG(λ,a,b)

Mean aK 1 (ab) /(bK (ab))

Variance a 2b 2 K2 (ab)( K 2 (ab) K (ab)  K21 (ab))

40
2. Basic notions: Definition and first examples
Example: Meixner process

 Defined from the Meixner distribution


Infinitly divisible distribution
Has no Brownian component
Density function
2 2
(2 cos( / 2))  bx   ix 
f Meixner ( x; ,  ,  )  exp      
2(2d ) a  

where   0,     ,   0 .

41
2. Basic notions: Definition and first examples
Example: Meixner process

 Moments:

Meixner(α,β,δ)

Mean
 tan( / 2)
1 2
Variance   cos 2 (  / 2)
2
Skewness sin(  / 2) 2 / 

Kurtosis
3  (2  cos( )) / 

42
2. Basic notions: Definition and first examples
Example: CGMY process

 CGMY = Carr-Geman-Madan-Yor
 Characteristic function:

𝜙𝐶𝐺𝑀𝑌 𝑢; 𝐶, 𝐺, 𝑀, 𝑌
= exp 𝐶Γ −𝑌 𝑀 − 𝑖𝑢 𝑌 − 𝑀𝑌 + 𝐺 + 𝑖𝑢 𝑌 − 𝐺𝑌

Where C,G,M>0, Y<2

 Analytical expressions for mean, variance, skewness and


kurtosis ([Sch] pg 61)

43
2. Basic notions: Definition and first examples
Example: CGMY process

44
2. Basic notions: Definition and first examples
Example: CGMY process

 Pure jump process, no Brownian part

 Path behaviour determined by the « Y » parameter


● Y>0  finite number of jumps in any finite time interval
● If not  infinite number of jumps in any finite time interval

 Appears as a generalisation of the VG process:


● If Y=0, the CGMY process becomes a VG process (with the corresponding
parameters C,G,M)

  also called « Generalised VG »

45
2. Basic notions: Definition and first examples
Example: CGMY process

 Pure jump process, no Brownian part

 Path behaviour determined by the « Y » parameter


● Y>0  finite number of jumps in any finite time interval
● If not  infinite number of jumps in any finite time interval

 Appears as a generalisation of the VG process:


● If Y=0, the CGMY process becomes a VG process (with the corresponding
parameters C,G,M)

  also called « Generalised VG »

46

S-ar putea să vă placă și