Sunteți pe pagina 1din 10

MSc Financial Mathematics - SMM302 1

2 Stochastic Processes
Stochastic processes are, roughly speaking, nothing but random variables evolving over
time; for this reason we need the whole machinery presented in the previous chapter.
However, since now we are going to have an additional dimension to take care of, i.e.
time, we need to readjust a little our probability space. And this is what is coming up in
the next section. We will introduce then the most common processes you will deal with
over the rest of the year.

2.1 Some introductory definitions


Definition 1 (Filtration) Let Ω be a nonempty set. Let T be a fixed positive number,
and assume that for each t ∈ [0, T ] there is a σ-algebra F t . Assume further that Fs ⊂ Ft
for all 0 ≤ s < t < ∞. Then we call the collection of σ-algebras F t , 0 ≤ t ≤ T a filtration;
and (Ω, F ,P, Ft ) is a filtered probability space.

We can consider the filtration Ft as the set of information available at time t. In other
words we can consider (F t )t≥0 as describing the flow of information over time, where we
suppose that we do not lose information as time passes (this is why we say Fs ⊂ Ft for
s < t).

Example 1 Consider Example 1.3.3. The sets F (1) and F (2) contain the information
learned by observing respectively the first one and the first two movements of the stock
price. Alternatively, you can consider these two sets as containing the information gained
after 1 period of time, and 2 periods of time respectively elapsed. If so, then we can index
those σ-algebras by time and {F0 , F1 , F2 } is an example of filtration.

Hence, a filtration tells us the information we will have at future times; more precisely,
when we get to time t, we will know for each set in Ft whether the true ω lies in that set.

Definition 2 (Stochastic process) A stochastic process is a family X = (Xt : t ≥ 0)


of random variables defined on the filtered probability space (Ω, F , P, Ft).

Example 2 (Random walk) P 1. A (simple) symmetric random walk is a stochastic


process S defined as Sn = ni=1 Xi , where {X}i∈N is a sequence of independent and
identically distributed (i.i.d.) random variables such that

1 with probability 1/2
Xi =
−1 with probability 1/2.
2. The evolution of the stock price described in Example 1.7 is a stochastic process
defined as
Pn
Xi
Sn = S0 2 i=1

S0 = 4,
0

c Laura Ballotta - Do not reproduce without permission.
2 2 STOCHASTIC PROCESSES

where {X}i∈N is a sequence of independent and identically distributed (i.i.d.) ran-


dom variables such that

1 with probability p
Xi =
−1 with probability q = 1 − p.

We say that X is adapted to Ft if Xt is Ft - measurable ∀t ≥ 0.


We use Xt as shorthand notation for X (t, ω). Note that for each fixed ω ∈ Ω, X(·, ω)
denotes the path of the stochastic process X.

Definition 3 (Natural filtration) Let (Ω, F , P, Ft) be a filtered probability space and
X be a stochastic process adapted to the filtration (F t )t≥0 . The natural filtration of X is
defined by the set of σ-algebras Ft = σ (Xs : 0 ≤ s ≤ t) , 0 ≤ t < ∞.

Example 3 The natural filtration of the process S defined in Example 2.2 is given by
the sequence {F0 , F1 , F2 , ...} discussed in Example 2.1.

2.2 Classes of processes


2.2.1 Martingale
A stochastic process X = (Xt : t ≥ 0) is a martingale relative to (F ,P) if:

1. X is adapted and E |Xt | < ∞ ∀t ≥ 0;

2. E [Xt |F s ] = Xs ∀s ≤ t.

A property of martingales that comes as consequence of the definition is the following:

• E [Xt − Xs |Fs ] = 0 ∀s ≤ t

The martingale properties show that a martingale is a random process whose future
variations are completely unpredictable given the current information set. For this reason,
the best forecast of the change in X over an arbitrary interval is zero. A martingale
represents a fair game: given the knowledge we have, on average the return produced by
the bet is what we invested in it.

Example 4 Let Y be an integrable random variable. Then the stochastic process Xt =


E [Y |F t ] t ≥ 0 is a martingale relative to (F ,P). In fact, E |Xt | < ∞ by definition of
conditional expectation. Also, for ∀s ≤ t

E [Xt |F s ] = E [E(Y |F t ) |F s ]
= E (Y |F s ) = Xs

where the last but one equality follows from the Tower property.

Example 5 A symmetric random walk is a martingale (very easy to check!)


2.2 Classes of processes 3

Example 6 In a simple discrete time model for the price of a share, the change in price
at time t, Xt , is assumed to be independent of anything that has happened before time t
and to take distribution:

 1 with probability p
Xt = −1 with probability q
0 with probability r = 1 − p − q

m
P
Let S0 = n ∈ N be the original price of the share, Sm = S0 + Xt be the price after m
t=1
periods and define
 Sm
q
Ym = .
p
Then Ym is a martingale. To check, proceed as follows: as p, q > 0, and Ym ≥ 0,
 
q Sm   Sm
q
E |Ym | = E =E

p p
m
  S0 + P Xt  n  Xt !m
q t=1 q q
= E = E
p p p
"
 n  1  −1  0 #m
q q q q
= p+ q+ r
p p p p
 n
q
= < ∞.
p

Further,
"  #
S
q m
E [Ym | Fm−1 ] = E Fm−1
p
"  # "  #
S +X X
q m−1 m q m
= E Fm−1 = Ym−1 E Fm−1
p p
"  #
X
q m
= Ym−1 E = Ym−1 .
p

Further remark: a martingale is always defined with respect to some information set,
and with respect to some probability measure. If we change the information content
and/or the probabilities associated with the process, the process under consideration may
cease to be a martingale. The opposite is also true: given a process, which does not
behave like a martingale, we may be able to modify the relevant probability measure and
convert the process into a martingale. This will be illustrated later on in the course when
we will talk about the martingale problem and the Girsanov theorem.
4 2 STOCHASTIC PROCESSES

Exercise 1 Let Ft be an increasing family of σ-algebras for t ≥ 0. Show that for every
random variable h, if Eh < ∞, then Mt := E [h|Ft ] is an Ft -martingale for t ≥ 0.
Exercise 2 Define vt = V ar(Mt ), where {Mt : t ≥ 0} is a zero-mean martingale. Show
that vt is non-decreasing.

2.2.2 Super/sub martingales


Consider the usual filtered probability space. A stochastic process X is called a
• supermartingale if Xt ∈ Ft , E |Xt | < ∞ and E (Xt |F s ) ≤ Xs ∀0 ≤ s ≤ t < ∞ ;
• submartingale if Xt ∈ Ft , E |Xt | < ∞ and E (Xt |F s ) ≥ Xs ∀0 ≤ s ≤ t < ∞.
Exercise 3 Let S be a supermartingale. Show that
E (S0 ) ≥ E (ST ) .
Exercise 4 Let S be a non-negative supermartingale such that E (S0 ) ≤ E (ST ). Show
that S is a martingale.

2.2.3 Stopping time


Consider the usual filtered probability space. A stopping time with respect to the filtration
Ft , t ≥ 0, is a random variable τ : Ω → [0, ∞) such that {τ ≤ t} := {ω ∈ Ω : τ (ω) ≤ t} ∈
Ft ∀t ≥ 0.
The measurability property means that the decision whether to stop a stochastic
process at time t ∈ [0, ∞) depends on the information up to time t only.
Example 7 Let τ1 and τ2 be two stopping times. Then τ1 ∧τ2 = min {τ1 , τ2 } is a stopping
time. In fact, by definition {τ1 ≤ t} ∈ Ft , {τ2 ≤ t} ∈ Ft . Hence:
{τ1 ∧ τ2 ≤ t} = {τ1 ≤ t, or τ2 ≤ t, or both τ1 , τ2 ≤ t}
= {τ1 < t} ∪ {τ2 < t} ∈ Ft
because of definition of σ-algebra.
Example 8 Let Sm be a random walk, i.e. a process {Sm : m = 0, 1, . . .} such that
Sm = Sm−1 + εm , where εm are independent and identically distributed random variables.
Consider the interval [a, b] ∈ R. Then:
τ = inf {m |Sm ∈ [a, b] } ,
i.e. the time of the first entry of S in [a, b], is a stopping time with respect to the natural
filtration of S.
Example 9 Suppose St is the stock price at time t. Let τ = sup {t |St ≥ 100}, i.e. the
latest date at which St ≥ 100. τ is not a stopping time because this event may occur later
than t.
Exercise 5 Consider the random time τ where X first achieves its maximum in the
interval [0, T ] . Is τ a stopping time with respect to the flow of information obtained from
X?
2.2 Classes of processes 5

2.2.4 Local martingale


Let (Ω, F ,P,Ft ) be a filtered probability space and X be a stochastic process such that
X0 = 0. If there is a sequence {τn : n ∈ N} of non-decreasing stopping times with
 
P lim τn = ∞ = 1
n→∞

such that X τ = Xτn ∧t is a martingale, i.e. E [Xτn ∧t |F s ] = Xτn ∧s , s ≤ t, then we say that
X is a local martingale.

2.2.5 Gaussian process


Consider a time partition of [0, ∞) , say {t1 , t2 , ..., tn } ; collect the values of the process
X at each time point, so you get the set (Xt1 , Xt2 , ..., Xtn ). This is a random vector with
some distribution F (t1 , t2 , ..., tn ) say. Now consider all the possible time-partitions of
[0, ∞), and the repeat the job. You get a class of all such distributions which is called
the class of all finite-dimensional distributions of X

Definition 4 A process X is Gaussian if all its finite-dimensional distributions are Gaus-


sian; such a process can be specified by:

1. a measurable function µ = µt with E (Xt ) = µt , the mean function;

2. a non-negative definite function σ (s, t) = cov (Xs , Xt ), the covariance function.

2.2.6 Markov process


Definition 5 A process X is Markov if for each t, each A ∈ σ (Xs : s > t) and B ∈
σ (Xs : s < t):
P (A |Xt , B ) = P (A |Xt , B ) = P (A |F t ) ,
where Ft = σ (Xs : s ≤ t).

In other words, if you know where you are at time t, how you get there does not matter
so far as predicting the future is concerned. Equivalently, past and future are conditional
independent given the present.
Any process with independent increments is Markov. In fact, for B ∈ σ (Xτ : τ ≤ s)
 \   \ 
P Xt ∈ E Xs = x B = P Xt − Xs + x ∈ E Xs = x B

= P (Xt − Xs + x ∈ E |Xs = x )

We distinguish Markov processes according to the nature of the state space, i.e. the
set of values that the random variables are capable of taking, and to whether we are
observing the process discretely in time or continuously in time. The classification works
as represented in the following table.
6 2 STOCHASTIC PROCESSES

State space/Time DISCRETE TIME CONTINUOUS TIME

DISCRETE STATE SPACE Markov Chain Markov Jump Process


CONTINUOUS STATE SPACE Markov Process

Definition 6 A Markov chain is a sequence of random variables Xt1 , Xt2 , ..., Xtn . . . with
the following property:
(m,n)
P (Xn = j |X0 = i0 , X1 = i1 , ..., Xm = i) = P (Xn = j |Xm = i) = pij .

(m,n)
pij is called transition probability (as it denotes the probability of transition from state
i at time m to state j at time n).

2.2.7 Point processes and the Poisson process


Consider a random series of events occurring on time. These events may be, for example,
the emission of radioactive particles; the arrival times of telephone calls at an exchange,
stock market crashes or devaluations.
Define S as the state space (underlying space in which the points sit). A Poisson
process then can be defined via a countable set N of S; in particular, let the random
variable Tn , n ∈ N, be the time at which the nth event occurs, then N = {T1 , T2 , T3 , . . .}.
Since the sequence {Tn }n∈N is random, N itself is random.
Generally a more accurate description of N can be obtained using a count function
N (A), which counts the number of events occurring in a prespecified set A ⊆ S. N
represents a point process, i.e. an adapted and increasing process whose jumps are equal
to 1 (i.e. the jump process ∆N takes only the values 0 and 1). Set A = (0, t] and
N (A) = Nt , for simplicity.
A Poisson process is a special kind of point process: it is such that the inter-arrival
times between two events are exponentially distributed with rate λ. More precisely

Definition 7 (Poisson process) A Poisson process on (Ω, F , P, Ft) is an adapted point


process N such that

1. for any 0 ≤ s < t < ∞, Nt − Ns is independent of Fs , i.e. N has independent


increments;

2. Nt has a Poisson distribution with parameter νt , that is

e−νt (νt )n
P [Nt = n] =
n!
and it is continuous in probability.
2.2 Classes of processes 7

T 1 T2 T 3 T 4

time

N t

1
3

1
2
A
1

time
T 1 T 2 T 3 T4

Figure 1: The Poisson process and the Poisson counter.

T
*3

X 3

AxB T
*2
X 4

X 2
T* 1

X 1

S
T 1 T 2 T 3 T4

T
*4

Figure 2: The compound Poisson process.


8 2 STOCHASTIC PROCESSES

Note that νt is a non-decreasing function and generally is given in terms of rate or intensity
(rate if S has dimension 1, intensity for multidimensional S). This is a positive function
λ on S such that
Zt
νt = λτ dτ
0

If λ ∈ R, νt = λt. In this case, we have a time-homogenous Poisson process which has


the additional property

3. for any s < t and u < v such that t − s = v − u, the distribution on Nt − Ns is the
same as that of Nv − Nu , i.e. N has stationary increments.

The idea of the Poisson process is represented in Figure 1.


Further properties of the Poisson process:

MGF You go through the calculations in a very similar fashion to the case of the Poisson
random variable:

kN (t)
 X
kn e
−λt
(λt)n
MN (k; t) = E e = e
n=0
n!
∞ k
n
e λt
= eλt(e ).
k −1
X
= e−λt
n=0
n!

Note that you can differentiate MN (k) with respect to k to obtain the mean and
the variance of the process.

Mean So, let’s differentiate:




= λtek+λt(e −1)
k
MN (k; t) = λt.

∂k k=0 k=0

Variance And differentiate again:

∂2

k+λt(ek −1)

= λt + (λt)2 .
k
MN (k; t) = λte 1 + λte
∂k 2

k=0

k=0

Which implies that


Var (Nt ) = λt.

As just seen, the Poisson process counts the number of points in a certain subset of
S. Often each point may be labelled with some quantity, (the size of a market crash, for
example), random itself, say X. This gives what is called a marked Poisson process or
compound Poisson process
Nt
X
Xk ,
k=1
2.2 Classes of processes 9

where {Xk }k∈N is a sequence of independent and identically distributed random variables.
A possible representation of the compound Poisson process is given in Figure 2.
Also in this case, you can calculate the MGF from which you can extract mean and
variance. Assume that EX = µ and V ar (X) = σ 2 .
MGF In this case, the conditional expectation will be of great help:
  n h P N(t) io
hN̄t h k=1 Xk
MN̄ (h; t) = E e =E E e N (t)
 
NY (t)
 h
N (t)
i
= eλt(MX (h)−1) .
 hX
= E E e k
N (t) = E (MX (h))
 
k=1

Mean Just for the sake of doing something different, instead of differentiating the MGF,
we can use the cumulants generating function
K(h; t)N̄ = ln MN̄ (h; t) = λt (MX (h) − 1) .
Then
 ∂ ∂
E N̄t = K(h; t)N̄
= λt MX (h) = λµt.
∂h h=0 ∂h h=0

Variance Similarly
∂2 ∂2

2 2
 
Var N̄t = K(h; t) N̄
= λt MX (h) = λ µ + σ t.
∂h2
h=0 ∂h2
h=0

Exercise 6 1. Calculate the characteristic function of the Poisson process and the
compound Poisson process.
2. Use the previous results and the properties of the characteristic function to derive
the mean and the variance of the Poisson process.
3. Repeat the same exercise for the compound Poisson process.
Exercise 7 Let N be a Poisson process with rate λ and Ft the associated filtration.
a) Calculate the mean of the process N and its variance.
b) Let MN be the moment generating function of the process N, i.e. MN (k) = E ekN (t) ,


for k ∈ R. Show that


MN (k) = eλt(e −1) .
k

c) Write down the conditional distribution of N (t + s) − N (t) given Ft , where s > 0,


and use your answer to find
E θN (t+s) |Ft .


d) Find a process of the form M (t) = η (t) θN (t) which is a martingale.


e) Consider a compound Poisson process. Find an expression for its mean, variance and
the moment generating function.
10 2 STOCHASTIC PROCESSES

2.2.8 Other classes of processes


A class of processes that are becoming more and more important in Finance is the so-called
class of Lévy processes.

Definition 8 An adapted, càdlàg (continu à droit, limite à gauche) process L := {Lt : t ≥ 0}


with L0 = 0 is a Lévy process if

1. L has increments independent of the past, i.e. Lt − Ls ⊥Fs ∀0 ≤ S < t < ∞;


D
2. L has stationary increments, i.e. Lt − Ls = Lt−s ;

3. L is continuous in probability, i.e. lims→t Ls = Lt in probability.

Lévy processes can be thought of as a mixture of a jump process, like for example the
Poisson process (but not only), and a diffusion (Gaussian) process called Brownian motion
or Wiener process. Lévy processes actually represent a generalization of both the Poisson
process and the Brownian motion, since the distribution followed by the increments is not
specified; we will study Lévy processes in more details next term; the Brownian motion
is instead the subject of the next unit, actually of the remaining of this module.
We conclude this section by briefly mentioning that the more general class of stochastic
process for which rules of calculus have been developed so far, is the class of semimartin-
gales. Again, we will encounter these processes in Term 2.

S-ar putea să vă placă și