Documente Academic
Documente Profesional
Documente Cultură
Stochastic Processes
Introduction
Let denote the random outcome of an experiment. To every such
outcome suppose a waveform X (t, )
X (t , ) is assigned.
The collection of such X (t, )
waveforms form a
n
X (t, )
stochastic process. The
k
0
t
t t
infinite or finite) as well. 1 2
Fig. 14.1
For fixed i S (the set of
all experimental outcomes), X (t , i ) is a specific time function.
For fixed t,
X 1 X (t1 , )
and
2
FX ( x1 , x2 , t1 , t2 ) (14-4)
f X ( x1 , x2 , t1 , t2 )
x1 x2
Properties:
The function i 1
Then
T T
E [| z | ] T T E{ X (t1 ) X * (t2 )}dt1dt2
2
T T
T T R XX (t1 , t2 )dt1dt2 (14-10) 5
PILLAI/Cha
Example 14.2
X (t ) a cos( 0 t ), ~ U (0,2 ). (14-11)
This gives
X (t ) E{ X (t )} aE{cos( 0 t )}
a cos 0 t E{cos } a sin 0 t E{sin } 0, (14-12)
2
since E{cos } 21 0 cos d 0 E{sin }.
Similarly
R XX (t1 , t2 ) a 2 E{cos( 0 t1 ) cos( 0 t2 )}
a2
E{cos 0 (t1 t2 ) cos( 0 (t1 t2 ) 2 )}
2
a2
cos 0 (t1 t 2 ). (14-13)
2
6
PILLAI/Cha
Stationary Stochastic Processes
Stationary processes exhibit statistical properties that are
invariant to shift in the time index. Thus, for example, second-order
stationarity implies that the statistical properties of the pairs
{X(t1) , X(t2) } and {X(t1+c) , X(t2+c)} are the same for any c.
Similarly first-order stationarity implies that the statistical properties
of X(ti) and X(ti+c) are the same for any c.
In strict terms, the statistical properties are governed by the
joint probability density function. Hence a process is nth-order
Strict-Sense Stationary (S.S.S) if
f X ( x1 , x2 , xn , t1 , t2 , tn ) f X ( x1 , x2 , xn , t1 c, t2 c , tn c )
(14-14)
for any c, where the left side represents the joint density function of
the random variables X 1 X (t1 ), X 2 X (t2 ), , X n X (tn ) and
the right side corresponds to the joint density function of the random
variables X 1 X (t1 c ), X 2 X (t2 c ), , X n X (tn c).
A process X(t) is said to be strict-sense stationary if (14-14) is 7
true for all ti , i 1, 2, , n, n 1, 2, and any c. PILLAI/Cha
For a first-order strict sense stationary process,
from (14-14) we have
f X ( x, t ) f X ( x, t c ) (14-15)
2 2
and hence the above integral reduces to
2T 2T | |
2 t R XX ( )( 2T | |)d
2
z
1
2T 2t XX
R ( )(1 2 T ) d . 12
(14-24) PILLAI/Cha
Systems with Stochastic Inputs
A deterministic system1 transforms each input waveform X (t , i ) into
an output waveform Y (t , i ) T [ X (t , i )] by operating only on the
time variable t. Thus a set of realizations at the input corresponding
to a process X(t) generates a new set of realizations {Y (t , )} at the
output associated with a new process Y(t).
Y (t, i )
X (t, i )
X (t )
T [] Y
(t )
t t
Fig. 14.3
Our goal is to study the output process statistics in terms of the input
process statistics and the system function.
1
A stochastic system on the other hand operates on both the variables t and .
13
PILLAI/Cha
Deterministic Systems
Linear-Time Invariant
(LTI) systems
X (t ) h (t ) Y (t ) h(t ) X ( )d
LTI system h( ) X (t )d . 14
PILLAI/Cha
Memoryless Systems:
The output Y(t) in this case depends only on the present value of the
input X(t). i.e.,
Y (t ) g{ X (t )} (14-25)
Proof: Verify it or
See Page 397 of the Book
16
PILLAI/Cha
Linear Systems: L[] represents a linear system if
L{a1 X (t1 ) a2 X (t2 )} a1 L{ X (t1 )} a2 L{ X (t2 )}. (14-28)
Let
Y (t ) L{ X (t )} (14-29)
represent the output of a linear system.
Time-Invariant System: L[] represents a time-invariant system if
Y (t ) L{ X (t )} L{ X (t t0 )} Y (t t0 ) (14-30)
i.e., shift in the input results in the same shift in the output also.
If L[] satisfies both (14-28) and (14-30), then it corresponds to
a linear time-invariant (LTI) system.
LTI systems can be uniquely represented in terms of their output to
h (t )
a delta function Impulse
response of
(t ) LTI h(t ) the system
t
X (t )
t
X (t ) Y (t )
t LTI
Y (t ) h(t ) X ( )d
arbitrary Fig. 14.6
input
h( ) X (t )d (14-31)
Eq. (14-31) follows by expressing X(t) as
X (t ) X ( ) (t )d (14-32)
and applying (14-28) and (14-30) to Y (t ) L{ X (t )}. Thus
Y (t ) L{ X (t )} L{ X ( ) (t )d }
L{ X ( ) (t )d } By Linearity
X ( ) L{ (t )}d By Time-invariance
X ( )h(t )d h( ) X (t )d . (14-33) 18
PILLAI/Cha
Output Statistics: Using (14-33), the mean of the output process
is given by
Y (t ) E{Y (t )} E{ X ( )h (t )d }
X ( )h(t )d X (t ) h(t ). (14-34)
X (t ) h(t) Y (t )
(a)
RXX (t1 , t 2 )
h*(t )
R XY ( t1 ,t 2 )
2 h(t )
1
RYY (t1 , t 2 )
(b)
20
Fig. 14.7 PILLAI/Cha
In particular if X(t) is wide-sense stationary, then we have X (t ) X
so that from (14-34)
Y (t ) X h( )d X c, a constant. (14-38)
X (t ) Y (t )
wide-sense LTI system wide-sense
stationary process h(t) stationary process.
(a)
X (t )
strict-sense LTI system Y (t )
strict-sense
stationary process h(t) stationary process
(b) (see Text for proof )
X (t ) Y (t )
Gaussian Linear system Gaussian process
process (also (also stationary)
stationary) (c)
22
Fig. 14.8 PILLAI/Cha
White Noise Process:
W(t) is said to be a white noise process if
RWW (t1 , t 2 ) q(t1 ) (t1 t 2 ), (14-42)
i.e., E[W(t1) W*(t2)] = 0 unless t1 = t2.
W(t) is said to be wide-sense stationary (w.s.s) white noise
if E[W(t)] = constant, and
RWW (t1 , t 2 ) q (t1 t 2 ) q ( ). (14-43)
and
where
( ) h( ) h ( ) h( )h* ( )d .
*
(14-46)
26
PILLAI/Cha
Auto Regressive Moving Average (ARMA) Processes
or
1 2 q
X ( z ) b0 b1 z b2 z bq z B( z )
H ( z ) h(k ) z k 1 2 p
k 0 W ( z ) 1 a1 z a2 z a p z A( z )
27
(14-70) PILLAI/Cha
represents the transfer function of the associated system response {h(n)}
in Fig 14.12 so that
X ( n ) h( n k )W ( k ). (14-71)
k 0
Notice that the transfer function H(z) in (14-70) is rational with p poles
and q zeros that determine the model order of the underlying system.
From (14-68), the output undergoes regression over p of its previous
values and at the same time a moving average based on W ( n ), W ( n 1),
, W ( n q) of the input over (q + 1) values is added to it, thus
generating an Auto Regressive Moving Average (ARMA (p, q))
process X(n). Generally the input {W(n)} represents a sequence of
uncorrelated random variables of zero mean and constant variance W2
so that
RWW (n) W2 (n). (14-72)
If in addition, {W(n)} is normally distributed then the output {X(n)}
also represents a strict-sense stationary normal process.
If q = 0, then (14-68) represents an AR(p) process (all-pole
28
process), and if p = 0, then (14-68) represents an MA(q) PILLAI/Cha
process (all-zero process). Next, we shall discuss AR(1) and AR(2)
processes through explicit calculations.
AR(1) process: An AR(1) process has the form (see (14-68))
X ( n ) aX ( n 1) W ( n ) (14-73)
and from (14-70) the corresponding system transfer
1
H ( z)
1 az 1
a n n
z (14-74)
n 0
given by
R (n)
X ( n ) XX a |n| , | n | 0. (14-77)
R XX (0)
It is instructive to compare an AR(1) model discussed above by
superimposing a random component to it, which may be an error
term associated with observing a first order AR process X(n). Thus
Y ( n) X ( n) V ( n) (14-78)
where X(n) ~ AR(1) as in (14-73), and V(n) is an uncorrelated random
sequence with zero mean and variance V that is also uncorrelated
2
RYY ( n ) 1 n0
Y ( n ) |n|
RYY (0) c a n 1, 2, (14-80)
where
W2
c 2 1.
W V (1 a )
2 2 (14-81)
Eqs. (14-77) and (14-80) demonstrate the effect of superimposing
an error sequence on an AR(1) model. For non-zero lags, the
autocorrelation of the observed sequence {Y(n)}is reduced by a constant
factor compared to the original process {X(n)}.
From (14-78), the superimposed
(0) (0) 1
error sequence V(n) only affects X Y
k 0
(14-89)
| b1 | ( ) b b ( ) b b ( ) | b2 | ( )
2 * n * * n * * n 2 * n
W
2
1 1 2
1
1 2 2
2
1 | 1 | 1 12 1 1 | 2 |
2 * * 2 33
1 2
PILLAI/Cha
where we have made use of (14-85). From (14-89), the normalized
output autocorrelations may be expressed as
RXX (n) *n *n
X ( n) c11 c2 2 (14-90)
RXX (0)
where c1 and c2 are appropriate constants.
Damped Exponentials: When the second order system in
(14-83)-(14-85) is real and corresponds to a damped exponential
response, the poles are complex conjugate which gives a1 4a2 0
2
in (14-83). Thus
1 r e j , 2 1* , r 1. (14-91)
j
In that case 1c c *
2 c e in (14-90) so that the normalized
correlations there reduce to
*n
X ( n ) 2 Re{c } 2cr n cos( n ).
1 1 (14-92)
But from (14-86)
1 2 2r cos a1 , r 2 a2 1, (14-93) 34
PILLAI/Cha
and hence 2r sin ( a12 4a2 ) 0 which gives
(a12 4a2 )
tan . (14-94)
a1
Also from (14-88)
X (1) a1 X (0) a2 X (1) a1 a2 X (1)
so that
a1
X (1) 2cr cos( ) (14-95)
1 a2
where the later form is obtained from (14-92) with n = 1. But X (0) 1
in (14-92) gives
2c cos 1, or c 1 / 2 cos . (14-96)
Substituting (14-96) into (14-92) and (14-95) we obtain the normalized
output autocorrelations to be 35
PILLAI/Cha
cos(n )
X (n ) ( a2 ) n/2
, a2 1 (14-97)
cos
where satisfies
cos( ) a1 1
. (14-98)
cos 1 a2 a2
Thus the normalized autocorrelations of a damped second order
system with real coefficients subject to random uncorrelated
impulses satisfy (14-97).
(14-100)
h h h h
n n 1 n 2 2n
i.e., In the case of rational systems for all sufficiently large n, the
Hankel matrices Hn in (14-100) all have the same rank.
37
1
Among other things “God created the integers and the rest is the work of man.” (Leopold Kronecker)
PILLAI/Cha