Sunteți pe pagina 1din 27

!

Module 2

Lecture 14 : Random Vibrations & Failure


Analysis
Stochastic Processes-1

!
!
Sayan Gupta

Department of Applied Mechanics

Indian Institute of Technology Madras

Stochastic Processes
We define a stochastic process (or a random process) as one when the
random variable evolves with time.

Thus, a random process can be viewed as a parametered family of random


variables.
A random process is denoted as

X(t, !)
time
parameter

t1
Figure 1

sample
space

When t is fixed and ! is allowed


to evolve (red line in Fig 1), it is a
random variable,

i.e., X(t = t1 , !) = X(!) = X

When ! is fixed and t varies, we get a sample time history,

i.e., X(t, ! = !i ) = xi (t) . Here, i refers to the ith time history.

!
When both t and ! are fixed, X(t = tk , ! = !i ) = xki is just a deterministic number.
2

Classification of a Random Process


Let X(t, !) be a random process.

!
If X(t, !) is a discrete random variable, then X(t, ) is a random process
with the state space being discrete.

!
On the other hand, if X(t, ) is a continuous random variable, the state
space of the random process is continuous.

!
If the parameter t is discrete, then X(t, !) is a discrete parametered
random process.

!
Conversely, X(t, ) is a continuous parametered random process when the
parameter t is continuous.

!
Thus, random processes can be

discrete state discrete parametered

discrete state continuous parametered

continuous state discrete parametered

continuous state continuous parametered


3

Classification of a Random Process


Usually, in mathematical representations of random processes, X(t, !) is
represented as X(t) with the dependence on the sample space ! is assumed
to be implied.

!
The parameter of a random process could be t indicating that the process
evolves in time or x indicating that the process evolves in space. Usually, a
process that evolves in space is called a random field.

!
A random process/random field could have multi-parameters. For example,

X(x, y) represents a 2-dimensional random field that evolves along x and

y coordinates.
For example, the wind force acting on the wind
turbine (Fig. 2) is a 2-dimensional random
process that evolves in space (the variation
along the height of the wind turbine mast) as well
as temporal variation and can be denoted as F (x, t).
Figure 2

Description of a Random Process


A random process X(t) could be described in terms of PDF of X(t) with t as
a parameter, e.g.
PX(t) (x; t) = P [X(t) x] is a first order PDF with t as parameter
To describe the process X(t) so that the dependence
between the time instants t1 and t2 are explicit, we need
to describe the process in terms of the 2-dimensional
joint PDF
PX(t1 )X(t2 ) (x1 , x2 ; t1 , t2 ) = P [{X(t1 ) x1 } \ {X(t2 ) x)2}]
Figure 3

Thus, to completely characterize a random process X(t) for all t 2 [0, T ], we


would need to specify the dependence between all time instants. This, in turn,
would require the knowledge of an infinite dimensional joint PDF.

!
This is a too stringent requirement and is not feasible.

We therefore, need to look for alternative descriptors of a random process.


5

Description of a Random Process: Statistical properties


The mean of a random process can be expressed as
Z 1
X (t) =
x pX (x; t)dx = E[X(t)]

...(8.1)

The auto-correlation of a random process is


Z 1Z 1
RXX (t1 , t2 ) = RX(t1 ) X(t2 ) =
x1 x2 pX(t1 )X(t2 ) (x1 , x2 ; t1 , t2 )dx1 dx2
=

...(8.2)

E[X(t1 )X(t2 )]

When t1 = t2 = t,
RXX (t, t) = E[X(t)2 ] =

2
X (t)

...(8.3)

It must be noted here that

!
the statistical properties (mean, autocorrelation function and variance) are
time varying

!
the averaging here is carried out across the ensemble ( ! space ) and not
across time.
6

Description of a Random Process: Statistical properties


The auto-covariance function is given by

CXX (t1 , t2 )

=
=

E[(X(t1 )
Z 1Z 1
1

X (t1 ))(X(t2 )
(x1

1 )(x2

X (t2 ))]

...(8.4)

2 )pX1 X2 (x1 , x2 ; t1 , t2 )dx1 dx2

The autocorrelation coecient


rXX (t1 , t2 ) =

CXX (t1 , t2 )
X (t1 ) X (t2 )

...(8.5)

can be shown to have the property


|rXX (t1 , t2 )| 1

...(8.6)

Prove this as an exercise.


7

Description of a Random Process: Statistical properties


Remarks
!
RXX (t1 , t2 ) = CXX (t1 , t2 ) if X (t1 ) = X (t2 ) = 0 , else

!
C
XX (t1 , t2 ) = RXX (t1 , t2 ) X (t1 )X (t2 )
!
2
X (t) = CXX (t, t)
Consider two random processes, X(t) & Y (t) , such that
Z 1
X (t) = E[X(t)] =
x pX (x; t)dx
Z 11
Y (t) = E[Y (t)] =
y pY (y; t)dy

...(8.7)
...(8.8)

then, the cross-correlation function is given by


RXY (t1 , t2 )

=
=

E[X(t1 )Y (t2 )]
Z 1Z 1
xy pXY (x, y; t1 , t2 )dxdy
1

...(8.9)

Description of a Random Process: Statistical properties


The cross-covariance function can be expressed as
CXY (t1 , t2 )

=
=
=

E[(X(t1 )
Z 1Z 1
1

X (t1 ))(Y (t2 )


(x

RXY (t1 , t2 )

x (t1 ))(y

Y (t2 ))]
Y (t2 )) pXY (x, y; t1 , t2 )dxdy

X (t1 )Y (t2 )

...(8.10)

and the cross-correlation coecient is


rXY (t1 , t2 ) =

CXX (t1 , t2 )
X (t1 ) Y (t2 )

...(8.11)

Remarks
!
1. X(t) and Y (t) are mutually orthogonal if

!
RXY (t1 , t2 ) = 0 8 t1 , t2
!
CXY (t1 , t2 ) = 0 8 t1 , t2
2. X(t) and Y (t) are uncorrelated if
9

Example Problems
Problem 1:
Let X(t) = f (t) , where f (t) is a deterministic function.

Find the mean and the auto-correlation function.

!
Solution:
!
X (t) = E[X(t)]
Z 1
!
=
f (t)pX (x; t)dx = f (t)
!
1
!
Hence, the mean of the process is equal to the deterministic function f (t) .

!
The auto-correlation function is given by

!
RXX (t1 , t2 )

E[X(t1 )X(t2 )]

E[f (t1 )f (t2 )]

f (t1 )f (t2 )

10

Example Problems
Problem 2:
Let X(t) = A cos !t , where A is a random variable and ! is a deterministic
function. Find the mean and the auto-correlation function of the process.

!
Solution:
!
X (t) = E[X(t)] = E[A cos !t]
!
= E[A] cos !t
!
= A cos !t
!
where, A = E[A] is the mean value of the random variable A.

!
!
RXX (t1 , t2 ) = E[X(t1 )X(t2 )]
!
2
=
E[A
cos !t1 cos !t2 ]
!
= E[A2 ] cos !t1 cos !t2
!
2
!
= ( A
+ 2A ) cos !t1 cos !t2
!
Here, we use the property

!
2
A )2 ] = E[A2 ] 2A .
A = E[(A
11

Example Problems
Problem 3:
Let X(t) = A cos(!t + ), where A is a random variable with pdf pA (a) , and

is a uniformly distributed random variable in [ , ]. Find the mean and the


auto-correlation function.

!
Solution:

X (t) = E[A cos(!t + )]


!
= E[A(cos !t cos
sin !t sin )]
!
= E[A cos ] cos !t E[A sin ] sin !t
...(8.12)
!
!
Now, if A & are assumed to be independent

Z 1Z 1
!
!
E[A cos ] =
a cos s pA (a, s)dads
1
1
!
Z 1Z 1
!
=
a cos s pA (a)p (s)dads
!
1
1
!
= E[A]E[cos ]
...(8.13)
12

Example Problems
Similarly,

E[A sin ] = E[A]E[sin ]


!
Z 1
Now,

E[cos ] =
cos sp (s)ds
!
1
!
Z
1
!
=
cos s ds
2

!
Z
1
!
=
cos sds = 0.
2
!

Similarly,
E[sin ] = 0.
Substituting in Eq (8.12), X (t) = 0.
RXX (t1 , t2 )

...(8.14)

E[X(t1 )X(t2 )]

E[A2 cos(!t1

E[A2 ]E[cos(!t1
) cos(!t2
)]
1
E[A2 ]E[cos{!(t1 t2 )} + cos(!t1 + !t2 + 2 )]
2
2
1 2
A
t2 )} +
E[cos{!(t1 + t2 ) + 2 }]
A cos{!(t1
2
2

=
=

) cos(!t2

)]

...(8.15)
13

Example Problems
Now
E[cos{!(t1 + t2 ) + 2 }] = E[cos{!(t1 + t2 ) + 2 }]
= E[cos{!(t1 + t2 )} cos(2 ) sin{!(t1 + t2 )} sin(2 )]
= cos[!(t1 + t2 )]E[cos(2 )] sin[!(t1 + t2 )]E sin(2 )] = 0.
...(8.16)
Substituting the results from Eq (8.16) into Eq. (8.15), we get
RXX (t1 , t2 ) =

1
2

2
A

cos[!(t1

When
t1 = t2 = t,
1
!
RXX (t, t) =
!
2

2
A

t2 )].

2
X

is the variance of the process.

!
Note that for this example, both the mean and the variance of the process are
independent of time.
14

Stationary Processes

If the statistical properties of a process at time t1 and at time t1 + c are


identical for any value of c, then the process is said to be stationary with
respect to the concerned statistical property.

!
For example, if for a random process
X(t)
!
E[X(t)] = E[X(t + c)]
!
for any value of c , then X(t) is said to be stationary in mean.

!
If E[X(t)] = E[X(t + c)] for any value of c, it implies that no matter at what
time instant the averaging is carried out across the ensemble, the mean
remains the same, i.e., the mean is a constant and is independent of time.

This then is a mean-stationary random process.

15

Stationary Processes

Similarly, if E[X 2 (t)] = E[X 2 (t + c)] for any value of c , then the process is
stationary in second moment.

!
2
2
If E[{X(t) X (t)} ] = E[{X(t + a) X (t + a)} ] for any value of a,

2
2
i.e., X (t) = X (t + a) for any value of a , then the process is stationary in
variance.

!
Similarly, one can define stationarity in terms of 1st order pdf,

i.e., if pX (x; t) = pX (x; t + c) for any value of c , then the 1st order pdf of the
process does not depend on t and the process X(t) is said to be first order
strong sense stationary (1st order SSS).

16

Strong Sense Stationary Processes


Strong sense stationarity of processes implies stationarity in terms of its
probability density function.

!
pX (x; t) = pX (x; t + c) = pX (x) 8c
1st order SSS:

!
2nd order SSS:

pX(t1 )X(t2 ) (x1 , x2 ; t1 , t2 ) = pX(t1 )X(t2 ) (x1 , x2 ; t1 + c, t2 + c)


!
= pX1 X2 (x1 , x2 ) 8c
!
!
In general, n-th order SSS implies

!
pX (x1 , . . . , xn ; t1 , . . . , tn ) = pX (x1 , . . . , xn ; t1 + c, . . . , tn + c)
!
= pX (x1 , . . . , xn ) 8c
!
!
where, X is a n-dimensional vector. Note that this implies that the stationarity
is valid for all
m n.
!
This is a stringent condition for stationarity and hence termed as strict (or
strong) sense stationarity (SSS).
17

Weak Sense Stationary Processes


When the stationarity of random processes are defined in terms of the weaker
conditions, such as moment stationarity, the process is said to be weak sense
stationary random process.

!
!
!
The following are examples when a process X(t) is said to be weak sense
stationary:

!

X (t) = X (t + c) = X 8c
!

RXX (t1 , t2 ) = E[X(t1 )X(t1 + )]
!
= E[X(t1 + c)X(t1 + + c)] 8c
!
Here, it is obvious that RXX (t1 , t2 ) does not depend on (t1 , t2 ) but only on
the dierence = t2 t1.

Thus, a process that is stationary in correlation can be represented as


RXX (t1 , t2 ) = RXX (t2

t1 ) = RXX ( )

...(8.17)
18

Stationary Processes
For a process stationary in correlation, in Eq. (8.17) refers to the lag as
shown in Fig.4.
Here, is the distance between two
time instants, i.e., the distance
between the two red lines.

!
The ensemble average for a process
stationary in correlation remains the
same as long as the distance
between the two time instants
remains the same.

Figure 4

Thus, with respect to Fig.4, the


ensemble average for < X(t1 )X(t2 ) >
and
< X(t1 + c)X(t2 + c) >
remains identical.

19

Stationary Processes
A typical plot of the variation of RXX ( ) with respect to for a process that
is stationary in correlation is shown in Fig. 5.
Here, for a positively correlated process,
it is reasonable to expect that as the
distance between the two time instants

t1 and t2 increases, the dependence


between X(t1 ) and X(t2 ) decreases
and RXX ( ) ! 0 as ! c .

!
Here, c is the critical value of , such
that, if > c, X(t1 ) and X(t1 + ) can
be assumed to be uncorrelated.

!
In such situations, c is referred to as
Figure 5
the correlation length of the process.
A stochastic process is called a -correlated process if

RXX (t1 , t2 ) = RXX (t1 t2 ) = 0


for
|t1 t2 |
i.e., is the correlation length of the process X(t).

...(8.18)
20

Stationary Processes
The mean and variance of a 1st order SSS random process
Z 1
X (t) =
x pX (x; t) dx
1
Z 1
=
x pX (x) dx = X

...(8.19)

i.e., the mean is independent of time.

!
Similarly,
Z 1
2
2
(t)
=
(x

(t))
pX (x; t) dx
X
X
1
Z 1
2
=
(x X )2 pX (x) dx = X

...(8.20)

the variance is also independent of time.

21

Stationary Processes
Now a second order SSS process also implies that the process is stationary in
the strong sense.

!
The covariance of a second order SSS process is expressed as
CXX

=
=

Z1Z
1
Z1Z

(x(t1 )

(x1

CXX (t2

X (t1 ))(x(t2 )

X )(x2
t1 )

X (t2 ))pXX (x1 , x2 ; t1 , t2 )dx1 dx2

X )pXX (x1 , x2 ; t2

t1 )dx1 dx2

...(8.21)

It is obvious that since 1st order SSS is implied,


X (t1 ) = X (t2 ) = X .
!
Here, X(ti ) = Xi is a random variable.

22

Stationary Processes

Remarks:
!
X(t) is said to be second order WSS process if

!
the mean of the process is independent of time,

i.e.,
X (t) = X
!
and the covariance function depends on the time lag,

i.e.,
CXX (t1 , t2 ) = CXX (t2 t1 )
!
!
The default notion of stationarity of a random process is 2nd order WSS,
implying the above two conditions.

!
Stationarity in random fields is referred to as homogenous fields.

23

Ergodic Process

Figure 6

Let x(t) be a sample realization of a


stationary process
X(t).
!
The temporal average is

Z T
!
1
Tave [X(t)] =
X(t) dt
...(8.22)
!
2T
T
!
The ensemble average
Z 1 is

!
E[X(t)] =
x pX (x; t) dx ...(8.23)
1

With respect to Fig. 6, we see that the ensemble average is averaging along
the ensemble (green line) while the temporal average is averaging along the
time (red line).
From Eq. (8.22), it is obvious that Tave [x(t)] = T is a random variable.

The mean of T can be obtained as

Z T
!
1
...(8.24)
E[T ] =
E[X(t)]dt = E[X(t)] =
!
2T
T
24

Ergodic Process
The variance of the random variable T is
ZTZ
1
2
=
E[(X(t1 ) )(X(t2 )
T
4T

)]dt1 dt2

=
=
=

1
4T 2
1
4T 2

ZTZ
Z

C(t1 , t2 ) dt1 dt2

2T
2T
2T

2T

| | C( ) d

Z
2T
| |
(1
)C( ) d
4T 2
2T
2T
Z
i
1 2T
h
1
R( ) 2 d
T 0
2T

It is obvious from Eq. (8.25) that as


Z 2T
lim 1
h
1
R( )
T !1T 0
2T

2 d ! 0

...(8.25)

...(8.26)
25

Ergodic Process
The result of Eq. (8.26) implies that as the sample time history of a realization
of the random process becomes infinitely long, the statistical fluctuation of
the temporal mean goes to zero.

!
In other words, the temporal mean ceases to be a random variable.

!
Furthermore, from Eqs. (8.23) and (8.24), it implies that the temporal mean
and the ensemble mean are equal.

!
Thus, the process X(t) is said to be ergodic in mean.

!
Similarly, ergodicity in the 2nd moment implies

!
Tave [X 2 (t)] = E[X 2 (t)]
!
and ergodicity in auto-correlation implies

!
!
Tave [X(t)X(t + )] = E[X(t)X(t + )] = RXX ( )

26

Ergodic Process

Remarks:
!
Ergodicity of a random process with respect to a statistical parameter
implies equivalence of temporal and ensemble averages.

!
All ergodic processes are necessarily stationary processes; however,
stationary processes need not be all ergodic.

!
The practical implication of a process being ergodic is that if a long sample
time history for an ergodic random process is available, one can, in
principle obtain all the relevant statistical information by computing the
temporal statistics in lieu of ensemble statistics.

27

S-ar putea să vă placă și