Sunteți pe pagina 1din 12

PTSPUNITVI Questions&Answers

GRIETECE 1
UNIT 6
1. Explain the classification of random process with neat sketches.
Ans:
A continuous random process is one in which the random variable X is continuous and
the time t can have any value between t
1
andt
2
.





Example: Temperature as a function of time.
If the random variable X can assume only certain specified values, while t is
continuous, it is called discrete random process.
Example: Voltage available at the output of a switch due to random opening and closing of
a switch.








If the future values of a sample function can be predicted from the knowledge of the past
values, the random process is called deterministic random process.

ExampleX(t) = Acos(t + 0)



www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 2
If the future values of a sample function cannot be predicted from the knowledge of the
past values, the random process is callednon-deterministic random process.





2.Define and explain autocorrelation and state its properties.
Ans:
The autocorrelation is the function which provides the interdependency or regularity
or similarity or relation between delayed and undelayed versions of the same random process as
a function of delay.
Let us consider the process x(t) as shown in figure below.






Two measurements are made on this process at t = t
1
and at t = t
2
, with a delay of .
The interdependency between these two measurements is given by the correlation of x(t) at
t = t
1
and x(t) at t = t
2
and is given by

R
X
() = E|x(t)
t=t
1
x(t)
t=t
2
]
= lim
1-
1
1
] x(t
1
)x(t
2
)Jt
1
1 2
-1 2

= lim
1-
1
1
] x(t
1
)x(t
1
+ )Jt
1
( t
2
= t
1
+)
1 2
-1 2


www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 3
In general
R
X
() = lim
1-
1
I
_ x(t
1
)x(t
1
+ )Jt
1 2
-1 2


Properties of Auto Correlation:
1. Auto correlation is an even function of delay ().
2. Auto correlation at origin is nothing but mean square value or average power.
3. Auto correlation at origin is always maximum.
4. Auto correlation and power spectral density functions are Fourier transform pair.
5. Auto correlation is independent of time shift.
6. If X(t) is ergodic with no periodic components thenlim
|:|-
R
XX
() = (X

)
2
.
7. If X(t) is ergodic, zero mean and has no periodic components thenlim
|:|-
R
XX
() = u.

3. Define cross correlation and state its properties.
Ans:
This is the interdependency or relatedness between delayed version of one process x ( t )
and an undelayed version of another process y (t). This is expressed as a function of delay ().
The cross-correlation of the process x (t) and y (t) can be expressed as
R
X
() = lim
1-
1
1
] x(t)y(t +)Jt
1 2
-1 2

Properties of Cross Correlation:
1. Cross correlation is an even function of delay ().
2. Cross correlation at origin is not maximum.
3. Cross correlation is zero when the random process x (t) and y (t) are of zero mean.
4. If X (t) and Y (t) are two random processes R
XX
() and R

() are their respective auto


correlation functions, then
|R
X
()| R
XX
(u). R

(u)
www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 4
5. If X (t) and Y (t) are two random processes, then
|R
X
()|
1
2
|R
XX
(u) + R

(u)]
6. If the random process X (t) and Y (t) are independent then R
X
() = E(X). E()
7. Two random processes X (t) and Y (t) are said to be uncorrelated if their cross correlation
function is equal to the product of their mean values.

4. Discuss Gaussian random process and state its properties.
Ans:
Consider a random process X(t) between the interval t = 0 and t = T . Associating some
weight g(t) to the random process X( t ) and then integrating the product g(t)X(t) between the
limit 0 to T, a random variable Y is obtained as defined by
= ] g(t)X(t)Jt
1
0

If the weighting function g (t) is such that the mean-square value of the random variable Y
is finite and if Y is a Gaussian-distributed random variable for every g (t) then X ( t ) is called a
Gaussian process.
The process X (t) is a Gaussian process if every linear functional of X (t) is a Gaussian
random variable.
As we know the random variable Y is said to possess a Gaussian distribution if its
probability density function is of the form

P
y
(y) =
1
2 o
Y
exp |
-(y -m
y
)
2
2o
Y
2
]

Where m
y
is the mean and o
Y
2
is the variance of the random variable Y.
Properties of a Gaussian Process:
1 If a Gaussian process X (t) is applied to a stable linear filter, then the random process Y (t)
developed at the output of the filter is also Gaussian.

2. Considering the set of random variables or samples X (t
1
), X (t
2
) .. X(t
n
) obtained by
observation of a random process X(t) at instants t
1,
t
2
t
n
, if the process X(t) is
Gaussian, then this set of random variables are jointly Gaussian for any n, with their n-fold
joint p.d.f. being completely determined by the set of means

www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 5
m
x(t
1
)
= F|x(t
|
)] i =l , 2 . . . n
And the set of auto covariance functions
C
XX
(t
k
, t
|
) = F|(X(t
k
) -F|X(t
k
)(X(t
|
) - F|X(t
|
)]

Where k, i = 1, 2 N

3. If a Gaussian process is wide-sense stationary, then the process is also stationary in the strict
sense.
4. If the set of random variables X (t
1
), X ( t
2
) . . . X (t
n
) are uncorrelated then they are
statistically independent.
5. Briefly discuss about Poisson random process?
Ans: Poisson process is a discrete random process, which describes the number of times an
event has occurred as a function of time. For example, arrivals of a customer at a bank check-out
register, occurrence of lightning strike over a particular locality, failure of some components in a
system etc.
To define the Poisson process, the two conditions required are:
1. That only one event occur at a time
2. That occurrence times be statistically independent so that the number that occurs in any given
time interval is independent of the number in any other non-overlapping time interval.











www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 6

6. Consider a random variable process X (t) = a cost where is a constant
and A is a random variable uniformly distribution over (0, 1).Find the auto
correlation and covariance of X (t)?
Sol:
We have f (A) =1 in (0, 1)
R
XX
(t
1
, t
2
) = E|X(t
1
). X(t
2
)]
= E [Acost
1
Acost
2
]
= cost
1
cost
2
E |A
2
]
consider
E |A
2
] = ] A
2
1
0
(A). JA
= j
A
3
3
[
0
1
=
1
3

Therefore
R
XX
(t
1
, t
2
) =
1
S
cost
1
cost
2

Auto covariance
con:
XX
(t
1
, t
2
) = F|(X(t
1
) - F|X(t
1
)(X(t
2
) -F|X(t
2
)]

=R
XX
(t
1
, t
2
) - E|X(t
1
). X(t
2
)]
E|X(t
1
)] = E [Acost
1
]
= cost
1
E (A)
We have
E [A] = ] A
1
0
(A)JA
= j
A
2
2
[
0
1
= 1/2

Therefore E|X(t
1
)] = 1/2 [cost
1
]
www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 7
And E|X(t
2
)] = 1/2 [cost
2
]
Therefore
con:
XX
(t
1
, t
2
) =
1
S
cost
1
cost
2
-
1
4
cost
1
cost
2

=
1
12
cost
1
cost
2

7. If x (t) is a stationary random process having mean =3 and auto correlation
function R
XX
(t) = 9 + 2e
-|t|
.Find the mean and variance of the random
variable.
Sol:
E[X (t)] = 3 and R
XX
(t) = 9 + 2e
-|t|

Y = ] x(t)Jt
2
0

E[Y (t)] = Ej] x(t)Jt
2
0
[ = j] E(x(t))Jt
2
0
[ = 3 j] Jt
2
0
[ = 6
Var(Y) =E (Y
2
) [E(Y)]
2

Let Z(t) be a

random process
Define a random variable
= ] Z(t)Jt
u+1
u
Where T > 0
Then E(
2
) = ]
(I - ||)R
zz
()J
1
-1

Using the above relation Y = ] x(t)Jt
2
0
i.e., a= 0 and T= 2
Therefore
E(Y
2
) = ] (2 -||)
2
-2
R
XX
(z)dz
We have R
XX
(z) = 9 + 2e
-|t|

E(Y
2
) = ]
(2 +)|9 + 2e
z
]
0
-2
+] (2 - )
2
-2
|9 + 2e
-z
dz
= |18 + 4e +
9
2
2
+ 2c

( - 1)]
-2
0
+|18 + 4e +
9
2
2
+ 2c

( - 1)]
0
2

www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 8
Therefore E(Y
2
) = 40.542
Var(Y) = E(Y
2
) [E(Y)]
2

= 40.542 - 36 = 4.542
8.A random process is defined as X(t) = A. x|n (mt +0) where A is a
constant and 0 is a random variable, uniformly distributed over (-a, a).
Check X (t) for stationary.
Ans:
Since 0 is a uniform random variable over(-n, n),
(0) =
1
2n
or - n 0 n
= 0 elsewhere
Consider E|X(t)] = ] sin(t + 0)J0
n
-n

=
A
2n
] sin(t + 0)J0
n
-n

= -
A
2n
|cos(t + 0)]
-n
n
= u
Consider R
XX
() = E|X(t). X(t +)]
= E|Asin(t +0). Asin|(t + ) + 0]]
=
A
2
2
. E|cos
0
- cos(2t + + 20)]
=
A
2
2
. E|cos
0
] -
A
2
2
. E|cos(2t + + 20)]
E|cos(2t + +20)] = ] cos(2t + +20)(0). J0
n
-n

=
1
2n
j
sIn(2ot+o:+20)
2
[
-n
n

= 0
R
XX
() =
A
2
2
. cos
Since, mean of the process is independent o time and the auto correlation function of the
process is independent of time, and is a function of , the process is a stationary process.
www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 9


9.A random process is defined as X(t) = A. cux (m
c
t +0) where 0 is a
uniform random variable over (, 2a). Verify the process is ergodic in the
mean sense and auto correlation sense.
Ans:
Consider E|X(t)] = E|A. cos(
c
t + 0)]
= ] A. cos(
c
t + 0)(0). J0
2n
0

=
1
2n
] A. cos(
c
t +0)J0
2n
0

= 0
Consider the time average
X(t)

=
1
1
] X(t). Jt
1 2
-1 2

=
1
1
] A. cos(
c
t +0)Jt
1 2
-1 2

= 0
We have X(t)

= E|X(t)]. Hence, the process is egodic in the mean sense.


Consider R
XX
() = E|X(t). X(t +)]
= E|A. cos(
c
t +0). Acos(
c
(t + ) +0) ]
=
A
2
2
. cos
c

Consider auto correlation in time sense as
R
XX
() = lim
1-
1
1
] X(t). X(t + ). Jt
1 2
-1 2

= lim
1-
1
1
] A. cos(
c
t + 0). A. cos(
c
(t + ) + 0)Jt
1 2
-1 2

= lim
1-
1
1
A
2
2
|] cos(2
c
t +
c
+ 20)Jt + ] cos
c
. Jt]
1 2
-1 2
1 2
-1 2

=
A
2
2
. cos
c

www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 10
Thus, the process is ergodic in the auto correlation sense also.

10. Consider two random processes X(t) = A cux mt +B x|n mt and
Y(t) = Bcux mt -A x|n mt where A and B are uncorrelated, zero mean
random variables with same variance and 'm' is a constant. Show that X(t)
and Y(t) are jointly stationary?
Ans:
Consider E|X(t)] = E(A). cos t + E(B). sint
= 0
Consider R
XX
(t, t + ) = E|X(t). X(t + )]
= E|(Acos t +B sint)(Acos (t + ) +B sin(t +))]
= E|A
2
cos t cos (t + ) + ABcos t sin (t + ) +
ABsin t cos (t + ) + B
2
sin t sin(t + )]
=E(A
2
) cos t cos (t + ) +E(AB) sin(2t + ) +
E(B
2
) sint sin(t + )
Since A and B are of zero mean, E(A
2
) = :or(A)onJ E(B
2
) = :or(B)
Since A and B are uncorrelated, p
AB
=
co(A,B)
c
A
.c
B
= u
= cov (A,B) = 0
= E(AB) = E(A). E(B)
Since A and B are of zero mean, E(AB) = u
R
X
(t, t + ) = E(A
2
). |cos t. cos (t + ) +sint. sin(t + )]
= E(A
2
). cos
= X(t) is a stationary process.
Similarly Y(t) also can be verified that is stationary.
Consider R
X
(t. t +) = E|X(t). (t +)]
= E|(Acos t +B sint)|B cos (t + ) -A. sin(t +)]]
www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 11
= E(AB). cos t. cos (t + ) -E(A
2
). cos t. sin(t + ) +
E(B
2
). sint. cos (t + ) -E(AB). sint. sin(t + )
= E(A
2
)|cos (t +). sint - cos t. sin(t + )]
= E|A
2
] sin
Since R
X
(t, t +)is independent of time and is a function of , X(t) and (t) are jointly
stationary processes.

11.A random process X(t) is defined X(t) = A. cux 2a
c
t, where A is Gaussian
distributed random variable with zero mean and variance o
A
2
. This random
process is applied to an ideal integrator which produces the o/p Y(t) =
] x(z) dz
t

. Then,
(a) Check Y(t) for stationary.
(b) Check Y(t) for ergodicity.
Ans:
(a) E|(t)] = E|] x(). J] = ] E|x()]J
t
0
t
0

E|x()] = E|A. cos 2n
c
= E(A). cos 2n
c
= u
E|(t)] = u
Consider R

(t, t + ) = E|(t). (t +)]


(t) = ] x(). J = ] Acos 2n
c
. J
t
0
t
0

(t) = |
AsIn2n]
c
:
2n]
c
]
0
t
=
A
2n]
c
sin2n
c
t
R

(t. t + ) = E|
A
2n]
c
. sin2n
c
t.
A
2n]
c
. sin2n
c
(t + )]
=
L(A
2
)
(2n]
c
)
2
. sin2n
c
t. sin2n
c
(t + )
Since R

(t, t + ) is not independent of time, Y(t) is not a stationary process.


(b) Every ergodic process is necessarily a stationary process. Hence Y(t) is not an ergodic
process.
www.jntuworld.com
www.jntuworld.com
PTSPUNITVI Questions&Answers
GRIETECE 12






www.jntuworld.com
www.jntuworld.com

S-ar putea să vă placă și