Sunteți pe pagina 1din 11

Probability and Random Variables

Method for generating random variables with a specified probability distribution function. Gaussian And Markov Processes Characterization of Stationary Random Process Linearly filtered random process P.S.D and Auto Correlation

Generation of random variables (r.v.)


o Simulate effects of noise signals o Random phenomenon in the real world o To study the effects through simulation of communication Systems

Uniform r.v. A number between 0 and 1 with equal probability

0 <= A <=1
o A random variable ( r.v.) has the range 0 to 1

o The uniform probability density function (pdf) for the r.v. A is denoted by f(A)

pdf : Probability density function f(A)

The average value or mean value of a r.v. is

mA mA =

pdf can be defined for any r.v. while CDF or PDF can only be defined for continuous r.v.

Probability Distribution Function (PDF) or Cumulative distribution function (CDF) F(A) of a uniformly distributed r.v. A
The integral of the probability density function (pdf) is called the Probability Distribution Function (PDF) or Cumulative Distribution Function (CDF) of the r.v. A is the area under f(A) denoted by F(A) or FX (x) ( i.e. P(X<=x) )

For any R.V. this area must always be unity ; maximum value of the distribution function.

For the uniform random variable A the range of F(A) is

0<= F(A) <= 1

for 0 <= A <=1

Generating uniform distributed noise in interval (b,b+1) can be done by using output A of the random number generator and shifting it by an amount

B=A+b B is new r.v. has mean mB = b+

If b=- then the r.v. B is uniformly distributed in interval (-, )

Generating the r.v. with other PDF

Zero Mean uniformly distributed r.v. in the range (0,1) can be used to generate the r.v with other PDF. Suppose C with PDF F(C) Since the range of F(C) is the interval (0,1) o Begin with generating a uniformly distributed r.v. A in the range (0,1) and -1 o Set F(C) =A hence C=F (A) . o This is the inverse mapping from A to C

Gaussian Process

A random process X(t) is a Gaussian process if for all n and all (t1, t2, t3, . tn ) the r.v.s have Joint Gaussian density function, f(x) as

x m C t C-1

vector of n r.vs x = (x1, x2 , .........., xn) mean, m=E(X) n x n covariance matrix transpose inverse of covariance matrix C

o At any time instant t0 the r.v. X(t0) is Gaussian o At any two points t1, t2 the r.vs. (X(t1), X(t2)) are distributed according to a two dimensional Gaussian r.v.

Properties
1. For a Gaussian processes, the knowledge of the Mean m and the Covariance C provides a complete statistical description of the process . 2. If the Gaussian process X(t) is passed through a LTI system, the output of the system is also a Gaussian process. The effect of the system on |X(t)| is simply reflected by a change in the mean value and the covariance of X(t)

Markov Process
o A Markov process X(t) is a random process whose past has no influence on the future if its present is specified. i.e. If tn > tn-1 , then

In other way If t1 < t2 < .. < tn , then

Gauss-Markov Process
A Gauss-Markov process X(t) is a Markov process whose probability density function is Gaussian o The simplest method for generating a Markov process is by means of a simple recursive formula

Wn

where is a sequence of zero mean i.i.d (white) r.v.s and is a parameter that determines the degree of correlation between Xn and Xn-1 o If the sequence {Wn } is Gaussian, then the resulting process X(t) is Gauss-Markov
FIGURE GaussMarkov Sequence (left) Auto Correlation of the GaussMarkov process (right)

Power Spectrum of Random Processes and White processes

A stationary process X(t) is characterize in the frequency domain by its power which is the F.T. of the autocorrelation function of spectrum the random process. That is

The auto-correlation function of a stationary random process X(t) is obtained from the power spectrum by means of the IFT

White Process

A random process X(t) is called the white process if it has a flat power spectrum i.e. is constant for all f o Information sources are modeled as the output of LTI systems driven by a white process.

o If
then

or all f

the total power is infinite.

No real Physical process can have infinite power and therefore, a white process may not be meaningful physical process

However thermal noise can be modeled for all practical purposes as white noise process with the power spectrum equating The value kT is denoted by N0 PSD of thermal noise is

PSD of thermal noise is referred to two

sided power spectral density

For a white random process X(t) with power spectrum the auto correlation function is

where

is the unit impulse for all

we have

o If we sample a white process at two points t1, t2 , r.v.s will be un-correlated.

the resulting

o If in addition to being white, the random process is also Gaussian, the sampled r.v.s will be statistically independent Gaussian r.v.s

Linear Filtering of Random Processes


o Suppose that a stationary random process X(t) is passed through a LTI filter that is characterized in time domain by Its I.R. and in the frequency domain by its F.R.

o It follows that the output of the linear filter is the random process

o The mean value of Y(t) is

where H(0) is the F.R. H(f) of the filter evaluated at f = 0 o The autocorrelation function of Y(t)

o In the frequency domain, the power spectrum of the output Y(t) is related to the power spectrum of the input process X(t) and the frequency response of the linear filter by the expression

This is easily know by taking the F.T. of

Low pass and Band Pass Processes


o Random signals can also be characterized as low pass random processes.

Definition
o A random process is called lowpass if its power spectrum is larger in the vicinity of f = 0 and small ( approaching 0) at high frequencies. In other words a low pass random process has most of its power contracted at low frequencies.

Definition
o A lowpass random process X(t) is bandlimited if the power spectrum for the parameter B is called Bandwidth of the random process

Definition
o A random process is called bandpass if its power spectrum is large in the band of frequencies centered in the neighborhood of a central frequency and relatively small outside this band of frequencies.

o A random process is called narrowband if its bandwidth in the vicinity of f=0 and small ( approaching 0) at high frequencies. o In other words a low pass random process has most of its power contracted at low frequencies.

Properties
o Bandpass process are suitable for representing modulated signals. o The information bearing signal is usually a lowpass random process that modulates over a bandpass (narrowband) communication channel. o Modulated signal is a bandpass random process. o A bandpass random process X(t) can be represented as

and are called in-phase and quadratic components of where X(t) and are lowpass process

Theorem: o If X(t) is a zero-mean stationary random process, the processes


are also zero-mean, jointly stationary processes. o The autocorrelation functions of be expressed as and

and

are identical and may

where and

is the autocorrelation function of the bandpass process X(t) is the Hilbert transform of and is defined as

o The cross-correlation function of

and

is expressed as

o The auto-correlation function of the bandpass process X(t) is expressed in terms of the autocorrelation function function as and the cross-correlation

S-ar putea să vă placă și