Documente Academic
Documente Profesional
Documente Cultură
1 / 35
Outline
Probability & Random Variables Random Vectors Random (Stochastic) Processes Stationarity Power Spectral Densities for Real WSS Processes Conclusion
2 / 35
We do not know the underlying distributions completely . . . all we have are sample moments such as E [X ], var (X ) and higherorder moments E [(X )k ], k > 2. We know the distributions, but integration in closed form is not possible (e.g., Gaussian pdf)
Question: How do we solve this? We use approximation techniques to establish upper and\or lower bounds on probabilities. . .
3 / 35
AtHome Activity: Read up on the following inequalities\bounds: Markov inequality Tchebycheff (Chebyshev) inequality Chernoff inequality Strong Law of Large Numbers (SLLN) Weak Law of Large Numbers (WLLN) Central Limit Theorem (CLT)
4 / 35
When would you use one approximation technique over another? Which gives a tighter bound: Markov, Tchebycheff or Chernoff? What is the difference between the SLLN and WLLN? Express the CLT in your own words. SLLN, WLLN, and CLT assume IID RVs. What if the RVS are dependent or not identical? Can you recognize when to use the approximation techniques if given a problem to solve? Can you apply the appropriate technique(s) when you recognize that one is necesary?
5 / 35
Random Vectors
Transpose Matrix sum Matrix product Trace of a matrix Norm of a vector Vector inner & outer products Block matrices & operations
6 / 35
Random Vectors
7 / 35
Random Vectors
Example 1: Write out the correlation, covariance, crosscorrelation & crosscovariance matrices for the ndimensional random vectors = [X1 , . . . , Xn ]T and Y = [Y1 , . . . , Yn ]T . X
8 / 35
Conceptual Denition of Random (Stochastic) Processes: Mathematical model of an empirical process whose development is governed by probability laws . . . Lets look at some examples. . .
9 / 35
10 / 35
11 / 35
12 / 35
13 / 35
14 / 35
15 / 35
For each xed t = tk , what is X (tk , )? For a xed sample point, = i , what is X (t , i )? For xed t = tk & = i , what is X (tk , i )? Be sure that you know the denitions of the following terms for random processes: ensemble, member function, sample function, realization
ECNG 6700 - Stochastic Processes, Detection and Estimation 16 / 35
Continuous
Value Discrete
17 / 35
Stationarity: Stationary (Strictly, Wide Sense) Cyclostationary Nonstationary Real vs Complexvalued: Realvalued bandpass RP - Z (t ) = A(t )cos[2 fc t + (t )] Z (t ) = {A(t )e
j (t ) ej 2 fc t }
= {W (t )e j 2fc t }
18 / 35
19 / 35
20 / 35
xi ]}
m j =1 {P [Y (t j )
yj ]}
Markov Processes Gaussian (Normal) Processes Independent increments (e.g., Wiener Process, Poisson Process) You should really just get a better idea of the key characteristics of these processes!!!
23 / 35
Stationarity
The distribution function describing the process is invariant under a translation of time/space For all t1 , . . . , tk , t1 + , . . . , tk + T & all k = 1, 2, . . ., P [X (t1 ) x1 , . . . , X (tk ) xk ] = P [X (t1 + ) x1 , . . . , X (tk + ) xk ] If stationary for k N but not k > N then X (t ) is a N th order stationary process
SSS Properties: Constant mean over index: E [X (t )] = X = constant Autocorrelation only depends on time difference, not actual time samples: E [X (t1 )X (t2 )] = RXX (t2 t1 ) Jointly SSS: Joint distributions of X (t ) and Y (t ) are invariant to shifts in time
ECNG 6700 - Stochastic Processes, Detection and Estimation 24 / 35
Stationarity
Stationarity
Properties:
1 2 3 4 5 6
Average power, RXX (0) = E [X 2 (t )] 0 RXX ( ) is an even function of : RXX ( ) = RXX ( ) |RXX ( )| RXX (0) If X (t ) contains a periodic component, then RXX ( ) will also If lim RXX ( ) = C , then C = 2 X If RXX (T0 ) = RXX (0) for some T0 = 0, then RXX ( ) is periodic with period T0 If RXX (0) < and RXX ( ) is continuous at = 0, then RXX ( ) is continuous for every
27 / 35
Properties:
1 2 3 4 5
RXY ( ) = RYX ( ) |RXY ( )| |RXY ( )| RXX (0)RYY (0) 1 2 [RXX (0) + RYY (0)]
28 / 35
F{} is the Fourier transform operator Autocorrelation can be retrieved using inverse transform on PSD RXX ( ) = F 1 {SXX (f )} = SXX (f )e j 2f df Properties:
1 2 3
For X (t ) real, RXX ( ) is even and hence SXX (f ) is also even, SXX (f ) = SXX (f ) If X (t ) has periodic impulses, then SXX (f ) will have impulses
29 / 35
-B
Lowpass Spectrum
Sxx(f)
-fc - B/2
-fc
-fc + B/2
fc - B/2
fc
fc + B/2
Bandpass Spectrum
f1 f2
SXX (f )df = 2
f2 f1
SXX (f )df
30 / 35
Crosscorrelation can be retrieved using inverse transform on CPSD RXY ( ) = F 1 {SXY (f )} = SXY (f )e j 2f df Properties:
1 2 3
{SXY (f )} is an odd
4 5
RP Examples
Questions:
1
In a communication system, the carrier signal at the receiver is modeled by Xt = cos(2 ft + ), where Uniform[, ]. Find the mean function and the correlation function of Xt .
1 HINT: cosAcosB = 2 [cos(A + B ) + cos(A B )].
2 3
Determine the stationarity of the above function. What happens if the frequency or amplitude are random, as opposed to the phase? What happens if more than one are random?
32 / 35
RP Examples
Questions:
1
Find the PSD of the RP with autocorrelation function, | , | | < T 1 |T RXX ( ) = 0, else Find the PSD and effective bandwidth of the RP with autocorrelation function, RXX ( ) = Ae| | , A, > 0 The PSD of a zero mean Gaussian RP is given by, 1, |f | < 500Hz SXX (f ) = 0, else Find RXX ( ) and show that X (t ) and X (t + 1) are uncorrelated and hence independent. Uncorrelated Gaussian RVs are also independent!
ECNG 6700 - Stochastic Processes, Detection and Estimation 33 / 35
Conclusion
Conclusion
We covered: Concluded Random Variable Fundamentals Introduction to Random Processes Recommended Reading: Kay - Sections 9.19.3, 9.8, 11.8, 15.115.5, 16.116.7, 17.117.4, 17.617.8 Your goals for next class: Continue working with MATLAB and Simulink Revise in class exercises based on todays discussions and ask questions in the next class Start HW3 and ask questions in the next class Review notes on RPs Part II in prep for next class
ECNG 6700 - Stochastic Processes, Detection and Estimation 34 / 35
Q&A
Thank You
Questions????
35 / 35