Sunteți pe pagina 1din 28

Adaptive Signal Processing

Prof. Dr. Essam Sourour

Chapter 2: Fundamentals of Adaptive Filtering


Signal Representation The Correlation Matrix Wiener Filtering Linearly Constrained Wiener Filter Mean-Square Error Surface Bias and consistency (short) Newton Algorithm (short) Steepest-Descent Algorithm Applications

2.2 Signal Representation: 2.2.1 Deterministic Signal


Defined by its parameters, not random Adaptation not needed Example: for k=0, 1, 2, Convolution:

Z-transform in the region of conversion: Discrete-Time Fourier Transform

2.2.2 Random Signals


Random variable X: random outcome for each experiment Stochastic Process X(k): Time evolution of a random variable Denote x(k) as a sample of the process X(k) at time k. Hence, x(k) is a random variable Distribution Function (or Cumulative Density Function (CDF) ):

2.2.2 Random Signals, cont


The Probability Desity Function (pdf) is the derivative of the CDF: The mean, or expected value:
The autocorrelation function: Note that px(k), x(l)(y, z) is the joint probability density function of the random variables x(k) and x(l)

2.2.2 Random Signals, cont


The joint pdf:
The joint CDF: The auto-covariance function:

The variance is the auto-covariance when k=l If k=l, the variance of x(k) is

2.2.2 Random Signals, cont


The Gaussian pdf
Conditional CDF: Define px(k)(y|B) as the conditional pdf of x(k), conditioned on B Conditional mean: Conditional variance:

2.2.2 Random Signals, cont


Wide-Sense-Stationary process: mean and autocorrelation function do not depend on the time index k

Nth order stationary: the nth order moments do not depend on the time index k Strict-Sense-Stationary process: The pdf does not depend on the time index k

2.2.2 Random Signals, cont


For complex random signal x(k) = xr(k) + j xi(k) Also y= yr + j yi and z= zr + j zi Define pxr(k), xi(k)(y, z) as the joint pdf of xr(k) and xi(k) The complex mean:

The autocorrelation function The autocovariance function

2.2.2.1 Autoregressive Moving Average Process


The ARMA process is described by the linear difference equation:
The input x(k) is white noise. The output y(k) is colored noise (autocorrelation > 0) Special case of b0=1 and all other bj = 0, j>0, is called Autoregressive process (AR process). Output depends on current input and past outputs Special case of ai = 0, is called Moving average process (MA process). Output depends on current input and past inputs

2.2.2.2 Markov Process


A process where the output depends only on the most recent past Example, n(k) is white noise:
Mth order Markov process: is a Nth order AR process that depends on the past M outputs The CDF depends on the past M inputs:

2.2.2.4 Power Spectral Density for WSS Process


If we apply DTFT on a WSS random process with get a random output in frequency domain For WSS process the DTFT of the autocorrelation function provides the power spectral density:

The mean square of the random variable is given by the integration of the power spectral density:

2.2.2.4 Power Spectral Density for WSS, cont


If a random signal is applied to a linear time-invariant filter h(k) the following relations are valid:

In these equation rh(l) is Also, ryx(k) is the cross-correlation of x(k) and y(k) Also, Ryx(ejw) is the cross-power spectral density The power spectral density is periodic with period 2 Rx(ejw) is real since the correlation function is even

2.2.3 Ergodicity for WSS Process


A stochastic process is called ergodic if the time statistics and ensemble statistics are equal The time average: The time average is by itself a random variable. Its mean should be the true mean and its variance goes to zero at large N

2.3 The Correlation Matrix


Usually the input of the adaptive filter is a vector x(k)
The correlation matrix R is defined by:

The characteristics of R are very important in adaptive filters

2.3 The Correlation Matrix Characteristics


R is positive semidefinite. A matrix is called positive semidefinite is for any column vector w we have
R is (N+1) by (N+1) Hermitian R is Toeplitz (diagonal constant). The elements of any diagonal are the same

2.3 The Correlation Matrix Characteristics, cont


Conjugate symmetry:

Any matrix R has eigenvalues and corresponding eigenvectors q

Properties of Eigenvalues and Eigenvectots of R


Characteristic equation of R gives N+1 solutions for the eigenvalues: 0, 1, 2, N, The eigenvalues are real positive The eigenvalues of Rm are im If the N+1 eigenvectors qi form a matrix Q with qi as the columns, we have

Properties of Eigenvalues and Eigenvectots of R


The non-zero eigenvectors q0 , q1 , q2 , , qN+1 , are linearly independent The eigenvectors are orthogonal
The Hermitian matrix R can be diagonalized as follows:

Properties of Eigenvalues and Eigenvectots of R


The eigenvectors of R and R-1 are equal The eigenvalues of R and R-1 are the inverse

R can be written as

Properties of Eigenvalues and Eigenvectots of R


The sum of the eigenvalues of R is equal to the trace of R, and the product of the eigenvalues of R is equal to the determinant of R The Rayleighs quotient defined as
The norm of matrix ||R|| is defined in textbook as

Example 2.1

Wiener Filter
We need the output of a system to be equal to an desired reference signal d(k) Assume a linear combiner:

The input signal and the adaptive filter cooefficients are given by:

Wiener Filter, cont


Note that in many applications the input vector is a delayed version:

In this case the Wiener filter is an FIR filter:

Wiener Filter, cont


In both linear combiner or FIR filter we need to minimize the objective function, which is the meansquare-error

For fixed filter coefficients

Wiener Filter, cont

Wiener Filter, cont


The vector p is the cross-correlation between the desired and input signal: Also, R is the input signal cross-correlation matrix:
We need to find the tap-weight coefficients that minimizes the MSE. Assume R and p are know The gradient vector of the MSE function is:

Wiener Filter, cont


Setting the gradient vector gw to zero we get the Wiener Filter solution

S-ar putea să vă placă și