Sunteți pe pagina 1din 18

Recursive Least Squares Parameter Estimation for Linear Steady State and Dynamic Models

Thomas F. Edgar Department of Chemical Engineering University of Texas Austin, TX 78712

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

1

Outline

• Static model, sequential estimation

• Multivariate sequential estimation

• Example

• Dynamic discrete-time model

• Closed-loop estimation

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

2

Least Squares Parameter Estimation

Linear Time Series Models

ref: PC Young, Control Engr., p. 119, Oct, 1969

scalar example (no dynamics)

model

data

y = ax

y * = ax+

: error

least squares estimate of a: (aˆ)

min

a ˆ

k

i = 1

(

ˆ

ax

i

Thomas F. Edgar (UT-Austin)

y

i

*

) 2

(1)

RLS – Linear Models

Virtual Control Book 12/06

3

Simple Example

The analytical solution for the minimum (least squares) estimate is

=  

 

 

1

k

 

k

i

1

ˆ

a

k

i

1

x

i

2

x y

i

i

*

p

k

b

k

(2)

p k , b k are functions of the number of samples

This is the non-sequential form or non-recursive form

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

4

Sequential or Recursive Form

To update

and

a ˆ

k based on a new data pt. (y i , x i ), in Eq. (2) let

p

k

b

k

1

=

k

i = 1

=

x

i

i

k

i = 1

x y

i

2

*

=

=

p

k

b

k

1

1

1

+

x

k

2

+ x y

k

k

*

(3)

(4)

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

5

Recursive Form for Parameter Estimation

ˆ

a

k

=

a ˆ

k

where

K

k

=

p

k

1

1

K

k

(

x a ˆ

k

k

y

*

k

)

estimation error

1

x

k

(

1

+

p

k

1

x

k

2 )

1

To start the algorithm, need initial estimates and p 0 . To update p,

p

k

= p

k

1

p

k

1

2

x

k

2

(

1

+

p

k

1

x

k

2

)

1

(5)

(6)

ˆ

a

0

(7)

(Set p 0 = large positive number)

Eqn. (7) shows p k is decreasing with k

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

6

Estimating Multiple Parameters (Steady State Model)

y = x

T

a = a x

1

1

+ a x

2

2

++ a x

n

n

(8)

min

a ˆ

k

i = 1

(

x

i

T

ˆ

a

y

i

*

)

2

(non-sequential solution requires

n x n inverse)

To obtain a recursive form for aˆ,

P

B

k

k

1

=

P

k

=

B

k

1

1

+

1

+ x

k

x

k

y

k

*

x

k

T

(9)

(10)

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

7

a ˆ

k

=

a ˆ

k

1

Recursive Solution

P

k

1

x

k

(

1

+

x

k

T

P

k

1

x

k

)

1

(

x

k

T

a ˆ

k

1

y

*

k

)

(11)

P

k

=

P

k

1

P

k

1

x

k

 

K

k

1

+

x

k

T

P

k

1

x

k

 

need to assume

a ˆ

0

(vector)

P

0 (diagonal matrix)

P

ii

large

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

estimation

error

1 T

x

P

0

k

P

k

=

P

11

0

1

P

22

(12)

0

P

nn

 

Virtual Control Book 12/06

8

Simple Example (Estimate Slope & Intercept)

Linear parametric model

input, u

0

1

2

3

4

10

12

18

output, y

5.71

9

15

19

20

45

55

78

y = a 1 + a 2 u

15 19 20 45 55 78 y = a 1 + a 2 u Thomas F.

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

9

Sequential vs. Non-sequential Estimation of a 2 Only (a 1 = 0) Covariance Matrix vs.

Sequential vs. Non-sequential Estimation of a 2 Only (a 1 = 0)

Covariance Matrix vs. Number of Samples Using Eq. (12)

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

10

Sequential Estimation of a ˆ 1 and ˆ a 2 Using Eq. (11) Thomas F.

Sequential Estimation of

a ˆ

1

and ˆ

a

2

Using Eq. (11)

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

11

Application to Digital Model and Feedback Control

Linear Discrete Model with Time Delay:

(

y t

)

(

= a y t

1

+bu(t 1

1

N+)

1)

+

(

a y t

2

b u( t

2

2

y: output

u: input

Thomas F. Edgar (UT-Austin)

2)

+

N+)

+

+

(

a y t

n

b

r

u

(

t

d: disturbance

RLS – Linear Models

n

)

r

)

N +

d

(13)

N: time delay

Virtual Control Book 12/06

12

Recursive Least Squares Solution

y ( t ) = T ( t 1) ( t 1) + ( t
y (
t
)
=
T (
t
1)
(
t
1)
+
(
t
)
(14)
where
T
(
t
1)
=
[
y t
(
1),
y t
(
y(
t
n
),
u t
(
1
N
u t
(
r
N
), 1]
T
(
t
1)
=
[
a
,
a
.
a
,
b
,
b
,
.
.
.
b
,
d
].
1
2
n
1
2
r
min
t
2
T
(
i
1)
( )
i
y i
( )
(15)
ˆ
 
 
i
= 1
"least squares"
(predicted value of y)
ˆ
T
ˆ
t
)
=
ˆ
(
(
t
1)
+
P t
(
)
(
t
1)[
y t
(
)
(
t
1)
(
t
1)]

(16)

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

14

Recursive Least Squares Solution

(

P t

)

=

(

P t

1)

(

P t

1)

T

(

(

t

t

1)[

T

1)

(

P t

(

t

1)

1)

P t

(

1)

(

+

t

1)

1]

1

(17)

K t

( ) =

(

P t

1)

(

t

1)

1

+

T

(

t

1)

(

P t

1)

(

t

1)

(18)

(

P t

)

=

[

I

ˆ ( t ) =

ˆ

( t 1)+

K t

(

)

T

(

t

K ( t )[ y ( t )

1)]

(

P t

1)

(19)

yˆ ( t )]

(20)

K: Kalman filter gain

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

15

Closed-Loop RLS Estimation

• There are three practical considerations in implementation of parameter estimation algorithms

- covariance resetting

- variable forgetting factor

- use of perturbation signal

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

16

Enhance sensitivity of least squares estimation algorithms with forgetting factor

P ( t )

ˆ

(

t

)

=

=

1

ˆ

(

(

J

[ P ( t

( ))

t

1)

=

t

i = 1

P ( t

t

T

(

t

1)

+

1)

(

P t

)

(

P t

(

t

t

i

1)

T

( t

1)]

1)[

y t

(

)

(

i

1)[

1)

( )

i

T ( t

T

(

t

1)

( )

y i

2

1)P ( t

ˆ

(

t

1)]

1)

( t

+

(21)

1)

]

(22)

(23)

1

.

prevents elements of P from becoming too small (improves sensitivity) but noise may lead to incorrect parameter estimates

0 <

< 1.0

Thomas F. Edgar (UT-Austin)

1.0 all data weighted equally

~ 0.98 typical

RLS – Linear Models

Virtual Control Book 12/06

17

Closed-Loop Estimation (RLS)

Perturbation signal is added to process input (via set point) to excite process dynamics

large signal: good parameter estimates but large errors in process output

small signal: better control but more sensitivity to noise

Guidelines: Vogel and Edgar, Comp. Chem. Engr., Vol. 12, pp. 15-26 (1988)

1. set forgetting factor = 1.0

2. use covariance resetting (add diagonal matrix D to P

when tr (P) becomes small)

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

18

3. use PRBS perturbation signal only when estimation error is large and P is not small. Vary PRBS amplitude with size of elements proportional to tr (P).

4. P(0) = 10 4

I

5. filter new parameter estimates

c

(

k

)

=

c

(

k

1)

+

(1

)

(

k

)

: tuning parameter ( c used by controller)

6. Use other diagnostic checks such as the sign of the computed process gain

Thomas F. Edgar (UT-Austin)

RLS – Linear Models

Virtual Control Book 12/06

19