Sunteți pe pagina 1din 30

Chapter 7: Stability

We are familiar with the intuitive concept of stability in the


context of system poles of the transfer function:
) ( ) (
) ( ) (
) (
1
1
n
m
p s p s
z s z s K
s H

L
L
If the s are:
p
i
'
The system is:
all in the left half-plane stable
any in the right half-plane unstable
in the left half-plane except marginally stable
for possibly non-repeated
poles on the Im-axis
These rules do not always account for different types of
stability, nor for state-variable descriptions. Except for
the fact that we know the eigenvalues of a SISO
system's A-matrix to be the poles of the transfer
function through the relationship:
H s C sI A B D
C
adj sI A
sI A
B D
( ) ( )
( )
det( )
+

+
1
Same expression
used to find
eigenvalues.
Next we'll define several kinds of useful types of stability
and discuss new tests for these in terms of state-
variable descriptions.
especially
nonlinear
systems
DEFINITION: For a system given by a general equation (not
necessarily linear):
is an equilibrium point if, in the absence of any input,
) ), ( ), ( ( ) ( t t u t x f t x &
x
e
x t x t t
e
( ) . for all
0
Because is constant, this implies that
x
e
) , 0 , ( 0 ) ( t x f t x
e
&
(Or, in discrete-time, ) x k x k x k
e
( ) ( ) . + 1 0 for all
"equilibrium point" = "critical point"
"equilibrium point" = "critical point"
( ) 0 t u
0
0
constant

u
x
x
e
&
Note that for LTI (Linear Time-Invariant) systems, this
results in:
So either (when A is full rank), or , in
which case A has at least one zero eigenvalue and
there are an infinite number of actual equilibrium
points lying in a subspace that must, of course, pass
through the origin. We will consider only A's of full
rank, so the only equilibrium points considered are
Ax
e
0
x
e
0 ) ( A N x
e

x
e
0
It is the equilibrium point that we will classify as stable or
unstable; i.e., does a system stray from its equilibrium,
and if so, how far?
x
e
x
e
x
e
x
e
x
e
Definitions of different types of stability:
F
unstable
marginally stable
stable
stable
unstable
Denote the solution of a zero-input system of equations as:
(which is just for our linear systems), and let
the origin be the equilibrium point of interest.
( ; , ) t x t
0 0
( , ) t t x
0 0
x
e
0
DEFINITION: The origin is a stable equilibrium if for any
there exists a such that if , then
for all
> 0
( , ) t
0
0 >
x t ( )
0
<
x t ( ) <
t t >
0
.
stable "in the sense of Lyapunov" (i.s.L.)
DEFINITION: The origin is asymptotically stable if it is stable
and: there exists an such that whenever
then
> ( ) t
0
0 x t t ( ) ( )
0 0
<
lim ( )
t
x t

0
( ) 0 t u
x
1
x
2
Picture for a two
variable system:

unstable
stable i.s.L.
asymptotically stable
Extra Conditions:
If (or ) are independent of the initial time, then the
stability is called "uniform."
If (or ) can be chosen arbitrarily large, then the
equilibrium point is globally stable, or stable in the large.



DEFINITION: If there is a fixed constant M such that
for every t (or k), then the input is bounded. If for every
bounded input and every initial condition there exists
a scalar such that , then the
system is BIBS (bounded input, bounded state) stable.
M u
x t ( )
0
0 )) ( , , (
0 0
> t x t M N
s
s
N t x ) (
DEFINITION: If input u is bounded above by constant
and there exists a scalar such that for all time the system
output satisfies , the system is BIBO (bounded
input, bounded output) stable.
M
N
N y
Now we consider linear systems: Recall that the solution of
a linear system can be written:
[ ]
+
+ +
+ +
+

d u t H t x t t t C
d u D t B t t C t x t t t C
t u t D d u B t t C t x t t t C t y
d u B t t x t t t x
t
t
t
t
t
t
t
t
) ( ) , ( ) ( ) , ( ) (
) ( ) ( ) ( ) ( ) , ( ) ( ) ( ) , ( ) (
) ( ) ( ) ( ) ( ) , ( ) ( ) ( ) , ( ) ( ) (
) ( ) ( ) , ( ) ( ) , ( ) (
0
0
0
0
0 0
0 0
0 0
0 0
or

"weighting matrix"
This is also known
as the impulse
response matrix
of the system.
Use properties of
delta function
Consider the zero-input solution the the state variables:
By Cauchy's Theorem,
x t t t x t ( ) ( , ) ( )
0 0
x t t t x t ( ) ( , ) ( )
0 0
If there is a bound on the state-transition matrix
such that , then by choosing
we can prove stability i.s.L. by the definition.
This is a necessary and sufficient condition for stability
i.s.L. of the zero equilibrium state. (Proof of necessity is
by contradiction).
) (
0
t ( , ) t t
0
) ( ) , (
0 0
t t t
) (
) , (
0
0
t
t


If in addition, , then the equilibrium is
asymptotically stable.
( , ) t t
0
0
( ) 0 t u
t
Now re-consider the input-related types of stability: BIBS
and BIBO:
All linear systems that are asymptotically stable are also
exponentially stable. This means that the norm of the
state transition matrix is bounded by an exponential
function as it tends to zero.
All linear systems that are asymptotically stable are also
exponentially stable. This means that the norm of the
state transition matrix is bounded by an exponential
function as it tends to zero.
Theorem: A system is BIBS stable iff there exists a finite
fixed value such that
for all
) (
0
t N
<

) ( ) ( ) , (
0
0
t N d B t
t
t
t t
0
.
( ) ) t u (assume M <
< +
+
+

M d B t t x t t
d u B t t x t t
d u B t t x t t t x
t
t
t
t
t
t
0
0
0
) ( ) , ( ) ( ) , (
) ( ) ( ) , ( ) ( ) , (
) ( ) ( ) , ( ) ( ) , ( ) (
0 0
0 0
0 0
Stability i.s.L. is necessary for BIBS stability, which we
can see by setting above, so we know that
As long as and that
then the state will be bounded.
u t ( ) 0
< ) , (
0
t t
<

) ( ) ( ) , (
0
0
t N d B t
t
t
triangle & Cauchy
inequalities
boundedness of
the input
For BIBO stability, we consider any initial conditions as
having resulted from some bounded past input, so that
we may write:



d u t H t y
t
) ( ) , ( ) (
Then because we are considering only bounded
inputs, we can conclude that
is necessary and sufficient for BIBO stability.
<


H
t
N d t H ) , (
Recall that we can find the 2-norm of a matrix M by taking
the square root of the largest eigenvalue of M M
T
.
From slide 243
with

0
t
( ) M t u <
In practice, this integral test is not often used. Instead,
it can be shown that:
A time-invariant system is BIBO stable iff all of the
poles of the reduced transfer function are in the open
left-half complex plane, or, for discrete-time systems,
inside the open unit circle.
Note that eigenvalues and poles are not necessarily
the same thing! (i.e., for time-varying systems)
Time-Invariant systems:

( , )
( , )
( )
t t e
k A
A t t
k
0
0
0

(continuous time )
(discrete time )
The behavior of these state-transition matrices is obvious
from the forms above. A summary of zero-input stability
properties in terms of the eigenvalues of the A-matrix is
shown in the table:
Unstable if ...
Stable i.s.L. if ...
Asymptotically
stable if ...
Continuous time Discrete time
Ax x &
x k Ax k ( ) ( ) + 1
Any , or
if is repeated.
Re( )
i
>0 Re( )
i
0

i
Re( )
i
0 All , or
if is repeated.
Re( )
i
< 0

i
All Re( )
i
< 0
Any or if
is repeated.

i
>1
i
1
i
All or if
is repeated.

i
1

i

i
< 1
All
i
< 1
Modal Stability and Classification of Equilibria:
Recall that each eigenvector corresponds to an eigenvalue,
and that the eigenvectors span invariant subspaces in the
state-space. Then we can talk about the stability of these
subspaces by considering the value of the eigenvalue
corresponding to each such subspace.
Recall the phase portraits in the last chapter that illustrated
the "phase planes" of some 2-D systems. Consider how
they are constructed from their modes:
{ }
1
]
1


1
]
1

1 0
0 1
4 1
4 0
0 1
M
A
Stable Node
Large motion along
vertical eigenvector;
toward origin.
fast mode corresp.
to l = -4
Small motion along
horizontal eigenvector;
toward origin.
slow mode corresp.
to l = -1
Eigenvectors = invariant subspaces
e
2
e
1
x
1
x
2
{ }
1
]
1


1
]
1

2 1
1 1
4 1
3 2
1 2
M
A
Stable Node
Large motion along
second eigenvector;
toward origin.
fast mode corresp.
to l = -4
Large motion along
first eigenvector;
toward origin.
slow mode corresp.
to l = -1
Eigenvectors = invariant subspaces
e
2
e
1
x
1
x
2
Similar pictures can be drawn for 3-D systems, where the
invariant subspaces might then be planes also.
Higher dimensions work the same way, but are difficult to
imagine.
Lyapunov Stability Methods: This can be a powerful stability
testing technique. It applies to many nonlinear systems as
well.
Consider the (unforced) mechanical system:
K
B
M
0 + + Kx x B x M & & &
The energy stored in this system is :
2
2
1
2
2
1
) , ( Kx x M x x V + & &
Normally we expect a passive system to have positive
energy, which would require (sufficiently) that :
M K , > 0
If the system is "stable", this energy should dissipate, so
the rate of change of energy over time should be
negative:
Evaluating this "along the trajectories of the system",
i.e., according to the differential equation,
x x K x x M x x V & & & & &
&
+ ) , (
( )
2
2
) , (
x B
x Kx x Kx x B
x x K x x M x x V
x x x
M
K
M
B
&
& & &
& & & & &
&
& & &

+
+

So if we expect this passive physical system to dissipate
positive energy, then we must have
All three of the constants should therefore be positive in
order for the energy to be positive and the change in energy
to be negative. This is the idea behind Lyapunov's Direct
Method.
B > 0.
What would we ask as conditions on the coefficients of
for a stable system? (Hint: Routh-Hurwitz
( ) Ms Bs K
2
+ +
Usually, though, the state equations have little intuitive sense
of "energy" to them, so we generalize the terminology.
Quadratic
Formula
0 , , > K B M
Notation: A function of a state vector, , is positive
definite if, in a neighborhood of an equilibrium point,
and ( for semidefinite).
(reverse the signs to get negative (semi)definite).
V x ( )
V ( ) 0 0
V x x ( ) . > 0 0 for
V x ( ) 0
Lyapunov Theorem: For any system with
equilibrium point , if a positive definite function
can be found such that is negative semidefinite,
then the system is stable in the sense of Lyapunov. The
function is then called a "Lyapunov Function."
If the derivative is negative definite, then the origin
is asymptotically stable.
) ( x f x
&
f ( ) 0 0 V x ( )
) (x V
&
V x ( )
) ( x V
&
Notes: What these theorems do and don't say:
1. The neighborhood in which the Lyapunov function is
defined is the only* neighborhood in which the initial
condition can lie for which stability is guaranteed. A
system might be unstable if we leave this neighborhood.
2. The theorems do not say how to find such Lyapunov
functions. Doing so generally takes lots of experience
and trial and error (especially for nonlinear systems).
3. If we cannot find a Lyapunov function (our candidates do
not work), this does not imply the system is unstable!!! (It
just means we haven't shown it is!)
Theorem: If a system is globally asymptotically stable, then a
Lyapunov function exists that is valid for all .
x 0
*When linear systems are stable, they are globally stable.
Linear time-invariant systems give some interesting
results: we choose a quadratic form as a "candidate"
Lyapunov function:
with P being real, symmetric, and positive-definite. Now
compute the derivative:
V x x Px
T
( ) ,
[ ]x PA P A x
xPAx Px A x
x P x Px x x V
T T
T T
T T
+
+
+ & &
&
) (
Ax x &
We want this to be a negative definite function, so . . .
Letss examine
Lyapunovs method
Obvious V(x) is PD
A P PA Q
T
+
where Q is some positive definite matrix. This is a famous
equation called a Lyapunov equation.
So we can guess a P matrix and compute Q to see if it is
positive definite (or semidefinite for i.s.L. stability).
A more common (and reliable) technique is choose a positive
definite Q and solve the Lyapunov equation for P (which is
often not easy). Any Q will do, so is often used.
Q I
Theorem: A system with system matrix A is asymptotically
stable iff the solution P of the Lyapunov equation is
positive definite whenever Q is positive definite.
Theorem: A system with system matrix A is asymptotically
stable iff the solution P of the Lyapunov equation is
positive definite whenever Q is positive definite.
Note that any p.d. Q will show whether a system is
stable in the reverse approach, whereas choosing a P
and solving for Q will show neither stability or instability
if Q turns out not to be positive definite.
Note that any p.d. Q will show whether a system is
stable in the reverse approach, whereas choosing a P
and solving for Q will show neither stability or instability
if Q turns out not to be positive definite.
Interestingly, the Lyapunov equation has a closed-form
solution, but it is of little help, because it requires
knowledge of the state-transition matrix for the LTI
system:
dt Qe e P
At t A
T

0
Example: Prove that the nonlinear system:
is asymptotically stable. Consider the candidate Lyapunov
function:
2
2
1 2 2 1 1 2 2 1 2
2 1
) ( x x b x b x a x a x
x x
+

&
&
V x a x x ( ) +
2 1
2
2
2
a a
1 2
0 , >
So V(x) is obviously positive definite. First find equilibra:
0
0
2
1 2 2 1 1 2 2 1
2
2

+
x
a x a x b x b x x ( )
The only equilibrium is
0
2 1

e e
x x
This is obviously:

'


<
0 0
0 0
2
2
x
x
if
if
[ ]
2
1 2 2 1
2
2
2
2 1
2
1 2 2 1
2
2 2 1 2
2
2 1 2 1 2
2
2
1 2 2 1 1 2 2 1 2 2 1 2
2 2 1 1 2
) ( 2 2
) ( 2 2 2 2
) ( 2 2
2 2 ) (
x b x b x x a
x b x b x x x a x a x x a
x x b x b x a x a x x x a
x x x x a x V
+
+
+ +
+ & &
&
Now:
So at any time that , the system is
asymptotically stable. If the system is
stable i.s.L. This is the "region of stability".
x t
2
0 ( )
x t
2
0 ( )
Stability of time-varying systems:
Interestingly, if we freeze time and compute the
eigenvalues of A(t) at that instant, the results do not tell
us whether the system is stable, except in special
cases wherein the eigenvalues are changing "slowly."
Also, when a time-varying system has an unstable
eigenvalue, that does not necessarily imply that the
system is unstable overall!! (unless all of them are
unstable).
Lyapunov techniques are better for time-varying systems,
but are still very difficult.

S-ar putea să vă placă și