Documente Academic
Documente Profesional
Documente Cultură
1.1
In absolute (i.e. non constrained) optimization there are second order sufficient conditions in terms of Hessian matrix
D2 F =
1. Max. Suppose
Fx1 x1 Fx2 x1
Fx1 x2 Fx2 x2
...
...
Fx1 xn Fx2 xn
... Fxn x1
... Fxn x2
...
...
... Fxn xn
F
= 0, i = 1, 2, ..., n,
xi
x1 x1
Fx1 x1 < 0,
Fx1 x2
x1 x1
Fx2 x1
> 0, Fx1 x2
Fx2 x2
Fx x
1 3
Fx2 x1 Fx3 x1
Fx2 x2 Fx3 x2
Fx2 x3 Fx3 x3
< 0, ...
Df (x ) = 0
D2 f (x ) < 0
2. Min. Suppose
x max.
F
= 0, i = 1, 2, ..., n,
xi
x1 x1
Fx1 x1 > 0,
Fx1 x2
x1 x1
Fx2 x1
> 0, Fx1 x2
Fx2 x2
Fx x
1 3
Fx2 x1 Fx3 x1
Fx2 x2 Fx3 x2
Fx2 x3 Fx3 x3
> 0, ...
3. Saddle. Suppose
Df (x ) = 0
D2 f (x ) > 0
x min.
F
= 0, i = 1, 2, ..., n,
xi
1.2
Recall one special case of constrained optimization, where the objective function is a quadratic form and all constraints are linear:
f (x1 , ... , xn ) =
aij xi xj ,
i,j
H=
B11
B1n
... 0
...
... 0
... Bm1
...
... B1n
| B11
|
| Bm1
| a11
|
| a1n
... B1n
...
... Bmn
... a1n
...
... ann
BT
B
A
1.3
Consider now the problem of maximizing f (x1 , ..., xn ) on the constraint set
Ch = {x Rn : hi (x) = ci , i = 1, ..., k}.
As usual we consider the Lagrangian
L(x1 , ..., xn , 1 , ..., k ) = f (x1 , ..., xn )
k
X
i=1
0
...
0
H=
h1
x1
...
h1
xn
...
...
...
...
...
...
h1
x1
0
...
0
...
hk
x1
2L
x21
hk
x1
...
hk
xn
...
2L
x1 xn
...
...
...
...
...
...
h1
xn
...
hk
xn
2L
xn x1
...
2L
x2n
0 1 1
H= 1 0 1
.
1 1 0
Here n = 2, k = 1 so we have to check just n k = 2 1 = 1 last leading
principal minors, so just H itself. Calculation shows that det H = 2 > 0 has
the sign (1)2 = (1)n so our critical point (x = 3, y = 3) is max.
Example 2. Find extremum of F (x, y, z) = x2 +y 2 +z 2 subject of h1 (x, y, z) =
3x + y + z = 5, h2 (x, y, z) = x + y + z = 1.
Solution. The lagrangian here is
L(x, y, 1 , 2 ) = x2 + y 2 + z 2 1 (3x + y + z 5) 2 (x + y + z 1).
The first order conditions give the solution
1
5
7
1
x = 2, y = , z = , 1 = , 2 = .
2
2
2
2
Now it is time to switch to bordered hessian in order to tell whether it is
maximum, minimum or neither
H=
0
0
3
1
1
0
0
1
1
1
5
3
1
2
0
0
1
1
0
2
0
1
1
0
0
2
0
2x
2y
0
H = 2x 2
.
2y
0
2
Here n = 2, k = 1 so we have to check just n k = 3 2 = 1 leading
principal minor H2 = H.
Checking H for (x = 1, y = 1, = 0.5) we obtain H = 4 > 0, that is it has
the sign of (1)n = (1)2 , so this point is max.
Checking H for (x = 1, y = 1, = 0.5) we obtain H = 4 < 0, that
is it has the sign of (1)k = (1)1 , so this point is min.
1.4
0
...
0
0
...
0
H=
g1
x1
...
g1
xn
...
...
...
...
...
...
...
...
...
0
...
0
0
...
0
0
...
0
0
...
0
gb
x1
h1
x1
...
...
gb
xn
h1
xn
...
...
...
...
...
...
...
...
...
0
...
0
0
...
0
hm
x1
...
hm
xn
g1
x1
...
gb
x1
h1
x1
...
hm
x1
2L
x21
...
2L
x1 xn
...
...
...
...
...
...
...
...
...
g1
xn
...
gb
xn
h1
xn
...
hm
xn
2L
xn x1
...
2L
x2n
1.5
Back to a problem
Maximize f (x1 , x2 ) subject to h(x1 , x2 ) = a.
The shadow price formula here looks as
1.6
Envelop Theorems*
The above theorems about the meaning of multiplier are particular cases of
so called Envelope Theorems.
1.6.1
f (x (a), a) =
L(x (a), (a), a).
da
a
Remark. Actually when f (x, a) = f (x), that is the objective function does
d
L(x, y, a) = y 2 ,
da
thus
F 0 (1) =
d
L(x, y, a)|(x=1,y=1,=0.5) = y 2 |(x=1,y=1,=0.5) = 0.5 12 = 0.5,
da
and
F (1.1) = F (1) + F 0 (1) 0.1 = 1 + (0.5) 0.1 = 1 0.05 = 0.95.
11
Exercises
1. Write out the bordered Hessian for a constrained optimization problem
with four choice variables and two constraints. Then state specifically the
second-order sufficient condition for a maximum and for a minimum respectively.
2. For the following problems
(i) find stationary values,
(ii) ascertain whether they are min or max,
(iii) find whether relaxation of the constraint (say increasing of budget) will
increase or decrease the optimal value?
(iv) At what rate?
(a) f (x, y) = xy, h(x, y) = x + 2y = 2;
(b) f (x, y) = x(y + 4), h(x, y) = x + y = 8;
(c) f (x, y) = x 3y xy, h(x, y) = x + y = 6;
(d) f (x, y) = 7 y + x2 , h(x, y) = x + y = 0.
3. Find all the stationary points of f (x, y, z) = x2 y 2 subject of x2 +y 2 = 2
and check the second order conditions.
4. Find all the stationary points of f (x, y, z) = x2 y 2 z 2 subject of x2 +
y 2 + z 2 = 3 and check the second order conditions.
5. Find the maximum and minimum values of f (x, y) = x2 + y 2 on the
constraint set h(x, y) = x2 + xy + y 2 = 3. Redo the problem, this time using
the constraint h(x, y) = 3.3.
Now estimate the shadow price of increasing of constraint by 0.3 units
and compare with previous result.
And now estimate the change of optimal value when the objective function is changed to f (x, y) = x2 + 1.2y 2 keeping h = x2 + xy + y 2 = 3.
6. Find all the stationary points of f (x, y, z) = x + y + z 2 subject to
x + y 2 + z 2 = 1 and y = 0. Check the second order conditions and classify
minimums and maximums.
2
Homework
1.
2.
3.
4.
5.
Exercise
Exercise
Exercise
Exercise
Exercise
1.
2d.
4.
6.
19.3 from [Simon].
12