Sunteți pe pagina 1din 4

CHAPTER 13

Optimization
Exercises 13.1
1 (a) The derivative is f

(x) = e
x
e
x
, which is zero if and only if x = 0. Furthermore,
f

(x) < 0 for x < 0 and f

(x) > 0 for x > 0. Therefore f is unimodal on (, ) and has


an absolute minimum at (x, y) = (0, 2).
1 (b) The expression x
6
is positive if and only if x = 0, and equals zero at x = 0. Therefore
(0, 0) is the absolute minimum. The derivative is f

(x) = 6x
5
, which is negative for x < 0
and positive for x > 0, so f is unimodal on (, ).
1 (c) The derivative f

(x) = 8x
3
+ 1 is equal to zero if and only if x = 1/2, is negative for
x < 1/2, and is positive for x > 1/2. Therefore (1/2, 3/8) is the absolute minimum
of the unimodal function.
1 (d) The derivative is f

(x) = 1 1/x, which equals zero if and only if x = 1, is negative for


0 < x < 1, and positive for x > 1. Thus f is unimodal on (0, ) and the absolute minimum
occurs at (1, 1).
2 (a) (, 1)
2 (b) (1, 4)
2 (c) (0, 5)
2 (d) (ln 2, 2 2 ln 2)
Computer Problems 13.1
1 (a) The plot shows that the interval [0, 1] contains a relative minimum. According to Theorem
13.2, the number k of Golden Section Search steps needed satises
g
k
(1 0)
2
< 0.5 10
5
,
where g = (

5 1)/2. The inequality holds for k 24. Program 13.1 is an implementation


of Golden Section Search that can be used to nd the minimum. The command
>> x=gss(inline(2
*
x4+3
*
x2-4
*
x+5),0,1,24)
results in convergence to the minimum x = 1/2.
1 (b) From the plot below, there are two relative minima. The intervals [2.5, 1.5] and [0.5, 1.5]
each contain a minimum. The number of steps needed for each interval is 24, as in (a). The
minima x = 2 and x = 1 are found by gss.
278 CHAPTER 13 OPTIMIZATION
1 (c) Similar to (a). The interval [0, 1] contains a relative minimum. Applying 24 steps of gss
gives the approximation x = 0.47033.
1 (d) Similar to (a). The interval [1, 2] contains a relative minimum. Applying 24 steps of gss
provides the approximation x = 1.43791.
3 2 1 0 1 2 3
1
2
3
4
5
6
7
8
(a)
3 2 1 1 2 3
20
10
5
(b)
2 1 1 2
6
4
2
2
(c)
2 1 1 2
20
15
10
5
(d)
2 (a) 1/2
2 (b) 2, 1
2 (c) 0.47033
2 (d) 1.43791
3 (a) The squared distance from the point (x, 1/x) to (2, 3) is D(x) = (x 2)
2
+ (1/x 3)
2
.
The derivative is
D

(x) = 2x 4
2
x
3
+
6
x
2
.
Newtons Method applied to nd a root of D

(x) converges to r = 0.358555, corresponding


to a distance of 2.788973.
3 (b) Applying Golden Section Search on the interval [0, 1]
>> x=gss(inline((x-2)2+(1/x-3)2),0,1,30)
Section 13.2 279
produces x = 0.358555 as in part (a).
4 (a) Newtons Method nds the maximum distance point to be (0.335281, 0.628079).
4 (b) Golden Section Search nds the same point as in part (a).
5 Program 13.3 can be applied as
x=neldermead(inline(exp(-x(1)2
*
x(2)2)+(x(1)-1)2+(x(2)-1)2),[1;1],1,60)
to converge to (1.20881759, 1.20881759) within 8 decimal places.
6 (a) (1.132638, 0.465972), (0.465972, 1.132638)
6 (b) (0.67633357728, 0.67633357728)
7 Program 13.3 can be applied as
x=neldermead(inline(100
*
(x(2)-x(1)2)2+(x(1)-1)2),[0;0],1,100)
to converge to (1, 1).
Exercises 13.2
1 Minimum is (1.2088176, 1.2088176). Different initial conditions will yield answers that differ
by about
1/2
.
2 (a) (1.132638, 0.465972), (0.465972, 1.132638)
2 (b) (0.6763, 0.6763)
3 (a) Newtons Method when applied to the gradient of the Rosenbrock function F(x
1
, x
2
) =
100(x
2
x
2
1
)
2
+(x
1
1)
2
converges to the minimum (x
1
, x
2
) = (1, 1). Newtons method will
be accurate to machine precision since it is nding a simple root.
3 (b) Steepest Descent also converges to (1, 1), but about 8 digits of accuracy in double precision,
since error is of size
1/2
.
4 (a) (1.132638, 0.465972), (0.465972, 1.132638)
4 (b) (0.6763, 0.6763)
5 (a) Implement Conjugate Gradient Search as on page 596, using Successive Parabolic Inter-
polation as the one-dimensional minimizer. Using initial guess (1, 1), the method converges
to the minimum (1.132638, 0.465972). Using initial guess (1, 1), the method converges to
the minimum (0.465972, 1.132638).
5 (b) Similar to (a). Conjugate Gradient Search with SPI converges, depending on initial guess,
to the two minima (0.6763, 0.6763) and (0.6763, 0.6763).
6 (a) (1.29034, 0.41771)
6 (b) (0.84545, 0.70095)

S-ar putea să vă placă și