Sunteți pe pagina 1din 21

Title :

Linear Optimization (Unconstrained) - Direct search Me

Submitted to
Dr. Abdullahil Azeem
Professor, IPE Dept.
BUET
Submitted by
Group-3
SL Roll Name
1 0419082122 Rahul Carnalious Snal Course:
2 0419082124 Rajib Debnath AEM 6112
3 0419082120 Sujit saha Quantitative Analysis
4 0419082102 Kh Abdullah Al Masud
5 0419082107 Md. Mahadi Hassan
6 1015082106 Rasel Alam
Contents
 Introduction
 Golden Section Search Method
 Dichotomous Search Method
 Problem
 Result and plot
 Matlab Code
 Conclusion
Introduction
 The solution methods of nonlinear programming
generally can be classified as either direct or indirect
algorithms.
 Idea of direct search method for unconstrained non-
linear optimization is to identify the interval of
uncertainty that is known to include the optimum
solution point.
 There are two commonly used direct search methods
◦ Dichotomous Search Method
◦ Golden Search Method
Golden Section Search Method
 The Golden Section Search Method chooses x1and x2
such that the one of the two evaluations of the function
in each step can be reused in the next step.
 The golden ratio is the ratio r satisfying
Golden Section Search Method
Golden Section Search Method
 Golden Section Search Method to maximize f (x) over the interval
a≤x≤b
 STEP 1: Initialize: Choose a tolerance t > 0.

 STEP 2: Set and define the test points:

 STEP 3: Calculate f(x1) and f(x2).


Golden Section Search Method
 STEP 4: (For a maximization problem)
If f(x1) ≤ f(x2) then a = x1, x1 = x2
else b = x2, x2 = x1. Find the new x1 or x2
using the formula in Step 2.

 STEP 5: If the length of the new interval from


Step 4 is less than the tolerance t specified,
then stop. Otherwise go back to Step 3.

 STEP 6: Estimate x* as the midpoint of the final


interval and compute MAX = f(x*)
Golden Section Search Method
 To minimize a function y = f(x), either maximize –y or
switch the directions of the signs in Step 4.

 The advantage of the Golden Section Search Method is


that only one new test point must be computed at each
successive iteration.

 The length of the uncertainty is 61.8% of he length of


the previous interval of uncertainty.
Golden Section Search Method
 How small should the tolerance be?
◦ If x* is the optimal point, then

◦ If we want the second term to be of a fraction ε of the


first term, then

◦ As a rule of thumb, we need


Dichotomous Search Method Algorithm

This algorithm seek the maximization of a unimodal


function f(x) over the interval a <= x <= b, which is
know to include optimum point x*, where f(x) is
maximum.

This method starts with I0 = (a,b) representing the


initial interval of uncertainty.
Dichotomous Search Method Algorithm

Steps
1. Input equation f(x)
2. Input interval value [a,b]
3. Input ∂ [lower value is better]
4. Input ∆
∆ is difference between f(x1) and f(x2)
5. Assign
• xL = a
• xR = b
• N = 1 [N is iteration number]
Dichotomous Search Method Algorithm

6. N=N+1
7. x1 = ( xL + xR) / 2 - ∂
8. x1 = ( xL + xR) / 2 + ∂
9. if abs( f(x1) - f(x2) ) <= ∆ Goto Step 14
10. If f(x1) > f(x2) xR = x2
11. If f(x1) < f(x2) xL = x1
12. If f(x1) = f(x2) xL = x1 and xR = x2
Dichotomous Search Method Algorithm

13. Goto Step 6


14. X* = ( x1 + x2 ) / 2
15. Max = f(x*)
16. Print x*, Max
Problem:

 Maximize f(x) = –x^2 – 2x over the


interval
–3 ≤ x ≤ 2. Assume ∆ to be less than
0.0001 and we choose ∂ = 0.001.
Result
n=1, x1=-0.50100, x2=-0.49900, f(x1)=0.75100, f(x2)=0.74900
n=2, x1=-1.75050, x2=-1.74850, f(x1)=0.43675, f(x2)=0.43975
n=3, x1=-1.12575, x2=-1.12375, f(x1)=0.98419, f(x2)=0.98469
n=4, x1=-0.81337, x2=-0.81137, f(x1)=0.96517, f(x2)=0.96442
n=5, x1=-0.96956, x2=-0.96756, f(x1)=0.99907, f(x2)=0.99895
n=6, x1=-1.04766, x2=-1.04566, f(x1)=0.99773, f(x2)=0.99792
n=7, x1=-1.00861, x2=-1.00661, f(x1)=0.99993, f(x2)=0.99996

At x = -1.00761 , Maximum value = 0.99994


Plot
f(x) = –x^2 – 2x,,
–3 ≤ x ≤ 2 (-1.00761 , 0.99994)
1

-1

-2

-3

-4

-5

-6

-7

-8
-3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2
Maximization Problem
Matlab Code
(Using Dichotomous Search Method Algorithm)
clear all;
clc;
syms f(x);
f(x) = - x^2 - 2*x ;
a=-3; %Left limit
b=2; %Right limit
delta=.001; %Delta
diff=.0001; %Difference of f(x1)~f(x2)

xL=a;
xR=b;
n=1;
while(1)
Matlab Code
x1= (xL + xR)/2 - delta ;
x2= (xL + xR)/2 + delta ;
x0= double((x1 + x2)/2);
max= double(f(x0));
fprintf('n=%d, x1=%.5f, x2=%.5f,min=%.5f \n',n,double(x1),double(x2),max);
n=n+1;
if (abs( f(x1)-f(x2) )<=diff)
break;
end

if(f(x1)>f(x2))
xR=x2;
elseif(f(x1)<f(x2))
xL=x1;
elseif(f(x1)==f(x2))
xL=x1;
xR=x2;
end
end
Matlab Code
x0= double((x1 + x2)/2) ;
min= double(f(x0));
fprintf('At x = %.5f , Minimum value = %.5f\n',x0,min);
x = -3:.1:2;
plot(x,f(x));
Conclusion

The golden section search method converges more rapidly than the
Dichotomous method because, in the dichotomous method, the narrowing
of the interval of uncertainty slows down appreciably as I → ∆. In addition,
each iteration in the golden section method requires Half the computations
because the method always recycles one of Computations from the
immediately preceding iteration.

S-ar putea să vă placă și