Sunteți pe pagina 1din 4

Supplement to Chapter 5 LINEAR PROGRAMMING Simplex.

A mathematical approach that lacks the visual features of the graphical approach, but it can handle more than two variables and is thus mush more useful for solving real problems, which often involve a large number of variables. Linear programming models are used to help operations managers make decisions in many different areas. These include allocation of scarce resources, assignment problems, transportation problems, blending problems, and other problems. Linear programming models are mathematical representations of constrained optimization problems. The characteristics can be grouped into two categories: components and assumptions. Four components provide the structure of a linear programming model: 1. Objective 2. Decision variables 3. Constraints 4. Parameters Objective. The goal of an LP model; maximization or minimization. Maximization objective might involve profits, revenues, efficiency, or rate of return. Minimization objective might involve cost, time, distance traveled or scrap. Objective function. Mathematical statement of profit, cost, etc., per unit of output or input. Decision variables. Amounts of either inputs or outputs. Constraints. Limitations that restrict the available alternatives. The three types of constraints are less than or equal to (<), greater than or equal to (>), and simply equal to (=). A < constraint implies an upper limit on the amount of some scarce resource. A > lower constraints specifies a minimum that must be achieved in the final solution. The = constraint is more restrictive in the sense that it specifies exactly what a decision variable should equal. Feasible solution space. The set of all feasible combinations of decision variables as defined by the constraints. LP model consists of a mathematical statement of the objective and a mathematical statement of each constraint. These statements consist of symbols that represent the decision variables and numerical values called parameters. In order for linear programming model to be used effectively, certain assumptions must be satisfied. These are: 1. Linearity. The impact of decision variables is linear in constraints and the objective function. 2. Divisibility. Non-integer values of decision variables are acceptable. 3. Certainty. Values of parameters are known and constant. 4. Non-negativity. Negative values of decision variables are unacceptable. Model Formulation Begin by identifying the decision variables. Very often, decision variables are the quantity of something, such as x1 = the quantity of Product 1. Generally, decision variables have profits, costs, times, or a similar measure of value associated with them. Knowing this can help you identify the decision variables in a problem.

Constraints are restrictions or requirements on one or more decision variable, and they refer to available amounts of resources such as labor, material, or machine time, or to minimal requirements, such as make at least 10 units of Product 1. It can be helpful to give a name to each constraint. Some of different kinds of constraints you will encounter. 1. A constraint that refers to one or more decision variables. 2. A constraint that specifies ratio. 3. A constraint that specifies a percentage for one or more other variables. Graphical linear programming. Graphical method for finding optimal solutions to two-variable problems. Outline. The graphical method of linear programming plots the constraints on a graph and identifies an area that satisfies all of the constraints. The area is referred to as the feasible solution space. Next, the objective function is plotted and used to identify the optimal point in the feasible solution space. The coordinates of the point can sometimes be read directly from the graph, although generally in algebraic determination of the coordinates of the point is necessary. The general procedure followed in the graphical approach: 1. Set up the objective function and the constraints in mathematical format. 2. Plot the constraints. 3. Identify the feasible solution space. 4. Plot the objective function. 5. Determine the optimum solution. To solve: 1. Identify the decision variables. 2. Formulate the objective function. 3. Identify and formulate the constraints. 4. Add the non-negativity constraints. Plotting constraints; 1. Replace the inequality sign with an equal sign. This transforms the constraint into an equation of a straight line. 2. Determine where the line intersects each axis. a. To find where it crosses the x 2 axis, set x1 equal to zero and solve the equation for the value of x2. b. To find where it crosses that x1 axis, set x2 equal to zero and solve the equation for the value of x1. 3. Mark these intersections on the axes, and connect them with a straight line. 4. Indicate by shading whether the inequality is greater than or less than. 5. Repeat steps 1 4 for each constraint. Identifying the Feasible Solution Space The feasible solution space is the set of all points that satisfies all constraints. Plotting the Objective Function Line The objective function is actually a family of lines; the lines are parallel, but each represents a different amount. Solutions and Corner Points The feasible solution space in graphical linear programming is a polygon. Moreover, the solution to any problem will be at one of the corner points of the polygon. In rare instance, the objective function will be parallel to one of the constraint lines. When this happens, every combination of x1 and x2 ion the segment of the constraint that touches the feasible solution space represents an optimal solution.

Minimization Two important differences with maximization: one is that the constraints are usually greater than or equal instead of less than or equal to. The other difference is that the optimal point is the one closest to the origin. Binding constraint. A constraint that forms the optimal corner point of the feasible solution space. Surplus. When the optimal values of decision variables are substituted into a > constraint and the resulting value exceeds the right side value. Slack. When the optimal values of decision variables are substituted into a < constraint and the resulting value is less than the right side value. Simplex Method. A linear programming algorithm that can solve problems having more than two decision variables. The simplex technique involves generating a series of solutions in tabular form, called tableaus. Tableau is one in a series of solution in tabular form, each corresponding to a corner point of the feasible solution space. The process: 1. Set up the initial tableau. 2. Develop a revised tableau using the information contained in the first tableau. 3. Inspect to see if it is optimum. 4. Repeat steps 2 and 3 until no further improvement is possible. Setting Up the Initial Tableau Obtaining the initial tableau is a two-step process. First, we must re write the constraints to make them equalities and modify the objective function slightly. Then we put this information into a table and supply a few computations that are needed to complete the table. The Test for Optimality If all, the values in the C Z row any of any tableau are zero and negative, the optimal solution has been obtained. Developing Second Tableau Row pivot values. The numbers in the column of the entering variable in the initial tableau, used to determine which variable will leave the solution. New pivot row. The row of the leaving variable in the second tableau; the foundation on which to develop the other rows. The procedure: 1. Find the value that is at the intersection of the constraint row and the entering variable column. 2. Multiply each value in the new pivot row by this value. 3. Subtract the resulting values, column by column, from the current row values. Developing Third Tableau 1. Determine the entering variable: Find the column with the largest positive value in the C-Z row. 2. Determine the leaving variable: Divide the solution quantity in each row by the row pivot. 3. Divide each value in the row of the leaving variable by the row pivot value to obtain the new pivot-row values. 4. Compute values for the x2 row. Multiply each new pivot-row value by the x 2, row pivot value and subtract the product from corresponding current values. 5. Compute new Z row values. 6. Compute the C-Z row values. Handling > and = Constraints

Artificial variable. Variable added when an equality constraint is present, to permit development of an initial solution. Summary of Maximization Procedure 1. Set up the initial tableau. a. Rewrite the constraints so that they become equalities; add slack variables to each constraint. b. Rewrite the objective function to include the slack variables. Give slack variables coefficients of 0. c. Put the objective coefficients and constraints coefficient into tableau form. d. Compute values for the Z row; multiply the values in each constraint row by the rows C value. Add the results within each column. e. Compute values for the C-Z row. 2. Set up subsequent tableaus. a. Determine the entering variable. If a tie exists, choose one column arbitrarily. b. Determine the leaving variable: Divide each constraint rows solution quantity by the rows pivot value; the smallest positive ratio indicates the leaving variable. If a tie occurs, divide the values in each row by the row pivot value, beginning with slack columns and then other columns, moving left to right. The leaving variable is indicated by the lowest ratio in the first column with unequal ratios. c. Form the new pivot row of the next tableau. Divide each number in the leaving row by the rows pivot value. Enter these values in the next tableau in the same row positions. d. Compute new values for remaining constraint rows: For each row, multiply the values in the new pivot row by the constraint rows pivot value, and subtract the resulting values, column by column, from the original row values. Enter these in the new tableau in the same positions as the original row. e. Compute values for Z and C-Z rows. f. Check to see if any values in the C-Z row are positive; if they are, repeat 2a 2f. Otherwise, the optimal solution has been obtained. Minimization Problems There are few differences with maximization problems. One is the need to adjust for > constraints, which require both artificial variables and surplus variables. A second major difference is the test for the optimum: A solution is optimal if there are no negative values in C-Z row. Sensitivity Analysis Range of optimality. The range of values for which the solution quantities of all the decision variables remains the same. Range of feasibility. The range of values for the right-hand side of a constraint over which the shadow price remains the same. Shadow prices. Values indicating how much a one-unit change in the original amount of a constraint would change the final value of the objective function. Computer Solutions LINDO. Widely used linear-programming package.

S-ar putea să vă placă și