|
|
xiii | |
|
|
xv | |
Foreword |
|
xix | |
|
|
1 | (176) |
|
1 An Introduction to Linear Programming |
|
|
3 | (26) |
|
1.1 The Basic Linear Programming Problem Formulation |
|
|
3 | (10) |
|
1.1.1 A Prototype Example: The Blending Problem |
|
|
4 | (3) |
|
1.1.2 Maple's LPSolve Command |
|
|
7 | (1) |
|
1.1.3 The Matrix Inequality Form of an LP |
|
|
8 | (2) |
|
|
10 | (3) |
|
1.2 Linear Programming: A Graphical Perspective in R2 |
|
|
13 | (6) |
|
|
17 | (2) |
|
1.3 Basic Feasible Solutions |
|
|
19 | (10) |
|
|
25 | (4) |
|
|
29 | (52) |
|
2.1 The Simplex Algorithm |
|
|
29 | (10) |
|
2.1.1 An Overview of the Algorithm |
|
|
29 | (1) |
|
2.1.2 A Step-by-Step Analysis of the Process |
|
|
30 | (3) |
|
2.1.3 Solving Minimization Problems |
|
|
33 | (1) |
|
2.1.4 A Step-by-Step Maple Implementation of the Simplex Algorithm |
|
|
34 | (4) |
|
|
38 | (1) |
|
2.2 Alternative Optimal/Unbounded Solutions and Degeneracy |
|
|
39 | (8) |
|
2.2.1 Alternative Optimal Solutions |
|
|
40 | (1) |
|
2.2.2 Unbounded Solutions |
|
|
41 | (1) |
|
|
42 | (3) |
|
|
45 | (2) |
|
2.3 Excess and Artificial Variables: The Big M Method |
|
|
47 | (7) |
|
|
53 | (1) |
|
2.4 A Partitioned Matrix View of the Simplex Method |
|
|
54 | (8) |
|
2.4.1 Partitioned Matrices |
|
|
54 | (1) |
|
2.4.2 Partitioned Matrices with Maple |
|
|
55 | (1) |
|
2.4.3 The Simplex Algorithm as Partitioned Matrix Multiplication |
|
|
56 | (5) |
|
|
61 | (1) |
|
2.5 The Revised Simplex Algorithm |
|
|
62 | (6) |
|
|
62 | (1) |
|
2.5.2 Observations about the Simplex Algorithm |
|
|
63 | (1) |
|
2.5.3 An Outline of the Method |
|
|
63 | (1) |
|
2.5.4 Application to the FuelPro LP |
|
|
64 | (3) |
|
|
67 | (1) |
|
2.6 Moving beyond the Simplex Method: An Interior Point Algorithm |
|
|
68 | (13) |
|
2.6.1 The Origin of the Interior Point Algorithm |
|
|
68 | (1) |
|
2.6.2 The Projected Gradient |
|
|
69 | (3) |
|
|
72 | (3) |
|
2.6.4 Summary of the Method |
|
|
75 | (1) |
|
2.6.5 Application of the Method to the FuelPro LP |
|
|
75 | (1) |
|
2.6.6 A Maple Implementation of the Interior Point Algorithm |
|
|
76 | (3) |
|
|
79 | (2) |
|
3 Standard Applications of Linear Programming |
|
|
81 | (22) |
|
|
81 | (4) |
|
3.1.1 Eating for Cheap on a Very Limited Menu |
|
|
81 | (1) |
|
3.1.2 The Problem Formulation and Solution, with Help from Maple |
|
|
82 | (3) |
|
|
85 | (1) |
|
3.2 Transportation and Transshipment Problems |
|
|
85 | (7) |
|
3.2.1 A Coal Distribution Problem |
|
|
85 | (2) |
|
3.2.2 The Integrality of the Transportation Problem Solution |
|
|
87 | (2) |
|
3.2.3 Coal Distribution with Transshipment |
|
|
89 | (2) |
|
|
91 | (1) |
|
|
92 | (11) |
|
3.3.1 The Minimum Cost Network Flow Problem Formulation |
|
|
92 | (2) |
|
3.3.2 Formulating and Solving the Minimum Cost Network Flow Problem with Maple |
|
|
94 | (1) |
|
3.3.3 The Shortest Path Problem |
|
|
95 | (3) |
|
3.3.4 Maximum Flow Problems |
|
|
98 | (1) |
|
|
99 | (4) |
|
4 Duality and Sensitivity Analysis |
|
|
103 | (42) |
|
|
103 | (16) |
|
|
103 | (2) |
|
4.1.2 Weak and Strong Duality |
|
|
105 | (5) |
|
4.1.3 An Economic Interpretation of Duality |
|
|
110 | (1) |
|
4.1.4 A Final Note on the Dual of an Arbitrary LP |
|
|
111 | (1) |
|
4.1.5 The Zero-Sum Matrix Game |
|
|
112 | (4) |
|
|
116 | (3) |
|
|
119 | (18) |
|
4.2.1 Sensitivity to an Objective Coefficient |
|
|
121 | (4) |
|
4.2.2 Sensitivity to Constraint Bounds |
|
|
125 | (5) |
|
4.2.3 Sensitivity to Entries in the Coefficient Matrix A |
|
|
130 | (3) |
|
4.2.4 Performing Sensitivity Analysis with Maple |
|
|
133 | (2) |
|
|
135 | (2) |
|
4.3 The Dual Simplex Method |
|
|
137 | (8) |
|
4.3.1 Overview of the Method |
|
|
138 | (1) |
|
|
139 | (4) |
|
|
143 | (2) |
|
5 Integer Linear Programming |
|
|
145 | (32) |
|
5.1 An Introduction to Integer Linear Programming and the Branch and Bound Method |
|
|
145 | (22) |
|
|
145 | (1) |
|
5.1.2 The Relaxation of an ILP |
|
|
146 | (1) |
|
5.1.3 The Branch and Bound Method |
|
|
147 | (7) |
|
5.1.4 Practicing the Branch and Bound Method with Maple |
|
|
154 | (1) |
|
5.1.5 Binary and Mixed Integer Linear Programming |
|
|
155 | (1) |
|
5.1.6 Solving ILPs Directly with Maple |
|
|
156 | (1) |
|
5.1.7 An Application of Integer Linear Programming: The Traveling Salesperson Problem |
|
|
157 | (5) |
|
|
162 | (5) |
|
5.2 The Cutting Plane Algorithm |
|
|
167 | (10) |
|
|
167 | (1) |
|
|
168 | (4) |
|
5.2.3 A Step-by-Step Maple Implementation of the Cutting Plane Algorithm |
|
|
172 | (3) |
|
5.2.4 Comparison with the Branch and Bound Method |
|
|
175 | (1) |
|
|
175 | (2) |
|
|
177 | (142) |
|
6 Algebraic Methods for Unconstrained Problems |
|
|
179 | (46) |
|
6.1 Nonlinear Programming: An Overview |
|
|
179 | (8) |
|
6.1.1 The General Nonlinear Programming Model |
|
|
179 | (1) |
|
6.1.2 Plotting Feasible Regions and Solving NLPs with Maple |
|
|
180 | (3) |
|
6.1.3 A Prototype NLP Example |
|
|
183 | (2) |
|
|
185 | (2) |
|
6.2 Differentiability and a Necessary First-Order Condition |
|
|
187 | (6) |
|
|
188 | (2) |
|
6.2.2 Necessary Conditions for Local Maxima or Minima |
|
|
190 | (3) |
|
|
193 | (1) |
|
6.3 Convexity and a Sufficient First-Order Condition |
|
|
193 | (13) |
|
|
194 | (2) |
|
6.3.2 Testing for Convexity |
|
|
196 | (3) |
|
6.3.3 Convexity and The Global Optimal Solutions Theorem |
|
|
199 | (1) |
|
6.3.4 Solving the Unconstrained NLP for Differentiable, Convex Functions |
|
|
200 | (1) |
|
6.3.5 Multiple Linear Regression |
|
|
201 | (3) |
|
|
204 | (2) |
|
6.4 Sufficient Conditions for Local and Global Optimal Solutions |
|
|
206 | (19) |
|
|
207 | (2) |
|
6.4.2 Positive Definite Quadratic Forms |
|
|
209 | (1) |
|
6.4.3 Second-Order Differentiability and the Hessian Matrix |
|
|
210 | (8) |
|
6.4.4 Using Maple to Classify Critical Points for the Unconstrained NLP |
|
|
218 | (1) |
|
6.4.5 The Zero-Sum Matrix Game, Revisited |
|
|
219 | (3) |
|
|
222 | (3) |
|
7 Numeric Tools for Unconstrained NLPs |
|
|
225 | (36) |
|
7.1 The Steepest Descent Method |
|
|
225 | (13) |
|
|
225 | (4) |
|
7.1.2 A Maple Implementation of the Steepest Descent Method |
|
|
229 | (2) |
|
7.1.3 A Sufficient Condition for Convergence |
|
|
231 | (3) |
|
7.1.4 The Rate of Convergence |
|
|
234 | (2) |
|
|
236 | (2) |
|
|
238 | (11) |
|
7.2.1 Shortcomings of the Steepest Descent Method |
|
|
238 | (1) |
|
|
239 | (2) |
|
7.2.3 A Maple Implementation of Newton's Method |
|
|
241 | (2) |
|
7.2.4 Convergence Issues and Comparison with the Steepest Descent Method |
|
|
243 | (4) |
|
|
247 | (2) |
|
7.3 The Levenberg-Marquardt Algorithm |
|
|
249 | (12) |
|
7.3.1 Interpolating between the Steepest Descent and Newton Methods |
|
|
249 | (1) |
|
7.3.2 The Levenberg Method |
|
|
250 | (1) |
|
7.3.3 The Levenberg-Marquardt Algorithm |
|
|
251 | (2) |
|
7.3.4 A Maple Implementation of the Levenberg-Marquardt Algorithm |
|
|
253 | (2) |
|
7.3.5 Nonlinear Regression |
|
|
255 | (2) |
|
7.3.6 Maple's Global Optimization Toolbox |
|
|
257 | (1) |
|
|
258 | (3) |
|
8 Methods for Constrained Nonlinear Problems |
|
|
261 | (58) |
|
8.1 The Lagrangian Function and Lagrange Multipliers |
|
|
261 | (11) |
|
8.1.1 Some Convenient Notation |
|
|
262 | (1) |
|
8.1.2 The Karush-Kuhn-Tucker Theorem |
|
|
263 | (4) |
|
8.1.3 Interpreting the Multiplier |
|
|
267 | (2) |
|
|
269 | (3) |
|
|
272 | (6) |
|
8.2.1 Solving Convex NLPs |
|
|
273 | (3) |
|
|
276 | (2) |
|
8.3 Saddle Point Criteria |
|
|
278 | (6) |
|
8.3.1 The Restricted Lagrangian |
|
|
278 | (2) |
|
8.3.2 Saddle Point Optimality Criteria |
|
|
280 | (2) |
|
|
282 | (2) |
|
8.4 Quadratic Programming |
|
|
284 | (16) |
|
8.4.1 Problems with Equality-type Constraints Only |
|
|
284 | (5) |
|
8.4.2 Inequality Constraints |
|
|
289 | (2) |
|
8.4.3 Maple's QPSolve Command |
|
|
291 | (2) |
|
|
293 | (4) |
|
|
297 | (3) |
|
8.5 Sequential Quadratic Programming |
|
|
300 | (19) |
|
8.5.1 Method Derivation for Equality-type Constraints |
|
|
300 | (6) |
|
8.5.2 The Convergence Issue |
|
|
306 | (1) |
|
8.5.3 Inequality-Type Constraints |
|
|
306 | (4) |
|
8.5.4 A Maple Implementation of the Sequential Quadratic Programming Technique |
|
|
310 | (2) |
|
8.5.5 An Improved Version of the SQPT |
|
|
312 | (3) |
|
|
315 | (4) |
|
|
319 | (18) |
|
A.1 Excavating and Leveling a Large Land Tract |
|
|
319 | (3) |
|
A.2 The Juice Logistics Model |
|
|
322 | (3) |
|
A.3 Work Scheduling with Overtime |
|
|
325 | (2) |
|
A.4 Diagnosing Breast Cancer with a Linear Classifier |
|
|
327 | (3) |
|
A.5 The Markowitz Portfolio Model |
|
|
330 | (4) |
|
A.6 A Game Theory Model of a Predator-Prey Habitat |
|
|
334 | (3) |
|
B Important Results from Linear Algebra |
|
|
337 | (4) |
|
|
337 | (1) |
|
B.2 The Invertible Matrix Theorem |
|
|
337 | (1) |
|
|
338 | (1) |
|
B.4 Positive Definite Matrices |
|
|
338 | (1) |
|
|
339 | (1) |
|
B.6 The Rank-Nullity Theorem |
|
|
339 | (1) |
|
|
339 | (1) |
|
|
340 | (1) |
|
C Getting Started with Maple |
|
|
341 | (22) |
|
C.1 The Worksheet Structure |
|
|
341 | (2) |
|
C.2 Arithmetic Calculations and Built-in Operations |
|
|
343 | (1) |
|
C.3 Expressions and Functions |
|
|
344 | (3) |
|
C.4 Arrays, Lists, Sequences, and Sums |
|
|
347 | (2) |
|
C.5 Matrix Algebra and the LinearAlgebra Package |
|
|
349 | (4) |
|
C.6 Plot Structures with Maple |
|
|
353 | (10) |
|
D Summary of Maple Commands |
|
|
363 | (20) |
Bibliography |
|
383 | (4) |
Index |
|
387 | |