Preface: Nonlinear Optimization-Models and Applications |
|
xiii | |
Acknowledgments |
|
xix | |
Author |
|
xxi | |
1 Introduction to Optimization Models |
|
1 | (34) |
|
|
1 | (3) |
|
|
2 | (1) |
|
1.1.2 Applications of Optimization |
|
|
3 | (1) |
|
|
3 | (1) |
|
1.2 Classifying Optimization Problems |
|
|
4 | (4) |
|
1.3 Review of Mathematical Programming with Excel Technology |
|
|
8 | (17) |
|
1.3.1 Excel Using the Solver |
|
|
10 | (10) |
|
1.3.2 Examples for Integer, Mixed-Integer, and Nonlinear Optimization |
|
|
20 | (5) |
|
|
25 | (1) |
|
1.5 Review of the Simplex Method in Excel Using Revised Simplex |
|
|
26 | (7) |
|
1.5.1 Steps of the Simplex Method |
|
|
29 | (4) |
|
References and Suggested Further Reading |
|
|
33 | (2) |
2 Review of Differential Calculus |
|
35 | (16) |
|
|
35 | (4) |
|
|
39 | (1) |
|
|
40 | (3) |
|
2.3.1 Increasing and Decreasing Functions |
|
|
43 | (1) |
|
|
43 | (1) |
|
2.4 Convex and Concave Functions |
|
|
43 | (6) |
|
|
48 | (1) |
|
References and Suggested Reading |
|
|
49 | (2) |
3 Single-Variable Unconstrained Optimization |
|
51 | (20) |
|
|
51 | (1) |
|
3.2 Single-Variable Optimization and Basic Theory |
|
|
52 | (2) |
|
3.3 Basic Applications of Max-Min Theory |
|
|
54 | (5) |
|
|
57 | (2) |
|
3.4 Applied Single-Variable Optimization Models |
|
|
59 | (10) |
|
|
65 | (3) |
|
|
68 | (1) |
|
References and Suggested Reading |
|
|
69 | (2) |
4 Numerical Search Techniques in Single-Variable Optimization |
|
71 | (32) |
|
4.1 Single-Variable Techniques |
|
|
71 | (20) |
|
4.1.1 Unrestricted Search |
|
|
73 | (1) |
|
|
74 | (1) |
|
|
74 | (2) |
|
4.1.4 Golden Section Search |
|
|
76 | (1) |
|
4.1.5 Finding the Maximum of a Function over an Interval with Golden Section |
|
|
77 | (2) |
|
4.1.6 Golden Section Search with Technology |
|
|
79 | (6) |
|
4.1.6.1 Excel Golden Search |
|
|
79 | (1) |
|
4.1.6.2 Maple Golden Search |
|
|
80 | (2) |
|
4.1.6.3 MATLAB Golden Search |
|
|
82 | (3) |
|
4.1.7 Illustrious Examples with Technology |
|
|
85 | (3) |
|
|
88 | (3) |
|
4.1.8.1 Finding the Maximum of a Function over an Interval with the Fibonacci Method |
|
|
88 | (3) |
|
4.2 Interpolation with Derivatives: Newton's Method |
|
|
91 | (10) |
|
4.2.1 Finding the Critical Points (Roots) of a Function |
|
|
91 | (1) |
|
4.2.2 The Basic Application |
|
|
92 | (2) |
|
4.2.3 Newton's Method to Find Critical Points with Technology |
|
|
94 | (1) |
|
4.2.4 Excel: Newton's Method |
|
|
94 | (1) |
|
4.2.5 Maple: Newton's Method |
|
|
94 | (2) |
|
4.2.6 Newton's Method for Critical Points with MATLAB |
|
|
96 | (2) |
|
4.2.7 The Bisection Method with Derivatives |
|
|
98 | (2) |
|
|
100 | (1) |
|
|
100 | (1) |
|
References and Suggested Further Readings |
|
|
101 | (2) |
5 Review of Multivariable Differential Calculus |
|
103 | (12) |
|
5.1 Introduction: Basic Theory and Partial Differentiation |
|
|
103 | (6) |
|
5.2 Directional Derivatives and the Gradient |
|
|
109 | (4) |
|
|
113 | (1) |
|
References and Suggested Reading |
|
|
114 | (1) |
6 Models Using Unconstrained Optimization: Maximization and Minimization with Several Variables |
|
115 | (28) |
|
|
115 | (2) |
|
|
117 | (11) |
|
6.3 Unconstrained Optimization |
|
|
128 | (11) |
|
|
136 | (3) |
|
|
139 | (2) |
|
|
140 | (1) |
|
Reference and Further Suggested Reading |
|
|
141 | (2) |
7 Multivariate Optimization Search Techniques |
|
143 | (30) |
|
|
143 | (1) |
|
7.2 Gradient Search Methods |
|
|
143 | (8) |
|
7.3 Examples of Gradient Search |
|
|
151 | (7) |
|
7.4 Modified Newton's Method |
|
|
158 | (8) |
|
7.4.1 Modified Newton with Technology |
|
|
162 | (4) |
|
|
166 | (1) |
|
7.5 Comparisons of Methods |
|
|
166 | (4) |
|
7.5.1 Maple Code for Steepest Ascent Method (See Fox and Richardson) |
|
|
166 | (2) |
|
7.5.2 Newton's Method for Optimization in Maple |
|
|
168 | (2) |
|
|
170 | (1) |
|
|
171 | (1) |
|
References and Suggested Reading |
|
|
171 | (2) |
8 Optimization with Equality Constraints |
|
173 | (22) |
|
|
173 | (1) |
|
8.2 Equality Constraints Method of Lagrange Multipliers |
|
|
173 | (1) |
|
8.3 Introduction and Basic Theory |
|
|
174 | (2) |
|
8.4 Graphical Interpretation of Lagrange Multipliers |
|
|
176 | (2) |
|
8.5 Computational Method of Lagrange Multipliers |
|
|
178 | (10) |
|
Lagrange Method with Technology |
|
|
180 | (8) |
|
8.6 Applications with Lagrange Multipliers |
|
|
188 | (6) |
|
|
191 | (2) |
|
|
193 | (1) |
|
References and Suggested Reading |
|
|
194 | (1) |
9 Inequality Constraints: Necessary/Sufficient Kuhn-Tucker Conditions (KTC) |
|
195 | (36) |
|
|
195 | (1) |
|
9.2 Basic Theory of Constrained Optimization |
|
|
196 | (4) |
|
9.2.1 Necessary and Sufficient Conditions |
|
|
197 | (3) |
|
9.3 Geometric Interpretation of KTC |
|
|
200 | (4) |
|
9.3.1 Spanning Cones (Optional) |
|
|
200 | (4) |
|
9.4 Computational KTC with Maple |
|
|
204 | (14) |
|
9.5 Modeling and Application with KTC |
|
|
218 | (11) |
|
|
225 | (3) |
|
|
228 | (1) |
|
|
228 | (1) |
|
References and Suggested Reading |
|
|
229 | (2) |
10 Specialized Nonlinear Optimization Methods |
|
231 | (28) |
|
|
231 | (3) |
|
10.1.1 Numerical and Heuristic Methods |
|
|
231 | (3) |
|
|
234 | (1) |
|
10.2 Method of Feasible Directions |
|
|
234 | (5) |
|
|
238 | (1) |
|
10.3 Quadratic Programming |
|
|
239 | (8) |
|
|
246 | (1) |
|
10.4 Separable Programming |
|
|
247 | (10) |
|
10.4.1 Adjacency Assumptions |
|
|
248 | (1) |
|
10.4.2 Linearization Property |
|
|
248 | (9) |
|
|
257 | (1) |
|
References and Suggested Reading |
|
|
257 | (2) |
11 Dynamic Programming |
|
259 | (20) |
|
11.1 Introduction: Basic Concepts and Theory |
|
|
259 | (3) |
|
11.1.1 Characteristics of Dynamic Programming |
|
|
261 | (1) |
|
|
261 | (1) |
|
|
262 | (2) |
|
11.3 Modeling and Applications of Continuous DP |
|
|
264 | (3) |
|
|
266 | (1) |
|
11.4 Models of Discrete Dynamic Programming |
|
|
267 | (3) |
|
11.5 Modeling and Applications of Discrete DP |
|
|
270 | (8) |
|
|
276 | (2) |
|
References and Suggested Readings |
|
|
278 | (1) |
12 Data Analysis with Regression Models, Advanced Regression Models, and Machine Learning through Optimization |
|
279 | (82) |
|
12.1 Introduction and Machine Learning |
|
|
279 | (3) |
|
|
280 | (2) |
|
12.1.1.1 Data Cleaning and Breakdown |
|
|
281 | (1) |
|
|
282 | (1) |
|
|
282 | (1) |
|
12.2 The Different Curve Fitting Criterion |
|
|
282 | (5) |
|
12.2.1 Fitting Criterion 1: Least Squares |
|
|
282 | (2) |
|
12.2.2 Fitting Criterion 2: Minimize the Sum of the Absolute Deviations |
|
|
284 | (1) |
|
12.2.3 Fitting Criterion 3: Chebyshev's Criterion or Minimize the Largest Error |
|
|
285 | (1) |
|
|
285 | (2) |
|
12.3 Introduction to Simple Linear and Polynomial Regression |
|
|
287 | (4) |
|
|
288 | (1) |
|
12.3.2 Regression in Maple |
|
|
289 | (1) |
|
|
290 | (1) |
|
|
291 | (1) |
|
12.4 Diagnostics in Regression |
|
|
291 | (5) |
|
12.4.1 Example for the Common Sense Test |
|
|
294 | (2) |
|
12.4.1.1 Exponential Decay Example |
|
|
294 | (2) |
|
12.4.2 Multiple Linear Regression |
|
|
296 | (1) |
|
|
296 | (1) |
|
12.5 Nonlinear Regression through Optimization |
|
|
296 | (26) |
|
12.5.1 Exponential Regression |
|
|
297 | (10) |
|
12.5.1.1 Newton-Raphson Algorithm |
|
|
298 | (9) |
|
12.5.2 Sine Regression Using Optimization |
|
|
307 | (5) |
|
12.5.3 Illustrative Examples |
|
|
312 | (10) |
|
12.5.3.1 Nonlinear Regression (Exponential Decay) |
|
|
312 | (10) |
|
|
322 | (1) |
|
12.6 One-Predictor Logistic and One-Predictor Poisson Regression Models |
|
|
322 | (37) |
|
12.6.1 Logistic Regression and Poisson Regression with Technology |
|
|
323 | (12) |
|
12.6.1.1 Logistic Regression with Technology |
|
|
323 | (7) |
|
12.6.1.2 Simple Poisson Regression with Technology |
|
|
330 | (5) |
|
12.6.2 Logistic Regression Illustrious Examples |
|
|
335 | (2) |
|
12.6.3 Poisson Regression Discussion and Examples |
|
|
337 | (6) |
|
12.6.3.1 Normality Assumption Lost |
|
|
338 | (4) |
|
12.6.3.2 Estimates of Regression Coefficients |
|
|
342 | (1) |
|
12.6.4 Illustrative Poisson Regression Examples |
|
|
343 | (11) |
|
|
343 | (11) |
|
|
354 | (4) |
|
|
358 | (1) |
|
12.7 Conclusions and Summary |
|
|
359 | (1) |
|
References and Suggested Reading |
|
|
359 | (2) |
Answers to Selected Problems |
|
361 | (28) |
Index |
|
389 | |