Muutke küpsiste eelistusi

Linear and Nonlinear Programming with Maple: An Interactive, Applications-Based Approach [Pehme köide]

  • Formaat: Paperback / softback, 413 pages, kõrgus x laius: 234x156 mm, kaal: 793 g
  • Ilmumisaeg: 07-Oct-2019
  • Kirjastus: Chapman & Hall/CRC
  • ISBN-10: 0367384779
  • ISBN-13: 9780367384777
Teised raamatud teemal:
  • Formaat: Paperback / softback, 413 pages, kõrgus x laius: 234x156 mm, kaal: 793 g
  • Ilmumisaeg: 07-Oct-2019
  • Kirjastus: Chapman & Hall/CRC
  • ISBN-10: 0367384779
  • ISBN-13: 9780367384777
Teised raamatud teemal:

Helps Students Understand Mathematical Programming Principles and Solve Real-World Applications



Supplies enough mathematical rigor yet accessible enough for undergraduates
Integrating a hands-on learning approach, a strong linear algebra focus, Maple™ software, and real-world applications, Linear and Nonlinear Programming with Maple™: An Interactive, Applications-Based Approach introduces undergraduate students to the mathematical concepts and principles underlying linear and nonlinear programming. This text fills the gap between management science books lacking mathematical detail and rigor and graduate-level books on mathematical programming.



Essential linear algebra tools
Throughout the text, topics from a first linear algebra course, such as the invertible matrix theorem, linear independence, transpose properties, and eigenvalues, play a prominent role in the discussion. The book emphasizes partitioned matrices and uses them to describe the simplex algorithm in terms of matrix multiplication. This perspective leads to streamlined approaches for constructing the revised simplex method, developing duality theory, and approaching the process of sensitivity analysis. The book also discusses some intermediate linear algebra topics, including the spectral theorem and matrix norms.



Maple enhances conceptual understanding and helps tackle problems
Assuming no prior experience with Maple, the author provides a sufficient amount of instruction for students unfamiliar with the software. He also includes a summary of Maple commands as well as Maple worksheets in the text and online. By using Maple’s symbolic computing components, numeric capabilities, graphical versatility, and intuitive programming structures, students will acquire a deep conceptual understanding of major mathematical programming principles, along with the ability to solve moderately sized rea

List of Figures
xiii
List of Tables
xv
Foreword xix
I Linear Programming
1(176)
1 An Introduction to Linear Programming
3(26)
1.1 The Basic Linear Programming Problem Formulation
3(10)
1.1.1 A Prototype Example: The Blending Problem
4(3)
1.1.2 Maple's LPSolve Command
7(1)
1.1.3 The Matrix Inequality Form of an LP
8(2)
Exercises
10(3)
1.2 Linear Programming: A Graphical Perspective in R2
13(6)
Exercises
17(2)
1.3 Basic Feasible Solutions
19(10)
Exercises
25(4)
2 The Simplex Algorithm
29(52)
2.1 The Simplex Algorithm
29(10)
2.1.1 An Overview of the Algorithm
29(1)
2.1.2 A Step-by-Step Analysis of the Process
30(3)
2.1.3 Solving Minimization Problems
33(1)
2.1.4 A Step-by-Step Maple Implementation of the Simplex Algorithm
34(4)
Exercises
38(1)
2.2 Alternative Optimal/Unbounded Solutions and Degeneracy
39(8)
2.2.1 Alternative Optimal Solutions
40(1)
2.2.2 Unbounded Solutions
41(1)
2.2.3 Degeneracy
42(3)
Exercises
45(2)
2.3 Excess and Artificial Variables: The Big M Method
47(7)
Exercises
53(1)
2.4 A Partitioned Matrix View of the Simplex Method
54(8)
2.4.1 Partitioned Matrices
54(1)
2.4.2 Partitioned Matrices with Maple
55(1)
2.4.3 The Simplex Algorithm as Partitioned Matrix Multiplication
56(5)
Exercises
61(1)
2.5 The Revised Simplex Algorithm
62(6)
2.5.1 Notation
62(1)
2.5.2 Observations about the Simplex Algorithm
63(1)
2.5.3 An Outline of the Method
63(1)
2.5.4 Application to the FuelPro LP
64(3)
Exercises
67(1)
2.6 Moving beyond the Simplex Method: An Interior Point Algorithm
68(13)
2.6.1 The Origin of the Interior Point Algorithm
68(1)
2.6.2 The Projected Gradient
69(3)
2.6.3 Affine Scaling
72(3)
2.6.4 Summary of the Method
75(1)
2.6.5 Application of the Method to the FuelPro LP
75(1)
2.6.6 A Maple Implementation of the Interior Point Algorithm
76(3)
Exercises
79(2)
3 Standard Applications of Linear Programming
81(22)
3.1 The Diet Problem
81(4)
3.1.1 Eating for Cheap on a Very Limited Menu
81(1)
3.1.2 The Problem Formulation and Solution, with Help from Maple
82(3)
Exercises
85(1)
3.2 Transportation and Transshipment Problems
85(7)
3.2.1 A Coal Distribution Problem
85(2)
3.2.2 The Integrality of the Transportation Problem Solution
87(2)
3.2.3 Coal Distribution with Transshipment
89(2)
Exercises
91(1)
3.3 Basic Network Models
92(11)
3.3.1 The Minimum Cost Network Flow Problem Formulation
92(2)
3.3.2 Formulating and Solving the Minimum Cost Network Flow Problem with Maple
94(1)
3.3.3 The Shortest Path Problem
95(3)
3.3.4 Maximum Flow Problems
98(1)
Exercises
99(4)
4 Duality and Sensitivity Analysis
103(42)
4.1 Duality
103(16)
4.1.1 The Dual of an LP
103(2)
4.1.2 Weak and Strong Duality
105(5)
4.1.3 An Economic Interpretation of Duality
110(1)
4.1.4 A Final Note on the Dual of an Arbitrary LP
111(1)
4.1.5 The Zero-Sum Matrix Game
112(4)
Exercises
116(3)
4.2 Sensitivity Analysis
119(18)
4.2.1 Sensitivity to an Objective Coefficient
121(4)
4.2.2 Sensitivity to Constraint Bounds
125(5)
4.2.3 Sensitivity to Entries in the Coefficient Matrix A
130(3)
4.2.4 Performing Sensitivity Analysis with Maple
133(2)
Exercises
135(2)
4.3 The Dual Simplex Method
137(8)
4.3.1 Overview of the Method
138(1)
4.3.2 A Simple Example
139(4)
Exercises
143(2)
5 Integer Linear Programming
145(32)
5.1 An Introduction to Integer Linear Programming and the Branch and Bound Method
145(22)
5.1.1 A Simple Example
145(1)
5.1.2 The Relaxation of an ILP
146(1)
5.1.3 The Branch and Bound Method
147(7)
5.1.4 Practicing the Branch and Bound Method with Maple
154(1)
5.1.5 Binary and Mixed Integer Linear Programming
155(1)
5.1.6 Solving ILPs Directly with Maple
156(1)
5.1.7 An Application of Integer Linear Programming: The Traveling Salesperson Problem
157(5)
Exercises
162(5)
5.2 The Cutting Plane Algorithm
167(10)
5.2.1 Motivation
167(1)
5.2.2 The Algorithm
168(4)
5.2.3 A Step-by-Step Maple Implementation of the Cutting Plane Algorithm
172(3)
5.2.4 Comparison with the Branch and Bound Method
175(1)
Exercises
175(2)
II Nonlinear Programming
177(142)
6 Algebraic Methods for Unconstrained Problems
179(46)
6.1 Nonlinear Programming: An Overview
179(8)
6.1.1 The General Nonlinear Programming Model
179(1)
6.1.2 Plotting Feasible Regions and Solving NLPs with Maple
180(3)
6.1.3 A Prototype NLP Example
183(2)
Exercises
185(2)
6.2 Differentiability and a Necessary First-Order Condition
187(6)
6.2.1 Differentiability
188(2)
6.2.2 Necessary Conditions for Local Maxima or Minima
190(3)
Exercises
193(1)
6.3 Convexity and a Sufficient First-Order Condition
193(13)
6.3.1 Convexity
194(2)
6.3.2 Testing for Convexity
196(3)
6.3.3 Convexity and The Global Optimal Solutions Theorem
199(1)
6.3.4 Solving the Unconstrained NLP for Differentiable, Convex Functions
200(1)
6.3.5 Multiple Linear Regression
201(3)
Exercises
204(2)
6.4 Sufficient Conditions for Local and Global Optimal Solutions
206(19)
6.4.1 Quadratic Forms
207(2)
6.4.2 Positive Definite Quadratic Forms
209(1)
6.4.3 Second-Order Differentiability and the Hessian Matrix
210(8)
6.4.4 Using Maple to Classify Critical Points for the Unconstrained NLP
218(1)
6.4.5 The Zero-Sum Matrix Game, Revisited
219(3)
Exercises
222(3)
7 Numeric Tools for Unconstrained NLPs
225(36)
7.1 The Steepest Descent Method
225(13)
7.1.1 Method Derivation
225(4)
7.1.2 A Maple Implementation of the Steepest Descent Method
229(2)
7.1.3 A Sufficient Condition for Convergence
231(3)
7.1.4 The Rate of Convergence
234(2)
Exercises
236(2)
7.2 Newton's Method
238(11)
7.2.1 Shortcomings of the Steepest Descent Method
238(1)
7.2.2 Method Derivation
239(2)
7.2.3 A Maple Implementation of Newton's Method
241(2)
7.2.4 Convergence Issues and Comparison with the Steepest Descent Method
243(4)
Exercises
247(2)
7.3 The Levenberg-Marquardt Algorithm
249(12)
7.3.1 Interpolating between the Steepest Descent and Newton Methods
249(1)
7.3.2 The Levenberg Method
250(1)
7.3.3 The Levenberg-Marquardt Algorithm
251(2)
7.3.4 A Maple Implementation of the Levenberg-Marquardt Algorithm
253(2)
7.3.5 Nonlinear Regression
255(2)
7.3.6 Maple's Global Optimization Toolbox
257(1)
Exercises
258(3)
8 Methods for Constrained Nonlinear Problems
261(58)
8.1 The Lagrangian Function and Lagrange Multipliers
261(11)
8.1.1 Some Convenient Notation
262(1)
8.1.2 The Karush-Kuhn-Tucker Theorem
263(4)
8.1.3 Interpreting the Multiplier
267(2)
Exercises
269(3)
8.2 Convex NLPs
272(6)
8.2.1 Solving Convex NLPs
273(3)
Exercises
276(2)
8.3 Saddle Point Criteria
278(6)
8.3.1 The Restricted Lagrangian
278(2)
8.3.2 Saddle Point Optimality Criteria
280(2)
Exercises
282(2)
8.4 Quadratic Programming
284(16)
8.4.1 Problems with Equality-type Constraints Only
284(5)
8.4.2 Inequality Constraints
289(2)
8.4.3 Maple's QPSolve Command
291(2)
8.4.4 The Bimatrix Game
293(4)
Exercises
297(3)
8.5 Sequential Quadratic Programming
300(19)
8.5.1 Method Derivation for Equality-type Constraints
300(6)
8.5.2 The Convergence Issue
306(1)
8.5.3 Inequality-Type Constraints
306(4)
8.5.4 A Maple Implementation of the Sequential Quadratic Programming Technique
310(2)
8.5.5 An Improved Version of the SQPT
312(3)
Exercises
315(4)
A Projects
319(18)
A.1 Excavating and Leveling a Large Land Tract
319(3)
A.2 The Juice Logistics Model
322(3)
A.3 Work Scheduling with Overtime
325(2)
A.4 Diagnosing Breast Cancer with a Linear Classifier
327(3)
A.5 The Markowitz Portfolio Model
330(4)
A.6 A Game Theory Model of a Predator-Prey Habitat
334(3)
B Important Results from Linear Algebra
337(4)
B.1 Linear Independence
337(1)
B.2 The Invertible Matrix Theorem
337(1)
B.3 Transpose Properties
338(1)
B.4 Positive Definite Matrices
338(1)
B.5 Cramer's Rule
339(1)
B.6 The Rank-Nullity Theorem
339(1)
B.7 The Spectral Theorem
339(1)
B.8 Matrix Norms
340(1)
C Getting Started with Maple
341(22)
C.1 The Worksheet Structure
341(2)
C.2 Arithmetic Calculations and Built-in Operations
343(1)
C.3 Expressions and Functions
344(3)
C.4 Arrays, Lists, Sequences, and Sums
347(2)
C.5 Matrix Algebra and the LinearAlgebra Package
349(4)
C.6 Plot Structures with Maple
353(10)
D Summary of Maple Commands
363(20)
Bibliography 383(4)
Index 387
Paul E. Fishback is a professor in the Department of Mathematics at Grand Valley State University in Allendale, Michigan, USA.