Muutke küpsiste eelistusi

Introduction to Methods for Nonlinear Optimization 1st ed. 2023 [Pehme köide]

  • Formaat: Paperback / softback, 723 pages, kõrgus x laius: 235x155 mm, kaal: 1369 g, 32 Illustrations, color; 14 Illustrations, black and white; XV, 723 p. 46 illus., 32 illus. in color., 1 Paperback / softback
  • Sari: UNITEXT 152
  • Ilmumisaeg: 28-May-2023
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3031267893
  • ISBN-13: 9783031267895
Teised raamatud teemal:
  • Pehme köide
  • Hind: 71,86 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 84,54 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Paperback / softback, 723 pages, kõrgus x laius: 235x155 mm, kaal: 1369 g, 32 Illustrations, color; 14 Illustrations, black and white; XV, 723 p. 46 illus., 32 illus. in color., 1 Paperback / softback
  • Sari: UNITEXT 152
  • Ilmumisaeg: 28-May-2023
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3031267893
  • ISBN-13: 9783031267895
Teised raamatud teemal:

This book has two main objectives:
•  to provide a concise introduction to nonlinear optimization methods, which can be used as a textbook at a graduate or upper undergraduate level;
•  to collect and organize selected important topics on optimization algorithms, not easily found in textbooks, which can provide material for advanced courses or can serve as a reference text for self-study and research.
The basic material on unconstrained and constrained optimization is organized into two blocks of chapters:
•   basic theory and optimality conditions
•   unconstrained and constrained algorithms.
These topics are treated in short chapters that contain the most important results in theory  and algorithms, in a way that, in the authors’ experience, is suitable for introductory courses.  
A third block of chapters addresses methods that are of increasing interest for solving difficult optimization problems. Difficulty can be typically due to the high nonlinearity of the objective function, ill-conditioning of the Hessian matrix, lack of information on first-order derivatives, the need to solve large-scale problems.
In the book various key subjects are addressed, including: exact penalty functions and exact augmented Lagrangian functions, non monotone methods, decomposition algorithms, derivative free methods for nonlinear equations and optimization problems.
The appendices at the end of the book offer a review of the essential mathematical background, including an introduction to convex analysis that can make part of an introductory course.

Arvustused

The authors deliver a well-written, well-structured, detailed textbook on the basics and a description of the theory, solution methods and applications in nonlinear continuous optimization. This book can be used well for introductory courses in nonlinear optimization . (Jan-J. Rückmann, Mathematical Reviews, July, 2024)

1 Introduction.-2 Fundamental definitions and basic existence results.-
3 Optimality conditions for unconstrained problems in Rn.- 4 Optimality
conditions for problems with convex feasible set.- 5 Optimality conditions
for Nonlinear Programming.- 6 Duality theory.- 7 Optimality conditions based
on theorems of the alternative.- 8 Basic concepts on optimization
algorithms.- 9 Unconstrained optimization algorithms.- 10 Line search
methods.- 11 Gradient method.- 12 Conjugate direction methods.- 13 Newtons
method.- 14 Trust region methods.- 15 Quasi-Newton Methods.- 16 Methods for
nonlinear equations.- 17 Methods for least squares problems.- 18 Methods for
large-scale optimization.- 19 Derivative-free methods for unconstrained
optimization.- 20 Methods for problems with convex feasible set.- 21 Penalty
and augmented Lagrangian methods.- 22 SQP methods.- 23 Introduction to
interior point methods.- 24 Nonmonotone methods.- 25 Spectral gradient
methods.- 26 Decomposition methods.- Appendix A: basic concepts of linear
algebra and analysis.- Appendix B: Differentiation in Rn.- Appendix C:
Introduction to convex analysis.
Prof. Luigi Grippo was formerly a full professor of operations research at the University of Rome "La Sapienza" and he taught courses on operations research, optimization algorithms, approximation methods, mathematical programming, computer learning. His research work has been mainly concerned with methods for nonlinear optimization and computer learning. He has published more than 40 papers on international journals and has served as associate editor in the Journal Optimization Methods and Software.





Prof. Marco Sciandrone is a full professor of Operations Research at University of Rome La Sapienza.  He teaches courses on operations research, continuous  optimization and machine learning. His research interests include nonlinear optimization and machine learning.  He has published about 60 papers on international journals. He is associate editor of the journals Optimization Methods and Software, and 4OR. He was one of the founders of DEIX srl, a start-up of University of Rome La Sapienza.