Muutke küpsiste eelistusi

Lectures on Nonsmooth Optimization [Kõva köide]

  • Formaat: Hardback, 560 pages, kõrgus x laius: 235x155 mm, XIII, 560 p., 1 Hardback
  • Sari: Texts in Applied Mathematics 82
  • Ilmumisaeg: 04-Jul-2025
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3031914163
  • ISBN-13: 9783031914164
Teised raamatud teemal:
  • Kõva köide
  • Hind: 159,88 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 188,09 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Hardback, 560 pages, kõrgus x laius: 235x155 mm, XIII, 560 p., 1 Hardback
  • Sari: Texts in Applied Mathematics 82
  • Ilmumisaeg: 04-Jul-2025
  • Kirjastus: Springer International Publishing AG
  • ISBN-10: 3031914163
  • ISBN-13: 9783031914164
Teised raamatud teemal:

This book provides an in-depth exploration of nonsmooth optimization, covering foundational algorithms, theoretical insights, and a wide range of applications. Nonsmooth optimization, characterized by nondifferentiable objective functions or constraints, plays a crucial role across various fields, including machine learning, imaging, inverse problems, statistics, optimal control, and engineering. Its scope and relevance continue to expand, as many real-world problems are inherently nonsmooth or benefit significantly from nonsmooth regularization techniques. This book covers a variety of algorithms for solving nonsmooth optimization problems, which are foundational and recent. It first introduces basic facts on convex analysis and subdifferetial calculus, various algorithms are then discussed, including subgradient methods, mirror descent methods, proximal algorithms, alternating direction method of multipliers, primal dual splitting methods and semismooth Newton methods. Moreover, error bound conditions are discussed and the derivation of linear convergence is illustrated. A particular chapter is delved into first order methods for nonconvex optimization problems satisfying the Kurdyka-Lojasiewicz condition. The book also addresses the rapid evolution of stochastic algorithms for large-scale optimization. This book is written for a wide-ranging audience, including senior undergraduates, graduate students, researchers, and practitioners who are interested in gaining a comprehensive understanding of nonsmooth optimization.

Preface.- Introduction.- Convex sets and convex functions.- Subgradient
and mirror descent methods.- Proximal algorithms.- Karush-Kuhn-Tucker theory
and Lagrangian duality.- ADMM: alternating direction method of multipliers.-
Primal dual splitting algorithms.- Error bound conditions and linear
convergence.- Optimization with Kurdyka- Lojasiewicz property.- Semismooth
Newton methods.- Stochastic algorithms.- References.- Index.
Qinian Jin graduated from Anhui Normal University in China with a bachelor degree and obtained his PhD degree from the Department of Mathematics at Rutgers University, New Brunswick, USA. He then joined the Mathematical  Sciences Institute at Australian National University in 2011. His research was supported by Australian Research Council (ARC) and he was awarded the Future Fellowship from ARC. His research interest covers inverse problems, numerical analysis, optimization, partial differential equations, geometric analysis. In particular his recent research focuses on using nonsmooth optimization technique to design algorithms for solving ill-posed inverse problems. He has published about 70 papers on international journals.