Muutke küpsiste eelistusi

Algorithms for Convex Optimization [Pehme köide]

(Yale University, Connecticut)
  • Formaat: Paperback / softback, 200 pages, kõrgus x laius x paksus: 228x150x20 mm, kaal: 520 g, Worked examples or Exercises
  • Ilmumisaeg: 07-Oct-2021
  • Kirjastus: Cambridge University Press
  • ISBN-10: 1108741770
  • ISBN-13: 9781108741774
Teised raamatud teemal:
  • Formaat: Paperback / softback, 200 pages, kõrgus x laius x paksus: 228x150x20 mm, kaal: 520 g, Worked examples or Exercises
  • Ilmumisaeg: 07-Oct-2021
  • Kirjastus: Cambridge University Press
  • ISBN-10: 1108741770
  • ISBN-13: 9781108741774
Teised raamatud teemal:
Algorithms for Convex Optimization are the workhorses of data-driven, technological advancements in machine learning and artificial intelligence. This concise, modern guide to deriving these algorithms is self-contained and accessible to advanced students, practitioners, and researchers in computer science, operations research, and data science.

In the last few years, Algorithms for Convex Optimization have revolutionized algorithm design, both for discrete and continuous optimization problems. For problems like maximum flow, maximum matching, and submodular function minimization, the fastest algorithms involve essential methods such as gradient descent, mirror descent, interior point methods, and ellipsoid methods. The goal of this self-contained book is to enable researchers and professionals in computer science, data science, and machine learning to gain an in-depth understanding of these algorithms. The text emphasizes how to derive key algorithms for convex optimization from first principles and how to establish precise running time bounds. This modern text explains the success of these algorithms in problems of discrete optimization, as well as how these methods have significantly pushed the state of the art of convex optimization itself.

Arvustused

'The field of mathematical programming has two major themes: linear programming and convex programming. The far-reaching impact of the first theory in computer science, game theory and engineering is well known. We are now witnessing the growth of the second theory as it finds its way into diverse fields such as machine learning, mathematical economics and quantum computing. This much-awaited book with its unique approach, steeped in the modern theory of algorithms, will go a long way in making this happen.' Vijay V. Vazirani, Distinguished Professor at University of California, Irvine 'I had thought that there is no need for new books about convex optimization but this book proves me wrong. It treats both classic and cutting-edge topics with an unparalleled mix of clarity and rigor, building intuitions about key ideas and algorithms driving the field. A must read for anyone interested in optimization!' Aleksander Madry, Massachusetts Institute of Technology 'Vishnoi's book provides an exceptionally good introduction to convex optimization for students and researchers in computer science, operations research, and discrete optimization. The book gives a comprehensive introduction to classical results as well as to some of the most recent developments. Concepts and ideas are introduced from first principles, conveying helpful intuitions. There is significant emphasis on bridging continuous and discrete optimization, in particular, on recent breakthroughs on flow problems using convex optimization methods; the book starts with an enlightening overview of the interplay between these areas.' László Végh, LSE 'Recommended.' M. Bona, Choice Connect

Muu info

A concise, accessible guide to the modern optimization methods that are transforming computer science, data science, and machine learning.
Preface xi
Acknowledgments xiv
Notation xv
1 Bridging Continuous and Discrete Optimization
1(16)
1.1 An Example: The Maximum Flow Problem
2(6)
1.2 Linear Programming
8(4)
1.3 Fast and Exact Algorithms via Interior Point Methods
12(1)
1.4 Ellipsoid Method beyond Succinct Linear Programs
13(4)
2 Preliminaries
17(18)
2.1 Derivatives, Gradients, and Hessians
17(2)
2.2 Fundamental Theorem of Calculus
19(1)
2.3 Taylor Approximation
19(1)
2.4 Linear Algebra, Matrices, and Eigenvalues
20(3)
2.5 The Cauchy-Schwarz Inequality
23(1)
2.6 Norms
24(1)
2.7 Euclidean Topology
25(1)
2.8 Dynamical Systems
25(2)
2.9 Graphs
27(2)
2.10 Exercises
29(6)
3 Convexity
35(14)
3.1 Convex Sets
35(1)
3.2 Convex Functions
36(6)
3.3 The Usefulness of Convexity
42(4)
3.4 Exercises
46(3)
4 Convex Optimization and Efficiency
49(20)
4.1 Convex Programs
49(2)
4.2 Computational Models
51(2)
4.3 Membership Problem for Convex Sets
53(6)
4.4 Solution Concepts for Optimization Problems
59(4)
4.5 The Notion of Polynomial Time for Convex Optimization
63(2)
4.6 Exercises
65(4)
5 Duality and Optimality
69(15)
5.1 Lagrangian Duality
70(4)
5.2 The Conjugate Function
74(2)
5.3 KKT Optimality Conditions
76(2)
5.4 Proof of Strong Duality under Slater's Condition
78(1)
5.5 Exercises
79(5)
6 Gradient Descent
84(24)
6.1 The Setup
84(1)
6.2 Gradient Descent
85(4)
6.3 Analysis When the Gradient Is Lipschitz Continuous
89(7)
6.4 Application: The Maximum Flow Problem
96(5)
6.5 Exercises
101(7)
7 Mirror Descent and the Multiplicative Weights Update
108(35)
7.1 Beyond the Lipschitz Gradient Condition
108(2)
7.2 A Local Optimization Principle and Regularizes
110(2)
7.3 Exponential Gradient Descent
112(9)
7.4 Mirror Descent
121(4)
7.5 Multiplicative Weights Update
125(1)
7.6 Application: Perfect Matching in Bipartite Graphs
126(6)
7.7 Exercises
132(11)
8 Accelerated Gradient Descent
143(17)
8.1 The Setup
143(1)
8.2 Main Result on Accelerated Gradient Descent
144(1)
8.3 Proof Strategy: Estimate Sequences
145(2)
8.4 Construction of an Estimate Sequence
147(5)
8.5 The Algorithm and Its Analysis
152(1)
8.6 An Algorithm for Strongly Convex and Smooth Functions
153(2)
8.7 Application: Linear System of Equations
155(1)
8.8 Exercises
156(4)
9 Newton's Method
160(25)
9.1 Finding a Root of a Univariate Function
160(4)
9.2 Newton's Method for Multivariate Functions
164(1)
9.3 Newton's Method for Unconstrained Optimization
165(2)
9.4 First Take on the Analysis
167(3)
9.5 Newton's Method as Steepest Descent
170(5)
9.6 Analysis Based on a Local Norm
175(6)
9.7 Analysis Based on the Euclidean Norm
181(1)
9.8 Exercises
182(3)
10 An Interior Point Method for Linear Programming
185(30)
10.1 Linear Programming
185(2)
10.2 Constrained Optimization via Barrier Functions
187(1)
10.3 The Logarithmic Barrier Function
188(1)
10.4 The Central Path
189(1)
10.5 A Path-Following Algorithm for Linear Programming
190(4)
10.6 Analysis of the Path-Following Algorithm
194(14)
10.7 Exercises
208(7)
11 Variants of Interior Point Method and Self-Concordance
215(33)
11.1 The Minimum Cost Flow Problem
215(4)
11.2 An IPM for Linear Programming in Standard Form
219(7)
11.3 Application: The Minimum Cost Flow Problem
226(4)
11.4 Self-Concordant Barriers
230(2)
11.5 Linear Programming Using Self-Concordant Barriers
232(7)
11.6 Semidefinite Programming Using Self-Concordant Barriers
239(3)
11.7 Convex Optimization Using Self-Concordant Barriers
242(1)
11.8 Exercises
242(6)
12 Ellipsoid Method for Linear Programming
248(31)
12.1 0-1-Polytopes with Exponentially Many Constraints
248(4)
12.2 Cutting Plane Methods
252(6)
12.3 Ellipsoid Method
258(3)
12.4 Analysis of Volume Drop and Efficiency for Ellipsoids
261(8)
12.5 Application: Linear Optimization over 0-1-Polytopes
269(4)
12.6 Exercises
273(6)
13 Ellipsoid Method for Convex Optimization
279(31)
13.1 Convex Optimization Using the Ellipsoid Method?
279(2)
13.2 Application: Submodular Function Minimization
281(8)
13.3 Application: The Maximum Entropy Problem
289(5)
13.4 Convex Optimization Using the Ellipsoid Method
294(7)
13.5 Variants of Cutting Plane Method
301(3)
13.6 Exercises
304(6)
Bibliography 310(9)
Index 319
Nisheeth K. Vishnoi is a Professor of Computer Science at Yale University. His research areas include theoretical computer science, optimization, and machine learning. He is a recipient of the Best Paper Award at IEEE FOCS in 2005, the IBM Research Pat Goldberg Memorial Award in 2006, the Indian National Science Academy Young Scientist Award in 2011, and the Best Paper award at ACM FAccT in 2019. He was elected an ACM Fellow in 2019. He obtained a bachelor degree in Computer Science and Engineering from IIT Bombay and a Ph.D. in Algorithms, Combinatorics and Optimization from Georgia Institute of Technology.