Muutke küpsiste eelistusi

Nonlinear Least Squares for Inverse Problems: Theoretical Foundations and Step-by-Step Guide for Applications 2010 ed. [Kõva köide]

  • Formaat: Hardback, 360 pages, kõrgus x laius: 235x155 mm, kaal: 1550 g, XIV, 360 p., 1 Hardback
  • Sari: Scientific Computation
  • Ilmumisaeg: 19-Oct-2009
  • Kirjastus: Springer
  • ISBN-10: 904812784X
  • ISBN-13: 9789048127849
  • Kõva köide
  • Hind: 104,29 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 122,69 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Hardback, 360 pages, kõrgus x laius: 235x155 mm, kaal: 1550 g, XIV, 360 p., 1 Hardback
  • Sari: Scientific Computation
  • Ilmumisaeg: 19-Oct-2009
  • Kirjastus: Springer
  • ISBN-10: 904812784X
  • ISBN-13: 9789048127849

This book provides a step-by-step introduction to the least squares resolution of nonlinear inverse problems. For readers interested in projection of non-convex sets, it also presents the geometric theory of quasi-convex and strictly quasi-convex sets.



This book provides an introduction into the least squares resolution of nonlinear inverse problems. The first goal is to develop a geometrical theory to analyze nonlinear least square (NLS) problems with respect to their quadratic wellposedness, i.e. both wellposedness and optimizability. Using the results, the applicability of various regularization techniques can be checked. The second objective of the book is to present frequent practical issues when solving NLS problems. Application oriented readers will find a detailed analysis of problems on the reduction to finite dimensions, the algebraic determination of derivatives (sensitivity functions versus adjoint method), the determination of the number of retrievable parameters, the choice of parametrization (multiscale, adaptive) and the optimization step, and the general organization of the inversion code. Special attention is paid to parasitic local minima, which can stop the optimizer far from the global minimum: multiscale parametrization is shown to be an efficient remedy in many cases, and a new condition is given to check both wellposedness and the absence of parasitic local minima.

For readers that are interested in projection on non-convex sets, Part II of this book presents the geometric theory of quasi-convex and strictly quasi-convex (s.q.c.) sets. S.q.c. sets can be recognized by their finite curvature and limited deflection and possess a neighborhood where the projection is well-behaved.

Throughout the book, each chapter starts with an overview of the presented concepts and results.

Arvustused

From the reviews:

This comprehensive treatise on the nonlinear inverse problem, written by a mathematician with extensive experience in exploration geophysics, deals primarily with the nonlinear least squares (NLS) methods to solve such problems. Chavent has authored a book with appeal to both the practitioner of the arcane art of NLS inversion as well as to the theorist seeking a rigorous and formal development of what is currently known about this subject. (Sven Treitel, The Leading Edge, April, 2010)

The book is organized so that readers interested in the more practical aspects can easily dip into the appropriate chapters of the book without having to work through the more theoretical details. is recommended for readers who are interested in applying the OLS approach to nonlinear inverse problems. This material is relatively accessible even to readers without a very strong background in analysis. The book will also be of interest to readers who want to learn more about quasi-convex sets and Q-wellposedness. (Brain Borchers, The Mathematical Association of America, July, 2010)

Preface vii
I Nonlinear Least Squares
1(270)
Nonlinear Inverse Problems: Examples and Difficulties
5(24)
Example 1: Inversion of Knott-Zoeppritz Equations
6(3)
An Abstract NLS Inverse Problem
9(1)
Analysis of NLS Problems
10(11)
Wellposedness
10(2)
Optimizability
12(1)
Output Least Squares Identifiability and Quadratically Wellposed Problems
12(2)
Regularization
14(6)
Derivation
20(1)
Example 2: ID Elliptic Parameter Estimation Problem
21(3)
Example 3: 2D Elliptic Nonlinear Source Estimation Problem
24(2)
Example 4: 2D Elliptic Parameter Estimation Problem
26(3)
Computing Derivatives
29(50)
Setting the Scene
30(3)
The Sensitivity Functions Approach
33(1)
The Adjoint Approach
33(5)
Implementation of the Adjoint Approach
38(3)
Example 1: The Adjoint Knott-Zoeppritz Equations
41(5)
Examples 3 and 4: Discrete Adjoint Equations
46(13)
Discretization Step 1: Choice of a Discretized Forward Map
47(5)
Discretization Step 2: Choice of a Discretized Objective Function
52(1)
Derivation Step 0: Forward Map and Objective Function
52(1)
Derivation Step 1: State-Space Decomposition
53(1)
Derivation Step 2: Lagrangian
54(2)
Derivation Step 3: Adjoint Equation
56(2)
Derivation Step 4: Gradient Equation
58(1)
Examples 3 and 4: Continuous Adjoint Equations
59(6)
Example 5: Differential Equations, Discretized Versus Discrete Gradient
65(8)
Implementing the Discretized Gradient
68(1)
Implementing the Discrete Gradient
68(5)
Example 6: Discrete Marching Problems
73(6)
Choosing a Parameterization
79(82)
Calibration
80(4)
On the Parameter Side
80(3)
On the Data Side
83(1)
Conclusion
84(1)
How Many Parameters Can be Retrieved from the Data?
84(4)
Simulation Versus Optimization Parameters
88(2)
Parameterization by a Closed Form Formula
90(1)
Decomposition on the Singular Basis
91(2)
Multiscale Parameterization
93(15)
Simulation Parameters for a Distributed Parameter
93(1)
Optimization Parameters at Scale k
94(1)
Scale-By-Scale Optimization
95(10)
Examples of Multiscale Bases
105(3)
Summary for Multiscale Parameterization
108(1)
Adaptive Parameterization: Refinement Indicators
108(18)
Definition of Refinement Indicators
109(7)
Multiscale Refinement Indicators
116(5)
Application to Image Segmentation
121(1)
Coarsening Indicators
122(2)
A Refinement/Coarsening Indicators Algorithm
124(2)
Implementation of the Inversion
126(9)
Constraints and Optimization Parameters
126(3)
Gradient with Respect to Optimization Parameters
129(6)
Maximum Projected Curvature: A Descent Step for Nonlinear Least Squares
135(26)
Descent Algorithms
135(2)
Maximum Projected Curvature (MPC) Step
137(6)
Convergence Properties for the Theoretical MPC Step
143(1)
Implementation of the MPC Step
144(4)
Performance of the MPC Step
148(13)
Output Least Squares Identifiability and Quadratically Wellposed NLS Problems
161(48)
The Linear Case
163(2)
Finite Curvature/Limited Deflection Problems
165(9)
Identifiability and Stability of the Linearized Problems
174(2)
A Sufficient Condition for OLS-Identifiability
176(3)
The Case of Finite Dimensional Parameters
179(3)
Four Questions to Q-Wellposedness
182(2)
Case of Finite Dimensional Parameters
183(1)
Case of Infinite Dimensional Parameters
184(1)
Answering the Four Questions
184(7)
Application to Example 2: ID Parameter Estimation with H1 Observation
191(9)
Linear Stability
193(5)
Deflection Estimate
198(1)
Curvature Estimate
199(1)
Conclusion: OLS-Identifiability
200(1)
Application to Example 4: 2D Parameter Estimation, with H1 Observation
200(9)
Regularization of Nonlinear Least Squares Problems
209(62)
Levenberg-Marquardt-Tychonov (LMT) Regularization
209(28)
Linear Problems
211(8)
Finite Curvature/Limited Deflection (FC/LD) Problems
219(12)
General Nonlinear Problems
231(6)
Application to the Nonlinear 2D Source Problem
237(9)
State-Space Regularization
246(13)
Dense Observation: Geometric Approach
248(8)
Incomplete Observation: Soft Analysis
256(3)
Adapted Regularization for Example 4: 2D Parameter Estimation with H1 Observation
259(12)
Which Part of a is Constrained by the Data?
260(2)
How to Control the Unconstrained Part?
262(2)
The Adapted-Regularized Problem
264(1)
Infinite Dimensional Linear Stability and Deflection Estimates
265(2)
Finite Curvature Estimate
267(1)
OLS-Identifiability for the Adapted Regularized Problem
268(3)
II A Generalization of Convex Sets
271(74)
Quasi-Convex Sets
275(24)
Equipping the Set D with Paths
277(4)
Definition and Main Properties of q.c. Sets
281(18)
Strictly Quasi-Convex Sets
299(22)
Definition and Main Properties of s.q.c. Sets
300(4)
Characterization by the Global Radius of Curvature
304(12)
Formula for the Global Radius of Curvature
316(5)
Deflection Conditions for the Strict Quasi-convexity of Sets
321(24)
The General Case: D F
327(10)
The Case of an Attainable Set D = &phis; (C)
337(8)
Bibliography 345(8)
Index 353
Background: Ecole Polytechnique (Paris, 1965),

Ecole Nationale Supérieure des Télécommunications (Paris,1968),

Paris-6 University (Ph. D., 1971).

Professor Chavent joined the Faculty of Paris 9-Dauphine in 1971. He is now an emeritus professor from this university. During the same span of time, he ran a research project at INRIA (Institut National de Recherche en Informatique et en Automatique), focused on industrial inverse problems (oil production and exploration, nuclear reactors, ground water management).