Muutke küpsiste eelistusi

E-raamat: Nonlinear Parameter Optimization Using R Tools

(University of Ottawa, Canada)
  • Formaat: EPUB+DRM
  • Ilmumisaeg: 03-Apr-2014
  • Kirjastus: John Wiley & Sons Inc
  • Keel: eng
  • ISBN-13: 9781118883969
  • Formaat - EPUB+DRM
  • Hind: 72,80 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele
  • Formaat: EPUB+DRM
  • Ilmumisaeg: 03-Apr-2014
  • Kirjastus: John Wiley & Sons Inc
  • Keel: eng
  • ISBN-13: 9781118883969

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Nonlinear Parameter Optimization Using R John C. Nash, Telfer School of Management, University of Ottawa, Canada

A systematic and comprehensive treatment of optimization software using R

In recent decades, optimization techniques have been streamlined by computational and artificial intelligence methods to analyze more variables, especially under nonlinear, multivariable conditions, more quickly than ever before.

Optimization is an important tool for decision science and for the analysis of physical systems used in engineering. Nonlinear Parameter Optimization with R explores the principal tools available in R for function minimization, optimization, and nonlinear parameter determination and features numerous examples throughout.

Nonlinear Parameter Optimization with R:





Provides a comprehensive treatment of optimization techniques Examines optimization problems that arise in statistics and how to solve them using R Enables researchers and practitioners to solve parameter determination problems Presents traditional methods as well as recent developments in R Is supported by an accompanying website featuring R code, examples and datasets

Researchers and practitioners who have to solve parameter determination problems who are users of R but are novices in the field optimization or function minimization will benefit from this book. It will also be useful for scientists building and estimating nonlinear models in various fields such as hydrology, sports forecasting, ecology, chemical engineering, pharmaco-kinetics, agriculture, economics and statistics.

Arvustused

"The book chapters are enriched by little anecdotes, and the reader obviously benefits from John C. Nash's experience of more than 30 years in the field of nonlinear optimization. This experience translates into many practical recommendations and tweaks. The book provides plenty of code examples and useful code snippets." (Biometrical Journal, 2016)

Preface xv
1 Optimization problem tasks and how they arise
1(8)
1.1 The general optimization problem
1(1)
1.2 Why the general problem is generally uninteresting
2(2)
1.3 (Non-)Linearity
4(1)
1.4 Objective function properties
4(1)
1.4.1 Sums of squares
4(1)
1.4.2 Minimax approximation
5(1)
1.4.3 Problems with multiple minima
5(1)
1.4.4 Objectives that can only be imprecisely computed
5(1)
1.5 Constraint types
5(1)
1.6 Solving sets of equations
6(1)
1.7 Conditions for optimality
7(1)
1.8 Other classifications
7(2)
References
8(1)
2 Optimization algorithms -- an overview
9(16)
2.1 Methods that use the gradient
9(3)
2.2 Newton-like methods
12(1)
2.3 The promise of Newton's method
13(1)
2.4 Caution: convergence versus termination
14(1)
2.5 Difficulties with Newton's method
14(1)
2.6 Least squares: Gauss--Newton methods
15(2)
2.7 Quasi-Newton or variable metric method
17(1)
2.8 Conjugate gradient and related methods
18(1)
2.9 Other gradient methods
19(1)
2.10 Derivative-free methods
19(1)
2.10.1 Numerical approximation of gradients
19(1)
2.10.2 Approximate and descend
19(1)
2.10.3 Heuristic search
20(1)
2.11 Stochastic methods
20(1)
2.12 Constraint-based methods -- mathematical programming
21(4)
References
22(3)
3 Software structure and interfaces
25(16)
3.1 Perspective
25(1)
3.2 Issues of choice
26(1)
3.3 Software issues
27(1)
3.4 Specifying the objective and constraints to the optimizer
28(1)
3.5 Communicating exogenous data to problem definition functions
28(4)
3.5.1 Use of "global" data and variables
31(1)
3.6 Masked (temporarily fixed) optimization parameters
32(1)
3.7 Dealing with inadmissible results
33(1)
3.8 Providing derivatives for functions
34(2)
3.9 Derivative approximations when there are constraints
36(1)
3.10 Scaling of parameters and function
36(1)
3.11 Normal ending of computations
36(1)
3.12 Termination tests -- abnormal ending
37(1)
3.13 Output to monitor progress of calculations
37(1)
3.14 Output of the optimization results
38(1)
3.15 Controls for the optimizer
38(1)
3.16 Default control settings
39(1)
3.17 Measuring performance
39(1)
3.18 The optimization interface
39(2)
References
40(1)
4 One-parameter root-finding problems
41(15)
4.1 Roots
41(1)
4.2 Equations in one variable
42(1)
4.3 Some examples
42(9)
4.3.1 Exponentially speaking
42(2)
4.3.2 A normal concern
44(2)
4.3.3 Little Polly Nomial
46(3)
4.3.4 A hypothequial question
49(2)
4.4 Approaches to solving 1D root-finding problems
51(1)
4.5 What can go wrong?
52(2)
4.6 Being a smart user of root-finding programs
54(1)
4.7 Conclusions and extensions
54(2)
References
55(1)
5 One-parameter minimization problems
56(7)
5.1 The optimize () function
56(1)
5.2 Using a root-finder
57(1)
5.3 But where is the minimum?
58(1)
5.4 Ideas for 1D minimizers
59(2)
5.5 The line-search subproblem
61(2)
References
62(1)
6 Nonlinear least squares
63(32)
6.1 nls () from package stats
63(2)
6.1.1 A simple example
63(2)
6.1.2 Regression versus least squares
65(1)
6.2 A more difficult case
65(7)
6.3 The structure of the nls () solution
72(1)
6.4 Concerns with nls ()
73(6)
6.4.1 Small residuals
74(1)
6.4.2 Robustness -- "singular gradient" woes
75(2)
6.4.3 Bounds with nls ()
77(2)
6.5 Some ancillary tools for nonlinear least squares
79(2)
6.5.1 Starting values and self-starting problems
79(1)
6.5.2 Converting model expressions to sum-of-squares functions
80(1)
6.5.3 Help for nonlinear regression
80(1)
6.6 Minimizing R functions that compute sums of squares
81(1)
6.7 Choosing an approach
82(4)
6.8 Separable sums of squares problems
86(7)
6.9 Strategies for nonlinear least squares
93(2)
References
93(2)
7 Nonlinear equations
95(13)
7.1 Packages and methods for nonlinear equations
95(2)
7.1.1 BB
96(1)
7.1.2 nleqslv
96(1)
7.1.3 Using nonlinear least squares
96(1)
7.1.4 Using function minimization methods
96(1)
7.2 A simple example to compare approaches
97(6)
7.3 A statistical example
103(5)
References
106(2)
8 Function minimization tools in the base R system
108(7)
8.1 optim()
108(2)
8.2 nlm()
110(1)
8.3 nlminb()
111(1)
8.4 Using the base optimization tools
112(3)
References
114(1)
9 Add-in function minimization packages for R
115(8)
9.1 Package optimx
115(3)
9.1.1 Optimizers in optimx
116(1)
9.1.2 Example use of optimx ()
117(1)
9.2 Some other function minimization packages
118(3)
9.2.1 nloptr and nloptwrap
118(1)
9.2.2 trust and trustOptim
119(2)
9.3 Should we replace optim() routines?
121(2)
References
122(1)
10 Calculating and using derivatives
123(9)
10.1 Why and how
123(1)
10.2 Analytic derivatives -- by hand
124(1)
10.3 Analytic derivatives -- tools
125(1)
10.4 Examples of use of R tools for differentiation
125(2)
10.5 Simple numerical derivatives
127(1)
10.6 Improved numerical derivative approximations
128(1)
10.6.1 The Richardson extrapolation
128(1)
10.6.2 Complex-step derivative approximations
128(1)
10.7 Strategy and tactics for derivatives
129(3)
References
131(1)
11 Bounds constraints
132(11)
11.1 Single bound: use of a logarithmic transformation
132(1)
11.2 Interval bounds: Use of a hyperbolic transformation
133(2)
11.2.1 Example of the tanh transformation
134(1)
11.2.2 A fly in the ointment
134(1)
11.3 Setting the objective large when bounds are violated
135(1)
11.4 An active set approach
136(2)
11.5 Checking bounds
138(1)
11.6 The importance of using bounds intelligently
138(1)
11.6.1 Difficulties in applying bounds constraints
139(1)
11.7 Post-solution information for bounded problems
139(4)
Appendix 11.A Function transfinite
141(1)
References
142(1)
12 Using masks
143(6)
12.1 An example
143(1)
12.2 Specifying the objective
143(4)
12.3 Masks for nonlinear least squares
147(1)
12.4 Other approaches to masks
148(1)
References
148(1)
13 Handling general constraints
149(20)
13.1 Equality constraints
149(9)
13.1.1 Parameter elimination
151(2)
13.1.2 Which parameter to eliminate?
153(1)
13.1.3 Scaling and centering?
154(1)
13.1.4 Nonlinear programming packages
154(2)
13.1.5 Sequential application of an increasing penalty
156(2)
13.2 Sumscale problems
158(5)
13.2.1 Using a projection
162(1)
13.3 Inequality constraints
163(4)
13.4 A perspective on penalty function ideas
167(1)
13.5 Assessment
167(2)
References
168(1)
14 Applications of mathematical programming
169(16)
14.1 Statistical applications of math programming
169(1)
14.2 R packages for math programming
170(1)
14.3 Example problem: L1 regression
171(6)
14.4 Example problem: minimax regression
177(2)
14.5 Nonlinear quantile regression
179(1)
14.6 Polynomial approximation
180(5)
References
183(2)
15 Global optimization and stochastic methods
185(18)
15.1 Panorama of methods
185(1)
15.2 R packages for global and stochastic optimization
186(1)
15.3 An example problem
187(9)
15.3.1 Method SANN from optim()
187(1)
15.3.2 Package GenSA
188(1)
15.3.3 Packages DEoptim and RcppDE
189(2)
15.3.4 Package smco
191(1)
15.3.5 Package soma
192(1)
15.3.6 Package Rmalschains
193(1)
15.3.7 Package rgenoud
193(1)
15.3.8 Package GA
194(1)
15.3.9 Package gaoptim
195(1)
15.4 Multiple starting values
196(7)
References
202(1)
16 Scaling and reparameterization
203(21)
16.1 Why scale or reparameterize?
203(1)
16.2 Formalities of scaling and reparameterization
204(1)
16.3 Hobbs' weed infestation example
205(5)
16.4 The KKT conditions and scaling
210(4)
16.5 Reparameterization of the weeds problem
214(1)
16.6 Scale change across the parameter space
214(1)
16.7 Robustness of methods to starting points
215(7)
16.7.1 Robustness of optimization techniques
218(2)
16.7.2 Robustness of nonlinear least squares methods
220(2)
16.8 Strategies for scaling
222(2)
References
223(1)
17 Finding the right solution
224(6)
17.1 Particular requirements
224(1)
17.1.1 A few integer parameters
225(1)
17.2 Starting values for iterative methods
225(1)
17.3 KKT conditions
226(2)
17.3.1 Unconstrained problems
226(1)
17.3.2 Constrained problems
227(1)
17.4 Search tests
228(2)
References
229(1)
18 Tuning and terminating methods
230(20)
18.1 Timing and profiling
230(4)
18.1.1 rbenchmark
231(1)
18.1.2 microbenchmark
231(1)
18.1.3 Calibrating our timings
232(2)
18.2 Profiling
234(4)
18.2.1 Trying possible improvements
235(3)
18.3 More speedups of R computations
238(4)
18.3.1 Byte-code compiled functions
238(1)
18.3.2 Avoiding loops
238(1)
18.3.3 Package upgrades - an example
239(2)
18.3.4 Specializing codes
241(1)
18.4 External language compiled functions
242(5)
18.4.1 Building an R function using Fortran
244(2)
18.4.2 Summary of Rayleigh quotient timings
246(1)
18.5 Deciding when we are finished
247(3)
18.5.1 Tests for things gone wrong
248(1)
References
249(1)
19 Linking R to external optimization tools
250(5)
19.1 Mechanisms to link R to external software
251(1)
19.1.1 R functions to call external (sub)programs
251(1)
19.1.2 File and system call methods
251(1)
19.1.3 Thin client methods
252(1)
19.2 Prepackaged links to external optimization tools
252(1)
19.2.1 NEOS
252(1)
19.2.2 Automatic Differentiation Model Builder (ADMB)
252(1)
19.2.3 NLopt
253(1)
19.2.4 BUGS and related tools
253(1)
19.3 Strategy for using external tools
253(2)
References
254(1)
20 Differential equation models
255(8)
20.1 The model
255(1)
20.2 Background
256(2)
20.3 The likelihood function
258(1)
20.4 A first try at minimization
258(1)
20.5 Attempts with optimx
259(1)
20.6 Using nonlinear least squares
260(1)
20.7 Commentary
261(2)
Reference
262(1)
21 Miscellaneous nonlinear estimation tools for R
263(13)
21.1 Maximum likelihood
263(3)
21.2 Generalized nonlinear models
266(2)
21.3 Systems of equations
268(1)
21.4 Additional nonlinear least squares tools
268(2)
21.5 Nonnegative least squares
270(3)
21.6 Noisy objective functions
273(1)
21.7 Moving forward
274(2)
References
275(1)
Appendix A R packages used in examples 276(3)
Index 279
JOHN C. NASH, Telfer School of Management, University of Ottawa, Canada