Muutke küpsiste eelistusi

E-raamat: Surrogates: Gaussian Process Modeling, Design, and Optimization for the Applied Sciences

(Virginia Tech Department of Statistics, USA)
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 51,99 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Surrogates is a graduate textbook, on topics at the interface between machine learning,spatial statistics,computer simulation,meta-modeling,design of experiments,and optimization. Experimentation through simulation,management of dynamic processes,online and real-time analysis,automation and practical application are at the forefront.



Surrogates: a graduate textbook, or professional handbook, on topics at the interface between machine learning, spatial statistics, computer simulation, meta-modeling (i.e., emulation), design of experiments, and optimization. Experimentation through simulation, "human out-of-the-loop" statistical support (focusing on the science), management of dynamic processes, online and real-time analysis, automation, and practical application are at the forefront.

Topics include:

  • Gaussian process (GP) regression for flexible nonparametric and nonlinear modeling.
  • Applications to uncertainty quantification, sensitivity analysis, calibration of computer models to field data, sequential design/active learning and (blackbox/Bayesian) optimization under uncertainty.
  • Advanced topics include treed partitioning, local GP approximation, modeling of simulation experiments (e.g., agent-based models) with coupled nonlinear mean and variance (heteroskedastic) models.
  • Treatment appreciates historical response surface methodology (RSM) and canonical examples, but emphasizes contemporary methods and implementation in R at modern scale.
  • Rmarkdown facilitates a fully reproducible tour, complete with motivation from, application to, and illustration with, compelling real-data examples.

Presentation targets numerically competent practitioners in engineering, physical, and biological sciences. Writing is statistical in form, but the subjects are not about statistics. Rather, they’re about prediction and synthesis under uncertainty; about visualization and information, design and decision making, computing and clean code.

Arvustused

"The coverage of this book is unique and important. It focuses on a current area at the edge of applied mathematics and statistics, a domain that really should be substantially better-developed. For researchers and students who already have a solid foundation in statistics and familiarity with R, and want to know more about how statistics can be used in the approximation of complex functions and numerical optimization (i.e. computer experiments), this should be a welcome resource." -Max Morris, Iowa State University, USA

This book is a fantastic exploration of Gaussian process surrogates and a variety of applications to which they have been utilized. This approach is rapidly expanding in both the statistical and machine learning communities. I particularly enjoyed the applied focus of this book and the ease with which the author enables the reader to follow along, by providing code for each example discussed. In my view, the technical content of the book is well-chosen, and the flow of material should be very well-received by the readership. -Brian J. Williams, Scientist, Los Alamos National Laboratory

"This book offers a good coverage of Gaussian Process for computer simulation experiments. [ . . .] Reading the book for two hours you already forgot the fear you had when you first opened this 543-page long book that is impressively titled as surrogate. The accessible R examples, the tongue-in-the-cheek tone, the pursuit of simplicity (simple but not simpler), greatly reduced the distance between the reader and the system of knowledge presented in this book. On the other hand, the complexity of the subject matter is never lost but a sense of appreciation of the complexity is enhanced during reading, and unsolved mysteries remain open. Is this the reason why Newtons ironically long and complicated comment is quoted in the very beginning of the preface to establish the notion of simplicity? At any rate, there are invitations everywhere in the book that are empowering the readers, not just to use the codes for whatever imminent practical tasks at hand, but also embark on methodological pursuit in future to solve the unsolved mysteries. This is a great book for PhD students for sure, but also a good entry point for anyone who opens the book in a hope to readily use some methods as well." -Shuai Huang, Journal of Quality Technology

"At 543 pages in length, the books coverage is exhaustive, covering a wide variety of well-chosen theoretical and practical topics in computer modeling. The book does a fantastic job exploring the big topics in computer modeling, including prediction/emulation, uncertainty quantification, calibration, sensitivity analysis, and the sequential optimization of computer experiments.[ . . .] In particular, the amount of reproducible code the author provides is really where the book shines. By far the greatest feature of this book is the amount of effort that the author has put into making the concepts understandable via available R code. Almost no equation in the book comes without an accompanying snippet of example code in R. Rather than have an appendix of code in the back of the book, the author wisely peppers the code upfront throughout the text of the chapters making the formulas and examples (down to even the figures) in the book easy to follow along, understand, and replicate. In fact, the entire content of the book is reproducible given the authors choice to use Rmarkdown for all of the writing. Stitching all of the Rmarkdown files together using bookdown, the author has also made the book and code both easily accessible and widely available to readers (see https://bookdown.org/rbg/surrogates/). This style of writing is intentional and makes it both easy and enjoyable for the reader to follow along. The author has clearly made it a priority tomake the book both appealing and useful to its reader. The author understands that a significant portion the readership of the book will be practitioners and thus has made the book very easy to understand and use, while not sacrificing quality of material. As such, the book has the potential to be the "go to" reference for people entering the area. Likewise, the book would serve well as a text for a class (likely a graduate course) on computer experiments." -Tony Pourmohamad, Technometrics

"In conclusion, Surrogates: Gaussian Process Modeling, Design, and Optimization for the Applied Sciences is a book that is a fusion of response surface methodology and associated problems with Gaussian process modelling. Gramacy covers a lot of ground while being very attentive to the various fields of machine learning and statistics that have considered Gaussian processes. He has synthesised the knowledge about these topics in a very interesting and fresh manner. The book is a great introduction to Gaussian processes and their use on large-scale datasets, along with their application to various problems in design of experiments. The R code provided will allow users of the book to be able to implement these methods quickly in practice. I look forward to future editions of the book." -Debashis Ghosh, International Statistical Review

Preface xi
1 Historical Perspective
1(30)
1.1 Response surface methodology
3(11)
1.1.1 What is it?
3(3)
1.1.2 Low-order polynomials
6(6)
1.1.3 General models, inference and sequential design
12(2)
1.2 Computer experiments
14(11)
1.2.1 Aircraft wing weight example
16(3)
1.2.2 Surrogate modeling and design
19(6)
1.3 A road map
25(2)
1.4 Homework exercises
27(4)
2 Four Motivating Datasets
31(32)
2.1 Rocket booster dynamics
31(7)
2.1.1 Data
32(2)
2.1.2 Sequential design and nonstationary surrogate modeling
34(4)
2.2 Radiative shock hydrodynamics
38(7)
2.2.1 Data
39(2)
2.2.2 Computer model calibration
41(4)
2.3 Predicting satellite drag
45(7)
2.3.1 Simulating drag
46(2)
2.3.2 Surrogate drag
48(4)
2.3.3 Big tpm runs
52(1)
2.4 Groundwater remediation
52(7)
2.4.1 Optimization and search
55(3)
2.4.2 Where from here?
58(1)
2.5 Data out there
59(1)
2.6 Homework exercises
60(3)
3 Steepest Ascent and Ridge Analysis
63(54)
3.1 Path of steepest ascent
64(18)
3.1.1 Signs and magnitudes
64(9)
3.1.2 Confidence regions
73(3)
3.1.3 Constrained ascent
76(6)
3.2 Second-order response surfaces
82(31)
3.2.1 Canonical analysis
82(9)
3.2.2 Ridge analysis
91(10)
3.2.3 Sampling properties
101(3)
3.2.4 Confidence in the stationary point
104(6)
3.2.5 Intervals on eigenvalues
110(3)
3.3 Homework exercises
113(4)
4 Space-filling Design
117(26)
4.1 Latin hypercube sample
118(11)
4.1.1 LHS properties
121(3)
4.1.2 LHS variations and extensions
124(5)
4.2 Maximin designs
129(8)
4.2.1 Calculating maximin designs
130(4)
4.2.2 Sequential maximin design
134(3)
4.3 Libraries and hybrids
137(2)
4.4 Homework exercises
139(4)
5 Gaussian Process Regression
143(80)
5.1 Gaussian process prior
144(11)
5.1.1 Gaussian process posterior
146(5)
5.1.2 Higher dimension?
151(4)
5.2 GP hyperparameters
155(32)
5.2.1 Scale
155(7)
5.2.2 Noise and nuggets
162(4)
5.2.3 Derivative-based hyperparameter optimization
166(2)
5.2.4 Lengthscale: rate of decay of correlation
168(4)
5.2.5 Anisotropic modeling
172(8)
5.2.6 Library
180(3)
5.2.7 Abakeoff
183(4)
5.3 Some interpretation and perspective
187(17)
5.3.1 Bayesian linear regression?
187(2)
5.3.2 Latent random field
189(2)
5.3.3 Stationary kernels
191(9)
5.3.4 Signal-to-noise
200(4)
5.4 Challenges and remedies
204(12)
5.4.1 GP by convolution
205(7)
5.4.2 Limits of stationarity
212(3)
5.4.3 Functional and other outputs
215(1)
5.5 Homework exercises
216(7)
6 Model-based Design for GPs
223(38)
6.1 Model-based design
224(11)
6.1.1 Maximum entropy design
224(5)
6.1.2 Minimizing predictive uncertainty
229(6)
6.2 Sequential design/active learning
235(19)
6.2.1 Whack-a-mole: active learning MacKay
237(9)
6.2.2 A more aggregate criteria: active learning Cohn
246(5)
6.2.3 Other sequential criteria
251(3)
6.3 Fast GP updates
254(1)
6.4 Homework exercises
255(6)
7 Optimization
261(72)
7.1 Surrogate-assisted optimization
262(10)
7.1.1 A running example
263(7)
7.1.2 A classical comparator
270(2)
7.2 Expected improvement
272(19)
7.2.1 Classic EI illustration
274(3)
7.2.2 EI on our running example
277(8)
7.2.3 Conditional improvement
285(1)
7.2.4 Noisy objectives
286(2)
7.2.5 Illustrating conditional improvement and noise
288(3)
7.3 Optimization under constraints
291(38)
7.3.1 Known constraints
292(4)
7.3.2 Blackbox binary constraints
296(9)
7.3.3 Real-valued constraints
305(3)
7.3.4 Augmented Lagrangian
308(6)
7.3.5 Augmented Lagrangian Bayesian optimization (ALBO)
314(4)
7.3.6 ALBO implementation details
318(3)
7.3.7 ALBO in action
321(6)
7.3.8 Equality constraints and more
327(2)
7.4 Homework exercises
329(4)
8 Calibration and Sensitivity
333(46)
8.1 Calibration
333(28)
8.1.1 Kennedy and O'Hagan framework
335(3)
8.1.2 Modularization
338(7)
8.1.3 Calibration as optimization
345(5)
8.1.4 Removing bias
350(6)
8.1.5 Back to Bayes
356(5)
8.2 Sensitivity analysis
361(15)
8.2.1 Uncertainty distribution
362(1)
8.2.2 Main effects
363(3)
8.2.3 First-order and total sensitivity
366(6)
8.2.4 Bayesian sensitivity
372(4)
8.3 Homework exercises
376(3)
9 GP Fidelity and Scale
379(78)
9.1 Compactly supported kernels
381(14)
9.1.1 Working with CSKs
382(2)
9.1.2 Sharing load between mean and variance
384(4)
9.1.3 Practical Bayesian inference and UQ
388(7)
9.2 Partition models and regression trees
395(23)
9.2.1 Divide-and-conquer regression
396(7)
9.2.2 Treed Gaussian process
403(12)
9.2.3 Regression tree extensions, off-shoots and fix-ups
415(3)
9.3 Local approximate GPs
418(34)
9.3.1 Local subdesign
419(3)
9.3.2 Illustrating LAGP: ALC v. MSPE
422(5)
9.3.3 Global LAGP surrogate
427(7)
9.3.4 Global/local multi-resolution effect
434(2)
9.3.5 Details and variations
436(5)
9.3.6 Two applications
441(11)
9.4 Homework exercises
452(5)
10 Heteroskedasticity
457(42)
10.1 Replication and stochastic kriging
458(5)
10.1.1 Woodbury trick
459(1)
10.1.2 Efficient inference and prediction under replication
460(3)
10.2 Coupled mean and variance GPs
463(13)
10.2.1 Latent variance process
465(2)
10.2.2 Illustrations with hetGP
467(9)
10.3 Sequential design
476(19)
10.3.1 Integrated mean-squared prediction error
476(5)
10.3.2 Lookahead over replication
481(7)
10.3.3 Examples
488(4)
10.3.4 Optimization, level sets, calibration and more
492(3)
10.4 Homework exercises
495(4)
Appendix
499(12)
A Numerical Linear Algebra for Fast GPs
499(6)
A.1 Intel MKL and OSX Accelerate
499(4)
A.2 Stochastic approximation
503(2)
B An Experiment Game
505(6)
B.1 A shiny update to an old game
505(2)
B.2 Benchmarking play in real-time
507(4)
Bibliography 511(20)
Index 531
Robert B. Gramacy is a professor of Statistics in the College of Science at Virginia Tech. Research interests include Bayesian modeling methodology, statistical computing, Monte Carlo inference, nonparametric regression, sequential design, and optimization under uncertainty. Bobby enjoys cycling and ice hockey, and watching his kids grow up too fast.