Muutke küpsiste eelistusi

E-raamat: Computational Statistics

(Colorado State University, Fort Collins), (Colorado State University, Fort Collins)
  • Formaat - EPUB+DRM
  • Hind: 137,02 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Givens and Hoeting (both statistics, Colorado State U.) provide graduate students in statistics and related fields, statisticians, and quantitative empirical scientists in other fields most of the information they need to develop a broad working knowledge of modern computational statistics. They emphasize how and why existing methods work, so readers learn to use them effectively both alone and when combined with others in more advanced methods. Without mentioning a date for the first edition, they say they have updated and broadened the second, adding computer code, new MCMC topics that are now popular, more methods for problems where statistical dependency is important, and new support for the open-source statistical language R. Annotation ©2013 Book News, Inc., Portland, OR (booknews.com)

This new edition continues to serve as a comprehensive guide to modern and classical methods of statistical computing. The book is comprised of four main parts spanning the field:

  • Optimization
  • Integration and Simulation
  • Bootstrapping
  • Density Estimation and Smoothing

Within these sections,each chapter includes a comprehensive introduction and step-by-step implementation summaries to accompany the explanations of key methods. The new edition includes updated coverage and existing topics as well as new topics such as adaptive MCMC and bootstrapping for correlated data. The book website now includes comprehensive R code for the entire book. There are extensive exercises, real examples, and helpful insights about how to use the methods in practice.

Preface xv
Acknowledgments xvii
1 Review
1(20)
1.1 Mathematical Notation
1(1)
1.2 Taylor's Theorem and Mathematical Limit Theory
2(2)
1.3 Statistical Notation and Probability Distributions
4(5)
1.4 Likelihood Inference
9(2)
1.5 Bayesian Inference
11(2)
1.6 Statistical Limit Theory
13(1)
1.7 Markov Chains
14(3)
1.8 Computing
17(4)
PART I OPTIMIZATION
2 Optimization And Solving Nonlinear Equations
21(38)
2.1 Univariate Problems
22(12)
2.1.1 Newton's Method
26(3)
2.1.1.1 Convergence Order
29(1)
2.1.2 Fisher Scoring
30(1)
2.1.3 Secant Method
30(2)
2.1.4 Fixed-Point Iteration
32(1)
2.1.4.1 Scaling
33(1)
2.2 Multivariate Problems
34(25)
2.2.1 Newton's Method and Fisher Scoring
34(2)
2.2.1.1 Iteratively Reweighted Least Squares
36(3)
2.2.2 Newton-Like Methods
39(1)
2.2.2.1 Ascent Algorithms
39(2)
2.2.2.2 Discrete Newton and Fixed-Point Methods
41(1)
2.2.2.3 Quasi-Newton Methods
41(3)
2.2.3 Gauss-Newton Method
44(1)
2.2.4 Nelder-Mead Algorithm
45(7)
2.2.5 Nonlinear Gauss-Seidel Iteration
52(2)
Problems
54(5)
3 Combinatorial Optimization
59(38)
3.1 Hard Problems and NP-Completeness
59(6)
3.1.1 Examples
61(3)
3.1.2 Need for Heuristics
64(1)
3.2 Local Search
65(3)
3.3 Simulated Annealing
68(7)
3.3.1 Practical Issues
70(1)
3.3.1.1 Neighborhoods and Proposals
70(1)
3.3.1.2 Cooling Schedule and Convergence
71(3)
3.3.2 Enhancements
74(1)
3.4 Genetic Algorithms
75(10)
3.4.1 Definitions and the Canonical Algorithm
75(1)
3.4.1.1 Basic Definitions
75(1)
3.4.1.2 Selection Mechanisms and Genetic Operators
76(2)
3.4.1.3 Allele Alphabets and Genotypic Representation
78(1)
3.4.1.4 Initialization, Termination, and Parameter Values
79(1)
3.4.2 Variations
80(1)
3.4.2.1 Fitness
80(1)
3.4.2.2 Selection Mechanisms and Updating Generations
81(1)
3.4.2.3 Genetic Operators and Permutation Chromosomes
82(2)
3.4.3 Initialization and Parameter Values
84(1)
3.4.4 Convergence
84(1)
3.5 Tabu Algorithms
85(12)
3.5.1 Basic Definitions
86(1)
3.5.2 The Tabu List
87(1)
3.5.3 Aspiration Criteria
88(1)
3.5.4 Diversification
89(1)
3.5.5 Intensification
90(1)
3.5.6 Comprehensive Tabu Algorithm
91(1)
Problems
92(5)
4 Em Optimization Methods
97(32)
4.1 Missing Data, Marginalization, and Notation
97(1)
4.2 The EM Algorithm
98(13)
4.2.1 Convergence
102(3)
4.2.2 Usage in Exponential Families
105(1)
4.2.3 Variance Estimation
106(1)
4.2.3.1 Louis's Method
106(2)
4.2.3.2 SEM Algorithm
108(2)
4.2.3.3 Bootstrapping
110(1)
4.2.3.4 Empirical Information
110(1)
4.2.3.5 Numerical Differentiation
111(1)
4.3 EM Variants
111(18)
4.3.1 Improving the E Step
111(1)
4.3.1.1 Monte Carlo EM
111(1)
4.3.2 Improving the M Step
112(1)
4.3.2.1 ECM Algorithm
113(3)
4.3.2.2 EM Gradient Algorithm
116(2)
4.3.3 Acceleration Methods
118(1)
4.3.3.1 Aitken Acceleration
118(1)
4.3.3.2 Quasi-Newton Acceleration
119(2)
Problems
121(8)
PART II INTEGRATION AND SIMULATION
5 Numerical Integration
129(22)
5.1 Newton-Cotes Quadrature
129(10)
5.1.1 Riemann Rule
130(4)
5.1.2 Trapezoidal Rule
134(2)
5.1.3 Simpson's Rule
136(2)
5.1.4 General kth-Degree Rule
138(1)
5.2 Romberg Integration
139(3)
5.3 Gaussian Quadrature
142(4)
5.3.1 Orthogonal Polynomials
143(1)
5.3.2 The Gaussian Quadrature Rule
143(3)
5.4 Frequently Encountered Problems
146(5)
5.4.1 Range of Integration
146(1)
5.4.2 Integrands with Singularities or Other Extreme Behavior
146(1)
5.4.3 Multiple Integrals
147(1)
5.4.4 Adaptive Quadrature
147(1)
5.4.5 Software for Exact Integration
148(1)
Problems
148(3)
6 Simulation And Monte Carlo Integration
151(50)
6.1 Introduction to the Monte Carlo Method
151(1)
6.2 Exact Simulation
152(11)
6.2.1 Generating from Standard Parametric Families
153(1)
6.2.2 Inverse Cumulative Distribution Function
153(2)
6.2.3 Rejection Sampling
155(3)
6.2.3.1 Squeezed Rejection Sampling
158(1)
6.2.3.2 Adaptive Rejection Sampling
159(4)
6.3 Approximate Simulation
163(17)
6.3.1 Sampling Importance Resampling Algorithm
163(4)
6.3.1.1 Adaptive Importance, Bridge, and Path Sampling
167(1)
6.3.2 Sequential Monte Carlo
168(1)
6.3.2.1 Sequential Importance Sampling for Markov Processes
169(1)
6.3.2.2 General Sequential Importance Sampling
170(1)
6.3.2.3 Weight Degeneracy, Rejuvenation, and Effective Sample Size
171(4)
6.3.2.4 Sequential Importance Sampling for Hidden Markov Models
175(4)
6.3.2.5 Particle Filters
179(1)
6.4 Variance Reduction Techniques
180(21)
6.4.1 Importance Sampling
180(6)
6.4.2 Antithetic Sampling
186(3)
6.4.3 Control Variates
189(4)
6.4.4 Rao-Blackwellization
193(2)
Problems
195(6)
7 Markov Chain Monte Carlo
201(36)
7.1 Metropolis-Hastings Algorithm
202(7)
7.1.1 Independence Chains
204(2)
7.1.2 Random Walk Chains
206(3)
7.2 Gibbs Sampling
209(9)
7.2.1 Basic Gibbs Sampler
209(5)
7.2.2 Properties of the Gibbs Sampler
214(2)
7.2.3 Update Ordering
216(1)
7.2.4 Blocking
216(1)
7.2.5 Hybrid Gibbs Sampling
216(2)
7.2.6 Griddy-Gibbs Sampler
218(1)
7.3 Implementation
218(19)
7.3.1 Ensuring Good Mixing and Convergence
219(1)
7.3.1.1 Simple Graphical Diagnostics
219(1)
7.3.1.2 Burn-in and Run Length
220(2)
7.3.1.3 Choice of Proposal
222(1)
7.3.1.4 Reparameterization
223(1)
7.3.1.5 Comparing Chains: Effective Sample Size
224(1)
7.3.1.6 Number of Chains
225(1)
7.3.2 Practical Implementation Advice
226(1)
7.3.3 Using the Results
226(4)
Problems
230(7)
8 Advanced Topics In MCMC
237(50)
8.1 Adaptive MCMC
237(13)
8.1.1 Adaptive Random Walk Metropolis-within-Gibbs Algorithm
238(2)
8.1.2 General Adaptive Metropolis-within-Gibbs Algorithm
240(7)
8.1.3 Adaptive Metropolis Algorithm
247(3)
8.2 Reversible Jump MCMC
250(6)
8.2.1 RJMCMC for Variable Selection in Regression
253(3)
8.3 Auxiliary Variable Methods
256(4)
8.3.1 Simulated Tempering
257(1)
8.3.2 Slice Sampler
258(2)
8.4 Other Metropolis-Hastings Algorithms
260(4)
8.4.1 Hit-and-Run Algorithm
260(1)
8.4.2 Multiple-Try Metropolis-Hastings Algorithm
261(1)
8.4.3 Langevin Metropolis-Hastings Algorithm
262(2)
8.5 Perfect Sampling
264(4)
8.5.1 Coupling from the Past
264(3)
8.5.1.1 Stochastic Monotonicity and Sandwiching
267(1)
8.6 Markov Chain Maximum Likelihood
268(1)
8.7 Example: MCMC for Markov Random Fields
269(18)
8.7.1 Gibbs Sampling for Markov Random Fields
270(4)
8.7.2 Auxiliary Variable Methods for Markov Random Fields
274(3)
8.7.3 Perfect Sampling for Markov Random Fields
277(2)
Problems
279(8)
PART III BOOTSTRAPPING
9 Bootstrapping
287(38)
9.1 The Bootstrap Principle
287(1)
9.2 Basic Methods
288(4)
9.2.1 Nonparametric Bootstrap
288(1)
9.2.2 Parametric Bootstrap
289(1)
9.2.3 Bootstrapping Regression
290(1)
9.2.4 Bootstrap Bias Correction
291(1)
9.3 Bootstrap Inference
292(10)
9.3.1 Percentile Method
292(1)
9.3.1.1 Justification for the Percentile Method
293(1)
9.3.2 Pivoting
294(1)
9.3.2.1 Accelerated Bias-Corrected Percentile Method, BCa
294(2)
9.3.2.2 The Bootstrap t
296(2)
9.3.2.3 Empirical Variance Stabilization
298(1)
9.3.2.4 Nested Bootstrap and Prepivoting
299(2)
9.3.3 Hypothesis Testing
301(1)
9.4 Reducing Monte Carlo Error
302(1)
9.4.1 Balanced Bootstrap
302(1)
9.4.2 Antithetic Bootstrap
302(1)
9.5 Bootstrapping Dependent Data
303(12)
9.5.1 Model-Based Approach
304(1)
9.5.2 Block Bootstrap
304(1)
9.5.2.1 Nonmoving Block Bootstrap
304(2)
9.5.2.2 Moving Block Bootstrap
306(1)
9.5.2.3 Blocks-of-Blocks Bootstrapping
307(2)
9.5.2.4 Centering and Studentizing
309(2)
9.5.2.5 Block Size
311(4)
9.6 Bootstrap Performance
315(1)
9.6.1 Independent Data Case
315(1)
9.6.2 Dependent Data Case
316(1)
9.7 Other Uses of the Bootstrap
316(1)
9.8 Permutation Tests
317(8)
Problems
319(6)
PART IV DENSITY ESTIMATION AND SMOOTHING
10 Nonparametric Density Estimation
325(38)
10.1 Measures of Performance
326(1)
10.2 Kernel Density Estimation
327(14)
10.2.1 Choice of Bandwidth
329(3)
10.2.1.1 Cross-Validation
332(3)
10.2.1.2 Plug-in Methods
335(3)
10.2.1.3 Maximal Smoothing Principle
338(1)
10.2.2 Choice of Kernel
339(1)
10.2.2.1 Epanechnikov Kernel
339(1)
10.2.2.2 Canonical Kernels and Rescalings
340(1)
10.3 Nonkernel Methods
341(4)
10.3.1 Logspline
341(4)
10.4 Multivariate Methods
345(18)
10.4.1 The Nature of the Problem
345(1)
10.4.2 Multivariate Kernel Estimators
346(2)
10.4.3 Adaptive Kernels and Nearest Neighbors
348(1)
10.4.3.1 Nearest Neighbor Approaches
349(1)
10.4.3.2 Variable-Kernel Approaches and Transformations
350(3)
10.4.4 Exploratory Projection Pursuit
353(6)
Problems
359(4)
11 Bivariate Smoothing
363(30)
11.1 Predictor-Response Data
363(2)
11.2 Linear Smoothers
365(12)
11.2.1 Constant-Span Running Mean
366(2)
11.2.1.1 Effect of Span
368(1)
11.2.1.2 Span Selection for Linear Smoothers
369(3)
11.2.2 Running Lines and Running Polynomials
372(2)
11.2.3 Kernel Smoothers
374(1)
11.2.4 Local Regression Smoothing
374(2)
11.2.5 Spline Smoothing
376(1)
11.2.5.1 Choice of Penalty
377(1)
11.3 Comparison of Linear Smoothers
377(2)
11.4 Nonlinear Smoothers
379(5)
11.4.1 Loess
379(2)
11.4.2 Supersmoother
381(3)
11.5 Confidence Bands
384(4)
11.6 General Bivariate Data
388(5)
Problems
389(4)
12 Multivariate Smoothing
393(28)
12.1 Predictor-Response Data
393(20)
12.1.1 Additive Models
394(3)
12.1.2 Generalized Additive Models
397(2)
12.1.3 Other Methods Related to Additive Models
399(1)
12.1.3.1 Projection Pursuit Regression
399(3)
12.1.3.2 Neural Networks
402(1)
12.1.3.3 Alternating Conditional Expectations
403(1)
12.1.3.4 Additivity and Variance Stabilization
404(1)
12.1.4 Tree-Based Methods
405(1)
12.1.4.1 Recursive Partitioning Regression Trees
406(3)
12.1.4.2 Tree Pruning
409(2)
12.1.4.3 Classification Trees
411(1)
12.1.4.4 Other Issues for Tree-Based Methods
412(1)
12.2 General Multivariate Data
413(8)
12.2.1 Principal Curves
413(1)
12.2.1.1 Definition and Motivation
413(2)
12.2.1.2 Estimation
415(1)
12.2.1.3 Span Selection
416(1)
Problems
416(5)
Data Acknowledgments 421(2)
References 423(34)
Index 457
GEOF H. GIVENS, PhD, is Associate Professor in the Department of Statistics at Colorado State University. He serves as Associate Editor for Computational Statistics and Data Analysis. His research interests include statistical problems in wildlife conservation biology including ecology, population modeling and management, and automated computer face recognition.

JENNIFER A. HOETING, PhD, is Professor in the Department of Statistics at Colorado State University. She is an award-winning teacher who co-leads large research efforts for the National Science Foundation. She has served as associate editor for the Journal of the American Statistical Association and Environmetrics. Her research interests include spatial statistics, Bayesian methods, and model selection.

Givens and Hoeting have taught graduate courses on computational statistics for nearly twenty years, and short courses to leading statisticians and scientists around the world.