Muutke küpsiste eelistusi

E-raamat: Sufficient Dimension Reduction: Methods and Applications with R

(Pennsylvania State University, University Park, PA)
  • Formaat - PDF+DRM
  • Hind: 59,79 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Sufficient dimension reduction is a rapidly developing research field that has wide applications in regression diagnostics, data visualization, machine learning, genomics, image processing, pattern recognition, and medicine, because they are fields that produce large datasets with a large number of variables. Sufficient Dimension Reduction: Methods and Applications with R introduces the basic theories and the main methodologies, provides practical and easy-to-use algorithms and computer codes to implement these methodologies, and surveys the recent advances at the frontiers of this field.

Features











Provides comprehensive coverage of this emerging research field.





Synthesizes a wide variety of dimension reduction methods under a few unifying principles such as projection in Hilbert spaces, kernel mapping, and von Mises expansion.





Reflects most recent advances such as nonlinear sufficient dimension reduction, dimension folding for tensorial data, as well as sufficient dimension reduction for functional data.





Includes a set of computer codes written in R that are easily implemented by the readers.





Uses real data sets available online to illustrate the usage and power of the described methods.

Sufficient dimension reduction has undergone momentous development in recent years, partly due to the increased demands for techniques to process high-dimensional data, a hallmark of our age of Big Data. This book will serve as the perfect entry into the field for the beginning researchers or a handy reference for the advanced ones.

The author

Bing Li obtained his Ph.D. from the University of Chicago. He is currently a Professor of Statistics at the Pennsylvania State University. His research interests cover sufficient dimension reduction, statistical graphical models, functional data analysis, machine learning, estimating equations and quasilikelihood, and robust statistics. He is a fellow of the Institute of Mathematical Statistics and the American Statistical Association. He is an Associate Editor for The Annals of Statistics and the Journal of the American Statistical Association.

Arvustused

"...Sufficient Dimension Reduction: Methods and Applications with R is a thorough overview of the key ideas and a detailed reference for advanced researchers...Professor Li gives careful discussions of the relevant details, rendering the text impressively self-contained. But as one would expect from a book based on graduate course notes, this manuscript is mainly accessible to those with advanced training in theoretical statistics...This book serves as an excellent introduction to the field of sufficient dimension reduction, and the depth of presentation and theoretical rigor are impressive. It would, of course, naturally serve as the basis for a deep graduate course, and provides a substantial foundation for anyone hoping to contribute in this thriving area." - Daniel J. McDonald, JASA 2020

List of Figures
xiii
List of Tables
xvii
Preface xix
Author xxi
1 Preliminaries
1(16)
1.1 Empirical Distribution and Sample Moments
1(1)
1.2 Principal Component Analysis
2(1)
1.3 Generalized Eigenvalue Problem
3(1)
1.4 Multivariate Linear Regression
3(2)
1.5 Generalized Linear Model
5(3)
1.5.1 Exponential Family
5(1)
1.5.2 Generalized Linear Models
6(2)
1.6 Hilbert Space, Linear Manifold, Linear Subspace
8(2)
1.7 Linear Operator and Projection
10(1)
1.8 The Hilbert Space W(Z)
11(1)
1.9 Coordinate Representation
12(1)
1.10 Generalized Linear Models under Link Violation
13(4)
2 Dimension Reduction Subspaces
17(10)
2.1 Conditional Independence
17(4)
2.2 Sufficient Dimension Reduction Subspace
21(3)
2.3 Transformation Laws of Central Subspace
24(1)
2.4 Fisher Consistency, Unbiasedness, and Exhaustiveness
25(2)
3 Sliced Inverse Regression
27(10)
3.1 Sliced Inverse Regression: Population-Level Development
27(3)
3.2 Limitation of SIR
30(1)
3.3 Estimation, Algorithm, and R-codes
31(2)
3.4 Application: The Big Mac Index
33(4)
4 Parametric and Kernel Inverse Regression
37(10)
4.1 Parametric Inverse Regression
37(2)
4.2 Algorithm, R Codes, and Application
39(1)
4.3 Relation of PIR with SIR
40(2)
4.4 Relation of PIR with Ordinary Least Squares
42(1)
4.5 Kernel Inverse Regression
42(5)
5 Sliced Average Variance Estimate
47(16)
5.1 Motivation
47(1)
5.2 Constant Conditional Variance Assumption
47(2)
5.3 Sliced Average Variance Estimate
49(3)
5.4 Algorithm and R-code
52(3)
5.5 Relation with SIR
55(1)
5.6 The Issue of Exhaustiveness
56(2)
5.7 SIR-II
58(2)
5.8 Case Study: The Pen Digit Data
60(3)
6 Contour Regression and Directional Regression
63(20)
6.1 Contour Directions and Central Subspace
63(2)
6.2 Contour Regression at the Population Level
65(2)
6.3 Algorithm and R Codes for CR
67(2)
6.4 Exhaustiveness of Contour Regression
69(1)
6.5 Directional Regression
70(4)
6.6 Representation of ADR Using Moments
74(2)
6.7 Algorithm and R Codes for DR
76(1)
6.8 Exhaustiveness Relation with SIR and SAVE
77(2)
6.9 Pen Digit Case Study Continued
79(4)
7 Elliptical Distribution and Predictor Transformation
83(14)
7.1 Linear Conditional Mean and Elliptical Distribution
83(5)
7.2 Box-Cox Transformation
88(4)
7.3 Application to the Big Mac Data
92(2)
7.4 Estimating Equations for Handling Non-Ellipticity
94(3)
8 Sufficient Dimension Reduction for Conditional Mean
97(10)
8.1 Central Mean Subspace
97(3)
8.2 Ordinary Least Squares
100(1)
8.3 Principal Hessian Direction
101(3)
8.4 Iterative Hessian Transformation
104(3)
9 Asymptotic Sequential Test for Order Determination
107(34)
9.1 Stochastic Ordering and Von Mises Expansion
107(2)
9.2 Von Mises Expansion and Influence Functions
109(1)
9.3 Influence Functions of Some Statistical Functionals
110(2)
9.4 Random Matrix with Affine Invariant Eigenvalues
112(3)
9.5 Asymptotic Distribution of the Sum of Small Eigenvalues
115(2)
9.6 General Form of the Sequential Tests
117(1)
9.7 Sequential Test for SIR
118(6)
9.8 Sequential Test for PHD
124(2)
9.9 Sequential Test for SAVE
126(6)
9.10 Sequential Test for DR
132(7)
9.11 Applications
139(2)
10 Other Methods for Order Determination
141(18)
10.1 BIC Type Criteria for Order Determination
141(6)
10.2 Bootstrapped Eigenvector Variation
147(3)
10.3 Eigenvalue Magnitude and Eigenvector Variation
150(2)
10.4 Ladle Estimator
152(4)
10.5 Consistency of the Ladle Estimator
156(1)
10.6 Application: Identification of Wine Cultivars
156(3)
11 Forward Regressions for Dimension Reduction
159(32)
11.1 Outer Product of Gradients
160(3)
11.2 Fisher Consistency of Gradient Estimate
163(4)
11.3 Minimum Average Variance Estimate
167(3)
11.4 Refined MAVE and refined OPG
170(3)
11.5 From Central Mean Subspace to Central Subspace
173(1)
11.6 dOPG and Its Refinement
173(5)
11.7 dMAVE and Its Refinement
178(2)
11.8 Ensemble Estimators
180(4)
11.9 Simulation Studies and Applications
184(4)
11.10 Summary
188(3)
12 Nonlinear Sufficient Dimension Reduction
191(20)
12.1 Reproducing Kernel Hilbert Space
192(1)
12.2 Covariance Operators in RKHS
193(6)
12.3 Coordinate Mapping
199(1)
12.4 Coordinate of Covariance Operators
200(2)
12.5 Kernel Principal Component Analysis
202(2)
12.6 Sufficient and Central σ-Field for Nonlinear SDR
204(2)
12.7 Complete Sub σ-Field for Nonlinear SDR
206(2)
12.8 Converting σ-Fields to Function Classes for Estimation
208(3)
13 Generalized Sliced Inverse Regression
211(22)
13.1 Regression Operator
212(1)
13.2 Generalized Sliced Inverse Regression
213(2)
13.3 Exhaustiveness and Completeness
215(1)
13.4 Relative Universality
216(1)
13.5 Implementation of GSIR
217(3)
13.6 Precursors and Variations of GSIR
220(1)
13.7 Generalized Cross Validation for Tuning εx and εY
220(3)
13.8 k-Fold Cross Validation for Tuning ρx, ρy, εx, εy
223(2)
13.9 Simulation Studies
225(2)
13.10 Applications
227(6)
13.10.1 Pen Digit Data
227(1)
13.10.2 Face Sculpture Data
228(5)
14 Generalized Sliced Average Variance Estimator
233(20)
14.1 Generalized Sliced Average Variance Estimation
233(4)
14.2 Relation with GSIR
237(2)
14.3 Implementation of GSAVE
239(9)
14.4 Simulation Studies and an Application
248(3)
14.5 Relation between Linear and Nonlinear SDR
251(2)
15 The Broad Scope of Sufficient Dimension Reduction
253(18)
15.1 Sufficient Dimension Reduction for Functional Data
253(3)
15.2 Sufficient Dimension Folding for Tensorial Data
256(3)
15.3 Sufficient Dimension Reduction for Grouped Data
259(1)
15.4 Variable Selection via Sufficient Dimension Reduction
260(2)
15.5 Efficient Dimension Reduction
262(2)
15.6 Partial Dimension Reduction for Categorical Predictors
264(1)
15.7 Measurement Error Problem
265(2)
15.8 SDR via Support Vector Machine
267(1)
15.9 SDR for Multivariate Responses
268(3)
Bibliography 271(10)
Index 281
Bing Li obtained his Ph.D. from the University of Chicago. He is currently a Professor of Statistics at the Pennsylvania State University. His research interests cover sufficient dimension reduction, statistical graphical models, functional data analysis, machine learning, estimating equations and quasilikelihood, and robust statistics. He is a fellow of the Institute of Mathematical Statistics and the American Statistical Association. He is an Associate Editor for The Annals of Statistics and the Journal of the American Statistical Association.