Muutke küpsiste eelistusi

E-raamat: Applied Regularization Methods for the Social Sciences [Taylor & Francis e-raamat]

(Ball State University, USA)
  • Taylor & Francis e-raamat
  • Hind: 124,64 €*
  • * hind, mis tagab piiramatu üheaegsete kasutajate arvuga ligipääsu piiramatuks ajaks
  • Tavahind: 178,05 €
  • Säästad 30%
Researchers in the social sciences are faced with complex data sets in which they have relatively small samples and many variables (high dimensional data). Unlike the various technical guides currently on the market, Applied Regularization Methods for the Social Sciences provides and overview of a variety of models alongside clear examples of hands-on application. Each chapter in this book covers a specific application of regularization techniques with a user-friendly technical description, followed by examples that provide a thorough demonstration of the methods in action.

Key Features:





Description of regularization methods in a user friendly and easy to read manner Inclusion of regularization-based approaches for a variety of statistical analyses commonly used in the social sciences, including both univariate and multivariate models Fully developed extended examples using multiple software packages, including R, SAS, and SPSS Website containing all datasets and software scripts used in the examples Inclusion of both frequentist and Bayesian regularization approaches Application exercises for each chapter that instructors could use in class, and independent researchers could use to practice what they have learned from the book
1 R
1(10)
The R Console and R Scripts
1(1)
R Libraries
2(2)
Reading and Viewing Data in R
4(2)
Missing Data
6(1)
Variable and Data Set Types
7(1)
Descriptive Statistics and Graphics in R
8(2)
Summary
10(1)
2 Theoretical Underpinnings of Regularization Methods
11(14)
The Need for Variable Selection Methods
11(2)
The Lasso Estimator
13(2)
The Ridge Estimator
15(1)
The Elastic Net Estimator
16(1)
The Adaptive Lasso
16(1)
The Group Lasso
17(1)
Bayesian Regularization
18(4)
Inference for Regularization Methods
22(1)
Summary
23(2)
3 Regularization Methods for Linear Models
25(46)
Linear Regression
25(1)
Fitting Linear Regression Model with R
26(3)
Assessing Regression Assumptions Using R
29(3)
Variable Selection without Regularization
32(1)
Stepwise Regression
33(1)
Application of Stepwise Regression Using R
34(1)
Best Subsets Regression
35(1)
Application of Best Subsets Regression Using R
35(3)
Regularized Linear Regression
38(1)
Lasso Regression
38(8)
Ridge Regression
46(5)
Elastic Net Regression
51(3)
Bayesian Lasso Regression
54(7)
Bayesian Ridge Regression
61(3)
Adaptive Lasso Regression
64(2)
Group Lasso Regression
66(2)
Comparison of Modeling Approaches
68(1)
Summary
69(2)
4 Regularization Methods for Generalized Linear Models
71(52)
Logistic Regression for Dichotomous Outcome
72(1)
Fitting Logistic Regression with R
72(3)
Regularization with Logistic Regression for Dichotomous Outcomes
75(1)
Logistic Regression with the Lasso Penalty
75(3)
Logistic Regression with the Ridge Penalty
78(2)
Penalized Logistic Regression with the Bayesian Estimator
80(3)
Adaptive Lasso for Dichotomous Logistic Regression
83(3)
Grouped Regularization for Dichotomous Logistic Regression
86(2)
Logistic Regression for Ordinal Outcome
88(3)
Regularized Ordinal Logistic Regression
91(5)
Regression Models for Count Data
96(3)
Regularized Count Regression
99(5)
Cox Proportional Hazards Model
104(9)
Regularized Cox Regression
113(8)
Summary
121(2)
5 Regularization Methods for Multivariate Linear Models
123(30)
Standard Multivariate Regression
124(4)
Regularized Multivariate Regression
128(1)
Regularized Multivariate Regression in R
129(8)
Standard Canonical Correlation
137(2)
Standard CCA in R
139(3)
Regularized Canonical Correlation
142(3)
Standard Linear Discriminant Analysis
145(3)
Regularized Linear Discriminant Analysis
148(3)
Summary
151(2)
6 Regularization Methods for Cluster Analysis and Principal Components Analysis
153(38)
K-means
153(1)
Determining the Number of Clusters to Retain
154(12)
Regularized K-means Cluster Analysis
166(12)
Determining the Number of Clusters to Retain
178(1)
Hierarchical Cluster Analysis
179(5)
Regularized Hierarchical Cluster Analysis
184(6)
Summary
190(1)
7 Regularization Methods for Latent Variable Models
191(58)
Factor Analysis
192(1)
Common Factor Model
193(1)
Exploratory Factor Analysis
193(1)
Factor Extraction
194(1)
Factor Rotation
194(11)
Sparse Estimation via Nonconcave Penalized Likelihood in Factor Analysis Model (FANC)
205(3)
Confirmatory Factor Analysis
208(1)
Model Parameter Estimation
208(2)
Assessing Model Fit
210(6)
RegSEM
216(5)
Penfa
221(5)
Structural Equation Modeling
226(4)
RegSEM Structural Model
230(7)
2-Stage Least Squares Estimation for SEM
237(2)
Standard 2SLS with R
239(2)
Regularized 2SLS
241(7)
Summary
248(1)
8 Regularization Methods for Multilevel Models
249(36)
Multilevel Linear Regression
250(1)
Random Intercept Model
250(2)
Random Coefficients Model
252(1)
Regularized Multilevel Regression Model
253(1)
Fitting the Multilevel Lasso in R
254(5)
Multilevel Logistic Regression Model
259(1)
Random Intercept Logistic Regression
260(3)
Regularized Multilevel Logistic Regression
263(7)
MGLM for an Ordinal Outcome Variable
270(1)
Random Intercept Logistic Regression
270(3)
Multilevel Count Regression Model
273(1)
MGLM for Count Data
274(1)
Random Intercept Poisson Regression
274(2)
Random Coefficient Poisson Regression
276(7)
Summary
283(2)
References 285(4)
Index 289
Holmes Finch is the George and Frances Ball Distinguished Professor of Educational Psychology at BSU, and a professor of statistics and psychometrics. His research interests include structural equation modeling, item response theory, educational and psychological measurement, multilevel modeling, machine learning, and robust multivariate inference. In addition to conducting research in the field of statistics, he also regularly collaborates with colleagues in fields such as educational psychology, neuropsychology, and exercise physiology.