|
|
|
|
3 | (23) |
|
1.1 Bayesian Statistical Modeling: The Frequency of Use |
|
|
3 | (3) |
|
1.2 The Key Impediments within Bayesian Statistics |
|
|
6 | (3) |
|
1.3 Benefits of Bayesian Statistics within SEM |
|
|
9 | (3) |
|
1.3.1 A Recap: Why Bayesian SEM? |
|
|
12 | (1) |
|
1.4 Mastering the SEM Basics: Precursors to Bayesian SEM |
|
|
12 | (8) |
|
1.4.1 The Fundamentals of SEM Diagrams and Terminology |
|
|
13 | (4) |
|
|
17 | (2) |
|
1.4.3 Additional Comments about Notation |
|
|
19 | (1) |
|
1.5 Datasets Used in the Chapter Examples |
|
|
20 | (6) |
|
|
21 | (1) |
|
1.5.2 Early Childhood Longitudinal Survey-Kindergarten Class |
|
|
21 | (1) |
|
1.5.3 Holzinger and Swineford (1939) |
|
|
21 | (1) |
|
1.5.4 IPIP 50: Big Five Questionnaire |
|
|
22 | (1) |
|
1.5.5 Lakaev Academic Stress Response Scale |
|
|
23 | (1) |
|
1.5.6 Political Democracy |
|
|
23 | (1) |
|
1.5.7 Program for International Student Assessment |
|
|
24 | (1) |
|
1.5.8 Youth Risk Behavior Survey |
|
|
25 | (1) |
|
2 Basic Elements of Bayesian Statistics |
|
|
26 | (63) |
|
2.1 A Brief Introduction to Bayesian Statistics |
|
|
26 | (1) |
|
|
27 | (2) |
|
2.3 Comparing Frequentist and Bayesian Estimation |
|
|
29 | (2) |
|
2.4 The Bayesian Research Circle |
|
|
31 | (1) |
|
|
32 | (2) |
|
|
34 | (9) |
|
|
35 | (1) |
|
|
35 | (1) |
|
2.6.3 The Inverse Gamma Prior |
|
|
35 | (1) |
|
|
36 | (1) |
|
2.6.5 The Inverse Wishart Prior |
|
|
36 | (1) |
|
|
36 | (1) |
|
|
37 | (1) |
|
2.6.8 The Dirichlet Prior |
|
|
37 | (1) |
|
2.6.9 Different Levels of Informativeness for Prior Distributions |
|
|
38 | (1) |
|
|
39 | (3) |
|
2.6.11 Prior Predictive Checking |
|
|
42 | (1) |
|
2.7 The Likelihood (Frequentist and Bayesian Perspectives) |
|
|
43 | (2) |
|
|
45 | (10) |
|
2.8.1 An Introduction to Markov Chain Monte Carlo Methods |
|
|
45 | (2) |
|
2.8.2 Sampling Algorithms |
|
|
47 | (5) |
|
|
52 | (1) |
|
|
53 | (1) |
|
2.8.5 The Number of Markov Chains |
|
|
53 | (1) |
|
2.8.6 A Note about Starting Values |
|
|
54 | (1) |
|
|
54 | (1) |
|
|
55 | (7) |
|
2.9.1 Posterior Summary Statistics |
|
|
55 | (1) |
|
|
56 | (1) |
|
2.9.3 Effective Sample Size |
|
|
56 | (1) |
|
|
57 | (1) |
|
2.9.5 Autocorrelation Plots |
|
|
57 | (1) |
|
2.9.6 Posterior Histogram and Density Plots |
|
|
57 | (1) |
|
2.9.7 HDI Histogram and Density Plots |
|
|
57 | (1) |
|
|
58 | (1) |
|
2.9.9 Sensitivity Analysis |
|
|
58 | (4) |
|
|
62 | (9) |
|
|
71 | (5) |
|
2.11.1 Major Take-Home Points |
|
|
71 | (2) |
|
2.11.2 Notation Referenced |
|
|
73 | (2) |
|
2.11.3 Annotated Bibliography of Select Resources |
|
|
75 | (1) |
|
Appendix 2.A Getting Started with R |
|
|
76 | (13) |
|
Part II Measurement Models and Related Issues |
|
|
|
3 The Confirmatory Factor Analysis Model |
|
|
89 | (49) |
|
3.1 Introduction to Bayesian CFA |
|
|
89 | (2) |
|
3.2 The Model and Notation |
|
|
91 | (5) |
|
3.2.1 Handling Indeterminacies in CFA |
|
|
93 | (3) |
|
3.3 The Bayesian Form of the CFA Model |
|
|
96 | (5) |
|
3.3.1 Additional Information about the (Inverse) Wishart Prior |
|
|
97 | (3) |
|
3.3.2 Alternative Priors for Covariance Matrices |
|
|
100 | (1) |
|
3.3.3 Alternative Priors for Variances |
|
|
100 | (1) |
|
3.3.4 Alternative Priors for Factor Loadings |
|
|
101 | (1) |
|
3.4 Example 1: Basic CFA Model |
|
|
101 | (19) |
|
3.5 Example 2: Implementing Near-Zero Priors for Cross-Loadings |
|
|
120 | (4) |
|
3.6 How to Write Up Bayesian CFA Results |
|
|
124 | (4) |
|
3.6.1 Hypothetical Data Analysis Plan |
|
|
125 | (1) |
|
3.6.2 Hypothetical Results Section |
|
|
125 | (2) |
|
3.6.3 Discussion Points Relevant to the Analysis |
|
|
127 | (1) |
|
|
128 | (10) |
|
3.7.1 Major Take-Home Points |
|
|
128 | (3) |
|
3.7.2 Notation Referenced |
|
|
131 | (1) |
|
3.7.3 Annotated Bibliography of Select Resources |
|
|
132 | (1) |
|
3.7.4 Example Code for Mplus |
|
|
133 | (3) |
|
|
136 | (2) |
|
|
138 | (31) |
|
4.1 A Brief Introduction to Multiple-Group Models |
|
|
138 | (1) |
|
4.2 Introduction to the Multiple-Group CFA Model (with Mean Differences) |
|
|
139 | (1) |
|
4.3 The Model and Notation |
|
|
140 | (2) |
|
4.4 The Bayesian Form of the Multiple-Group CFA Model |
|
|
142 | (2) |
|
4.5 Example 1: Using a Mean-Difference, Multiple-Group CFA Model to Assess for School Differences |
|
|
144 | (9) |
|
4.6 Introduction to the MIMIC Model |
|
|
153 | (1) |
|
4.7 The Model and Notation |
|
|
153 | (1) |
|
4.8 The Bayesian Form of the MIMIC Model |
|
|
154 | (2) |
|
4.9 Example 2: Using the MIMIC Model to Assess for School Differences |
|
|
156 | (2) |
|
4.10 How to Write Up Bayesian Multiple-Group Model Results with Mean Differences |
|
|
158 | (3) |
|
4.10.1 Hypothetical Data Analysis Plan |
|
|
158 | (1) |
|
4.10.2 Hypothetical Results Section |
|
|
159 | (1) |
|
4.10.3 Discussion Points Relevant to the Analysis |
|
|
160 | (1) |
|
|
161 | (8) |
|
4.11.1 Major Take-Home Points |
|
|
162 | (1) |
|
4.11.2 Notation Referenced |
|
|
163 | (2) |
|
4.11.3 Annotated Bibliography of Select Resources |
|
|
165 | (1) |
|
4.11.4 Example Code for Mplus |
|
|
166 | (1) |
|
4.11.5 Example Code for R |
|
|
167 | (2) |
|
5 Measurement Invariance Testing |
|
|
169 | (30) |
|
5.1 A Brief Introduction to MI in SEM |
|
|
169 | (4) |
|
5.1.1 Stages of Traditional MI Testing |
|
|
170 | (2) |
|
5.1.2 Challenges within Traditional MI Testing |
|
|
172 | (1) |
|
5.2 Bayesian Approximate MI |
|
|
173 | (1) |
|
5.3 The Model and Notation |
|
|
174 | (2) |
|
5.4 Priors within Bayesian Approximate MI |
|
|
176 | (2) |
|
5.5 Example: Illustrating Bayesian Approximate MI for School Differences |
|
|
178 | (8) |
|
5.5.1 Results for the Conventional MI Tests |
|
|
181 | (1) |
|
5.5.2 Results for the Bayesian Approximate MI Tests |
|
|
182 | (2) |
|
5.5.3 Results Comparing Latent Means across Approaches |
|
|
184 | (2) |
|
5.6 How to Write Up Bayesian Approximate MI Results |
|
|
186 | (4) |
|
5.6.1 Hypothetical Data Analysis Plan |
|
|
187 | (1) |
|
5.6.2 Hypothetical Analytic Procedure |
|
|
188 | (1) |
|
5.6.3 Hypothetical Results Section |
|
|
189 | (1) |
|
5.6.4 Discussion Points Relevant to the Analysis |
|
|
190 | (1) |
|
|
190 | (9) |
|
5.7.1 Major Take-Home Points |
|
|
190 | (2) |
|
5.7.2 Notation Referenced |
|
|
192 | (1) |
|
5.7.3 Annotated Bibliography of Select Resources |
|
|
193 | (1) |
|
5.7.4 Example Code for Mplus |
|
|
194 | (1) |
|
|
195 | (4) |
|
Part III Extending the Structural Model |
|
|
|
6 The General Structural Equation Model |
|
|
199 | (29) |
|
6.1 Introduction to Bayesian SEM |
|
|
199 | (2) |
|
6.2 The Model and Notation |
|
|
201 | (2) |
|
6.3 The Bayesian Form of SEM |
|
|
203 | (1) |
|
6.4 Example: Revisiting Bollen's (1989) Political Democracy Example |
|
|
204 | (9) |
|
6.4.1 Motivation for This Example |
|
|
205 | (1) |
|
6.4.2 The Current Example |
|
|
206 | (7) |
|
6.5 How to Write Up Bayesian SEM Results |
|
|
213 | (3) |
|
6.5.1 Hypothetical Data Analysis Plan |
|
|
213 | (1) |
|
6.5.2 Hypothetical Results Section |
|
|
214 | (1) |
|
6.5.3 Discussion Points Relevant to the Analysis |
|
|
215 | (1) |
|
|
216 | (8) |
|
6.6.1 Major Take-Home Points |
|
|
217 | (2) |
|
6.6.2 Notation Referenced |
|
|
219 | (2) |
|
6.6.3 Annotated Bibliography of Select Resources |
|
|
221 | (1) |
|
6.6.4 Example Code for Mplus |
|
|
222 | (1) |
|
|
223 | (1) |
|
Appendix 6.A Causal Inference and Mediation Analysis |
|
|
224 | (4) |
|
7 Multilevel Structural Equation Modeling |
|
|
228 | (47) |
|
|
228 | (5) |
|
|
230 | (2) |
|
|
232 | (1) |
|
7.2 Extending MSEM into the Bayesian Context |
|
|
233 | (2) |
|
7.3 The Model and Notation |
|
|
235 | (3) |
|
7.4 The Bayesian Form of MSEM |
|
|
238 | (5) |
|
7.5 Example 1: A Two-Level CFA with Continuous Items |
|
|
243 | (4) |
|
7.5.1 Implementation of Example 1 |
|
|
244 | (2) |
|
|
246 | (1) |
|
7.6 Example 2: A Three-Level CFA with Categorical Items |
|
|
247 | (11) |
|
7.6.1 Implementation of Example 2 |
|
|
253 | (1) |
|
|
253 | (5) |
|
7.7 How to Write Up Bayesian MSEM Results |
|
|
258 | (3) |
|
7.7.1 Hypothetical Data Analysis Plan |
|
|
258 | (1) |
|
7.7.2 Hypothetical Results Section |
|
|
259 | (1) |
|
7.7.3 Discussion Points Relevant to the Analysis |
|
|
260 | (1) |
|
|
261 | (14) |
|
7.8.1 Major Take-Home Points |
|
|
262 | (2) |
|
7.8.2 Notation Referenced |
|
|
264 | (3) |
|
7.8.3 Annotated Bibliography of Select Resources |
|
|
267 | (1) |
|
7.8.4 Example Code for Mplus |
|
|
268 | (1) |
|
|
268 | (7) |
|
Part IV Longitudinal and Mixture Models |
|
|
|
8 The Latent Growth Curve Model |
|
|
275 | (33) |
|
8.1 Introduction to Bayesian LGCM |
|
|
275 | (1) |
|
8.2 The Model and Notation |
|
|
276 | (4) |
|
8.2.1 Extensions of the LGCM |
|
|
279 | (1) |
|
8.3 The Bayesian Form of the LGCM |
|
|
280 | (3) |
|
8.3.1 Alternative Priors for the Factor Variances and Covariances |
|
|
281 | (2) |
|
8.4 Example 1: Bayesian Estimation of the LGCM Using ECLS--K Reading Data |
|
|
283 | (4) |
|
8.5 Example 2: Extending the Example to Include Separation Strategy Priors |
|
|
287 | (4) |
|
8.6 Example 3: Extending the Framework to Assessing MI over Time |
|
|
291 | (6) |
|
8.7 How to Write Up Bayesian LGCM Results |
|
|
297 | (2) |
|
8.7.1 Hypothetical Data Analysis Plan |
|
|
297 | (1) |
|
8.7.2 Hypothetical Results Section |
|
|
298 | (1) |
|
8.7.3 Discussion Points Relevant to the Analysis |
|
|
299 | (1) |
|
|
299 | (9) |
|
8.8.1 Major Take-Home Points |
|
|
300 | (2) |
|
8.8.2 Notation Referenced |
|
|
302 | (2) |
|
8.8.3 Annotated Bibliography of Select Resources |
|
|
304 | (1) |
|
8.8.4 Example Code for Mplus |
|
|
305 | (1) |
|
|
305 | (3) |
|
|
308 | (46) |
|
9.1 A Brief Introduction to Mixture Models |
|
|
308 | (1) |
|
9.2 Introduction to Bayesian LCA |
|
|
309 | (1) |
|
9.3 The Model and Notation |
|
|
310 | (3) |
|
9.3.1 Introducing the Issue of Class Separation |
|
|
312 | (1) |
|
9.4 The Bayesian Form of the LCA Model |
|
|
313 | (2) |
|
9.4.1 Adding Flexibility to the LCA Model |
|
|
314 | (1) |
|
9.5 Mixture Models, Label Switching, and Possible Solutions |
|
|
315 | (6) |
|
9.5.1 Identifiability Constraints |
|
|
319 | (1) |
|
9.5.2 Relabeling Algorithms |
|
|
320 | (1) |
|
9.5.3 Label Invariant Loss Functions |
|
|
321 | (1) |
|
9.5.4 Final Thoughts on Label Switching |
|
|
321 | (1) |
|
9.6 Example: A Demonstration of Bayesian LCA |
|
|
321 | (19) |
|
9.6.1 Motivation for This Example |
|
|
322 | (2) |
|
9.6.2 The Current Example |
|
|
324 | (16) |
|
9.7 How to Write Up Bayesian LCA Results |
|
|
340 | (4) |
|
9.7.1 Hypothetical Data Analysis Plan |
|
|
340 | (1) |
|
9.7.2 Hypothetical Results Section |
|
|
341 | (2) |
|
9.7.3 Discussion Points Relevant to the Analysis |
|
|
343 | (1) |
|
|
344 | (10) |
|
9.8.1 Major Take-Home Points |
|
|
344 | (2) |
|
9.8.2 Notation Referenced |
|
|
346 | (1) |
|
9.8.3 Annotated Bibliography of Select Resources |
|
|
347 | (1) |
|
9.8.4 Example Code for Mplus |
|
|
348 | (4) |
|
|
352 | (2) |
|
10 The Latent Growth Mixture Model |
|
|
354 | (39) |
|
10.1 Introduction to Bayesian LGMM |
|
|
354 | (2) |
|
10.2 The Model and Notation |
|
|
356 | (7) |
|
10.2.1 Concerns with Class Separation |
|
|
359 | (4) |
|
10.3 The Bayesian Form of the LGMM |
|
|
363 | (3) |
|
10.3.1 Alternative Priors for Factor Means |
|
|
365 | (1) |
|
10.3.2 Alternative Priors for the Measurement Error Covariance Matrix |
|
|
365 | (1) |
|
10.3.3 Alternative Priors for the Factor Covariance Matrix |
|
|
365 | (1) |
|
10.3.4 Handling Label Switching in LGMMs |
|
|
365 | (1) |
|
10.4 Example: Comparing Different Prior Conditions in an LGMM |
|
|
366 | (12) |
|
10.5 How to Write Up Bayesian LGMM Results |
|
|
378 | (3) |
|
10.5.1 Hypothetical Data Analysis Plan |
|
|
378 | (1) |
|
10.5.2 Hypothetical Results Section |
|
|
379 | (2) |
|
10.5.3 Discussion Points Relevant to the Analysis |
|
|
381 | (1) |
|
|
381 | (12) |
|
10.6.1 Major Take-Home Points |
|
|
382 | (2) |
|
10.6.2 Notation Referenced |
|
|
384 | (2) |
|
10.6.3 Annotated Bibliography of Select Resources |
|
|
386 | (1) |
|
10.6.4 Example Code for Mplus |
|
|
387 | (1) |
|
10.6.5 Example Code for R |
|
|
387 | (6) |
|
|
|
|
393 | (41) |
|
11.1 Model Comparison and Cross-Validation |
|
|
395 | (9) |
|
|
395 | (3) |
|
11.1.2 The Bayesian Information Criterion |
|
|
398 | (2) |
|
11.1.3 The Deviance Information Criterion |
|
|
400 | (2) |
|
11.1.4 The Widely Applicable Information Criterion |
|
|
402 | (1) |
|
11.1.5 Leave-One-Out Cross-Validation |
|
|
403 | (1) |
|
|
404 | (7) |
|
11.2.1 Posterior Predictive Model Checking |
|
|
404 | (5) |
|
11.2.2 Missing Data and the PPC Procedure |
|
|
409 | (1) |
|
11.2.3 Testing Near-Zero Parameters through the PPPP |
|
|
410 | (1) |
|
11.3 Bayesian Approximate Fit |
|
|
411 | (5) |
|
11.3.1 Bayesian Root Mean Square Error of Approximation |
|
|
412 | (1) |
|
11.3.2 Bayesian Tucker-Lewis Index |
|
|
413 | (1) |
|
11.3.3 Bayesian Normed Fit Index |
|
|
414 | (1) |
|
11.3.4 Bayesian Comparative Fit Index |
|
|
414 | (1) |
|
11.3.5 Implementation of These Indices |
|
|
415 | (1) |
|
11.4 Example 1: Illustrating the PPC and the PPPP for CFA |
|
|
416 | (3) |
|
11.5 Example 2: Illustrating Bayesian Approximate Fit for CFA |
|
|
419 | (3) |
|
11.6 How to Write Up Bayesian Approximate Fit Results |
|
|
422 | (3) |
|
11.6.1 Hypothetical Data Analysis Plan |
|
|
422 | (1) |
|
11.6.2 Hypothetical Results Section |
|
|
423 | (2) |
|
11.6.3 Discussion Points Relevant to the Analysis |
|
|
425 | (1) |
|
|
425 | (9) |
|
11.7.1 Major Take-Home Points |
|
|
425 | (2) |
|
11.7.2 Notation Referenced |
|
|
427 | (4) |
|
11.7.3 Annotated Bibliography of Select Resources |
|
|
431 | (1) |
|
11.7.4 Example Code for Mplus |
|
|
432 | (1) |
|
11.7.5 Example Code for R |
|
|
432 | (2) |
|
12 Important Points to Consider |
|
|
434 | (39) |
|
12.1 Implementation and Reporting of Bayesian Results |
|
|
434 | (2) |
|
12.1.1 Priors Implemented |
|
|
435 | (1) |
|
|
435 | (1) |
|
12.1.3 Sensitivity Analysis |
|
|
435 | (1) |
|
12.1.4 How Should We Interpret These Findings? |
|
|
436 | (1) |
|
12.2 Points to Check Prior to Data Analysis |
|
|
436 | (7) |
|
12.2.1 Is Your Model Formulated "Correctly"? |
|
|
436 | (4) |
|
12.2.2 Do You Understand the Priors? |
|
|
440 | (3) |
|
12.3 Points to Check after Initial Data Analysis, but before Interpretation of Results |
|
|
443 | (13) |
|
|
443 | (5) |
|
12.3.2 Does Convergence Remain after Doubling the Number of Iterations? |
|
|
448 | (2) |
|
12.3.3 Is There Ample Information in the Posterior Histogram? |
|
|
450 | (2) |
|
12.3.4 Is There a Strong Degree of Autocorrelation in the Posterior? |
|
|
452 | (3) |
|
12.3.5 Does the Posterior Make Substantive Sense? |
|
|
455 | (1) |
|
12.4 Understanding the Influence of Priors |
|
|
456 | (6) |
|
12.4.1 Examining the Influence of Priors on Multivariate Parameters (e.g., Covariance Matrices) |
|
|
457 | (3) |
|
12.4.2 Comparing the Original Prior to Other Diffuse or Subjective Priors |
|
|
460 | (2) |
|
12.5 Incorporating Model Fit or Model Comparison |
|
|
462 | (1) |
|
12.6 Interpreting Model Results the "Bayesian Way" |
|
|
463 | (1) |
|
12.7 How to Write Up Bayesian Results |
|
|
464 | (5) |
|
12.7.1 (Hypothetical) Results for Bayesian Two-Factor CFA |
|
|
465 | (4) |
|
12.8 How to Review Bayesian Work |
|
|
469 | (1) |
|
12.9 Chapter Summary and Looking Forward |
|
|
470 | (3) |
Glossary |
|
473 | (9) |
References |
|
482 | (17) |
Author Index |
|
499 | (5) |
Subject Index |
|
504 | (17) |
About the Author |
|
521 | |