Preface |
|
xvii | |
Acknowledgments |
|
xix | |
Acronyms |
|
xxi | |
|
Introduction to Bayesian Inference |
|
|
1 | (30) |
|
Introduction: Bayesian modeling in the 21st century |
|
|
1 | (2) |
|
Definition of statistical models |
|
|
3 | (1) |
|
|
3 | (1) |
|
Model-based Bayesian inference |
|
|
4 | (3) |
|
Inference using conjugate prior distributions |
|
|
7 | (17) |
|
Inference for the Poisson rate of count data |
|
|
7 | (1) |
|
Inference for the success probability of binomial data |
|
|
8 | (1) |
|
Inference for the mean of normal data with known variance |
|
|
9 | (2) |
|
Inference for the mean and variance of normal data |
|
|
11 | (1) |
|
Inference for normal regression models |
|
|
12 | (2) |
|
Other conjugate prior distributions |
|
|
14 | (1) |
|
|
14 | (10) |
|
|
24 | (7) |
|
|
27 | (4) |
|
Markov Chain Monte Carlo Algorithms in Bayesian Inference |
|
|
31 | (52) |
|
Simulation, Monte Carlo integration, and their implementation in Bayesian inference |
|
|
31 | (4) |
|
Markov chain Monte Carlo methods |
|
|
35 | (7) |
|
|
36 | (1) |
|
Terminology and implementation details |
|
|
37 | (5) |
|
|
42 | (39) |
|
The Metropolis-Hastings algorithm |
|
|
42 | (3) |
|
Componentwise Metropolis-Hastings |
|
|
45 | (26) |
|
|
71 | (5) |
|
|
76 | (1) |
|
|
76 | (1) |
|
A simple example using the slice sampler |
|
|
77 | (4) |
|
Summary and closing remarks |
|
|
81 | (2) |
|
|
81 | (2) |
|
WinBUGS Software: Introduction, Setup, and Basic Analysis |
|
|
83 | (42) |
|
Introduction and historical background |
|
|
83 | (1) |
|
|
84 | (4) |
|
Downloading and installing WinBUGS |
|
|
84 | (1) |
|
A short description of the menus |
|
|
85 | (3) |
|
Preliminaries on using WinBUGS |
|
|
88 | (5) |
|
Code structure and type of parameters/nodes |
|
|
88 | (1) |
|
Scalar, vector, matrix, and array nodes |
|
|
89 | (4) |
|
Building Bayesian models in WinBUGS |
|
|
93 | (15) |
|
|
93 | (4) |
|
Using the for syntax and array, matrix, and vector calculations |
|
|
97 | (1) |
|
Use of parentheses, brackets and curly braces in WinBUGS |
|
|
98 | (1) |
|
Differences between WinBUGS and R/Splus syntax |
|
|
98 | (1) |
|
Model specification in WinBUGS |
|
|
99 | (1) |
|
Data and initial value specification |
|
|
100 | (7) |
|
An example of a complete model specification |
|
|
107 | (1) |
|
|
108 | (1) |
|
Compiling the model and simulating values |
|
|
108 | (9) |
|
Basic output analysis using the sample monitor tool |
|
|
117 | (3) |
|
Summarizing the procedure |
|
|
120 | (1) |
|
Chapter summary and concluding comments |
|
|
121 | (4) |
|
|
121 | (4) |
|
WinBUGS Software: Illustration, Results, and Further Analysis |
|
|
125 | (26) |
|
A complete example of running MCMC in WinBUGS for a simple model |
|
|
125 | (7) |
|
|
125 | (2) |
|
|
127 | (1) |
|
Compiling and running the model |
|
|
127 | (2) |
|
MCMC output analysis and results |
|
|
129 | (3) |
|
Further output analysis using the inference menu |
|
|
132 | (9) |
|
|
133 | (3) |
|
Calculation of correlations |
|
|
136 | (1) |
|
|
137 | (1) |
|
Evaluation and ranking of individuals |
|
|
138 | (2) |
|
Calculation of deviance information criterion |
|
|
140 | (1) |
|
|
141 | (4) |
|
Generation of multiple chains |
|
|
141 | (1) |
|
|
142 | (1) |
|
The Gelman-Rubin convergence diagnostic |
|
|
143 | (2) |
|
Changing the properties of a figure |
|
|
145 | (3) |
|
General graphical options |
|
|
145 | (1) |
|
Special graphical options |
|
|
145 | (3) |
|
|
148 | (1) |
|
|
148 | (1) |
|
Monitoring the acceptance rate of the Metropolis-Hastings algorithm |
|
|
148 | (1) |
|
Saving the current state of the chain |
|
|
149 | (1) |
|
Setting the starting seed number |
|
|
149 | (1) |
|
Running the model as a script |
|
|
149 | (1) |
|
Summary and concluding remarks |
|
|
149 | (2) |
|
|
150 | (1) |
|
Introduction to Bayesian Models: Normal Models |
|
|
151 | (38) |
|
General modeling principles |
|
|
151 | (1) |
|
Model specification in normal regression models |
|
|
152 | (9) |
|
Specifying the likelihood |
|
|
153 | (1) |
|
Specifying a simple independent prior distribution |
|
|
154 | (1) |
|
Interpretation of the regression coefficients |
|
|
154 | (3) |
|
A regression example using WinBUGS |
|
|
157 | (4) |
|
Using vectors and multivariate priors in normal regression models |
|
|
161 | (6) |
|
Defining the model using matrices |
|
|
161 | (1) |
|
Prior distributions for normal regression models |
|
|
162 | (1) |
|
Multivariate normal priors in WinBUGS |
|
|
163 | (1) |
|
Continuation of Example 5.1 |
|
|
164 | (3) |
|
Analysis of variance models |
|
|
167 | (22) |
|
|
167 | (1) |
|
Parametrization and parameter interpretation |
|
|
168 | (1) |
|
One-way ANOVA model in WinBUGS |
|
|
169 | (2) |
|
A one-way ANOVA example using WinBUGS |
|
|
171 | (2) |
|
|
173 | (11) |
|
Multifactor analysis of variance |
|
|
184 | (1) |
|
|
184 | (5) |
|
Incorporating Categorical Variables in Normal Models and Further Modeling Issues |
|
|
189 | (40) |
|
Analysis of variance models using dummy variables |
|
|
191 | (4) |
|
Analysis of covariance models |
|
|
195 | (8) |
|
Models using one quantitative variable and one qualitative variable |
|
|
197 | (1) |
|
|
197 | (4) |
|
|
201 | (2) |
|
|
203 | (15) |
|
|
204 | (8) |
|
Slope ratio analysis: Models with common intercept and different slope |
|
|
212 | (5) |
|
Comparison of the two approaches |
|
|
217 | (1) |
|
|
218 | (8) |
|
Extending the simple ANCOVA model |
|
|
218 | (1) |
|
Using binary indicators to specify models in multiple regression |
|
|
219 | (1) |
|
Selection of variables using the deviance information criterion (DIC) |
|
|
219 | (7) |
|
|
226 | (3) |
|
|
226 | (3) |
|
Introduction to Generalized Linear Models: Binomial and Poisson Data |
|
|
229 | (46) |
|
|
229 | (10) |
|
|
230 | (1) |
|
Common distributions as members of the exponential family |
|
|
231 | (3) |
|
|
234 | (2) |
|
Common generalized linear models |
|
|
236 | (2) |
|
Interpretation of GLM coefficients |
|
|
238 | (1) |
|
|
239 | (2) |
|
|
241 | (1) |
|
The posterior distribution of a generalized linear model |
|
|
241 | (1) |
|
GLM specification in WinBUGS |
|
|
242 | (1) |
|
Poisson regression models |
|
|
242 | (13) |
|
Interpretation of Poisson log-linear parameters |
|
|
242 | (3) |
|
A simple Poisson regression example |
|
|
245 | (4) |
|
A Poisson regression model for modeling football data |
|
|
249 | (6) |
|
|
255 | (14) |
|
Interpretation of model parameters in binomial response models |
|
|
257 | (6) |
|
|
263 | (6) |
|
Models for contingency tables |
|
|
269 | (6) |
|
|
270 | (5) |
|
Models for Positive Continuous Data, Count Data, and Other GLM-Based Extensions |
|
|
275 | (30) |
|
Models with nonstandard distributions |
|
|
275 | (4) |
|
Specification of arbitrary likelihood using the zeros-ones trick |
|
|
276 | (1) |
|
The inverse Gaussian model |
|
|
277 | (2) |
|
Models for positive continuous response variables |
|
|
279 | (3) |
|
|
279 | (1) |
|
|
280 | (1) |
|
|
281 | (1) |
|
Additional models for count data |
|
|
282 | (14) |
|
The negative binomial model |
|
|
283 | (3) |
|
The generalized Poisson model |
|
|
286 | (2) |
|
|
288 | (3) |
|
The bivariate Poisson model |
|
|
291 | (2) |
|
The Poisson difference model |
|
|
293 | (3) |
|
Further GLM-based models and extensions |
|
|
296 | (9) |
|
|
297 | (1) |
|
|
298 | (2) |
|
Additional models and further reading |
|
|
300 | (1) |
|
|
301 | (4) |
|
Bayesian Hierarchical Models |
|
|
305 | (36) |
|
|
305 | (3) |
|
A simple motivating example |
|
|
306 | (1) |
|
Why use a hierarchical model? |
|
|
307 | (1) |
|
Other advantages and characteristics |
|
|
308 | (1) |
|
|
308 | (12) |
|
|
308 | (5) |
|
Introducing random effects in performance parameters |
|
|
313 | (2) |
|
Poisson mixture models for count data |
|
|
315 | (3) |
|
The use of hierarchical models in meta-analysis |
|
|
318 | (2) |
|
The generalized linear mixed model formulation |
|
|
320 | (18) |
|
A hierarchical normal model: A simple crossover trial |
|
|
321 | (4) |
|
Logit GLMM for correlated binary responses |
|
|
325 | (8) |
|
Poisson log-linear GLMMs for correlated count data |
|
|
333 | (7) |
|
Discussion, closing remarks, and further reading |
|
|
338 | (3) |
|
|
340 | (1) |
|
The Predictive Distribution and Model Checking |
|
|
341 | (48) |
|
|
341 | (3) |
|
Prediction within Bayesian framework |
|
|
341 | (1) |
|
Using posterior predictive densities for model evaluation and checking |
|
|
342 | (2) |
|
Cross-validation predictive densities |
|
|
344 | (1) |
|
Estimating the predictive distribution for future or missing observations using MCMC |
|
|
344 | (10) |
|
A simple example: Estimating missing observations |
|
|
345 | (2) |
|
An example of Bayesian prediction using a simple model |
|
|
347 | (7) |
|
Using the predictive distribution for model checking |
|
|
354 | (21) |
|
Comparison of actual and predictive frequencies for discrete data |
|
|
354 | (3) |
|
Comparison of cumulative frequencies for predictive and actual values for continuous data |
|
|
357 | (1) |
|
Comparison of ordered predictive and actual values for continuous data |
|
|
358 | (1) |
|
Estimation of the posterior predictive ordinate |
|
|
359 | (3) |
|
Checking individual observations using residuals |
|
|
362 | (3) |
|
Checking structural assumptions of the model |
|
|
365 | (3) |
|
Checking the goodness-of-fit of a model |
|
|
368 | (7) |
|
Using cross-validation predictive densities for model checking, evaluation, and comparison |
|
|
375 | (3) |
|
Estimating the conditional predictive ordinate |
|
|
375 | (2) |
|
Generating values from the leave-one-out cross-validatory predictive distributions |
|
|
377 | (1) |
|
Illustration of a complete predictive analysis: Normal regression models |
|
|
378 | (9) |
|
Checking structural assumptions of the model |
|
|
378 | (1) |
|
Detailed checks based on residual analysis |
|
|
379 | (1) |
|
Overall goodness-of-fit of the model |
|
|
380 | (1) |
|
Implementation using WinBUGS |
|
|
380 | (3) |
|
|
383 | (3) |
|
Summary of the model checking procedure |
|
|
386 | (1) |
|
|
387 | (2) |
|
|
387 | (2) |
|
Bayesian Model and Variable Evaluation |
|
|
389 | (46) |
|
Prior predictive distributions as measures of model comparison: Posterior model odds and Bayes factors |
|
|
389 | (2) |
|
Sensitivity of the posterior model probabilities: The Lindley-Bartlett paradox |
|
|
391 | (1) |
|
Computation of the marginal likelihood |
|
|
392 | (5) |
|
Approximations based on the normal distribution |
|
|
392 | (1) |
|
Sampling from the prior: A naive Monte Carlo estimator |
|
|
392 | (1) |
|
Sampling from the posterior: The harmonic mean estimator |
|
|
393 | (1) |
|
Importance sampling estimators |
|
|
394 | (1) |
|
Bridge sampling estimators |
|
|
394 | (1) |
|
Chib's marginal likelihood estimator |
|
|
395 | (2) |
|
Additional details and further reading |
|
|
397 | (1) |
|
Computation of the marginal likelihood using WinBUGS |
|
|
397 | (8) |
|
|
399 | (4) |
|
A normal regression example with conjugate normal-inverse gamma prior |
|
|
403 | (2) |
|
Bayesian variable selection using Gibbs-based methods |
|
|
405 | (7) |
|
Prior distributions for variable selection in GLM |
|
|
406 | (3) |
|
|
409 | (1) |
|
Other Gibbs-based methods for variable selection |
|
|
410 | (2) |
|
Posterior inference using the output of Bayesian variable selection samplers |
|
|
412 | (2) |
|
Implementation of Gibbs variable selection in WinBUGS using an illustrative example |
|
|
414 | (5) |
|
|
419 | (1) |
|
reversible jump MCMC (RJMCMC) |
|
|
420 | (1) |
|
Using posterior predictive densities for model evaluation |
|
|
421 | (3) |
|
Estimation from an MCMC output |
|
|
423 | (1) |
|
A simple example in WinBUGS |
|
|
424 | (1) |
|
|
424 | (8) |
|
The Bayes information criterion (BIC) |
|
|
425 | (1) |
|
The Akaike information criterion (AIC) |
|
|
426 | (1) |
|
|
427 | (1) |
|
Calculation of penalized deviance measures from the MCMC output |
|
|
428 | (1) |
|
Implementation in WinBUGS |
|
|
428 | (1) |
|
A simple example in WinBUGS |
|
|
429 | (3) |
|
Discussion and further reading |
|
|
432 | (3) |
|
|
432 | (3) |
|
Appendix A: Model Specification via Directed Acyclic Graphs: The DOODLE Menu |
|
|
435 | (8) |
|
A.1 Introduction: Starting with DOODLE |
|
|
435 | (1) |
|
|
436 | (2) |
|
|
438 | (1) |
|
|
438 | (1) |
|
|
439 | (4) |
|
Appendix B: The Batch Mode: Running a Model in the Background Using Scripts |
|
|
443 | (4) |
|
|
443 | (1) |
|
B.2 Basic commands: Compiling and running the model |
|
|
444 | (3) |
|
Appendix C: Checking Convergence Using CODA/BOA |
|
|
447 | (14) |
|
|
447 | (1) |
|
C.2 A short historical review |
|
|
448 | (1) |
|
C.3 Diagnostics implemented by CODA/BOA |
|
|
448 | (2) |
|
C.3.1 The Geweke diagnostic |
|
|
448 | (1) |
|
C.3.2 The Gelman-Rubin diagnostic |
|
|
449 | (1) |
|
C.3.3 The Raftery-Lewis diagnostic |
|
|
449 | (1) |
|
C.3.4 The Heidelberger-Welch diagnostic |
|
|
449 | (1) |
|
|
450 | (1) |
|
C.4 A first look at CODA/BOA |
|
|
450 | (3) |
|
|
450 | (1) |
|
|
451 | (2) |
|
|
453 | (8) |
|
C.5.1 Illustration in CODA |
|
|
453 | (4) |
|
C.5.2 Illustration in BOA |
|
|
457 | (4) |
|
Appendix D: Notation Summary |
|
|
461 | (1) |
|
|
461 | (1) |
|
D.2 Subscripts and indices |
|
|
462 | (1) |
|
|
462 | (1) |
|
D.4 Random variables and data |
|
|
463 | (1) |
|
|
463 | (1) |
|
D.6 Special functions, vectors, and matrices |
|
|
464 | (1) |
|
|
464 | (1) |
|
D.8 Distribution-related notation |
|
|
465 | (1) |
|
D.9 Notation used in ANOVA and ANCOVA |
|
|
466 | (1) |
|
D.10 Variable and model specification |
|
|
466 | (1) |
|
D.11 Deviance information criterion (DIC) |
|
|
466 | (1) |
|
|
467 | |