Muutke küpsiste eelistusi

E-raamat: Bayesian Modeling Using WinBUGS

(Athens University of Economics and Business, Greece)
Teised raamatud teemal:
  • Formaat - EPUB+DRM
  • Hind: 172,84 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Raamatukogudele
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles.

The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including:





Markov Chain Monte Carlo algorithms in Bayesian inference



Generalized linear models



Bayesian hierarchical models



Predictive distribution and model checking



Bayesian model and variable evaluation





Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all data sets and code are available on the book's related Web site.

Requiring only a working knowledge of probability theory and statistics, Bayesian Modeling Using WinBUGS serves as an excellent book for courses on Bayesian statistics at the upper-undergraduate and graduate levels. It is also a valuable reference for researchers and practitioners in the fields of statistics, actuarial science, medicine, and the social sciences who use WinBUGS in their everyday work.
Preface xvii
Acknowledgments xix
Acronyms xxi
Introduction to Bayesian Inference
1(30)
Introduction: Bayesian modeling in the 21st century
1(2)
Definition of statistical models
3(1)
Bayes theorem
3(1)
Model-based Bayesian inference
4(3)
Inference using conjugate prior distributions
7(17)
Inference for the Poisson rate of count data
7(1)
Inference for the success probability of binomial data
8(1)
Inference for the mean of normal data with known variance
9(2)
Inference for the mean and variance of normal data
11(1)
Inference for normal regression models
12(2)
Other conjugate prior distributions
14(1)
Illustrative examples
14(10)
Nonconjugate analysis
24(7)
Problem
27(4)
Markov Chain Monte Carlo Algorithms in Bayesian Inference
31(52)
Simulation, Monte Carlo integration, and their implementation in Bayesian inference
31(4)
Markov chain Monte Carlo methods
35(7)
The algorithm
36(1)
Terminology and implementation details
37(5)
Popular MCMC algorithms
42(39)
The Metropolis-Hastings algorithm
42(3)
Componentwise Metropolis-Hastings
45(26)
The Gibbs sampler
71(5)
Metropolis within Gibbs
76(1)
The slice Gibbs sampler
76(1)
A simple example using the slice sampler
77(4)
Summary and closing remarks
81(2)
Problems
81(2)
WinBUGS Software: Introduction, Setup, and Basic Analysis
83(42)
Introduction and historical background
83(1)
The WinBUGS environment
84(4)
Downloading and installing WinBUGS
84(1)
A short description of the menus
85(3)
Preliminaries on using WinBUGS
88(5)
Code structure and type of parameters/nodes
88(1)
Scalar, vector, matrix, and array nodes
89(4)
Building Bayesian models in WinBUGS
93(15)
Function description
93(4)
Using the for syntax and array, matrix, and vector calculations
97(1)
Use of parentheses, brackets and curly braces in WinBUGS
98(1)
Differences between WinBUGS and R/Splus syntax
98(1)
Model specification in WinBUGS
99(1)
Data and initial value specification
100(7)
An example of a complete model specification
107(1)
Data transformations
108(1)
Compiling the model and simulating values
108(9)
Basic output analysis using the sample monitor tool
117(3)
Summarizing the procedure
120(1)
Chapter summary and concluding comments
121(4)
Problems
121(4)
WinBUGS Software: Illustration, Results, and Further Analysis
125(26)
A complete example of running MCMC in WinBUGS for a simple model
125(7)
The model
125(2)
Data and initial values
127(1)
Compiling and running the model
127(2)
MCMC output analysis and results
129(3)
Further output analysis using the inference menu
132(9)
Comparison of nodes
133(3)
Calculation of correlations
136(1)
Using the summary tool
137(1)
Evaluation and ranking of individuals
138(2)
Calculation of deviance information criterion
140(1)
Multiple chains
141(4)
Generation of multiple chains
141(1)
Output analysis
142(1)
The Gelman-Rubin convergence diagnostic
143(2)
Changing the properties of a figure
145(3)
General graphical options
145(1)
Special graphical options
145(3)
Other tools and menus
148(1)
The node info tool
148(1)
Monitoring the acceptance rate of the Metropolis-Hastings algorithm
148(1)
Saving the current state of the chain
149(1)
Setting the starting seed number
149(1)
Running the model as a script
149(1)
Summary and concluding remarks
149(2)
Problems
150(1)
Introduction to Bayesian Models: Normal Models
151(38)
General modeling principles
151(1)
Model specification in normal regression models
152(9)
Specifying the likelihood
153(1)
Specifying a simple independent prior distribution
154(1)
Interpretation of the regression coefficients
154(3)
A regression example using WinBUGS
157(4)
Using vectors and multivariate priors in normal regression models
161(6)
Defining the model using matrices
161(1)
Prior distributions for normal regression models
162(1)
Multivariate normal priors in WinBUGS
163(1)
Continuation of Example 5.1
164(3)
Analysis of variance models
167(22)
The one-way ANOVA model
167(1)
Parametrization and parameter interpretation
168(1)
One-way ANOVA model in WinBUGS
169(2)
A one-way ANOVA example using WinBUGS
171(2)
Two-way ANOVA models
173(11)
Multifactor analysis of variance
184(1)
Problems
184(5)
Incorporating Categorical Variables in Normal Models and Further Modeling Issues
189(40)
Analysis of variance models using dummy variables
191(4)
Analysis of covariance models
195(8)
Models using one quantitative variable and one qualitative variable
197(1)
The parallel lines model
197(4)
The separate lines model
201(2)
A bioassay example
203(15)
Parallel lines analysis
204(8)
Slope ratio analysis: Models with common intercept and different slope
212(5)
Comparison of the two approaches
217(1)
Further modeling issues
218(8)
Extending the simple ANCOVA model
218(1)
Using binary indicators to specify models in multiple regression
219(1)
Selection of variables using the deviance information criterion (DIC)
219(7)
Closing remarks
226(3)
Problems
226(3)
Introduction to Generalized Linear Models: Binomial and Poisson Data
229(46)
Introduction
229(10)
The exponential family
230(1)
Common distributions as members of the exponential family
231(3)
Link functions
234(2)
Common generalized linear models
236(2)
Interpretation of GLM coefficients
238(1)
Prior distributions
239(2)
Posterior inference
241(1)
The posterior distribution of a generalized linear model
241(1)
GLM specification in WinBUGS
242(1)
Poisson regression models
242(13)
Interpretation of Poisson log-linear parameters
242(3)
A simple Poisson regression example
245(4)
A Poisson regression model for modeling football data
249(6)
binomial response models
255(14)
Interpretation of model parameters in binomial response models
257(6)
A simple example
263(6)
Models for contingency tables
269(6)
Problems
270(5)
Models for Positive Continuous Data, Count Data, and Other GLM-Based Extensions
275(30)
Models with nonstandard distributions
275(4)
Specification of arbitrary likelihood using the zeros-ones trick
276(1)
The inverse Gaussian model
277(2)
Models for positive continuous response variables
279(3)
The gamma model
279(1)
Other models
280(1)
An example
281(1)
Additional models for count data
282(14)
The negative binomial model
283(3)
The generalized Poisson model
286(2)
Zero inflated models
288(3)
The bivariate Poisson model
291(2)
The Poisson difference model
293(3)
Further GLM-based models and extensions
296(9)
Survival analysis models
297(1)
Multinomial models
298(2)
Additional models and further reading
300(1)
Problems
301(4)
Bayesian Hierarchical Models
305(36)
Introduction
305(3)
A simple motivating example
306(1)
Why use a hierarchical model?
307(1)
Other advantages and characteristics
308(1)
Some simple examples
308(12)
Repeated measures data
308(5)
Introducing random effects in performance parameters
313(2)
Poisson mixture models for count data
315(3)
The use of hierarchical models in meta-analysis
318(2)
The generalized linear mixed model formulation
320(18)
A hierarchical normal model: A simple crossover trial
321(4)
Logit GLMM for correlated binary responses
325(8)
Poisson log-linear GLMMs for correlated count data
333(7)
Discussion, closing remarks, and further reading
338(3)
Problems
340(1)
The Predictive Distribution and Model Checking
341(48)
Introduction
341(3)
Prediction within Bayesian framework
341(1)
Using posterior predictive densities for model evaluation and checking
342(2)
Cross-validation predictive densities
344(1)
Estimating the predictive distribution for future or missing observations using MCMC
344(10)
A simple example: Estimating missing observations
345(2)
An example of Bayesian prediction using a simple model
347(7)
Using the predictive distribution for model checking
354(21)
Comparison of actual and predictive frequencies for discrete data
354(3)
Comparison of cumulative frequencies for predictive and actual values for continuous data
357(1)
Comparison of ordered predictive and actual values for continuous data
358(1)
Estimation of the posterior predictive ordinate
359(3)
Checking individual observations using residuals
362(3)
Checking structural assumptions of the model
365(3)
Checking the goodness-of-fit of a model
368(7)
Using cross-validation predictive densities for model checking, evaluation, and comparison
375(3)
Estimating the conditional predictive ordinate
375(2)
Generating values from the leave-one-out cross-validatory predictive distributions
377(1)
Illustration of a complete predictive analysis: Normal regression models
378(9)
Checking structural assumptions of the model
378(1)
Detailed checks based on residual analysis
379(1)
Overall goodness-of-fit of the model
380(1)
Implementation using WinBUGS
380(3)
An Illustrative example
383(3)
Summary of the model checking procedure
386(1)
Discussion
387(2)
Problems
387(2)
Bayesian Model and Variable Evaluation
389(46)
Prior predictive distributions as measures of model comparison: Posterior model odds and Bayes factors
389(2)
Sensitivity of the posterior model probabilities: The Lindley-Bartlett paradox
391(1)
Computation of the marginal likelihood
392(5)
Approximations based on the normal distribution
392(1)
Sampling from the prior: A naive Monte Carlo estimator
392(1)
Sampling from the posterior: The harmonic mean estimator
393(1)
Importance sampling estimators
394(1)
Bridge sampling estimators
394(1)
Chib's marginal likelihood estimator
395(2)
Additional details and further reading
397(1)
Computation of the marginal likelihood using WinBUGS
397(8)
A beta-binomial example
399(4)
A normal regression example with conjugate normal-inverse gamma prior
403(2)
Bayesian variable selection using Gibbs-based methods
405(7)
Prior distributions for variable selection in GLM
406(3)
Gibbs variable selection
409(1)
Other Gibbs-based methods for variable selection
410(2)
Posterior inference using the output of Bayesian variable selection samplers
412(2)
Implementation of Gibbs variable selection in WinBUGS using an illustrative example
414(5)
The Carlin-Chib method
419(1)
reversible jump MCMC (RJMCMC)
420(1)
Using posterior predictive densities for model evaluation
421(3)
Estimation from an MCMC output
423(1)
A simple example in WinBUGS
424(1)
Information criteria
424(8)
The Bayes information criterion (BIC)
425(1)
The Akaike information criterion (AIC)
426(1)
Other criteria
427(1)
Calculation of penalized deviance measures from the MCMC output
428(1)
Implementation in WinBUGS
428(1)
A simple example in WinBUGS
429(3)
Discussion and further reading
432(3)
Problems
432(3)
Appendix A: Model Specification via Directed Acyclic Graphs: The DOODLE Menu
435(8)
A.1 Introduction: Starting with DOODLE
435(1)
A.2 Nodes
436(2)
A.3 Edges
438(1)
A.4 Panels
438(1)
A.5 A simple example
439(4)
Appendix B: The Batch Mode: Running a Model in the Background Using Scripts
443(4)
B.1 Introduction
443(1)
B.2 Basic commands: Compiling and running the model
444(3)
Appendix C: Checking Convergence Using CODA/BOA
447(14)
C.1 Introduction
447(1)
C.2 A short historical review
448(1)
C.3 Diagnostics implemented by CODA/BOA
448(2)
C.3.1 The Geweke diagnostic
448(1)
C.3.2 The Gelman-Rubin diagnostic
449(1)
C.3.3 The Raftery-Lewis diagnostic
449(1)
C.3.4 The Heidelberger-Welch diagnostic
449(1)
C.3.5 Final remarks
450(1)
C.4 A first look at CODA/BOA
450(3)
C.4.1 CODA
450(1)
C.4.2 BOA
451(2)
C.5 A simple example
453(8)
C.5.1 Illustration in CODA
453(4)
C.5.2 Illustration in BOA
457(4)
Appendix D: Notation Summary
461(1)
D.1 MCMC
461(1)
D.2 Subscripts and indices
462(1)
D.3 Parameters
462(1)
D.4 Random variables and data
463(1)
D.5 Sample estimates
463(1)
D.6 Special functions, vectors, and matrices
464(1)
D.7 Distributions
464(1)
D.8 Distribution-related notation
465(1)
D.9 Notation used in ANOVA and ANCOVA
466(1)
D.10 Variable and model specification
466(1)
D.11 Deviance information criterion (DIC)
466(1)
D.12 Predictive measures
467
Ioannis Ntzoufras, PhD, is Assistant Professor of Statistics at Athens University of Economics and Business (Greece). Dr. Ntzoufras has published numerous journal articles in his areas of research interest, which include Bayesian statistics, statistical analysis and programming, and generalized linear models.