Muutke küpsiste eelistusi

E-raamat: BUGS Book: A Practical Introduction to Bayesian Analysis

(MRC Biostatistics Unit, Cambridge, UK)
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 57,19 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

"Preface. History. Markov chain Monte Carlo (MCMC) methods, in which plausible values for unknown quantities are simulated from their appropriate probability distribution, have revolutionised the practice of statistics. For more than 20 years the BUGS project has been at the forefront of this movement. The BUGS project began in Cambridge, in 1989, just as Alan Gelfand and Adrian Smith were working 80 miles away in Nottingham on their classic Gibbs sampler paper (Gelfand and Smith, 1990) that kicked off the revolution. But we never communicated (except through the intermediate node of David Clayton) and whereas the Gelfand-Smith approach used image-processing as inspiration, the philosophy behind BUGS was rooted more in techniques for handling uncertaintyin artificial intelligence using directed graphical models and what came to be called Bayesian networks (Pearl, 1988). Lunn et al. (2009b) lay out all this history in greater detail. Some people have accused Markov chain Monte Carlo methods of being slow, but nothing could compare with the time it has taken this book to be written! The first proposal dates from 1995, but things got in the way, as they do, and it needed a vigorous new generation of researchers to finally get it finished. It is slightly galling that much of the current book could have been written in the mid-1990s, since the basic ideas of the software, the language for model description, and indeed some of the examples are unchanged. Nevertheless there have been important developments in the extended gestational period of the book, for example techniques for model criticism and comparison, implementation of differential equations and nonparametric techniques, and the ability to run BUGS code within a range of alternative programs"--



Arvustused

"This is a beautiful bookit was a pleasure, and indeed great fun to read. The authors succeeded in writing a very nicely readable yet concise and carefully balanced text. It contains a lot of motivation, detailed explanations, necessary pieces of underlying theory, references to useful book-length treatments of various topics, and examples of the code illustrating how to implement concrete models in the BUGS language efficiently. this book also has a substantial pedagogical value. By reading this book carefully, redoing the examples, and thinking about them, one can learn a lot not only about BUGS, but also about Bayesian methods and statistics in general. highly recommended to a wide audience, from students of statistics [ to] practicing statisticians to researchers from various fields." ISCB News, 57, June 2014

" truly demonstrates the power and flexibility of the BUGS software and its broad range of applications, and that makes this book highly relevant not only for beginners but for advanced users as well. a notable addition to the growing range of introductory Bayesian textbooks that have been published within the last decade. It is unique in its focus on explicating state-of-the-art computational Bayesian strategies in the WinBUGS software. Thus, practitioners may use it as an excellent, didactically enhanced BUGS manual that, unlike ordinary software manuals, presents detailed explanations of the underlying models with references to relevant literature [ and] worked examples, including excerpts of WinBUGS code, as well as graphical illustrations of results and critical discussions. No doubt, The BUGS Book will become a classic Bayesian textbook and provide invaluable guidance to practicing statisticians, academics, and students alike." Renate Meyer, Journal of Biopharmaceutical Statistics, 2014

"In this book the developers of BUGS reveal the power of the BUGS software and how it can be used in Bayesian statistical modeling and inference. Many people will find it very useful for self-learning or as a supplement for a Bayesian inference course." William M. Bolstad, Australian & New Zealand Journal of Statistics, 2013

"If a book has ever been so much desired in the world of statistics, it is for sure this one. the tens of thousands of users of WinBUGS are indebted to the leading team of the BUGS project for having eventually succeeded in finalizing the writing of this book and for making sure that the long-held expectations are not dashed. it reflects very well the aims and spirit of the BUGS project and is meant to be a manual for anyone who would like to apply Bayesian methods to real-world problems. strikes the right distance between advanced theory and pure practice. I especially like the numerous examples given in the successive chapters which always help readers to figure out what is going on and give them new ideas to improve their BUGS skills. The BUGS Book is not only a major textbook on a topical subject, but it is also a mandatory one for all statisticians willing to learn and analyze data with Bayesian statistics at any level. It will be the companion and reference book for all users (beginners or advanced) of the BUGS software. I have no doubt it will meet the same success as BUGS and become very soon a classic in the literature of computational Bayesian statistics." Jean-Louis Fouley, CHANCE, 2013

" a two-in-one product that provides the reader with both a BUGS manual and a Bayesian analysis textbook, a combination that will likely appeal to many potential readers. The strength of The BUGS Book is its rich collection of ambitiously constructed and thematically arranged examples, which often come with snippets of code and printouts, as well as illustrative plots and diagrams. great value to many readers seeking to familiarize themselves with BUGS and its capabilities." Joakim Ekström, Journal of Statistical Software, January 2013

"MCMC freed Bayes from the shackles of conjugate priors and the curse of dimensionality; BUGS then brought MCMC-Bayes to the masses, yielding an astonishing explosion in the number, quality, and complexity of Bayesian inference over a vast array of application areas, from finance to medicine to data mining. The most anticipated applied Bayesian text of the last 20 years, The BUGS Book is like a wonderful album by an established rock supergroup: the pressure to deliver a high-quality product was enormous, but the authors have created a masterpiece well worth the wait. The book offers the perfect mix of basic probability calculus, Bayes and MCMC basics, an incredibly broad array of useful statistical models, and a BUGS tutorial and user manual complete with all the tricks one would expect from the team that invented the language. BUGS is the dominant Bayesian software package of the post-MCMC era, and this book ensures it will remain so for years to come by providing accessible yet comprehensive instruction in its proper use. A must-own for any working applied statistical modeler." Bradley P. Carlin, Professor and Head of Division of Biostatistics, University of Minnesota, Minneapolis, USA "This is a beautiful bookit was a pleasure, and indeed great fun to read. The authors succeeded in writing a very nicely readable yet concise and carefully balanced text. It contains a lot of motivation, detailed explanations, necessary pieces of underlying theory, references to useful book-length treatments of various topics, and examples of the code illustrating how to implement concrete models in the BUGS language efficiently. this book also has a substantial pedagogical value. By reading this book carefully, redoing the examples, and thinking about them, one can learn a lot not only about BUGS, but also about Bayesian methods and statistics in general. highly recommended to a wide audience, from students of statistics [ to] practicing statisticians to researchers from various fields." ISCB News, 57, June 2014

" truly demonstrates the power and flexibility of the BUGS software and its broad range of applications, and that makes this book highly relevant not only for beginners but for advanced users as well. a notable addition to the growing range of introductory Bayesian textbooks that have been published within the last decade. It is unique in its focus on explicating state-of-the-art computational Bayesian strategies in the WinBUGS software. Thus, practitioners may use it as an excellent, didactically enhanced BUGS manual that, unlike ordinary software manuals, presents detailed explanations of the underlying models with references to relevant literature [ and] worked examples, including excerpts of WinBUGS code, as well as graphical illustrations of results and critical discussions. No doubt, The BUGS Book will become a classic Bayesian textbook and provide invaluable guidance to practicing statisticians, academics, and students alike." Renate Meyer, Journal of Biopharmaceutical Statistics, 2014

"In this book the developers of BUGS reveal the power of the BUGS software and how it can be used in Bayesian statistical modeling and inference. Many people will find it very useful for self-learning or as a supplement for a Bayesian inference course." William M. Bolstad, Australian & New Zealand Journal of Statistics, 2013

"If a book has ever been so much desired in the world of statistics, it is for sure this one. the tens of thousands of users of WinBUGS are indebted to the leading team of the BUGS project for having eventually succeeded in finalizing the writing of this book and for making sure that the long-held expectations are not dashed. it reflects very well the aims and spirit of the BUGS project and is meant to be a manual for anyone who would like to apply Bayesian methods to real-world problems. strikes the right distance between advanced theory and pure practice. I especially like the numerous examples given in the successive chapters which always help readers to figure out what is going on and give them new ideas to improve their BUGS skills. The BUGS Book is not only a major textbook on a topical subject, but it is also a mandatory one for all statisticians willing to learn and analyze data with Bayesian statistics at any level. It will be the companion and reference book for all users (beginners or advanced) of the BUGS software. I have no doubt it will meet the same success as BUGS and become very soon a classic in the literature of computational Bayesian statistics." Jean-Louis Fouley, CHANCE, 2013

" a two-in-one product that provides the reader with both a BUGS manual and a Bayesian analysis textbook, a combination that will likely appeal to many potential readers. The strength of The BUGS Book is its rich collection of ambitiously constructed and thematically arranged examples, which often come with snippets of code and printouts, as well as illustrative plots and diagrams. great value to many readers seeking to familiarize themselves with BUGS and its capabilities." Joakim Ekström, Journal of Statistical Software, January 2013

"MCMC freed Bayes from the shackles of conjugate priors and the curse of dimensionality; BUGS then brought MCMC-Bayes to the masses, yielding an astonishing explosion in the number, quality, and complexity of Bayesian inference over a vast array of application areas, from finance to medicine to data mining. The most anticipated applied Bayesian text of the last 20 years, The BUGS Book is like a wonderful album by an established rock supergroup: the pressure to deliver a high-quality product was enormous, but the authors have created a masterpiece well worth the wait. The book offers the perfect mix of basic probability calculus, Bayes and MCMC basics, an incredibly broad array of useful statistical models, and a BUGS tutorial and user manual complete with all the tricks one would expect from the team that invented the language. BUGS is the dominant Bayesian software package of the post-MCMC era, and this book ensures it will remain so for years to come by providing accessible yet comprehensive instruction in its proper use. A must-own for any working applied statistical modeler." Bradley P. Carlin, Professor and Head of Division of Biostatistics, University of Minnesota, Minneapolis, USA

Preface xiii
1 Introduction: Probability and parameters
1(12)
1.1 Probability
1(4)
1.2 Probability distributions
5(2)
1.3 Calculating properties of probability distributions
7(1)
1.4 Monte Carlo integration
8(5)
2 Monte Carlo simulations using BUGS
13(20)
2.1 Introduction to BUGS
13(8)
2.1.1 Background
13(1)
2.1.2 Directed graphical models
13(2)
2.1.3 The BUGS language
15(1)
2.1.4 Running BUGS models
16(1)
2.1.5 Running WinBUGS for a simple example
17(4)
2.2 DoodleBUGS
21(1)
2.3 Using BUGS to simulate from distributions
22(2)
2.4 Transformations of random variables
24(2)
2.5 Complex calculations using Monte Carlo
26(1)
2.6 Multivariate Monte Carlo analysis
27(2)
2.7 Predictions with unknown parameters
29(4)
3 Introduction to Bayesian inference
33(24)
3.1 Bayesian learning
33(3)
3.1.1 Bayes' theorem for observable quantities
33(1)
3.1.2 Bayesian inference for parameters
34(2)
3.2 Posterior predictive distributions
36(1)
3.3 Conjugate Bayesian inference
36(9)
3.3.1 Binomial data
37(4)
3.3.2 Normal data with unknown mean, known variance
41(4)
3.4 Inference about a discrete parameter
45(4)
3.5 Combinations of conjugate analyses
49(2)
3.6 Bayesian and classical methods
51(6)
3.6.1 Likelihood-based inference
52(1)
3.6.2 Exchangeability
52(1)
3.6.3 Long-run properties of Bayesian methods
53(1)
3.6.4 Model-based vs procedural methods
54(1)
3.6.5 The "likelihood principle"
55(2)
4 Introduction to Markov chain Monte Carlo methods
57(24)
4.1 Bayesian computation
57(5)
4.1.1 Single-parameter models
57(2)
4.1.2 Multi-parameter models
59(2)
4.1.3 Monte Carlo integration for evaluating posterior integrals
61(1)
4.2 Markov chain Monte Carlo methods
62(8)
4.2.1 Gibbs sampling
63(1)
4.2.2 Gibbs sampling and directed graphical models
64(4)
4.2.3 Derivation of full conditional distributions in BUGS
68(1)
4.2.4 Other MCMC methods
68(2)
4.3 Initial values
70(1)
4.4 Convergence
71(6)
4.4.1 Detecting convergence/stationarity by eye
72(1)
4.4.2 Formal detection of convergence/stationarity
73(4)
4.5 Efficiency and accuracy
77(2)
4.5.1 Monte Carlo standard error of the posterior mean
77(1)
4.5.2 Accuracy of the whole posterior
78(1)
4.6 Beyond MCMC
79(2)
5 Prior distributions
81(22)
5.1 Different purposes of priors
81(1)
5.2 Vague, "objective," and "reference" priors
82(7)
5.2.1 Introduction
82(1)
5.2.2 Discrete uniform distributions
83(1)
5.2.3 Continuous uniform distributions and Jeffreys prior
83(1)
5.2.4 Location parameters
84(1)
5.2.5 Proportions
84(1)
5.2.6 Counts and rates
85(2)
5.2.7 Scale parameters
87(1)
5.2.8 Distributions on the positive integers
88(1)
5.2.9 More complex situations
89(1)
5.3 Representation of informative priors
89(6)
5.3.1 Elicitation of pure judgement
90(3)
5.3.2 Discounting previous data
93(2)
5.4 Mixture of prior distributions
95(2)
5.5 Sensitivity analysis
97(6)
6 Regression models
103(18)
6.1 Linear regression with normal errors
103(4)
6.2 Linear regression with non-normal errors
107(2)
6.3 Non-linear regression with normal errors
109(3)
6.4 Multivariate responses
112(2)
6.5 Generalised linear regression models
114(4)
6.6 Inference on functions of parameters
118(1)
6.7 Further reading
119(2)
7 Categorical data
121(16)
7.1 2 × 2 tables
121(5)
7.1.1 Tables with one margin fixed
122(3)
7.1.2 Case-control studies
125(1)
7.1.3 Tables with both margins fixed
126(1)
7.2 Multinomial models
126(6)
7.2.1 Conjugate analysis
126(2)
7.2.2 Non-conjugate analysis parameter constraints
128(1)
7.2.3 Categorical data with covariates
129(2)
7.2.4 Multinomial and Poisson regression equivalence
131(1)
7.2.5 Contingency tables
132(1)
7.3 Ordinal regression
132(2)
7.4 Further reading
134(3)
8 Model checking and comparison
137(48)
8.1 Introduction
137(1)
8.2 Deviance
138(2)
8.3 Residuals
140(7)
8.3.1 Standardised Pearson residuals
140(2)
8.3.2 Multivariate residuals
142(1)
8.3.3 Observed p-values for distributional shape
143(2)
8.3.4 Deviance residuals and tests of fit
145(2)
8.4 Predictive checks and Bayesian p-values
147(10)
8.4.1 Interpreting discrepancy statistics --- how big is big?
147(1)
8.4.2 Out-of-sample prediction
148(1)
8.4.3 Checking functions based on data alone
148(4)
8.4.4 Checking functions based on data and parameters
152(3)
8.4.5 Goodness of fit for grouped data
155(2)
8.5 Model assessment by embedding in larger models
157(2)
8.6 Model comparison using deviances
159(10)
8.6.1 pD: The effective number of parameters
159(2)
8.6.2 Issues with pD
161(3)
8.6.3 Alternative measures of the effective number of parameters
164(1)
8.6.4 DIC for model comparison
165(2)
8.6.5 How and why does WinBUGS partition DIC and pD?
167(1)
8.6.6 Alternatives to DIC
168(1)
8.7 Bayes factors
169(4)
8.7.1 Lindley-Bartlett paradox in model selection
171(1)
8.7.2 Computing marginal likelihoods
172(1)
8.8 Model uncertainty
173(4)
8.8.1 Bayesian model averaging
173(1)
8.8.2 MCMC sampling over a space of models
173(2)
8.8.3 Model averaging when all models are wrong
175(1)
8.8.4 Model expansion
176(1)
8.9 Discussion on model comparison
177(1)
8.10 Prior-data conflict
178(7)
8.10.1 Identification of prior-data conflict
179(1)
8.10.2 Accommodation of prior-data conflict
180(5)
9 Issues in Modelling
185(34)
9.1 Missing data
185(8)
9.1.1 Missing response data
186(3)
9.1.2 Missing covariate data
189(4)
9.2 Prediction
193(2)
9.3 Measurement error
195(6)
9.4 Cutting feedback
201(3)
9.5 New distributions
204(2)
9.5.1 Specifying a new sampling distribution
204(1)
9.5.2 Specifying a new prior distribution
205(1)
9.6 Censored, truncated, and grouped observations
206(5)
9.6.1 Censored observations
206(2)
9.6.2 Truncated sampling distributions
208(1)
9.6.3 Grouped, rounded, or interval-censored data
209(2)
9.7 Constrained parameters
211(3)
9.7.1 Univariate fully specified prior distributions
211(1)
9.7.2 Multivariate fully specified prior distributions
211(3)
9.7.3 Prior distributions with unknown parameters
214(1)
9.8 Bootstrapping
214(1)
9.9 Ranking
215(4)
10 Hierarchical models
219(34)
10.1 Exchangeability
219(4)
10.2 Priors
223(4)
10.2.1 Unit-specific parameters
223(1)
10.2.2 Parameter constraints
223(2)
10.2.3 Priors for variance components
225(2)
10.3 Hierarchical regression models
227(10)
10.3.1 Data formatting
230(7)
10.4 Hierarchical models for variances
237(3)
10.5 Redundant parameterisations
240(2)
10.6 More general formulations
242(1)
10.7 Checking of hierarchical models
242(7)
10.8 Comparison of hierarchical models
249(3)
10.8.1 "Focus": The crucial element of model comparison in hierarchical models
250(2)
10.9 Further resources
252(1)
11 Specialised models
253(44)
11.1 Time-to-event data
253(4)
11.1.1 Parametric survival regression
254(3)
11.2 Time series models
257(5)
11.3 Spatial models
262(11)
11.3.1 Intrinsic conditionally autoregressive (CAR) models
263(1)
11.3.2 Supplying map polygon data to WinBUGS and creating adjacency matrices
264(4)
11.3.3 Multivariate CAR models
268(1)
11.3.4 Proper CAR model
269(1)
11.3.5 Poisson-gamma moving average models
269(1)
11.3.6 Geostatistical models
270(3)
11.4 Evidence synthesis
273(5)
11.4.1 Meta-analysis
273(1)
11.4.2 Generalised evidence synthesis
274(4)
11.5 Differential equation and pharmacokinetic models
278(2)
11.6 Finite mixture and latent class models
280(6)
11.6.1 Mixture models using an explicit likelihood
283(3)
11.7 Piecewise parametric models
286(5)
11.7.1 Change-point models
286(2)
11.7.2 Splines
288(1)
11.7.3 Semiparametric survival models
288(3)
11.8 Bayesian nonparametric models
291(6)
11.8.1 Dirichlet process mixtures
293(1)
11.8.2 Stick-breaking implementation
293(4)
12 Different implementations of BUGS
297(32)
12.1 Introduction --- BUGS engines and interfaces
297(1)
12.2 Expert systems and MCMC methods
298(1)
12.3 Classic BUGS
299(1)
12.4 WinBUGS
300(15)
12.4.1 Using WinBUGS: compound documents
301(1)
12.4.2 Formatting data
301(3)
12.4.3 Using the WinBUGS graphical interface
304(4)
12.4.4 Doodles
308(1)
12.4.5 Scripting
308(2)
12.4.6 Interfaces with other software
310(1)
12.4.7 R2WinBUGS
311(2)
12.4.8 WBDev
313(2)
12.5 OpenBUGS
315(5)
12.5.1 Differences from WinBUGS
317(1)
12.5.2 OpenBUGS on Linux
317(1)
12.5.3 BRugs
318(1)
12.5.4 Parallel computation
319(1)
12.6 JAGS
320(9)
12.6.1 Extensibility: modules
321(1)
12.6.2 Language differences
321(3)
12.6.3 Other differences from WinBUGS
324(1)
12.6.4 Running JAGS from the command line
325(1)
12.6.5 Running JAGS from R
326(3)
Appendix A Bugs language syntax
329(8)
A.1 Introduction
329(1)
A.2 Distributions
329(2)
A.2.1 Standard distributions
329(1)
A.2.2 Censoring and truncation
330(1)
A.2.3 Non-standard distributions
331(1)
A.3 Deterministic functions
331(1)
A.3.1 Standard functions
331(1)
A.3.2 Special functions
331(1)
A.3.3 Add-on functions
332(1)
A.4 Repetition
332(1)
A.5 Multivariate quantities
333(1)
A.6 Indexing
334(1)
A.6.1 Functions as indices
334(1)
A.6.2 Implicit indexing
334(1)
A.6.3 Nested indexing
334(1)
A.7 Data transformations
335(1)
A.8 Commenting
335(2)
Appendix B Functions in BUGS
337(6)
B.1 Standard functions
337(1)
B.2 Trigonometric functions
337(1)
B.3 Matrix algebra
337(3)
B.4 Distribution utilities and model checking
340(1)
B.5 Functionals and differential equations
341(1)
B.6 Miscellaneous
342(1)
Appendix C Distributions in BUGS
343(14)
C.1 Continuous univariate, unrestricted range
343(2)
C.2 Continuous univariate, restricted to be positive
345(4)
C.3 Continuous univariate, restricted to a finite interval
349(1)
C.4 Continuous multivariate distributions
350(1)
C.5 Discrete univariate distributions
351(3)
C.6 Discrete multivariate distributions
354(3)
Bibliography 357(16)
Index 373
David Lunn, Chris Jackson, Nicky Best, Andrew Thomas, David Spiegelhalter