|
|
|
1 Introduction to Bayesian Statistics. |
|
|
|
1.1 The Frequentist Approach to Statistics. |
|
|
|
1.2 The Bayesian Approach to Statistics. |
|
|
|
1.3 Comparing Likelihood and Bayesian Approaches to Statistics. |
|
|
|
1.4 Computational Bayesian Statistics. |
|
|
|
1.5 Purpose and Organization of This Book. |
|
|
|
2 Monte Carlo Sampling from the posterior. |
|
|
|
2.1 Acceptance-Rejection-Sampling. |
|
|
|
2.2 Sampling-Importance-Resampling. |
|
|
|
2.3 Adaptive-Rejection-Sampling from a Log-Concave Distribution. |
|
|
|
2.4 Why Direct Methods are Inefficient for High-Dimension Parameter Space. |
|
|
|
|
|
|
|
|
|
3.1 Bayesian Inference from the Numerical Posterior. |
|
|
|
3.2 Bayesian Inference from Posterior Random Sample. |
|
|
|
4 Bayesian Statistics using Conjugate Priors. |
|
|
|
4.1 One-Dimensional Exponential Family of Densities. |
|
|
|
4.2 Distributions for Count Data. |
|
|
|
4.3 Distributions for Waiting Times. |
|
|
|
4.4 Normally Distributed Observations with Known Variance. |
|
|
|
4.5 Normally Distributed Observations with Known Mean. |
|
|
|
4.6 Normally Distributed Observations with Unknown Mean and Variance. |
|
|
|
4.7 Multivariate Normal Observations with Known Covariance Matrix. |
|
|
|
4.8 Observations from Normal Linear Regression Model. |
|
|
|
Appendix: Proof of Poisson Process Theorem. |
|
|
|
|
|
5.1 Stochastic Processes. |
|
|
|
|
|
5.3 Time-Invariant Markov Chains with Finite State Space. |
|
|
|
5.4 Classification of States of a Markov Chain. |
|
|
|
5.5 Sampling from a Markov Chain. |
|
|
|
5.6 Time-Reversible Markov Chains and Detailed Balance. |
|
|
|
5.7 Markov Chains with Continuous State Space. |
|
|
|
6 Markov Chain Monte Carlo Sampling from Posterior. |
|
|
|
6.1 Metropolis-Hastings Algorithm for a Single Parameter. |
|
|
|
6.2 Metropolis-Hastings Algorithm for Multiple Parameters. |
|
|
|
6.3 Blockwise Metropolis-Hastings Algorithm. |
|
|
|
|
|
|
|
7 Statistical Inference from a Markov Chain Monte Carlo Sample. |
|
|
|
7.1 Mixing Properties of the Chain. |
|
|
|
7.2 Finding a Heavy-Tailed Matched Curvature Candidate Density. |
|
|
|
7.3 Obtaining An Approximate Random Sample For Inference. |
|
|
|
Appendix: Procedure for Finding the Matched Curvature Candidate Density for a Multivariate Parameter. |
|
|
|
|
|
8.1 Logistic Regression Model. |
|
|
|
8.2 Computational Bayesian Approach to the Logistic Regression Model. |
|
|
|
8.3 Modelling with the Multiple Logistic Regression Model. |
|
|
|
9 Poisson Regression and Proportional Hazards Model. |
|
|
|
9.1 Poisson Regression Model. |
|
|
|
9.2 Computational Approach to Poisson Regression Model. |
|
|
|
9.3 The Proportional Hazards Model. |
|
|
|
9.4 Computational Bayesian Approach to Proportional Hazards Model. |
|
|
|
10 Gibbs Sampling and Hierarchical Models. |
|
|
|
10.1 Gibbs Sampling Procedure. |
|
|
|
10.2 The Gibbs Sampler for the Normal Distribution. |
|
|
|
10.3 Hierarchical Models and Gibbs Sampling. |
|
|
|
10.4 Modelling Related Populations with Hierarchical Models. |
|
|
|
Appendix: Proof that Improper Jeffrey’s Prior Distribution for the Hypervariance Can Lead to an Improper Posterior. |
|
|
|
11 Going Forward with Markov Chain Monte Carlo. |
|
|
|
A Using the Included Minitab Macros. |
|
|
|
B Using the Included R Functions. |
|
|
|
|
|
Topic Index. 9780470046098
|
|
|