| Preface |
|
xi | |
|
1 Probability: A Measurement of Uncertainty |
|
|
1 | (32) |
|
|
|
1 | (1) |
|
1.2 The Classical View of a Probability |
|
|
2 | (2) |
|
1.3 The Frequency View of a Probability |
|
|
4 | (2) |
|
1.4 The Subjective View of a Probability |
|
|
6 | (3) |
|
|
|
9 | (3) |
|
1.6 Assigning Probabilities |
|
|
12 | (3) |
|
1.7 Events and Event Operations |
|
|
15 | (1) |
|
1.8 The Three Probability Axioms |
|
|
16 | (2) |
|
1.9 The Complement and Addition Properties |
|
|
18 | (1) |
|
|
|
19 | (14) |
|
|
|
33 | (24) |
|
2.1 Introduction: Rolling Dice, Yahtzee, and Roulette |
|
|
33 | (1) |
|
2.2 Equally Likely Outcomes |
|
|
34 | (1) |
|
2.3 The Multiplication Counting Rule |
|
|
35 | (2) |
|
|
|
37 | (2) |
|
|
|
39 | (3) |
|
2.6 Arrangements of Non-Distinct Objects |
|
|
42 | (4) |
|
|
|
46 | (3) |
|
|
|
49 | (8) |
|
3 Conditional Probability |
|
|
57 | (40) |
|
3.1 Introduction: The Three Card Problem |
|
|
57 | (3) |
|
|
|
60 | (2) |
|
|
|
62 | (3) |
|
3.4 Definition and the Multiplication Rule |
|
|
65 | (4) |
|
3.5 The Multiplication Rule under Independence |
|
|
69 | (6) |
|
3.6 Learning Using Bayes' Rule |
|
|
75 | (3) |
|
3.7 R Example: Learning about a Spinner |
|
|
78 | (5) |
|
|
|
83 | (14) |
|
|
|
97 | (40) |
|
4.1 Introduction: The Hat Check Problem |
|
|
97 | (1) |
|
4.2 Random Variable and Probability Distribution |
|
|
98 | (4) |
|
4.3 Summarizing a Probability Distribution |
|
|
102 | (2) |
|
4.4 Standard Deviation of a Probability Distribution |
|
|
104 | (6) |
|
4.5 Coin-Tossing Distributions |
|
|
110 | (11) |
|
4.5.1 Binomial probabilities |
|
|
111 | (4) |
|
4.5.2 Binomial computations |
|
|
115 | (2) |
|
4.5.3 Mean and standard deviation of a binomial |
|
|
117 | (1) |
|
4.5.4 Negative binomial experiments |
|
|
118 | (3) |
|
|
|
121 | (16) |
|
5 Continuous Distributions |
|
|
137 | (48) |
|
5.1 Introduction: A Baseball Spinner Game |
|
|
137 | (2) |
|
5.2 The Uniform Distribution |
|
|
139 | (4) |
|
5.3 Probability Density: Waiting for a Bus |
|
|
143 | (3) |
|
5.4 The Cumulative Distribution Function |
|
|
146 | (3) |
|
5.5 Summarizing a Continuous Random Variable |
|
|
149 | (2) |
|
|
|
151 | (6) |
|
5.7 Binomial Probabilities and the Normal Curve |
|
|
157 | (4) |
|
5.8 Sampling Distribution of the Mean |
|
|
161 | (8) |
|
|
|
169 | (16) |
|
6 Joint Probability Distributions |
|
|
185 | (32) |
|
|
|
185 | (1) |
|
6.2 Joint Probability Mass Function: Sampling from a Box |
|
|
185 | (6) |
|
6.3 Multinomial Experiments |
|
|
191 | (4) |
|
6.4 Joint Density Functions |
|
|
195 | (5) |
|
6.5 Independence and Measuring Association |
|
|
200 | (2) |
|
6.6 Flipping a Random Coin: The Beta-Binomial Distribution |
|
|
202 | (3) |
|
6.7 Bivariate Normal Distribution |
|
|
205 | (4) |
|
|
|
209 | (8) |
|
7 Learning about a Binomial Probability |
|
|
217 | (50) |
|
7.1 Introduction: Thinking Subjectively about a Proportion |
|
|
217 | (4) |
|
7.2 Bayesian Inference with Discrete Priors |
|
|
221 | (8) |
|
7.2.1 Example: students' dining preference |
|
|
221 | (1) |
|
7.2.2 Discrete prior distributions for proportion p |
|
|
221 | (3) |
|
7.2.3 Likelihood of proportion p |
|
|
224 | (1) |
|
7.2.4 Posterior distribution for proportion p |
|
|
225 | (2) |
|
7.2.5 Inference: students' dining preference |
|
|
227 | (1) |
|
7.2.6 Discussion: using a discrete prior |
|
|
228 | (1) |
|
|
|
229 | (8) |
|
7.3.1 The beta distribution and probabilities |
|
|
231 | (3) |
|
7.3.2 Choosing a beta density to represent prior opinion |
|
|
234 | (3) |
|
7.4 Updating the Beta Prior |
|
|
237 | (5) |
|
7.4.1 Bayes' rule calculation |
|
|
238 | (1) |
|
7.4.2 From beta prior to beta posterior: conjugate priors |
|
|
239 | (3) |
|
7.5 Bayesian Inferences with Continuous Priors |
|
|
242 | (8) |
|
7.5.1 Bayesian hypothesis testing |
|
|
243 | (1) |
|
7.5.2 Bayesian credible intervals |
|
|
244 | (3) |
|
7.5.3 Bayesian prediction |
|
|
247 | (3) |
|
|
|
250 | (6) |
|
|
|
256 | (11) |
|
8 Modeling Measurement and Count Data |
|
|
267 | (46) |
|
|
|
267 | (1) |
|
8.2 Modeling Measurements |
|
|
267 | (4) |
|
|
|
267 | (2) |
|
8.2.2 The general approach |
|
|
269 | (1) |
|
|
|
270 | (1) |
|
8.3 Bayesian Inference with Discrete Priors |
|
|
271 | (7) |
|
8.3.1 Example: Roger Federer's time-to-serve |
|
|
271 | (4) |
|
8.3.2 Simplification of the likelihood |
|
|
275 | (3) |
|
8.3.3 Inference: Federer's time-to-serve |
|
|
278 | (1) |
|
|
|
278 | (3) |
|
8.4.1 The normal prior for mean μ |
|
|
278 | (1) |
|
8.4.2 Choosing a normal prior |
|
|
279 | (2) |
|
8.5 Updating the Normal Prior |
|
|
281 | (7) |
|
|
|
281 | (1) |
|
8.5.2 A quick peak at the update procedure |
|
|
282 | (3) |
|
8.5.3 Bayes' rule calculation |
|
|
285 | (1) |
|
8.5.4 Conjugate normal prior |
|
|
286 | (2) |
|
8.6 Bayesian Inferences for Continuous Normal Mean |
|
|
288 | (4) |
|
8.6.1 Bayesian hypothesis testing and credible interval |
|
|
288 | (2) |
|
8.6.2 Bayesian prediction |
|
|
290 | (2) |
|
8.7 Posterior Predictive Checking |
|
|
292 | (2) |
|
|
|
294 | (7) |
|
|
|
295 | (1) |
|
8.8.2 The Poisson distribution |
|
|
296 | (1) |
|
8.8.3 Bayesian inferences |
|
|
297 | (3) |
|
8.8.4 Case study: Learning about website counts |
|
|
300 | (1) |
|
|
|
301 | (12) |
|
9 Simulation by Markov Chain Monte Carlo |
|
|
313 | (52) |
|
|
|
313 | (4) |
|
9.1.1 The Bayesian computation problem |
|
|
313 | (1) |
|
|
|
313 | (2) |
|
9.1.3 The two-parameter normal problem |
|
|
315 | (1) |
|
9.1.4 Overview of the chapter |
|
|
316 | (1) |
|
|
|
317 | (3) |
|
|
|
317 | (1) |
|
|
|
318 | (1) |
|
9.2.3 Simulating a Markov chain |
|
|
319 | (1) |
|
9.3 The Metropolis Algorithm |
|
|
320 | (6) |
|
9.3.1 Example: Walking on a number line |
|
|
320 | (3) |
|
9.3.2 The general algorithm |
|
|
323 | (3) |
|
9.3.3 A general function for the Metropolis algorithm |
|
|
326 | (1) |
|
9.4 Example: Cauchy-Normal Problem |
|
|
326 | (4) |
|
9.4.1 Choice of starting value and proposal region |
|
|
327 | (2) |
|
9.4.2 Collecting the simulated draws |
|
|
329 | (1) |
|
|
|
330 | (8) |
|
9.5.1 Bivariate discrete distribution |
|
|
330 | (2) |
|
9.5.2 Beta-binomial sampling |
|
|
332 | (1) |
|
9.5.3 Normal sampling - both parameters unknown |
|
|
333 | (5) |
|
9.6 MCMC Inputs and Diagnostics |
|
|
338 | (3) |
|
9.6.1 Burn-in, starting values, and multiple chains |
|
|
338 | (1) |
|
|
|
338 | (1) |
|
9.6.3 Graphs and summaries |
|
|
339 | (2) |
|
|
|
341 | (13) |
|
9.7.1 Normal sampling model |
|
|
342 | (3) |
|
|
|
345 | (2) |
|
9.7.3 Posterior predictive checking |
|
|
347 | (2) |
|
9.7.4 Comparing two proportions |
|
|
349 | (5) |
|
|
|
354 | (11) |
|
10 Bayesian Hierarchical Modeling |
|
|
365 | (44) |
|
|
|
365 | (4) |
|
10.1.1 Observations in groups |
|
|
365 | (1) |
|
10.1.2 Example: standardized test scores |
|
|
366 | (1) |
|
10.1.3 Separate estimates? |
|
|
366 | (1) |
|
10.1.4 Combined estimates? |
|
|
367 | (1) |
|
10.1.5 A two-stage prior leading to compromise estimates |
|
|
367 | (2) |
|
10.2 Hierarchical Normal Modeling |
|
|
369 | (12) |
|
10.2.1 Example: ratings of animation movies |
|
|
369 | (1) |
|
10.2.2 A hierarchical Normal model with random σ |
|
|
370 | (4) |
|
10.2.3 Inference through MCMC |
|
|
374 | (7) |
|
10.3 Hierarchical Beta-Binomial Modeling |
|
|
381 | (12) |
|
10.3.1 Example: Deaths after heart attacks |
|
|
381 | (1) |
|
10.3.2 A hierarchical beta-binomial model |
|
|
381 | (4) |
|
10.3.3 Inference through MCMC |
|
|
385 | (8) |
|
|
|
393 | (16) |
|
11 Simple Linear Regression |
|
|
409 | (40) |
|
|
|
409 | (3) |
|
11.2 Example: Prices and Areas of House Sales |
|
|
412 | (1) |
|
11.3 A Simple Linear Regression Model |
|
|
413 | (1) |
|
11.4 A Weakly Informative Prior |
|
|
414 | (1) |
|
|
|
415 | (1) |
|
11.6 Inference through MCMC |
|
|
416 | (4) |
|
11.7 Bayesian Inferences with Simple Linear Regression |
|
|
420 | (7) |
|
11.7.1 Simulate fits from the regression model |
|
|
420 | (1) |
|
11.7.2 Learning about the expected response |
|
|
421 | (2) |
|
11.7.3 Prediction of future response |
|
|
423 | (2) |
|
11.7.4 Posterior predictive model checking |
|
|
425 | (2) |
|
|
|
427 | (6) |
|
|
|
428 | (1) |
|
11.8.2 Prior distributions |
|
|
429 | (2) |
|
11.8.3 Posterior Analysis |
|
|
431 | (2) |
|
11.9 A Conditional Means Prior |
|
|
433 | (4) |
|
|
|
437 | (12) |
|
12 Bayesian Multiple Regression and Logistic Models |
|
|
449 | (38) |
|
|
|
449 | (1) |
|
12.2 Bayesian Multiple Linear Regression |
|
|
449 | (10) |
|
12.2.1 Example: expenditures of U.S. households |
|
|
449 | (2) |
|
12.2.2 A multiple linear regression model |
|
|
451 | (2) |
|
12.2.3 Weakly informative priors and inference through MCMC |
|
|
453 | (4) |
|
|
|
457 | (2) |
|
12.3 Comparing Regression Models |
|
|
459 | (6) |
|
12.4 Bayesian Logistic Regression |
|
|
465 | (12) |
|
12.4.1 Example: U.S. women labor participation |
|
|
465 | (2) |
|
12.4.2 A logistic regression model |
|
|
467 | (2) |
|
12.4.3 Conditional means priors and inference through MCMC |
|
|
469 | (6) |
|
|
|
475 | (2) |
|
|
|
477 | (10) |
|
|
|
487 | (38) |
|
|
|
487 | (1) |
|
13.2 Federalist Papers Study |
|
|
488 | (9) |
|
|
|
488 | (1) |
|
|
|
488 | (1) |
|
13.2.3 Poisson density sampling |
|
|
489 | (2) |
|
13.2.4 Negative binomial sampling |
|
|
491 | (3) |
|
13.2.5 Comparison of rates for two authors |
|
|
494 | (2) |
|
13.2.6 Which words distinguish the two authors? |
|
|
496 | (1) |
|
|
|
497 | (8) |
|
|
|
497 | (1) |
|
13.3.2 Measuring hitting performance in baseball |
|
|
498 | (1) |
|
13.3.3 A hitter's career trajectory |
|
|
498 | (1) |
|
13.3.4 Estimating a single trajectory |
|
|
499 | (3) |
|
13.3.5 Estimating many trajectories by a hierarchical model |
|
|
502 | (3) |
|
13.4 Latent Class Modeling |
|
|
505 | (12) |
|
13.4.1 Two classes of test takers |
|
|
505 | (4) |
|
13.4.2 A latent class model with two classes |
|
|
509 | (5) |
|
13.4.3 Disputed authorship of the Federalist Papers |
|
|
514 | (3) |
|
|
|
517 | (8) |
|
|
|
525 | (4) |
|
14.1 Appendix A: The constant in the beta posterior |
|
|
525 | (1) |
|
14.2 Appendix B: The posterior predictive distribution |
|
|
526 | (1) |
|
14.3 Appendix C: Comparing Bayesian models |
|
|
527 | (2) |
| Bibliography |
|
529 | (2) |
| Index |
|
531 | |