Muutke küpsiste eelistusi

E-raamat: Introduction to Bayesian Inference, Methods and Computation

  • Formaat: EPUB+DRM
  • Ilmumisaeg: 17-Oct-2021
  • Kirjastus: Springer Nature Switzerland AG
  • Keel: eng
  • ISBN-13: 9783030828080
  • Formaat - EPUB+DRM
  • Hind: 67,91 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
  • Formaat: EPUB+DRM
  • Ilmumisaeg: 17-Oct-2021
  • Kirjastus: Springer Nature Switzerland AG
  • Keel: eng
  • ISBN-13: 9783030828080

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

These lecture notes provide a rapid, accessible introduction to Bayesian statistical methods. The course covers the fundamental philosophy and principles of Bayesian inference, including the reasoning behind the prior/likelihood model construction synonymous with Bayesian methods, through to advanced topics such as nonparametrics, Gaussian processes and latent factor models. These advanced modelling techniques can easily be applied using computer code samples written in Python and Stan which are integrated into the main text. Importantly, the reader will learn methods for assessing model fit, and to choose between rival modelling approaches.

 


1 Uncertainty and Decisions
1(14)
1.1 Subjective Uncertainty and Possibilities
1(2)
1.1.1 Subjectivism
1(1)
1.1.2 Subjective Uncertainty
2(1)
1.1.3 Possible Outcomes and Events
2(1)
1.2 Decisions: Actions, Outcomes, Consequences
3(2)
1.2.1 Elements of a Decision Problem
3(1)
1.2.2 Preferences on Actions
3(2)
1.3 Subjective Probability
5(3)
1.3.1 Standard Events
5(1)
1.3.2 Equivalent Standard Events
6(1)
1.3.3 Definition of Subjective Probability
6(1)
1.3.4 Contrast with Frequentist Probability
7(1)
1.3.5 Conditional Probability
7(1)
1.3.6 Updating Beliefs: Bayes Theorem
8(1)
1.4 Utility
8(4)
1.4.1 Principle of Maximising Expected Utility
9(1)
1.4.2 Utilities for Bounded Decision Problems
10(1)
1.4.3 Utilities for Unbounded Decision Problems
10(1)
1.4.4 Randomised Strategies
11(1)
1.4.5 Conditional Probability as a Consequence of Coherence
11(1)
1.5 Estimation and Prediction
12(3)
1.5.1 Continuous Random Variables and Decision Spaces
12(1)
1.5.2 Estimation and Loss Functions
12(1)
1.5.3 Prediction
13(2)
2 Prior and Likelihood Representation
15(8)
2.1 Exchangeability and Infinite Exchangeability
15(1)
2.2 De Finetti's Representation Theorem
16(2)
2.3 Prior, Likelihood and Posterior
18(5)
2.3.1 Prior Elicitation
18(1)
2.3.2 Non-informative Priors
18(1)
2.3.3 Hyperpriors
19(1)
2.3.4 Mixture Priors
19(1)
2.3.5 Bayesian Paradigm for Prior to Posterior Reporting
20(1)
2.3.6 Asymptotic Consistency
20(1)
2.3.7 Asymptotic Normality
21(2)
3 Graphical Modelling and Hierarchical Models
23(10)
3.1 Graphs
23(3)
3.1.1 Specifying a Graph
23(1)
3.1.2 Neighbourhoods of Graph Nodes
24(1)
3.1.3 Paths, Cycles and Directed Acyclic Graphs
25(1)
3.1.4 Cliques and Separation
25(1)
3.2 Graphical Models
26(4)
3.2.1 Belief Networks
26(1)
3.2.2 Markov Networks
27(2)
3.2.3 Factor Graphs
29(1)
3.3 Hierarchical Models
30(3)
4 Parametric Models
33(6)
4.1 Parametric Modelling
33(1)
4.2 Conjugate Models
34(2)
4.3 Exponential Families
36(1)
4.4 Non-conjugate Models
36(1)
4.5 Posterior Summaries for Parametric Models
37(2)
4.5.1 Marginal Distributions
37(1)
4.5.2 Credible Regions
38(1)
5 Computational Inference
39(22)
5.1 Intractable Integrals in Bayesian Inference
39(1)
5.2 Monte Carlo Estimation
40(4)
5.2.1 Standard Error
41(1)
5.2.2 Estimation Under a Loss Function
41(1)
5.2.3 Importance Sampling
42(1)
5.2.4 Normalising Constant Estimation
43(1)
5.3 Markov Chain Monte Carlo
44(6)
5.3.1 Technical Requirements of Markov Chains in MCMC
44(2)
5.3.2 Gibbs Sampling
46(2)
5.3.3 Metropolis-Hastings Algorithm
48(2)
5.4 Hamiltonian Markov Chain Monte Carlo
50(2)
5.5 Analytic Approximations
52(8)
5.5.1 Normal Approximation
52(1)
5.5.2 Laplace Approximations
53(2)
5.5.3 Variational Inference
55(5)
5.6 Further Topics
60(1)
6 Bayesian Software Packages
61(6)
6.1 Illustrative Statistical Model
61(1)
6.2 Stan
62(3)
6.2.1 PyStan
63(2)
6.3 Other Software Libraries
65(2)
6.3.1 PyMC
65(1)
6.3.2 Edward
65(2)
7 Criticism and Model Choice
67(12)
7.1 Model Uncertainty
68(1)
7.2 Model Averaging
68(1)
7.3 Model Selection
69(4)
7.3.1 Selecting From a Set of Models
69(1)
7.3.2 Pairwise Comparisons: Bayes Factors
70(2)
7.3.3 Bayesian Information Criterion
72(1)
7.4 Posterior Predictive Checking
73(6)
7.4.1 Posterior Predictive p-Values
73(1)
7.4.2 Monte Carlo Estimation
74(1)
7.4.3 PPC with Stan
74(5)
8 Linear Models
79(14)
8.1 Parametric Regression
79(1)
8.2 Bayes Linear Model
80(6)
8.2.1 Conjugate Prior
81(4)
8.2.2 Reference Prior
85(1)
8.3 Generalisation of the Linear Model
86(1)
8.3.1 General Basis Functions
86(1)
8.4 Generalised Linear Models
87(6)
8.4.1 Poisson Regression
88(2)
8.4.2 Logistic regression
90(3)
9 Nonparametric Models
93(14)
9.1 Random Probability Measures
94(1)
9.2 Dirichlet Processes
94(4)
9.2.1 Discrete Base Measure
97(1)
9.3 Polya Trees
98(4)
9.3.1 Continuous Random Measures
101(1)
9.4 Partition Models
102(5)
9.4.1 Partition Models: Bayesian Histograms
102(2)
9.4.2 Bayesian Histograms with Equal Bin Widths
104(3)
10 Nonparametric Regression
107(14)
10.1 Nonparametric Regression Modelling
107(1)
10.2 Gaussian Processes
108(5)
10.2.1 Normal Errors
109(1)
10.2.2 Inference
110(3)
10.3 Spline Models
113(3)
10.3.1 Spline Regression with Equally Spaced Knots
114(2)
10.4 Partition Regression Models
116(5)
10.4.1 Changepoint Models
117(2)
10.4.2 Classification and Regression Trees
119(2)
11 Clustering and Latent Factor Models
121(16)
11.1 Mixture Models
121(7)
11.1.1 Finite Mixture Models
122(4)
11.1.2 Dirichlet Process Mixture Models
126(2)
11.2 Mixed-Membership Models
128(5)
11.2.1 Latent Dirichlet Allocation
129(2)
11.2.2 Hierarchical Dirichlet Processes
131(2)
11.3 Latent Factor Models
133(4)
11.3.1 Stan Implementation
134(3)
Appendix A Conjugate Parametric Models 137(4)
Appendix B Solutions to Exercises 141(22)
Glossary 163(2)
References 165(2)
Index 167
Professor Nick Heard received his PhD degree from the Department of Mathematics at Imperial College London in 2001 and currently holds the position of Chair in Statistics at Imperial. His research interests include developing statistical models for cyber-security applications, finding community structure in large dynamic networks, clustering and changepoint analysis, in each case using computational Bayesian methods.