Muutke küpsiste eelistusi

E-raamat: Bayesian Programming

(CNRS, Grenoble, France), , , (Probayes Inc., Montbonnot, France)
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 57,19 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

This book proposes Bayesian programming as a systematic and generic method for building subjective probabilistic models. The presentation is intended for non-specialists and assumes only a basic foundation in mathematics. The topics include incompleteness and uncertainty, the importance of conditional independence, Bayesian programming with coherence variables, the Bayesian programming conditional statement, and Bayesian inference algorithms revisited. Appendix-like final chapters contain frequently asked questions and frequently argued matters, and a glossary. Annotation ©2014 Ringgold, Inc., Portland, OR (protoview.com)

Probability as an Alternative to Boolean Logic
While logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data.

Decision-Making Tools and Methods for Incomplete and Uncertain Data
Emphasizing probability as an alternative to Boolean logic, Bayesian Programming covers new methods to build probabilistic programs for real-world applications. Written by the team who designed and implemented an efficient probabilistic inference engine to interpret Bayesian programs, the book offers many Python examples that are also available on a supplementary website together with an interpreter that allows readers to experiment with this new approach to programming.

Principles and Modeling
Only requiring a basic foundation in mathematics, the first two parts of the book present a new methodology for building subjective probabilistic models. The authors introduce the principles of Bayesian programming and discuss good practices for probabilistic modeling. Numerous simple examples highlight the application of Bayesian modeling in different fields.

Formalism and Algorithms
The third part synthesizes existing work on Bayesian inference algorithms since an efficient Bayesian inference engine is needed to automate the probabilistic calculus in Bayesian programs. Many bibliographic references are included for readers who would like more details on the formalism of Bayesian programming, the main probabilistic models, general purpose algorithms for Bayesian inference, and learning problems.

FAQs
Along with a glossary, the fourth part contains answers to frequently asked questions. The authors compare Bayesian programming and possibility theories, discuss the computational complexity of Bayesian inference, cover the irreducibility of incompleteness, and address the subjectivist versus objectivist epistemology of probability.

The First Steps toward a Bayesian Computer
A new modeling methodology, new inference algorithms, new programming languages, and new hardware are all needed to create a complete Bayesian computing framework. Focusing on the methodology and algorithms, this book describes the first steps toward reaching that goal. It encourages readers to explore emerging areas, such as bio-inspired computing, and develop new programming languages and hardware architectures.

Arvustused

"Bayesian Programming comprises a methodology, a programming language, and a set of tools for developing and applying complex models. The approach is described in great detail, with many worked examples backed up by an online code repository. Unlike other books that tend to focus almost entirely on mathematics, this one gives equal time to conceptual and methodological guidance for the model-builder. It grapples with the knotty problems that arise in practice, some of which do not yet have clear solutions." From the Foreword by Stuart Russell, University of California, Berkeley

"The book has many worked examples backed up by an online code repository. The book provides a contibution on conceptual and methodological guidelines for model-builders. The authors discuss the problem how to build a Bayesian computer. The book has an excellent bibliography." Nirode C. Mohanty, in Zentralblatt MATH 1281

Foreword xv
Preface xvii
1 Introduction
1(14)
1.1 Probability an alternative to logic
1(4)
1.2 A need for a new computing paradigm
5(1)
1.3 A need for a new modeling methodology
5(3)
1.4 A need for new inference algorithms
8(2)
1.5 A need for a new programming language and new hardware
10(1)
1.6 A place for numerous controversies
11(1)
1.7 Running real programs as exercises
12(3)
I Bayesian Programming Principles 15(76)
2 Basic Concepts
17(18)
2.1 Variable
18(1)
2.2 Probability
18(1)
2.3 The normalization postulate
19(1)
2.4 Conditional probability
19(1)
2.5 Variable conjunction
20(1)
2.6 The conjunction postulate (Bayes theorem)
20(1)
2.7 Syllogisms
21(1)
2.8 The marginalization rule
22(1)
2.9 Joint distribution and questions
23(2)
2.10 Decomposition
25(1)
2.11 Parametric forms
26(2)
2.12 Identification
28(1)
2.13 Specification = Variables + Decomposition + Parametric forms
29(1)
2.14 Description = Specification + Identification
29(1)
2.15 Question
29(2)
2.16 Bayesian program = Description + Question
31(1)
2.17 Results
32(3)
3 Incompleteness and Uncertainty
35(12)
3.1 Observing a water treatment unit
35(5)
3.1.1 The elementary water treatment unit
36(2)
3.1.2 Experimentation and uncertainty
38(2)
3.2 Lessons, comments, and notes
40(7)
3.2.1 The effect of incompleteness
40(1)
3.2.2 The effect of inaccuracy
41(1)
3.2.3 Not taking into account the effect of ignored variables may lead to wrong decisions
42(1)
3.2.4 From incompleteness to uncertainty
43(4)
4 Description = Specification + Identification
47(18)
4.1 Pushing objects and following contours
48(8)
4.1.1 The Khepera robot
48(1)
4.1.2 Pushing objects
49(4)
4.1.3 Following contours
53(3)
4.2 Description of a water treatment unit
56(4)
4.2.1 Specification
56(3)
4.2.2 Identification
59(1)
4.2.3 Bayesian program
59(1)
4.2.4 Results
60(1)
4.3 Lessons, comments, and notes
60(5)
4.3.1 Description = Specification + Identification
60(1)
4.3.2 Specification -= Variables + Decomposition + Forms
61(1)
4.3.3 Learning is a means to transform incompleteness into uncertainty
62(3)
5 The Importance of Conditional Independence
65(10)
5.1 Water treatment center Bayesian model
65(1)
5.2 Description of the water treatment center
66(5)
5.2.1 Specification
66(4)
5.2.2 Identification
70(1)
5.2.3 Bayesian program
71(1)
5.3 Lessons, comments, and notes
71(4)
5.3.1 Independence versus conditional independence
71(2)
5.3.2 The importance of conditional independence
73(2)
6 Bayesian Program = Description + Question
75(16)
6.1 Water treatment center Bayesian model (end)
76(1)
6.2 Forward simulation of a single unit
76(2)
6.2.1 Question
77(1)
6.2.2 Results
78(1)
6.3 Forward simulation of the water treatment center
78(3)
6.3.1 Question
78(2)
6.3.2 Results
80(1)
6.4 Control of the water treatment center
81(4)
6.4.1 Question (1)
81(1)
6.4.2 Results (1)
81(1)
6.4.3 Question (2)
82(2)
6.4.4 Results (2)
84(1)
6.5 Diagnosis
85(2)
6.5.1 Question
86(1)
6.5.2 Results
86(1)
6.6 Lessons, comments, and notes
87(6)
6.6.1 Bayesian Program = Description + Question
87(1)
6.6.2 The essence of Bayesian inference
88(1)
6.6.3 No inverse or direct problem
89(1)
6.6.4 No ill-posed problem
89(2)
II Bayesian Programming Cookbook 91(106)
7 Information Fusion
93(28)
7.1 "Naive" Bayes sensor fusion
94(8)
7.1.1 Statement of the problem
94(1)
7.1.2 Bayesian program
94(2)
7.1.3 Instance and results
96(6)
7.2 Relaxing the conditional independence fundamental hypothesis
102(3)
7.2.1 Statement of the problem
102(1)
7.2.2 Bayesian program
103(1)
7.2.3 Instance and results
103(2)
7.3 Classification
105(3)
7.3.1 Statement of the problem
105(1)
7.3.2 Bayesian program
106(1)
7.3.3 Instance and results
106(2)
7.4 Ancillary clues
108(5)
7.4.1 Statement of the problem
108(1)
7.4.2 Bayesian program
108(2)
7.4.3 Instance and results
110(3)
7.5 Sensor fusion with false alarm
113(3)
7.5.1 Statement of the problem
113(1)
7.5.2 Bayesian program
114(1)
7.5.3 Instance and results
114(2)
7.6 Inverse programming
116(5)
7.6.1 Statement of the problem
116(1)
7.6.2 Bayesian program
117(1)
7.6.3 Instance and results
118(3)
8 Bayesian Programming with Coherence Variables
121(32)
8.1 Basic example with Boolean variables
122(3)
8.1.1 Statement of the problem
122(1)
8.1.2 Bayesian program
123(1)
8.1.3 Instance and results
124(1)
8.2 Basic example with discrete variables
125(5)
8.2.1 Statement of the problem
125(1)
8.2.2 Bayesian program
126(1)
8.2.3 Instance and results
126(4)
8.3 Checking the semantic of A
130(2)
8.3.1 Statement of the problem
130(1)
8.3.2 Bayesian program
130(1)
8.3.3 Instance and results
131(1)
8.4 Information fusion revisited using coherence variables
132(9)
8.4.1 Statement of the problems
132(3)
8.4.2 Bayesian program
135(1)
8.4.3 Instance and results
135(6)
8.5 Reasoning with soft evidence
141(4)
8.5.1 Statement of the problem
141(1)
8.5.2 Bayesian program
142(1)
8.5.3 Instance and results
143(2)
8.6 Switch
145(2)
8.6.1 Statement of the problem
145(1)
8.6.2 Bayesian program
145(1)
8.6.3 Instance and results
146(1)
8.7 Cycles
147(6)
8.7.1 Statement of the problem
147(1)
8.7.2 Bayesian program
148(1)
8.7.3 Instance and results
148(5)
9 Bayesian Programming Subroutines
153(18)
9.1 The sprinkler model
154(5)
9.1.1 Statement of the problem
154(2)
9.1.2 Bayesian program
156(1)
9.1.3 Instance and results
156(3)
9.2 Calling subroutines conditioned by values
159(3)
9.2.1 Statement of the problem
159(1)
9.2.2 Bayesian program
159(1)
9.2.3 Instance and results
160(2)
9.3 Water treatment center revisited (final)
162(1)
9.3.1 Statement of the problem
162(1)
9.3.2 Bayesian program
162(1)
9.4 Fusion of subroutines
163(2)
9.4.1 Statement of the problem
163(1)
9.4.2 Bayesian program
163(2)
9.5 Superposition
165(6)
9.5.1 Statement of the problem
165(1)
9.5.2 Bayesian program
165(1)
9.5.3 Instance and results
166(5)
10 Bayesian Programming Conditional Statement
171(12)
10.1 Bayesian if-then-else
172(7)
10.1.1 Statement of the problem
172(1)
10.1.2 Bayesian program
173(3)
10.1.3 Instance and results
176(3)
10.2 Behavior recognition
179(1)
10.2.1 Statement of the problem
179(1)
10.2.2 Bayesian program
179(1)
10.2.3 Instance and results
179(1)
10.3 Mixture of models and model recognition
180(3)
11 Bayesian Programming Iteration
183(14)
11.1 Generic iteration
184(2)
11.1.1 Statement of the problem
184(1)
11.1.2 Bayesian program
184(1)
11.1.3 Instance and results
185(1)
11.2 Generic Bayesian filters
186(5)
11.2.1 Statement of the problem
186(1)
11.2.2 Bayesian program
186(2)
11.2.3 Instance and results
188(3)
11.3 Markov localization
191(8)
11.3.1 Statement of the problem
191(1)
11.3.2 Bayesian program
192(1)
11.3.3 Instance and results
192(5)
III Bayesian Programming Formalism and Algorithms 197(112)
12 Bayesian Programming Formalism
199(10)
12.1 Logical propositions
200(1)
12.2 Probability of a proposition
200(1)
12.3 Normalization and conjunction postulates
200(1)
12.4 Disjunction rule for propositions
201(1)
12.5 Discrete variables
201(1)
12.6 Variable conjunction
202(1)
12.7 Probability on variables
202(1)
12.8 Conjunction rule for variables
202(1)
12.9 Normalization rule for variables
203(1)
12.10 Marginalization rule
203(1)
12.11 Bayesian program
203(1)
12.12 Description
204(1)
12.13 Specification
204(2)
12.14 Questions
206(1)
12.15 Inference
206(3)
13 Bayesian Models Revisited
209(38)
13.1 General purpose probabilistic models
210(10)
13.1.1 Graphical models and Bayesian networks
210(3)
13.1.2 Recursive Bayesian estimation
213(4)
13.1.3 Mixture models
217(2)
13.1.4 Maximum entropy approaches
219(1)
13.2 Engineering oriented probabilistic models
220(5)
13.2.1 Sensor fusion
220(2)
13.2.2 Classification
222(1)
13.2.3 Pattern recognition
222(1)
13.2.4 Sequence recognition
222(1)
13.2.5 Markov localization
223(1)
13.2.6 Markov decision processes
224(1)
13.3 Cognitive oriented probabilistic models
225(22)
13.3.1 Ambiguities
226(3)
13.3.2 Fusion, multimodality, conflicts
229(6)
13.3.3 Modularity, hierarchies
235(6)
13.3.4 Loops
241(6)
14 Bayesian Inference Algorithms Revisited
247(34)
14.1 Stating the problem
248(2)
14.2 Symbolic computation
250(16)
14.2.1 Exact symbolic computation
250(15)
14.2.2 Approximate symbolic computation
265(1)
14.3 Numerical computation
266(5)
14.3.1 Sampling high-dimensional distributions
267(1)
14.3.2 Forward sampling
267(1)
14.3.3 Importance sampling
268(1)
14.3.4 Rejection sampling
268(1)
14.3.5 Gibbs sampling
269(1)
14.3.6 Metropolis algorithm
269(1)
14.3.7 Numerical estimation of high-dimensional integrals
270(1)
14.4 Approximate inference in ProBT
271(10)
14.4.1 Approximation in computing marginalization
271(2)
14.4.2 Approximation in sampling distributions
273(1)
14.4.3 Approximation in computing MAP
274(7)
15 Bayesian Learning Revisited
281(28)
15.1 Parameter identification
282(8)
15.1.1 Problem statement
282(1)
15.1.2 Bayesian parametric estimation
283(2)
15.1.3 Maximum likelihood (ML)
285(2)
15.1.4 Bayesian estimator and conjugate laws
287(3)
15.2 Expectation-Maximization (EM)
290(12)
15.2.1 EM and classification
293(4)
15.2.2 EM and HMM
297(4)
15.2.3 Model selection
301(1)
15.3 Learning structure of Bayesian networks
302(9)
15.3.1 Directed minimum spanning tree algorithm: DMST
304(1)
15.3.2 Score-based algorithms
305(4)
IV Frequently Asked Questions - Frequently Argued Matters 309(32)
16 Frequently Asked Questions and Frequently Argued Matters
311(20)
16.1 Alternative Bayesian inference engines
312(1)
16.2 Bayesian programming applications
313(3)
16.3 Bayesian programming versus Bayesian networks
316(1)
16.4 Bayesian programming versus Bayesian modeling
317(1)
16.5 Bayesian programming versus possibility theories
318(1)
16.6 Bayesian programming versus probabilistic programming
318(1)
16.7 Computational complexity of Bayesian inference
319(1)
16.8 Cox theorem
320(1)
16.9 Discrete versus continuous variables
321(1)
16.10 Incompleteness irreducibility
322(2)
16.11 Maximum entropy principle justifications
324(2)
16.12 Noise or ignorance?
326(1)
16.13 Objectivism versus subjectivism controversy and the "mind projection fallacy"
326(3)
16.14 Unknown distribution
329(2)
17 Glossary
331(10)
17.1 Bayesian filter
331(1)
17.2 Bayesian inference
332(1)
17.3 Bayesian network
333(1)
17.4 Bayesian program
334(1)
17.5 Coherence variable
335(1)
17.6 Conditional statement
335(1)
17.7 Decomposition
336(1)
17.8 Description
336(1)
17.9 Forms
337(1)
17.10 Incompleteness
337(1)
17.11 Mixture
337(1)
17.12 Noise
338(1)
17.13 Preliminary knowledge
338(1)
17.14 Question
339(1)
17.15 Specification
339(1)
17.16 Subroutines
340(1)
17.17 Variable
340(1)
Bibliography 341(18)
Index 359
Pierre Bessiere is with CNRS, the French National Centre for Scientific Research. Juan-Manuel Ahuactzin, Kamel Mekhnacha, and Emmanuel Mazer are with Probayes Inc., France.