Foreword |
|
xv | |
Preface |
|
xvii | |
|
|
1 | (14) |
|
1.1 Probability an alternative to logic |
|
|
1 | (4) |
|
1.2 A need for a new computing paradigm |
|
|
5 | (1) |
|
1.3 A need for a new modeling methodology |
|
|
5 | (3) |
|
1.4 A need for new inference algorithms |
|
|
8 | (2) |
|
1.5 A need for a new programming language and new hardware |
|
|
10 | (1) |
|
1.6 A place for numerous controversies |
|
|
11 | (1) |
|
1.7 Running real programs as exercises |
|
|
12 | (3) |
I Bayesian Programming Principles |
|
15 | (76) |
|
|
17 | (18) |
|
|
18 | (1) |
|
|
18 | (1) |
|
2.3 The normalization postulate |
|
|
19 | (1) |
|
2.4 Conditional probability |
|
|
19 | (1) |
|
|
20 | (1) |
|
2.6 The conjunction postulate (Bayes theorem) |
|
|
20 | (1) |
|
|
21 | (1) |
|
2.8 The marginalization rule |
|
|
22 | (1) |
|
2.9 Joint distribution and questions |
|
|
23 | (2) |
|
|
25 | (1) |
|
|
26 | (2) |
|
|
28 | (1) |
|
2.13 Specification = Variables + Decomposition + Parametric forms |
|
|
29 | (1) |
|
2.14 Description = Specification + Identification |
|
|
29 | (1) |
|
|
29 | (2) |
|
2.16 Bayesian program = Description + Question |
|
|
31 | (1) |
|
|
32 | (3) |
|
3 Incompleteness and Uncertainty |
|
|
35 | (12) |
|
3.1 Observing a water treatment unit |
|
|
35 | (5) |
|
3.1.1 The elementary water treatment unit |
|
|
36 | (2) |
|
3.1.2 Experimentation and uncertainty |
|
|
38 | (2) |
|
3.2 Lessons, comments, and notes |
|
|
40 | (7) |
|
3.2.1 The effect of incompleteness |
|
|
40 | (1) |
|
3.2.2 The effect of inaccuracy |
|
|
41 | (1) |
|
3.2.3 Not taking into account the effect of ignored variables may lead to wrong decisions |
|
|
42 | (1) |
|
3.2.4 From incompleteness to uncertainty |
|
|
43 | (4) |
|
4 Description = Specification + Identification |
|
|
47 | (18) |
|
4.1 Pushing objects and following contours |
|
|
48 | (8) |
|
|
48 | (1) |
|
|
49 | (4) |
|
|
53 | (3) |
|
4.2 Description of a water treatment unit |
|
|
56 | (4) |
|
|
56 | (3) |
|
|
59 | (1) |
|
|
59 | (1) |
|
|
60 | (1) |
|
4.3 Lessons, comments, and notes |
|
|
60 | (5) |
|
4.3.1 Description = Specification + Identification |
|
|
60 | (1) |
|
4.3.2 Specification -= Variables + Decomposition + Forms |
|
|
61 | (1) |
|
4.3.3 Learning is a means to transform incompleteness into uncertainty |
|
|
62 | (3) |
|
5 The Importance of Conditional Independence |
|
|
65 | (10) |
|
5.1 Water treatment center Bayesian model |
|
|
65 | (1) |
|
5.2 Description of the water treatment center |
|
|
66 | (5) |
|
|
66 | (4) |
|
|
70 | (1) |
|
|
71 | (1) |
|
5.3 Lessons, comments, and notes |
|
|
71 | (4) |
|
5.3.1 Independence versus conditional independence |
|
|
71 | (2) |
|
5.3.2 The importance of conditional independence |
|
|
73 | (2) |
|
6 Bayesian Program = Description + Question |
|
|
75 | (16) |
|
6.1 Water treatment center Bayesian model (end) |
|
|
76 | (1) |
|
6.2 Forward simulation of a single unit |
|
|
76 | (2) |
|
|
77 | (1) |
|
|
78 | (1) |
|
6.3 Forward simulation of the water treatment center |
|
|
78 | (3) |
|
|
78 | (2) |
|
|
80 | (1) |
|
6.4 Control of the water treatment center |
|
|
81 | (4) |
|
|
81 | (1) |
|
|
81 | (1) |
|
|
82 | (2) |
|
|
84 | (1) |
|
|
85 | (2) |
|
|
86 | (1) |
|
|
86 | (1) |
|
6.6 Lessons, comments, and notes |
|
|
87 | (6) |
|
6.6.1 Bayesian Program = Description + Question |
|
|
87 | (1) |
|
6.6.2 The essence of Bayesian inference |
|
|
88 | (1) |
|
6.6.3 No inverse or direct problem |
|
|
89 | (1) |
|
6.6.4 No ill-posed problem |
|
|
89 | (2) |
II Bayesian Programming Cookbook |
|
91 | (106) |
|
|
93 | (28) |
|
7.1 "Naive" Bayes sensor fusion |
|
|
94 | (8) |
|
7.1.1 Statement of the problem |
|
|
94 | (1) |
|
|
94 | (2) |
|
7.1.3 Instance and results |
|
|
96 | (6) |
|
7.2 Relaxing the conditional independence fundamental hypothesis |
|
|
102 | (3) |
|
7.2.1 Statement of the problem |
|
|
102 | (1) |
|
|
103 | (1) |
|
7.2.3 Instance and results |
|
|
103 | (2) |
|
|
105 | (3) |
|
7.3.1 Statement of the problem |
|
|
105 | (1) |
|
|
106 | (1) |
|
7.3.3 Instance and results |
|
|
106 | (2) |
|
|
108 | (5) |
|
7.4.1 Statement of the problem |
|
|
108 | (1) |
|
|
108 | (2) |
|
7.4.3 Instance and results |
|
|
110 | (3) |
|
7.5 Sensor fusion with false alarm |
|
|
113 | (3) |
|
7.5.1 Statement of the problem |
|
|
113 | (1) |
|
|
114 | (1) |
|
7.5.3 Instance and results |
|
|
114 | (2) |
|
|
116 | (5) |
|
7.6.1 Statement of the problem |
|
|
116 | (1) |
|
|
117 | (1) |
|
7.6.3 Instance and results |
|
|
118 | (3) |
|
8 Bayesian Programming with Coherence Variables |
|
|
121 | (32) |
|
8.1 Basic example with Boolean variables |
|
|
122 | (3) |
|
8.1.1 Statement of the problem |
|
|
122 | (1) |
|
|
123 | (1) |
|
8.1.3 Instance and results |
|
|
124 | (1) |
|
8.2 Basic example with discrete variables |
|
|
125 | (5) |
|
8.2.1 Statement of the problem |
|
|
125 | (1) |
|
|
126 | (1) |
|
8.2.3 Instance and results |
|
|
126 | (4) |
|
8.3 Checking the semantic of A |
|
|
130 | (2) |
|
8.3.1 Statement of the problem |
|
|
130 | (1) |
|
|
130 | (1) |
|
8.3.3 Instance and results |
|
|
131 | (1) |
|
8.4 Information fusion revisited using coherence variables |
|
|
132 | (9) |
|
8.4.1 Statement of the problems |
|
|
132 | (3) |
|
|
135 | (1) |
|
8.4.3 Instance and results |
|
|
135 | (6) |
|
8.5 Reasoning with soft evidence |
|
|
141 | (4) |
|
8.5.1 Statement of the problem |
|
|
141 | (1) |
|
|
142 | (1) |
|
8.5.3 Instance and results |
|
|
143 | (2) |
|
|
145 | (2) |
|
8.6.1 Statement of the problem |
|
|
145 | (1) |
|
|
145 | (1) |
|
8.6.3 Instance and results |
|
|
146 | (1) |
|
|
147 | (6) |
|
8.7.1 Statement of the problem |
|
|
147 | (1) |
|
|
148 | (1) |
|
8.7.3 Instance and results |
|
|
148 | (5) |
|
9 Bayesian Programming Subroutines |
|
|
153 | (18) |
|
|
154 | (5) |
|
9.1.1 Statement of the problem |
|
|
154 | (2) |
|
|
156 | (1) |
|
9.1.3 Instance and results |
|
|
156 | (3) |
|
9.2 Calling subroutines conditioned by values |
|
|
159 | (3) |
|
9.2.1 Statement of the problem |
|
|
159 | (1) |
|
|
159 | (1) |
|
9.2.3 Instance and results |
|
|
160 | (2) |
|
9.3 Water treatment center revisited (final) |
|
|
162 | (1) |
|
9.3.1 Statement of the problem |
|
|
162 | (1) |
|
|
162 | (1) |
|
9.4 Fusion of subroutines |
|
|
163 | (2) |
|
9.4.1 Statement of the problem |
|
|
163 | (1) |
|
|
163 | (2) |
|
|
165 | (6) |
|
9.5.1 Statement of the problem |
|
|
165 | (1) |
|
|
165 | (1) |
|
9.5.3 Instance and results |
|
|
166 | (5) |
|
10 Bayesian Programming Conditional Statement |
|
|
171 | (12) |
|
10.1 Bayesian if-then-else |
|
|
172 | (7) |
|
10.1.1 Statement of the problem |
|
|
172 | (1) |
|
|
173 | (3) |
|
10.1.3 Instance and results |
|
|
176 | (3) |
|
10.2 Behavior recognition |
|
|
179 | (1) |
|
10.2.1 Statement of the problem |
|
|
179 | (1) |
|
|
179 | (1) |
|
10.2.3 Instance and results |
|
|
179 | (1) |
|
10.3 Mixture of models and model recognition |
|
|
180 | (3) |
|
11 Bayesian Programming Iteration |
|
|
183 | (14) |
|
|
184 | (2) |
|
11.1.1 Statement of the problem |
|
|
184 | (1) |
|
|
184 | (1) |
|
11.1.3 Instance and results |
|
|
185 | (1) |
|
11.2 Generic Bayesian filters |
|
|
186 | (5) |
|
11.2.1 Statement of the problem |
|
|
186 | (1) |
|
|
186 | (2) |
|
11.2.3 Instance and results |
|
|
188 | (3) |
|
|
191 | (8) |
|
11.3.1 Statement of the problem |
|
|
191 | (1) |
|
|
192 | (1) |
|
11.3.3 Instance and results |
|
|
192 | (5) |
III Bayesian Programming Formalism and Algorithms |
|
197 | (112) |
|
12 Bayesian Programming Formalism |
|
|
199 | (10) |
|
12.1 Logical propositions |
|
|
200 | (1) |
|
12.2 Probability of a proposition |
|
|
200 | (1) |
|
12.3 Normalization and conjunction postulates |
|
|
200 | (1) |
|
12.4 Disjunction rule for propositions |
|
|
201 | (1) |
|
|
201 | (1) |
|
12.6 Variable conjunction |
|
|
202 | (1) |
|
12.7 Probability on variables |
|
|
202 | (1) |
|
12.8 Conjunction rule for variables |
|
|
202 | (1) |
|
12.9 Normalization rule for variables |
|
|
203 | (1) |
|
12.10 Marginalization rule |
|
|
203 | (1) |
|
|
203 | (1) |
|
|
204 | (1) |
|
|
204 | (2) |
|
|
206 | (1) |
|
|
206 | (3) |
|
13 Bayesian Models Revisited |
|
|
209 | (38) |
|
13.1 General purpose probabilistic models |
|
|
210 | (10) |
|
13.1.1 Graphical models and Bayesian networks |
|
|
210 | (3) |
|
13.1.2 Recursive Bayesian estimation |
|
|
213 | (4) |
|
|
217 | (2) |
|
13.1.4 Maximum entropy approaches |
|
|
219 | (1) |
|
13.2 Engineering oriented probabilistic models |
|
|
220 | (5) |
|
|
220 | (2) |
|
|
222 | (1) |
|
13.2.3 Pattern recognition |
|
|
222 | (1) |
|
13.2.4 Sequence recognition |
|
|
222 | (1) |
|
13.2.5 Markov localization |
|
|
223 | (1) |
|
13.2.6 Markov decision processes |
|
|
224 | (1) |
|
13.3 Cognitive oriented probabilistic models |
|
|
225 | (22) |
|
|
226 | (3) |
|
13.3.2 Fusion, multimodality, conflicts |
|
|
229 | (6) |
|
13.3.3 Modularity, hierarchies |
|
|
235 | (6) |
|
|
241 | (6) |
|
14 Bayesian Inference Algorithms Revisited |
|
|
247 | (34) |
|
|
248 | (2) |
|
14.2 Symbolic computation |
|
|
250 | (16) |
|
14.2.1 Exact symbolic computation |
|
|
250 | (15) |
|
14.2.2 Approximate symbolic computation |
|
|
265 | (1) |
|
14.3 Numerical computation |
|
|
266 | (5) |
|
14.3.1 Sampling high-dimensional distributions |
|
|
267 | (1) |
|
|
267 | (1) |
|
14.3.3 Importance sampling |
|
|
268 | (1) |
|
14.3.4 Rejection sampling |
|
|
268 | (1) |
|
|
269 | (1) |
|
14.3.6 Metropolis algorithm |
|
|
269 | (1) |
|
14.3.7 Numerical estimation of high-dimensional integrals |
|
|
270 | (1) |
|
14.4 Approximate inference in ProBT |
|
|
271 | (10) |
|
14.4.1 Approximation in computing marginalization |
|
|
271 | (2) |
|
14.4.2 Approximation in sampling distributions |
|
|
273 | (1) |
|
14.4.3 Approximation in computing MAP |
|
|
274 | (7) |
|
15 Bayesian Learning Revisited |
|
|
281 | (28) |
|
15.1 Parameter identification |
|
|
282 | (8) |
|
|
282 | (1) |
|
15.1.2 Bayesian parametric estimation |
|
|
283 | (2) |
|
15.1.3 Maximum likelihood (ML) |
|
|
285 | (2) |
|
15.1.4 Bayesian estimator and conjugate laws |
|
|
287 | (3) |
|
15.2 Expectation-Maximization (EM) |
|
|
290 | (12) |
|
15.2.1 EM and classification |
|
|
293 | (4) |
|
|
297 | (4) |
|
|
301 | (1) |
|
15.3 Learning structure of Bayesian networks |
|
|
302 | (9) |
|
15.3.1 Directed minimum spanning tree algorithm: DMST |
|
|
304 | (1) |
|
15.3.2 Score-based algorithms |
|
|
305 | (4) |
IV Frequently Asked Questions - Frequently Argued Matters |
|
309 | (32) |
|
16 Frequently Asked Questions and Frequently Argued Matters |
|
|
311 | (20) |
|
16.1 Alternative Bayesian inference engines |
|
|
312 | (1) |
|
16.2 Bayesian programming applications |
|
|
313 | (3) |
|
16.3 Bayesian programming versus Bayesian networks |
|
|
316 | (1) |
|
16.4 Bayesian programming versus Bayesian modeling |
|
|
317 | (1) |
|
16.5 Bayesian programming versus possibility theories |
|
|
318 | (1) |
|
16.6 Bayesian programming versus probabilistic programming |
|
|
318 | (1) |
|
16.7 Computational complexity of Bayesian inference |
|
|
319 | (1) |
|
|
320 | (1) |
|
16.9 Discrete versus continuous variables |
|
|
321 | (1) |
|
16.10 Incompleteness irreducibility |
|
|
322 | (2) |
|
16.11 Maximum entropy principle justifications |
|
|
324 | (2) |
|
16.12 Noise or ignorance? |
|
|
326 | (1) |
|
16.13 Objectivism versus subjectivism controversy and the "mind projection fallacy" |
|
|
326 | (3) |
|
16.14 Unknown distribution |
|
|
329 | (2) |
|
|
331 | (10) |
|
|
331 | (1) |
|
|
332 | (1) |
|
|
333 | (1) |
|
|
334 | (1) |
|
|
335 | (1) |
|
17.6 Conditional statement |
|
|
335 | (1) |
|
|
336 | (1) |
|
|
336 | (1) |
|
|
337 | (1) |
|
|
337 | (1) |
|
|
337 | (1) |
|
|
338 | (1) |
|
17.13 Preliminary knowledge |
|
|
338 | (1) |
|
|
339 | (1) |
|
|
339 | (1) |
|
|
340 | (1) |
|
|
340 | (1) |
Bibliography |
|
341 | (18) |
Index |
|
359 | |