About the Author |
|
xxiii | |
To the Instructor |
|
xxv | |
To the Reader |
|
xxxi | |
I Fundamentals of Probability |
|
1 | (146) |
|
1 Basic Probability Models |
|
|
3 | (32) |
|
1.1 Example: Bus Ridership |
|
|
3 | (1) |
|
1.2 A "Notebook" View: the Notion of a Repeatable Experiment |
|
|
4 | (3) |
|
1.2.1 Theoretical Approaches |
|
|
5 | (1) |
|
1.2.2 A More Intuitive Approach |
|
|
5 | (2) |
|
|
7 | (4) |
|
|
11 | (1) |
|
1.5 Example: Bus Ridership Model (cont'd.) |
|
|
11 | (3) |
|
1.6 Example: ALOHA Network |
|
|
14 | (5) |
|
1.6.1 ALOHA Network Model Summary |
|
|
16 | (1) |
|
1.6.2 ALOHA Network Computations |
|
|
16 | (3) |
|
1.7 ALOHA in the Notebook Context |
|
|
19 | (1) |
|
1.8 Example: A Simple Board Game |
|
|
20 | (3) |
|
|
23 | (1) |
|
|
23 | (1) |
|
1.9.2 Example: Document Classification |
|
|
23 | (1) |
|
|
24 | (2) |
|
1.10.1 Example: Preferential Attachment Model |
|
|
25 | (1) |
|
1.11 Combinatorics-Based Computation |
|
|
26 | (5) |
|
1.11.1 Which Is More Likely in Five Cards, One King or Two Hearts? |
|
|
26 | (1) |
|
1.11.2 Example: Random Groups of Students |
|
|
27 | (1) |
|
1.11.3 Example: Lottery Tickets |
|
|
27 | (1) |
|
1.11.4 Example: Gaps between Numbers |
|
|
28 | (1) |
|
1.11.5 Multinomial Coefficients |
|
|
29 | (1) |
|
1.11.6 Example: Probability of Getting Four Aces in a Bridge Hand |
|
|
30 | (1) |
|
|
31 | (4) |
|
|
35 | (10) |
|
2.1 Example: Rolling Dice |
|
|
35 | (4) |
|
|
36 | (1) |
|
|
37 | (1) |
|
|
38 | (1) |
|
2.2 Example: Dice Problem |
|
|
39 | (1) |
|
2.3 Use of runif() for Simulating Events |
|
|
39 | (1) |
|
2.4 Example: Bus Ridership (cont'd.) |
|
|
40 | (1) |
|
2.5 Example: Board Game (cont'd.) |
|
|
40 | (1) |
|
|
41 | (1) |
|
2.7 How Long Should We Run the Simulation? |
|
|
42 | (1) |
|
2.8 Computational Complements |
|
|
42 | (1) |
|
2.8.1 More on the replicate() Function |
|
|
42 | (1) |
|
|
43 | (2) |
|
3 Discrete Random Variables: Expected Value |
|
|
45 | (20) |
|
|
45 | (1) |
|
3.2 Discrete Random Variables |
|
|
46 | (1) |
|
3.3 Independent Random Variables |
|
|
46 | (1) |
|
3.4 Example: The Monty Hall Problem |
|
|
47 | (3) |
|
|
50 | (1) |
|
3.5.1 Generality - Not Just for Discrete Random Variables |
|
|
50 | (1) |
|
|
50 | (1) |
|
3.5.3 Definition and Notebook View |
|
|
50 | (1) |
|
3.6 Properties of Expected Value |
|
|
51 | (7) |
|
3.6.1 Computational Formula |
|
|
51 | (3) |
|
3.6.2 Further Properties of Expected Value |
|
|
54 | (4) |
|
3.7 Example: Bus Ridership |
|
|
58 | (1) |
|
3.8 Example: Predicting Product Demand |
|
|
58 | (1) |
|
3.9 Expected Values via Simulation |
|
|
59 | (1) |
|
3.10 Casinos, Insurance Companies and "Sum Users," Compared to Others |
|
|
60 | (1) |
|
3.11 Mathematical Complements |
|
|
61 | (1) |
|
3.11.1 Proof of Property E |
|
|
61 | (1) |
|
|
62 | (3) |
|
4 Discrete Random Variables: Variance |
|
|
65 | (18) |
|
|
65 | (6) |
|
|
65 | (4) |
|
4.1.2 Central Importance of the Concept of Variance |
|
|
69 | (1) |
|
4.1.3 Intuition Regarding the Size of Var(X) |
|
|
69 | (5) |
|
4.1.3.1 Chebychev's Inequality |
|
|
69 | (1) |
|
4.1.3.2 The Coefficient of Variation |
|
|
70 | (1) |
|
|
71 | (1) |
|
|
72 | (2) |
|
4.4 Indicator Random Variables, and Their Means and Variances |
|
|
74 | (5) |
|
4.4.1 Example: Return Time for Library Books, Version I |
|
|
75 | (1) |
|
4.4.2 Example: Return Time for Library Books, Version II |
|
|
76 | (1) |
|
4.4.3 Example: Indicator Variables in a Committee Problem |
|
|
77 | (2) |
|
|
79 | (1) |
|
4.6 Mathematical Complements |
|
|
79 | (2) |
|
4.6.1 Proof of Chebychev's Inequality |
|
|
79 | (2) |
|
|
81 | (2) |
|
5 Discrete Parametric Distribution Families |
|
|
83 | (30) |
|
|
83 | (3) |
|
5.1.1 Example: Toss Coin Until First Head |
|
|
84 | (1) |
|
5.1.2 Example: Sum of Two Dice |
|
|
85 | (1) |
|
5.1.3 Example: Watts-Strogatz Random Graph Model |
|
|
85 | (3) |
|
|
85 | (1) |
|
5.2 Parametric Families of Distributions |
|
|
86 | (1) |
|
5.3 The Case of Importance to Us: Parameteric Families of pmfs |
|
|
86 | (2) |
|
5.4 Distributions Based on Bernoulli Trials |
|
|
88 | (10) |
|
5.4.1 The Geometric Family of Distributions |
|
|
88 | (6) |
|
|
91 | (1) |
|
5.4.1.2 Example: A Parking Space Problem |
|
|
92 | (2) |
|
5.4.2 The Binomial Family of Distributions |
|
|
94 | (2) |
|
|
95 | (1) |
|
5.4.2.2 Example: Parking Space Model |
|
|
96 | (1) |
|
5.4.3 The Negative Binomial Family of Distributions |
|
|
96 | (2) |
|
|
97 | (1) |
|
5.4.3.2 Example: Backup Batteries |
|
|
98 | (1) |
|
5.5 Two Major Non-Bernoulli Models |
|
|
98 | (8) |
|
5.5.1 The Poisson Family of Distributions |
|
|
99 | (1) |
|
|
99 | (1) |
|
5.5.1.2 Example: Broken Rod |
|
|
100 | (1) |
|
5.5.2 The Power Law Family of Distributions |
|
|
100 | (2) |
|
|
100 | (2) |
|
5.5.3 Fitting the Poisson and Power Law Models to Data |
|
|
102 | (4) |
|
|
102 | (1) |
|
5.5.3.2 Straight-Line Graphical Test for the Power Law |
|
|
103 | (1) |
|
5.5.3.3 Example: DNC E-mail Data |
|
|
103 | (3) |
|
|
106 | (2) |
|
5.6.1 Example: The Bus Ridership Problem |
|
|
106 | (1) |
|
5.6.2 Example: Analysis of Social Networks |
|
|
107 | (1) |
|
5.7 Computational Complements |
|
|
108 | (1) |
|
5.7.1 Graphics and Visualization in R |
|
|
108 | (1) |
|
|
109 | (4) |
|
6 Continuous Probability Models |
|
|
113 | (34) |
|
|
113 | (1) |
|
6.2 Individual Values Now Have Probability Zero |
|
|
114 | (1) |
|
6.3 But Now We Have a Problem |
|
|
115 | (1) |
|
6.4 Our Way Out of the Problem: Cumulative Distribution Functions |
|
|
115 | (4) |
|
|
115 | (4) |
|
6.4.2 Non-Discrete, Non-Continuous Distributions |
|
|
119 | (1) |
|
|
119 | (4) |
|
6.5.1 Properties of Densities |
|
|
120 | (2) |
|
6.5.2 Intuitive Meaning of Densities |
|
|
122 | (1) |
|
|
122 | (1) |
|
|
123 | (1) |
|
6.7 Famous Parametric Families of Continuous Distributions |
|
|
124 | (14) |
|
6.7.1 The Uniform Distributions |
|
|
125 | (2) |
|
6.7.1.1 Density and Properties |
|
|
125 | (1) |
|
|
125 | (1) |
|
6.7.1.3 Example: Modeling of Disk Performance |
|
|
126 | (1) |
|
6.7.1.4 Example: Modeling of Denial-of-Service Attack |
|
|
126 | (1) |
|
6.7.2 The Normal (Gaussian) Family of Continuous Distributions |
|
|
127 | (1) |
|
6.7.2.1 Density and Properties |
|
|
127 | (1) |
|
|
127 | (1) |
|
6.7.2.3 Importance in Modeling |
|
|
128 | (1) |
|
6.7.3 The Exponential Family of Distributions |
|
|
128 | (3) |
|
6.7.3.1 Density and Properties |
|
|
128 | (1) |
|
|
128 | (1) |
|
6.7.3.3 Example: Garage Parking Fees |
|
|
129 | (1) |
|
6.7.3.4 Memoryless Property of Exponential Distributions |
|
|
130 | (1) |
|
6.7.3.5 Importance in Modeling |
|
|
131 | (1) |
|
6.7.4 The Gamma Family of Distributions |
|
|
131 | (3) |
|
6.7.4.1 Density and Properties |
|
|
132 | (1) |
|
6.7.4.2 Example: Network Buffer |
|
|
133 | (1) |
|
6.7.4.3 Importance in Modeling |
|
|
133 | (1) |
|
6.7.5 The Beta Family of Distributions |
|
|
134 | (4) |
|
|
134 | (4) |
|
6.7.5.2 Importance in Modeling |
|
|
138 | (1) |
|
6.8 Mathematical Complements |
|
|
138 | (3) |
|
|
138 | (1) |
|
6.8.2 Duality of the Exponential Family with the Poisson Family |
|
|
139 | (2) |
|
6.9 Computational Complements |
|
|
141 | (2) |
|
6.9.1 R's integrate() Function |
|
|
141 | (1) |
|
6.9.2 Inverse Method for Sampling from a Density |
|
|
141 | (1) |
|
6.9.3 Sampling from a Poisson Distribution |
|
|
142 | (1) |
|
|
143 | (4) |
II Fundamentals of Statistics |
|
147 | (96) |
|
|
149 | (22) |
|
7.1 Importance of This Chapter |
|
|
150 | (1) |
|
7.2 Sampling Distributions |
|
|
150 | (2) |
|
|
150 | (2) |
|
7.3 The Sample Mean - a Random Variable |
|
|
152 | (4) |
|
7.3.1 Toy Population Example |
|
|
152 | (1) |
|
7.3.2 Expected Value and Variance of X |
|
|
153 | (1) |
|
7.3.3 Toy Population Example Again |
|
|
154 | (1) |
|
|
155 | (1) |
|
|
155 | (1) |
|
7.4 Simple Random Sample Case |
|
|
156 | (1) |
|
|
157 | (2) |
|
7.5.1 Intuitive Estimation of σ2 |
|
|
157 | (1) |
|
|
158 | (1) |
|
7.5.3 Special Case: X Is an Indicator Variable |
|
|
158 | (1) |
|
7.6 To Divide by n or n-1? |
|
|
159 | (2) |
|
|
159 | (2) |
|
7.7 The Concept of a "Standard Error" |
|
|
161 | (1) |
|
7.8 Example: Pima Diabetes Study |
|
|
162 | (2) |
|
7.9 Don't Forget: Sample not equal to Population! |
|
|
164 | (1) |
|
|
164 | (1) |
|
|
164 | (1) |
|
7.10.2 Infinite Populations? |
|
|
164 | (1) |
|
7.11 Observational Studies |
|
|
165 | (1) |
|
7.12 Computational Complements |
|
|
165 | (5) |
|
7.12.1 The *apply() Functions |
|
|
165 | (3) |
|
7.12.1.1 R's apply() Function |
|
|
166 | (1) |
|
7.12.1.2 The lapply() and sapply() Function |
|
|
166 | (1) |
|
7.12.1.3 The split() and tapply() Functions |
|
|
167 | (1) |
|
7.12.2 Outliers/Errors in the Data |
|
|
168 | (2) |
|
|
170 | (1) |
|
8 Fitting Continuous Models |
|
|
171 | (26) |
|
8.1 Why Fit a Parametric Model? |
|
|
171 | (1) |
|
8.2 Model-Free Estimation of a Density from Sample Data |
|
|
172 | (8) |
|
|
172 | (1) |
|
|
173 | (1) |
|
|
174 | (7) |
|
8.2.3.1 The Bias-Variance Tradeoff |
|
|
175 | (1) |
|
8.2.3.2 The Bias-Variance Tradeoff in the Histogram Case |
|
|
176 | (2) |
|
8.2.3.3 A General Issue: Choosing the Degree of Smoothing |
|
|
178 | (2) |
|
8.3 Advanced Methods for Model-Free Density Estimation |
|
|
180 | (1) |
|
|
181 | (6) |
|
|
181 | (1) |
|
|
182 | (1) |
|
8.4.3 The Method of Maximum Likelihood |
|
|
183 | (2) |
|
8.4.4 Example: Humidity Data |
|
|
185 | (2) |
|
|
187 | (1) |
|
8.6 Assessment of Goodness of Fit |
|
|
187 | (2) |
|
8.7 The Bayesian Philosophy |
|
|
189 | (2) |
|
|
190 | (1) |
|
8.7.2 Arguments For and Against |
|
|
190 | (1) |
|
8.8 Mathematical Complements |
|
|
191 | (1) |
|
8.8.1 Details of Kernel Density Estimators |
|
|
191 | (1) |
|
8.9 Computational Complements |
|
|
192 | (2) |
|
|
192 | (1) |
|
|
193 | (4) |
|
8.9.2.1 The gmm() Function |
|
|
193 | (1) |
|
8.9.2.2 Example: Bodyfat Data |
|
|
193 | (1) |
|
|
194 | (3) |
|
9 The Family of Normal Distributions |
|
|
197 | (20) |
|
9.1 Density and Properties |
|
|
197 | (3) |
|
9.1.1 Closure under Affine Transformation |
|
|
198 | (1) |
|
9.1.2 Closure under Independent Summation |
|
|
199 | (1) |
|
|
200 | (1) |
|
|
200 | (1) |
|
9.3 The Standard Normal Distribution |
|
|
200 | (1) |
|
9.4 Evaluating Normal cdfs |
|
|
201 | (1) |
|
9.5 Example: Network Intrusion |
|
|
202 | (1) |
|
9.6 Example: Class Enrollment Size |
|
|
203 | (1) |
|
9.7 The Central Limit Theorem |
|
|
204 | (3) |
|
9.7.1 Example: Cumulative Roundoff Error |
|
|
205 | (1) |
|
9.7.2 Example: Coin Tosses |
|
|
205 | (1) |
|
9.7.3 Example: Museum Demonstration |
|
|
206 | (1) |
|
9.7.4 A Bit of Insight into the Mystery |
|
|
207 | (1) |
|
9.8 X Is Approximately Normal |
|
|
207 | (2) |
|
9.8.1 Approximate Distribution of X |
|
|
207 | (1) |
|
9.8.2 Improved Assessment of Accuracy of X |
|
|
208 | (1) |
|
9.9 Importance in Modeling |
|
|
209 | (1) |
|
9.10 The Chi-Squared Family of Distributions |
|
|
210 | (2) |
|
9.10.1 Density and Properties |
|
|
210 | (1) |
|
9.10.2 Example: Error in Pin Placement |
|
|
211 | (1) |
|
9.10.3 Importance in Modeling |
|
|
211 | (1) |
|
9.10.4 Relation to Gamma Family |
|
|
212 | (1) |
|
9.11 Mathematical Complements |
|
|
212 | (1) |
|
9.11.1 Convergence in Distribution, and the Precisely-Stated CLT |
|
|
212 | (1) |
|
9.12 Computational Complements |
|
|
213 | (1) |
|
9.12.1 Example: Generating Normal Random Numbers |
|
|
213 | (1) |
|
|
214 | (3) |
|
10 Introduction to Statistical Inference |
|
|
217 | (26) |
|
10.1 The Role of Normal Distributions |
|
|
217 | (1) |
|
10.2 Confidence Intervals for Means |
|
|
218 | (2) |
|
|
218 | (2) |
|
10.3 Example: Pima Diabetes Study |
|
|
220 | (1) |
|
10.4 Example: Humidity Data |
|
|
221 | (1) |
|
10.5 Meaning of Confidence Intervals |
|
|
221 | (2) |
|
10.5.1 A Weight Survey in Davis |
|
|
221 | (2) |
|
10.6 Confidence Intervals for Proportions |
|
|
223 | (3) |
|
10.6.1 Example: Machine Classification of Forest Covers |
|
|
224 | (2) |
|
10.7 The Student-t Distribution |
|
|
226 | (1) |
|
10.8 Introduction to Significance Tests |
|
|
227 | (1) |
|
10.9 The Proverbial Fair Coin |
|
|
228 | (1) |
|
|
229 | (2) |
|
10.11 General Normal Testing |
|
|
231 | (1) |
|
10.12 The Notion of "p-Values" |
|
|
231 | (1) |
|
10.13 What's Random and What Is Not |
|
|
232 | (1) |
|
10.14 Example: The Forest Cover Data |
|
|
232 | (2) |
|
10.15 Problems with Significance Testing |
|
|
234 | (3) |
|
10.15.1 History of Significance Testing |
|
|
234 | (1) |
|
|
235 | (1) |
|
10.15.3 Alternative Approach |
|
|
236 | (1) |
|
10.16 The Problem of "P-hacking" |
|
|
237 | (2) |
|
10.16.1 A Thought Experiment |
|
|
238 | (1) |
|
10.16.2 Multiple Inference Methods |
|
|
238 | (1) |
|
10.17 Philosophy of Statistics |
|
|
239 | (2) |
|
10.17.1 More about Interpretation of Cis |
|
|
239 | (6) |
|
10.17.1.1 The Bayesian View of Confidence Intervals |
|
|
241 | (1) |
|
|
241 | (2) |
III Multivariate Analysis |
|
243 | (122) |
|
11 Multivariate Distributions |
|
|
245 | (20) |
|
11.1 Multivariate Distributions: Discrete |
|
|
245 | (1) |
|
11.1.1 Example: Marbles in a Bag |
|
|
245 | (1) |
|
11.2 Multivariate Distributions: Continuous |
|
|
246 | (2) |
|
11.2.1 Motivation and Definition |
|
|
246 | (1) |
|
11.2.2 Use of Multivariate Densities in Finding Probabilities and Expected Values |
|
|
247 | (1) |
|
11.2.3 Example: Train Rendezvous |
|
|
247 | (1) |
|
11.3 Measuring Co-variation |
|
|
248 | (3) |
|
|
248 | (2) |
|
11.3.2 Example: The Committee Example Again |
|
|
250 | (1) |
|
|
251 | (1) |
|
|
252 | (1) |
|
11.5 Sets of Independent Random Variables |
|
|
252 | (2) |
|
|
252 | (2) |
|
11.5.1.1 Expected Values Factor |
|
|
253 | (1) |
|
|
253 | (1) |
|
|
253 | (1) |
|
|
254 | (2) |
|
11.6.1 Mailing Tubes: Mean Vectors |
|
|
254 | (1) |
|
11.6.2 Covariance Matrices |
|
|
254 | (1) |
|
11.6.3 Mailing Tubes: Covariance Matrices |
|
|
255 | (1) |
|
11.7 Sample Estimate of Covariance Matrix |
|
|
256 | (1) |
|
11.7.1 Example: Pima Data |
|
|
257 | (1) |
|
11.8 Mathematical Complements |
|
|
257 | (5) |
|
|
257 | (2) |
|
11.8.1.1 Example: Backup Battery |
|
|
258 | (1) |
|
|
259 | (17) |
|
11.8.2.1 Generating Functions |
|
|
259 | (2) |
|
11.8.2.2 Sums of Independent Poisson Random Variables Are Poisson Distributed |
|
|
261 | (1) |
|
|
262 | (3) |
|
12 The Multivariate Normal Family of Distributions |
|
|
265 | (10) |
|
|
265 | (1) |
|
12.2 Geometric Interpretation |
|
|
266 | (3) |
|
|
269 | (1) |
|
12.4 Special Case: New Variable Is a Single Linear Combination of a Random Vector |
|
|
270 | (1) |
|
12.5 Properties of Multivariate Normal Distributions |
|
|
270 | (2) |
|
12.6 The Multivariate Central Limit Theorem |
|
|
272 | (1) |
|
|
273 | (2) |
|
|
275 | (12) |
|
13.1 Iterated Expectations |
|
|
276 | (5) |
|
13.1.1 Conditional Distributions |
|
|
277 | (1) |
|
|
277 | (2) |
|
13.1.3 Example: Flipping Coins with Bonuses |
|
|
279 | (1) |
|
13.1.4 Conditional Expectation as a Random Variable |
|
|
280 | (1) |
|
13.1.5 What about Variance? |
|
|
280 | (1) |
|
13.2 A Closer Look at Mixture Distributions |
|
|
281 | (3) |
|
13.2.1 Derivation of Mean and Variance |
|
|
281 | (2) |
|
13.2.2 Estimation of Parameters |
|
|
283 | (5) |
|
13.2.2.1 Example: Old Faithful Estimation |
|
|
283 | (1) |
|
|
284 | (1) |
|
|
285 | (2) |
|
14 Multivariate Description and Dimension Reduction |
|
|
287 | (22) |
|
14.1 What Is Overfitting Anyway? |
|
|
288 | (5) |
|
14.1.1 "Desperate for Data" |
|
|
288 | (1) |
|
14.1.2 Known Distribution |
|
|
289 | (1) |
|
|
289 | (1) |
|
14.1.4 The Bias/Variance Tradeoff: Concrete Illustration |
|
|
290 | (2) |
|
|
292 | (1) |
|
14.2 Principal Components Analysis |
|
|
293 | (4) |
|
|
293 | (2) |
|
|
295 | (1) |
|
14.2.3 Example: Turkish Teaching Evaluations |
|
|
296 | (1) |
|
14.3 The Log-Linear Model |
|
|
297 | (3) |
|
14.3.1 Example: Hair Color, Eye Color and Gender |
|
|
297 | (2) |
|
14.3.2 Dimension of Our Data |
|
|
299 | (1) |
|
14.3.3 Estimating the Parameters |
|
|
299 | (1) |
|
14.4 Mathematical Complements |
|
|
300 | (2) |
|
14.4.1 Statistical Derivation of PCA |
|
|
300 | (2) |
|
14.5 Computational Complements |
|
|
302 | (4) |
|
|
302 | (1) |
|
14.5.2 Some Details on Log-Linear Models |
|
|
302 | (8) |
|
14.5.2.1 Parameter Estimation |
|
|
303 | (1) |
|
14.5.2.2 The loglin() Function |
|
|
304 | (1) |
|
14.5.2.3 Informal Assessment of Fit |
|
|
305 | (1) |
|
|
306 | (3) |
|
|
309 | (34) |
|
15.1 Example: Heritage Health Prize |
|
|
309 | (1) |
|
15.2 The Goals: Prediction and Description |
|
|
310 | (1) |
|
|
310 | (1) |
|
15.3 What Does "Relationship" Mean? |
|
|
311 | (3) |
|
15.3.1 Precise Definition |
|
|
311 | (2) |
|
15.3.2 Parametric Models for the Regression Function m() |
|
|
313 | (1) |
|
15.4 Estimation in Linear Parametric Regression Models |
|
|
314 | (1) |
|
15.5 Example: Baseball Data |
|
|
315 | (4) |
|
|
316 | (3) |
|
|
319 | (1) |
|
15.7 Example: Baseball Data (cont'd.) |
|
|
320 | (1) |
|
|
321 | (1) |
|
15.9 Parametric Estimation |
|
|
322 | (6) |
|
15.9.1 Meaning of "Linear" |
|
|
322 | (1) |
|
15.9.2 Random-X and Fixed-X Regression |
|
|
322 | (1) |
|
15.9.3 Point Estimates and Matrix Formulation |
|
|
323 | (3) |
|
15.9.4 Approximate Confidence Intervals |
|
|
326 | (2) |
|
15.10 Example: Baseball Data (cont'd ) |
|
|
328 | (1) |
|
|
329 | (1) |
|
|
330 | (6) |
|
15.12.1 Classification = Regression |
|
|
331 | (1) |
|
15.12.2 Logistic Regression |
|
|
332 | (2) |
|
15.12.2.1 The Logistic Model: Motivations |
|
|
332 | (2) |
|
15.12.2.2 Estimation and Inference for Logit |
|
|
334 | (1) |
|
15.12.3 Example: Forest Cover Data |
|
|
334 | (1) |
|
|
334 | (1) |
|
15.12.5 Analysis of the Results |
|
|
335 | (1) |
|
15.12.5.1 Multiclass Case |
|
|
336 | (1) |
|
15.13 Machine Learning: Neural Networks |
|
|
336 | (4) |
|
15.13.1 Example: Predicting Vertebral Abnormalities |
|
|
336 | (3) |
|
15.13.2 But What Is Really Going On? |
|
|
339 | (1) |
|
|
339 | (1) |
|
15.14 Computational Complements |
|
|
340 | (2) |
|
15.14.1 Computational Details in Section 15.5.1 |
|
|
340 | (1) |
|
15.14.2 More Regarding glm() |
|
|
341 | (1) |
|
|
342 | (1) |
|
16 Model Parsimony and Overfitting |
|
|
343 | (6) |
|
16.1 What Is Overfitting? |
|
|
343 | (2) |
|
16.1.1 Example: Histograms |
|
|
343 | (1) |
|
16.1.2 Example: Polynomial Regression |
|
|
344 | (1) |
|
16.2 Can Anything Be Done about It? |
|
|
345 | (1) |
|
|
345 | (1) |
|
16.3 Predictor Subset Selection |
|
|
346 | (1) |
|
|
347 | (2) |
|
17 Introduction to Discrete Time Markov Chains |
|
|
349 | (16) |
|
|
350 | (1) |
|
|
351 | (1) |
|
17.3 Long-Run State Probabilities |
|
|
352 | (4) |
|
17.3.1 Stationary Distribution |
|
|
353 | (1) |
|
|
354 | (1) |
|
17.3.3 Simulation Calculation of π |
|
|
355 | (1) |
|
17.4 Example: 3-Heads-in-a-Row Game |
|
|
356 | (2) |
|
17.5 Example: Bus Ridership Problem |
|
|
358 | (1) |
|
17.6 Hidden Markov Models |
|
|
359 | (2) |
|
17.6.1 Example: Bus Ridership |
|
|
360 | (1) |
|
|
361 | (1) |
|
|
361 | (1) |
|
17.8 Computational Complements |
|
|
361 | (1) |
|
17.8.1 Initializing a Matrix to All 0s |
|
|
361 | (1) |
|
|
362 | (3) |
IV Appendices |
|
365 | (26) |
|
|
367 | (16) |
|
|
367 | (1) |
|
|
368 | (1) |
|
A.3 First Sample Programming Session |
|
|
369 | (3) |
|
|
372 | (1) |
|
A.5 Second Sample Programming Session |
|
|
372 | (2) |
|
|
374 | (1) |
|
A.7 More on Vectorization |
|
|
374 | (1) |
|
A.8 Default Argument Values |
|
|
375 | (1) |
|
|
376 | (2) |
|
|
376 | (1) |
|
|
377 | (1) |
|
|
378 | (2) |
|
|
380 | (1) |
|
|
380 | (3) |
|
|
383 | (8) |
|
B.1 Terminology and Notation |
|
|
383 | (2) |
|
B.1.1 Matrix Addition and Multiplication |
|
|
383 | (2) |
|
|
385 | (1) |
|
|
385 | (1) |
|
B.4 Eigenvalues and Eigenvectors |
|
|
385 | (1) |
|
B.5 Mathematical Complements |
|
|
386 | (5) |
|
|
386 | (5) |
Bibliography |
|
391 | (4) |
Index |
|
395 | |