|
Introduction to Regression Models |
|
|
1 | (25) |
|
|
1 | (1) |
|
|
2 | (13) |
|
|
2 | (1) |
|
Period of Oscillation of a Pendulum |
|
|
3 | (1) |
|
Salary of College Teachers |
|
|
3 | (2) |
|
|
5 | (2) |
|
Urea Formaldehyde Foam Insulation |
|
|
7 | (2) |
|
|
9 | (2) |
|
|
11 | (4) |
|
|
15 | (1) |
|
Important Reasons for Modeling |
|
|
16 | (1) |
|
Data Plots and Empirical Modeling |
|
|
17 | (4) |
|
An Iterative Model Building Approach |
|
|
21 | (1) |
|
|
22 | (4) |
|
|
23 | (3) |
|
|
26 | (41) |
|
|
26 | (1) |
|
|
26 | (1) |
|
Objectives of the Analysis |
|
|
27 | (1) |
|
|
27 | (2) |
|
Maximum Likelihood Estimation |
|
|
27 | (1) |
|
|
28 | (1) |
|
Fitted Values, Residuals, and the Estimate of σ2 |
|
|
29 | (2) |
|
Consequences of the Least Squares Fit |
|
|
30 | (1) |
|
|
30 | (1) |
|
Least Squares Calculations for the Hardness Example |
|
|
31 | (1) |
|
Properties of Least Squares Estimates |
|
|
31 | (2) |
|
Expected Values of Least Squares Estimates |
|
|
32 | (1) |
|
Variances of Least Squares Estimates |
|
|
32 | (1) |
|
Inferences about the Regression Parameters |
|
|
33 | (5) |
|
|
33 | (3) |
|
Inference about μ0 = β0 + β1x0 |
|
|
36 | (1) |
|
Hardness Example Continued |
|
|
37 | (1) |
|
|
38 | (3) |
|
Hardness Example Continued |
|
|
40 | (1) |
|
Analysis of Variance Approach to Regression |
|
|
41 | (5) |
|
Coefficient of Determination: R2 |
|
|
44 | (1) |
|
Hardness Example Continued |
|
|
45 | (1) |
|
|
46 | (5) |
|
|
51 | (16) |
|
Regression through the Origin |
|
|
51 | (1) |
|
|
52 | (1) |
|
Appendix: Univariate Distributions |
|
|
53 | (3) |
|
|
56 | (11) |
|
A Review of Matrix Algebra and Important Results on Random Vectors |
|
|
67 | (20) |
|
|
67 | (7) |
|
Matrix Approach to Simple Linear Regression |
|
|
74 | (2) |
|
|
74 | (2) |
|
|
76 | (1) |
|
Vectors of Random Variables |
|
|
76 | (4) |
|
The Multivariate Normal Distribution |
|
|
80 | (2) |
|
Important Results on Quadratic Forms |
|
|
82 | (5) |
|
|
83 | (4) |
|
Multiple Linear Regression Model |
|
|
87 | (57) |
|
|
87 | (3) |
|
|
87 | (2) |
|
|
89 | (1) |
|
|
90 | (12) |
|
A Geometric Interpretation of Least Squares |
|
|
91 | (4) |
|
Useful Properties of Estimates and Other Related Vectors |
|
|
95 | (4) |
|
Preliminary Discussion of Residuals |
|
|
99 | (3) |
|
|
102 | (10) |
|
Confidence Intervals and Tests of Hypotheses for a Single Parameter |
|
|
102 | (6) |
|
Prediction of a New Observation |
|
|
108 | (4) |
|
The Additional Sum of Squares Principle |
|
|
112 | (9) |
|
|
112 | (2) |
|
Test of a Set of Linear Hypotheses |
|
|
114 | (6) |
|
Joint Confidence Regions for Several Parameters |
|
|
120 | (1) |
|
The Analysis of Variance and the Coefficient of Determination, R2 |
|
|
121 | (4) |
|
Coefficient of Determination, R2 |
|
|
124 | (1) |
|
Generalized Least Squares |
|
|
125 | (19) |
|
|
125 | (3) |
|
Generalized Least Squares Estimation |
|
|
128 | (1) |
|
|
129 | (1) |
|
Appendix: Proofs of Results |
|
|
130 | (4) |
|
|
134 | (10) |
|
Specification Issues in Regression Models |
|
|
144 | (25) |
|
|
144 | (3) |
|
|
144 | (1) |
|
|
145 | (1) |
|
|
146 | (1) |
|
Systems of Straight Lines |
|
|
147 | (4) |
|
Comparison of Several ``Treatments'' |
|
|
151 | (3) |
|
X Matrices with Nearly Linear-Dependent Columns |
|
|
154 | (6) |
|
Detection of Multicollinearity |
|
|
158 | (1) |
|
Guarding against Multicollinearity |
|
|
159 | (1) |
|
X Matrices with Orthogonal Columns |
|
|
160 | (9) |
|
|
163 | (6) |
|
|
169 | (53) |
|
|
169 | (1) |
|
|
170 | (12) |
|
Residuals and Residual Plots |
|
|
170 | (3) |
|
|
173 | (4) |
|
Checking the Normality Assumption |
|
|
177 | (1) |
|
Serial Correlation among the Errors |
|
|
177 | (5) |
|
Example: Residual Plots for the UFFI Data |
|
|
182 | (1) |
|
The Effect of Individual Cases |
|
|
182 | (17) |
|
|
182 | (4) |
|
Leverage and Influence Measures |
|
|
186 | (6) |
|
|
192 | (7) |
|
Assessing the Adequacy of the Functional Form: Testing for Lack of Fit |
|
|
199 | (6) |
|
Lack-of-Fit Test with More Than One Independent Variable |
|
|
204 | (1) |
|
Variance-Stabilizing Transformations |
|
|
205 | (17) |
|
|
206 | (1) |
|
Several Useful Results for Logarithmic Transformations |
|
|
207 | (1) |
|
|
208 | (2) |
|
|
210 | (12) |
|
|
222 | (23) |
|
|
222 | (5) |
|
|
227 | (7) |
|
R2, Radj, S2, and Akaike's Information Criterion |
|
|
227 | (4) |
|
|
231 | (2) |
|
|
233 | (1) |
|
|
234 | (11) |
|
|
234 | (2) |
|
|
236 | (1) |
|
|
237 | (1) |
|
|
238 | (7) |
|
Case Studies in Linear Regression |
|
|
245 | (43) |
|
Educational Achievement of Iowa Students |
|
|
245 | (8) |
|
Analysis of Achievement Scores |
|
|
246 | (3) |
|
Analysis of Teacher Salaries |
|
|
249 | (3) |
|
|
252 | (1) |
|
Predicting the Price of Bordeaux Wine |
|
|
253 | (5) |
|
Factors Influencing the Auction Price of Iowa Cows |
|
|
258 | (8) |
|
Predicting U.S. Presidential Elections |
|
|
266 | (14) |
|
A Purely Economics-Based Model Proposed by Ray Fair |
|
|
266 | (8) |
|
Prediction Models Proposed by Political Scientists |
|
|
274 | (4) |
|
|
278 | (2) |
|
Student Projects Leading to Additional Case Studies |
|
|
280 | (8) |
|
|
283 | (5) |
|
Nonlinear Regression Models |
|
|
288 | (19) |
|
|
288 | (1) |
|
Overview of Useful Deterministic Models, With Emphasis on Nonlinear Growth Curve Models |
|
|
289 | (4) |
|
Nonlinear Regression Models |
|
|
293 | (1) |
|
Inference in the Nonlinear Regression Model |
|
|
294 | (5) |
|
The Newton-Raphson Method of Determining the Minimum of a Function |
|
|
294 | (2) |
|
Application to Nonlinear Regression: Newton--Raphson and Gauss-Newton Methods |
|
|
296 | (1) |
|
Standard Errors of the Maximum Likelihood Estimates |
|
|
297 | (2) |
|
|
299 | (1) |
|
|
299 | (8) |
|
Example 1: Loss in Chlorine Concentration |
|
|
299 | (3) |
|
Example 2: Shoot Length of Plants Exposed to Gamma Irradiation |
|
|
302 | (3) |
|
|
305 | (2) |
|
Regression Models for Time Series Situations |
|
|
307 | (36) |
|
A Brief Introduction to Time Series Models |
|
|
307 | (6) |
|
First-Order Autoregressive Model |
|
|
307 | (3) |
|
|
310 | (1) |
|
Second-Order Autoregressive Model |
|
|
311 | (1) |
|
|
311 | (1) |
|
Summary of Time Series Models |
|
|
312 | (1) |
|
|
313 | (1) |
|
The Effects of Ignoring the Autocorrelation in the Errors |
|
|
313 | (3) |
|
Inefficiency of Least Squares Estimation |
|
|
313 | (2) |
|
Spurious Regression Results When Nonstationary Errors Are Involved |
|
|
315 | (1) |
|
The Estimation of Combined Regression Time Series Models |
|
|
316 | (4) |
|
A Regression Model with First-Order Autoregressive Errors |
|
|
316 | (3) |
|
Regression Model with Noisy Random Walk Errors |
|
|
319 | (1) |
|
Forecasting With Combined Regression Time Series Models |
|
|
320 | (4) |
|
Forecasts from the Regression Model with First-Order Autoregressive Errors |
|
|
321 | (1) |
|
Forecasts from the Regression Model with Errors Following a Noisy Random Walk |
|
|
322 | (1) |
|
Forecasts When the Explanatory Variable Must Be Forecast |
|
|
323 | (1) |
|
Model-Building Strategy and Example |
|
|
324 | (3) |
|
Cointegration and Regression with Time Series Data: An Example |
|
|
327 | (16) |
|
|
334 | (9) |
|
|
343 | (39) |
|
|
343 | (3) |
|
Interpretation of the Parameters |
|
|
346 | (4) |
|
Relationship between Logistic Regression and the Analysis of 2 x 2 Contingency Tables |
|
|
347 | (2) |
|
Another Advantage of Modeling Odds Ratios by Logistic Regression |
|
|
349 | (1) |
|
Estimation of the Parameters |
|
|
350 | (3) |
|
|
353 | (3) |
|
|
353 | (1) |
|
|
354 | (1) |
|
Standard Errors of the Maximum Likelihood Estimates |
|
|
355 | (1) |
|
Model-Building Approach in the Context of Logistic Regression Models |
|
|
356 | (4) |
|
Preliminary Model Specification |
|
|
356 | (1) |
|
|
357 | (2) |
|
|
359 | (1) |
|
|
360 | (12) |
|
Example 1: Death Penalty and Race of the Victim |
|
|
360 | (4) |
|
Example 2: Hand Washing in Bathrooms at the University of Iowa |
|
|
364 | (2) |
|
Example 3: Survival of the Donner Party |
|
|
366 | (6) |
|
|
372 | (1) |
|
Other Models for Binary Responses |
|
|
373 | (1) |
|
Modeling Responses with More Than Two Categorical Outcomes |
|
|
374 | (8) |
|
|
377 | (5) |
|
Generalized Linear Models and Poisson Regression |
|
|
382 | (21) |
|
|
382 | (1) |
|
Estimation of the Parameters in the Poisson Regression Model |
|
|
383 | (2) |
|
Inference in the Poison Regression Model |
|
|
385 | (2) |
|
|
385 | (1) |
|
Standard Errors of the Maximum Likelihood Estimates the Wald Tests |
|
|
386 | (1) |
|
|
387 | (2) |
|
|
389 | (14) |
|
Example 1: Mating Success of African Elephants |
|
|
389 | (6) |
|
Example 2: Reproduction of Ceriodaphnia Organisms |
|
|
395 | (4) |
|
|
399 | (4) |
Answers to Selected Exercises |
|
403 | (11) |
Statistical Tables |
|
414 | (7) |
References |
|
421 | (4) |
List of Data Files |
|
425 | (2) |
Index |
|
427 | |