Preface |
|
xiii | |
|
|
1 | (20) |
|
What Is Regression Analysis? |
|
|
1 | (1) |
|
Publicly Available Data Sets |
|
|
2 | (1) |
|
Selected Applications of Regression Analysis |
|
|
3 | (4) |
|
Steps in Regression Analysis |
|
|
7 | (10) |
|
Scope and Organization of the Book |
|
|
17 | (4) |
|
|
18 | (3) |
|
|
21 | (30) |
|
|
21 | (1) |
|
Covariance and Correlation Coefficient |
|
|
21 | (5) |
|
Example: Computer Repair Data |
|
|
26 | (3) |
|
The Simple Linear Regression Model |
|
|
29 | (1) |
|
|
30 | (3) |
|
|
33 | (4) |
|
|
37 | (1) |
|
|
38 | (2) |
|
Measuring the Quality of Fit |
|
|
40 | (3) |
|
Regression Line Through the Origin |
|
|
43 | (1) |
|
Trivial Regression Models |
|
|
44 | (1) |
|
|
45 | (6) |
|
|
45 | (6) |
|
Multiple Linear Regression |
|
|
51 | (34) |
|
|
51 | (1) |
|
Description of the Data and Model |
|
|
51 | (1) |
|
Example: Supervisor Performance Data |
|
|
52 | (3) |
|
|
55 | (1) |
|
Interpretations of Regression Coefficients |
|
|
56 | (3) |
|
Properties of the Least Squares Estimators |
|
|
59 | (1) |
|
Multiple Correlation Coefficient |
|
|
60 | (1) |
|
Inference for Individual Regression Coefficients |
|
|
61 | (2) |
|
Tests of Hypotheses in a Linear Model |
|
|
63 | (11) |
|
|
74 | (1) |
|
|
74 | (11) |
|
|
75 | (5) |
|
|
80 | (5) |
|
Regression Diagnostics: Detection of Model Violations |
|
|
85 | (38) |
|
|
85 | (1) |
|
The Standard Regression Assumptions |
|
|
85 | (3) |
|
Various Types of Residuals |
|
|
88 | (2) |
|
|
90 | (3) |
|
Graphs Before Fitting a Model |
|
|
93 | (4) |
|
Graphs After Fitting a Model |
|
|
97 | (1) |
|
Checking Linearity and Normality Assumptions |
|
|
97 | (1) |
|
Leverage, Influence, and Outliers |
|
|
98 | (5) |
|
|
103 | (4) |
|
The Potential-Residual Plot |
|
|
107 | (1) |
|
What to Do with the Outliers? |
|
|
108 | (1) |
|
Role of Variables in a Regression Equation |
|
|
109 | (4) |
|
Effects of an Additional Predictor |
|
|
113 | (3) |
|
|
116 | (7) |
|
|
116 | (7) |
|
Qualitative Variables as Predictors |
|
|
123 | (30) |
|
|
123 | (1) |
|
|
124 | (3) |
|
|
127 | (4) |
|
Systems of Regression Equations |
|
|
131 | (9) |
|
Other Applications of Indicator Variables |
|
|
140 | (1) |
|
|
141 | (1) |
|
Stability of Regression Parameters Over Time |
|
|
142 | (11) |
|
|
144 | (9) |
|
Transformation of Variables |
|
|
153 | (28) |
|
|
153 | (2) |
|
Transformations to Achieve Linearity |
|
|
155 | (2) |
|
Bacteria Deaths Due to X-Ray Radiation |
|
|
157 | (4) |
|
Transformations to Stabilize Variance |
|
|
161 | (5) |
|
Detection of Heteroscedastic Errors |
|
|
166 | (2) |
|
Removal of Heteroscedasticity |
|
|
168 | (2) |
|
|
170 | (1) |
|
Logarithmic Transformation of Data |
|
|
170 | (4) |
|
|
174 | (2) |
|
|
176 | (5) |
|
|
176 | (5) |
|
|
181 | (20) |
|
|
181 | (1) |
|
|
182 | (3) |
|
|
185 | (2) |
|
Education Expenditure Data |
|
|
187 | (10) |
|
Fitting a Dose-Response Relationship Curve |
|
|
197 | (4) |
|
|
199 | (2) |
|
The Problem of Correlated Errors |
|
|
201 | (24) |
|
Introduction: Autocorrelation |
|
|
201 | (1) |
|
Consumer Expenditure and Money Stock |
|
|
202 | (2) |
|
|
204 | (2) |
|
Removal of Autocorrelation by Transformation |
|
|
206 | (2) |
|
Iterative Estimation With Autocorrelated Errors |
|
|
208 | (1) |
|
Autocorrelation and Missing Variables |
|
|
209 | (1) |
|
Analysis of Housing Starts |
|
|
210 | (4) |
|
Limitations of Durbin-Watson Statistic |
|
|
214 | (2) |
|
Indicator Variables to Remove Seasonality |
|
|
216 | (3) |
|
Regressing Two Time Series |
|
|
219 | (6) |
|
|
220 | (5) |
|
Analysis of Collinear Data |
|
|
225 | (38) |
|
|
225 | (1) |
|
|
226 | (6) |
|
|
232 | (4) |
|
Detection of Multicollinearity |
|
|
236 | (6) |
|
|
242 | (3) |
|
Principal Components Approach |
|
|
245 | (4) |
|
|
249 | (3) |
|
Searching for Linear Functions of the β's |
|
|
252 | (3) |
|
Computations Using Principal Components |
|
|
255 | (3) |
|
|
258 | (5) |
|
|
258 | (2) |
|
Appendix: Principal Components |
|
|
260 | (3) |
|
Biased Estimation of Regression Coefficients |
|
|
263 | (22) |
|
|
263 | (1) |
|
Principal Components Regression |
|
|
264 | (2) |
|
Removing Dependence Among the Predictors |
|
|
266 | (2) |
|
Constraints on the Regression Coefficients |
|
|
268 | (1) |
|
Principal Components Regression: A Caution |
|
|
269 | (2) |
|
|
271 | (2) |
|
Estimation by the Ridge Method |
|
|
273 | (3) |
|
Ridge Regression: Some Remarks |
|
|
276 | (3) |
|
|
279 | (6) |
|
|
279 | (2) |
|
Appendix: Ridge Regression |
|
|
281 | (4) |
|
Variable Selection Procedures |
|
|
285 | (34) |
|
|
285 | (1) |
|
Formulation of the Problem |
|
|
286 | (1) |
|
Consequences of Variables Deletion |
|
|
286 | (2) |
|
Uses of Regression Equations |
|
|
288 | (1) |
|
Criteria for Evaluating Equations |
|
|
289 | (2) |
|
Multicollinearity and Variable Selection |
|
|
291 | (1) |
|
Evaluating All Possible Equations |
|
|
291 | (1) |
|
Variable Selection Procedures |
|
|
292 | (2) |
|
General Remarks on Variable Selection Methods |
|
|
294 | (1) |
|
A Study of Supervisor Performance |
|
|
295 | (4) |
|
Variable Selection With Collinear Data |
|
|
299 | (1) |
|
|
300 | (3) |
|
Variable Selection Using Ridge Regression |
|
|
303 | (1) |
|
Selection of Variables in an Air Pollution Study |
|
|
303 | (7) |
|
A Possible Strategy for Fitting Regression Models |
|
|
310 | (2) |
|
|
312 | (7) |
|
|
312 | (4) |
|
Appendix: Effects of Incorrect Model Specifications |
|
|
316 | (3) |
|
|
319 | (16) |
|
|
319 | (1) |
|
Modeling Qualitative Data |
|
|
320 | (1) |
|
|
320 | (2) |
|
Example: Estimating Probability of Bankruptcies |
|
|
322 | (3) |
|
Logistic Regression Diagnostics |
|
|
325 | (2) |
|
Determination of Variables to Retain |
|
|
327 | (1) |
|
Judging the Fit of a Logistic Regression |
|
|
328 | (2) |
|
Classification Problem: Another Approach |
|
|
330 | (5) |
|
|
331 | (4) |
Appendix: Statistical Tables |
|
335 | (12) |
References |
|
347 | (8) |
Index |
|
355 | |