Muutke küpsiste eelistusi

Introduction to Regression Modeling (with CD-ROM) New edition [Multiple-component retail product, part(s) enclosed]

(University of Waterloo), (University of Iowa)
  • Formaat: Multiple-component retail product, part(s) enclosed, 448 pages, kõrgus x laius x paksus: 240x194x23 mm, kaal: 892 g, Mixed media, Contains 1 Hardback and 1 CD-ROM
  • Sari: Duxbury Applied
  • Ilmumisaeg: 20-Jan-2005
  • Kirjastus: Duxbury Press
  • ISBN-10: 0534420753
  • ISBN-13: 9780534420758
Teised raamatud teemal:
  • Multiple-component retail product, part(s) enclosed
  • Hind: 169,51 €*
  • * saadame teile pakkumise kasutatud raamatule, mille hind võib erineda kodulehel olevast hinnast
  • See raamat on trükist otsas, kuid me saadame teile pakkumise kasutatud raamatule.
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Lisa soovinimekirja
  • Formaat: Multiple-component retail product, part(s) enclosed, 448 pages, kõrgus x laius x paksus: 240x194x23 mm, kaal: 892 g, Mixed media, Contains 1 Hardback and 1 CD-ROM
  • Sari: Duxbury Applied
  • Ilmumisaeg: 20-Jan-2005
  • Kirjastus: Duxbury Press
  • ISBN-10: 0534420753
  • ISBN-13: 9780534420758
Teised raamatud teemal:
This textbook describes the linear regression model with a single predictor variable, regression models containing several explanatory variables, nonlinear models, regression models with time series errors, and logistic and Poisson regression models. Students should have already completed a course in statistics and linear algebra. The CD- ROM contains datasets in several formats. Annotation ©2004 Book News, Inc., Portland, OR (booknews.com)

Using a data-driven approach, this book is an exciting blend of theory and interesting regression applications. Students learn the theory behind regression while actively applying it. Working with many case studies, projects, and exercises from areas such as engineering, business, social sciences, and the physical sciences, students discover the purpose of regression and learn how, when, and where regression models work. The book covers the analysis of observational data as well as of data that arise from designed experiments. Special emphasis is given to the difficulties when working with observational data, such as problems arising from multicollinearity and "messy" data situations that violate some of the usual regression assumptions. Throughout the text, students learn regression modeling by solving exercises that emphasize theoretical concepts, by analyzing real data sets, and by working on projects that require them to identify a problem of interest and collect data that are relevant to the problem's solution. The book goes beyond linear regression by covering nonlinear models, regression models with time series errors, and logistic and Poisson regression models.

Arvustused

1. Introduction to Regression Models. 2. Simple Linear Regression. 3. A Review of Matrix Algebra and Important Results of Random Vectors. 4. Multiple Linear Regression Model. 5. Specification Issues in Regression Models. 6. Model Checking. 7. Model Selection. 8. Case Studies in Linear Regression. 9. Nonlinear Regression Models. 10. Regression Models for Time Series Situations. 11. Logistic Regression. 12. Generalized Linear Models and Poisson Regression. Brief Answers to Selected Exercises. Statistical Tables. References.

Introduction to Regression Models
1(25)
Introduction
1(1)
Examples
2(13)
Payout of an Investment
2(1)
Period of Oscillation of a Pendulum
3(1)
Salary of College Teachers
3(2)
Hardness Data
5(2)
Urea Formaldehyde Foam Insulation
7(2)
Oral Contraceptive Data
9(2)
Gas Consumption Data
11(4)
A General Model
15(1)
Important Reasons for Modeling
16(1)
Data Plots and Empirical Modeling
17(4)
An Iterative Model Building Approach
21(1)
Some Comments on Data
22(4)
Exercises
23(3)
Simple Linear Regression
26(41)
The Model
26(1)
Important Assumptions
26(1)
Objectives of the Analysis
27(1)
Estimation of Parameters
27(2)
Maximum Likelihood Estimation
27(1)
Least Squares Estimation
28(1)
Fitted Values, Residuals, and the Estimate of σ2
29(2)
Consequences of the Least Squares Fit
30(1)
Estimation of σ2
30(1)
Least Squares Calculations for the Hardness Example
31(1)
Properties of Least Squares Estimates
31(2)
Expected Values of Least Squares Estimates
32(1)
Variances of Least Squares Estimates
32(1)
Inferences about the Regression Parameters
33(5)
Inference about β1
33(3)
Inference about μ0 = β0 + β1x0
36(1)
Hardness Example Continued
37(1)
Prediction
38(3)
Hardness Example Continued
40(1)
Analysis of Variance Approach to Regression
41(5)
Coefficient of Determination: R2
44(1)
Hardness Example Continued
45(1)
Another Example
46(5)
Related Models
51(16)
Regression through the Origin
51(1)
The Case of Random x's
52(1)
Appendix: Univariate Distributions
53(3)
Exercises
56(11)
A Review of Matrix Algebra and Important Results on Random Vectors
67(20)
Review of Matrix Algebra
67(7)
Matrix Approach to Simple Linear Regression
74(2)
Least Squares
74(2)
Hardness Example
76(1)
Vectors of Random Variables
76(4)
The Multivariate Normal Distribution
80(2)
Important Results on Quadratic Forms
82(5)
Exercises
83(4)
Multiple Linear Regression Model
87(57)
Introduction
87(3)
Two Examples
87(2)
The General Linear Model
89(1)
Estimation of the Model
90(12)
A Geometric Interpretation of Least Squares
91(4)
Useful Properties of Estimates and Other Related Vectors
95(4)
Preliminary Discussion of Residuals
99(3)
Statistical Inference
102(10)
Confidence Intervals and Tests of Hypotheses for a Single Parameter
102(6)
Prediction of a New Observation
108(4)
The Additional Sum of Squares Principle
112(9)
Introduction
112(2)
Test of a Set of Linear Hypotheses
114(6)
Joint Confidence Regions for Several Parameters
120(1)
The Analysis of Variance and the Coefficient of Determination, R2
121(4)
Coefficient of Determination, R2
124(1)
Generalized Least Squares
125(19)
Introduction
125(3)
Generalized Least Squares Estimation
128(1)
Weighted Least Squares
129(1)
Appendix: Proofs of Results
130(4)
Exercises
134(10)
Specification Issues in Regression Models
144(25)
Elementary Special Cases
144(3)
One-Sample Problem
144(1)
Two-Sample Problem
145(1)
Polynomial Models
146(1)
Systems of Straight Lines
147(4)
Comparison of Several ``Treatments''
151(3)
X Matrices with Nearly Linear-Dependent Columns
154(6)
Detection of Multicollinearity
158(1)
Guarding against Multicollinearity
159(1)
X Matrices with Orthogonal Columns
160(9)
Exercises
163(6)
Model Checking
169(53)
Introduction
169(1)
Residual Analysis
170(12)
Residuals and Residual Plots
170(3)
Added Variable Plots
173(4)
Checking the Normality Assumption
177(1)
Serial Correlation among the Errors
177(5)
Example: Residual Plots for the UFFI Data
182(1)
The Effect of Individual Cases
182(17)
Outliers
182(4)
Leverage and Influence Measures
186(6)
Example: Forestry Data
192(7)
Assessing the Adequacy of the Functional Form: Testing for Lack of Fit
199(6)
Lack-of-Fit Test with More Than One Independent Variable
204(1)
Variance-Stabilizing Transformations
205(17)
Box--Cox Transformations
206(1)
Several Useful Results for Logarithmic Transformations
207(1)
Appendix
208(2)
Exercises
210(12)
Model Selection
222(23)
Introduction
222(5)
All Possible Regressions
227(7)
R2, Radj, S2, and Akaike's Information Criterion
227(4)
Cp Statistic
231(2)
Press Statistic
233(1)
Automatic Methods
234(11)
Forward Selection
234(2)
Backward Elimination
236(1)
Stepwise Regression
237(1)
Exercises
238(7)
Case Studies in Linear Regression
245(43)
Educational Achievement of Iowa Students
245(8)
Analysis of Achievement Scores
246(3)
Analysis of Teacher Salaries
249(3)
Concluding Comments
252(1)
Predicting the Price of Bordeaux Wine
253(5)
Factors Influencing the Auction Price of Iowa Cows
258(8)
Predicting U.S. Presidential Elections
266(14)
A Purely Economics-Based Model Proposed by Ray Fair
266(8)
Prediction Models Proposed by Political Scientists
274(4)
Concluding Comments
278(2)
Student Projects Leading to Additional Case Studies
280(8)
Exercises
283(5)
Nonlinear Regression Models
288(19)
Introduction
288(1)
Overview of Useful Deterministic Models, With Emphasis on Nonlinear Growth Curve Models
289(4)
Nonlinear Regression Models
293(1)
Inference in the Nonlinear Regression Model
294(5)
The Newton-Raphson Method of Determining the Minimum of a Function
294(2)
Application to Nonlinear Regression: Newton--Raphson and Gauss-Newton Methods
296(1)
Standard Errors of the Maximum Likelihood Estimates
297(2)
Implementation Issues
299(1)
Examples
299(8)
Example 1: Loss in Chlorine Concentration
299(3)
Example 2: Shoot Length of Plants Exposed to Gamma Irradiation
302(3)
Exercises
305(2)
Regression Models for Time Series Situations
307(36)
A Brief Introduction to Time Series Models
307(6)
First-Order Autoregressive Model
307(3)
Random Walk Model
310(1)
Second-Order Autoregressive Model
311(1)
Noisy Random Walk Model
311(1)
Summary of Time Series Models
312(1)
Remark
313(1)
The Effects of Ignoring the Autocorrelation in the Errors
313(3)
Inefficiency of Least Squares Estimation
313(2)
Spurious Regression Results When Nonstationary Errors Are Involved
315(1)
The Estimation of Combined Regression Time Series Models
316(4)
A Regression Model with First-Order Autoregressive Errors
316(3)
Regression Model with Noisy Random Walk Errors
319(1)
Forecasting With Combined Regression Time Series Models
320(4)
Forecasts from the Regression Model with First-Order Autoregressive Errors
321(1)
Forecasts from the Regression Model with Errors Following a Noisy Random Walk
322(1)
Forecasts When the Explanatory Variable Must Be Forecast
323(1)
Model-Building Strategy and Example
324(3)
Cointegration and Regression with Time Series Data: An Example
327(16)
Exercises
334(9)
Logistic Regression
343(39)
The Model
343(3)
Interpretation of the Parameters
346(4)
Relationship between Logistic Regression and the Analysis of 2 x 2 Contingency Tables
347(2)
Another Advantage of Modeling Odds Ratios by Logistic Regression
349(1)
Estimation of the Parameters
350(3)
Inference
353(3)
Likelihood Ratio Tests
353(1)
Deviance
354(1)
Standard Errors of the Maximum Likelihood Estimates
355(1)
Model-Building Approach in the Context of Logistic Regression Models
356(4)
Preliminary Model Specification
356(1)
Assessing Model Fit
357(2)
Model Diagnostics
359(1)
Examples
360(12)
Example 1: Death Penalty and Race of the Victim
360(4)
Example 2: Hand Washing in Bathrooms at the University of Iowa
364(2)
Example 3: Survival of the Donner Party
366(6)
Overdispersion
372(1)
Other Models for Binary Responses
373(1)
Modeling Responses with More Than Two Categorical Outcomes
374(8)
Exercises
377(5)
Generalized Linear Models and Poisson Regression
382(21)
The Model
382(1)
Estimation of the Parameters in the Poisson Regression Model
383(2)
Inference in the Poison Regression Model
385(2)
Likelihood Ratio Tests
385(1)
Standard Errors of the Maximum Likelihood Estimates the Wald Tests
386(1)
Overdispersion
387(2)
Examples
389(14)
Example 1: Mating Success of African Elephants
389(6)
Example 2: Reproduction of Ceriodaphnia Organisms
395(4)
Exercises
399(4)
Answers to Selected Exercises 403(11)
Statistical Tables 414(7)
References 421(4)
List of Data Files 425(2)
Index 427


Bovas Abraham is the former Director of the Institute for Improvement in Quality and Productivity, and is also a professor in the Department of Statistics and Actuarial Science at the University of Waterloo. Bovas received his Ph.D. from the University of Wisconsin, Madison. He has held visiting positions at the University of Wisconsin, the University of Iowa, and the University of Western Australia. He is the author of the book STATISTICAL METHODS FOR FORECASTING (with Johannes Ledolter) published by Wiley in 1983, and the editor of the volume QUALITY IMPROVEMENT THROUGH STATISTICAL METHODS published by Birkhauser in 1998. Johannes Ledolter is the John F. Murray Professor of Management Sciences at the University of Iowa, and a Professor at the Vienna University of Economics and Business Administration. His graduate degrees are in Statistics (M.S. and Ph.D. from the University of Wisconsin, and M.S. from the University of Vienna). He has held visiting positions at Princeton University and Yale University. He is the author of four books: STATISTICAL METHODS FOR FORECASTING (with Bovas Abraham) published by Wiley in 1983, STATISTICS FOR ENGINEERS AND PHYSICAL SCIENTISTS (2nd edition, with Robert V. Hogg) published by Macmillan in 1991, STATISTICAL QUALITY CONTROL (with Claude W. Burrill) published by Wiley in 1999, and ACHIEVING QUALITY THROUGH CONTINUAL IMPROVEMENT (with Claude W. Burrill) published by Wiley in 1999.