|
|
|
|
1 | (14) |
|
Statistical Procedures Described in the Book |
|
|
1 | (14) |
|
|
2 | (2) |
|
|
4 | (1) |
|
|
5 | (2) |
|
General Loglinear Analysis |
|
|
7 | (1) |
|
|
8 | (1) |
|
|
9 | (1) |
|
|
10 | (2) |
|
|
12 | (1) |
|
|
13 | (1) |
|
|
14 | (1) |
|
2 Getting to Know IBM SPSS Statistics |
|
|
15 | (22) |
|
|
16 | (1) |
|
|
16 | (3) |
|
Windows of IBM SPSS Statistics |
|
|
19 | (1) |
|
|
20 | (5) |
|
IBM SPSS Statistics Viewer |
|
|
25 | (9) |
|
|
27 | (1) |
|
|
28 | (5) |
|
|
33 | (1) |
|
|
34 | (3) |
|
|
37 | (16) |
|
|
37 | (2) |
|
|
39 | (1) |
|
Getting Data into IBM SPSS Statistics |
|
|
39 | (11) |
|
Using IBM SPSS Statistics Data Files |
|
|
39 | (2) |
|
Using Spreadsheet and Database Files |
|
|
41 | (1) |
|
Feeding Text Files to the Text Wizard |
|
|
42 | (1) |
|
|
42 | (7) |
|
|
49 | (1) |
|
|
50 | (1) |
|
|
50 | (1) |
|
|
50 | (1) |
|
Saving Time When Dealing with Your Data |
|
|
50 | (3) |
|
Selecting Cases for Analyses |
|
|
51 | (1) |
|
Repeating the Analysis for Different Groups of Cases |
|
|
51 | (2) |
|
|
53 | (14) |
|
Checking Variable Definitions |
|
|
53 | (4) |
|
|
54 | (1) |
|
Using Define Variable Properties |
|
|
55 | (2) |
|
|
57 | (2) |
|
Eliminating Duplicate Cases |
|
|
57 | (2) |
|
|
59 | (1) |
|
|
59 | (7) |
|
|
59 | (1) |
|
|
60 | (1) |
|
Looking At the Distribution of Values |
|
|
61 | (2) |
|
Looking At Combinations of Variables |
|
|
63 | (3) |
|
|
66 | (1) |
|
|
67 | (16) |
|
|
67 | (8) |
|
One Size Fits All: Unconditional Transformation |
|
|
68 | (3) |
|
If and Then: Conditional Transformations |
|
|
71 | (1) |
|
|
72 | (3) |
|
Changing the Coding Scheme |
|
|
75 | (8) |
|
|
77 | (1) |
|
Changing a String Variable to a Numeric Variable |
|
|
78 | (1) |
|
|
79 | (1) |
|
|
80 | (3) |
|
|
83 | (28) |
|
|
83 | (1) |
|
|
84 | (1) |
|
Newspaper Reading: The Example |
|
|
84 | (8) |
|
Examining Tables and Charts of Counts |
|
|
84 | (4) |
|
Examining Two-Way Tables of Counts |
|
|
88 | (4) |
|
Summarize Scale Variables |
|
|
92 | (12) |
|
|
93 | (3) |
|
Variability and Central Tendency |
|
|
96 | (4) |
|
Plotting Pairs of Variables |
|
|
100 | (1) |
|
|
101 | (1) |
|
|
102 | (2) |
|
|
104 | (2) |
|
|
105 | (1) |
|
|
106 | (1) |
|
|
107 | (4) |
|
|
111 | (16) |
|
|
111 | (1) |
|
|
112 | (3) |
|
Defining Samples and Populations |
|
|
112 | (1) |
|
|
112 | (1) |
|
Creating a Good Experimental Design |
|
|
112 | (1) |
|
Dealing with Missing Data |
|
|
113 | (2) |
|
|
115 | (7) |
|
Step 1 Specifying the Null Hypothesis and the Alternative Hypothesis |
|
|
115 | (1) |
|
Prelude to the Remaining Steps |
|
|
116 | (1) |
|
Step 2 Selecting the Appropriate Statistical Procedure |
|
|
117 | (1) |
|
Step 3 Checking Whether Your Data Meet the Required Assumptions |
|
|
117 | (2) |
|
Step 4 Assuming That the Null Hypothesis Is True |
|
|
119 | (1) |
|
Step 5 Calculating the Observed Significance Level |
|
|
119 | (2) |
|
Step 6 Deciding Whether to Reject the Null Hypothesis |
|
|
121 | (1) |
|
Calculating Confidence Intervals |
|
|
122 | (4) |
|
Reporting Your Results Correctly |
|
|
123 | (2) |
|
|
125 | (1) |
|
Commonly Used Tests for Popular Hypotheses |
|
|
126 | (1) |
|
|
126 | (1) |
|
|
127 | (16) |
|
|
127 | (1) |
|
Deciding Which T Test to Use |
|
|
128 | (5) |
|
|
128 | (2) |
|
|
130 | (1) |
|
Two-Independent-Samples T Test |
|
|
131 | (2) |
|
Analyzing Truancy Data: The Example |
|
|
133 | (9) |
|
|
134 | (3) |
|
|
137 | (3) |
|
Two-Independent-Samples T Test |
|
|
140 | (2) |
|
|
142 | (1) |
|
9 One-Way Analysis of Variance |
|
|
143 | (20) |
|
|
143 | (1) |
|
|
144 | (1) |
|
|
144 | (9) |
|
|
144 | (1) |
|
|
145 | (2) |
|
|
147 | (4) |
|
|
151 | (1) |
|
|
152 | (1) |
|
Pinpointing the Differences |
|
|
153 | (8) |
|
Atoning for Many Comparisons |
|
|
154 | (2) |
|
Contrasts: Testing Linear Combinations of Means |
|
|
156 | (5) |
|
|
161 | (2) |
|
|
163 | (34) |
|
|
163 | (1) |
|
|
164 | (1) |
|
Chi-Square Test: Are Two Variables Independent? |
|
|
164 | (8) |
|
|
165 | (3) |
|
|
168 | (3) |
|
Measuring Change: McNemar Test |
|
|
171 | (1) |
|
How Strongly Are Two Variables Related? |
|
|
172 | (14) |
|
Measures of Association for Nominal Variables |
|
|
173 | (3) |
|
Proportional Reduction in Error Measures |
|
|
176 | (3) |
|
|
179 | (4) |
|
|
183 | (1) |
|
Measures Based on Correlation Coefficients |
|
|
183 | (1) |
|
|
183 | (3) |
|
Measuring Risk in 2-by-2 Tables |
|
|
186 | (7) |
|
Measuring the Relative Risk |
|
|
186 | (2) |
|
Calculating the Odds Ratio |
|
|
188 | (2) |
|
|
190 | (1) |
|
Testing Hypotheses about the Odds Ratios |
|
|
191 | (2) |
|
Megatip: Entering Tables Directly |
|
|
193 | (1) |
|
|
194 | (3) |
|
|
197 | (20) |
|
|
197 | (1) |
|
|
198 | (1) |
|
|
198 | (8) |
|
|
199 | (2) |
|
Examining the Scatterplot Matrix |
|
|
201 | (1) |
|
Using the Pearson Correlation Coefficient |
|
|
202 | (2) |
|
Testing Hypotheses about Correlation Coefficients |
|
|
204 | (2) |
|
|
206 | (2) |
|
Basing Correlation Coefficients on Ranks |
|
|
208 | (1) |
|
Using Partial Correlation Coefficients |
|
|
208 | (6) |
|
Calculating Partial Correlation Coefficients |
|
|
209 | (2) |
|
Testing Hypotheses about Partial Correlation Coefficients |
|
|
211 | (1) |
|
Steps for Calculating the Partial Correlation Coefficient |
|
|
212 | (1) |
|
|
213 | (1) |
|
Megatip: Identifying Points in a Scatterplot |
|
|
214 | (2) |
|
Labeling Individual Points with the Default Variable Value |
|
|
215 | (1) |
|
Labeling All Points with the Default Variable Value |
|
|
215 | (1) |
|
Changing the Variables Used to Identify Points |
|
|
215 | (1) |
|
|
216 | (1) |
|
12 Bivariate Linear Regression |
|
|
217 | (20) |
|
|
217 | (1) |
|
|
218 | (1) |
|
Predicting Body Fat: The Example |
|
|
218 | (7) |
|
|
218 | (1) |
|
Calculating the Least-Squares Regression Line |
|
|
219 | (1) |
|
|
220 | (1) |
|
Determining How Well the Line Fits |
|
|
221 | (1) |
|
Testing Hypotheses about the Population Regression Line |
|
|
221 | (4) |
|
Searching for Violations of Assumptions |
|
|
225 | (5) |
|
Looking for Unusual Points |
|
|
230 | (2) |
|
Dealing with Violations of the Assumptions |
|
|
232 | (2) |
|
Coaxing a Nonlinear Relationship to Linearity |
|
|
232 | (1) |
|
Coping with Non-Normality |
|
|
233 | (1) |
|
|
234 | (1) |
|
Adding More Circumferences: Multiple Linear Regression |
|
|
234 | (1) |
|
|
235 | (2) |
|
13 Multiple Linear Regression |
|
|
237 | (38) |
|
|
238 | (1) |
|
|
238 | (1) |
|
Reading Scores: The Example |
|
|
238 | (2) |
|
|
239 | (1) |
|
Multiple Linear Regression Model |
|
|
240 | (14) |
|
|
241 | (2) |
|
Estimating the Coefficients of the Model |
|
|
243 | (3) |
|
Assumptions for Testing Hypotheses |
|
|
246 | (1) |
|
|
246 | (5) |
|
Including Categorical Variables |
|
|
251 | (1) |
|
Comparing Two Models: Change in R-Square |
|
|
252 | (1) |
|
Many Paths Lead to the Same Hypothesis |
|
|
253 | (1) |
|
Using an Automated Method for Building a Model |
|
|
254 | (6) |
|
|
255 | (1) |
|
Stepwise Selection: An Example |
|
|
256 | (3) |
|
Calculating Predicted Values |
|
|
259 | (1) |
|
After the Model Is Selected |
|
|
260 | (1) |
|
Checking for Violations of Regression Assumptions |
|
|
261 | (4) |
|
|
261 | (4) |
|
Looking for Unusual Observations |
|
|
265 | (8) |
|
Identifying Large Residuals |
|
|
265 | (1) |
|
Identifying Unusual Values of Independent Variables |
|
|
266 | (1) |
|
Looking for Influential Points |
|
|
267 | (3) |
|
|
270 | (1) |
|
|
271 | (1) |
|
|
272 | (1) |
|
|
273 | (2) |
|
14 Automatic Linear Modeling |
|
|
275 | (16) |
|
All-Possible-Subsets Regression Models |
|
|
276 | (8) |
|
|
277 | (1) |
|
|
277 | (2) |
|
Comparing Observed and Expected Values |
|
|
279 | (1) |
|
Examining the Coefficients |
|
|
280 | (2) |
|
|
282 | (1) |
|
Looking for Influential Points |
|
|
283 | (1) |
|
|
284 | (7) |
|
Automatic Data Preparation |
|
|
285 | (1) |
|
|
285 | (2) |
|
Evaluating the Predictors |
|
|
287 | (4) |
|
|
291 | |
|
|
292 | (1) |
|
|
292 | (1) |
|
Predicting Internet Use: The Example |
|
|
293 | (6) |
|
|
293 | (3) |
|
Examining Descriptive Statistics |
|
|
296 | (3) |
|
Calculating the Discriminant Function |
|
|
299 | (7) |
|
Discriminant Function Coefficients |
|
|
300 | (6) |
|
Testing Hypotheses about the Discriminant Function |
|
|
306 | (3) |
|
|
307 | (1) |
|
Testing Equality of Discriminant Function Means |
|
|
308 | (1) |
|
Classifying Cases into Groups |
|
|
309 | (5) |
|
|
309 | (3) |
|
Examining the Probability of Group Membership |
|
|
312 | (2) |
|
|
314 | (6) |
|
|
314 | (3) |
|
|
317 | (3) |
|
|
320 | (1) |
|
Discriminant Analysis with Four Groups |
|
|
320 | (6) |
|
Discriminant Function Coefficients |
|
|
321 | (3) |
|
Testing Equality of Discriminant Function Means |
|
|
324 | (1) |
|
Pairwise Differences between Groups |
|
|
325 | (1) |
|
|
326 | (5) |
|
Classification Function Coefficients |
|
|
327 | (4) |
|
|
331 | |