Summary |
|
1 | (10) |
|
|
11 | (14) |
|
|
12 | (3) |
|
Need for Fundamental Change |
|
|
15 | (1) |
|
|
16 | (3) |
|
|
19 | (6) |
|
2 Assessments to Meet the Goals of the Framework |
|
|
25 | (22) |
|
The Framework's Vision for K-12 Science Education |
|
|
25 | (3) |
|
Dimensions of the Framework |
|
|
28 | (3) |
|
Dimension 1 Scientific and Engineering Practices |
|
|
29 | (1) |
|
Dimension 2 Crosscutting Concepts |
|
|
30 | (1) |
|
Dimension 3 Disciplinary Core Ideas |
|
|
31 | (1) |
|
Integration: Three-Dimensional Science Learning |
|
|
31 | (7) |
|
Learning Progressions: Developing Proficiency Over Time |
|
|
33 | (4) |
|
Supporting Connections Across Disciplines |
|
|
37 | (1) |
|
Example 1 What Is Going on Inside Me? |
|
|
38 | (5) |
|
|
38 | (1) |
|
|
39 | (4) |
|
|
43 | (4) |
|
Assessing Three-Dimensional Learning |
|
|
44 | (1) |
|
Assessing the Development of Three-Dimensional Learning Over Time |
|
|
44 | (1) |
|
Breadth and Depth of Content |
|
|
45 | (2) |
|
3 Assessment Design and Validation |
|
|
47 | (36) |
|
Assessment as a Process of Evidentiary Reasoning |
|
|
48 | (8) |
|
Construct-Centered Approaches to Assessment Design |
|
|
50 | (6) |
|
Illustrations of Task-Design Approaches |
|
|
56 | (20) |
|
Evidence-Centered Design---Example 2: Pinball Car Task |
|
|
56 | (9) |
|
Construct Modeling: Measuring Silkworms |
|
|
65 | (11) |
|
|
76 | (4) |
|
Conclusion and Recommendation |
|
|
80 | (3) |
|
|
83 | (50) |
|
Assessment Purposes: Formative or Summative |
|
|
84 | (2) |
|
Characteristics of NGSS-Aligned Assessments |
|
|
86 | (6) |
|
Variation in Assessment Activities |
|
|
87 | (1) |
|
Tasks with Multiple Components |
|
|
88 | (2) |
|
|
90 | (1) |
|
Learning as a Progression |
|
|
90 | (2) |
|
|
92 | (31) |
|
Example 3 Measuring Silkworms |
|
|
93 | (1) |
|
Example 4 Behavior of Air |
|
|
94 | (5) |
|
Example 5 Movement of Water |
|
|
99 | (5) |
|
Example 6 Biodiversity in the Schoolyard |
|
|
104 | (6) |
|
|
110 | (5) |
|
|
115 | (8) |
|
Lessons from the Examples |
|
|
123 | (6) |
|
Types of Assessment Activities |
|
|
123 | (2) |
|
|
125 | (1) |
|
|
126 | (2) |
|
|
128 | (1) |
|
Conclusions and Recommendations |
|
|
129 | (4) |
|
5 Assessment for Monitoring |
|
|
133 | (60) |
|
Current Science Monitoring Assessments |
|
|
135 | (2) |
|
Including Performance Tasks in Monitoring Assessments |
|
|
137 | (7) |
|
Measurement and Implementation Issues |
|
|
138 | (2) |
|
|
140 | (2) |
|
Implications for Assessment of the NGSS |
|
|
142 | (2) |
|
|
144 | (31) |
|
|
145 | (2) |
|
Two Classes of Design Options |
|
|
147 | (1) |
|
On-Demand Assessment Components |
|
|
147 | (18) |
|
Classroom-Embedded Assessment Components |
|
|
165 | (1) |
|
Collections of Performance Tasks |
|
|
166 | (5) |
|
Maintaining the Quality of Classroom-Embedded Components |
|
|
171 | (4) |
|
Taking Advantage of Technology |
|
|
175 | (14) |
|
Variations in Item-Response Formats |
|
|
177 | (6) |
|
Assessing Challenging Constructs |
|
|
183 | (4) |
|
|
187 | (2) |
|
Conclusions and Recommendations |
|
|
189 | (4) |
|
6 Designing an Assessment System |
|
|
193 | (24) |
|
Rationale for a Systems Approach |
|
|
194 | (6) |
|
Value of a System of Assessments |
|
|
196 | (1) |
|
Curriculum and Instruction |
|
|
196 | (2) |
|
|
198 | (1) |
|
Communicating Assessment Results |
|
|
199 | (1) |
|
|
200 | (5) |
|
|
201 | (1) |
|
|
202 | (1) |
|
Indicators of Opportunity "to Learn |
|
|
203 | (2) |
|
Understanding the System Components and Their Uses |
|
|
205 | (2) |
|
Examples of Alternative Science Assessment Systems |
|
|
207 | (5) |
|
|
209 | (2) |
|
|
211 | (1) |
|
Conclusions and Recommendations |
|
|
212 | (5) |
|
7 Implementing a Science Assessment System |
|
|
217 | (36) |
|
|
218 | (3) |
|
|
221 | (5) |
|
|
226 | (1) |
|
|
227 | (3) |
|
|
230 | (5) |
|
|
235 | (18) |
|
|
|
|
253 | (6) |
|
B Biographical Sketches of Committee Members and Staff |
|
|
259 | |