Acknowledgments |
|
xxi | |
Introduction |
|
xxiii | |
About the Author |
|
xxv | |
|
SECTION 1 SOFTWARE QUALITY IN PERSPECTIVE |
|
|
|
A Brief History of Software Testing |
|
|
3 | (10) |
|
Historical Software Testing and Development Parallels |
|
|
6 | (2) |
|
|
8 | (1) |
|
Evolution of Automated Testing Tools |
|
|
8 | (5) |
|
Static Capture/Replay Tools (without Scripting Language) |
|
|
10 | (1) |
|
Static Capture/Replay Tools (with Scripting Language) |
|
|
10 | (1) |
|
Variable Capture/Replay Tools |
|
|
10 | (3) |
|
Quality Assurance Framework |
|
|
13 | (26) |
|
|
13 | (1) |
|
Prevention versus Detection |
|
|
14 | (1) |
|
Verification versus Validation |
|
|
15 | (1) |
|
Software Quality Assurance |
|
|
16 | (1) |
|
Components of Quality Assurance |
|
|
17 | (1) |
|
|
17 | (1) |
|
|
18 | (5) |
|
Software Configuration Management |
|
|
19 | (1) |
|
Elements of Software Configuration Management |
|
|
20 | (3) |
|
Software Quality Assurance Plan |
|
|
23 | (3) |
|
Steps to Develop and Implement a Software Quality Assurance Plan |
|
|
23 | (1) |
|
|
23 | (2) |
|
Obtain Management Acceptance |
|
|
25 | (1) |
|
Obtain Development Acceptance |
|
|
25 | (1) |
|
Plan for Implementation of the SQA Plan |
|
|
26 | (1) |
|
|
26 | (1) |
|
|
26 | (11) |
|
|
26 | (3) |
|
|
29 | (1) |
|
Capability Maturity Model (CMM) |
|
|
29 | (1) |
|
|
30 | (1) |
|
|
31 | (1) |
|
|
31 | (1) |
|
|
32 | (1) |
|
|
32 | (1) |
|
|
33 | (1) |
|
|
33 | (1) |
|
Malcolm Baldrige National Quality Award |
|
|
34 | (3) |
|
|
37 | (2) |
|
Overview of Testing Techniques |
|
|
39 | (12) |
|
Black-Box Testing (Functional) |
|
|
39 | (1) |
|
White-Box Testing (Structural) |
|
|
40 | (1) |
|
Gray-Box Testing (Functional and Structural) |
|
|
41 | (1) |
|
Manual versus Automated Testing |
|
|
41 | (1) |
|
Static versus Dynamic Testing |
|
|
41 | (1) |
|
Taxonomy of Software Testing Techniques |
|
|
42 | (9) |
|
Transforming Requirements to Testable Test Cases |
|
|
51 | (24) |
|
|
51 | (1) |
|
Software Requirements as the Basis of Testing |
|
|
51 | (1) |
|
Requirement Quality Factors |
|
|
52 | (2) |
|
|
52 | (1) |
|
|
53 | (1) |
|
|
53 | (1) |
|
|
53 | (1) |
|
|
54 | (1) |
|
|
54 | (1) |
|
|
54 | (1) |
|
|
54 | (1) |
|
Numerical Method for Evaluating Requirement Quality |
|
|
54 | (1) |
|
Process for Creating Test Cases from Good Requirements |
|
|
55 | (9) |
|
|
55 | (3) |
|
|
58 | (1) |
|
|
58 | (1) |
|
|
59 | (3) |
|
Write Test Case Descriptions and Objectives |
|
|
62 | (1) |
|
|
62 | (1) |
|
|
63 | (1) |
|
Transforming Use Cases to Test Cases |
|
|
64 | (4) |
|
|
64 | (1) |
|
Write the Detailed Use Case Text |
|
|
64 | (2) |
|
Identify Use Case Scenarios |
|
|
66 | (1) |
|
Generating the Test Cases |
|
|
66 | (2) |
|
|
68 | (1) |
|
|
68 | (1) |
|
What to Do When Requirements Are Nonexistent or Poor? |
|
|
68 | (7) |
|
|
68 | (1) |
|
The Art of Ad Hoc Testing |
|
|
68 | (3) |
|
Advantages and Disadvantages of Ad Hoc Testing |
|
|
71 | (1) |
|
|
72 | (1) |
|
The Art of Exploratory Testing |
|
|
72 | (1) |
|
Exploratory Testing Process |
|
|
72 | (1) |
|
Advantages and Disadvantages of Exploratory Testing |
|
|
73 | (2) |
|
Quality through Continuous Improvement Process |
|
|
75 | (12) |
|
Contribution of Edward Deming |
|
|
75 | (1) |
|
Role of Statistical Methods |
|
|
76 | (1) |
|
|
76 | (1) |
|
|
76 | (1) |
|
|
76 | (1) |
|
|
77 | (1) |
|
|
77 | (1) |
|
|
77 | (1) |
|
|
77 | (1) |
|
Deming's 14 Quality Principles |
|
|
77 | (6) |
|
Create Constancy of Purpose |
|
|
77 | (1) |
|
|
78 | (1) |
|
Cease Dependence on Mass Inspection |
|
|
78 | (1) |
|
End the Practice of Awarding Business on Price Tag Alone |
|
|
79 | (1) |
|
Improve Constantly and Ceaselessly the System of Production and Service |
|
|
79 | (1) |
|
Institute Training and Retraining |
|
|
79 | (1) |
|
|
80 | (1) |
|
|
80 | (1) |
|
Break Down Barriers between Staff Areas |
|
|
81 | (1) |
|
Eliminate Slogans, Exhortations, and Targets for the Workforce |
|
|
81 | (1) |
|
Eliminate Numerical Goals |
|
|
81 | (1) |
|
Remove Barriers to Pride of Workmanship |
|
|
82 | (1) |
|
Institute a Vigorous Program of Education and Retraining |
|
|
82 | (1) |
|
Take Action to Accomplish the Transformation |
|
|
82 | (1) |
|
Continuous Improvement through the Plan, Do, Check, Act Process |
|
|
83 | (1) |
|
Going around the PDCA Circle |
|
|
84 | (3) |
|
SECTION 2 WATERFALL TESTING REVIEW |
|
|
|
|
87 | (20) |
|
Waterfall Development Methodology |
|
|
87 | (1) |
|
Continuous Improvement ``Phased'' Approach |
|
|
88 | (1) |
|
Psychology of Life-Cycle Testing |
|
|
89 | (1) |
|
Software Testing as a Continuous Improvement Process |
|
|
89 | (3) |
|
The Testing Bible: Software Test Plan |
|
|
92 | (1) |
|
Major Steps in Developing a Test Plan |
|
|
93 | (2) |
|
Define the Test Objectives |
|
|
93 | (1) |
|
Develop the Test Approach |
|
|
93 | (2) |
|
Define the Test Environment |
|
|
95 | (1) |
|
Develop the Test Specifications |
|
|
95 | (1) |
|
|
95 | (1) |
|
Review and Approve the Test Plan |
|
|
95 | (1) |
|
Components of a Test Plan |
|
|
95 | (1) |
|
Technical Reviews as a Continuous Improvement Process |
|
|
96 | (5) |
|
Motivation for Technical Reviews |
|
|
101 | (1) |
|
|
101 | (2) |
|
|
101 | (1) |
|
|
102 | (1) |
|
|
103 | (2) |
|
Steps for an Effective Review |
|
|
105 | (2) |
|
Plan for the Review Process |
|
|
105 | (1) |
|
|
105 | (1) |
|
Develop the Review Agenda |
|
|
106 | (1) |
|
|
106 | (1) |
|
Static Testing the Requirements |
|
|
107 | (8) |
|
Testing the Requirements with Ambiguity Reviews |
|
|
108 | (1) |
|
Testing the Requirements with Technical Reviews |
|
|
109 | (1) |
|
Inspections and Walkthroughs |
|
|
109 | (1) |
|
|
109 | (1) |
|
|
109 | (1) |
|
Requirements Traceability Matrix |
|
|
110 | (1) |
|
Building the System/Acceptance Test Plan |
|
|
111 | (4) |
|
Static Testing the Logical Design |
|
|
115 | (6) |
|
Data Model, Process Model, and the Linkage |
|
|
115 | (2) |
|
Testing the Logical Design with Technical Reviews |
|
|
117 | (1) |
|
Refining the System/Acceptance Test Plan |
|
|
118 | (3) |
|
Static Testing the Physical Design |
|
|
121 | (6) |
|
Testing the Physical Design with Technical Reviews |
|
|
121 | (1) |
|
Creating Integration Test Cases |
|
|
122 | (1) |
|
Methodology for Integration Testing |
|
|
123 | (4) |
|
|
123 | (1) |
|
Reconcile Interfaces for Completeness |
|
|
124 | (1) |
|
Create Integration Test Conditions |
|
|
124 | (1) |
|
Evaluate the Completeness of Integration Test Conditions |
|
|
124 | (3) |
|
Static Testing the Program Unit Design |
|
|
127 | (4) |
|
Testing the Program Unit Design with Technical Reviews |
|
|
127 | (1) |
|
|
127 | (1) |
|
|
128 | (1) |
|
|
128 | (1) |
|
|
128 | (3) |
|
Static Testing and Dynamic Testing the Code |
|
|
131 | (8) |
|
Testing Coding with Technical Reviews |
|
|
131 | (1) |
|
|
132 | (1) |
|
|
133 | (1) |
|
|
134 | (1) |
|
|
134 | (1) |
|
|
134 | (1) |
|
|
135 | (4) |
|
SECTION 3 SPIRAL (AGILE) SOFTWARE TESTING METHODOLOGY: PLAN, DO, CHECK, ACT |
|
|
|
Development Methodology Overview |
|
|
139 | (16) |
|
Limitations of Life-Cycle Development |
|
|
139 | (1) |
|
The Client/Server Challenge |
|
|
140 | (1) |
|
Psychology of Client/Server Spiral Testing |
|
|
141 | (5) |
|
The New School of Thought |
|
|
141 | (1) |
|
Tester/Developer Perceptions |
|
|
142 | (1) |
|
Project Goal: Integrate QA and Development |
|
|
143 | (1) |
|
Iterative/Spiral Development Methodology |
|
|
144 | (2) |
|
|
146 | (1) |
|
|
146 | (2) |
|
Methodology for Developing Prototypes |
|
|
148 | (3) |
|
|
148 | (1) |
|
Demonstrate Prototypes to Management |
|
|
149 | (1) |
|
Demonstrate Prototype to Users |
|
|
150 | (1) |
|
Revise and Finalize Specifications |
|
|
150 | (1) |
|
Develop the Production System |
|
|
151 | (1) |
|
Continuous Improvement ``Spiral'' Testing Approach |
|
|
151 | (4) |
|
Information Gathering (Plan) |
|
|
155 | (12) |
|
Prepare for the Interview |
|
|
156 | (1) |
|
Identify the Participants |
|
|
156 | (1) |
|
|
156 | (1) |
|
|
156 | (9) |
|
|
158 | (1) |
|
Understand the Project Objectives |
|
|
159 | (1) |
|
Understand the Project Status |
|
|
160 | (1) |
|
Understand the Project Plans |
|
|
160 | (1) |
|
Understand the Project Development Methodology |
|
|
161 | (1) |
|
Identify the High-Level Business Requirements |
|
|
161 | (1) |
|
|
162 | (1) |
|
|
163 | (1) |
|
|
163 | (1) |
|
|
163 | (1) |
|
Identifying and Weighting Risk Attributes |
|
|
164 | (1) |
|
|
165 | (2) |
|
|
165 | (1) |
|
Confirm the Interview Findings |
|
|
165 | (2) |
|
|
167 | (28) |
|
|
168 | (20) |
|
|
168 | (2) |
|
Define the High-Level Functional Requirements (in Scope) |
|
|
170 | (1) |
|
Identify Manual/Automated Test Types |
|
|
171 | (1) |
|
Identify the Test Exit Criteria |
|
|
171 | (1) |
|
Establish Regression Test Strategy |
|
|
172 | (2) |
|
Define the Test Deliverables |
|
|
174 | (1) |
|
|
175 | (2) |
|
Establish a Test Environment |
|
|
177 | (1) |
|
|
177 | (1) |
|
|
178 | (1) |
|
|
178 | (4) |
|
Establish Defect Recording/Tracking Procedures |
|
|
182 | (2) |
|
Establish Change Request Procedures |
|
|
184 | (1) |
|
Establish Version Control Procedures |
|
|
185 | (1) |
|
Define Configuration Build Procedures |
|
|
186 | (1) |
|
Define Project Issue Resolution Procedures |
|
|
186 | (1) |
|
Establish Reporting Procedures |
|
|
187 | (1) |
|
Define Approval Procedures |
|
|
187 | (1) |
|
Define the Metric Objectives |
|
|
188 | (6) |
|
|
188 | (1) |
|
|
189 | (5) |
|
|
194 | (1) |
|
Schedule/Conduct the Review |
|
|
194 | (1) |
|
|
194 | (1) |
|
|
195 | (14) |
|
|
195 | (5) |
|
Refine the Functional Test Requirements |
|
|
195 | (5) |
|
Build a Function/Test Matrix |
|
|
200 | (1) |
|
|
200 | (3) |
|
Ten Guidelines for Good GUI Design |
|
|
200 | (2) |
|
Identify the Application GUI Components |
|
|
202 | (1) |
|
|
202 | (1) |
|
Define the System/Acceptance Tests |
|
|
203 | (3) |
|
Identify Potential System Tests |
|
|
203 | (2) |
|
Design System Fragment Tests |
|
|
205 | (1) |
|
Identify Potential Acceptance Tests |
|
|
206 | (1) |
|
|
206 | (3) |
|
Schedule/Prepare for Review |
|
|
206 | (1) |
|
|
206 | (3) |
|
|
209 | (4) |
|
|
209 | (1) |
|
Script the Manual/Automated GUI/Function Tests |
|
|
209 | (1) |
|
Script the Manual/Automated System Fragment Tests |
|
|
210 | (1) |
|
Review/Approve Test Development |
|
|
210 | (3) |
|
Schedule/Prepare for Review |
|
|
210 | (2) |
|
|
212 | (1) |
|
Test Coverage through Traceability |
|
|
213 | (4) |
|
Use Cases and Traceability |
|
|
214 | (2) |
|
|
216 | (1) |
|
Test Execution/Evaluation (Do/Check) |
|
|
217 | (6) |
|
|
217 | (2) |
|
Regression Test the Manual/Automated Spiral Fixes |
|
|
217 | (2) |
|
Execute the Manual/Automated New Spiral Tests |
|
|
219 | (1) |
|
Document the Spiral Test Defects |
|
|
219 | (1) |
|
|
219 | (1) |
|
|
219 | (1) |
|
|
220 | (3) |
|
|
220 | (1) |
|
Identify Requirement Changes |
|
|
221 | (2) |
|
Prepare for the Next Spiral (Act) |
|
|
223 | (10) |
|
|
223 | (2) |
|
Update the Function/GUI Tests |
|
|
223 | (2) |
|
Update the System Fragment Tests |
|
|
225 | (1) |
|
Update the Acceptance Tests |
|
|
225 | (1) |
|
Reassess the Team, Procedures, and Test Environment |
|
|
225 | (2) |
|
|
225 | (1) |
|
Review the Test Control Procedures |
|
|
226 | (1) |
|
Update the Test Environment |
|
|
227 | (1) |
|
Publish Interim Test Report |
|
|
227 | (6) |
|
Publish the Metric Graphics |
|
|
227 | (1) |
|
Test Case Execution Status |
|
|
227 | (1) |
|
|
228 | (1) |
|
|
228 | (1) |
|
|
228 | (5) |
|
Conduct the System Test (Act) |
|
|
233 | (20) |
|
Complete System Test Plan |
|
|
233 | (6) |
|
Finalize the System Test Types |
|
|
233 | (2) |
|
Finalize System Test Schedule |
|
|
235 | (1) |
|
Organize the System Test Team |
|
|
235 | (3) |
|
Establish the System Test Environment |
|
|
238 | (1) |
|
Install the System Test Tools |
|
|
239 | (1) |
|
Complete System Test Cases |
|
|
239 | (11) |
|
Design/Script the Performance Tests |
|
|
239 | (1) |
|
|
240 | (1) |
|
|
241 | (1) |
|
|
241 | (1) |
|
Design/Script the Security Tests |
|
|
242 | (1) |
|
A Security Design Strategy |
|
|
242 | (1) |
|
Design/Script the Volume Tests |
|
|
243 | (1) |
|
Design/Script the Stress Tests |
|
|
243 | (1) |
|
Design/Script the Compatibility Tests |
|
|
244 | (1) |
|
Design/Script the Conversion Tests |
|
|
245 | (1) |
|
Design/Script the Usability Tests |
|
|
246 | (1) |
|
Design/Script the Documentation Tests |
|
|
246 | (1) |
|
Design/Script the Backup Tests |
|
|
247 | (1) |
|
Design/Script the Recovery Tests |
|
|
248 | (1) |
|
Design/Script the Installation Tests |
|
|
248 | (1) |
|
Design/Script Other System Test Types |
|
|
249 | (1) |
|
Review/Approve System Tests |
|
|
250 | (1) |
|
Schedule/Conduct the Review |
|
|
250 | (1) |
|
|
250 | (1) |
|
|
251 | (2) |
|
Regression Test the System Fixes |
|
|
251 | (1) |
|
Execute the New System Tests |
|
|
251 | (1) |
|
Document the System Defects |
|
|
251 | (2) |
|
Conduct Acceptance Testing |
|
|
253 | (8) |
|
Complete Acceptance Test Planning |
|
|
253 | (3) |
|
Finalize the Acceptance Test Types |
|
|
253 | (2) |
|
Finalize the Acceptance Test Schedule |
|
|
255 | (1) |
|
Organize the Acceptance Test Team |
|
|
255 | (1) |
|
Establish the Acceptance Test Environment |
|
|
256 | (1) |
|
Install Acceptance Test Tools |
|
|
256 | (1) |
|
Complete Acceptance Test Cases |
|
|
256 | (1) |
|
Identify the System-Level Test Cases |
|
|
257 | (1) |
|
Design/Script Additional Acceptance Tests |
|
|
257 | (1) |
|
Review/Approve Acceptance Test Plan |
|
|
257 | (1) |
|
Schedule/Conduct the Review |
|
|
257 | (1) |
|
|
258 | (1) |
|
Execute the Acceptance Tests |
|
|
258 | (3) |
|
Regression Test the Acceptance Fixes |
|
|
258 | (1) |
|
Execute the New Acceptance Tests |
|
|
259 | (1) |
|
Document the Acceptance Defects |
|
|
259 | (2) |
|
Summarize/Report Test Results |
|
|
261 | (18) |
|
|
261 | (2) |
|
Ensure All Tests Were Executed/Resolved |
|
|
261 | (1) |
|
Consolidate Test Defects by Test Number |
|
|
261 | (1) |
|
Post Remaining Defects to a Matrix |
|
|
262 | (1) |
|
Prepare Final Test Report |
|
|
263 | (9) |
|
Prepare the Project Overview |
|
|
263 | (1) |
|
Summarize the Test Activities |
|
|
263 | (1) |
|
Analyze/Create Metric Graphics |
|
|
263 | (1) |
|
|
264 | (1) |
|
|
264 | (1) |
|
|
264 | (1) |
|
|
264 | (1) |
|
|
264 | (2) |
|
|
266 | (1) |
|
|
266 | (1) |
|
|
267 | (1) |
|
Functions Tested and Not Tested |
|
|
267 | (1) |
|
System Testing Defect Types |
|
|
268 | (1) |
|
Acceptance Testing Defect Types |
|
|
268 | (1) |
|
Develop Findings/Recommendations |
|
|
269 | (3) |
|
Review/Approve the Final Test Report |
|
|
272 | (7) |
|
Schedule/Conduct the Review |
|
|
272 | (1) |
|
|
273 | (1) |
|
Publish the Final Test Report |
|
|
273 | (6) |
|
SECTION 4 PROJECT MANAGEMENT METHODOLOGY |
|
|
|
The Project Management Framework |
|
|
279 | (12) |
|
|
279 | (1) |
|
Product Quality and Project Quality |
|
|
279 | (1) |
|
Components of the Project Framework |
|
|
280 | (1) |
|
The Project Framework and Continuous Quality Improvement |
|
|
280 | (1) |
|
The Project Framework Phases |
|
|
281 | (2) |
|
|
281 | (1) |
|
|
282 | (1) |
|
Executing, Monitoring, and Controlling Phases |
|
|
282 | (1) |
|
|
283 | (1) |
|
Scoping the Project to Ensure Product Quality |
|
|
283 | (1) |
|
Product Scope and Project Scope |
|
|
283 | (1) |
|
|
284 | (1) |
|
|
285 | (1) |
|
The Role of the Project Manager in Quality Management |
|
|
285 | (1) |
|
The Role of the Test Manager in Quality Management |
|
|
286 | (2) |
|
|
286 | (1) |
|
|
286 | (1) |
|
Avoid Duplication and Repetition |
|
|
287 | (1) |
|
|
287 | (1) |
|
Validate the Test Environment |
|
|
287 | (1) |
|
|
288 | (1) |
|
|
288 | (1) |
|
Advice for the Test Manager |
|
|
288 | (2) |
|
|
288 | (1) |
|
Communicate Issues as They Arise |
|
|
288 | (1) |
|
Always Update Your Business Knowledge |
|
|
289 | (1) |
|
Learn the New Testing Technologies and Tools |
|
|
289 | (1) |
|
|
289 | (1) |
|
|
289 | (1) |
|
The Benefits of the Quality Project Management and the Project Framework |
|
|
290 | (1) |
|
Project Quality Management |
|
|
291 | (10) |
|
Project Quality Management Processes |
|
|
291 | (1) |
|
|
292 | (1) |
|
Identifying the High-Level Project Activities |
|
|
292 | (1) |
|
Estimating the Test Work Effort |
|
|
292 | (1) |
|
|
293 | (1) |
|
Effort Estimation: Model Project |
|
|
294 | (2) |
|
|
296 | (5) |
|
The Defect Management Process |
|
|
301 | (8) |
|
Quality Control and Defect Management |
|
|
301 | (1) |
|
Defect Discovery and Classification |
|
|
301 | (1) |
|
|
302 | (1) |
|
|
303 | (1) |
|
|
303 | (1) |
|
|
304 | (1) |
|
|
304 | (1) |
|
|
305 | (1) |
|
|
305 | (1) |
|
|
306 | (3) |
|
Integrated Testing and Development |
|
|
309 | (6) |
|
Quality Control and Integrated Testing |
|
|
309 | (1) |
|
|
309 | (1) |
|
|
310 | (1) |
|
Identify the Tasks to Integrate |
|
|
310 | (1) |
|
Customize Test Steps and Tasks |
|
|
311 | (1) |
|
Select Integration Points |
|
|
311 | (1) |
|
Modify the Development Methodology |
|
|
312 | (1) |
|
Test Methodology Training |
|
|
312 | (1) |
|
Incorporate Defect Recording |
|
|
313 | (1) |
|
|
313 | (2) |
|
Test Management Constraints |
|
|
315 | (8) |
|
Organizational Architecture |
|
|
315 | (1) |
|
Traits of a Well-Established Quality Organization |
|
|
315 | (1) |
|
Division of Responsibilities |
|
|
316 | (1) |
|
Organizational Relationships |
|
|
317 | (1) |
|
Using the Project Framework Where No Quality Infrastructure Exists |
|
|
317 | (1) |
|
Ad Hoc Testing and the Project Framework |
|
|
318 | (1) |
|
Using a Traceability/Validation Matrix |
|
|
319 | (1) |
|
|
319 | (4) |
|
SECTION 5 EMERGING SPECIALIZED AREAS IN TESTING |
|
|
|
Test Process and Automation Assessment |
|
|
323 | (20) |
|
|
323 | (1) |
|
Process Evaluation Methodology |
|
|
324 | (6) |
|
Identify the Key Elements |
|
|
324 | (1) |
|
Gather and Analyze the Information |
|
|
325 | (1) |
|
|
326 | (1) |
|
The Requirements Definition Maturity |
|
|
326 | (1) |
|
|
327 | (1) |
|
Test Effort Estimation Maturity |
|
|
328 | (1) |
|
Test Design and Execution Maturity |
|
|
328 | (1) |
|
Regression Testing Maturity |
|
|
329 | (1) |
|
|
329 | (1) |
|
Document and Present Findings |
|
|
330 | (1) |
|
Test Automation Assessment |
|
|
330 | (4) |
|
Identify the Applications to Automate |
|
|
332 | (1) |
|
Identify the Best Test Automation Tool |
|
|
332 | (1) |
|
|
333 | (1) |
|
|
333 | (1) |
|
|
334 | (1) |
|
Test Automation Framework |
|
|
334 | (9) |
|
Basic Features of an Automation Framework |
|
|
335 | (1) |
|
Define the Folder Structure |
|
|
335 | (1) |
|
Modularize Scripts/Test Data to Increase Robustness |
|
|
336 | (1) |
|
Reuse Generic Functions and Application-Specific Function Libraries |
|
|
336 | (1) |
|
Develop Scripting Guidelines and Review Checklists |
|
|
336 | (1) |
|
Define Error Handling and Recovery Functions |
|
|
337 | (1) |
|
Define the Maintenance Process |
|
|
337 | (1) |
|
Standard Automation Frameworks |
|
|
337 | (1) |
|
|
338 | (1) |
|
|
338 | (1) |
|
|
339 | (2) |
|
|
341 | (2) |
|
|
343 | (24) |
|
|
343 | (1) |
|
|
344 | (1) |
|
|
344 | (1) |
|
|
344 | (1) |
|
|
344 | (1) |
|
Performance Testing Approach |
|
|
344 | (1) |
|
Knowledge Acquisition Process |
|
|
345 | (1) |
|
|
346 | (4) |
|
|
350 | (1) |
|
|
351 | (2) |
|
Identifying the Scope of Security Testing |
|
|
352 | (1) |
|
Test Case Generation and Execution |
|
|
353 | (1) |
|
Types of Security Testing |
|
|
353 | (5) |
|
|
353 | (1) |
|
|
354 | (1) |
|
|
354 | (1) |
|
|
354 | (1) |
|
|
354 | (1) |
|
|
355 | (1) |
|
|
355 | (1) |
|
|
355 | (1) |
|
|
355 | (1) |
|
|
356 | (1) |
|
|
356 | (1) |
|
|
356 | (1) |
|
|
356 | (1) |
|
|
356 | (1) |
|
|
357 | (1) |
|
|
357 | (1) |
|
|
357 | (1) |
|
|
357 | (1) |
|
|
357 | (1) |
|
|
358 | (1) |
|
|
358 | (1) |
|
|
358 | (1) |
|
Goals of Usability Testing |
|
|
359 | (5) |
|
|
360 | (1) |
|
Guidelines for Usability Testing |
|
|
361 | (1) |
|
Accessibility Testing and Section 508 |
|
|
361 | (3) |
|
|
364 | (3) |
|
|
367 | (4) |
|
|
368 | (3) |
|
|
371 | (6) |
|
Agile User Stories Contrasted to Formal Requirements |
|
|
371 | (1) |
|
|
372 | (1) |
|
|
372 | (2) |
|
|
374 | (1) |
|
|
375 | (2) |
|
Testing Center of Excellence |
|
|
377 | (6) |
|
|
381 | (1) |
|
|
381 | (1) |
|
|
381 | (1) |
|
Test Automation Framework |
|
|
382 | (1) |
|
Continuous Competency Development |
|
|
382 | (1) |
|
|
383 | (16) |
|
|
384 | (1) |
|
Determine the Economic Trade-Offs |
|
|
384 | (1) |
|
Determine the Selection Criteria |
|
|
385 | (1) |
|
Project Management and Monitoring |
|
|
385 | (1) |
|
|
385 | (3) |
|
|
386 | (1) |
|
|
387 | (1) |
|
Implementing the On-Site/Offshore Model |
|
|
388 | (1) |
|
|
388 | (1) |
|
|
388 | (1) |
|
|
388 | (1) |
|
|
389 | (1) |
|
|
389 | (1) |
|
|
389 | (3) |
|
|
389 | (2) |
|
|
391 | (1) |
|
Benefits of On-Site/Offshore Methodology |
|
|
392 | (2) |
|
On-Site/Offshore Model Challenges |
|
|
393 | (1) |
|
|
393 | (1) |
|
|
394 | (1) |
|
|
394 | (1) |
|
|
394 | (1) |
|
|
394 | (1) |
|
|
394 | (1) |
|
|
394 | (1) |
|
Future of the Onshore/Offshore Model |
|
|
394 | (5) |
|
SECTION 6 MODERN SOFTWARE TESTING TOOLS |
|
|
|
|
399 | (10) |
|
Automated Capture/Replay Testing Tools |
|
|
399 | (1) |
|
|
400 | (1) |
|
Necessary and Sufficient Conditions |
|
|
400 | (1) |
|
Test Data Generation Strategies |
|
|
401 | (8) |
|
|
401 | (1) |
|
|
402 | (1) |
|
|
402 | (1) |
|
Generating Data Based on the Database |
|
|
403 | (1) |
|
A Cutting-Edge Test Case Generator Based on Requirements |
|
|
404 | (5) |
|
Taxonomy of Software Testing Tools |
|
|
409 | (22) |
|
Testing Tool Selection Checklist |
|
|
409 | (1) |
|
Commercial Vendor Tool Descriptions |
|
|
410 | (1) |
|
Open-Source Freeware Vendor Tools |
|
|
410 | (1) |
|
When You Should Consider Test Automation |
|
|
410 | (18) |
|
When You Should Not Consider Test Automation |
|
|
428 | (3) |
|
Methodology to Evaluate Automated Testing Tools |
|
|
431 | (198) |
|
Define Your Test Requirements |
|
|
431 | (1) |
|
|
432 | (1) |
|
Conduct Selection Activities for Informal Procurement |
|
|
432 | (2) |
|
Develop the Acquisition Plan |
|
|
432 | (1) |
|
Define Selection Criteria |
|
|
432 | (1) |
|
|
433 | (1) |
|
Conduct the Candidate Review |
|
|
433 | (1) |
|
|
433 | (1) |
|
|
434 | (1) |
|
Conduct Selection Activities for Formal Procurement |
|
|
434 | (2) |
|
Develop the Acquisition Plan |
|
|
434 | (1) |
|
Create the Technical Requirements Document |
|
|
434 | (1) |
|
|
434 | (1) |
|
Generate the Request for Proposal |
|
|
434 | (1) |
|
|
435 | (1) |
|
Perform the Technical Evaluation |
|
|
435 | (1) |
|
|
435 | (1) |
|
|
436 | (1) |
|
Create the Evaluation Plan |
|
|
436 | (1) |
|
Create the Tool Manager's Plan |
|
|
436 | (1) |
|
|
437 | (1) |
|
|
437 | (1) |
|
Perform the Acceptance Test |
|
|
437 | (1) |
|
|
437 | (1) |
|
|
438 | (1) |
|
|
438 | (1) |
|
Use the Tool in the Operating Environment |
|
|
438 | (1) |
|
Write the Evaluation Report |
|
|
439 | (1) |
|
Determine Whether Goals Have Been Met |
|
|
439 | (4) |
|
|
|
Appendix A: Spiral (Agile) Testing Methodology |
|
|
443 | (10) |
|
Appendix B: Software Quality Assurance Plan |
|
|
453 | (2) |
|
Appendix C: Requirements Specification |
|
|
455 | (2) |
|
Appendix D: Change Request Form |
|
|
457 | (2) |
|
Appendix E: Test Templates |
|
|
459 | (34) |
|
|
493 | (64) |
|
Appendix G: Software Testing Techniques |
|
|
557 | (72) |
Bibliography |
|
629 | (4) |
Glossary |
|
633 | (8) |
Index |
|
641 | |