Muutke küpsiste eelistusi

E-raamat: Software Testing and Continuous Quality Improvement 3rd edition [Taylor & Francis e-raamat]

(President and CEO, Smartware Technologies, Inc., Plano, Texas, USA)
  • Formaat: 684 pages, 156 Tables, black and white; 101 Illustrations, black and white
  • Ilmumisaeg: 22-Dec-2008
  • Kirjastus: Auerbach
  • ISBN-13: 9781315181400
Teised raamatud teemal:
  • Taylor & Francis e-raamat
  • Hind: 184,65 €*
  • * hind, mis tagab piiramatu üheaegsete kasutajate arvuga ligipääsu piiramatuks ajaks
  • Tavahind: 263,78 €
  • Säästad 30%
  • Formaat: 684 pages, 156 Tables, black and white; 101 Illustrations, black and white
  • Ilmumisaeg: 22-Dec-2008
  • Kirjastus: Auerbach
  • ISBN-13: 9781315181400
Teised raamatud teemal:
It is often assumed that software testing is based on clearly defined requirements and software development standards. However, testing is typically performed against changing, and sometimes inaccurate, requirements. The third edition of a bestseller, Software Testing and Continuous Quality Improvement, Third Edition provides a continuous quality framework for the software testing process within traditionally structured and unstructured environments. This framework aids in creating meaningful test cases for systems with evolving requirements.

This completely revised reference provides a comprehensive look at software testing as part of the project management process, emphasizing testing and quality goals early on in development. Building on the success of previous editions, the text explains testing in a Service Orientated Architecture (SOA) environment, the building blocks of a Testing Center of Excellence (COE), and how to test in an agile development. Fully updated, the sections on test effort estimation provide greater emphasis on testing metrics. The book also examines all aspects of functional testing and looks at the relation between changing business strategies and changes to applications in development.

Includes New Chapters on Process, Application, and Organizational Metrics

All IT organizations face software testing issues, but most are unprepared to manage them. Software Testing and Continuous Quality Improvement, Third Edition is enhanced with an up-to-date listing of free software tools and a question-and-answer checklist for choosing the best tools for your organization. It equips you with everything you need to effectively address testing issues in the most beneficial way for your business.
Acknowledgments xxi
Introduction xxiii
About the Author xxv
SECTION 1 SOFTWARE QUALITY IN PERSPECTIVE
A Brief History of Software Testing
3(10)
Historical Software Testing and Development Parallels
6(2)
Extreme Programming
8(1)
Evolution of Automated Testing Tools
8(5)
Static Capture/Replay Tools (without Scripting Language)
10(1)
Static Capture/Replay Tools (with Scripting Language)
10(1)
Variable Capture/Replay Tools
10(3)
Quality Assurance Framework
13(26)
What Is Quality?
13(1)
Prevention versus Detection
14(1)
Verification versus Validation
15(1)
Software Quality Assurance
16(1)
Components of Quality Assurance
17(1)
Software Testing
17(1)
Quality Control
18(5)
Software Configuration Management
19(1)
Elements of Software Configuration Management
20(3)
Software Quality Assurance Plan
23(3)
Steps to Develop and Implement a Software Quality Assurance Plan
23(1)
Document the Plan
23(2)
Obtain Management Acceptance
25(1)
Obtain Development Acceptance
25(1)
Plan for Implementation of the SQA Plan
26(1)
Execute the SQA Plan
26(1)
Quality Standards
26(11)
Sarbanes---Oxley
26(3)
ISO9000
29(1)
Capability Maturity Model (CMM)
29(1)
Initial
30(1)
Repeatable
31(1)
Defined
31(1)
Managed
32(1)
Optimized
32(1)
People CMM
33(1)
CMMI
33(1)
Malcolm Baldrige National Quality Award
34(3)
Notes
37(2)
Overview of Testing Techniques
39(12)
Black-Box Testing (Functional)
39(1)
White-Box Testing (Structural)
40(1)
Gray-Box Testing (Functional and Structural)
41(1)
Manual versus Automated Testing
41(1)
Static versus Dynamic Testing
41(1)
Taxonomy of Software Testing Techniques
42(9)
Transforming Requirements to Testable Test Cases
51(24)
Introduction
51(1)
Software Requirements as the Basis of Testing
51(1)
Requirement Quality Factors
52(2)
Understandable
52(1)
Necessary
53(1)
Modifiable
53(1)
Nonredundant
53(1)
Terse
54(1)
Testable
54(1)
Traceable
54(1)
Within Scope
54(1)
Numerical Method for Evaluating Requirement Quality
54(1)
Process for Creating Test Cases from Good Requirements
55(9)
Review the Requirements
55(3)
Write a Test Plan
58(1)
Identify the Test Suite
58(1)
Name the Test Cases
59(3)
Write Test Case Descriptions and Objectives
62(1)
Create the Test Cases
62(1)
Review the Test Cases
63(1)
Transforming Use Cases to Test Cases
64(4)
Draw a Use Case Diagram
64(1)
Write the Detailed Use Case Text
64(2)
Identify Use Case Scenarios
66(1)
Generating the Test Cases
66(2)
Generating Test Data
68(1)
Summary
68(1)
What to Do When Requirements Are Nonexistent or Poor?
68(7)
Ad Hoc Testing
68(1)
The Art of Ad Hoc Testing
68(3)
Advantages and Disadvantages of Ad Hoc Testing
71(1)
Exploratory Testing
72(1)
The Art of Exploratory Testing
72(1)
Exploratory Testing Process
72(1)
Advantages and Disadvantages of Exploratory Testing
73(2)
Quality through Continuous Improvement Process
75(12)
Contribution of Edward Deming
75(1)
Role of Statistical Methods
76(1)
Cause-and-Effect Diagram
76(1)
Flowchart
76(1)
Pareto Chart
76(1)
Run Chart
77(1)
Histogram
77(1)
Scatter Diagram
77(1)
Control Chart
77(1)
Deming's 14 Quality Principles
77(6)
Create Constancy of Purpose
77(1)
Adopt the New Philosophy
78(1)
Cease Dependence on Mass Inspection
78(1)
End the Practice of Awarding Business on Price Tag Alone
79(1)
Improve Constantly and Ceaselessly the System of Production and Service
79(1)
Institute Training and Retraining
79(1)
Institute Leadership
80(1)
Drive Out Fear
80(1)
Break Down Barriers between Staff Areas
81(1)
Eliminate Slogans, Exhortations, and Targets for the Workforce
81(1)
Eliminate Numerical Goals
81(1)
Remove Barriers to Pride of Workmanship
82(1)
Institute a Vigorous Program of Education and Retraining
82(1)
Take Action to Accomplish the Transformation
82(1)
Continuous Improvement through the Plan, Do, Check, Act Process
83(1)
Going around the PDCA Circle
84(3)
SECTION 2 WATERFALL TESTING REVIEW
Overview
87(20)
Waterfall Development Methodology
87(1)
Continuous Improvement ``Phased'' Approach
88(1)
Psychology of Life-Cycle Testing
89(1)
Software Testing as a Continuous Improvement Process
89(3)
The Testing Bible: Software Test Plan
92(1)
Major Steps in Developing a Test Plan
93(2)
Define the Test Objectives
93(1)
Develop the Test Approach
93(2)
Define the Test Environment
95(1)
Develop the Test Specifications
95(1)
Schedule the Test
95(1)
Review and Approve the Test Plan
95(1)
Components of a Test Plan
95(1)
Technical Reviews as a Continuous Improvement Process
96(5)
Motivation for Technical Reviews
101(1)
Types of Reviews
101(2)
Structured Walkthroughs
101(1)
Inspections
102(1)
Participant Roles
103(2)
Steps for an Effective Review
105(2)
Plan for the Review Process
105(1)
Schedule the Review
105(1)
Develop the Review Agenda
106(1)
Create a Review Report
106(1)
Static Testing the Requirements
107(8)
Testing the Requirements with Ambiguity Reviews
108(1)
Testing the Requirements with Technical Reviews
109(1)
Inspections and Walkthroughs
109(1)
Checklists
109(1)
Methodology Checklist
109(1)
Requirements Traceability Matrix
110(1)
Building the System/Acceptance Test Plan
111(4)
Static Testing the Logical Design
115(6)
Data Model, Process Model, and the Linkage
115(2)
Testing the Logical Design with Technical Reviews
117(1)
Refining the System/Acceptance Test Plan
118(3)
Static Testing the Physical Design
121(6)
Testing the Physical Design with Technical Reviews
121(1)
Creating Integration Test Cases
122(1)
Methodology for Integration Testing
123(4)
Identify Unit Interfaces
123(1)
Reconcile Interfaces for Completeness
124(1)
Create Integration Test Conditions
124(1)
Evaluate the Completeness of Integration Test Conditions
124(3)
Static Testing the Program Unit Design
127(4)
Testing the Program Unit Design with Technical Reviews
127(1)
Sequence
127(1)
Selection
128(1)
Iteration
128(1)
Creating Unit Test Cases
128(3)
Static Testing and Dynamic Testing the Code
131(8)
Testing Coding with Technical Reviews
131(1)
Executing the Test Plan
132(1)
Unit Testing
133(1)
Integration Testing
134(1)
System Testing
134(1)
Acceptance Testing
134(1)
Defect Recording
135(4)
SECTION 3 SPIRAL (AGILE) SOFTWARE TESTING METHODOLOGY: PLAN, DO, CHECK, ACT
Development Methodology Overview
139(16)
Limitations of Life-Cycle Development
139(1)
The Client/Server Challenge
140(1)
Psychology of Client/Server Spiral Testing
141(5)
The New School of Thought
141(1)
Tester/Developer Perceptions
142(1)
Project Goal: Integrate QA and Development
143(1)
Iterative/Spiral Development Methodology
144(2)
Role of JADs
146(1)
Role of Prototyping
146(2)
Methodology for Developing Prototypes
148(3)
Develop the Prototype
148(1)
Demonstrate Prototypes to Management
149(1)
Demonstrate Prototype to Users
150(1)
Revise and Finalize Specifications
150(1)
Develop the Production System
151(1)
Continuous Improvement ``Spiral'' Testing Approach
151(4)
Information Gathering (Plan)
155(12)
Prepare for the Interview
156(1)
Identify the Participants
156(1)
Define the Agenda
156(1)
Conduct the Interview
156(9)
Understand the Project
158(1)
Understand the Project Objectives
159(1)
Understand the Project Status
160(1)
Understand the Project Plans
160(1)
Understand the Project Development Methodology
161(1)
Identify the High-Level Business Requirements
161(1)
Perform Risk Analysis
162(1)
Computer Risk Analysis
163(1)
Judgment and Instinct
163(1)
Dollar Estimation
163(1)
Identifying and Weighting Risk Attributes
164(1)
Summarize the Findings
165(2)
Summarize the Interview
165(1)
Confirm the Interview Findings
165(2)
Test Planning (Plan)
167(28)
Build a Test Plan
168(20)
Prepare an Introduction
168(2)
Define the High-Level Functional Requirements (in Scope)
170(1)
Identify Manual/Automated Test Types
171(1)
Identify the Test Exit Criteria
171(1)
Establish Regression Test Strategy
172(2)
Define the Test Deliverables
174(1)
Organize the Test Team
175(2)
Establish a Test Environment
177(1)
Define the Dependencies
177(1)
Create a Test Schedule
178(1)
Select the Test Tools
178(4)
Establish Defect Recording/Tracking Procedures
182(2)
Establish Change Request Procedures
184(1)
Establish Version Control Procedures
185(1)
Define Configuration Build Procedures
186(1)
Define Project Issue Resolution Procedures
186(1)
Establish Reporting Procedures
187(1)
Define Approval Procedures
187(1)
Define the Metric Objectives
188(6)
Define the Metrics
188(1)
Define the Metric Points
189(5)
Review/Approve the Plan
194(1)
Schedule/Conduct the Review
194(1)
Obtain Approvals
194(1)
Test Case Design (Do)
195(14)
Design Function Tests
195(5)
Refine the Functional Test Requirements
195(5)
Build a Function/Test Matrix
200(1)
Design GUI Tests
200(3)
Ten Guidelines for Good GUI Design
200(2)
Identify the Application GUI Components
202(1)
Define the GUI Tests
202(1)
Define the System/Acceptance Tests
203(3)
Identify Potential System Tests
203(2)
Design System Fragment Tests
205(1)
Identify Potential Acceptance Tests
206(1)
Review/Approve Design
206(3)
Schedule/Prepare for Review
206(1)
Obtain Approvals
206(3)
Test Development (Do)
209(4)
Develop Test Scripts
209(1)
Script the Manual/Automated GUI/Function Tests
209(1)
Script the Manual/Automated System Fragment Tests
210(1)
Review/Approve Test Development
210(3)
Schedule/Prepare for Review
210(2)
Obtain Approvals
212(1)
Test Coverage through Traceability
213(4)
Use Cases and Traceability
214(2)
Summary
216(1)
Test Execution/Evaluation (Do/Check)
217(6)
Setup and Testing
217(2)
Regression Test the Manual/Automated Spiral Fixes
217(2)
Execute the Manual/Automated New Spiral Tests
219(1)
Document the Spiral Test Defects
219(1)
Evaluation
219(1)
Analyze the Metrics
219(1)
Publish Interim Report
220(3)
Refine the Test Schedule
220(1)
Identify Requirement Changes
221(2)
Prepare for the Next Spiral (Act)
223(10)
Refine the Tests
223(2)
Update the Function/GUI Tests
223(2)
Update the System Fragment Tests
225(1)
Update the Acceptance Tests
225(1)
Reassess the Team, Procedures, and Test Environment
225(2)
Evaluate the Test Team
225(1)
Review the Test Control Procedures
226(1)
Update the Test Environment
227(1)
Publish Interim Test Report
227(6)
Publish the Metric Graphics
227(1)
Test Case Execution Status
227(1)
Defect Gap Analysis
228(1)
Defect Severity Status
228(1)
Test Burnout Tracking
228(5)
Conduct the System Test (Act)
233(20)
Complete System Test Plan
233(6)
Finalize the System Test Types
233(2)
Finalize System Test Schedule
235(1)
Organize the System Test Team
235(3)
Establish the System Test Environment
238(1)
Install the System Test Tools
239(1)
Complete System Test Cases
239(11)
Design/Script the Performance Tests
239(1)
Monitoring Approach
240(1)
Probe Approach
241(1)
Test Drivers
241(1)
Design/Script the Security Tests
242(1)
A Security Design Strategy
242(1)
Design/Script the Volume Tests
243(1)
Design/Script the Stress Tests
243(1)
Design/Script the Compatibility Tests
244(1)
Design/Script the Conversion Tests
245(1)
Design/Script the Usability Tests
246(1)
Design/Script the Documentation Tests
246(1)
Design/Script the Backup Tests
247(1)
Design/Script the Recovery Tests
248(1)
Design/Script the Installation Tests
248(1)
Design/Script Other System Test Types
249(1)
Review/Approve System Tests
250(1)
Schedule/Conduct the Review
250(1)
Obtain Approvals
250(1)
Execute the System Tests
251(2)
Regression Test the System Fixes
251(1)
Execute the New System Tests
251(1)
Document the System Defects
251(2)
Conduct Acceptance Testing
253(8)
Complete Acceptance Test Planning
253(3)
Finalize the Acceptance Test Types
253(2)
Finalize the Acceptance Test Schedule
255(1)
Organize the Acceptance Test Team
255(1)
Establish the Acceptance Test Environment
256(1)
Install Acceptance Test Tools
256(1)
Complete Acceptance Test Cases
256(1)
Identify the System-Level Test Cases
257(1)
Design/Script Additional Acceptance Tests
257(1)
Review/Approve Acceptance Test Plan
257(1)
Schedule/Conduct the Review
257(1)
Obtain Approvals
258(1)
Execute the Acceptance Tests
258(3)
Regression Test the Acceptance Fixes
258(1)
Execute the New Acceptance Tests
259(1)
Document the Acceptance Defects
259(2)
Summarize/Report Test Results
261(18)
Perform Data Reduction
261(2)
Ensure All Tests Were Executed/Resolved
261(1)
Consolidate Test Defects by Test Number
261(1)
Post Remaining Defects to a Matrix
262(1)
Prepare Final Test Report
263(9)
Prepare the Project Overview
263(1)
Summarize the Test Activities
263(1)
Analyze/Create Metric Graphics
263(1)
Defects by Function
264(1)
Defects by Tester
264(1)
Defect Gap Analysis
264(1)
Defect Severity Status
264(1)
Test Burnout Tracking
264(2)
Root Cause Analysis
266(1)
Defects by How Found
266(1)
Defects by Who Found
267(1)
Functions Tested and Not Tested
267(1)
System Testing Defect Types
268(1)
Acceptance Testing Defect Types
268(1)
Develop Findings/Recommendations
269(3)
Review/Approve the Final Test Report
272(7)
Schedule/Conduct the Review
272(1)
Obtain Approvals
273(1)
Publish the Final Test Report
273(6)
SECTION 4 PROJECT MANAGEMENT METHODOLOGY
The Project Management Framework
279(12)
The Project Framework
279(1)
Product Quality and Project Quality
279(1)
Components of the Project Framework
280(1)
The Project Framework and Continuous Quality Improvement
280(1)
The Project Framework Phases
281(2)
Initiation Phase
281(1)
Planning Phase
282(1)
Executing, Monitoring, and Controlling Phases
282(1)
Implement Phase
283(1)
Scoping the Project to Ensure Product Quality
283(1)
Product Scope and Project Scope
283(1)
The Project Charter
284(1)
The Scope Statement
285(1)
The Role of the Project Manager in Quality Management
285(1)
The Role of the Test Manager in Quality Management
286(2)
Analyze the Requirements
286(1)
Perform a Gap Analysis
286(1)
Avoid Duplication and Repetition
287(1)
Define the Test Data
287(1)
Validate the Test Environment
287(1)
Analyze the Test Results
288(1)
Deliver the Quality
288(1)
Advice for the Test Manager
288(2)
Request Help from Others
288(1)
Communicate Issues as They Arise
288(1)
Always Update Your Business Knowledge
289(1)
Learn the New Testing Technologies and Tools
289(1)
Improve the Process
289(1)
Create a Knowledge Base
289(1)
The Benefits of the Quality Project Management and the Project Framework
290(1)
Project Quality Management
291(10)
Project Quality Management Processes
291(1)
Quality Planning
292(1)
Identifying the High-Level Project Activities
292(1)
Estimating the Test Work Effort
292(1)
Test Planning
293(1)
Effort Estimation: Model Project
294(2)
Quality Standards
296(5)
The Defect Management Process
301(8)
Quality Control and Defect Management
301(1)
Defect Discovery and Classification
301(1)
Defect Priority
302(1)
Defect Category
303(1)
Defect Tracking
303(1)
Defect Reporting
304(1)
Defect Summary
304(1)
Defect Meetings
305(1)
Defect Metrics
305(1)
Quality Standards
306(3)
Integrated Testing and Development
309(6)
Quality Control and Integrated Testing
309(1)
Integrated Testing
309(1)
Organize the Test Team
310(1)
Identify the Tasks to Integrate
310(1)
Customize Test Steps and Tasks
311(1)
Select Integration Points
311(1)
Modify the Development Methodology
312(1)
Test Methodology Training
312(1)
Incorporate Defect Recording
313(1)
The Integrated Team
313(2)
Test Management Constraints
315(8)
Organizational Architecture
315(1)
Traits of a Well-Established Quality Organization
315(1)
Division of Responsibilities
316(1)
Organizational Relationships
317(1)
Using the Project Framework Where No Quality Infrastructure Exists
317(1)
Ad Hoc Testing and the Project Framework
318(1)
Using a Traceability/Validation Matrix
319(1)
Reporting the Progress
319(4)
SECTION 5 EMERGING SPECIALIZED AREAS IN TESTING
Test Process and Automation Assessment
323(20)
Test Process Assessment
323(1)
Process Evaluation Methodology
324(6)
Identify the Key Elements
324(1)
Gather and Analyze the Information
325(1)
Analyze Test Maturity
326(1)
The Requirements Definition Maturity
326(1)
Test Strategy Maturity
327(1)
Test Effort Estimation Maturity
328(1)
Test Design and Execution Maturity
328(1)
Regression Testing Maturity
329(1)
Test Automation Maturity
329(1)
Document and Present Findings
330(1)
Test Automation Assessment
330(4)
Identify the Applications to Automate
332(1)
Identify the Best Test Automation Tool
332(1)
Test Scripting Approach
333(1)
Test Execution Approach
333(1)
Test Script Maintenance
334(1)
Test Automation Framework
334(9)
Basic Features of an Automation Framework
335(1)
Define the Folder Structure
335(1)
Modularize Scripts/Test Data to Increase Robustness
336(1)
Reuse Generic Functions and Application-Specific Function Libraries
336(1)
Develop Scripting Guidelines and Review Checklists
336(1)
Define Error Handling and Recovery Functions
337(1)
Define the Maintenance Process
337(1)
Standard Automation Frameworks
337(1)
Data-Driven Framework
338(1)
Modular Framework
338(1)
Keyword-Driven Framework
339(2)
Hybrid Framework
341(2)
Nonfunctional Testing
343(24)
Performance Testing
343(1)
Load Testing
344(1)
Stress Testing
344(1)
Volume Testing
344(1)
Performance Monitoring
344(1)
Performance Testing Approach
344(1)
Knowledge Acquisition Process
345(1)
Test Development
346(4)
Performance Deliverables
350(1)
Security Testing
351(2)
Identifying the Scope of Security Testing
352(1)
Test Case Generation and Execution
353(1)
Types of Security Testing
353(5)
Network Scanning
353(1)
Purpose
354(1)
Tools
354(1)
Approach
354(1)
Vulnerability Scanning
354(1)
Purpose
355(1)
Tools
355(1)
Approach
355(1)
Password Cracking
355(1)
Tools
356(1)
Log Reviews
356(1)
Approach
356(1)
File Integrity Checkers
356(1)
Purpose
356(1)
Tools
357(1)
Virus Detectors
357(1)
Tools
357(1)
Approach
357(1)
Penetration Testing
357(1)
Purpose
358(1)
Approach
358(1)
Usability Testing
358(1)
Goals of Usability Testing
359(5)
Approach and Execution
360(1)
Guidelines for Usability Testing
361(1)
Accessibility Testing and Section 508
361(3)
Compliance Testing
364(3)
SOA Testing
367(4)
Key Steps of SOA Testing
368(3)
Agile Testing
371(6)
Agile User Stories Contrasted to Formal Requirements
371(1)
What Is a User Story?
372(1)
Agile Planning
372(2)
Types of Agile Testing
374(1)
Compliance Testing
375(2)
Testing Center of Excellence
377(6)
Industry Best Processes
381(1)
Testing Metrics
381(1)
Operating Model
381(1)
Test Automation Framework
382(1)
Continuous Competency Development
382(1)
On-Site/Offshore Model
383(16)
Analysis
384(1)
Determine the Economic Trade-Offs
384(1)
Determine the Selection Criteria
385(1)
Project Management and Monitoring
385(1)
Outsourcing Methodology
385(3)
On-Site Activities
386(1)
Offshore Activities
387(1)
Implementing the On-Site/Offshore Model
388(1)
Knowledge Transfer
388(1)
Detailed Design
388(1)
Milestone-Based Transfer
388(1)
Steady State
389(1)
Application Management
389(1)
Prerequisites
389(3)
Relationship Model
389(2)
Standards
391(1)
Benefits of On-Site/Offshore Methodology
392(2)
On-Site/Offshore Model Challenges
393(1)
Out of Sight
393(1)
Establish Transparency
394(1)
Security Considerations
394(1)
Project Monitoring
394(1)
Management Overhead
394(1)
Cultural Differences
394(1)
Software Licensing
394(1)
Future of the Onshore/Offshore Model
394(5)
SECTION 6 MODERN SOFTWARE TESTING TOOLS
Software Testing Trends
399(10)
Automated Capture/Replay Testing Tools
399(1)
Test Case Builder Tools
400(1)
Necessary and Sufficient Conditions
400(1)
Test Data Generation Strategies
401(8)
Sampling from Production
401(1)
Starting from Scratch
402(1)
Seeding the Data
402(1)
Generating Data Based on the Database
403(1)
A Cutting-Edge Test Case Generator Based on Requirements
404(5)
Taxonomy of Software Testing Tools
409(22)
Testing Tool Selection Checklist
409(1)
Commercial Vendor Tool Descriptions
410(1)
Open-Source Freeware Vendor Tools
410(1)
When You Should Consider Test Automation
410(18)
When You Should Not Consider Test Automation
428(3)
Methodology to Evaluate Automated Testing Tools
431(198)
Define Your Test Requirements
431(1)
Set Tool Objectives
432(1)
Conduct Selection Activities for Informal Procurement
432(2)
Develop the Acquisition Plan
432(1)
Define Selection Criteria
432(1)
Identify Candidate Tools
433(1)
Conduct the Candidate Review
433(1)
Score the Candidates
433(1)
Select the Tool
434(1)
Conduct Selection Activities for Formal Procurement
434(2)
Develop the Acquisition Plan
434(1)
Create the Technical Requirements Document
434(1)
Review Requirements
434(1)
Generate the Request for Proposal
434(1)
Solicit Proposals
435(1)
Perform the Technical Evaluation
435(1)
Select a Tool Source
435(1)
Procure the Testing Tool
436(1)
Create the Evaluation Plan
436(1)
Create the Tool Manager's Plan
436(1)
Create the Training Plan
437(1)
Receive the Tool
437(1)
Perform the Acceptance Test
437(1)
Conduct Orientation
437(1)
Implement Modifications
438(1)
Train Tool Users
438(1)
Use the Tool in the Operating Environment
438(1)
Write the Evaluation Report
439(1)
Determine Whether Goals Have Been Met
439(4)
SECTION 7 APPENDICES
Appendix A: Spiral (Agile) Testing Methodology
443(10)
Appendix B: Software Quality Assurance Plan
453(2)
Appendix C: Requirements Specification
455(2)
Appendix D: Change Request Form
457(2)
Appendix E: Test Templates
459(34)
Appendix F: Checklists
493(64)
Appendix G: Software Testing Techniques
557(72)
Bibliography 629(4)
Glossary 633(8)
Index 641
William E. Lewis