Preface |
|
xix | |
Features and Organization |
|
xxi | |
Practice Descriptions |
|
xxiii | |
Intended Audience |
|
xxiv | |
Acknowledgments |
|
xxiv | |
Permissions |
|
xxvi | |
Disclaimer |
|
xxvi | |
|
The Case for Automated Defect Prevention |
|
|
1 | (18) |
|
|
1 | (2) |
|
What Are the Goals of ADP? |
|
|
3 | (5) |
|
People: Stimulated and Satisfied |
|
|
3 | (1) |
|
|
4 | (1) |
|
Organization: Increased Productivity and Operational Efficiency |
|
|
5 | (1) |
|
Process: Controlled, Improved, and Sustainable |
|
|
6 | (1) |
|
Project: Managed through Informed Decision Making |
|
|
7 | (1) |
|
|
8 | (3) |
|
|
8 | (1) |
|
|
9 | (1) |
|
|
9 | (1) |
|
Defect Prevention Mindset |
|
|
10 | (1) |
|
|
11 | (1) |
|
From the Waterfall to Modern Software Development Process Models |
|
|
11 | (2) |
|
|
13 | (1) |
|
|
13 | (2) |
|
|
15 | (1) |
|
|
16 | (3) |
|
Principles of Automated Defect Prevention |
|
|
19 | (34) |
|
|
19 | (2) |
|
Defect Prevention: Definition and Benefits |
|
|
21 | (3) |
|
Historical Perspective: Defect Analysis and Prevention in the Auto Industry---What Happened to Deming? |
|
|
24 | (2) |
|
Principles of Automated Defect Prevention |
|
|
26 | (12) |
|
Principle 1---Establishment of Infrastructure: ``Build a Strong Foundation through Integration of People and Technology'' |
|
|
26 | (2) |
|
Principle 2---Application of General Best Practices: ``Learn from Others' Mistakes'' |
|
|
28 | (1) |
|
Principle 3---Customization of Best Practices: ``Learn from Your Own Mistakes and Improve the Process'' |
|
|
29 | (2) |
|
Principle 4---Measurement and Tracking of Project Status: ``Understand the Past and Present to Make Decisions about the Future'' |
|
|
31 | (1) |
|
Principle 5---Automation: ``Let the Computer Do It'' |
|
|
32 | (4) |
|
Principle 6---Incremental Implementation of ADP's Practices and Policies |
|
|
36 | (2) |
|
Automated Defect Prevention-Based Software Development Process Model |
|
|
38 | (3) |
|
|
41 | (7) |
|
Focus on Root Cause Analysis of a Defect |
|
|
41 | (3) |
|
|
44 | (1) |
|
Focus on Customized Best Practice |
|
|
45 | (2) |
|
Focus on Measurements of Project Status |
|
|
47 | (1) |
|
|
48 | (1) |
|
|
48 | (2) |
|
|
50 | (1) |
|
|
51 | (2) |
|
Initial Planning and Infrastructure |
|
|
53 | (32) |
|
|
53 | (1) |
|
Initial Software Development Plan |
|
|
54 | (2) |
|
|
54 | (1) |
|
|
54 | (1) |
|
|
55 | (1) |
|
|
55 | (1) |
|
Best Practices for Creating People Infrastructure |
|
|
56 | (7) |
|
|
56 | (2) |
|
Determining a Location for Each Group's Infrastructure |
|
|
58 | (1) |
|
|
58 | (3) |
|
Establishing a Training Program |
|
|
61 | (1) |
|
Cultivating a Positive Group Culture |
|
|
61 | (2) |
|
Best Practices for Creating Technology Infrastructure |
|
|
63 | (12) |
|
Automated Reporting System |
|
|
63 | (3) |
|
Policy for Use of Automated Reporting System |
|
|
66 | (2) |
|
Minimum Technology Infrastructure |
|
|
68 | (4) |
|
Intermediate Technology Infrastructure |
|
|
72 | (1) |
|
Expanded Technology Infrastructure |
|
|
73 | (2) |
|
Integrating People and Technology |
|
|
75 | (2) |
|
Human Factors and Concerns |
|
|
77 | (1) |
|
|
78 | (2) |
|
|
78 | (1) |
|
Focus on Reports Generated by the Minimum Infrastructure |
|
|
79 | (1) |
|
|
80 | (1) |
|
|
81 | (1) |
|
|
82 | (1) |
|
|
83 | (2) |
|
Requirements Specification and Management |
|
|
85 | (34) |
|
|
85 | (2) |
|
Best Practices for Gathering and Organizing Requirements |
|
|
87 | (18) |
|
Creating the Product Vision and Scope Document |
|
|
87 | (2) |
|
Gathering and Organizing Requirements |
|
|
89 | (4) |
|
Prioritizing Requirements |
|
|
93 | (2) |
|
|
95 | (3) |
|
Creating a Prototype to Elicit Requirements |
|
|
98 | (2) |
|
Creating Conceptual Test Cases |
|
|
100 | (1) |
|
Requirements Documents Inspection |
|
|
101 | (2) |
|
Managing Changing Requirements |
|
|
103 | (2) |
|
Best Practices in Different Environments |
|
|
105 | (2) |
|
Existing versus New Software Project |
|
|
105 | (1) |
|
In-House versus Outsourced Development Teams |
|
|
106 | (1) |
|
Policy for Use of the Requirements Management System |
|
|
107 | (3) |
|
Measurements Related to Requirements Management System |
|
|
109 | (1) |
|
Tracking of Data Related to the Requirements Management System |
|
|
110 | (1) |
|
|
110 | (5) |
|
Focus on Customized Best Practice |
|
|
110 | (2) |
|
Focus on Monitoring and Managing Requirement Priorities |
|
|
112 | (2) |
|
|
114 | (1) |
|
|
115 | (1) |
|
|
115 | (1) |
|
|
116 | (1) |
|
|
116 | (3) |
|
Extended Planning and Infrastructure |
|
|
119 | (46) |
|
|
119 | (1) |
|
Software Development Plan |
|
|
120 | (1) |
|
Defining Project Objectives |
|
|
121 | (3) |
|
Defining Project Artifacts and Deliverables |
|
|
124 | (4) |
|
The Vision and Scope Document and Project Objectives |
|
|
124 | (1) |
|
SRS, Describing the Product Key Features |
|
|
125 | (1) |
|
Architectural and Detailed Design Documents and Models |
|
|
125 | (1) |
|
List of COTS (Commercial Off-the-Shelf Components) Used |
|
|
125 | (1) |
|
Source and Executable Code |
|
|
126 | (1) |
|
|
126 | (1) |
|
|
126 | (1) |
|
Periodic Reports Generated by the Reporting System |
|
|
126 | (1) |
|
|
127 | (1) |
|
User and Operational Manuals |
|
|
127 | (1) |
|
|
127 | (1) |
|
Selecting a Software Development Process Model |
|
|
128 | (1) |
|
Defining Defect Prevention Process |
|
|
129 | (1) |
|
|
129 | (3) |
|
|
132 | (1) |
|
Defining Work Breakdown Structure---An Iterative Approach |
|
|
132 | (3) |
|
Best Practices for Estimating Project Effort |
|
|
135 | (11) |
|
Estimation by Using Elements of Wideband Delphi |
|
|
137 | (1) |
|
Estimation by Using Effort Analogy |
|
|
138 | (3) |
|
Estimation by Using Parametric Models |
|
|
141 | (4) |
|
Estimations of Using COTS and Code Reuse |
|
|
145 | (1) |
|
Quality of Estimation and the Iterative Adjustments of Estimates |
|
|
145 | (1) |
|
Best Practices for Preparing the Schedule |
|
|
146 | (3) |
|
Measurement and Tracking for Estimation |
|
|
149 | (1) |
|
Identifying Additional Resource Requirements |
|
|
150 | (7) |
|
Extending the Technology Infrastructure |
|
|
151 | (5) |
|
Extending the People Infrastructure |
|
|
156 | (1) |
|
|
157 | (3) |
|
Focus on the Root Cause of a Project Scheduling Problem |
|
|
157 | (1) |
|
Focus on Organizing and Tracking Artifacts |
|
|
158 | (1) |
|
Focus on Scheduling and Tracking Milestones |
|
|
158 | (2) |
|
|
160 | (1) |
|
|
160 | (2) |
|
|
162 | (1) |
|
|
163 | (2) |
|
Architectural and Detailed Design |
|
|
165 | (42) |
|
|
165 | (3) |
|
Best Practices for Design of System Functionality and Its Quality Attributes |
|
|
168 | (21) |
|
Identifying Critical Attributes of Architectural Design |
|
|
168 | (4) |
|
Defining the Policies for Design of Functional and Nonfunctional Requirements |
|
|
172 | (3) |
|
|
175 | (3) |
|
Service-Oriented Architecture |
|
|
178 | (1) |
|
Mapping Requirements to Modules |
|
|
178 | (3) |
|
Designing Module Interfaces |
|
|
181 | (1) |
|
Modeling Modules and Their Interfaces |
|
|
182 | (3) |
|
Defining Application Logic |
|
|
185 | (1) |
|
|
186 | (1) |
|
Design Document Storage and Inspection |
|
|
187 | (1) |
|
Managing Changes in Design |
|
|
188 | (1) |
|
Best Practices for Design of Graphical User Interface |
|
|
189 | (9) |
|
Identifying Critical Attributes of User Interface Design |
|
|
190 | (3) |
|
Defining the User Interface Design Policy |
|
|
193 | (2) |
|
Identifying Architectural Patterns Applicable to the User Interface Design |
|
|
195 | (1) |
|
Creating Categories of Actions |
|
|
195 | (1) |
|
Dividing Actions into Screens |
|
|
196 | (1) |
|
Prototyping the Interface |
|
|
197 | (1) |
|
|
197 | (1) |
|
|
198 | (2) |
|
Focus on Module Assignments and Design Progress |
|
|
198 | (1) |
|
Focus on the Number of Use Cases per Module |
|
|
198 | (1) |
|
Focus on Module Implementation Overview |
|
|
199 | (1) |
|
Focus on Customized Best Practice for GUI Design |
|
|
199 | (1) |
|
|
200 | (1) |
|
|
201 | (3) |
|
|
204 | (1) |
|
|
205 | (2) |
|
|
207 | (42) |
|
|
207 | (2) |
|
Best Practices for Code Construction |
|
|
209 | (20) |
|
Applying Coding Standards throughout Development |
|
|
210 | (15) |
|
Applying the Test-First Approach at the Service and Module Implementation Level |
|
|
225 | (1) |
|
Implementing Service Contracts and/or Module Interfaces before Their Internal Functionality |
|
|
226 | (1) |
|
Applying Test Driven Development for Algorithmically Complex and Critical Code Units |
|
|
227 | (1) |
|
Conducting White Box Unit Testing after Implementing Each Unit and before Checking the Code into the Source Control System |
|
|
228 | (1) |
|
Verifying Code Consistency with the Requirements and Design |
|
|
228 | (1) |
|
Policy for Use of the Code Source Control System |
|
|
229 | (8) |
|
Measurements Related to Source Control |
|
|
234 | (2) |
|
Tracking of Source Control Data |
|
|
236 | (1) |
|
Policy for Use of Automated Build |
|
|
237 | (4) |
|
Measurements Related to Automated Builds |
|
|
240 | (1) |
|
Tracking of Data Related to Automated Builds |
|
|
241 | (1) |
|
|
241 | (2) |
|
Focus on a Customized Coding Standard Policy |
|
|
241 | (1) |
|
Focus on Features/Tests Reports |
|
|
242 | (1) |
|
|
243 | (1) |
|
|
244 | (1) |
|
|
245 | (2) |
|
|
247 | (2) |
|
Testing and Defect Prevention |
|
|
249 | (38) |
|
|
249 | (1) |
|
Best Practices for Testing and Code Review |
|
|
250 | (21) |
|
Conducting White Box Unit Testing: Bottom-Up Approach |
|
|
251 | (5) |
|
Conducting Black Box Testing and Verifying the Convergence of Top-Down and Bottom-Up Tests |
|
|
256 | (4) |
|
Conducting Code Reviews as a Testing Activity |
|
|
260 | (3) |
|
Conducting Integration Testing |
|
|
263 | (2) |
|
Conducting System Testing |
|
|
265 | (3) |
|
Conducting Regression Testing |
|
|
268 | (2) |
|
Conducting Acceptance Testing |
|
|
270 | (1) |
|
Defect Analysis and Prevention |
|
|
271 | (2) |
|
Policy for Use of Problem Tracking System |
|
|
273 | (5) |
|
Measurements of Data Related to the Problem Tracking System |
|
|
277 | (1) |
|
Tracking of Data Related to the Problem Tracking System |
|
|
277 | (1) |
|
Policy for Use of Regression Testing System |
|
|
278 | (1) |
|
Measurements Related to the Regression Testing System |
|
|
279 | (1) |
|
Tracking of Data Related to the Regression Testing System |
|
|
279 | (1) |
|
|
279 | (4) |
|
Focus on Defect Tracking Reports |
|
|
279 | (1) |
|
Focus on Test Type Reports |
|
|
280 | (1) |
|
Example of a Root Cause Analysis of a Design and Testing Defect |
|
|
280 | (3) |
|
|
283 | (1) |
|
|
283 | (2) |
|
|
285 | (1) |
|
|
286 | (1) |
|
Trend Analysis and Deployment |
|
|
287 | (24) |
|
|
287 | (1) |
|
Trends in Process Control |
|
|
288 | (2) |
|
|
288 | (1) |
|
|
288 | (1) |
|
|
289 | (1) |
|
Trends in Project Progress |
|
|
290 | (11) |
|
Analyzing Features/Requirements Implementation Status |
|
|
290 | (4) |
|
Analyzing Source Code Growth |
|
|
294 | (2) |
|
|
296 | (3) |
|
|
299 | (2) |
|
Analyzing Cost and Schedule |
|
|
301 | (1) |
|
Best Practices for Deployment and Transition |
|
|
301 | (8) |
|
Deployment to a Staging System |
|
|
301 | (2) |
|
Automation of the Deployment Process |
|
|
303 | (1) |
|
Assessing Release Readiness |
|
|
304 | (3) |
|
Release: Deployment to the Production System |
|
|
307 | (1) |
|
|
307 | (2) |
|
|
309 | (1) |
|
|
309 | (1) |
|
|
309 | (1) |
|
|
310 | (1) |
|
Managing External Factors |
|
|
311 | (30) |
|
|
311 | (1) |
|
Best Practices for Managing Outsourced Projects |
|
|
312 | (10) |
|
Establishing a Software Development Outsource Process |
|
|
313 | (1) |
|
Phase 0: Decision to Outsource |
|
|
314 | (1) |
|
|
315 | (4) |
|
|
319 | (2) |
|
|
321 | (1) |
|
Best Practices for Facilitating IT Regulatory Compliance |
|
|
322 | (6) |
|
Section 508 of the U.S. Rehabilitation Act |
|
|
322 | (1) |
|
Sarbanes-Oxley Act of 2002 |
|
|
323 | (5) |
|
Best Practices for Implementation of CMMI |
|
|
328 | (9) |
|
Capability and Maturity Model Integration (CMMI) |
|
|
329 | (1) |
|
|
330 | (1) |
|
Putting Staged Representation-Based Improvement into Practice Using ADP |
|
|
330 | (6) |
|
Putting Continuous Representation-Based Improvement into Practice Using ADP |
|
|
336 | (1) |
|
|
337 | (1) |
|
|
337 | (1) |
|
|
338 | (1) |
|
|
339 | (2) |
|
Case Study: Automation as an Agent of Change |
|
|
341 | (12) |
|
Case Study: Implementing Java Coding Standards in a Financial Application |
|
|
341 | (11) |
|
|
341 | (1) |
|
|
342 | (2) |
|
|
344 | (4) |
|
|
348 | (3) |
|
The Bottom Line Results---Facilitating Change |
|
|
351 | (1) |
|
|
352 | (1) |
|
|
352 | (1) |
|
|
352 | (1) |
|
APPENDIX A: A BRIEF SURVEY OF MODERN SOFTWARE DEVELOPMENT PROCESS MODELS |
|
|
353 | (8) |
|
|
353 | (1) |
|
Rapid Application Development (RAD) and Rapid Prototyping |
|
|
353 | (2) |
|
|
355 | (1) |
|
|
355 | (2) |
|
Object-Oriented Unified Process |
|
|
357 | (1) |
|
Extreme and Agile Programming |
|
|
358 | (1) |
|
|
359 | (2) |
|
APPENDIX B: MARS POLAR LANDER (MPL): LOSS AND LESSONS |
|
|
361 | (8) |
|
|
|
361 | (1) |
|
|
362 | (2) |
|
|
364 | (2) |
|
|
366 | (3) |
|
APPENDIX C: SERVICE-ORIENTED ARCHITECTURE: EXAMPLE OF AN IMPLEMENTATION WITH ADP BEST PRACTICES |
|
|
369 | (22) |
|
|
369 | (1) |
|
Web Service Creation: Initial Planning and Requirements |
|
|
369 | (2) |
|
|
370 | (1) |
|
Nonfunctional Requirements |
|
|
371 | (1) |
|
Web Service Creation: Extended Planning and Design |
|
|
371 | (4) |
|
|
371 | (2) |
|
|
373 | (1) |
|
|
374 | (1) |
|
Web Service Creation: Construction and Testing, Stage 1---Module Implementation |
|
|
375 | (6) |
|
Applying Coding Standards |
|
|
376 | (1) |
|
Implementing Interfaces and Applying a Test-First Approach for Modules and Submodules |
|
|
376 | (1) |
|
Generating White Box JUnit Tests |
|
|
377 | (1) |
|
Gradually Implementing the Submodule until All Junit Tests Pass and Converge with the Original Black Box Tests |
|
|
378 | (2) |
|
Checking Verified Tests into the Source Control System and Running Nightly Regression Tests |
|
|
380 | (1) |
|
Web Service Creation: Construction and Testing, Stage 2---The WSDL Document Implementation |
|
|
381 | (3) |
|
Creating and Deploying the WSDL Document on the Staging Server as Part of the Nightly Build Process |
|
|
381 | (2) |
|
Avoiding Inline Schemas when XML Validation Is Required |
|
|
383 | (1) |
|
Avoiding Cyclical Referencing when Using Inline Schemas |
|
|
383 | (1) |
|
Verifying WSDL Document for XML Validity |
|
|
384 | (1) |
|
Avoiding ``Encoded'' Coding Style by Checking Interoperability |
|
|
384 | (1) |
|
Creating Regression Tests for the WSDL Documents and Schemas to Detect Undesired Changes |
|
|
384 | (1) |
|
Web Service Creation: Server Deployment |
|
|
384 | (2) |
|
Deploying the Web Service to a Staging Server as Part of the Nightly Build Process |
|
|
384 | (1) |
|
Executing Web Service Tests That Verify the Functionality of the Web Service |
|
|
385 | (1) |
|
Creating Scenario-Based Tests and Incorporating Them Into the Nightly Test Process |
|
|
386 | (1) |
|
|
386 | (1) |
|
Web Service Creation: Client Deployment |
|
|
386 | (1) |
|
Implementing the Client According to the WSDL Document Specification |
|
|
386 | (1) |
|
Using Server Stubs to Test Client Functionality---Deploying the Server Stub as Part of the Nightly Deployment Process |
|
|
387 | (1) |
|
Adding Client Tests into the Nightly Test Process |
|
|
387 | (1) |
|
Web Service Creation: Verifying Security |
|
|
387 | (2) |
|
Determining the Desired Level of Security |
|
|
387 | (1) |
|
Deploying Security-Enabled Web Service on Additional Port of the Staging Server |
|
|
388 | (1) |
|
Leveraging Existing Tests: Modifying Them to Test for Security and Incorporating Them into the Nightly Test Process |
|
|
388 | (1) |
|
Web Service Creation: Verifying Performance through Continuous Performance/Load Testing |
|
|
389 | (2) |
|
Starting Load Testing as Early as Possible and Incorporating It into the Nightly Test Process |
|
|
389 | (1) |
|
Using Results of Load Tests to Determine Final Deployment Configuration |
|
|
389 | (2) |
|
APPENDIX D: AJAX BEST PRACTICE: CONTINUOUS TESTING |
|
|
391 | (4) |
|
|
391 | (1) |
|
Ajax Development and Testing Challenges |
|
|
392 | (1) |
|
|
392 | (3) |
|
APPENDIX E: SOFTWARE ENGINEERING TOOLS |
|
|
395 | (6) |
Glossary |
|
401 | (14) |
Index |
|
415 | |