|
CHAPTER 1 Software Assurance |
|
|
1 | (28) |
|
|
1 | (5) |
|
|
1 | (4) |
|
1.1.2 The Impossibility of Exhaustive Testing |
|
|
5 | (1) |
|
|
6 | (6) |
|
1.2.1 Achieving Quality Is Easier Than Measuring Quality |
|
|
6 | (2) |
|
1.2.2 Direct and Indirect Measurement |
|
|
8 | (1) |
|
|
9 | (1) |
|
1.2.4 Static and Dynamic Analyses |
|
|
10 | (2) |
|
1.3 The Process and Product Debate |
|
|
12 | (6) |
|
1.3.1 Industry Process Standards |
|
|
12 | (2) |
|
1.3.2 Clean Pipes and Dirty Water |
|
|
14 | (3) |
|
1.3.3 Why Measure Product? |
|
|
17 | (1) |
|
1.4 Fault Injection and Software Assessment |
|
|
18 | (11) |
|
1.4.1 Traditional Process Improvement |
|
|
19 | (1) |
|
1.4.2 Predicting Future Quality |
|
|
20 | (1) |
|
1.4.3 Fault Injection in the Physical World |
|
|
21 | (2) |
|
1.4.4 Fault Injection in the Virtual World |
|
|
23 | (3) |
|
1.4.5 Software Inoculation |
|
|
26 | (3) |
|
CHAPTER 2 Setting the Stage |
|
|
29 | (16) |
|
|
29 | (5) |
|
|
29 | (1) |
|
2.1.2 Fault Injection in Hardware Design Languages |
|
|
30 | (3) |
|
2.1.3 Low-Level Software Fault Injection |
|
|
33 | (1) |
|
2.2 Basic Definitions and Relations |
|
|
34 | (11) |
|
2.2.1 Fault Injection as Nontraditional Testing |
|
|
34 | (2) |
|
2.2.2 Absolute Correctness Is a Red Herring |
|
|
36 | (1) |
|
2.2.3 Defining Some Terms |
|
|
37 | (8) |
|
CHAPTER 3 Fault-Injection Fundamentals |
|
|
45 | (32) |
|
3.1 The Three Fundamentals |
|
|
45 | (8) |
|
|
45 | (2) |
|
|
47 | (3) |
|
|
50 | (1) |
|
3.1.4 Atomic Operators and Data Spaces |
|
|
50 | (3) |
|
|
53 | (10) |
|
3.2.1 Theorizing about Anomaly Spaces |
|
|
53 | (7) |
|
3.2.2 Using Software Standards to Partition the "All Problems" Space |
|
|
60 | (3) |
|
3.2.3 Simulating Distributed Faults |
|
|
63 | (1) |
|
3.3 Implementation Issues |
|
|
63 | (12) |
|
3.3.1 When to Apply Fault Injection |
|
|
64 | (1) |
|
3.3.2 Pseudorandom Number Generation |
|
|
64 | (2) |
|
|
66 | (5) |
|
|
71 | (3) |
|
3.3.5 Statistics and Result Collection |
|
|
74 | (1) |
|
|
75 | (2) |
|
CHAPTER 4 Software Mutation |
|
|
77 | (40) |
|
4.1 Getting the Most from Software Testing |
|
|
77 | (3) |
|
4.2 Code Mutation versus State Mutation |
|
|
80 | (4) |
|
4.3 Compiling or Interpreting Mutants |
|
|
84 | (2) |
|
4.4 Assembly and Object Language Mutation |
|
|
86 | (1) |
|
4.5 Creating Mutation Operators |
|
|
86 | (5) |
|
|
91 | (21) |
|
4.6.1 Syntactically Incorrect Mutants |
|
|
92 | (1) |
|
4.6.2 Operand Replacement Operators |
|
|
93 | (2) |
|
4.6.3 Statement Modification Operators |
|
|
95 | (7) |
|
4.6.4 Expression Modification Operators |
|
|
102 | (5) |
|
|
107 | (1) |
|
|
108 | (1) |
|
4.6.7 Comparison of Ada, C, and Fortran-77 Operators |
|
|
109 | (3) |
|
|
112 | (2) |
|
|
114 | (3) |
|
CHAPTER 5 Software Testability |
|
|
117 | (42) |
|
|
117 | (4) |
|
5.2 Reliability versus Testability |
|
|
121 | (1) |
|
5.3 Fault Tolerance versus Testability |
|
|
122 | (1) |
|
5.4 Testability Anomaly Spaces |
|
|
122 | (2) |
|
5.5 Fault Injection-Based Testability: The PIE Algorithm |
|
|
124 | (18) |
|
5.5.1 Propagation Analysis |
|
|
125 | (9) |
|
|
134 | (3) |
|
|
137 | (3) |
|
5.5.4 Understanding the Results of PIE |
|
|
140 | (1) |
|
5.5.5 Using PIE to Generate "Better" Test Cases |
|
|
140 | (1) |
|
5.5.6 PIE versus Formal Verification |
|
|
141 | (1) |
|
5.6 Timing and Ordering Fault-Injection Mechanisms |
|
|
142 | (7) |
|
5.7 Miscellaneous Parameters Affecting Testability Analysis and Mutation Testing |
|
|
149 | (6) |
|
|
150 | (2) |
|
|
152 | (3) |
|
|
155 | (4) |
|
CHAPTER 6 Software Safety |
|
|
159 | (46) |
|
6.1 Making Software More Tolerant of Faults |
|
|
162 | (15) |
|
|
162 | (2) |
|
6.1.2 N-Version Programming |
|
|
164 | (4) |
|
6.1.3 Consensus Recovery Block |
|
|
168 | (1) |
|
6.1.4 Analyzing Software for Fault Tolerance with Fault Trees |
|
|
169 | (5) |
|
|
174 | (1) |
|
6.1.6 Safety-Critical Partitioning |
|
|
175 | (2) |
|
6.2 Software Tolerance Measurement with EPA |
|
|
177 | (8) |
|
6.2.1 Extended Propagation Analysis(EPA) |
|
|
177 | (8) |
|
6.3 Simulating Diverse Faulty Components |
|
|
185 | (6) |
|
6.3.1 Simulating a Faulty Specification for N-Version Systems |
|
|
188 | (2) |
|
6.3.2 Simulating Ambiguous Specifications for N-Version Systems |
|
|
190 | (1) |
|
6.4 EPA-Based Fault-Injection Tool Architecture |
|
|
191 | (5) |
|
6.5 Mutation-Based Fault-Injection Tool Architecture |
|
|
196 | (1) |
|
6.6 Process-Based, Debugger-Based, and Processor-Based Fault Injection |
|
|
196 | (1) |
|
6.7 Electromagnetic Radiation Perturbation Functions for EPA |
|
|
197 | (3) |
|
6.8 EPA as a Redesign Heuristic |
|
|
200 | (1) |
|
|
200 | (5) |
|
CHAPTER 7 Applied Safety Assessment |
|
|
205 | (22) |
|
|
206 | (2) |
|
|
208 | (4) |
|
7.3 Magneto Stereotaxis System |
|
|
212 | (4) |
|
7.4 Advanced Automated Train Control System |
|
|
216 | (2) |
|
7.5 OECD Halden Reactor Project, Norway |
|
|
218 | (6) |
|
|
224 | (3) |
|
CHAPTER 8 Information Security |
|
|
227 | (26) |
|
8.1 The State of Security Assessment |
|
|
229 | (4) |
|
8.2 AVA Theoretical Model |
|
|
233 | (5) |
|
8.2.1 Simulating Code Weaknesses |
|
|
234 | (1) |
|
8.2.2 Test-Case Generation |
|
|
234 | (1) |
|
8.2.3 Specifying Intrusions Using Predicates |
|
|
235 | (1) |
|
8.2.4 Limiting the Information Available from AVA |
|
|
236 | (2) |
|
8.3 httpd Security Analysis Results |
|
|
238 | (6) |
|
|
243 | (1) |
|
8.4 Buffer Overflow with AVA |
|
|
244 | (4) |
|
|
245 | (3) |
|
8.4.2 An AVA Perturbation for Buffer Overflow |
|
|
248 | (1) |
|
|
248 | (5) |
|
CHAPTER 9 Maintenance and Reuse |
|
|
253 | (36) |
|
|
253 | (8) |
|
|
254 | (1) |
|
9.1.2 Parnas's Uses Hierarchy |
|
|
255 | (1) |
|
9.1.3 The Year 2000 Problem |
|
|
256 | (4) |
|
9.1.4 COTS and Life Expectancy |
|
|
260 | (1) |
|
|
261 | (24) |
|
9.2.1 Applying Fault Injection to Legacy Code Testing |
|
|
262 | (4) |
|
9.2.2 Applying Fault Injection to COTS Systems |
|
|
266 | (14) |
|
9.2.3 Software Component Composition |
|
|
280 | (1) |
|
9.2.4 Interface Propagation Analysis |
|
|
281 | (3) |
|
9.2.5 Why Not Fault Inject Straight into Executables? |
|
|
284 | (1) |
|
|
285 | (4) |
|
CHAPTER 10 Advanced Fault Injection |
|
|
289 | (30) |
|
10.1 Profiles and Environments: The Keys to Good Behavior |
|
|
290 | (5) |
|
10.1.1 Operational Profiles |
|
|
290 | (1) |
|
10.1.2 Inverted Operational Profiles |
|
|
291 | (2) |
|
10.1.3 Modified Operational Profiles |
|
|
293 | (2) |
|
|
295 | (20) |
|
10.2.1 Testability-Based Assertion Localization |
|
|
299 | (4) |
|
10.2.2 Cleansing Assertions |
|
|
303 | (12) |
|
|
315 | (4) |
|
CHAPTER 11 Inoculating Real-world Software |
|
|
319 | (20) |
|
11.1 Selling Management on Fault Injection |
|
|
319 | (5) |
|
11.2 Developing a Winning Game Plan |
|
|
324 | (10) |
|
11.2.1 Avoid the Fruit Mix-up |
|
|
324 | (1) |
|
11.2.2 Consider the Life Cycle |
|
|
325 | (4) |
|
11.2.3 Software Standards |
|
|
329 | (1) |
|
11.2.4 Regulations and Certification |
|
|
330 | (1) |
|
|
331 | (3) |
|
11.3 Five Steps to Getting Started with Fault Injection |
|
|
334 | (1) |
|
11.4 What's the Bottom Line? |
|
|
335 | (1) |
|
|
336 | (3) |
Appendix |
|
339 | (8) |
Index |
|
347 | |