Muutke küpsiste eelistusi

E-raamat: Common System and Software Testing Pitfalls: How to Prevent and Mitigate Them: Descriptions, Symptoms, Consequences, Causes, and Recommendations

Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 24,37 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Dons book is a very good addition both to the testing literature and to the literature on quality assurance and software engineering . [ It] is likely to become a standard for test training as well as a good reference for professional testers and developers. I would also recommend this book as background material for negotiating outsourced software contracts. I often work as an expert witness in litigation for software with very poor quality, and this book might well reduce or eliminate these lawsuits. Capers Jones, VP and CTO, Namcook Analytics LLC  

Software and system testers repeatedly fall victim to the same pitfalls. Think of them as anti-patterns: mistakes that make testing far less effective and efficient than it ought to be. In Common System and Software Testing Pitfalls, Donald G. Firesmith catalogs 92 of these pitfalls. Drawing on his 35 years of software and system engineering experience, Firesmith shows testers and technical managers and other stakeholders how to avoid falling into these pitfalls, recognize when they have already fallen in, and escape while minimizing their negative consequences.

 

Firesmith writes for testing professionals and other stakeholders involved in large or medium-sized projects. His anti-patterns and solutions address both pure software applications and software-reliant systems, encompassing heterogeneous subsystems, hardware, software, data, facilities, material, and personnel. For each pitfall, he identifies its applicability, characteristic symptoms, potential negative consequences and causes, and offers specific actionable recommendations for avoiding it or limiting its consequences.

 

This guide will help you





Pinpoint testing processes that need improvementbefore, during, and after the project Improve shared understanding and collaboration among all project participants Develop, review, and optimize future project testing programs Make your test documentation far more useful Identify testing risks and appropriate risk-mitigation strategies Categorize testing problems for metrics collection, analysis, and reporting Train new testers, QA specialists, and other project stakeholders

 

With 92 common testing pitfalls organized into 14 categories, this taxonomy of testing pitfalls should be relatively complete. However, in spite of its comprehensiveness, it is also quite likely that additional pitfalls and even missing categories of pitfalls will be identified over time as testers read this book and compare it to their personal experiences. As an enhancement to the print edition, the author has provided the following location on the web where readers can find major additions and modifications to this taxonomy of pitfalls: http://donald.firesmith.net/home/common-testing-pitfalls

 

Please send any recommended changes and additions to dgf (at) sei (dot) cmu (dot) edu, and the author will consider them for publication both on the website and in future editions of this book.

 

Arvustused

Firesmiths collection of actionable practices for real-world, non-trivial testing and the processes in which theyre applied is comprehensive and uniquely valuable. Nothing published about software testing in recent years provides anything like it. Robert V. Binder, robertvbinder.com

 

Dons compilation of real-world testing problems, symptoms, and solutions is the most comprehensive available. You can use it early in your project to prevent these problems. Or you can use it at the end, as a ready list of costly lessons you could have avoided, had you used it early on. Im afraid this books publication will undermine a lot of excuses for repeating these mistakes.

Vince Alcalde, National Australia Bank

 

Excellent, Excellent, Excellent! This book should be mandatory reading for anyone involved in product development. Donalds book addresses the pitfalls that need to be understood and allowed for in all product development verification and validation planning. While the focus of the book is on software projects, most of the pitfalls are equally applicable to any size project that involves both hardware and software.

Louis S. Wheatcraft, Requirements Experts Inc.

 

The potential impact of this book cannot be overstressed. Software systems that are not adequately tested do not adequately evolve. I highly recommend this book as a must-read for people directly involved in the development and management of software- intensive systems.

Dr. Kenneth E. Nidiffer, Director of Strategic Plans for Government Programs, Software Engineering Institute, Carnegie Mellon University

 

Common System and Software Testing Pitfalls identifies realistic testing pitfalls. More importantly, it also identifies solutions for avoiding them on your next project. Every manager should read this book and follow the recommendations.

Barry Stanly, Enterprise Technology Alliance

 

Whether you are a novice tester or a seasoned professional, you will find this book to be a valuable resource. The information on how to identify and prevent problem areas is clear, concise and, most importantly, actionable.

Allison Yeager, Blackbaud

 

First of all, this is great material! It contains probably all of the testing problems I have faced in my career and some that I wasnt aware of. . . . Thank you for the opportunity to read this superb material!

Alexandru Cosma, Frequentis

 

As a tester, I consider Common System and Software Testing Pitfalls by Donald Firesmith to be a must-read book for all testers and QA engineers.

Thanh Huynh, LogiGear

 

Your book provides very good insight and knowledge. After working in IT for over thirty years, and focusing on software testing the past thirteen years, I still learned more tips and best practices in software testing.

Tom Zalewski, Texas State Government

 

This book is essential for the people in the cyber security business . . . I can see it becoming a classic. Don has done a great job.

Michael Hom, Compass360 Consulting

 

Awesome work. Very mature.

Alejandro Salado, Kaiser, Threde Gmbh

 

All in all, a great document.

Peter Bolin, Revolution IT Pty Ltd.

 

Foreword xiii
Preface xvii
Scope
xviii
Intended Audience
xviii
How to Use This Book-and Its Contents
xix
Organization of This Book
xx
Acknowledgments
xxi
About The Author xxiii
1 Overview 1(12)
1.1 What Is Testing?
1(1)
1.2 Testing and the V Models
2(3)
1.3 What Is a Defect?
5(2)
1.4 Why Is Testing Critical?
7(2)
1.5 The Limitations of Testing
9(1)
1.6 What Is a Testing Pitfall?
10(1)
1.7 Categorizing Pitfalls
11(1)
1.8 Pitfall Specifications
11(2)
2 Brief Overviews Of The Testing Pitfalls 13(12)
2.1 General Testing Pitfalls
13(7)
2.1.1 Test Planning and Scheduling Pitfalls
13(1)
2.1.2 Stakeholder Involvement and Commitment Pitfalls
14(1)
2.1.3 Management-Related Testing Pitfalls
14(1)
2.1.4 Staffing Pitfalls
15(1)
2.1.5 Test-Process Pitfalls
16(1)
2.1.6 Test Tools and Environments Pitfalls
17(1)
2.1.7 Test Communication Pitfalls
18(1)
2.1.8 Requirements-Related Testing Pitfalls
19(1)
2.2 Test-Type-Specific Pitfalls
20(5)
2.2.1 Unit Testing Pitfalls
20(1)
2.2.2 Integration Testing Pitfalls
20(1)
2.2.3 Specialty Engineering Testing Pitfalls
21(1)
2.2.4 System Testing Pitfalls
22(1)
2.2.5 System of Systems (SoS) Testing Pitfalls
22(1)
2.2.6 Regression Testing Pitfalls
23(2)
3 Detailed Descriptions Of The Testing Pitfalls 25(216)
3.1 Common Negative Consequences
25(1)
3.2 General Recommendations
26(2)
3.3 General Testing Pitfalls
28(136)
3.3.1 Test Planning and Scheduling Pitfalls
28(16)
No Separate Test Planning Documentation (GEN-TPS-1)
28(3)
Incomplete Test Planning (GEN-TPS-2)
31(4)
Test Plans Ignored (GEN-TPS-3)
35(2)
Test-Case Documents as Test Plans (GEN-TPS-4)
37(2)
Inadequate Test Schedule (GEN-TPS-5)
39(3)
Testing at the End (GEN-TPS-6)
42(2)
3.3.2 Stakeholder Involvement and Commitment Pitfalls
44(7)
Wrong Testing Mindset (GEN-SIC-1)
44(3)
Unrealistic Testing Expectations (GEN-SIC-2)
47(2)
Lack of Stakeholder Commitment to Testing (GEN-SIC-3)
49(2)
3.3.3 Management-Related Testing Pitfalls
51(14)
Inadequate Test Resources (GEN-MGMT-1)
52(5)
Inappropriate External Pressures (GEN-MGMT-2)
Inadequate Test-Related Risk Management (GEN-MGMT-3)
57(2)
Inadequate Test Metrics (GEN-MGMT-4)
59(2)
Inconvenient Test Results Ignored (GEN-MGMT-5)
61(3)
Test Lessons Learned Ignored (GEN-MGMT-6)
64(1)
3.3.4 Staffing Pitfalls
65(10)
Lack of Independence (GEN-STF-1),
66(2)
Unclear Testing Responsibilities (GEN-STF-2)
68(1)
Inadequate Testing Expertise (GEN-STF-3)
69(3)
Developers Responsible for All Testing (GEN-STF-4)
72(2)
Testers Responsible for All Testing (GEN-STF-5)
74(1)
3.3.5 Test Process Pitfalls
75(31)
Testing and Engineering Processes Not Integrated (GEN-PRO-1)
76(1)
One-Size-Fits-All Testing (GEN-PRO-2)
77(3)
Inadequate Test Prioritization (GEN-PRO-3)
80(2)
Functionality Testing Overemphasized (GEN-PRO-4)
82(3)
Black-Box System Testing Overemphasized (GEN-PRO-5)
85(1)
Black-Box System Testing Underemphasized (GEN-PRO-6)
86(2)
Too Immature for Testing (GEN-PRO-7)
88(2)
Inadequate Evaluations of Test Assets (GEN-PRO-8)
90(2)
Inadequate Maintenance of Test Assets (GEN-PRO-9)
92(2)
Testing as a Phase (GEN-PRO-10)
94(2)
Testers Not Involved Early (GEN-PRO-11)
96(2)
Incomplete Testing (GEN-PRO-12)
98(2)
No Operational Testing (GEN-PRO-13)
100(2)
Inadequate Test Data (GEN-PRO-14)
102(2)
Test-Type Confusion (GEN-PRO-15)
104(2)
3.3.6 Test Tools and Environments Pitfalls
106(25)
Over-Reliance on Manual Testing (GEN-TTE-1)
106(2)
Over-Reliance on Testing Tools (GEN-TTE-2)
108(2)
Too Many Target Platforms (GEN-TTE-3)
110(2)
Target Platform Difficult to Access (GEN-TTE-4)
112(2)
Inadequate Test Environments (GEN-TTE--5)
114(4)
Poor Fidelity of Test Environments (GEN-TTE-6)
118(4)
Inadequate Test Enviroment Quality (GEN-TTE-7)
122(2)
Test Assets Not Delivered (GEN-TTE-8)
124(2)
Inadequate Test Configuration to Management (GEN-TTE-9)
126(3)
Developers Ignore Testability (GEN-TTE-10)
129(2)
3.3.7 Test Communication Pitfalls
131(12)
Inadequate Architecture or Design, Documentation (GEN-COM-1)
131(3)
Inadequate Defect Reports (GEN-COM-2)
134(2)
Inadequate Test Documentation (GEN-COM-3)
136(3)
Source Documents Not Maintained (GEN-COM-4)
139(1)
Inadequate Communication Concerning Testing (GEN-COM-5)
140(3)
3.3.8 Requirements-Related Testing Pitfalls
143(21)
Ambiguous Requirements (GEN-REQ-1)
144(3)
Obsolete Requirements (GEN-REQ-2)
147(3)
Missing Requirements (GEN-REQ-3)
150(2)
Incomplete Requirements (GEN-REQ-4)
152(2)
Incorrect Requirements (GEN-REQ-5)
154(2)
Requirements Churn (GEN-REQ-6)
156(3)
Improperly Derived Requirements (GEN-REQ-7)
159(2)
Verification Methods Not Properly Specified (GEN-REQ-8)
161(1)
Lack of Requirements Trace (GEN-REQ-9)
162(2)
3.4 Test-Type-Specific Pitfalls
164(77)
3.4.1 Unit Testing Pitfalls
164(5)
Testing Does Not Drive Design and Implementation (TTS-UNT-1)
165(2)
Conflict of Interest (TTS-UNT-2)
167(2)
3.4.2 Integration Testing Pitfalls
169(8)
Integration Decreases Testability Ignored (TTS-INT-1)
169(3)
Inadequate Self-Monitoring (TTS-INT-2)
172(1)
Unavailable Components (TTS-INT-3)
173(2)
System Testing as Integration Testing (TTS-INT-4)
175(2)
3.4.3 Specialty Engineering Testing Pitfalls
177(29)
Inadequate Capacity Testing (TTS-SPC-1)
178(3)
Inadequate Concurrency Testing (TTS-SPC-2)
181(2)
Inadequate Internationalization Testing (TTS-SPC-3)
183(2)
Inadequate Interoperability Testing (TTS-SPC-4)
185(3)
Inadequate Performance Testing (TTS-SPC-5)
188(2)
Inadequate Reliability Testing (TTS-SPC-6)
190(3)
Inadequate Robustness Testing (TTS-SPC-7)
193(4)
Inadequate Safety Testing (TTS-SPC-8)
197(3)
Inadequate Security Testing (TTS-SPC-9)
200(3)
Inadequate Usability Testing (TTS-SPC-10)
203(3)
3.4.4 System Testing Pitfalls
206(5)
Test Hooks Remain (TTS-SYS-1)
206(2)
Lack of Test Hooks (TTS-SYS-2)
208(1)
Inadequate End-To-end Testing (TTS-SYS-3)
209(2)
3.4.5 System of Systems (SoS) Testing Pitfalls
211(14)
Inadequate SoS Test Planning (TTS-SoS-1)
212(1)
Unclear SoS Testing Responsibilities (TTS-SoS-2)
213(2)
Inadequate Resources for SoS Testing (TTS-SoS-3)
215(2)
SoS Testing Not Properly Scheduled (TTS-SoS-4)
217(2)
Inadequate SoS Requirements (TTS-SoS-5)
219(1)
Inadequate Support from Individual System Projects (TTS-SoS-6)
220(2)
Inadequate Defect Tracking Across Projects (TTS-SoS-7)
222(2)
Finger-Pointing (TTS-SoS-8)
224(1)
3.4.6 Regression Testing Pitfalls
225(16)
Inadequate Regression Test Automation (TTS-REG-1)
225(3)
Regression Testing Not Performed (TTS-REG-2)
228(3)
Inadequate Scope of Regression Testing (TTS-REG-3)
231(3)
Only Low-Level Regression Tests (TTS-REG-4)
234(2)
Test Resources Not Delivered for Maintenance (TTS-REG-5)
236(1)
Only Functional Regression Testing (TTS-REG-6)
237(4)
4 Conclusion 241(2)
4.1 Future Work
241(1)
4.2 Maintaining the Lists of Pitfalls
242(1)
A Glossary 243(10)
B Acronyms 253(2)
C Notes 255(14)
D References 269(2)
E Planning Checklist 271(8)
Index 279
Donald G. Firesmith is senior member of technical staff in the Software Solutions Division at the Software Engineering Institute (SEI). There, he helps the U.S. Department of Defense and other agencies acquire large, complex, software-reliant systems. An internationally recognized software and systems engineering expert, he has published books on requirements engineering, architecture engineering, situational method engineering, testing, and object-oriented development.