This book describes how to build and implement an automated testing regime for software development. It presents a detailed account of the principles of automated testing, practical techniques for designing a good automated testing regime, and advice on choosing and applying off-the-shelf testing tools to specific needs. This sound and practical introduction to automated testing comes from two authors well known for their seminars, consultancy and training in the field.
Les mer
A sound and practical introduction to automated testing, this book presents a detailed account of the principles of automated testing. The authors provide practical techniques for designing a good automated testing regime, and advice on choosing and applying off-the-shelf testing tools for specific needs.
Les mer
PrefacePart One: Techniques for Automating Test Execution1 Test automation context1.1 Introduction1.2 Testing and test automation are different1.3 The V-model1.4 Tool support for life-cycle testing1.5 The promise of test automation1.6 Common problems of test automation1.7 Test activities1.8 Automate test design?1.9 The limitations of automating software testing2 Capture Replay is Not Test Automation2.1 An example application: Scribble2.2 The manual test process: what is to be automated2.3 Automating Test Execution: inputs2.4 Automating Test Result Comparison2.5 The next steps in evolving test automation2.6 Conclusion: Automated is not automatic3 Scripting techniques3.1 Introduction3.2 Scripting techniques3.3 Script pre-processing4 Automated comparison4.1 Verification, comparison and automation4.2 What do comparators do?4.3 Dynamic comparison4.4 Post-execution comparison4.5 Simple comparison4.6 Complex comparison4.7 Test sensitivity4.8 Comparing different types of outcome4.9 Comparison filters4.10 Comparison guidelines5 Testware Architecture5.1 What is testware architecture?5.2 Key issues to be resolved5.3 An Approach5.4 Might this be Overkill?6 Automating Pre- and Post-Processing6.1 What are Pre- and Post-Processing?6.2 Pre- and Post Processing6.3 What should happen after test case execution6.4 Implementation Issues7 Building maintainable tests7.1 Problems in maintaining automated tests7.2 Attributes of test maintenance7.3 The conspiracy7.4 Strategy and tactics8 Metrics8.1 Why measure testing and test automation?8.2 What can we measure?8.3 Objectives for testing and test automation8.4 Attributes of software testing8.5 Attributes of test automation8.6 Which is the best test automation regime?8.7 Should I really measure all these?8.8 Summary8.9 Answer to DDP Exercise9 Other Issues9.1 Which Tests to Automate (first)?9.2 Selecting which tests to run when9.3 Order of test execution9.4 Test status9.5 Designing software for (automated) testability9.6 Synchronization9.7 Monitoring progress of automated tests9.8 Tailoring your own regime around your tools10 Choosing a tool to automate testing10.1 Introduction to Chapters 10 and 1110.2 Where to start in selecting tools: your requirements, not the tool market10.3 The tool selection project10.4 The tool selection team10.5 Identifying your requirements10.6 Identifying your constraints10.7 Build or buy?10.8 Identifying what is available on the market10.9 Evaluating the short listed candidate tools10.10 Making the decision11 Implementing tools within the organization11.1 What could go wrong?11.2 Importance of managing the implementation process11.3 Roles in the implementation/change process11.4 Management commitment11.5 Preparation11.6 Pilot project11.7 Planned phased installation or roll-out11.8 Special problems in implementing11.9 People issues11.10 Conclusion12 Racal-Redac Case History12.1 Introduction12.2 Background12.3 Solutions12.4 Integration to Test Automation12.5 System Test Automation12.6 The Results Achieved12.7 Summary of the case history up to 199112.8 What happened next?13 The Evolution of an Automated Software Test System13.1 Introduction13.2 Background13.3 Gremlin 113.4 Gremlin 2.0: A Step Beyond Capture/Replay13.5 Finding The Real Problem13.6 Lesson Learned14 Experiences with Test Automation14.1 Background14.2 Planning, preparation and eventual success14.3 Benefits of test automation14.4 Lessons learned14.5 The way forward15 Automating System Testing in a VMS Environment15.1 Background15.2 The first attempt at automation15.3 New tool selection and evaluation15.4 Implementation of V-Test15.5 Conclusion16 Automated Testing of an Electronic Stock Exchange16.1 Background16.2 The System and Testing16.3 Test Automation Requirements16.4 Test tool selection16.5 Implementation16.6 Maturity and Maintenance16.7 Our results17 Insurance quotation systems tested automatically every month17.1 Background: the UK insurance industry17.2 The Brief, or how I became involved17.3 Why automation?17.4 Our testing strategy17.5 Selecting a test automation tool17.6 Some decisions about our test automation plans17.7 The Test Plan17.8 Some additional issues we encountered17.9 A telling tale: tester versus automator17.10 In Summary18 Three Generations of Test Automation at ISS18.1 Introduction18.2 The Software Under Test18.3 First Generation18.4 Second Generation18.5 Third Generation18.6 Three Generations - A summary19 Test Automation Failures - Lessons to be Learning19.1 Introduction19.2 The projects19.3 Problems19.4 Recommendations19.5 Pilot Project19.6 Epilogue20 An Unexpected Application of Test Automation20.1 Introduction and Background20.2 Helping the Background20.3 Doing the testing20.4 Automated Testing20.5 The results21 Implementing test automation in an Independent Test Unit21.1 Introduction and Background21.2 The evaluation process21.3 The implementation phase21.4 The Deployment of the tool21.5 How QARun has been used21.6 Problems we have experienced21.7 The benefits achieved in two years21.8 Conclusion22 Testing with Action Words22.1 Introduction and Background22.2 Test clusters22.3 The navigation22.4 The test development life cycle22.5 Applicability for other types of tests22.6 Templates: meta clusters23 Regression testing at ABN AMRO Bank Development International23.1 Background23.2 Problems with conventional testing23.3 Pilot project using TestFrame23.4 Regression test project23.5 Spin-offs23.6 Future24 A Test Automation Journey24.1 Introduction24.2 The five generations of testware development24.3 Radstar24.4 Window-centric Scenario Libraries24.5 Business Object Scenarios24.6 Mixing Business Object Scenarios with existing tests24.7 Re-use versus Repeatability24.8 Conclusion25 A Test Automation Journey25.1 Introduction25.2 First Steps25.3 An Off-the-Shelf Automated Test Foundation: RadStar(25.4 How we have implemented automated testing using RadStar(25.5 Payback26 Extracts from The Automated Testing Handbook26.1 Introduction to this chapter26.2 Introduction to the Handbook26.3 Fundamentals of Test Automation26.4 Test Process and People26.5 Test Execution: Analyzing Results26.6 Test Metrics26.7 More information about the Handbook27 Building Maintainable GUI Tests27.1 Introduction and Background27.2 Cost Drivers27.3 Test Planning and Design27.4 Well Behaved Test Cases27.5 Encapsulated Test Set-up27.6 Putting it All Together28 Test Automation Experience at Microsoft28.1 History28.2 Batch Files28.3 Capture/Playback tools28.4 Scripting Language28.5 Cosmetic Dialog Box Testing28.6 Help testing tool28.7 Tools to randomize test execution28.8 What should I automate first?28.9 My Top Ten list for a successful test automation strategyReferencesGlossaryIndex
Les mer
"This is the first comprehensive treatment of software test automation issues, strategies and tactics ever published. It provides the equivalent of two or three years of on the job experience. Every important aspect of test automation is covered, with enough information to help the reader approach the subject with the right balance of caution and confidence. I'm delighted with what Fewster & Graham have done." --James Bach, Test Design Consultant "The most authoritative book available on this subject, a must read for every software testing professional!" --Jeffrey M. Voas, Chief Scientist, Reliable Software Technologies, VA
Les mer

Produktdetaljer

ISBN
9780201331400
Publisert
1999-06-28
Utgiver
Vendor
Addison Wesley
Vekt
893 gr
Høyde
234 mm
Bredde
156 mm
Dybde
30 mm
Aldersnivå
06, P
Språk
Product language
Engelsk
Format
Product format
Heftet
Antall sider
600

Biographical note

Dorothy Graham and Mark Fewster are the principal consultant partners of Grove Consultants which provides consultancy and training in software testing, test automation, and Inspection. Mark Fewster developed the test automation design techniques which are the primary subject of this book. He has been refining and applying his ideas through consultancy with a wide variety of clients since 1991. Dorothy Graham is the originator and co-author of the CAST Report (Computer Aided Software Testing tools) published by Cambridge Market Intelligence, and the co-author of Software Inspection published by Addison-Wesley in 1993. Both authors are popular and sought-after speakers at international conferences and workshops on software testing.