软件测试论坛

 找回密码
 软件测试论坛注册页
查看: 9862|回复: 3

[英文资料] The Case for Automated Software Testing

    [复制链接]
发表于 2008-10-7 16:00:41 | 显示全部楼层 |阅读模式
软件测试工程师就业班马上开班
The Case for Automated Software Testing
Bernie Gauf and Elfriede Dustin, IDT
ABSTRACT
This article will discuss Automated Software Testing as a proposed solution to the ever increasing testing problem: This proposed solution is backed up, by presenting a background of the current testing problem, supported by results of a recent IDT survey.
Automated Software Testing refers to the “Application and implementation of software technology to allow for automation throughout the entire software testing lifecycle (STL) with the goal to improve the STL efficiencies and effectiveness.”
We discuss the importance of Automated Software Testing as part of the System Engineering Lifecycle and we describe the return on investment (ROI) of some efforts undertaken thus far, as well as other benefits. Additionally, this article will talk about some of the automated software testing pitfalls to avoid and how to accomplish successful automated software testing.
PROBLEM: TOO MUCH TIME IS SPENT ON SOFTWARE TESTING!
Too much time is spent on software testing. As software programs are increasing in complexity, testing times only seem to have increased. As stated by Hailpern and Santhanam: "... debugging, testing, and verification activities can easily range from 50 to 75 percent of the total development cost.1"
One recent testing improvement initiative is the establishment of a task force to improve development test and evaluation. A “Memorandum for Chairman, Defense Science Board” with the subject “Terms of Reference – Defense Science Board (DSB) Task Force on Development Test and Evaluation (DT&E)”, states that “approximately 50% of programs entering Initial Operational Test and Evaluation (IOT&E) in recent years have not been evaluated as Operationally Effective or Operationally Suitable.” Because of this memorandum, dated 2007, it was requested that “DSB establish a task force to examine T&E roles and responsibilities, policy and practices, and recommend changes that may contribute to improved success in IOT&E along with quicker delivery of improved capability and sustainability to Warfighters.”
Evidence of another improvement initiative appeared in the Washington Post, June 3, 2007: “The IRS has launched an initiative to enhance and expand current testing by integrating industry best testing practices to gain efficiencies that improve our overall testing processes.”
The outcome of a recent software testing survey2 conducted by IDT, LLC3 backs up the findings related to long software test timelines and high testing time percentages relative to the rest of the software engineering lifecycle. The survey’s goal was to determine software testing related issues in order to derive needed solutions, while reaching as many software testers (with a wide demographic) as possible: It was sent to tens of thousands of test engineers, posted on Quality Assurance (QA) user sites, and advertised on various Government tech sites. So far. there are nearly 280 responses from all over the world: 74% of the responses are from the U.S. and 26% of the responses are from other countries, such as India, Pakistan, Canada, South Africa, China, and Europe. More than 50% of survey respondents work for organizations of 1000 and more employees.
The survey contained various software testing related questions, and specifically the survey response to “Time currently spent on testing in relationship to overall software development lifecycle” is listed in Table 1: Almost 50% state that 30-50% of time is spent on software testing in relation to the overall software development lifecycle, and nearly 25% state that more than 50% of time is spent on it.
AUTOMATED SOFTWARE TESTING AS PART OF THE SYSTEM ENGINEERING LIFECYCLE
Automated Software Testing success is increased, if implemented as part of the system engineering lifecycle. This includes developer involvement; starting with automated unit testing; integration-testing and then building on those initial tests, automating the system testing. Additionally, automated testing as part of the system engineering lifecycle includes stakeholder understanding of what Automated Software Testing entails. Developers need to keep application testability issues in mind when developing software. They need to understand, for example, how a change in a GUI control/widget implementation could affect existing automated scripts, or how logging is required for test results evaluation, etc. Project managers need to include Automated Software Testing efforts as part of the schedules and budgets. Test managers need to hire qualified Automated Software Testing personnel, and so forth. Figure 1 shows the Automated Testing Lifecycle that parallels the system engineering lifecycle.4

Automated Software Testing can be effectively applied to all software testing phases that run in parallel to the system engineering lifecycle, such as developing an automated requirements traceability (RTM) via the use of a Requirements Management System during the requirements phase; automated build verification processes that include an automated unit test during the development phase; defect tracking, test status reporting; and metrics collection during the testing phase; configuration management throughout all phases; and so forth.
AUTOMATED SOFTWARE TESTING RETURN ON INVESTMENT (ROI)
If Automated Software Testing is implemented effectively, it will contribute to solving the ever increasing software testing issue. An Automated Software Testing ROI was demonstrated on an application for the Navy recently. Here is a high level description of one part of this effort:
Versions of a component used for communications onboard Navy ships and other DoD areas are delivered to Navy labs from vendors for testing and verification prior to release to respective programs and ultimately onboard war-fighters. The components each consist of nearly one million lines of code including vast complexity. Currently, it takes several months to thoroughly test multi-vendor component versions for performance, quality, and functionality. An initial Automated Software Testing implementation and ROI have shown that with Automated Software Testing substantial time savings can be achieved. (see Figure 2 ).
Figure 2 shows the initial findings: Based on the initial component testing actual results, one can project a 97% reduction in test days would occur over the course of ten years. Implementing Automated Testing to conduct testing in new and innovative ways, while shortening the testing and certification timeline, while maintaining or improving product quality, can accomplish a significant reduction in overall software development costs.



Ideally, automation in 10 years would include self-testable automated components. As for today, there are many reasons why the STL should be automated. The quality of the test effort is improved through automated regression testing, build verification testing, multi-platform compatibility tests, and easier ability to reproduce software problems, since automated testing takes out the human error in recreating the test steps. Test procedure development, test execution, test result analysis, documentation and status of problems should all be reduced with automated testing enabling the overall test effort and schedule to be reduced. Since Automated Software Testing applies to all phases of the STL, this would include an automated requirements traceability matrix (i.e. traceability from requirements to design/development/test cases, etc.); automated test environment setup; automated testing; automated defect tracking; etc.
Most importantly, some tests can hardly be accomplished using manual testing efforts, such as memory leak detection, stress or performance testing, high test coverage with a large amount of test data input and so on.
The challenges described related to testing complex software systems and the desire to reduce the cost and schedule associated with testing is not unique to DOD in general. Commercial businesses, large and small, are also faced with increasing large and sophisticated software projects while, at the same time, they are interested in delivering new and more capable products faster to market at the lowest possible cost. In response to these challenges, automated testing tools and methodologies have been developed and continue to emerge. In addition, the emphasis on iterative incremental development approaches where incremental software builds are utilized and incremental repetitive testing is required has further contributed to the growth in automated test tools and capabilities being utilized.
The IDC Software Research Group published report entitled, "Worldwide Distributed Automated Software Quality Tools 2005- 2009 Forecast and 2004 Vendor Shares"5 begins by stating that the “automated software quality tools market was once again the growth leader across application life-cycle markets”. This report goes on to state, “The criticality of software to business, the increasing complexity of software applications and systems, and the relentless business pressures for quality, productivity, and faster time to market have all been positive drivers (resulting in growth in the market) and will continue to be in the foreseeable future.”
The IDC Software Research Group attributes automated software quality tools growth primarily to businesses’ desire for higher quality, increased productivity, and faster time to market.
BENEFITS OF AUTOMATED TESTING THAT SHOULD ALSO BE CONSIDERED
Types of tests that manual testing cannot accomplish effectively, if at all – such as Concurrency, Soak, Memory Leak or Performance testing:
Concurrency testing uncovers any type of concurrent user access issues, while soak and persistence testing often uncovers memory leaks when the application runs over a period of time. Automated testing tools allow for those types of tests in a stand-alone fashion, while this type of testing is very time consuming and resource intensive to conduct manually.
Using test automation, automated scripts can be run over an extended period of time to determine whether there is any type of performance degradation or memory leak. At the same time, timing statements can be inserted to track performance timing of each event tested.
We can kick off the automated scripts on numerous PCs to simulate concurrency testing, i.e. accessing the same application resources at the same time with numerous users and monitoring for any potential issues.
Automated testing tools should be used for these types of testing efforts to make them more feasible.
Effective Smoke (or Build Verification) Testing
Whenever a new software build or release is received, a test (generally referred to as “smoke test”) is run to verify that previously working functionality is still working. It sometimes can require numerous hours to complete an entire smoke test, only to determine that a faulty software build has been received, resulting in wasted testing time, because now the build has to be rejected and testing has to start all over again.
If the smoke test is automated, the smoke test scripts could be run by the developers to verify the build quality before it is handed over to the testing team, saving valuable testing time and cost.
-Standalone - Lights Out Testing
Automated testing tools can be programmed to kick off a script at a specific time.
Consider the fact that automated testing can be run standalone and be kicked off automatically if needed, overnight, and the testers simply can analyze the results of the automated test the next day they are in the office.
Increased repeatability
Often a test is executed manually that uncovers a defect only to find out that the test cannot be repeated, i.e. the tester forgot which combinations of test steps led to the error message and is not able to reproduce the defect. Automated testing scripts take the guess work out of test repeatability.
Testers can focus on advanced issues
As tests are automated most system issues are uncovered. The automated script can be baselined and rerun for regression testing purposes, which generally yields less new defects than testing and automating new functionality. Testers can focus on newer or more advanced areas, where the most defects can be uncovered while the automated testing script verifies the regression testing area. New features are incrementally added to the automated test regression test suite.
Higher functional test coverage
Automated Testing will allow for an increase of the number of test case data combinations that manual testing could not cover. Data driven testing allowed for numerous test data combinations to be executed using one automated script. For example, in our case study during one of our prototype efforts we wanted to baseline numerous charts and routes used in an application we were testing. In order to automate this test efficiently, we only need to write one test script that calls and baselines numerous charts and routes and runs a bitmap comparison against a recorded baseline.
Additionally, if the chart is off by just one pixel, during a manual test analysis the naked eye would probably have a difficult time detecting that pixel difference, however, the automated bitmap comparison feature in the automated testing tool will point out that difference immediately. Therefore, the accuracy of an automated test is higher in most cases.
AUTOMATED SOFTWARE TESTING PITFALLS
The above list is just a subset of the automated testing options and potential benefits. With so many benefits why are so few automated testing efforts underway or even successful? There are various reasons why automated testing efforts can fail and in many years of experience in automated testing many lessons learned have been accumulated.6 Here are just a few of the common mistakes programs make when implementing automated testing:
Treating automated testing as a side activity: It is important that automated testing is not treated as a side activity, i.e. asking a tester to automate whenever he gets free time. Testers rarely have free time and deadlines are always looming. Automated testing requires a mini-development lifecycle with test requirements, test design and test implementation and verification. Automated testing can
Thinking anyone can automate a test: Testing requires skills and automated testing requires software development skills. The automation effort is only successful if implemented using the appropriate expertise.7
A structured approach to automated testing is necessary to help steer the test team away from some of the common test program mistakes below:
Implementing the use of an automated test tool without a testing process in place resulting in an ad-hoc, nonrepeatable, non-measurable test program
Implementing a test design without following any design standards, resulting in the creation of test scripts that are not repeatable and therefore not reusable for incremental software builds
Using the wrong tool
Test tool implementation initiated too late in the application development life cycle, not allowing sufficient time for tool setup and test tool introduction process (i.e. learning curve)
Test engineer involvement initiated too late in the application development life cycle resulting in poor understanding of the application and system design, which results in incomplete testing
Not including software developers, so they can keep automated testing in mind when they make changes to the code. Developers need to understand the impact their code changes could have on an automated testing framework and can consider alternatives, as appropriate.
Automated testing enables rapid regression testing while comprehensive manual regression testing is almost prohibitive to conduct because of the time required.
Often the mistake is made to assume that a “manual” tester can pick up an automated testing tool and simply hit record and playback. However, much more is involved and a development background is required. Automated Software Testing, when done effectively, should be considered a software development effort and includes test requirements, automated test design, script development and automated script verification.
HOW TO AUTOMATE SOFTWARE TESTING
Automated Testing can be accomplished using vendor provided tools, open-source tools or in-house developed tools or a combination of the above:
Vendor provided automated testing tools generally mimic the actions of the test engineer via the use of the tool’s “recording” feature. During testing, the engineer uses the keyboard and mouse to perform some type of test step or action, while the recording feature of the automated testing tool captures all keystrokes, saving the recording baselines and test results in the form of an automated test script. During subsequent test playback, scripts compare the latest test output against the previous baseline. Testing tools generally have built-in test functions, code modules, .dlls and code libraries that the test engineer can reuse. Most test tools provide for non-intrusive testing, i.e. they interact with the application-undertest without affecting the application’s behavior, as if the test tool was not involved. Vendor provided test tools use a variety of test scripting languages, i.e Java script, VB Script, C, or vendor proprietary languages. Vendor provided tools also use various storage mechanisms, with generally no specific standard being applied across the vendor community. This type of automation can be most tedious and time-consuming with possibly the least level of Return on Investment in an environment where the application-under-test is still constantly changing.
The problem with this type of “record/playback” automation is that the script baselines contain hard coded values, i.e. if the test engineer clicks on today’s date as part of her test steps, today’s date will be recorded/baselined and trying to play back the script tomorrow (on the subsequent date) will fail. The hard coded values need to be replaced with variables, etc. The tool generated scripts generally will require much modification and coding expertise, such as understanding of the use of reusable functions/ libraries, looping constructs, conditional statements, etc. Software development knowledge is required to effectively use vendor provided automated testing tools.
Open-source testing tools8 come in various flavors, i.e. are based on various technologies, come with different levels of capabilities and features, and can be applied to various phases of the software testing lifecycle.
increasingly mature and stable enough to be safely implemented in an enterprise test environment. Implementing open-source testing tool solutions can be a viable option, especially when vendor provided tools don’t support the software engineering environment under test while the open-source tool provides the compatible features required.
In-house developed software test automation efforts are still common and are often necessary when vendor provided or open-source tools don’t meet the automated testing needs. Developing automated test scripts is a software development effort and requires a mini-softwaredevelopment lifecycle.
The most successful automated testing environments develop a framework of automated tests with reusable components that is continuously maintained and new capability is added.

While 73% of survey respondents believe Automated Testing is beneficial, 54% of the software testing survey respondents listed “lack of time” or “lack of budget” as the major reason for not automating their software testing efforts. Considering, that there doesn’t seem to be a lack of time or budget when a regression test has to be rerun manually yet again after just another showstopper has been uncovered and required a fix, no matter how long the manual regression testing cycle takes or how many testers it takes or how often it already had been run previously, isn’t it time to automate?
The second highest percentage of survey respondents listed “lack of expertise” as the reason for not automating their software testing efforts. There are various companies that provide automated testing services, plus a vast pool of automated test expertise exists that could be drawn from.
30% of survey respondents listed the regression testing phases as most time consuming.
Automated testing payoff is highest during regression testing, because at the time of regression testing the application area-under-test generally has stabilized and initial tests have been run and defects have been removed. Automated test scripts can be rerun with minimal maintenance or other involvement.
Too much time is spent on software testing. Hardware automated testing and the associated standards are prevalent in the commercial sector and have been employed successfully in the commercial arena at the various DOD organizations for many years. We need to get software testing up to par with hardware testing, which includes quick turnaround times. Implementing efficient and effective Automated Software Testing is a main step into that direction.
REFERENCES
Software debugging, testing and verification by Hailpern and Santhanam, 2002, see http://www.research.ibm.com/ journal/sj/411/hailpern.pdf
http://www.surveymonkey.com/s.asp?u=267783652245
http://www.idtus.com
For a detailed explanation of how Automated Testing parallels the Engineering Lifecycle, see the book “Automated Software Testing,” 1999, Dustin, et al, Addison Wesley
D. Hendrick, IDC, July 2006
"Lessons in Test Automation," 2001, Dustin–see http:// www.stickyminds.com/s.asp?F=S5010_ MAGAZINE_62
Book “Automated Software Testing,” 1999, Dustin, et al, Addison Wesley
See http://www.opensourcetesting.com for information on various open-source testing tools
ABOUT THE AUTHORS
Elfriede Dustin works at Innovative Defense Technologies (IDT) http://www.idtus.com, an Arlington based software testing consulting company, currently working on an effort to bring in automated software testing to a branch of the DOD. Elfriede is lead author of the book “Automated Software Testing,” which describes the Automated Testing Lifecycle Methodology (ATLM), a process that has been implemented at numerous companies. Elfriede is also author of various white papers, of the books “Effective Software Testing,” co-author of “The Art of Software Security Testing,” and “Quality Web Systems,” books which have been translated into many languages and are available world-wide. Dustin has been responsible for implementing automated test, or has performed as the lead consultant/manager/director guiding implementation of automated and manual software testing efforts at various commercial and Government agencies.
Bernie Gauf is President and Chief Technologist of IDT. Mr. Gauf has twenty years of experience in leading the design, development, and delivery of innovative solutions for the DoD. His experience includes the development and production of systems for passive and active sonar, electronic warfare, command and control, and computer based training and simulation for these system. Mr. Gauf is currently leading IDT’s efforts in developing automated testing strategies and an automated testing framework suitable for DoD systems. Mr. Gauf has been invited to participate in numerous DoD panels associated with the use of COTS technology, middleware technology, and Open Architecture.
Prior to his employment at IDT, Mr. Gauf was one of the founding employees at Digital System Resources, Inc., a system integration and software company specializing in technology critical to national security and a recognized leader in providing state of the art, high quality products. DSR became one of the top 100 largest prime Department of Defense contractors for Research, Development, Test, and Evaluation through the successful transition of transformational technologies for the DoD.
AUTHOR CONTACT INFORMATION
Email: edustin@idtus.com
Email: bgauf@idtus.com
ISTQB
发表于 2010-1-4 05:48:43 | 显示全部楼层
软件测试工程师就业班马上开班
Maybe it's classic
ISTQB
发表于 2010-9-6 21:24:21 | 显示全部楼层
软件测试工程师就业班马上开班
看来我的英文水准必须得好好改善一下了,不然再好的文章拿给我看也是白搭。。。。。。。。。。。
发表于 2012-2-6 11:38:17 | 显示全部楼层
软件测试工程师就业班马上开班
做个记号,下次好找!

本版积分规则

Archiver|手机版|小黑屋|领测软件测试网 ( 京ICP备10010545号-5 )

GMT+8, 2020-12-6 09:12 , Processed in 0.281199 second(s), 13 queries , Xcache On.

Powered by Discuz! X3

© 2001-2013 Comsenz Inc.

快速回复 返回顶部 返回列表