Sunteți pe pagina 1din 11

Methodical Approach

Authors:

to• Creating
Abhijit Nadgonda
a Test
• Abhay Joshi
Automation
• Ramanath Shanbhag
Strategy
Page 1 of 11

Paper: Methodical Approach to Creating a Test Automation Strategy

www.aztecsoft.com
Page 2 of 11

1. ABSTRACT...................................................................................................... ...........3
2. TEST AUTOMATION – STATE OF THE ART..................................................................... ..................4
a. Automation without a Proper Strategy:...................................................................................................................4
b. Resulting Problems:................................................................................................................................................4
1.b.1. Management Issues.......................................................................................................................................4
1.b.2. Automation development issues....................................................................................................................5
c. Need for Automation Strategy:...............................................................................................................................5
d. Goals of Automation Strategy:................................................................................................................................6
3. METHODOLOGY FOR BUILDING AUTOMATION STRATEGY..................................................... .......................6

4. CASE STUDY: AUTOMATION OF GRAPHICS EMBEDDED UI................................................................. ........10


5. CONCLUDING REMARKS......................................................................... ......................11
6. AUTHORS BIOGRAPHY....................................................................................... ..............11

.................................................................................................................................

www.aztecsoft.com
Page 3 of 11

1.ABSTRACT
In today’s environment of plummeting cycle times and mounting budget pressures, test automation has
become an increasingly critical and strategic necessity. Software test automation has the capability to
decrease the overall cost of testing, and improve software quality. Test automation raises people’s hopes
yet it often frustrates and disappoints them. Most testing organizations are not been able to achieve the
full potential of test automation. Many groups that implement test automation programs run into a number
of common obstacles. These problems can lead to test automation plans being completely scrapped with
the tools purchased for test automation becoming expensive “shelf-ware”. Often teams continue their
automation effort, building up huge costs of maintaining large suites of automated test scripts that are of
questionable value.

Many teams acquire a test automation tool and begin automating right away, with little consideration of
how they can structure their automation to make it scalable and maintainable. Little consideration is given
to managing the test scripts and test results, creating reusable functions, separating data from tests, and
other key issues which allow a test automation effort to succeed. After some time, the team realizes that
they have hundreds or thousands of test scripts, and thousands of separate test result files. The
combined work of maintaining the existing scripts while continuing to automate new ones requires a larger
and larger team with higher costs and no additional benefit. As teams drive towards their goal of
automating as many existing test cases as possible, they often don’t consider what will happen to the
automated tests when the application under test (AUT) undergoes a significant change.

The need of automation strategy is quite clear in these situations. For example, automation would be a
waste of effort if the tests would never be repeated. Automation requires a proper methodology in
architecting and implementing it. For example, writing automation for a user interface that is not stable is
clearly a throwaway effort. People sometimes view automation is a programming project – which is
another recipe for disaster. Automation requires much different and rigorous treatment than other
programming projects.

In this paper we will discuss Aztecsoft' iTest’s Test Automation Strategy Service which offers a complete
solution to this challenge. We have a rich experience of executing product focused test automation, and
have carried out Test Automation projects for a number of very demanding situations. Using this practical
experience we have designed a systematic methodology for Test Automation. In the strategy phase, we

www.aztecsoft.com
Page 4 of 11

understand the test organization, existing test practices, testing problems, and past successes. Next we
help define test automation goals for critical areas (i.e. where automation is the only solution), and to
obtain cost and time benefits wherever possible. We analyze the functional test coverage and propose, if
applicable, additional automation test cases to improve the coverage. We then put together a plan to
implement automation which would deliver the greatest ROI. The plan includes a well-defined test
automation vision, answers questions such as what, when and how of automation, and proposes a
prioritized test automation plan. The strategy will also include recommendations on test automation tools
and harnesses to assist with the “how” question.

2.TEST AUTOMATION – STATE OF THE ART


Test automation is always viewed as the most desired methodology for testing due to the benefits it offers
– cost saving, time saving, and reliability. With product life cycles getting squeezed, and IT budgets
crashing through the roofs, test automation has become a critical goal for IT organizations, and one that
must be planned carefully.

a. Automation without a Proper Strategy:


Test organizations typically undertake automation activity as soon as test cases are in place.
They identify a popular tool or worse, just pick up a license that has been lying around, hire
a bunch of programmers, train them if necessary on the tool, and throw at them the test
cases that need to be automated. The overall work is estimated using a simple formula
involving number of test cases and the time taken by a few sample test cases. The
programmers usually have no idea about the business goals of the product and even the
business logic of the application. They start coding the test cases one by one. No attention
is given to automation architecture and design; very little of the software development
process is followed.

b. Resulting Problems:

1.b.1.Management Issues
o Estimation of the effort turns out to be wildly out of track.

www.aztecsoft.com
Page 5 of 11

o Lack of a proper design causes problems with work distribution. It results in code that is
not maintainable, with repeat functionality, and modules that don’t interface very well.
o Delivery deadlines are missed, and cost of the project goes out of the roof.
o Automation programmers constantly ask a lot of questions about test cases, how the
application works, thus causing disruption to the application developers.
o Delayed availability of automation causes manual test passes to continue, increasing the
cost of testing.
o Inadequacies in the tool are discovered, causing the need for custom modules, or even
replacement of the tool itself midway in the project.

1.b.2.Automation development issues


o Programmers write incompatible code, often with inconsistent coding standards.
o Many test cases get dropped because they cannot be automated. A lot of time may be
wasted due to this churn.
o Test cases are sent back to test case developers for redesign or clarification because it is
not clear what they do. Sometimes poorly designed test cases are automated without
clarifying what they do resulting in automation that is useless.
o There is no design or user documentation, resulting in automation code that cannot be
handed over to anyone else. The lab engineer who runs the automation needs a lot of
support.
o Automation programmers resist the selected tool because they have their own
preferences.

c. Need for Automation Strategy:


Through the Automation Strategy service we first understand customer’s test organization, test practices,
their testing problems and successes. Next we understand or help define test automation goals for critical
areas (i.e. where automation is the only solution) and to obtain cost and time benefits wherever possible.
We also analyze functional test coverage and propose, if applicable, additional automation test cases to
improve the coverage. We then put together a plan to implement automation which would deliver the
greatest ROI:
• A well-defined test automation vision
o What and how much to automate
o When to automate and how
o Prioritized test automation plan

www.aztecsoft.com
Page 6 of 11

d. Goals of Automation Strategy:


- Select a tool that is best suitable from cost, technology, and other points of view
- Define an ROI that is practical
- Define dependencies, such as, impact of changes in dev plan, features, etc.
- Make recommendations on the time-line for automation
- Select appropriate features and test cases for automation
- Give a realistic budget for automation
- Provide a high level design that is scalable and allows future addition of test cases
- Provide a plan to manage and maintain the Automation (post-delivery)

3.METHODOLOGY FOR BUILDING AUTOMATION


STRATEGY

1. Understanding the product and its life cycle


Review of product architecture, its published interfaces, and its data flow, build frequency and
process.

2. Understanding the test effort


Review of the test effort to understand how much is manual and how much is automated. Review the
test environment – test bed, test tools, test documentation, etc.

3. Understanding the test practices


Review of current testing methods and standards for the application. Review of existing test cases to
check if they are well-written and are easy to understand/maintain.

4. Test Automation scope definition


Understand the trouble spots in existing automation, if any, or analyze other possibilities of
automation and determine their ROI. Determine the dependencies and limitations for every target
automation candidate. Selection of test cases based on feasibility. Addition of new test cases to
improve test coverage.

www.aztecsoft.com
Page 7 of 11

5. Tool selection or design


Considering the rich variety of automated test tools, selecting the best suitable tool is a key decision.
Aztecsoft-iTest’s tool selection process ensures that the most appropriate tool is selected, and the
return on investment is achieved.

Selection process – commercial / open source tools:


Based on our past experience and independent evaluation we have comprehensive data on
capabilities of all major commercial and open source automation tools. Using this knowledge
base we shortlist one or more automation tools. Using each short-listed tool a short duration (2-3
weeks) proof of concept (POC) project is executed to confirm the tool selection.
If more than one tool can be used then an appropriate tool is selected considering the following
parameters:
a. License cost
b. Maintenance cost
c. Required skills and its availability

Below are the categories selected for tool evaluation:


• Platform & Browser Support
• Recording
• Playback
• Validation
i. Functionality
ii. Response time
• Web Services Testing - Features
• Protocol Support
• Data-driven Testing
• Attachments support
• Others

Customized:
Sometimes it is difficult to find the appropriate tool which will support the functionality and test
environment we need. This requires developing customized test framework to address the
requirements. This may mean simply extending an existing tool (which is mostly the case), or
building a completely new tool from scratch. A detailed architecture and design of the required

www.aztecsoft.com
Page 8 of 11

tool capability is developed with the associated time-line and cost estimate. Aztecsoft-iTest has
developed several customized test frameworks by extending a variety of commercial tools and
sometimes developing them from scratch.

6. Effort estimate
Teams tend to forget many important tasks while estimating automation effort. For example, the need
for cross-platform compatibility is not considered. Automation maintenance discussed above also is a
common factor that is ignored. We have developed suitable templates for effort estimation. Below we
show a sub-set of the template:

Tasks Is applicable Total Total time


for this time (Days)
project? (Hrs)

Ramp up phase
Understanding the application 0
Evaluation of an automation tool and trial 0
Understanding the automation tool / language 0
……. 0 0
Total time for Ramp up 0 0

Test case development


Writing test cases 0
Total time for Test case development 0 0

Test Harness Development or Extension


Design 0
Coding 0
Logging code development 0
Code review + integration of review comments 0
….. 0
Total time for Test Harness Development or Extension 0 0

Test automation development


UI Automation - Test case coding + Unit testing 0 0
API Automation - Test case coding + Unit 0 0
testing
Test case automation code review (by 0
customer) + integration of review comments
Test Lab setup time (setting up servers, 0
operating systems, hardware and software
configurations)

www.aztecsoft.com
Page 9 of 11

…… 0
Total time for Test automation development 0 0

Delivery
Preparing \ updating build release documents 0
Test code check-in 0
Total time for delivery 0 0

Maintenance (to support multiple builds)


# of builds expected to be supported
Test Harness changes NA
Automation code changes NA

……. NA
Total maintenance time 0 0

www.aztecsoft.com
Page 10 of 11

4.CASE STUDY: AUTOMATION OF GRAPHICS


EMBEDDED UI
Challenges Solutions
• Customer did not have any prior • Assisted customer in selecting a test automation tool
exposure to test automation • Educated customer on what to expect from
• Customer had unrealistic expectations automation to extract maximum ROI
from test automation • Assisted customer in prioritizing product features to
• Customer needed automation in place as develop test automation
soon as possible • Identified the gap between expected automation and
• Customer did not have test cases ready feasible automation using the selected automation
for test automation tool. Developed extensible components for the
• Customer did not have bandwidth to automation tool to bridge the gap.
respond to automation development team • Separated functional test cases, UI and validation test
queries cases to write efficient and reusable test automation
• Customer did not have dedicated team to code
maintain automation • Developed functionality vs. test scenarios matrix to
map automation code to functional coverage.
Scope • Elaborated scenario functionality, stepwise, to make
• To understand client business goals and them granular.
products • Modified test cases to make them clearer and
• To understand client product granular to use while writing test automation code.
development life cycle • Developed test automation framework based on the
• To understand expectations from client tool selected.
about test automation • Developed, tested and deployed test automation
• To assist client in selecting test scripts for the selected products.
automation tool • Trained customer team to setup test automation
• Prioritize products for test automation environment, execute and debug test automation
development code
• To plan test automation activities for each
product Team
• Develop test automation framework • Offshore
based on the selected tool o1 Test Manager
• Develop, test and deploy test automation o5 Test Leads
scripts using the framework o5 Senior Automation Engineers
o20 Automation Engineers

www.aztecsoft.com
Page 11 of 11

5.CONCLUDING REMARKS
It should be clear from the ensuing discussion that the success of automation projects can be greatly
assured by spending some time up-front in devising a proper automation strategy. We have presented in
this paper a methodical approach for building a practical automation strategy.
One of the challenges that needs to be sorted out is how to reconcile the need for automation strategy
with rapid development environments like Agile in which some of the questions asked to build a strategy
are best answered somewhere downstream in the product life cycle. We are currently working on
improving our strategy approach to include such environments like Agile.

6.AUTHORS BIOGRAPHY
Abhijit Nadgonda has 9+ years of experience in Software Quality Testing and Analysis. Abhijit is working
with Aztecsoft as a senior Test Architect, and has managed multiple testing projects; presently working on
R&D, Training, Sales support, and Delivery Excellence activities related to Software Testing. Abhijit’s
specialization is in Test Automation, and he regularly trains Microsoft technical Software Development
Engineers at their offices worldwide. Abhijit has Bachelor's degree in Computer Science Engineering from
Shivaji University, India.

Abhay Joshi is presently Advisor to Aztecsoft iTest – a leading organization offering independent testing
services. Aztecsoft iTest was formerly Disha Technologies (which Abhay co-founded) – a company that
pioneered independent software testing for the ISV segment. Prior to becoming Advisor, Abhay was Head
of the Aztecsoft iTest Practice Group, and was responsible for R&D, Training, Process, Sales support, and
Delivery Excellence activities related to Software Testing. Prior to starting Disha Technologies in 1997,
Abhay worked as a developer in the areas of TCP/IP stack development, Secure UNIX systems, and
Storage. Abhay has a master’s degree in Computer Engineering from Syracuse University, USA, and a
Bachelor's degree in Instrumentation & Control Engineering from Pune University, India. His total
Software industry experience is of 18 years.

Ramanath Shanbhag is currently working at Aztecsoft as Deputy General Manager performing the role of
a Delivery Head (Offshore Project Delivery Management) for multiple projects, and also working on
Aztecsoft itest Delivery Excellence Activities, Test Process Standardization, and Technical Sales support.
Prior to working at Aztecsoft Technologies since 2004, Ramanath has worked at Microsoft Corporation
(Redmond, WA, USA), Aditi Technologies (Bellevue, WA, USA), and Accord Software Solutions
(Bangalore, India) in various testing roles. He has a Bachelor’s degree in Computer Science Engineering
from Karnataka University, India. He has a total Software Industry experience of 11 years.

www.aztecsoft.com

S-ar putea să vă placă și