Sunteți pe pagina 1din 59

ENTERPRISE

OPERATION SERVICES
(EOS) TEST STRATEGY
ORGANIZATIONAL
INFRASTRUCTURE & IFS
IMPLEMENTATION: POC & PILOT
RELEASE
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Table of Contents
1. Document Control ................................................................................................ 4
1.1 Version History ............................................................................................................... 4
1.1.1 Approvals........................................................................................................ 4
2. Introduction ......................................................................................................... 5
2.1 Document Purpose .......................................................................................................... 5
2.2 Definitions ..................................................................................................................... 5
2.3 References ..................................................................................................................... 5
3. Project Overview .................................................................................................. 6
3.1 Business Benefits ............................................................................................................ 6
3.2 The Solution................................................................................................................... 7
3.3 IFS Implementation ........................................................................................................ 7
3.4 IFS Integration ............................................................................................................... 8
4. Workflows in Scope .............................................................................................. 9
4.1 Day 0 PoC...................................................................................................................... 9
4.2 Day 1 Pilot ..................................................................................................................... 9

5. Implementation Plan:......................................................................................... 10
5.1 Day 0 POC / Day 1 PILOT: ............................................................................................. 11
6. Milestone Schedule ............................................................................................ 11
7. Assumptions....................................................................................................... 11
8. Test Scope .......................................................................................................... 12
9. High-Level Test Objectives ................................................................................. 12
9.1 User Story Testing ........................................................................................................ 13
9.2 Feature / Workflow Testing ............................................................................................ 13
9.3 SIT Test ...................................................................................................................... 13
9.4 UAT Test...................................................................................................................... 13
10. Test Scope Exclusions ........................................................................................ 14
11. Testing Approach ............................................................................................... 15
11.1 Integrated Test Methodology ........................................................................................ 15
11.2 Phased Approach: Test Levels ...................................................................................... 15
11.2.1 Static Testing ............................................................................................... 16
11.2.2 Unit / Unit Integration Testing ........................................................................ 16
11.2.3 System Test ................................................................................................. 17
11.2.4 System Integration Test ................................................................................. 19
11.2.5 User Acceptance Test..................................................................................... 21
11.2.6 Install & Back-out Test................................................................................... 21
11.3 Test Phases (Testing Delivery) ...................................................................................... 22
11.3.1 Test Initiation (PDP – Concept & Requirements) ................................................ 22
11.3.2 Test Planning (PDP – Requirements /Analysis & Design Phase) ........................... 23
11.3.3 Test Planning: Test Design (PDP – Requirements & Build Phase) ... Error! Bookmark
not defined.
11.3.4 Test Execution (PDP – Build Phase: Unit Test Level) .......................................... 26
11.3.5 Test Execution (PDP – Test Phase: System Test Level) ..........Error! Bookmark not
defined.
11.3.6 Test Execution (PDP – Test Phase: System Integration Test level) ............. Error!
Bookmark not defined.

Ladenburg-Thalmann Prasanna Kumaran B


2
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

11.3.7 Test Execution (PDP – Acceptance Phase: UAT Test Level) .....Error! Bookmark not
defined.
11.3.8 Test Execution (PDP – Implementation Phase: Install & Back-Out Test Level) . Error!
Bookmark not defined.
11.3.9 Test Closure (PDP – Post Implementation Phase) .............................................. 28
11.3.10 Regression Automation (New Section as part of Test Approach to be included) 29
11.3.11 Performance Testing (New Section as part of Test Approach to be included)Error!
Bookmark not defined.
11.3.12 Security Testing (New Section as part of Test Approach to be included) .... Error!
Bookmark not defined.
12. Defect Management Process .............................................................................. 43
12.1 Defect Detection, Analysis and Resolution ...................................................................... 44
12.2 Severity Code Definitions ............................................................................................. 48
12.3 Priority Code Definitions .............................................................................................. 49
13. Test Environment Requirements ........................................................................ 49
14. Dependencies ..................................................................................................... 50
15. Gating Approach ................................................................................................. 50
15.1 User Story Testing ...................................................................................................... 50
15.1.1 Entry Criteria ................................................................................................ 50
15.1.2 Exit Criteria .................................................................................................. 50
15.2 Feature / Work Flow Testing ......................................................................................... 50
15.2.1 Entry Criteria ................................................................................................ 50
15.2.2 Exit Criteria .................................................................................................. 50
16. Testing Organization .......................................................................................... 51
16.1 Structure ................................................................................................................... 51
16.2 Roles & Responsibilities ............................................................................................... 52
16.3 Communications Plan .................................................................................................. 54
17. Resource Estimates ............................................................................................ 54
18. Testing Risks ...................................................................................................... 55
19. Deliverables ....................................................................................................... 56
20. Test Metrics ........................................................................................................ 57
21. Testing Tools ...................................................................................................... 59

Ladenburg-Thalmann Prasanna Kumaran B


3
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

1. Document Control

1.1 Version History

Version Date Change Description Author


(include list of reviewers)
0.1 06/09/2011 First Draft Prasanna
Kumaran B
0.2 07/09/2011 Updated comments Prasanna
Kumaran B

1.1.1 Approvals
Document versions are approved by the following people:

Title Name / Date Feedback Date


Received
QA Head Leon Johnson
EOS IFS Implementation, Sue Leistico
Program Manager

Ladenburg-Thalmann Prasanna Kumaran B


4
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

2. Introduction

2.1 Document Purpose

The purpose of this document is to define the preliminary test scope and objectives, high-level
approach, testing organisation, and management controls for IFS Implementation Day 1.
This Test Strategy is the initial planning document that provides a framework for estimating and
approval of the total test effort for IFS Implementation Day 1.
This test strategy is intended to provide a controlling framework for the test planning, test design,
and test execution across all applications participating in these releases.
This test strategy has been developed during the release design phase and reflects the current
implementation plans.
The subsequent Detailed Test Plan(s) produced by each of the associated testing teams must conform
to the content of this strategy and will confirm and elaborate on the high-level approach documented.
This document is to be signed off by all authorised signatories. The intent of this document is as a
planning tool not a living document. Any changes to risks, issues, resource needs, etc. will be managed
through the standard planning and tracking process, and also reflected in subsequent test documents
(e.g. DTPs)

2.2 Definitions
In this document:
Acronym Definition
DTP Detailed Test Plan

2.3 References

Document Description

Ladenburg-Thalmann Prasanna Kumaran B


5
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Solution Architecture Document

Workflow Business case

3. Project Overview
The Enterprise Operational Services (EOS) project is the implementation of a new shared service
model throughout the LTS Enterprise. The goal is to implement and enable processes that improve
the advisor experience, control risk, leverage operations and create growth.
The plan is to accomplish this through two primary initiatives:
1) the readiness of the operations organization (“the people”) and
2) the synchronization and automation of common procedures and policies across all Ladenburg
Independent Advisory and Brokerage (IAB) firms (“the process and the technology”). The latter will
be facilitated by the integration of a new rules-based workflow solution from IFS Automation and is
the focus of this Charter.

3.1 Business Benefits

 Create the organizational infrastructure and prepare the resources to support an enterprise
 consolidated operation
 Ensure that all IABs, departments, and functions are included in this process
 Strive to use industry standard data requirements and rules;
 Minimize (strive to eliminate) the incorporation of exceptions in our workflows;
 Minimize disruption to the salesforce;
 Retain scope and deliver on time;
 Resolve issues quickly and minimize risk effectively;
 Provide regular, consistent and accurate communication to all stakeholders;
 Be consistent in our execution across all workstreams.

Ladenburg-Thalmann Prasanna Kumaran B


6
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

3.2 The Solution

3.3 IFS Implementation

Ladenburg-Thalmann Prasanna Kumaran B


7
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

3.4 IFS Integration

The Agile methodology has been adopted by IFS to deliver the new workflows and interfaces. The
EOS IFS Implementation will deliver across 2 releases in 2019-20. The project scope is currently split
and being managed in PoC & Pilot phase.

PoC (Day 0) Scope: KMS (Pershing Only) for Asset Movement, Transfers & Maintenance

Pilot (Day 1) Scope: KMS (Pershing Only) for Asset Movement, Transfers & Maintenance and Account
Opening

The test strategy is being built to handle the immediate testing requirements and support delivery of
Pilot (Day 1) in March 2020. Additionally, is expected to be robust enough to grow as additional waves
are implemented. Under the agile methodology, requirements and solution are grouped into blocks
called Sprints. These sprints will be configured and tested individually as they are completely
independent (from other sprints).

Ladenburg-Thalmann Prasanna Kumaran B


8
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

4. Workflows in Scope
4.1 Day 0 PoC
 Asset Movement: FedWires (Outgoing)
 Asset Movement: Check Disbursement
 Asset Movement: Receipts and Disbursements
 Asset Movement: IRA Contributions/Distributions
 Asset Movement: Transfers (Incoming)
 Asset Movement: Standing and Periodic Instructions (Cash Only)
 Maintenance: Standing and Periodic Instructions (Cash Only)
 Journals (Semi-Automated): Advisor data entry available for cash and securities.
Security Journals will drop to a queue for manual processing.

4.2 Day 1 Pilot


 Asset Movement: FedWires (Outgoing)
 Asset Movement: Check Disbursement
 Asset Movement: Receipts and Disbursements
 Asset Movement: IRA Contributions/Distributions
 Asset Movement: Transfers (Incoming)
 Asset Movement: Standing and Periodic Instructions (Cash Only)
 Maintenance: Standing and Periodic Instructions (Cash Only)
 Account Opening (80% of Registrations)
 Brokerage and Direct Sponsor
 Journals (Fully-Automated): Dependent on Pershing’s resolution to NASDAQ pricing
issue

Ladenburg-Thalmann Prasanna Kumaran B


9
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

5. Implementation Plan:
The Agile Methodology has been chosen as the delivery method. This methodology is depicted
pictorially below:

Day 1
Sprint n
PILOT

Day 0
Sprint 5

POC

Sprint 4

Sprint 3

Sprint 2

Agile Delivery

Sprint 1 Method

The approach involves breaking scope into manageable pieces of delivery and recognises the effort
required to build and deploy an enterprise wide solution.
Each agile sprint consists of the following windows
 Elaborate
 Construction
 Testing

Ladenburg-Thalmann Prasanna Kumaran B


10
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

5.1 Day 0 POC / Day 1 PILOT:


The constructed build will be turned over for Testing and the valid feedbacks received from QA
Team / Business Team will be incorporated in the application. This iterative cycle will continue into
the Acceptance Testing phase as well until the Go-Live sign-off.

6. Milestone Schedule
The following table documents the major testing milestones and the target dates

Milestone Date
Day 0 POC: 01/15/2020
Asset Movement, Maintenance, Transfers
Test Cases Preparation (Day 0 POC) 11/11/2019 to 12/06/2019
Feature Level / Workflow Testing (Day 0 POC) 12/09/2019 to 12/27/2019
System Integration Testing (Day 0 POC) 12/30/2019 to 01/08/2019
User Acceptance Testing (Day 0 POC) 01/09/2019 – 01/15/2019
Day 1 PILOT: 01/15/2020
Account Opening, Asset Movement, Maintenance, Transfers
Test Cases Preparation (Day 0 POC) 11/11/2019 to 12/06/2019
Feature Level / Workflow Testing (Day 0 POC) 12/09/2019 to 12/27/2019
System Integration Testing (Day 0 POC) 12/30/2019 to 01/08/2019
User Acceptance Testing (Day 0 POC) 01/09/2019 – 01/15/2019

7. Assumptions

No. Assumption
1. Feature / Workflow Testing will be happening in IFS Environment and SIT will be taking
place at SAI Environment
2. Workflow priority for Testing / Delivery will be given by Business Team and it will be
factored both in Project Management Plan and Test Strategy
3. Business Analyst / SME will be available to support test team during test planning phase
4. Each Release will comprise multiple code deploys to Production. The code deploys which
will be made to production will be the same as the one deployed to the SIT/UAT,
Performance and Security test environments
5. Individual workflows are completely independent; meaning changes in one workflow
cannot affect the other concurrent workflow.

Ladenburg-Thalmann Prasanna Kumaran B


11
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

8. Test Scope
The majority of the requirements and workflows for each sprint are being leveraged ‘Out of the Box’
(OOTB) from IFS product. OOTB workflows will be customised using configuration scripts to derive
SAI specific workflows. As such, the major scope of testing for each release is acceptance of these
customised workflows.
In addition to User Story feature testing, QA team is to perform the following testing

Task Name Focus Owner / Responsibility Accountability

Use Story Testing Functionality IFS Team IFS Development


Team

Feature Testing Functionality QA Team QA Team

Work Flow Testing Process & Solution QA Team QA Team

SIT Application and Solution QA Team QA Team

UAT Application and Solution Business Users Business Team

Regression Automation Process & Solution QA Team QA Team

Performance Testing Operational Readiness QA Team IFS / Technology Team

Install & Backout Process and Operational Environment Team / QA Environment Team
Acceptance Team

Security Testing Application and Infrastructure Team / QA Infrastructure Team


Operational Acceptance Team

9. High-Level Test Objectives


Across all test phases, the general objectives of testing activities for the IFS Implementation project
are:
 Verify that the system satisfies the requirements as specified in the Business
Requirements, UI, Process Flow and as well as Business Rules documentation
 Validate that the functionality performs as expected
 Ensure that all the system functionality (in production) that supports business processes
will still operate successfully after the release
 Verify the synchronization and automation of common procedures and policies across
all Ladenburg Independent Advisory and Brokerage (IAB) firms starting with KMS in Day
0 and Day 1 implementation.
Testing is conducted as a risk mitigation exercise aimed at ensuring that the solution delivered is as
error free as possible and of a high quality such that any undetected errors are not so serious as to
pose a major risk in production.

Ladenburg-Thalmann Prasanna Kumaran B


12
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

In accordance with the above, EOS-IFS Implementation Test Strategy has been built to meet the
following key high-level objectives:

Functional Coverage

Ensure the key functional elements for the solution work as expected. Eg, ensure that the
functionality of the workflows works as expected.

New / Changed Interfaces

Ensure that all the new or changed interfaces can be sent, received and processed as expected.

Non Functional

Ensure that all non-functional requirements outlined around the following key points are proven.

These overall high-level test objectives are met differently through the various test phases:

9.1 User Story Testing

User Story Testing will be performed by the IFS development team to prove individual configuration
changes behave as expected

9.2 Feature / Workflow Testing

As the majority of the testing required relates to business verification of workflow configuration, the
Features / workflows will be picked by the QA team when they are available to test and perform
functional testing validating process flow. This will happen in the IFS environment but will be
conducted by the QA Team.

9.3 SIT Test

As the majority of testing relates to business verification, SIT is required primarily to prove the
interfaces to IFS and from IFS to SAI downstream system works as expected. Only limited testing of
workflows is required to prove the data flow from Origin System to End system. The objective of the
SIT phase is to ensure that the applications interact to produce the expected ‘end-to-end’ result.

SIT will also focus on ensuring that any interface changes have not impacted other applications.

9.4 UAT Test

The objective of the UAT phase is again to ensure that the applications interact to produce the
expected ‘end-to-end’ result. But the Testing will be performed by Business Users. UAT will also
focus on ensuring that any interface changes have not impacted other applications.

Ladenburg-Thalmann Prasanna Kumaran B


13
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

10. Test Scope Exclusions

 Manual business operational changes (if any) introduced as part of IFS Implementation will
not fall under the scope of this document
 IFS product capability will not be tested as part of this engagement.
 Database Back-up and Restoration validation if needed is not within the Testing Delivery
Scope.

11. Test Phases

The proposed Integrated Test methodology as it relates to projects is fully integrated with the Project
Delivery Process. The Testing Process will emphasize a Phased Approach consisting of Test Initiation,
Test Planning, Test Execution and Test Closure activities as part of Test Stages. It will also have Unit,
ST, SIT, UAT and Implementation Test Levels bringing full value to the phased approach methodology.
All deliverables created for the Test Process must be provided to the Project Manager to satisfy PDP
requirements

Test Initiation occurs during Concept & Requirements phase of the project. Test Planning occurs
primarily during the Analysis & Design phase of a project, though activities begin in Requirements and
continue through Build. During Build, Test Planning activities are focused on building the test cases
and scripts that are outlined in the Test Plan.

Test Plans and Test Cases/Scripts are created for each Level of Testing as required. Test Closure
activities happen during Post-Implementation phase but the activities can begin during
Implementation Phase

Ladenburg-Thalmann Prasanna Kumaran B


14
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

12. Testing Approach

The proposed Test Methodology will go hand in hand with Delivery Methodology (Project Delivery
Process) complimenting its systematic approaches to plan, design, implement and control the
phases of a project to achieve organizational goals consistently. The Methodology will address
effective management of effort, time, cost and resources of the project to achieve superior quality,
higher productivity, perfect delivery performance, customer satisfaction and improve business
performance.

12.1 Integrated Test Methodology

12.2 Phased Approach: Test Levels

A Level of Test indicates a unique combination of code build, environment, and Testing Scope. A
Type of Test is a specific technique or focus of testing. For example, functional type of testing
focuses on system functionality whereas structural testing focuses on how the system is built (i.e.,
connectivity on a network).

 Static Testing
 User Story Testing
 Work Flow Testing
 Systems Integration Testing

Ladenburg-Thalmann Prasanna Kumaran B


15
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

 User Acceptance Testing


 Install and Back-out Testing

12.2.1 Static Testing

Static Testing involves verifying the requirements (BRD, FRD) and design of the system before
actual build and construction takes place.

As part of requirements study, the QA team will review the requirements against the following
criteria –

• Correctness: Validate the requirement reflects the user objective, standard or regulation
• Completeness: All necessary elements are included
• Clarity: Requirements are stated in a precise and measurable way
• Consistency: No internal/external contradictions within/between work products
• Testability: Confirm it is possible to create a test for the requirement: that an expected
result can be known

Deliverable* Responsible Accountable

Variance Documentation Business Analyst, Tester Project Manager

12.2.2 User Story / Feature Testing

User Story Testing is a test of high-level description of a goal a customer wants to achieve with the
application, as it operates in isolation. This level of testing confirms that the code operates in the
correct manner, as an isolated unit.

User Story Integration Testing (USIT) may be performed by the QA team to ensure that each User
Story or Feature can work together to meet the functional criteria.

User Story Testing will be conducted during the Detailed Design phase of IFS Configuration /
Development cycle

No specific Test cases for each User Stories will be written. However, the QA Team can leverage their
Workflow Test cases for execution.

Ladenburg-Thalmann Prasanna Kumaran B


16
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Primary Activities:

• Ensure User stories satisfy their goal


• Register Feedbacks to IFS
• Produce a User Story Testing Execution Report summarizing results

Deliverable* Responsible Accountable

User Story / Feature Testing


QA Team Test Manager
Results

Feedbacks QA Team Test Manager

User Story / Feature Test


QA Team Test Manager
Summary Report

Note: All deliverables are provided by the Accountable person to the Test and/or Project Manager

Entry Criteria:

 Design completed and aligned with requirements


 Completed User Story construction according to development standards
 Unit test scripts (& data) defined and documented
 Modules installed to stable test environment (IFS)
 Release note with User Stories eligible for Testing

Exit Criteria:

 All User Stories are executed


 Main functions tested, functional and stable
 User Story Test summary must be produced
 Outstanding variances (Feedbacks) logged

12.2.3 Work Flow Testing

Work Flow Testing verifies that the workflow components perform both functionally and
operationally as designed (workflow or process accurately reflects the business process). Workflow
Testing is conducted within a stream for components that may not have be Integrated with other
components of the end to end solution.

Ladenburg-Thalmann Prasanna Kumaran B


17
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Scenarios exercise various combinations of functionality that may or may not include interface
testing depending on the application being tested.

This approach will find defects earlier in the Testing SDLC thus mitigating the risk to SIT test phase.

Confirm that a system is technically and functionally sound meeting quality requirements. Testing
covers all combined parts of a system.

Focus on full functional validation All FRD, BRD requirements) of new and changed features within
the project scope:

 User Interface
 Project Interfaces (If any)
 Data format and Integrity
 Limited end to end function testing without possible upstream & downstream systems

Primary Activities:

• Write specific test cases targeting E2E navigation of the workflow.


• The Test cases will focus on both positive and negative test types
• Test cases will cover all key permutations of business data, processes and procedures
• Execute Workflow Test Cases and record results
• Record Defects for any variance
• Execute Regression & Performance test scripts and record results during PILOT
implementation
• Ensure results meet Exit Criteria in the Test Plan
• Produce daily/weekly Test Execution Reports summarizing test results

Deliverable* Responsible Accountable

Functional Test Cases Tester Test Lead

System Test Results & Defects Tester Test Lead

Test Execution Report Test Lead Test Manager

Traceability Matrix Test Lead Test Manager

Ladenburg-Thalmann Prasanna Kumaran B


18
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Regression Automation Scripts* Automation Lead Test Manager

Performance Scripts* Performance Analyst Test Manager

*- Will be part of PILOT implementation deliverables


Entry Criteria:

 Release / Build note with workflows details eligible for Testing: implemented features /
changes, outstanding development work, defect fixes, CR’s included in build
 Smoke tested; Main functions working and stable; no test blocking defects
 Completion and sign of project test plan
 Test cases and data completed
 Traceability Matrix created
 Defect handling procedure and tools in place
Exit Criteria:

 Review and sign off of test results by project manager, development and test manager
 Test case pass rate > 85%
 All workflows at least tested in 1 cycle
 Outstanding defects and workarounds documented

11.2.4 System Integration Test

Systems Integration Testing confirms successful operation from Origin to end point. All scenarios
will be analyzed to identify the touch points from an end to end data flow perspective. These
scenarios will form the scope for the SIT testing.

Validate interfaces are correct so systems can perform together end-to-end meeting quality
requirements.

Focus is on defect discovery by execution of end to end business scenarios and transactions, data
and system flows

Additionally, in this test phase there is a strong focus on regression testing and up & downstream
testing.

Primary Activities:

• Write specific test cases targeting E2E operation from Origin to Endpoint targeting
upstream and downstream systems
• The Test cases will focus on both positive and negative test types

Ladenburg-Thalmann Prasanna Kumaran B


19
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

• Execute Integration Test Cases and publish results


• Record Defects for any variance
• Produce daily/weekly Test Execution Reports summarizing test results and quality
risk assessment

Deliverable* Responsible Accountable

SIT Test Cases Tester Test Lead

System Integration Test Results Tester Test Lead


& Defects

SIT Execution Report Test Lead Test Manager

Traceability Matrix – Updated Test Lead Test Manager


for SIT

Regression Automation Scripts* Automation Lead Test Manager

Performance Scripts* Performance Analyst Test Manager

*- Will be part of PILOT implementation deliverables

Entry Criteria (Gate 1):

 Workflow Testing completed; All outstanding Defects logged


 SIT Test cases designed and reviewed
 Test data files designed and reviewed
 Defect review meetings in place
 SIT environment deployed and smoke tested. Build labels communicated
 Downstream and upstream systems ready for integration / regression testing

Exit Criteria:

 Successful execution of all SIT test cases

Ladenburg-Thalmann Prasanna Kumaran B


20
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

 Test case pass rate > 95%


 All outstanding defects logged, reviewed and accepted by the business
 All interfaces working as per requirement specifications
 Downstream and upstream systems can receive, provide and process interface data and files

11.2.5 User Acceptance Test

User Acceptance Testing (UAT) is where the users validate Business requirements and usability
criteria have been met by the delivered solution, ensuring that business needs are satisfied. UAT is a
Business, not a Technology responsibility.

User Acceptance Testing will be the accountability of the Business to plan and execute with ‘support’
from the individual Application teams and test teams as required.

Standard defect management process will be followed for any defects identified during this phase of
testing.

QA Team will support Business Team in terms of sharing the SIT Test cases, Execution & Defect Re-
Test and Regression

11.2.6 Install & Back-out Test

The objective of Back-out verification is to ensure each application can back out the changes
installed and bring back the software to its current production version.

Each application team will perform an install and back-out verification to ensure the application can
be installed and backed out successfully. Test scripts will be prepared to verify high level
functionalities and executed which will ensure successful installation.

Verify migration to Production was successful and user requirements are satisfied.

Primary Activities:

• Execute Implementation Test Cases, Wellness Checks, and Post Implementation


Checklist activities and record results
• Finalize Test Summary Report summarizing all test results and mitigation plan for any
defects on delivery – includes consolidation of final Test Results Report
• Participate in Project Lessons Learned session

Ladenburg-Thalmann Prasanna Kumaran B


21
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Deliverable* Responsible Accountable

I&B Test Cases QA Team

I&B Test Results QA Team Business Test Manager

I&B Test Defects QA Team Business Test Manager

I&B Test Summary Report Business Test Manager Business Test Manager

11.3 Test Phases (Testing Delivery)

11.3.1 Test Initiation (PDP – Concept & Requirements)

Communicate how testing for the project will be managed. Test initiation is a set of initial kick –off
meetings within a project or release with the following objectives:

 To provide a functional walkthrough of the key new and changed features


 To provide an architectural walkthrough of the key new and changes system features
 To provide an initial and high level understanding of the end to end system / application
scope (e.g. regression / downstream / integration testing with other applications)
 To provide a high level plan and milestones
 To define the key drivers (owners) and main stakeholders of the test effort (and related
budget (constraints)

11.3.1.1 Primary Activities:

• Provide a high-level overview of how the standard Test Process will be applied for this
particular project
• Roughly estimate resources – people, environments, test data
• Create preliminary project trace matrix – to include Business Requirements, Business Use
Cases, and Requirements Document
• Identify high-level risks that could impact ability to adequately test

Ladenburg-Thalmann Prasanna Kumaran B


22
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

11.3.1.2 Deliverables

Deliverable* Responsible Accountable

Test Strategy Test Manager Test Manager

Preliminary Traceability Matrix Testers Test Manager

Deliverable* Objective

 Definition of test scope, test approach, test phases on the


project level
 Expectations towards all release stakeholders
 Test resource plan, roles & responsibilities
Test Strategy  Scope of testing
 Test tool requirements and strategy
 Regression test requirements (if exist in prod)
 Test data plan
 Includes User Story, Workflow Testing, SIT approach

Preliminary Traceability  Predominantly requirements traceability


Matrix

11.3.2 Test Planning (PDP – Requirements /Analysis & Design Phase)

 Test planning produces 2 deliverables: test Plan, and test schedule (project plan)
 The detailed test planning must contain test phases that will be executed as per default
strategy, and per test phase
1. List all key test activities per test stage (planning, preparation, execution)
2. Duration (number of cycles) and timing of each test stage
3. Key delivery milestones
4. Resource (ramp up) plan
• Input to the test planning > Optimal amount of testing
• Budget / resource availability
• Available test schedule / program milestones
• Planned test strategy
• Development delivery schedules

Ladenburg-Thalmann Prasanna Kumaran B


23
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

• Availability of test tools


• Test scope
• Level / complexity of functional scope
• Level / complexity of non-functional scope
• Level / complexity of integration testing of all release constituents
• Level of regression testing
• Level of test automation available

Test design is the set of test execution preparation activities to support efficient test execution
cycles.

• Test cases /scenarios must be documented


• Test cases must be reviewed by the functional (BA’s, Ops, SME’s) and technology groups
for completeness and correctness
• Supported by coverage matrix
• Test design must be aligned with test strategy
• Test design must be shared between the participating projects of the release (e.g. test data
sharing)
• Tools must be created to support test data creation & validation
• Tools must be designed to support test result validation
• Test environments must be set up and shaken out

11.3.2.1 Primary activities

• Decompose requirements and design to determine appropriate test coverage.


• Specifically identify and secure resources required for testing – people, environments, test
data
• Schedule and execute a review of the Test Plan
• Build test cases & scripts in accordance with Test Plans
• Update Trace Matrix with test cases/scripts to ensure full coverage (A Trace Matrix is utilized
to ensure all requirements have been solved and that test cases provide adequate coverage of
the requirements and the design solution.)
• Build, modify, or request environments, test tools, test data and train QA and testing staff
if needed.

Ladenburg-Thalmann Prasanna Kumaran B


24
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

11.3.2.2 Deliverables

Deliverable* Responsible Accountable

Detailed Test Plan Test Lead Test Manager

Test Cases/Scripts for ST, SIT, UAT & Business Users Associated Test Leads

Unit, System, SIT, UAT &


Deploy & Back-out Levels

Trace Matrix Application Test Lead Application Test Manager


The
Updated Trace Matrix Test Lead Test Manager

Detailed Test Plan is a high-level documentation of what will be covered in testing and how it will
be covered (types of test, schedules, resources, etc.). Items that will not be tested must be
specifically stated in the Test Plan along with information on potential risk due to not testing and
what mitigation activities are planned to manage the risk.

11.3.2.3 Delivery Objective

Deliverable Objective

Detailed Test Plan  Definition of test scope, test approach, test phases
on the project level
 Expectations towards all release stakeholders
 Test resource plan, roles & responsibilities (BIO)
(responsibility matrix)
 Scope of upstream and downstream system testing
 Test tool requirements and strategy
 Regression test requirements (if exist in prod)
 Test data plan
 Migration test requirements
 Includes Dev, SIT, FT and UAT
 Test execution procedures like Test sequence, validation
points, Go/No go decisions, suspension criteria, meeting
schedules

Updated Trace Matrix  Coverage of FRD / BRD by test cases

Ladenburg-Thalmann Prasanna Kumaran B


25
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Project Test Schedule  Number of test cycles and timeline


 Detailed test cycle execution schedules

 Test coverage based of operational and business


Test Scenarios
scenario’s

 Detailed step by step instructions to execute a test


Test Cases / Test Scripts
scenario

Coverage matrix  Coverage of FRD / BRD by test cases

 Manufactured data versus Production data


 Reference data
Test Data
 Data load
 Environment refresh

Test Tool set up and  Default test tools versus specific, custom build test tools
development

Set up defect handling tool  Set up and agree on process and tool, and workflow of
and process encountered defects

11.3.4 Test Execution (PDP – Build Phase: Unit Test Level)

Confirm that the system modules or components meet functionality and quality requirements.

 The test execution must be well prepared through detailed test cycle execution plans that are
communicated and shared between all test project participants
 Test execution procedures must be prepared and validated
 Environment shakeout and environment clean up procedures must be clear
 The defect handling tool and process must be set up before starting test execution
 Test reports must be agreed and defined with business and technology management before
test execution starts
 Any test supporting tools must be installed and validated before test execution starts
 A code release management procedure must be agreed with the application development
teams
 Resources must be identified and aligned
 Test cases (coverage) is reviewed and signed off by stakeholders
 Test execution is organized in test cycles. A test cycle is a logical set of test cases that will be
executed together, according to a specified execution plan, by a given set of resources.

Ladenburg-Thalmann Prasanna Kumaran B


26
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

11.3.4.1 Primary Activities

• Execute Unit Test Cases and record results


• Complete Unit Test Checklist
• Ensure results meet Exit Criteria for Unit Test in the Test Plan
• Produce daily/weekly Test Execution Reports summarizing test results and quality
risk assessment

11.3.4.2 Deliverables

Deliverable* Responsible Accountable

Applicable Level Test Results Test Lead Test Manager

Applicable Level Test Defects Test Lead Test Manager

Applicable Level Test Summary Test Lead Test Manager


Report

Note: All deliverables are provided by the Accountable person to the Test and/or Project Manager

Note:

Note that the primary activities and deliverables for each level of test are the same, but the Test
Scope and environment varies.

11.3.8.3 Test Execution Deliverables and Objective:

Deliverable* Objective

Test cycle execution  Execution of test cases


 Reviewing gating criteria

Defect log  logging of defects

Release management procedures  Structured builds


 Release notes

Ladenburg-Thalmann Prasanna Kumaran B


27
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Test reporting  Test results


 Test progress
 (defect) metrics

Test sign off  Receive sign off from all stakeholders


against pre-defined criteria
 Sign off per Door: SIT ready, UAT
ready, UAT exit, Production ready

11.3.9 Test Closure (PDP – Post Implementation Phase)


Test closure activities are to be performed after a project / release completes testing

 Test Completion report


1. Final results
2. List outstanding defects
3. Test data storage
 Regression Automation suite
1. Test Scripts
2. Test data
3. Frameworks
 Performance Scripting
1. List outstanding defects
2. Performance Report
3. Scripts
 Lessons learned

11.3.9.1 Deliverables

Deliverable* Responsible Accountable

Test Closure Report Application Test Manager / QA Director


Business Test Manager

11.3.9.2 Test Closure Deliverables and Objective:

Deliverable* Objective

Ladenburg-Thalmann Prasanna Kumaran B


28
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Test Closure Report  Summary documentation of test


execution and defect metrics

Regression Test Suite  Build default regression test suite (test


cases + test data + execution
procedures

Test automation Requirements and plan  Identify test automation requirements


for future releases

Lessons Learned  Proposed process improvements


 Lessons learned next release

Production support hand-over  Handover open defects to production


support
 Agree on involvement QA team in 3
levels of support

11.3.10 Regression Automation (New Section as part of Test Approach to


be included)

Regression Testing is the testing of existing functionality after changes to a system have been made.
This may use/reuse old tests (from a previous phase of system development) or may require new
ones to be created and will be dependent on the changing use of the system.

It consists of the execution of a suite of automated/manual tests which have been created, and are
maintained, with the specific aims of:

 proving that existing application, system or platform capabilities are unaffected by changes
being delivered
 identifying any damage to existing system capabilities caused by subsequent changes

Regression Automation covers the usage of tools to automate the test execution process which might
otherwise be undertaken manually.

Although automation testing is part of the testing lifecycle, the process of creating a suite of
automated tests has its own process. As with any project lifecycle, test automation also goes through
the various phases of a software development lifecycle.

The steps involved in the development and execution of testing through automation are covered in
the subsequent tabs on this page:

Ladenburg-Thalmann Prasanna Kumaran B


29
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

 When to automate
 What to automate
 Regression tests
 Automation testing process

1.1 Test Automation Approach

Scope Define the Key Criteria for Framework set-up. It includes  Exec  Main
Evaluation ute tain
for various  Reusability the the
 Easy Maintenance auto Test
Activities
 Scalable across applications mate script
 Regre d s and
ssion test data
o Develop the individual Automation Test scripts using Automation script sheet
 ST
framework s s by
 SIT
cond
UAT
uctin
 Feasi o Dry run for the Script Execution  Valid g the
bility
ation impa
study
of ct
& PoC
Auto analy
(If
mati sis
requir
on
ed)
script
s and  Modi
Regression
analy fying
Profiling -
sis of the
Profiling
the Auto
of Test repor mati
Cases ts on
based on: Fram
ewor
 Test  Provi k
case de comp
compl the onen
etene exec t (if
ss ution requi
 Test resul red)
case ts to
flow LTS
Test Tech
data Team
Varia
nt

Ladenburg-Thalmann Prasanna Kumaran B


30
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

condi o The Individual Test script units are integrated


tions
 Repe
atabili
ty

Create a
detailed
project
plan
addressing
resourcing
,
deliverabl
es and
schedule

Analysis & Planning Define & Design Execute


Effort
Analysis
for Key
activities
Activities
Deliverables

Ladenburg-Thalmann Prasanna Kumaran B


31
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

 Enhancement to the framework design document (if required)  Auto  Modi


 Autom mati fied
ation on Auto
feasibili  Automation Test Scripts and Data sheets exec mate
ty ution d
report repor Test
ts script
s and
 Detaile datas
d  Auto heets
Autom mati (Upd
ation on ated)
Plan metri
(Coveri cs
ng  Impa
scope, ct
Test Analy
cases sis
conside docu
red, ment
Effort (Crea
estimat ted)
ions
and the
deliver
ables)

Ladenburg-Thalmann Prasanna Kumaran B


32
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Align it like this

Ladenburg-Thalmann Prasanna Kumaran B


33
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

1.2 Regression Profiling

Profiling the test cases allows easy maintenance and better business confidence to determine the
types of cases that are regression-worthy. These test cases could be componentized based on their
priority, business criticality, automatability and facilitation.

Entry Criteria

 Signed-off and issued Test Plan with Automation Approach


 An available Functional System Test environment for test execution with suitable hardware
and software components
 An available instance of all software components supporting Test Automation
 Available vendor support, either external or onsite presence
 Manual tests prepared prior to automation of tests
 Test Data for test execution

Ladenburg-Thalmann Prasanna Kumaran B


34
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Exit Criteria

 All planned cycles of automation tests completed successfully


 All Defects / Variances have been identified and logged
 A signed-off and issued Test Completion Report

Primary Activities

• Conduct feasibility of using test automation


• Define the scope of test automation
• Identify tools and frameworks for automation
• Conduct a proof of concept if required to understand feasibility
• Estimate the effort required to automate tests
• Analyze application functionality and manual test cases
• Identify manual test cases that can be automated
• Create traceability between manual test case and automation test scripts
• Develop automation test scripts
• Unit test and debug automation test scripts
• Test the application by executing the automation scripts
• Analyze the results and log defects

11.3.4.2 Deliverables

Deliverable* Responsible Accountable

Automation Feasibility Report Automation Lead Test Manager

Traceability Matrix Automation Lead Test Manager

Test Automation Scripts Automation Lead Test Manager

Ladenburg-Thalmann Prasanna Kumaran B


35
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Execution Log and Results Automation Lead Test Manager

Test Automation Summary Automation Lead Test Manager


Report

Test Automation Deliverables and Objective:

Automation
Deliverable* Objective
Phase

Automation Feasibility Report  Scope


 Feasibility Activity List
 Automation Solution
Analysis &
Traceability Matrix  Requirements Vs Test Cases Vs Planning
Automation Script

Automation Test Plan  Scope & Approach


 Effort & Resource Estimates

Automation Framework  Framework Overview


 Framework Components

Automation Coding Standard  Naming Standards Define & Design


 Code Commenting Conventions

Test Automation Scripts  Test Coverage

Execution Log and Results  Execution of test cases


 Test Results

Execute
Test Automation Summary  Test Results
Report  Test progress
 (Defect) metrics

Ladenburg-Thalmann Prasanna Kumaran B


36
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Automation Closure Report  Summary of Project Automation


 Summary of Defects
 Failure Summary
 Lessons Learnt Maintain

Impact Analysis / Effort Savings  % Automation Coverage

2. Performance Test

Performance testing usually occurs towards the end of the testing process, in a performance-
specific test environment. This is because performance testing solutions tend to be expensive,
require specialized skill sets, and require specific hardware and environments. This is a big problem
because APIs have service level agreements (SLAs) that must be met in order to release an
application. If you wait until the very last moment to do your performance testing, failures to meet
the SLAs can cause huge release delays.

GUI

Performance
Web Testing
API
Services Endurance

The approach is to Performance Test early. It is shift left performance testing through API calls and
the below approach reflects shift left performance testing strategy.

Ladenburg-Thalmann Prasanna Kumaran B


37
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

2.1 Performance Test Approach

Change the Diagram from Earlier SAI Approach we created

Ladenburg-Thalmann Prasanna Kumaran B


38
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Entry Criteria

 Signed-off and issued Test Plan with Performance Test Approach


 An available dedicated Performance Test environment for execution with suitable hardware
and software components
 An available instance of all software components supporting Performance Testing
 Available vendor support, either external or onsite presence
 Test Data for Test execution

Exit Criteria

 All planned cycles of Performance Tests completed successfully


 All Defects / Variances have been identified and logged
 A signed-off and issued Performance Test Completion Report

Primary Activities

• Identify tools and frameworks for Performance Testing


• Define the scope of Performance Testing
• Conduct a proof of concept if required to understand feasibility
• Estimate the effort required to automate tests
• Analyze Performance requirements
• Create traceability between Performance requirements & scripts
• Develop Performance test scripts
• Unit test and debug Performance test scripts
• Test the application by executing the Performance scripts
• Analyze the results and log defects

Ladenburg-Thalmann Prasanna Kumaran B


39
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

11.3.4.2 Deliverables

Deliverable* Responsible Accountable

Performance Tool Selection


Performance Test Lead Test Manager
Report

Traceability Matrix Performance Test Lead Test Manager

Performance Test Plan Performance Test Lead Test Manager

Performance Framework Performance Test Lead Test Manager

Performance Coding Standard Performance Test Lead Test Manager

Performance Scripts Performance Test Lead Test Manager

Performance Log and Results Performance Test Lead Test Manager

Performance Test Summary


Performance Test Lead Test Manager
Report

Performance Test Closure


Performance Test Lead Test Manager
Report

Ladenburg-Thalmann Prasanna Kumaran B


40
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Test Automation Deliverables and Objective:

Automation
Deliverable* Objective
Phase

Performance Tool Selection  Scope


 Feasibility Activity List
Report
 Feature Support

 Requirements Vs Performance Analysis


Traceability Matrix
Script

Performance Test Plan  Scope & Approach


 Effort & Resource Estimates

 Framework Overview
Performance Framework
 Framework Components

 Naming Standards
Performance Coding Standard Design
 Rendezvous, Ramp-Up & Ramp-
Down Practices

Performance Scripts  Test Coverage

 Execution of test cases


Performance Log and Results
 Test Results

Execution
Performance Test Summary  Test Results
Report  Test progress
 (Defect) metrics

 Summary of Project Testing


 Summary of Defects
Performance Test Closure Report Maintenance
 Failure Summary
 Lessons Learnt

Ladenburg-Thalmann Prasanna Kumaran B


41
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

3. Security Testing

Security Testing requirements are formalized by Security Impact Assessment (SIA). The SIA will
determine which of the below security testing techniques are applicable for each release within this
program.

Each progression application will be required to make their own assessment based on the changes
they are making. This assessment must be completed prior to the MTP due date to ensure it is
covered as a part of test planning. The applications are responsible for the planning and
engagement of necessary security testing out of their assessment.

The LTS Information Security Team (ISO) will be responsible for directing and spearheading Security
Testing comprehensively.

Ladenburg-Thalmann Prasanna Kumaran B


42
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

3.1 QA’s Security Testing Scope

The Following vulnerabilities will be validated by the QA team as part of Security Testing

 SQL Injection
 Cross Site Scripting
 Broken Authentication and Session Management
 Insecure Direct Object References
 Cross Site Request Forgery
 Security Misconfiguration
 Insecure Cryptographic Storage
 Failure to restrict URL Access
 Insufficient Transport Layer Protection
 Un-validated Redirects and Forwards

QA Team will work with Securities America’s Information Security Team to execute and ensure the
Security requirements are met.

All Entry & Exit Criteria will be defined in accordance with LTS IS Team including deliverables and
their ownership.

13. Defect Management Process

Any actual test result that does not match the expected result will be raised as a defect. Defects
will be captured in TFS.

Standard defect management processes as defined in Test Strategy (This Document) will be
followed throughout all phases of testing. The defect is to be clear around the issue, cause, fix and
re-tests. No defects are to be left unfixed without assessment from the team and the project, and
without formal approval from the project.

Ladenburg-Thalmann Prasanna Kumaran B


43
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

12.1 Defect Detection, Analysis and Resolution


This diagram shows the movement of a defect through various statuses

This process covers defect detection, analysis, fix order, defect tracking and defect resolution.

Ladenburg-Thalmann Prasanna Kumaran B


44
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

12.2 Role Vs Status & Required Fields

STATUS

DEFFERE
NEW OPEN REJECTED FIXED
D

Managem
ent
Defect is decision
assigned to Defect has deems
Defect
the Owing been that the
Defect is repair is
Group/ reviewed, defect will
identified by complete
Developer. evaluated and not be
Testers. Test and unit
Dev Lead deemed to be resolved
Lead will review tested.
PROCESS will review a non-issue. in the
defect for Defect will
defect and Test Lead/ current
duplicity, be promoted
assign to a Manager release.
validity, info to the test
developer. moves defect Test Lead
details etc., environment
Developer to Rejected / Manager
.
will work on status moved
code fix. defect to
deferred
status.

TESTING
TEAM

DEVELOPME
NT TEAM

TEST
LEAD/MANA
GER

 Summar
y  Statu  Statu
REQUIRED (Subject) s  Status s
DEFECT  Sta
 Test  Assig  Resolu tus  Assig
FIELDS Phase ned tion ned
 Build To To
Version

Ladenburg-Thalmann Prasanna Kumaran B


45
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

 Descripti  Priori  Priori


on ty ty
 Severity  Root
 Status Cause
 Detected  Defec
on date t
 Detected Type
By
 Assigned
To
 Reprodu
cible

STATUS

MONITO READY TO
REOPEN RETEST CLOSED
R CLOSE

Defect is in
test
environment Defect Defect
and ready to problem is problem is
be re-tested resolved. All
Defect resolved and
by defect the field
Defect is cannot be originator. validated by inputs are
fixed but re- Test Lead / defect validated by
PROCESS
failed during produced Manager submitter. the Test Lead
re-test consistentl moves defect Tester moves / Manager for
y. to Retest defect to the
status when correctness.
Ready to
the build is
deployed Close status
with the
defect fixes.

TESTING
TEAM

DEVELOPMEN
T TEAM

Ladenburg-Thalmann Prasanna Kumaran B


46
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

TEST
LEAD/MANAG
ER

 Status  Stat  Status  Status  Status


us
 Assign  Assign  Assign  Assign
ed To ed To ed To ed To
REQUIRED (If the
DEFECT Defect
FIELDS Details
are to
be
collect
ed)

12.3 Responsibilities

Make it like attached Defect Document

Ladenburg-Thalmann Prasanna Kumaran B


47
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

12.4 Severity Code Definitions


Severity code definitions will be as per those defined in Test Strategy (This Document):

http://sharepoint.apps.anz/communities/ProcessBank/PB%20Process%20Artefacts/GTST006%20De
fect%20Severity%20Definitions.ppt

Assuming the upgrade is considered a “new” system, the following definitions apply:

Convert these 2 Pics into 2 Tables for editing

Ladenburg-Thalmann Prasanna Kumaran B


48
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

12.3 Priority Code Definitions


The priority of an issue reflects its impact on the testing schedule and is used to determine where
within the severity level this problem should be addressed.

14. Test Environment Requirements


The Environments will need to support implementation for IFS customized workflows.

The following non production environments will be required:

 IFS Test Environment (For User Story / Feature Testing)


 IFS Environment (For Workflow Testing)
 SAI Acceptance Environment (For Systems Integration Testing)
 SAI Acceptance Environment (For UAT)
 Dedicated QA region for Performance & Security Testing activities

Production Environment will be required only for Install & Back-Out Testing

Ladenburg-Thalmann Prasanna Kumaran B


49
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

15. Dependencies
No. Dependency Management
1 Resource availability as per request RRA has been raised for additional
technology test resource to commence
asap located in Melbourne.
2 SME will be required to verify
completeness on the IFS Workflow and
base framework
3 The infrastructure resource will be
required to validate completeness of the
infrastructure build

16. Gating Approach


A quality gate approach will be employed between the application teams/project for all phases of
testing, with the intention of reducing impacts of test risks being transferred.

The project and each application team will use standard, pre-defined entry and exit criteria for their
respective test activities. Any criteria ‘not met’ are to outline the potential risks and mitigation
strategies. All standard Process entry and exit documentation is to be used.

Approvals for all gating documents will follow process bank requirements.

The decision to override quality gates will be based on overall project objectives and the potential
risks.

15.1 User Story Testing


15.1.1 Entry Criteria
15.1.2 Exit Criteria

15.2 Feature / Work Flow Testing


15.2.1 Entry Criteria
15.2.2 Exit Criteria

Ladenburg-Thalmann Prasanna Kumaran B


50
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

17. Testing Organization


16.1 Structure

Align the picture Properly

Ladenburg-Thalmann Prasanna Kumaran B


51
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

16.2 Roles & Responsibilities

Align and Position properly – Remove Horizontal spaces in S.No & Role and expand Responsibilities.

S No. Role Responsibilities

1 QA Strategist  Complete ownership of


the quality of the
product delivered.
 Responsibility includes
identifying areas that
need to be tested,
designing test strategies
that target those areas,
and setting acceptance
criteria for each test.
 Work closely with test
engineers, emphasizing
to them the importance
of responsible, quality
testing practices.
 Monitoring the
continued quality of
technology products
and finding ways to
increase the
performance of those
products.
 Deliver quality metrics,
measurement and
progress reporting
 The first point of
contact for all level
testing activities
 Liaise with customer
functional teams when
required to ensure the
progression of ST, SIT,
UAT activities.
 Define Test Strategy /
Plan
 Ensure all application
test activities track to
the agreed schedule
 Escalation points for
overall Testing

Ladenburg-Thalmann Prasanna Kumaran B


52
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

 Obtain Sign-Offs for key


deliverables
2 Sr. QA Analyst  Review Test Steps once
completed by
Application teams
 Scripting and test
execution
 Defect management
 Daily/weekly status
reporting
 Escalation point for Test
Analysts
3 QA Analyst  Complete Test Design
Steps
 Execution of test cases
 Raising and retesting
defects
4 Sr. Test Automation Engineer  Prepare Automation
Risk Assessment
 Manage and produce an
Automation Test Plan
 Scripting and execution
of test cases
 Manage and produce
test deliverables
 Manage Defects
5 Test Automation Engineer  Scripting and execution
of test cases
 Manage and produce
test deliverables
 Manage Defects
6 Performance Tester  Prepare Performance
Risk Assessment
 Manage and produce a
Performance Test Plan
 Scripting and execution
of test cases
 Manage and produce
test deliverables
 Manage Defects
7 Security Tester  Prepare Security Risk
Assessment
 Manage and produce
Security Test Plan
 Scripting and execution
of test cases
 Manage and produce
test deliverables
 Manage Defects

Ladenburg-Thalmann Prasanna Kumaran B


53
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

8 QA Manager  Manage conflicts/issues


that impact testing
 Ensure quality metrics,
measurement and
progress reporting
 Manage the testing
schedule for all testing
phases
 Daily/weekly status
reporting at the
application level
 Engage Test
Automation,
Performance and
Security test teams for
all releases
 Escalation point for Test
Leads/Test Analysts
 Define Detailed Test
Plan for each release

16.3 Communications Plan

Draw table as per our QA Strategy PPT Content.

Action Participants

Weekly Testing meetings

Weekly status reports

Daily stand up meetings Sensiple QA Team &

Daily Defect meetings Project Stakeholders

Test Management - Weekly status reports

18. Resource Estimates


Resource estimates for different types of Testing are as below:

Ladenburg-Thalmann Prasanna Kumaran B


54
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

S No. Role Location

1 QA Strategist / Test Manager Onsite

2 Sr. QA Analyst Offshore

3 QA Analyst Offshore

4 Sr. Test Automation Engineer Offshore

5 Test Automation Analyst Offshore

6 Performance Tester Offshore

7 Security Tester Offshore

8 QA Manager Offshore

19. Testing Risks

ART Ref. Description Probability Impact Mitigation Owner Status


No. Strategy
PR000294- If the Low High Ensure that Prasanna Open
RART192 performance of the Kumaran
virtualisation is assessment
not properly is completed
assessed, there at the
is a risk that earliest.
virtualisation of
hardware will
not be possible
due to a number
of factors (refer
Journal), which
may lead to the
need to procure
additional
hardware.
PR000294- If testing Low High Raise RRA Prasanna Open
RART193 requirements once DIRC Kumaran
are not properly approval is
defined during sorted and

Ladenburg-Thalmann Prasanna Kumaran B


55
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Evaluate, there track


is a risk of progress
insufficient test
support from
the Managed
Services team,
which may lead
to the need for
additional
testing support
and an
extension of the
testing phase.

20. Deliverables

Deliverable Description When created

Test Strategy / The Master Test Plan documents the overall test scope, At the commencement
Master Test Plan objectives and requirements for the software project at an of test planning.
application release level, and ensures each phase has a clear
focus and aligns to the Project Delivery Strategy.

Detailed Test Plan The Detailed Test Plan documents the testing approach for a Each phase of testing.
particular phase of testing.

Test Scenario Test Scenario document WHAT needs to be tested. They are Each phase of testing,
documented for each phase of testing to ensure all facets of excluding those phases
a specification or change have been covered. managed by the
Project.

Test Scripts Test Scripts document HOW to test a Test Condition/s. Test Each phase of testing,
Scripts contain detailed instructions on how to execute each excluding those phases
step of the test, along with a description of the expected managed by the
system response to each action including it’s applicable pre- Project.
condition.

Ladenburg-Thalmann Prasanna Kumaran B


56
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

Entry Criteria The Entry Criteria Checklist assesses the readiness of the team Each phase of testing.
Checklists to commence test execution, and is completed for each phase
of testing.

Exit Criteria The Exit Criteria Checklist assesses the readiness to proceed Each phase of testing.
Checklists into the next phase of testing, and is completed for each
phase of testing.

Test Summary The Test Summary Report (TSR) contains a summary of testing Updated and approved
Report performed and shows how actual testing performed maps at each phase of
back to planned testing (e.g. DTP), clearly outlining any testing.
variances.

Security / This report documents the findings of all Security Testing / Upon completion of
Performance Test Performance Testing undertaken for a project or initiative. It Security / Performance
Report details any outstanding security/Performance defects and Testing
provides a risk assessment.

21. Test Metrics

All Tables in the doc should be of same color

Metric Type Metric Name Test Phase

Preparation
No. of test cases created
Test Design
Preparation
Productivity Actual time taken (Person Days)
Define & Design
No. of test cases scripted (Test Automation)
No. of test cases reviewed Preparation

Preparation /
Test Design No. of test cases re-created
Execution
Quality
Total Test cases - Passed Execution

Total Test Cases - Failed Execution

Test Execution No. of test cases executed Execution


Productivity Actual time taken (Person Days) Execution

Ladenburg-Thalmann Prasanna Kumaran B


57
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

No. of test cases run (Test Automation) Execution

Average Time for Defect Repairs Execution

Number of defects found Execution

Total effort spent (in Hours) Execution

Defect Density / Defect Age Execution


Defect
Defect Detection Ratio Execution
Efficiency
Defect Leakage Closure

Defect Rejection Ratio Closure

Defects Deferred Percentage Closure

Total number of defects found in ST Closure

Total number of defects found in UAT Closure


Test
Effectiveness Total number of defects found in ST + UAT Closure

Execution /
Requirement Creep Percentage
Closure

Make it like this

Ladenburg-Thalmann Prasanna Kumaran B


58
Enterprise Operation Services (EOS) Test Strategy 11/15/2019

22. Testing Tools

Ladenburg-Thalmann Prasanna Kumaran B


59

S-ar putea să vă placă și