Sunteți pe pagina 1din 29

Camino Leaf Test Plan

Camino Leaf Test Plan

Version 1.0

CSOL 560
Secure Software Design and Development
Team Members:
Nicholas Balich
Tisa Carlos
Douglas Cleary
Micah Geertson
Alejandra Mejia
Tyler Murray
21 OCT 2018

Test Plan Template, version 0.1 Page 1 of 29


Camino Leaf Test Plan

Camino Leaf

Test Plan
Purpose: The purpose of this document is to provide an in-depth look into the secure software
development testing procedures and lifecycle. This document serves to showcase the unit,
security, and quality assurance testing that has been implemented during the development of the
Camino Leaf interface. Additionally, an in-depth testing strategy, environment plan and
administrative plan have been conducted to establish a baseline standard for which the application
can be held against.

Revision History

Date Version Author Description

22 OCT 2018 1.0 Camino Leaf Test plan documentation to support the
Engineering installation and deployment of Camino
Team Leaf 1.0 software.

Test Plan Template, version 0.1


Page 2 of 29
Camino Leaf Test Plan

Table of Contents
1. Overview 5
1.1. Purpose 5
1.2. Scope 5
2. Testing Summary 5
2.1. Scope of Testing 5
2.1.1. In scope 5
2.1.2. Out of scope 5
3. Analysis of Scope and Test Focus Areas 6
3.1. Release Content 6
3.2. Unit Testing 6
3.3. Functional Testing 10
3.4. Regression Testing 17
4. Other Testing 17
4.1. Verification Testing 17
4.2. Validation Testing 17
5. Test Strategy 18
5.1. Test level responsibility 18
5.2. Test Type & Approach 18
5.3. Build strategy 19
5.4. Facility, data, and resource provision plan 19
5.4.1. Test Environment and Requirements 19
5.4.2. Resources & Skills 19
5.5. Testing Tools 20
5.6 Testing Handover Procedure 20
5.7 Testing Metrics 20
6. Test Environment Plan 21
6.1. Test Environment Diagram 21
6.2. Test Environment Details 22
6.2.1. Testers 22
6.3. Establishing Environment 23
6.4. Environment Control 23
6.5. Environment Roles and Responsibilities 23
7. Assumptions and Dependencies 24
7.1. Assumptions 24
7.2. Dependencies 24
8. Entry and Exit Criteria 25
9. Continuous Software Mitigation 26
9.1. Product Security Incident Response Team 26
9.2. PSIRT Organizational Model 26
9.3. PSIRT Process Model 27
10. Definitions 28
11. References 29
12. Points of Contact 30

Test Plan Template, version 0.1


Page 3 of 29
Camino Leaf Test Plan

1. Overview

1.1. Purpose
The purpose of this document is to describe the Test Plan for Camino Leaf software and overall
framework that will drive this testing. In addition, the following will also apply:
● The test scope focus areas and objectives
● The test responsibilities
● The test strategy for the levels and types of test for this release
● The entry and exit criteria
● The basis of the test estimates
● Any risks, issues, assumptions and test dependencies
● The test schedule and major milestones
● The test deliverables

1.2. Scope
This document details the testing that will be performed by the project team for the Camino Leaf
project. It defines the overall testing requirements and provides an integrated view of the project
test activities. Its purpose is to document:
● What will be tested;
● How testing will be performed;
● What resources are needed, and when.

2. Testing Summary

2.1. Scope of Testing


2.1.1. In scope
The scope of testing is to include unit testing, input/output validation testing, integration testing,
and end-user acceptance testing. This includes the testing of Camino Leaf full stack function
testing to ensure that data entry and endpoints send and return properly. Load and stress testing
will be performed to test system failure under duress. Unit and input/output validation testing will
be conducted to ensure user-input is properly formatted and executed when transitioning from
client to server-side. Additionally, validation testing will allow the project team to uncover any
system-level error messages that should be suppressed from the user’s view to prevent potential
malicious intent. Integration testing will be used to validate system-level components properly
function within various networked environments.

It is assumed that unit testing already provided thorough black box testing, extensive coverage of
source code, and testing of all module interfaces

2.1.2. Out of scope


The following systems are out of scope for this test:

● Firewalls
● Borders Routers
● Switches
● External systems/connections

Test Plan Template, version 0.1


Page 4 of 29
Camino Leaf Test Plan

Any test not mentioned in section 2.1.1 will be excluded from this document. This includes, but is
not limited to: scalability testing, hardware acceptance testing, browser compatibility testing,
recovery testing, comparison testing, or exploratory testing.

3. Analysis of Scope and Test Focus Areas

3.1. Release Content


The contents of this document will only be released to the authorized Camino Leaf Team, testers,
and any industry partners with Non-Disclosure Agreements (NDA) in place.

3.2. Unit Testing

Ref Function Test Objective Evaluation Criteria X-Ref P

3.2.1 Common Successfully logon using User can successfully DoDI 8520.02 - P
Access Card government issued logon to the Camino Public Key
Logon Common Access Card Leaf application using Infrastructure (PKI)
(CAC). their CAC. and Public Key (PK)
Enabling; FR-A24;
NFR-2

3.2.2 GUI View Verify GUI loads Logon sequence FR-A2; FR-A4 P
completely. Correct completes successfully.
language is utilized. Default start page set to
user’s personal
dashboard display.

3.2.3 Classification Verify correct Post-logon, security DoDI 8310.01 - P


Banner classification banner banner successfully Information
displays on displays across the top Technology
unclassified/classified portion of the screen. Standards in the
systems. Green/UNCLASSIFIED DoD; FR-A3; NFR-5
for NIPRNET systems
and Red/SECRET for
SIPRNET.

3.2.4 Connection Verify identifying From menu bar, select DoDI 8310.01 - P
Point Details information of system ‘Connection’ -> ‘Details’. Information
used for access is Verify user name and Technology
captured and displayed. machine details listed Standards in the
as: DoD; FR-A11; NFR-
Connected: 5
IP:
Device Name:

3.2.5 Real Time Review System Verify all time stamps, FR-A2; FR-A27; P
System Status Dashboard for accuracy. alerts, action items, POC FR-A46; FR-B1;
information, and feature FR-B11
accessibility.

Test Plan Template, version 0.1


Page 5 of 29
Camino Leaf Test Plan

3.2.6 User/Admin Verify administrative From connection details, DoDI 8310.01 - P


Permissions controls are available only verify user designation Information
Verification to users designated as after successful logon. Technology
System Administrators. Standard users display Standards in the
“user” under identifying DoD; DoDD
indicators. System 8140.01 & DoD
Administrators display, 8570.01-M - Cyber
“Admin”. Workforce
Management
Program; FR-A9;
NFR-5

3.2.7 Customized Verify users’ capability to -From main dashboard FR-A2; FR-AA9; P
Widget Function prioritize and customize display, click the setting FR-A48
widgets in the GUI. “gear” icon next to the
individual menu options.
-Verify configuration
options are available. -
Verify all changes save
successfully.

3.2.8 Alert Notification Verify new alert pop-up -From top menu bar, DoDI 8530.01 - P
displays correctly on login. select ‘Alerts’ -> ‘New Cybersecurity
Alerts’. Activities Support to
-Verify timestamp on DoD Information
most recent alert. Network
Operations; FR-
A45; FR-A50; FR-
B9; FR-B10

3.2.9 Alert Database Verify previous alerts are - From top menu bar, DoDI 8530.01 - P
stored for review and select ‘Alerts’ -> Cybersecurity
access. ‘Previous Alerts’. Activities Support to
- Verify 90 days of alerts DoD Information
are viewable. Network
Operations; FR-
*Note – Alerts archive is
A45; FR-A50; FR-
searchable through the
B9; FR-B10
Alerts DB and reporting
features. *

3.2.10 IAVA/B/M/TA/C Verify IAVA/B/M/TA/CTO -From the top menu bar, DoDI 8530.01 - P
TO Verification database reflects select ‘IAVM’ -> and Cybersecurity
applicable vulnerabilities. browse through the Activities Support to
following choices: DoD Information
CTO Network
Operations; NIST
IAVA
SP 800-40 rev. 3-
IAVB Enterprise Patch
TA Management;
-Verify timestamp and DoDD 8140.01 &
system applicability of DoD 8570.01-M -
available vulnerability Cyber Workforce
notices. Management
Program; FR-A38;
FR-A39; FR-A45;
NFR-2

Test Plan Template, version 0.1


Page 6 of 29
Camino Leaf Test Plan

3.2.11 Weather Alert Verify Weather Alert -From the top menu bar, FR-A19; FR-A34 P
Verification notifications are select ‘Weather Alert’->
configured to report on ‘Zone’.
issues in the immediate -Verify Weather alerts
area. are configured to the
current zip code; radius
is set to 250 miles.

3.2.12 National Verify successful -From the top menu bar, DoDI 8530.01 - P
Vulnerability synchronization with NVD. select ‘NVD’ -> Status. Cybersecurity
Database (NVD) -Verify synchronization Activities Support to
Synchronization settings are configured DoD Information
for daily syncs at 0001 Network
(local). Operations; NIST
SP 800-40 rev. 3-
-Verify timestamp of last
Enterprise Patch
update.
Management;
DoDD 8140.01 &
DoD 8570.01-M -
Cyber Workforce
Management
Program; FR-A22;
FR-A25; FR-A40;
FR-A45; NFR-2

3.2.13 DISA Security Verify successful -From the top menu bar, DoDI 8530.01 - P
Technical synchronization with DISA select ‘’STIG’ -> Status. Cybersecurity
Implementation STIG database. -Verify synchronization Activities Support to
Guidance settings are configured DoD Information
(STIG) for weekly syncs on Network
Synchronization SAT, 0001 (local). Operations; DoDD
8140.01 & DoD
-Verify timestamp of last
8570.01-M - Cyber
update.
Workforce
Management
Program; FR-A23;
FR-A41; FR-A45;
NFR-2

3.2.14 Vendor Verify vendor look up -From the top menu bar, DoDI 4140.01 - P
Validation features. select ‘’SCRM’-> Supply Chain
‘Vendor’ -> ‘Search’. Material
-Verify all available Management; FR-
Vendor search options, A14; FR-A15; FR-
to include name, A20
location, product, price
range, and Cleared
Defense Contractor
(CDC).

Test Plan Template, version 0.1


Page 7 of 29
Camino Leaf Test Plan

3.2.15 Purchaser / Review list of valid -From the top menu bar, DoDI 4140.01 - P
Approver purchasers / approvers for select ‘SCRM’-> Supply Chain
Validation accuracy. Ensure local ‘Purchases’-> Material
POCs are listed with the ‘Authorized Purchasers’ Management; FR-
appropriate title and ->‘Search’. A14; FR-A16; FR-
designation. -Verify all available A17; FR-A18;
Purchaser search
options, to include
name, location, Unit
Identification Code
(UIC), approver, role,
and Purchase
Authorization Type.

3.2.16 Report Menu Verify all reporting options -From the top menu bar, FR-A45; FR-A49; P
Functionality display correctly and are select ‘Reports’ FR-B8; FR-B10;
selectable by standard -Verify all report options NFR-4
users and administrators. are listed, to include
Alert, Vulnerability,
STIG, Vendor, and
Purchaser.
-Verify monthly,
quarterly, and annual
reporting options are
available.
-Verify configuration
options based on
product, time, and date.

3.2.17 Reporting Verify reports successfully Submit no less than FR-A45; FR-A49; P
Capabilities queue, load, and display. three (3) reports for FR-B8; FR-B10;
processing. NFR-4

3.2.18 Report Export Verify export options for -Submit report to export FR-A45; FR-A47; P
Formats reports. in one of each of the FR-A49; FR-B8;
following formats: FR-B10; NFR-4
*.CSV
*.PDF
*.TXT
*.XLSX
*.XML

3.2.19 Console Logon Logon directly to Camino Using an external FR-A5 P


Leaf via the system device, connect to the
console. Camino Leaf system
rack via the console port
to verify connectivity in
the event of loss of web
access.

Test Plan Template, version 0.1


Page 8 of 29
Camino Leaf Test Plan

3.2.20 UPS Integration Verify UPS status is From the console FR-A32 P
accessible from system connection, verify UPS
console. status to include Admin
Information, UPS Status,
Battery Status, Power
Parameters, Logs, and
Diagnostics.

3.3. Functional Testing


Ref Requirement Action Result X-Ref

3.3.1 Separation of -Standard user CAC logon. Accounts are DoDI 8310.01 -
User and -Verify GUI ‘Admin’ menu successfully Information
Administrative unavailable. separated into Technology
Accounts user/admin roles Standards in the
-Verify user unable to make system
DoD; DoDD 8140.01
changes.
& DoD 8570.01-M -
-Escalate to administrative Cyber Workforce
permissions through non-CLO Management
authentication. Program; FR-A9;
-Verify GUI ‘Admin’ is available. NFR-5
-Verify admin user can make/save
changes to settings, i.e.
synchronization configurations,
location, and add/remove user.

3.3.2 Verify Account -Using non-CLO Account lockout DoDI 8310.01 -


Lockout Settings (username/passwd) account to test. policies are set Information
-Correctly type username; enter in accordance Technology
incorrect password 3 times in 2 with operating Standards in the
minutes. policy. DoD; DoDD 8140.01
& DoD 8570.01-M -
-Verify account locks out for 20
Cyber Workforce
mins.
Management
*Note-In Real World situation, Program; FR-A7; FR-
account lockout after 20 minutes will A8; FR-A9; NFR-5
also lock out the CAC, forcing the
user to unlock at the nearest
available RAPIDS facility.

3.3.3 Verify Vendor -Standard user CAC logon. Vendor Stocks DoDI 4140.01 -
Stock -Select ‘’SCRM’-> ‘Vendor’ -> from are synced with Supply Chain Material
the top menu bar. the Camino Leaf Management; FR-
application and A14; FR-A15; FR-A20
-Select vendor of choice and click,
correctly
‘View Stock’.
reporting
-Verify stock listings include parts inventory.
number, National Stock Number
(NSN), price, and quantity.

Test Plan Template, version 0.1


Page 9 of 29
Camino Leaf Test Plan

3.3.4 Purchase -Standard user CAC logon. Verification that DoDI 4140.01 -
Validation -Select ‘SCRM’-> ‘Purchases’-> the purchasing Supply Chain Material
‘Recent Purchases’ management Management; FR-
function is A14; FR-A16; FR-
-From the results, highlight one and
accurately A17; FR-A18;
select, ‘View Properties’.
capturing
-Verify purchaser and approver purchase
information displays successfully. details.
-Verify purchaser and approver are
not the same individual.

3.3.5 Weather Alert -Standard user CAC logon. Weather alert FR-A19; FR-A34
Status -Select ‘Weather Alert’-> ‘Zone’ function is
from the top menu. successfully
reporting current
-Under zip code range, type ‘92110’
conditions.
and click update.
-Verify current weather status is
available.
-Test remote location; type ‘23454’
(Dam Neck, VA) and click update.
-Verify Weather alerts have updated
to reflect the new location.

3.3.6 Push Alert -Standard user CAC logon. Push alert DoDI 8530.01 -
Functionality -Non-CLO escalation to function to Cybersecurity
administrative permissions. support SMS Activities Support to
distribution, DoD Information
-Select ‘Alerts’ -> ‘Push Notification’.
FEMA alerts, Network Operations;
-Click the box next to the alert and Integrated FR-A29; FR-A34; FR-
designated for transmission Public Alert and A45; FR-A50; FR-B9;
-From the menu on the left, select, Warning System FR-B10
‘Individual User’. (IPAWS)
-Manually enter the 10-digit requirements is
telephone number capable of functioning as
receiving SMS alerts, selected for designed.
verification.
-Click transmit.
-When prompted, ‘Are you sure?’,
click Yes.
-Verify on the designated phone
number the alert was received and
all text is intact.

Test Plan Template, version 0.1


Page 10 of 29
Camino Leaf Test Plan

3.3.7 Alerts – -Standard user CAC logon. Alert view is DoDI 8530.01 -
Customization -From top menu bar, select ‘Alerts’ - customizable for Cybersecurity
> ‘Current View’. each user. Activities Support to
DoD Information
-Click the ‘gear’ icon on the top right
Network Operations;
to customize the current Alert view.
FR-A45; FR-A50; FR-
-Select the identifying criteria B9; FR-B10
required as follows:
- Alert Type
- Alert Originator
- Alert Date Range
- Key Word
- Status (No Action Required, Pending,
Unresolved, Resolved)
-Once complete, click ‘Save
Changes’ on the bottom right.
-Confirm customized view saved as
configured.

3.3.8 Reports – -Standard user CAC logon. Purchase FR-A45; FR-A49; FR-
Purchase -From the top menu bar, select reports display B8; FR-B10; NFR-4
‘Reports’ -> ‘New Report’ -> correctly.
‘Purchases’
-Input the identifying criteria
required. Options available are as
follows:
- Purchaser
- Approver
- Vendor
- Item (Part # / NSN)
- Location
- Date of Purchase
- Date of Approval
- Status (Pending, Approved, In Transit,
Complete)
-Click the ‘Generate Report’ icon on
the bottom right.
-Verify report information displays
correctly and includes all
information selected in the report
generation screen.

Test Plan Template, version 0.1


Page 11 of 29
Camino Leaf Test Plan

3.3.9 Reports – Alerts -Standard user CAC logon. Alert reports FR-A45; FR-A49; FR-
-From the top menu bar, select display correctly. B8; FR-B10; NFR-4
‘Reports’ -> ‘New Report’ -> ‘Alerts’
- Input the identifying criteria
required. Options available are as
follows:
- Alert Type
- Alert Originator
- Alert Date Range
- Key Word
- Status (No Action Required, Pending,
Unresolved, Resolved)
- Click the ‘Generate Report’ icon
on the bottom right.
-Verify report information displays
correctly and includes all
information selected in the report
generation screen.

3.3.10 Reports – -Standard user CAC logon. Vulnerability DoDI 8530.01 -


Vulnerabilities -From the top menu bar, select report displays Cybersecurity
‘Reports’ -> ‘New Report’ -> correctly. Activities Support to
‘Vulnerabilities. DoD Information
Network Operations;
- Input the identifying criteria
NIST SP 800-40 rev.
required. Options available are as
3-Enterprise Patch
follows:
Management; DoDD
- System 8140.01 & DoD
- Vulnerability Type (IAVA, IAVB, IAVM, 8570.01-M - Cyber
TA, CTO) Workforce
- Release Date Management
- KB Association Program; FR-A38;
- CVE Identifier FR-A39; FR-A45;
- Key Word
NFR-2
- Status (No Action Required, Pending,
Unresolved, Resolved)
- Click the ‘Generate Report’ icon
on the bottom right.
-Verify report information displays
correctly and includes all
information selected in the report
generation screen.

Test Plan Template, version 0.1


Page 12 of 29
Camino Leaf Test Plan

3.3.11 Reports – -Standard user CAC logon. Monitor system DoDI 8530.01 -
System Status -Non-CLO escalation to for cyber Cybersecurity
administrative permissions. compliance. Activities Support to
DoD Information
-From the top menu bar, select
Network Operations;
‘Reports’ -> ‘New Report’ ->
NIST SP 800-40 rev.
‘System Status’.
3-Enterprise Patch
-Place checkmarks in the following Management; DoDD
boxes to scan for non-compliance: 8140.01 & DoD
- IAVA 8570.01-M - Cyber
- IAVB Workforce
- IAVM Management
Program; FR-A22;
- TA
FR-A23; FR-A25; FR-
- CTO
A38; FR-A39; FR-
- STIG A41; FR-A45;
- CVSS
-Click the ‘Generate Report’ icon on
the bottom right.
-Review results for cyber
compliance.

3.3.12 Reports – Export -Standard user CAC logon. Report data FR-A45; FR-A47; FR-
Format -From the top menu bar, select successfully A49; FR-B8; FR-B10;
‘Reports’ -> ‘Current Reports’. exports in the NFR-4
required format.
-Right click on an available report
and select export.
-Choose the directory and format.
Click ‘Save’.
-From the hosting device, browse to
the export directory and open the
report to confirm data is readable
and in the correct format.
Repeat for each of the following:
- *.CSV
- *.PDF
- *.TXT
- *.XLSX
- *.XML

Test Plan Template, version 0.1


Page 13 of 29
Camino Leaf Test Plan

3.3.13 Track User -Standard user CAC logon. Successfully FR-A6; FR-A11; FR-
Activity -Non-CLO escalation to track all user A12; FR-A13
administrative permissions. actions within
the Camino Leaf
-Select ‘Admin’ -> ‘User Activity’
system.
from the top menu bar.
-Filter User Activity based on the
following:
- User
- User Details (i.e. access location)
- Date
- Time
- Activity
- Generated Reports
- Push Alerts
- Files Accessed
- System Changes
-Review User results.

3.3.14 Data Push via -Standard user CAC logon. Verification of FR-A42; FR-A44
TAXII -Non-CLO escalation to TAXII services
administrative permissions. and off-site
distribution.
*Data report must be executed prior
to data push. All system reports
(alert, vendor, weather,
vulnerability, etc.) are eligible for
distribution to subscribers. *
-Select ‘Admin’ -> ‘TAXII’ -> ‘Status
Monitor’, to bring up the data
distribution monitor in a pop-up
window. Leave the window open, on
the side.
-Select ‘Admin’ -> ‘TAXII’ -> Data
Push from the top menu bar.
-From the list of available reports,
place a checkmark in the box, next
to the report(s) designated for
distribution.
-Click the, ‘Send Now’ icon in the
bottom right corner.
-In the TAXII Status Monitor, verify
the data selected was transmitted
off-site for subscriber viewing.

Test Plan Template, version 0.1


Page 14 of 29
Camino Leaf Test Plan

3.3.15 Searchable Help -Standard user CAC logon. Verification of FR-A46


Index -Click the ‘Help’ -> ‘About’ option searchable Help
from the top menu bar. file.
-Verify Camino Leaf version is listed
as 1.0.
-Click ‘Help’ again to view index and
searching options.
-In the search box, type common
use terms or statements associated
with operation, i.e. configuration,
how to generate a report, etc.
-Verify the information retrieved
matches the search terms provided.

3.4. Regression Testing


In this first iteration of Camino Leaf, Regression Testing takes the form of source code review, code
check-in and check-out procedures, and validation in the development environment prior to
production. Unit and functional tests are conducted, using a phased approach that allows for
segmented analysis of both code and function to support the identification of bugs and/or
vulnerabilities, inadvertently introduced in the various development stages. Additionally, the
gradual test progression through the various Camino Leaf functional areas allows for developers
to pinpoint dependency issues, not identified in unit testing.

As shown in figure 3.1, system testing is separated into nine essential categories, capturing the
principle elements of Camino Leaf.

Fig. 3-1: Camino Leaf Unit Test Structure

Test Plan Template, version 0.1


Page 15 of 29
Camino Leaf Test Plan

Private APIs, trusted objects, and individual function tests are utilized as each new element is
evaluated for accuracy and performance. Test case results are identified as either pass or fail,
with the results of each, logged and compiled for continued analysis. Code check-in and check-
out functions are then utilized, with implemented corrections and changes documented. Once
complete, re-evaluation of the previous object occurs to protect against ripple effect errors
stemming from alterations to the source code. With the validation of each individual component,
the successful addition is incorporated into a regression test suite, supporting evaluation of the
overall product.

4. Other Testing

4.1. Verification Testing

Verification testing will be conducted to ensure we are building the software right to fit the needs of
the customer. The diagram below illustrates the verification testing process that will be used by
our software team.

4.2. Validation Testing


Validation testing will be conducted to ensure we built the right product for our customer.
Validation testing will be conducted with the customer in a similar environment to their planned
software deployment to mimic the validation and acceptance testing conditions as close as
possible to their real word environment. This is to ensure a successful validation of the software
and to identify and remediate any deficiencies before the software is put into production.
Validation and acceptance testing will be conducted with the Camino Leaf Solutions software
team and the customer appointed team.

Test Plan Template, version 0.1


Page 16 of 29
Camino Leaf Test Plan

5. Test Strategy

5.1. Test level responsibility


There will be three levels of testing and two levels of analysis applied to the Camino Leaf project.
Unit testing will be used to ensure that each module of code functions as necessary to complete
its assigned task. Functional Testing will be used to ensure the aggregate of modules produce
expected outcomes and results. Regression Testing will be used to ensure that each additional
module added to the Camino Leaf project leaves it functional after implementation with no negative
side effects incurred. Post-software testing cycles will include Verification and Validation phases
to verify that the software performs as intended and produces the correct information for end user
operation.

Test Level External Proj Team


Party

Unit Testing P

Functional Testing P

Regression Testing S P

5.2. Test Type & Approach

Test Type Objectives

Unit Testing Requirements The objective of Unit Testing is to verify the following:
● Function returns expected results.
● Expose potential bugs in code that affect tested unit of
code.
● Determine whether errors are due to unit of code or flawed
testing logic.
● Ensure unit of code is free of dependencies from other
functions.
Unit testing will occur prior to merge requests. Code must past unit
tests prior to submitting merge request.

Functional Testing The objectives are to verify that the application:


Requirements ● Meets the defined requirements.
● Functions perform as intended.
● Ensures Error Handling is properly implemented.
● Ensures enforcement of properly formed requests and
return values.
● Testing will be conducted from the end-user perspective..
Functional testing will occur in an iterative and controlled manner,
ensuring the solution matches the defined requirements.

Regression Testing The objective of Regression Testing is to verify that code changes
Requirements and additions do not affect application functionality:

Test Plan Template, version 0.1


Page 17 of 29
Camino Leaf Test Plan

● Code changes or additions will be applied to development


server and vetted prior to inclusion in production code.
● Code will require Senior Developer approval in code
review phase prior to all merge requests for both
development and production codebase.
● All code subject to Regression Testing will be codelocked
and insusceptible to any code changes.
Regression Testing will occur prior to all timeline marker
submissions and phase completions.

5.3. Build strategy

The build strategy for the Camino Leaf project will be based upon an iterative approach driven by
predetermined milestones and phases. Development will be subjected to development and
production environments with each environment being replicated. Code releases will follow a
Continuous Integration/Continuous Development (CI/CD) approach with code testing and review
occurring prior to committal to the development environment. Successful end of phase Functional
and Regression testing will promote code to the production environment.

5.4. Facility, data, and resource provision plan


5.4.1. Test Environment and Requirements
The development and production environment will meet the following minimum requirements per
the Camino Leaf Visions Document:

Camino Leaf Interface


▪ Operating System
● Windows Desktop v.7 or greater
● MAC OS v.10.6 or greater
● Red Hat v.6 or greater
● Solaris v.10 or greater
▪ Browser
● Microsoft Internet Explorer v.10 or greater
● Google Chrome v.59.0.3071 or greater
● Mozilla Firefox v.53.0.3 or greater
▪ CPU Information
● CPU Speed: a minimum of 2.4 GHZ
● RAM: a minimum of 4GB, recommended 8GB
● Free Disk Space: 20 GB
▪ Cascading Style Sheets v.3 (CSS3)
▪ Hyper Text Markup Language v.5 (HTML5)
▪ JavaScript5 or greater

5.4.2. Resources & Skills


● A resource with PostGRESQL skills.
● A resource with REST API knowledge to validate URIs.
● A resource with Front End Web Development skills.
● A resource with Ruby RSpec unit testing abilities.

Test Plan Template, version 0.1


Page 18 of 29
Camino Leaf Test Plan

5.5. Testing Tools


The following tools will be used for testing:

Process Tool

Test case creation Ruby RSpec + Atom IDE

Test case tracking Jira’s TestFlo

Test case execution GitLab Runner + Ruby RSpec + Docker Container

Test case management Jira’s TestFlo

Defect management ServiceNow Ticketing System

5.6 Testing Handover Procedure

Code releases will follow a Continuous Integration/Continuous Development (CI/CD) approach with
code testing and review occurring prior to committal to the development environment. Upon a
successful testing phase, the code will be promoted through the workflow to any subsequent testing
that is still required. Successful end of phase Functional and Regression testing will promote code
to the production environment.

5.7 Testing Metrics

Jira’s TestFlo tracking system will be utilized to track tests that occur against the codebase. If any
module of code fails unit testing >3 times, the submission will be rejected and removed from project
submissions. A ticket will be opened for developers to modify and re-submit functional code and
initiate the workflow process over. Should any milestone code submissions have a submission
success rate of less than 90%, developer team leads will be notified, and milestones will need to
be re-evaluated. Developers responsible for submitting unsuccessful code exceeding the
aforementioned standard will need to submit design documents to developer team leads for review
and revision.

Test Plan Template, version 0.1


Page 19 of 29
Camino Leaf Test Plan

6. Test Environment Plan

6.1. Test Environment Diagram

Test Plan Template, version 0.1


Page 20 of 29
Camino Leaf Test Plan

6.2. Test Environment Details


6.2.1. Testers

Test Plan Template, version 0.1


Page 21 of 29
Camino Leaf Test Plan

6.3. Establishing Environment


Task Requirements Responsibility Start Date End Date

Hardware All parts and peripherals Nicholas Balich 10/15/18 10/22/18


Acquisition acquired by end date.

System System built and Douglas Cleary 10/23/18 10/30/18


Configuration operational by end date.

Environment Environment (Operating Micah Geertson 10/31/18 11/06/18


Configuration System and
Applications) installed
and configured by end
date.

System Network connectivity Alejandra Mejia 11/07/18 11/14/18


Networking operational and stable
by end date.

6.4. Environment Control


Software will be checked in and checked out via Jira and Release Assistant daily.
Modifications or additions will be submitted to GitLab as pull requests and contain all
necessary unit tests associated with changes implemented daily. All code and
environment variable modifications that do not contain sufficient testing units will be
rejected. Developer Team Leads will assign environment access based on developer’s
role during current software phase. Developers will ensure all testing is submitted to Jira
TestFlo and tracked throughout the remainder of each phase.

6.5. Environment Roles and Responsibilities


Role Staff Member Responsibilities

Release Manager Nicholas Balich Will maintain code check-in and


check-out process daily and ensure
latest release of application is
documented and controlled.

Test Manager Douglas Cleary Responsible for coordinating testing


plans, test adherence with
Developer Team Leads.

Project Manager Alejandra Mejia Ensures environments are properly


maintained and reconfigured if no
longer compliant. Ensures
environments are compliant during
milestone and phase achievements.

Test Plan Template, version 0.1


Page 22 of 29
Camino Leaf Test Plan

7. Assumptions and Dependencies

7.1. Assumptions
We will have a virtual environment that can support testing up at our SSCPAC facility. Using that
platform as a way to baseline our configuration will assist with the integration process. Test
Managers and Developer Team Leads will be onsite to support, train users and resolve any issues
as they are mentioned.

7.2. Dependencies
Test Manager will have full time access to the systems testing environment during the phases of
testing. The test environment will be configured and monitored by the Manager, all testing will be
started off by automating deployments and release patches to the team members before the live
version. As continuous integration to continuous deployment proceeds the team will increase code
coverage as the team builds onto the application. Doing so, will allow for a seamless deployment
due to all new changes are tested before being automatically released live.

Developer Team Leads will assign environment access based on developer’s role during current
software phase. When access is permitted, the developer can access any computer within the
network. The Project Manager will decide which attributes of the system to use for the rule
constraints, this will be help prevent conflict in rule setting in privilege leaking. Staff Members will
have access to some of the same data, but they have different roles to perform in relation to the
data and application’s function. Individuals in each group will have different job duties that will
be identified using several types of attributes. To avoid confusion in roles, specific role codes and
access profile rules are maintained for each user. When storing the data, the data owner
identifies attributes that describe the purpose of the data and the information in the document. At
the time of production, the user position codes and user's profiles are matched with document
attributes to determine the user’s access right.

Test Plan Template, version 0.1


Page 23 of 29
Camino Leaf Test Plan

8. Entry and Exit Criteria

Test Plan Template, version 0.1


Page 24 of 29
Camino Leaf Test Plan

9. Continuous Software Mitigation

9.1. Product Security Incident Response Team


Camino Leaf Solutions has established a PSIRT team to focuses on the identification,
assessment and disposition of the risks associated with security vulnerabilities within
our software, including offerings, solutions, components and/or services.

9.2. PSIRT Organizational Model


Camino Leaf Solutions PSIRT uses a Hybrid Model because of the different levels of security
classifications of information we may encounter from working with our government customers
across multiple agencies. This allows us to distribute issues across personnel throughout our
organization to the level of resolution needed for each PSIRT Service Area. PSIRT Service Areas
are broken out into categories of servicing customers to advance the relationships with our
customers (e.g. PSIRT Service Area 1 - Military Service Branches, Area 2 - Intelligence Agencies,
Area 3 - Critical Infrastructure).

Test Plan Template, version 0.1


Page 25 of 29
Camino Leaf Test Plan

9.3. PSIRT Process Model

The below diagram focuses on the identification, assessment and disposition of the vulnerabilities
found. The vulnerability intake identifies the vulnerability that exploited the software. Issue
triage prioritizes each bug based on severity, frequency, and risk. Issue analysis identifies
solutions for the bugs. Issue remediation is where a plan is devised on how to proceed with fixing
the bugs. Patch release & comms is a set of changes to update, fix, or improve the computer
program.

Test Plan Template, version 0.1


Page 26 of 29
Camino Leaf Test Plan

10. Definitions
The following acronyms and terms have been used throughout this document

Term/Acronym Definition

API-Application Programming A software intermediary that allows two applications to talk


Interface to each other

GUI-Graphical User Interface A user interface that includes graphical elements, such as
Window, icons and buttons

RT-Regression Testing The process of testing changes to computer programs to


make sure that the older programming still works with the
new changes

UT-Unit Testing A software development process in which the smallest


testable parts of an application are individually and
independently scrutinized for proper operation

FT-Functional Testing Functional testing is a quality assurance process and a type


of black-box testing that bases its test cases on the
specifications of the software component under test

SSCPAC SPAWAR System Center Pacific

PSIRT Product Security Incident Response Team

CI/CD Continuous Integration and Continuous Delivery

JIRA Used for bug tracking, issue tracking, and project


management

GitLab A web-based Git-repository manager providing wiki, issue


tracking and CI/CD pipeline features, using an open-source
license developed by GitLab Inc.

Test Plan Template, version 0.1


Page 27 of 29
Camino Leaf Test Plan

11. References
The following documents have been used to assist in creation of this document.

# Document name Version Comments

13.1 PSIRT Services Framework 1.0 IBM Product Security


Incident Response Team

13.2 Stakeholders Documentation 1.0 Acquisition notes gathered


from interviews on 14 SEPT
2018 at SPAWAR San
Diego, Old Town Complex

13.3 National Institute of Standards and Rev. 3 Enterprise Patch


Technology Special Publication (NIST Management
SP) 800-40

13.4 National Institute of Standards and Rev. 1 Protecting Controlled


Technology Special Publication 800-171 Unclassified Information

US Department of Defense, Instruction, Risk Management


13.5 Ver. 1
DoDI 8510.01 Framework (RMF) for DoD
Information Technology.

US Department of Defense, Instruction,


13.6 1.0 Supply Chain Material
DoDI 4140.01
Management

US Department of Defense, Instruction, Ports, Protocols, and


13.7 Change 1
DoDI 8551.01 Services Management

US Department of Defense, Manual, Federal Logistics Information


13.8 1.0
DoD-M 4100.39 Systems (FLIS) Procedures

13.9 US Department of Defense, Instruction, Change 2 Public Key Infrastructure


DoDI. 8520.02 (PKI) and Public Key (PK)
Enabling

13.10 US Department of Defense, Manual, 1.0 Information Assurance


DoD-M 8570.01 Workforce Improvement
Program

13.11 US Department of Defense, Office of 1.0 DFARS


Economic Adjustment

Test Plan Template, version 0.1


Page 28 of 29
Camino Leaf Test Plan

12. Points of Contact


The following people can be contacted in reference to this document

Primary Contact

Name Alejandra Mejia

Title/Organisation Project Manager/Camino Leaf Solutions

Phone (916)123-4567

Email amejia@cls.org

Secondary Contact

Name Douglas Cleary

Title/Organisation Test Manager/Camino Leaf Solutions

Phone (916)712-3456

Email dcleary@cls.org

Test Plan Template, version 0.1


Page 29 of 29

S-ar putea să vă placă și