Sunteți pe pagina 1din 14

Spell out the Project Name

(Project Acronym)

Testing Specifications and Procedures

Month DD, YYYY

UNITED STATES PATENT AND TRADEMARK OFFICE


INFORMATION TECHNOLOGY TESTING BRANCH
[Project Name] Testing Specifications and Procedures

TABLE OF CONTENTS

Qualification Statement...............................................................................................................iii

1 SCOPE...................................................................................................................................1-1
1.1 System Overview......................................................................................................................................1-1
1.2 Overview...................................................................................................................................................1-1
1.3 Performance Testing.................................................................................................................................1-1
1.4 Security Test and Evaluation...................................................................................................................1-1
1.5 System Compatibility Testing..................................................................................................................1-1

2 REFERENCED DOCUMENTS..........................................................................................2-1
2.1 Regulatory References.............................................................................................................................2-1
2.2 Department of Commerce Document References...................................................................................2-1
2.3 Project Specific Document References....................................................................................................2-1
2.4 Commercial Document References.........................................................................................................2-2

3 TEST SPECIFICATIONS AND PROCEDURES.............................................................3-1


3.1 Features to be Tested................................................................................................................................3-1
3.2 Features Not to be Tested.........................................................................................................................3-1
3.3 Approach Refinements.............................................................................................................................3-1
3.4 Feature Pass/Fail Criteria.........................................................................................................................3-1
3.5 Input Specifications..................................................................................................................................3-2
3.6 Output Specifications...............................................................................................................................3-2
3.7 Environmental Needs...............................................................................................................................3-2
3.7.1 Hardware...................................................................................................................................3-2
3.7.2 Software....................................................................................................................................3-2
3.7.3 Other Needs..............................................................................................................................3-2
3.8 Test Specifications And Procedures.........................................................................................................3-4
APPENDIX A ACRONYMS AND ABBREVIATIONS....................................................A-1
APPENDIX B [REQUIREMENTS or USE CASE] TO TEST ALLOCATION MATRIX.B-1

LIST OF TABLES
Table 3-1 Severity Rankings for Discrepancies....................................................................................................3-1

GENERAL comments on the Header and Footer. The footer date on the right should be
same as the cover date, which is also the deliverable date.

There is a slash and a version number following the date. You assign the version
number based on the iteration of the document (e.g., is this the first time the document
is being written = version 1; are you revising a version 1 document, which has been
delivered to USPTO = version 2 and so on ). The version number is NOT the AIS
version #, which is in the left header.

ii Month DD,YYYY/v.1
[Project Name] Testing Specifications and Procedures

QUALIFICATION STATEMENT

The paragraph below is required.

All product names and services identified throughout this document are trademarks or registered
trademarks of their respective companies.

iii Month DD,YYYY/v.1


[Project Name] Testing Specifications and Procedures

1 SCOPE

1.1 System Overview

The purpose of United States Patent and Trademark (USPTO) [project name and version #)
project is to enhance its existing system and further address customer needs. Add anything that
provides further insight into the project.

1.2 Overview

The Testing [Specifications or Scenarios] and Procedures contains: scope, referenced


documents, test [specifications or scenarios], and test procedures. It provides a detailed
description of each test [specification or scenario] and the [requirement(s) or Use Case(s)] it
tests. Each requirement from the [RTMx Report (RTMx) or Use Cases] includes a unique
identification (ID) and specified functionality. The test procedures describe the step-by-step
actions, expected results, and any special conditions necessary for testing.

During System Integration Testing the System Development Lead (SDL) edits the Testing
[Specifications or Scenarios] and Procedures, which the testing staff then uses as a basis for
conducting Functional Qualification Testing (FQT) and determining the system’s ability to meet
requirements. The testing staff will update the Testing [Specifications or Scenarios] and
Procedures as needed during the FQT process.

1.3 Performance Testing

This document does not address Performance Testing for [project acronym and version].

1.4 Security Test and Evaluation

This document does not address Certification and Accreditation (C&A) Security Test and
Evaluation (ST&E) testing for [project acronym and version].

1.5 System Compatibility Testing

This document does not address System Compatibility Testing (SCT) for [project acronym and
version].

1 Month DD,YYYY/v.1
[Project Name] Testing Specifications and Procedures

2 REFERENCED DOCUMENTS

The following documents are either referenced in or were used in preparation of this document:

2.1 Regulatory References

Section 508 of the Rehabilitation Act (29 U.S.C. 794d), as amended by the Military


Construction Appropriations Act for Fiscal Year 2001 (P.L. 106-246), July 2000.

2.2 Department of Commerce Document References

Technical Reference Model, Version 7.0, February 2003.

Systems Development Life Cycle, Version 1.00, June 5, 2007.

Quality Assurance for Automated Information Systems Technical Standard and Guideline,
IT- 212.2- 04, January 30, 1997.

Testing Technical Standards and Guidelines, USPTO IT-212.3- 01, October 2001.

Accommodation Technology Compliance (ATC) Functional Qualification Testing Specifications


and Procedures, June 1, 2001.

2.3 Project Specific Document References

Use the exact title of the AIS document rather than writing the skeleton/shortcut titles
shown below.

Keep month, day, and year together by using non-breaking spaces [CTRL + SHIFT +
Space] and NON-breaking hyphens [CTRL underline which means control + shift +
dash]
Do not write a shortcut name such as” QA Plan for ABC System,” rather, “Quality Assurance
Plan for Act Baseline Cat 2.5 (ABC 2.5), January 23, 2006.”

Requirements Specification for [name of AIS and version], Month XX, YYYY.

Requirements Traceability Matrix for [name of AIS and version], Month XX, YYYY.

Concept of Operations for [name of AIS and version], Month XX, YYYY.

Detailed Design Document for [name of AIS and version], Month XX, YYYY.

QA Plan for [name of AIS and version], Month XX, YYYY.

Test Plan for [name of AIS], Month XX, YYYY.

1 Month DD,YYYY/v.1
[Project Name] Testing Specifications and Procedures

Configuration Management Plan for [name of AIS and version], Month XX, YYYY.

Office of Procurement Check List for Processing EIT Request (Electronic Information
Technology), http://ptoweb/ptointranet/cio/section508/508.htm.

2.4 Commercial Document References

2 Month DD,YYYY/v.1
[Project Name] Testing Specifications and Procedures

3 TEST SPECIFICATIONS AND PROCEDURES

3.1 Features to be Tested

Specific functionality and features of [project acronym and version] to be tested are included in
Appendix B, [Requirements or Use Cases] To Test Allocation Matrix.

The principal features to be tested are categorized into the following areas:
a) Accessibility (Section 508).
b) External Interfaces.
c) Performance.
d) Important feature.
e) Important feature.
f) Important feature.

3.2 Features Not to be Tested

[Copy the FIRST paragraph from Section 5.5 from the Test Plan.] [Testing will also not
include product features that have not been modified from the previous version.]

3.3 Approach Refinements

There are no changes to the testing approach that was defined in the [name of AIS] Test Plan.

[OR]

[Describe changes that have been made to the test approach. In particular, was §508 planned or
added? Are automated scripts now expected to be used to test system functionality? Are
Performance Testing scripts now planned? Has the expected location of testing changed?]

3.4 Feature Pass/Fail Criteria

Any discrepancies identified are classified as one of three types defined in Table 3-1:
Table 3-1 Severity Rankings for Discrepancies
Severity Description
Critical Discrepancies that halt further program execution.
Example: run-time errors that cause the system to lock up in an unexplained or
unexpected manner.
Major Discrepancies that cause the application not to perform as functionally required.
Example: inability to print a report.
Minor Discrepancies that are not considered critical or major.
Examples: misspellings on a screen, ambiguous Help messages.

1 Month DD,YYYY/v.1
[Project Name] Testing Specifications and Procedures

3.5 Input Specifications

See the Operator Action column for the detailed input specifications in Section 3.8.

3.6 Output Specifications

See the Expected Results column for the expected outputs of each operator action in Section 3.8.

3.7 Environmental Needs

Copy the hardware and software from the Test Plan unless you have more current
information.

Include Testing Support software (Bobby, JAWS, etc.) in the Software subsection.

The SDL will supply the test environment and is responsible for building the test database.

3.7.1 Hardware

3.7.2 Software

3.7.3 Other Needs


a) System IDs and passwords.
b) PTOnet.
c) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
d) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
e) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
f) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
g) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
h) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
i) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.
j) Interface with [System name] using a [full or partial copy /production version] and
[similar/production] hardware.

2 Month DD,YYYY/v.1
[Project Name] Functional Qualification Testing

3.8 Test Specifications And Procedures

Examples of Testing [Specifications or Scenarios] and Procedures are provided below.

Remember to include testing Support software (Bobby, JAWS, etc.) in the Prerequisites or Setup.

Test Name: Test Case 1: Create Workload Navigator Basic Pie Chat Test Case.
Description: The Pie and Text Chart features should display independently for Assigned and Unassigned cases in Workload
Screen.
Requirement(s): TICRS-2.
Prerequisites: The user is logged on as an SLIE/BA.
Setup: Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in
Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the
FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):

Steps:
Step Operator Action Expected Results Observed Pass/
Results Fail
1. Logon to system as an SLIE user with valid ID. User should be able to log in the
system and Workload screen appears
for SLIE.
2. Click on the ‘Assigned’ tab. The Assigned work for Law Office in
which the SLIE is assigned by default
appears on screen. The Suspensions
cases appear in the work load screen.
3. Click on the ‘ Pie Chart.’ Pie Chat screen will appear.
4. Click ‘Close.’ Display Suspension cases screen.
5. Click on the ‘Unassigned’ tab. ‘Unassigned’ will be appear.

Month DD,YYYY/v.1
3
[Project Name] Functional Qualification Testing

Step Operator Action Expected Results Observed Pass/


Results Fail
6.
7.
8.

Month DD,YYYY/v.1
4
[Project Name] Functional Qualification Testing

Test Name: Test Case 2: Prosecution History Webpage


Description: In case details screen Prosecution History Webpage link will be available which provided by USPTO.
Requirement(s): TICRS-4.
Prerequisites: The user is logged on as an LIE/SLIE/BA.
Setup: Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in
Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the
FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):

Steps:
Step Operator Action Expected Results Observed Pass/
Results Fail
1. Logon to system as an SLIE user with valid ID. User should be able to log in the
system and Workload screen appears
for SLIE.
2. Click on the ‘Assigned’ tab. The Assigned work for Law Office in
which the SLIE is assigned by default
appears on screen. The Suspensions
cases appear in the work load screen.
3.
4.
5.
6.

Month DD,YYYY/v.1
5
[Project Name] Functional Qualification Testing

Test Name: Test Case 3: Display list of all TICRS documents.


Description: Case Details screen tab will provide access to users to view or display list of all TCRS documents.
Requirement(s): TICRS-6.
Prerequisites: The user is logged on as an LIE/SLIE/BA.
Setup: Tester must point to Mock P drive in test environment. Verify that the most recent file is currently in
Trademarks\FAST 2.1\BIN\Fast Application.exe of the current CM Build. Create a desktop shortcut of the
FAST executable file from Mock P and launch the FAST exe.
Map to the TICRS drive(s):

Step Operator Action Expected Results Observed Pass/


Results Fail
1. Logon to system as an SLIE user with valid user User should be able to log in the
ID. system and Workload screen appears
for SLIE.
2. Click on the ‘Work List’ tab. The ‘Work List’ screen will be
appear, also the Suspensions cases
category page display by default.
3.
4.
5.

Month DD,YYYY/v.1
6
[Project Name] Functional Qualification Testing
Specifications and Procedures

APPENDIX A ACRONYMS AND ABBREVIATIONS

ATC Accommodation Technology Compliance


C&A Certification & Accreditation
CD Change Document
CM Configuration Management
COTS Commercial Off-The-Shelf
EIT Electronic Information Technology
FQT Functional Qualification Testing
GUI Graphical User Interface
HP Hewlett-Packard
ID Identification
OCIO Office of Chief Information Officer
PTOnet USPTO’s campus-wide network
QA Quality Assurance
RTMx RTMx Report
SCT System Compatibility Testing
SDL System Development Manager
ST&E Security Test and Evaluation
USPTO United States Patent and Trademark Office

1 Month DD,YYYY/v.1
[Project Name] Functional Qualification Testing
Specifications and Procedures

APPENDIX B [REQUIREMENTS or USE CASES] TO TEST ALLOCATION


MATRIX

Requirement Requirement Description Test Case


Number Where
Verified
RE-V1.0001 The system shall provide for the initial receipt and initial
scanning of reexamination applications in OIPE when a final
filing date is established and will provide in OIPE the ability
1
to create, maintain and print Contents entries for the
electronic file corresponding to the separate papers in the
paper reexamination application.
RE-V1.0002 The system shall will provide the ability to preclude
reexamination applications from automated initial 1
classification.
RE-V1.0003 The system shall provide the ability to retrieve an “as filed”
2
version of reexamination electronic file wrapper.
RE-V1.0004 The system shall provide the ability to view and print an
electronic Contents listing of all papers for each electronic 2
file folder in the REPS system.
RE-V1.0036 The system shall provide an interface with the Revenue
Accounting Management System (RAM), so that at any time
3
during a REPS session, a user may enter or modify fee
information.
RE-V1.0037 The system shall distinguish between a reexam requester and
other types of filers when the bibliographic data is captured.
Additionally, the system shall determine the patent number Deferred
on which the reexamination request is based from the
bibliographic data entered on the form.
RE-V1.0067 The PSR printer must print a separator sheet containing this
3
information as the first page of the print job.
RE-V1.0095 PTCS will have the capability to print an electronic CD-R
3
contents list to be stored on the CD-R as a file.

1 Month DD,YYYY/v.1

S-ar putea să vă placă și