Sunteți pe pagina 1din 32

ANSI Std C12.

23-200x
COMPLIANCE TESTING
FOR STANDARD PROTOCOL
C12.18

Copyright © 200X by the National Electrical Manufacturers Association


1300 North 17th Street, Suite 1847
Rosslyn, VA 22209, USA
All rights reserved

This is an unapproved draft of a proposed ANSI Standard, subject to change. Permission is


hereby granted for ANSI Standards Committee participants to reproduce this document for
purposes of ANSI standardization activities. Use of information contained in this unapproved
draft is at your own risk.

Modified October 20, 2005

Standard Version 0.0

Document Version 1.0

-I-
Table of Contents
1. INTRODUCTION..................................................................................................................................4

2. SCOPE.....................................................................................................................................................4

3. DOCUMENT SYNTAX.........................................................................................................................4

4. REFERENCES.......................................................................................................................................4

5. DEFINITIONS........................................................................................................................................4
5.1. PSEM................................................................................................................................................4
5.2. TABLE...............................................................................................................................................4
5.2.1. ANSI.........................................................................................................................................5
5.2.2. Application Layer.....................................................................................................................5
5.2.3. AMR..........................................................................................................................................5
5.2.4. Baud Rate.................................................................................................................................5
5.2.5. Catastrophic Failure a.k.a. Fatal Failure...............................................................................5
5.2.6. CRC..........................................................................................................................................5
5.2.7. Data Link Layer.......................................................................................................................5
5.2.8. Device Under Test....................................................................................................................5
5.2.9. DUT..........................................................................................................................................5
5.2.10. FLC..........................................................................................................................................5
5.2.11. Function Under Test................................................................................................................6
5.2.12. FUT..........................................................................................................................................6
5.2.13. Half Duplex..............................................................................................................................6
5.2.14. Meter Under Test.....................................................................................................................6
5.2.15. MUT.........................................................................................................................................6
5.2.16. Octet.........................................................................................................................................6
5.2.17. Packet.......................................................................................................................................6
5.2.18. Packet Envelope.......................................................................................................................6
5.2.19. PSEM........................................................................................................................................6
5.2.20. Retrieve....................................................................................................................................6
5.2.21. Test...........................................................................................................................................7
5.2.22. Test Application.......................................................................................................................7
5.2.23. Write.........................................................................................................................................7
6. PRODUCT SUBMISSION AND VERIFICATION CRITERIA.......................................................7
6.1. RATIONAL.........................................................................................................................................7
6.2. SUBMISSION RULES...........................................................................................................................7
6.3. VERIFICATION GOALS.......................................................................................................................7
6.4. VERIFICATION NON GOALS................................................................................................................8
6.5. INTERPRETATION OF THE STANDARDS..............................................................................................8
6.6. OTHER CRITERIA...............................................................................................................................8
6.6.1. Example 1.................................................................................................................................8
6.6.2. Example 2.................................................................................................................................9
6.6.3. Example 3.................................................................................................................................9
6.6.4. Example 4...............................................................................................................................10
7. GENERAL CONFORMANCE STATEMENT.................................................................................10
7.1. THE TEST APPLICATION..................................................................................................................10
7.2. THE TEST APPLICATION AS A REFERENCE ANSI C12.19/IEEE-1377 IMPLEMENTATION.............11
7.3. TEST RESULTS AND RANKING CATEGORIES...................................................................................11

- II -
7.3.1. Not Applicable........................................................................................................................11
7.3.2. Not Tested...............................................................................................................................11
7.3.3. Conforming............................................................................................................................11
7.3.4. Conforming with Discrepancy...............................................................................................11
7.3.5. Non-conforming.....................................................................................................................11
7.3.6. Non-conforming, Rejected......................................................................................................12
7.3.7. Test Remarks..........................................................................................................................12
8. VERIFICATION OBJECTIVES FOR ANSI C12.18 DEVICES....................................................12
8.1. ANSI STANDARD C12.19-1997 / IEEE-1377 “UTILITY INDUSTRY STANDARD TABLES”.............12
8.1.1. Type of Tables........................................................................................................................12
8.1.2. Operations on Tables.............................................................................................................12
8.2. ANSI STANDARD C12.18-1996 “PROTOCOL SPECIFICATION FOR ANSI TYPE 2 OPTICAL PORT” 13
8.2.1. Basic Implementation Assumptions........................................................................................13
8.2.2. States and Services.................................................................................................................13
9. TEST PROCEDURES FOR ANSI C12.18........................................................................................14
9.1. TEST PROCEDURES FOR THE PHYSICAL LAYER...............................................................................14
9.1.1. Transmitter Characteristics (Section 4.8.3.2)........................................................................14
9.2. TEST PROCEDURES FOR THE DATA LINK LAYER............................................................................15
9.2.1. Pre-Test General Set-up Requirements..................................................................................15
9.2.2. Basic Sanity Check.................................................................................................................16
9.2.3. Immunity to Random Noise....................................................................................................16
9.2.4. ACK Returned Due to Reception of a Duplicate Packet........................................................17
9.2.5. NAK Returned Due to CRC Error..........................................................................................17
9.2.6. NAK Returned Due to Inter-Character Timeout....................................................................17
9.2.7. NAK Returned Due to Long Packet.......................................................................................18
9.2.8. NAK Returned Due to Invalid Multi-Packet Flag Setting......................................................19
9.2.9. NAK Returned Due to Invalid First-Packet Flag Setting.......................................................19
9.2.10. NAK Returned Due to an Invalid Sequence Number.............................................................19
9.2.11. Retransmission Triggered by a Response Timeout................................................................20
9.2.12. Retransmission Triggered by a NAK......................................................................................20
9.2.13. Link Termination After Third Retry Attempt (NAK)..............................................................21
9.2.14. Link Termination After Third Retry Attempt (Response Timeout).........................................21
9.2.15. Link Termination Due to Channel Traffic Timeout................................................................22
9.2.16. Sensitivity to the Initial State of the Toggle Bit......................................................................23
9.2.17. Default Packet Size Validation (Read)...................................................................................23
9.2.18. Default Packet Size Validation (Write)..................................................................................24
9.3. PSEM TEST PROCEDURES..............................................................................................................24
9.3.1. PSEM Service Transaction....................................................................................................24
9.3.2. Identification Service (Response)...........................................................................................25
9.3.3. Negotiate Service (Change Baud Rate)..................................................................................25
9.3.4. Negotiate Service (Number of Packets and Packet Size).......................................................26
9.3.5. Negotiate Service (Honors Negotiated Limits)......................................................................26
9.3.6. Negotiate Service (Establish Maximum Packet Limits).........................................................27
9.3.7. Wait Service...........................................................................................................................28
9.3.8. Terminate Service Working....................................................................................................28
9.3.9. Service Sequence State...........................................................................................................29
ANNEX A - TEST REPORTS (NORMATIVE).......................................................................................30
A.1 TESTS SUMMARY SHEET......................................................................................................................30
A.2 DETAIL RESULTS OF “TEST PROCEDURES FOR ANSI STANDARD C12.18-1996”................................31
A.2.1 Results of “Test procedures for the Data Link Layer”................................................................31
A.2.2 Results of “PSEM Test Procedures”...........................................................................................31
ANNEX B - Verification Codes (Normative)................................................................................................31

- III -
- IV -
TO DO: Create a requirement metrix and qualify each test base on this metrix.
TO DO: Add examples of real message format
TO DO: Format the definitions section
TODO: Add List test equipment and traceability to Standards

1. Introduction
Give some background about this standard, what it has to accomplish, the referenced standard and
how its envisioned to be implemented in the industry.
Add text on pre-requisites and order of tests

2. Scope
This document is a collection of compliance test procedures that aim to validate the implementation
correctness of ANSI C12.19-1997/IEEE-1377 devices that communicate using ANSI C12.18-1996
Standard communication.

3. Document Syntax
Explain the various annotations used in this document.

4. REFERENCES
ANSI Std C12.18-1996: Protocol Specification for ANSI Type 2 Optical Port

ISO 7498/1: OSI Reference Model

ISO 3309-1993(E): Information technology - Telecommunications and information


exchange between systems - High-level data link control (HDLC)
procedures - Frame Structure, Annex A Explanatory Notes On
Implementation of the Frame Checking Sequence

ANSI Std C12.19-1997: Utility Industry End Device Data Tables.

5. Definitions
For the purposes of this document, the following definitions are made for terms and syntax used throughout
this document.

5.1. PSEM

Protocol Specification for Electricity Metering as referenced in C12.18-1996

5.2. Table

Functionally related data elements, grouped together into a single data structure for transport as defined by
ANSI standard C12.19-1997

-V-
5.2.1. ANSI
American National Standards Institute. The primary organization for fostering the
development of technology standards in the United States, ANSI works with industry
groups and is the U.S. member of the International Standards Organization (ISO) and
the International Electrotechnical Commission (IEC).

5.2.2. Application Layer


The application layer (OSI layer 7) is the layer at which a user application interfaces with
a communication network.

5.2.3. AMR
AMR is an acronym for automated meter reading.

5.2.4. Baud Rate


Baud was the prevalent measure for data transmission speed until replaced by a more
accurate term, bps (bits per second). One baud is one electronic state change per
second. Since a single state change can involve more than a single bit of data, the bps
unit of measurement has replaced it as a better expression of data transmission speed.
The Standards use the terms Baud, Baud Rate, bps and bits per second
interchangeably.

5.2.5. Catastrophic Failure a.k.a. Fatal Failure


A failure that results in temporary or permanent disruption of communication, corruption
of protocol, delivery of wrong data, misrepresentation of data format, incorrect placement
of data in accordance to the Test Application reference model.

5.2.6. CRC
CRC is an acronym for “Cyclic Redundancy Check”. The CRC is used to validate the
integrity of the transmitted packets.

5.2.7. Data Link Layer


The data link layer (OSI layer 2) provides the synchronization of data, error detection
and correction procedures used during data transmission.

5.2.8. Device Under Test


A C12.19 Utility Industry Standard Tables driven device, using C12.18, C12.21 or
optionally C12.22 in any combination. This device and its accessories are being tested
for successful communication and data exchange with Test Application. The tests are
carried in a sequence, one Function Under Test at a time.

5.2.9. DUT
DUT is an acronym for “Device Under Test”.

5.2.10. FLC
ANSI C12.19/IEEE-1377 Function Limiting Control tables (x0/x1, where x is the decade
number 0-240).

- VI -
5.2.11. Function Under Test
A software, firmware, hardware or protocol component test that is applied to a Device
Under Test in order to validate the correct performance of a component. The test will
establish whether the component meets or does not meet interface, state, timing, data
values, data representation, data placement, protocol or implementation requirements in
reference to the Test Application. A “Conforming”, “Conforming with Discrepancy”, “Non-
Conforming” or “Non-conforming, Rejected” grade shall be assigned following the test of
each Function Under Test.

5.2.12. FUT
FUT is an acronym for “Function Under Test”.

5.2.13. Half Duplex


Half-duplex data transmission means that data can be transmitted in both directions on a
signal carrier, but not at the same time. One station can send data on the line and then
shortly after receive data on the line from the same direction in which data was just
transmitted.

5.2.14. Meter Under Test


A Device Under Test that is a sensor measuring water, gas or electricity.

5.2.15. MUT
MUT is an acronym for “Meter Under Test”.

5.2.16. Octet
An octet (from the Latin “octo” or “eight”) is a sequence of eight bits. An octet is an eight-
bit byte.

5.2.17. Packet
A packet is the unit of data that is routed between an origin and a destination. In an ANSI
C12.18 DUT, the individual packets may be fragments of a given service request. When
they have all arrived, they are reassembled into the original message at the receiving
end.
An ANSI C12.18 packet begins with the start of packet octet, <stp> = 0xEE, and ends
with a two octets CRC, based on the CCITT CRC Standard polynomial (x16+ x12 + x5 +
1).

5.2.18. Packet Envelope


The packet’s start-of-packet octet, <stp>, and the end of packet CRC.

5.2.19. PSEM
PSEM is an acronym for “Protocol Specification for Electric Metering” – USA.
PSEM is an acronym for “Protocol Specification for Electronic Metering” – Canada.

5.2.20. Retrieve
To cause the transmission from the DUT to the test application of an entire or partial
DUT table. The act of data retrieval may require the invocation of some or all the

- VII -
following PSEM services: Identification, Negotiate, Logon, Wait, Security, Read and
Logoff.

5.2.21. Test
A test is the execution of one element of a FUT procedure by the test application.

5.2.22. Test Application


The test application is a software application built to enable Manufacturers, Utilities and
meter verifiers to migrate their AMR and testing environment to ANSI C12.19 Standard-
based technology. It can read and program any ANSI C12.19-based device that uses
ANSI C12.18 Standard communication protocol. It also accepts and creates table
definition files, which describe the architecture and state of any ANSI C12.19 device. It
may also be a diagnostic and test tool, which is a reference application used to validate
device design assumptions and implementation details.

5.2.23. Write
To cause the transmission from the test application to DUT of an entire or partial DUT
table. The act of data writing may, in addition to the above, require the invocation of
some or all the following PSEM services: Identification, Negotiate, Logon, Wait, Security,
Read, Write and Logoff. The DUT may require the invocation of the Procedure 2, “Save
Configuration”, prior to Logoff for it to retain the changes.

6. Product Submission and Verification Criteria

6.1. Rational
This document defines a suite of tests with well-defined acceptance criteria that promotes
interoperability among devices that use the referenced standards.

6.2. Submission rules


It shall be required to obtain manufacturer consent to each compliance procedure for the FUT.
Test procedures that do not meet with manufacturer approval, or are withdrawn by the
manufacturer, prior to test initiation, shall be entered as “Not Tested”, and an explanation for the
result shall be entered in the exception field of the report.
The test suite outlined in this document performs an assessment of the DUT and its ability to
conform to the said reference test application. The following is a summary of the general
components that will, and will not be assessed as part of these conformance tests.

6.3. Verification Goals


The tester shall perform an assessment of the following:
1. Completeness of submission of DUT for testing (e.g. meter, connectors, accessories,
manuals, supporting electronic files, etc.)
2. Configuration and state disclosure of the DUT prior to testing (passwords, special processes,
jumper settings, mode settings, etc.)
3. Readiness of the DUT for testing (e.g. is it in metering state, is it sealed, does it need to be set
up? If so, how?).

- VIII -
4. Compatibility with Test Application communication reference interface (C12.18) at the physical
layer.
5. Compatibility with Test Application communication reference implementation (C12.18,) at the
data link layer.
6. Compatibility with Test Application protocol reference implementation (C12.18) at the
application layer (PSEM).
7. General assessment of firmware stability, solely derived from performing the tests describes
above.

6.4. Verification non goals


The tester shall not perform an assessment of the following:
1. DUT accuracy tests (including time, interval, and metrological registers).
2. DUT reliability tests (drop test, vibration test, electromagnetic susceptibility, weather,
temperature, etc.).
3. Hardware interface quality, accuracy and signaling strengths (except where it clearly does not
work reliably).
4. Device factory configuration and setup.
5. Implementation logic (assessment of device functions implemented, their delivery logic or
value to customer).
6. Evaluation of device performance, efficiency and effective delivery of advertised functionality to
the marketplace.

6.5. Interpretation of the Standards


Explain how this Standard is applied; and how other Standards are interpreted. Explain how
ambiguity is addressed or not addresses. Give an example

6.6. Other criteria


1. Where the standard is clear and unambiguous, produce a clear pass/fail a result.
2. Where the standard is unambiguously clear in its intent, but does not provide implementation
details for the required functionality, apply common sense.
3. Where the standard is vague or there is room for multiple interpretations, the test will be written
to handle the alternatives and expose the variations.
4. Where the standard says nothing or is so vague that one cannot formulate a test, do not test.
Do we want to revise these examples to be specific to C12.18?

6.6.1. Example 1
This example describes a test result where the Standard is clear and unambiguous.
Description

EVENT_INHIBIT_OVF_FLAG is defined as follows in table 71.


FALSE Event Log is not inhibiting new entries when an overflow
condition exists.

- IX -
TRUE Event Log is inhibiting new entries when an overflow condition
exists

OVERFLOW_FLAG is defined as follows in table 76.


FALSE Overflow has not occurred.
TRUE An attempt was made to enter an event such that the
number of unread entries would have exceeded the actual
number of possible entries in the log.
Conclusion
The standard is very clear in its expression of the intended implementation of the
OVERFLOW_FLAG status as restricted by the EVENT_INHIBIT_OVF_FLAG control flag.
Therefore, one cannot argue against a test that validates the behavior of an end device, following
excessive excitation that will cause the log to overflow. The compliant log shall not overflow if
EVENT_INHIBIT_OVF_FLAG control flag is true, and it shall overflow if
EVENT_INHIBIT_OVF_FLAG control flag is false. Regardless of the state of
EVENT_INHIBIT_OVF_FLAG control flag, the OVERFLOW_FLAG shall be set.

6.6.2. Example 2
This example describes a test result where the standard is unambiguously clear in its intent but
does not provide implementation details for the intent.
Description

LIST_TYPE_FLAG is defined as follows in table 26.


FALSE FIFO (First In First Out) as placed in self read list.
TRUE Circular list as placed in self read list.

LIST_TYPE is defined as follows in table 63.


0 FIFO (First In First Out) as placed in load profile storage.
1 Circular list as placed in load profile storage.

LIST_TYPE is defined as follows in table 74.


0 FIFO - as placed in log.
1 Circular - as placed in log.

Conclusion
The acronym FIFO is not defined in the Standard. However the standard qualifies the term FIFO in
table 26 and table 63 with the definition "First In First Out". Using (flawed) logic one can argue that
one cannot make any assumption about what the term FIFO means in Table 74, since the
Standard is silent about that (this follows the reasoning similar to that of manufacturer tables do not
have FLCs). So what would one make out of FIFO in table 74? Does FIFO mean "First In First
Out" or "Fill In Faulty Order" or "First In Final Out"? One could argue that a test for FIFO in Table
74 cannot be carried out on this basis.
Reasonable people would say that it is rather obvious that based on prior use of the term FIFO, the
Standard is clear about the intended use of the definition "First In First Out" and the compliance
Standard shall perform the tests accordingly.
Caveat: The term Circular is not defined in the Standard, does it mean that C12.23 cannot have a
test for Circular vs. FIFO since we do not know what it really means in terms of placement or
ordering of entries in a logs? If so, how can one build an AMR system to read a log based on the
C12.19 Standard?

-X-
6.6.3. Example 3
This example describes a test where the standard is vague or there is room for "possibilities" the
test will be written to cope with these possibilities and expose the variations:
Description
The value of the element LAST_ENTRY_SEQ_NBR, 0..4,294,967,295, the sequence number of
the newest valid entry in the log, is not clearly described by C12.19.
It could retain its value after a reset list pointers or it could be reset. However, there are only two
possible outcomes following the execution of a reset list pointers procedure:
LAST_ENTRY_SEQ_NBR retains its value.
LAST_ENTRY_SEQ_NBR does not retain its value.
Conclusion
The Standard is clear that following execution of a reset list pointer procedure, all elements that
describe the content (extent) of the list shall be consistent with that of the empty list. Therefore, one
can design a test that validates the reset action to create an empty list and also report the value of
LAST_ENTRY_SEQ_NBR before and after the test. This will expose the actual implementation
model of the DUT. Implementation variations of the LAST_ENTRY_SEQ_NBR shall not result in a
failure of a DUT, unless there is a country-specific clause regarding a specific desired behavior
(e.g. for Canada).

6.6.4. Example 4
This example describes a test where the standard says nothing and it is so vague that one cannot
formulate a test.
Description
EVENT_LOG_CTRL_TBL (Table 75) defines the Event Log codes to be written to the Event Log. It
also defines which specific procedures and/or table writes that are to be acknowledged in the
Event Log. For a specific procedure or table to be acknowledged, three independent tests shall all
be true:
The procedure or table shall be used in the end device, per the GEN_CONFIG_TBL (Table 00).
The appropriate Event code shall be used, per Table 75.
The procedure or table shall be configured to be acknowledged, per Table 75.
Conclusion
It is rather clear that table 75 contains the list of events used by the event logger. But what is an
“appropriate event code”? The standard does not define it and there is no indication of how one
should interpret “appropriate”. For that reason one cannot define a test for the use of appropriate
event codes.
Caveat: It is possible that some countries may provide definitions (e.g. Canada) for what is
appropriate. If such definitions are mandated, they can be entered into the country-specific clause.

7. General Conformance Statement

7.1. The Test Application


The test application is a software application built to enable Manufacturers, Utilities and meter
verifiers to migrate their AMR and testing environment to ANSI C12.19 Standard-based
technology. It can read and program any ANSI C12.19-based device that uses ANSI C12.18

- XI -
Standard communication protocol. It also accepts and creates table definition files, which
describe the architecture and state of any ANSI C12.19 device. It may also be a diagnostic
and test tool, which is a reference application used to validate device design assumptions and
implementation details.

7.2. The Test Application as a Reference ANSI C12.19/IEEE-1377


Implementation
The test application will be designed to communicate the data tables according to the ANSI
Standard C12.19-1997 / IEEE-1377, “Utility Industry Standard Tables”, although for the
purposes of testing ANSI C12.18 compliance, it will only need to access a subset of the tables
implemented in any DUT. It shall implement ANSI Standard C12.18-1996, “Protocol
Specification for ANSI Type 2 Optical Port”.
The software compliance tests described in this Standard shall validate the ability of a Device
Under Test (DUT) to communicate correctly and reliably with the test application being the
reference ANSI Standard C12.19-1997 / IEEE-1377 application.

7.3. Test Results and Ranking Categories


Tests performed on the DUT, based on the compliance procedures, shall be used to rank
DUTs on a per Function Under Test (FUT) basis according to the following scheme:

7.3.1. Not Applicable


A “not applicable” reported result is one that cannot be carried out or should not be
carried out given the configuration or test settings of the DUT. The tester does a
determination of inapplicability during the initial evaluation of the DUT features when
preparing the DUT for the tests.

7.3.2. Not Tested


A “not tested” reported result is one that was not be carried out or should not be carried
out, as determined during any of the following stages:
(a) Customer initiated request prior to test initiation during acceptance of test
procedures.
(b) Initial evaluation of the DUT features, in preparation for test, by the tester.
(c) Subsequent evaluation by the tester of the DUT’s ability to be subjected to a test.

7.3.3. Conforming
The DUT was found fully compliant with the Test Application reference implementation
according to the test procedure applied in a given test and the expected results for the
related FUT.

7.3.4. Conforming with Discrepancy


The DUT was found to be partially, marginally or non-compliant with Test Application
reference implementation according to the procedure applied in a given test and
expected results for the related FUT. The discrepancy is considered a non-fatal failure of
the DUT for the FUT.

7.3.5. Non-conforming
The DUT was found to be partially, marginally or non-compliant with Test Application
reference implementation according to the procedure applied in a given test and

- XII -
expected results for the related FUT. The discrepancy is considered a fatal failure of the
DUT for the FUT.
Do we need to include guidelines for distinguishing between conforming with
discrepancy and non-conforming?

7.3.6. Non-conforming, Rejected


The DUT was rated non-conforming in a qualification test.

7.3.7. Test Remarks


For each of the “Conforming with Discrepancy” or “Non-conforming” test results, the
tester shall enter informative remarks in the test report comment column to indicate the
cause of conformance discrepancy.
The tester shall not disclose, as part of the test report, the reason for a “Not Tested”,
when it is customer initiated. Such explanations shall be provided as an attachment to
the report or as directed by the customer.

8. Verification Objectives for ANSI C12.18 Devices

8.1. ANSI Standard C12.19-1997 / IEEE-1377


“Utility Industry Standard Tables”
The primary objective of this Standard is to establish an interoperable data format for gas,
water and electricity meters for the purpose of control and transport of data records between
the utility and a metering device. The Standard aims to maximize the degree of interoperability,
recognizing that it is not possible to define a universal data format that is suitable for all
devices. It defines mechanisms for the promotion and establishment of interoperability at the
test application level through the introduction pre-defined data structures, also known as
Tables.

8.1.1. Type of Tables


The tables are transferred to and from the DUT. These are grouped into two major types
“Standard Tables” and “Manufacturer Tables”. Each type is capable of having up to 2040
tables. The “Standard Tables” are in turn organized by function into groups of 10 tables,
known as decades. The Standard decades include configuration tables, data source
selection tables, metrological register tables, display control tables, access security
tables, clock and time-of-use tables, load profile tables, history and event log tables and
user-defined tables.

8.1.2. Operations on Tables


The tables are structured in such a way that they simplify the interface of the test
application, using only Table Read and Table Write services. The Test Application shall
exclusively use the read/write services to set up the DUT, if necessary, and perform the
tests.

Table 0, General Configuration Table


The Standard aims to collect tables into functional groups beginning with Table 0,
General Configuration Table. The Test Application shall be able to read and interpret

- XIII -
Standard Table 0 in order to determine the presence of other tables and their write
access modes for the purposes of conducting other tests.

8.2. ANSI Standard C12.18-1996


“Protocol Specification for ANSI Type 2 Optical Port”
8.2.1. Basic Implementation Assumptions
The ANSI C12.18-1996 Standard describes the protocol mechanism used to exchange
table data (ANSI C12.19-1997) between a DUT and the test application. The ANSI
C12.18-1996 Standard specifies the physical interface, data link layer interface and
application layer.
The physical interface is a simple light emitting diode (LED) transmitting a logical zero (0)
when the LED is “on” and a logical one (1) when the LED is “off”. The wavelength of the
radiated signal is between 800nm and 1000nm. The data link layer closely resembles
conventional EIA-232 systems. It is limited to asynchronous half-duplex data.
At the data link layer the data bits are grouped into octets that are transmitted as a serial
bit stream, least significant bit first. Each octet is preceded by a start bit (logical zero) and
followed by a stop bit (logical one). The default bit transmission rate is 9,600 bits per
second (used to initiate all tests). Other bit rates between 300 and 56,200 bits per
second may be supported.
Information is interchanged as variable length packets (default value is 64 octets in
length). Packet header parameters used within the application layer and the data link
layer are transmitted most significant octet first. However, table data is transmitted in
accordance with the DATA_ORDER table element defined in Table 0, the General
Configuration Table.
The protocol is based on the OSI model. It uses layer 7 (application), layer 2 (data link)
and layer 1 (physical) of the 7-layer OSI stack. Layers 6 (presentation), 5 (session), 4
(transport) and 3 (network) are not used.

8.2.2. States and Services


Layer 7 of ANSI C12.18-1996 is also known as “Protocol Specification for Electric
Metering” (PSEM). The test application shall assess the correct implementation of PSEM
protocol as follows:

Base State
The Base State is an idle state, where the DUT waits for the establishment of a
communication link, followed by the initiation of the Identification Service by the test
application.

Identification Service
The Identification Service establishes the identity of the end device by returning the
revision and version numbers of the protocol implementation. The Identification
Service can assist the test application in determining the DUT PSEM protocol
Standard compliance level.
Being the only valid service available in the Base State, the Identification Service must
be the first service requested after the establishment of a physical connection or after
the execution of the Terminate Service request or following any other condition that
causes the DUT to reenter the Base State.

- XIV -
Base State Initial Channel Setting
The test application expects the DUT to have its channel parameters set as follows,
while in the Base State.

Description Value
Data rate 9600 bits per second
Packet size (Data Link layer) 64 octets
Maximum number of Data Link packets per Application 1 packet
layer message
Channel traffic time-out 6 seconds
Transmission retries 3 retries
ANSI C12.18-1996 Base State default communication settings
Upon entry into the Base State, all DUTs are expected to reset their communication
settings to the defaults above.

Identify State
The Identify (ID) State is entered from the Base State following successful completion
of the Identification Service by the DUT. This service request is initiated by the test
application.
The ID State is an intermediate communication state bridging the Base State and the
Session State. While in the ID State, the test application will negotiate or re-negotiate
the communication link characteristics. The services a DUT is expected to support in
the Identify State are Negotiate, Wait and Terminate.

Session State
The Session State is entered from the ID State as a result of successful completion of
a Logon request. A session represents an environment where a set of requests is
exchanged between the test application and a DUT. These requests are expected to
have something in common, such as user ID and security clearance. PSEM provides
facilities for changing the access permission using the Security Service and
terminating the session by invoking the Logoff Service. The services supported in
Session State are Read, Write, Security, Wait, Terminate and Logoff.

9. Test procedures for ANSI C12.18

9.1. Test procedures for the Physical Layer

TO DO: Add test for “Dimension” (Section 4.8.1)


TO DO: Expose any issues on probe attachment.

TO DO: Add test for “Receiver Characteristics” (Section 4.8.3.3)


TO DO: Add test for “Environmental Lighting Condition Section 4.8.3.3)”
9.1.1. Transmitter Characteristics (Section 4.8.3.2)
Objective
Verify the DUT qualification procedure.

- XV -
Description
Using a calibrated photometer/radiometer and probe, position the probe 10mm from
the surface of the optical port and measure the luminance when the LED is on and
when the LED is off. See C12.18 section 4.8.3.3 for a diagram of the test setup.
Move the probe to a distance of 25mm and repeat the measurements.

Possible Test Results

Result Description Classification


Luminance is between 250 and 7500 µW/cm2 Conforming
when the probe is 10mm from the surface of
the optical port and the LED is on.
Luminance is less than 10 µW/cm2 when the Conforming
probe is 10mm from the surface of the optical
port and the LED is off.
Luminance is between 85 and 7500 µW/cm2 Conforming
when the probe is 25mm from the surface of
the optical port and the LED is on.
Luminance is less than 10 µW/cm2 when the Conforming
probe is 25mm from the surface of the optical
port and the LED is off.
Otherwise Non-conforming, Rejected

TO DO:
 A C12.18 Application level sequence of transactions shall be defined to cause the end-
device to send data, and enable the testing of luminance, wavelength and sensitivity of
the ANSI II port.
 Test procedure and generic test equipment shall be described to set various luminance
and wave length acceptance/emission levels to establish the operational limits of test 1
above.
 Create a list of generic test equipment properties in an Appendix

9.2. Test procedures for the Data Link Layer


9.2.1. Pre-Test General Set-up Requirements
The DUT as accepted for test shall be minimally configured as follows:
(a) It shall be programmed, configured and ready for metering.
(b) If a physical seal is used to prohibit tampering with the DUT internal working
mechanisms then the DUT shall be sealed.
(c) The clock shall be set to track time correctly in the time zone where the
verifications take place.
(d) Data sources, demand-registers, cumulative registers and profile-recording
functions shall be active and ready for test.
(e) History loggers and Event loggers shall be active and ready for test.
(f) When DUT programming is not possible, while in “metering mode”, Standard
Procedure 6, “Change End Device Mode” shall be used by the test application to
alternate between “metering mode” and “metering mode + meter-shop mode” to
enable DUT programming. Otherwise, the test application shall use the
mechanisms provided by the manufacturer to change its operating mode so that

- XVI -
the DUT can be programmed in preparation for test. However, this will result in a
Non-Conforming classification for Change Mode FUT.
(g) The presence of Standard Procedure 2, “Save Configuration”, is an indication to
the test application that a “Save Configuration” procedure needs to be requested
following DUT programming and prior to Logoff. Otherwise, its presence shall
result in an implied Conforming with discrepancy rating of this FUT.
(h) The clock shall be set using Standard Procedure 10, “Set Date and/or Time” or by
writing directly to Table 52, “Clock”, when Standard Procedure 10 is not available.
(i) A procedure shall be provided by the Manufacturer to re-establish the initial, “as
delivered”, configuration for use by the tester. This procedure shall utilize
Standard or Manufacturer tables only.

9.2.2. Basic Sanity Check


Objective
This is a DUT qualification procedure.
The procedure verifies that the device can complete a simple test application
transaction. A test application transaction is initiated of sending an Identification
request to the DUT.

Description
With the DUT in Base State the test application initiates an Identification request to the
DUT and records the transaction results.

Possible Test Results

Result Description Classification


The identification request completes with no Conforming
errors
The Identification request does not complete or Non-conforming, Rejected
it completes with an error.

9.2.3. Immunity to Random Noise


Objective
TO DO: Do this test at the ID state to avoid conflict with multi-protocol devices
Validate the capacity of the DUT to reject faulty data. This test establishes the
behavior of the DUT upon receipt of invalid octets outside the packet envelope. Invalid
octet values range from 0x00 to 0xFF except for 0x06, 0x15 and 0xEE.

Description
With the DUT in Base state or following receipt of an ACK or a valid response, a
random collection of invalid octets is send by the test application to the DUT. The
response channel is monitored for a NAK or any other code.

- XVII -
Possible Test Results
Result Description Classification
DUT does not respond at all. Conforming
DUT responds with a NAK or any other code. Non-conforming

9.2.4. ACK Returned Due to Reception of a Duplicate Packet


Objective
Affirm that the DUT returns an ACK upon receipt of a duplicate packet from the test
application.

Description
Two consecutive packets are issued by the test application. Both packets are
constructed to be identical having the same CRC, reserved byte value and toggle bit
state. The DUT response channel is monitored for one ACK in response to each of
the packets.

Possible Test Results


Result Description Classification
DUT responds with two ACKs. Conforming
DUT does not respond with two ACKs Non-conforming
Caveats
The request or data received in the first packet shall be processed by the DUT. The
DUT shall ignore the request or data received in the second.

9.2.5. NAK Returned Due to CRC Error


Objective
Validate that the DUT issues a NAK to the test application upon receipt of a packet
that is corrupt with a bad CRC.

Description
Two consecutive packets are sent by the test application to the DUT. The first packet
sent contains an invalid CRC. The second packet sent is valid and contains no
corruption of any kind. The DUT response channel is monitored by the test application
for NAK followed by an ACK.
This test is performed twice, once with a corruption in the data payload of the first
packet and a second time with a corruption of the first packet’s CRC itself.

Possible Test Results


Result Description Classification
DUT issues a NAK in response to the first Conforming
packet, then an ACK in response to the second
packet.
All other responses. Non-conforming

- XVIII -
9.2.6. NAK Returned Due to Inter-Character Timeout
Objective
Confirm that the inter-character timeout is implemented correctly.

Description
An incomplete packet is sent by the test application to the DUT. The DUT response
channel is monitored for a NAK.
Need to clarify what constitutes and incomplete packet. Should the DUT NAK a
“packet” that contains only an 0xEE and one additional byte, for example?

Possible Test Results


Result Description Classification
DUT issues a NAK to the test application after Conforming
500 to 2000 msec following the receipt of the
last octet.
DUT does not issue a NAK to the test Non-conforming
application after 500 to 2000 msec following the
receipt of the last octet.
DUT issues a NAK to the test application earlier Non-conforming
than 500 msec following the receipt of the last
octet.
DUT issues a NAK to the test application after Non-conforming
more than 2000+100 msec following the receipt
of the last octet.
Caveats
This procedure partially validates the acknowledgement timeout, which is set to 2
seconds by the Standard.

9.2.7. NAK Returned Due to Long Packet


Objective
Validate the behavior of the DUT when it receives a packet that is larger than
expected according to the packet’s length field.

Description
A packet with a payload size greater than one octet is composed by the test
application. The test application then sets the payload length field to a value that is
smaller than the actual payload size. It computes the correct CRC so that all other
fields remain valid. The test application sends the long packet to the DUT and it
monitors the DUT response channel for a NAK is generated.

Possible Test Results


Result Description Classification
DUT issues a NAK to the test application after Conforming
0.175 msec following the receipt of the bad
CRC field, but not later than 2000 msec
following the receipt of the last octet of the
corrupt test packet.
Any other response. Non-conforming

- XIX -
Caveats
For this procedure to work, near the 0.175 msec range, the test application may need
to use a full duplex ANSI type II optical probe.
Is this test valid as described?

9.2.8. NAK Returned Due to Invalid Multi-Packet Flag Setting


Objective
Validate that the multiple-packet flag of the packets in a multi-packet message
operates correctly.

Description
The test application sends one packet (or the first packet in a multi-packet
transmission) to the DUT with the multiple-packet flag (bit 7 of the Control field) set to
1 and the first-packet flag (bit 6 of the Control field) set to 0. The test application then
monitors the DUT response channel for a NAK.

Possible Test Results


Result Description Classification
DUT responds with a NAK Conforming
Any other response. Non-conforming

9.2.9. NAK Returned Due to Invalid First-Packet Flag Setting


Objective
Validate that the first-packet flag for packets of a multi-packet message operates
correctly.

Description
The test application sends one packet (or the first packet in a multi-packet
transmission) to the DUT with the multiple-packet flag (bit 7 of the Control field) set to
0 and the first-packet flag (bit 6 of the Control field) set to 1. The test application then
monitors the DUT response channel for a NAK.

Possible Test Results


Result Description Classification
The DUT responds with a NAK Conforming
Any other response. Non-conforming

9.2.10. NAK Returned Due to an Invalid Sequence Number


Objective
Confirm that a NAK is returned to the test application from the DUT when a packet
with an invalid sequence number is received by the DUT in a multi-packet
transmission sequence.

Description
The test application initiates a multi-packet transmission to the DUT. The sequence of
packets is constructed such that the second packet in the sequence contains an
invalid sequence number; the third and all subsequent packets in the same sequence

- XX -
contain correct sequence numbers. The DUT response channel is monitored for a
NAK on the second packet in the sequence and an ACK for all others.

Possible Test Results


Result Description Classification
The DUT responds with an ACK on the first Conforming
packet, NAK for the second packet, an ACK on
the third packet and all other packets in the
sequence and the entire transaction completes
without errors.
Any other ACK/NAK combination or if the Non-conforming
transaction completes with error.

9.2.11. Retransmission Triggered by a Response Timeout


Objective
Demonstrate the ability of the DUT to retry packet transmission within the correct
response time-out limits upon loss of an ACK response from the test application.

Description
The test application initiates a transaction that triggers the transmission of a packet
from the DUT. Upon receipt of the valid packet from the DUT, the test application
does not respond with an expected ACK. The test application then monitors the DUT
response channel for 2 second (±250 msec.), awaiting the retransmission of the
unacknowledged packet. Upon receipt of the retransmitted packet, it completes the
transaction.

Possible Test Results


Result Description Classification
The DUT retransmits the unacknowledged Conforming
packet within the allotted time frame. The
retransmitted packet is an exact duplicate of the
first unacknowledged packet. The transaction
completes without error.
Any other ACK/NAK combination, failure to Non-conforming
retransmit or if the transaction completes with
error.

9.2.12. Retransmission Triggered by a NAK


Objective
Confirm that the DUT handles the reception of NAK from the test application correctly.

Description
The test application initiates a transaction that triggers the transmission of a packet
from the DUT. Upon receipt of the valid packet from the DUT, the test application
responds with a NAK. The test application then monitors the DUT response channel
for 500 msec., awaiting the retransmission of the negatively acknowledged packet.
Then it completes the transaction.

- XXI -
Possible Test Results
Result Description Classification
The DUT retransmits the negatively Conforming
acknowledge packet within the allotted time
frame. The retransmitted packet is an exact
duplicate of the negatively acknowledged
packet. The transaction completes without
error.
Any other ACK/NAK combination, failure to Non-conforming
retransmit, or if the transaction completes with
error.

9.2.13. Link Termination After Third Retry Attempt (NAK)


Objective
Establish the DUT’s ability to terminate the link and return to the Base State after the
third failed transmission retry attempt due to the receipt of a NAK.

Description
The test application performs two tests: one while the DUT is in the ID State (following
an Identification Service request) and another test while the DUT in the Session State
(following Identification and Logon Service requests).
a) ID State Tests
While the DUT is in the ID State, the test application initiates a Wait request. It
then responds four times with a NAK (one for each received packet from the
DUT). Following the fourth NAK the test application issues an Identification
request; then it monitors the DUT response channel for an <ok>.
b) Session State Tests
While the DUT is in the Session State, the test application initiates a Read table 0
request. It then responds four times with a NAK (one for each received packet
from the DUT). Following the fourth NAK, the test application issues an
Identification request; then it monitors the DUT response channel for <ok>.

Possible Test Results


Result Description Classification
The DUT does not retry transmission three Non-conforming
times (it does not send four packets in total)
DUT does not respond with an <ok> to the final Non-Conforming
Identification request.
DUT processes the final Identification request Conforming
without error.

9.2.14. Link Termination After Third Retry Attempt (Response Timeout)


Objective
Establish the DUT’s ability to terminate the link and return to the Base State after the
third failed transmission retry attempt following response timeout.

- XXII -
Description
The test application performs two tests. One test, while the DUT is in the ID State
(following an Identification Service request) and another test while the DUT in the
Session State (following an Identification and Logon Service requests).
(a) ID State Tests
While the DUT is in the ID State, the test application initiates a Wait Service
request. It then sits idle and it monitors the DUT’s attempts to retransmit the
unacknowledged response packet three times. Following the third retry attempt,
the test application waits 2 seconds; then it issues an Identification request and
monitors the DUT response channel for an <ok>.
(b) Section State Tests
While the DUT is in the Session State, the test application initiates a Read table 0
request. It then sits idle and it monitors the DUT’s attempts to retransmit the
unacknowledged response packet three times. Following the third retry attempt,
the test application waits 2 seconds; then it issues an Identification request and
monitors the DUT response channel for <ok>.

Possible Test Results


Result Description Classification
The DUT does not retry transmission three Non-conforming
times (it does not send four packets in total).
DUT does not respond with an <ok> to the final Non-Conforming
Identification request.
DUT processes the final Identification request Conforming
without error.

9.2.15. Link Termination Due to Channel Traffic Timeout


Objective
Validate the DUT’s ability to terminate the link and return to the Base State as a result
of channel traffic timeout. This test is performed while the DUT is in the ID and
Session States.

Description
(a) ID State Tests
The test application initiates an Identification service request to the DUT. Then it
remains idle for a period 6 seconds (the channel traffic timeout). Following that, it
issues a Wait service request to the DUT. The DUT response channel is
monitored for an “invalid service sequence state” error code <isss>.
(b) Session State Tests
The test application initiates an Identification service request to the DUT followed
by a Logon service request. Then it idles by not initiating any more transactions for
a period of 6 seconds (the channel traffic timeout). It then issues a Read table 0
request to the DUT. The DUT response channel is monitored for an “invalid
service sequence state” error code <isss>.

- XXIII -
Possible Test Results
Result Description Classification
The DUT processes the Identification request Conforming
without error, but rejects the Wait request with
an <isss> error code.
The DUT does not process the Identification Non-conforming
request, or it accepts the Wait request without
error.

9.2.16. Sensitivity to the Initial State of the Toggle Bit


Objective
Confirm that the DUT is performing correctly and equivalently when receiving a first
packet with its toggle bit (bit 5 of the Control field) arbitrarily set to 0 or 1.

Description
The test application initiates an Identification Service request to the DUT immediately
followed by a Terminate request. This sequence is initiated twice; the first time the test
application sets the toggle bit of the first packet to 1. Subsequent packets have their
toggle bit successively toggled (flipped from 0 to 1 or from 1 to 0) relative to the state
of the toggle bit of the previous packet. The test application monitors the DUT
response channel to check for errors.
The test is repeated, but this time with the toggle bit of the first packet set to 0.

Possible Test Results


Result Description Classification
The DUT processes the Identification and Conforming
Terminate requests without error.
The DUT fails to processes the Identification or Non-conforming
Terminate requests without error.

9.2.17. Default Packet Size Validation (Read)


Objective
Establish the capability of the DUT to assume and honor the default values of the
number of packets and packet size (64 octets per packet, 1 packet per request or
response) using a Read request.

Description
The test application initiates a Login sequence to the DUT without invoking the
Negotiate service (does not adjust the channel parameters). Following Logon, a Read
Service is issued for a table that contains more octets than the default packet size of
64 octets can accommodate. The test application monitors the DUT response
channel for the reception of either a maximum of 55 octets of data, or an <onp> error
code.

- XXIV -
Possible Test Results
Result Description Classification
The DUT returns exactly 55 octets. Conforming
The DUT responds with a <onp> error. Conforming
The DUT delivers the table using either more Non-conforming
than one packet or a larger packet.

9.2.18. Default Packet Size Validation (Write)


Objective
TO DO: Review this test in the case write is not supported by a device.
Establish the capability of the DUT to assume and enforce the default values of the
number of packets and packet size (64 octets per packet, 1 packet per request or
response) for a Write request.

Description
The following test is performed twice.
(a) The test application initiates a Login sequence to the DUT without invoking the
Negotiate service (does not adjust the channel parameters). Following Logon, a
Write Service is initiated on a table that cannot be contained in one packet using
the default values. In the first test the application generates a packet that is larger
than the default. The test application monitors the DUT response channel for the
reception of a NAK.
(b) The test application generates more than one packet to accommodate the excess
size. The test application monitors the DUT response channel for the reception of
a NAK.

Possible Test Results


Result Description Classification
The DUT returns NAK. Conforming
The DUT processes the requests. Conforming with discrepancy

9.3. PSEM Test Procedures


9.3.1. PSEM Service Transaction
Objective
This is a DUT qualifying test procedure.
The procedure tests the general “sanity” of the DUT and that it can handle a typical
PSEM sequence of service requests.

Description
The following sequence of PSEM services is executed: Identification, Negotiate,
Logon, Security, Read table 0, Write-table(Remove), Logoff, then Terminate.
The test application communicates with the DUT. Throughout the communication
sequence it monitors PSEM responses looking for the reception of error codes or
improperly format responses.

- XXV -
Possible Test Results
Result Description Classification
No errors were triggered by DUT or test Conforming
application
An errors condition was triggered by DUT or Non-conforming, rejected
test application
Caveats
For this procedure to work, the test application needs to know a DUT password.
Alternatively, the tables selected need to be accessible at the default security level.

9.3.2. Identification Service (Response)


Objective
Verify the validity of information returned by the DUT Identification response.

Description
The test application initiates an Identification request to the DUT and processes the
response from the DUT to contain valid information that is consistent with C12.19-
1997 Standard/Revision numbers.

Possible Test Results


Result Description Classification
Identification responds indicates that Conforming
<std> = 0,
<ver> = 1,
<rev> = 0 and
<rsvd> = 0
Anything else. Non-conforming

9.3.3. Negotiate Service (Change Baud Rate)


Objective
Validate the capability to change the baud rate without side effects or synchronization
problems.

Description
The baud rate change test is repeated for each possible transmission bit rates
identified in the Standard. These include 300, 600, 1200, 2400, 4800, 9600, 14400,
19200, 28800 and 56200 bits per second.
The test application shall attempt to negotiate each of the above bit rates using the
Negotiate service. The DUT shall return in its negotiated response the baud rate it will
use in all subsequent transactions. If the baud rate in the response is different from
the one used to communicate the Negotiate request (first Negotiation is
communicated at 9600 bits per second), the channel speed shall be adjusted
accordingly by the test application prior to the initiation of the next service request
(which will be the next Negotiate service request). The channel is monitored by the
test application for errors or channel noise.

- XXVI -
Possible Test Results
Result Description Classification
No errors were triggered by DUT or test Conforming
application following baud rate change
Negotiate service resulted in errors codes. Conforming with discrepancy
Cannot communicate following baud rate Non-conforming
change.
Caveats
The 14400 and 28800 are not considered standard computer bit rates. They are
common in MODEM communications. As such, the test application may not test the
DUT for these rates.

9.3.4. Negotiate Service (Number of Packets and Packet Size)


Objective
Validate the capability of he DUT to accept minimal changes to the number of packets
and packet size used in the data link layer to transfer data payloads.

Description
DUT transmission channel limits are tested after the completion of a Logon sequence,
which includes a Negotiation service request (while in ID State), setting a limit of 2
packets, each having a maximum length of 65 octets.

Possible Test Results


Result Description Classification
DUT responds with actual number of packets Conforming
supported and packet size. These should be
less than or equal to the number and size
requested.
NAK or any other error condition Conforming with discrepancy

9.3.5. Negotiate Service (Honors Negotiated Limits)


Objective
Validate the capability of the DUT to honor the negotiated packet limits. During
negotiations, the test application conveys its desired packet size and count limits to
the DUT. The DUT responds with its own limits, attempting to match the test
application limits to the best of its designed ability. Thereafter, a real application is
expected to honor the DUT limits, and the DUT is expected to honor the limits
conveyed to it by the real application. The sequence of tests validates the DUT packet
generation behavior for consistency with the above.

Description
Following successful completion of the Negotiate Service (Number of Packets and
Packet Size) procedure, DUT is tested for honoring the negotiated limits.
(a) The test application initiates a read to the DUT. The process is repeated three
times, so that:
(i) The retrieved table size can only be assembled for transmission by the DUT
using the smallest (packet-size x number of packets) setting of either the test
application or the DUT.

- XXVII -
(ii) The retrieved table size can only be assembled for transmission by the DUT
using the smallest (packet-size x number of packet) setting that is greater
than the DUT limits, but less than or equal to that of the test application.
(iii) The retrieved table size can only be assembled for transmission by the DUT
using a larger (packet-size x number of packet) setting than the test
application.
(b) The test application initiates tables write to the DUT. The process is repeated a
number of times, so that.
(i) The number of packets and packets size are within the DUT conveyed limits.
(ii) The number of packets is within the DUT conveyed limits, but the packet size
is too large.
(iii) The packet size is within the DUT conveyed limits, but the number of packets
is too large.
(iv) Both, the number of packets and packets size equal exactly the DUT
conveyed limits.

Possible Test Results


Result Description Classification
Table Data retrieval from DUT operates within Conforming
test application packet limits.
Table Data retrieval from DUT operates within Conforming
DUT packet limits and test application packet
limits.
NAK was generated by the DUT when it Conforming
received data payload, which does not fit within
the negotiated DUT packet limits.
<onp> was generated by the DUT when the Conforming
requested data payload will not fit within the
negotiated packet limits of the DUT or the test
application. ???
The DUT accepts data payloads that exceed Conforming with discrepancy
the DUT limits.
The DUT delivers data payloads that exceed Non-conforming
test application limits.
Unexpected error code generated. Non-conforming

9.3.6. Negotiate Service (Establish Maximum Packet Limits)


Objective
Identify the maximum packet size and maximum number of packet supported by the
DUT.

Description
DUT transmission channel maximum limits are negotiated by the test application. This
includes the execution of a Negotiation service (while in ID State) by the test
application requesting channel settings of a maximum of 255 packets of 8192 octets
each.

- XXVIII -
The test application repeats test 9.3.5, “Negotiate Service (Honors Negotiated
Limits)”, above, and reports DUT channel limits negotiated.

Caveats
The number of packets and the largest packet size used by the DUT to transmit the
large tables are recorded.
Possible results?

9.3.7. Wait Service


Objective
Confirm the ability of the DUT to accept the Wait service to override the default
channel traffic timeout.

Description
This procedure invokes four tests:
(a) Initiate an Identification request to put the DUT in the ID State. A Wait service is
initiated by the test application requesting a 20-second wait period. The test
application idles for 18 seconds; then it initiates another Wait service request. The
DUT response channel is monitored for an unexpected “Invalid Service
Sequence State” error code <isss>.
(b) Repeat test (a) above, but this time the test application idles for 22 seconds and
monitors the response channel for an expected “Invalid Service Sequence State”
error code <isss>.
(c) Repeat tests (a) and (b) above, except that the initial state is set to the Session
State

Possible Test Results


Result Description Classification
The DUT responds with a <isss> after 18 Non-conforming
seconds of idle time.
The DUT does not respond with a <isss> after Non-conforming
a 22 second of idle time.
Otherwise Conforming

9.3.8. Terminate Service Working


Objective
Confirm that the Terminate service is accepted in all DUT protocol states, thus placing
the DUT back into Base State.

Description
This test is performed in each of the possible states, the Base, ID, and Session states.
For each of these states, a Terminate service is initiated. After acceptance of the
terminate service an Identification Service is initiated. The channel is monitored during
the execution of both the Terminate and the Identification service for detection of any
errors.

- XXIX -
Possible Test Results
Result Description Classification
Identification service request executes without Conforming
error.
The DUT responded with a <isss>. Non-conforming

9.3.9. Service Sequence State


Objective
Verify if the services implement correctly the service sequence state diagram.

Description
This test is performed in each of the possible states: the Base, ID and Session states.

In each of these states, the FUT related service is initiated. The channel is monitored
for reception of error and responses that do not respect the following table are
reported.
Service BASE state ID state SESSION state
Identification <ok> <isss> <isss>
Wait <isss> <ok> <ok>
Negotiate <isss> <ok> <isss>
Terminate <isss> <ok> <ok>
Logon <isss> <ok> <isss>
Logoff <isss> <isss> <ok>
Security <isss> <isss> <ok>
Read <isss> <isss> <ok>
Write <isss> <isss> <ok>
Partial read <isss> <isss> <ok>
Partial write <isss> <isss> <ok>
Valid return codes for associated services and related states

Possible Test Results


Result Description Classification
Result codes consistent with the expected Conforming
codes listed in Error: Reference source not
found.
Result codes inconsistent with expected codes. Non-conforming
Testing against informative annex?

- XXX -
Annex A - Test Reports (Normative)

A.1 Tests Summary Sheet


Test request information
Test report number:
Requested by:
Date device received:
Date device accepted for testing:
Date test completed:

Device information
Manufacturer name:
Device model:
Hardware version and revision
Firmware version and revision:
C12.19 NEMA device class:

Customer contact
information
Name:
Address:

Office phone number:


Pager phone number
Mobile phone number
Fax number:
Email address:

Result Scores Summary Code


Comforting: C
Conforming with discrepancy CWD
Non conforming: NC
Not applicable: NA
Not tested: NT

General comments

- XXXI -
A.2 Detail Results of “Test procedures for ANSI C12.18”
A.2.1 Results of “Test procedures for the Data Link Layer”
Section Test name Result Code Comment #
9.2.2 Basic Sanity Check
9.2.3 Immunity to Random Noise
9.2.4 ACK Returned Due to Reception of a Duplicate Packet
9.2.5 NAK Returned Due to CRC Error
9.2.6 NAK Returned Due to Inter-Character Timeout
9.2.7 NAK Returned Due to Long Packet
9.2.8 NAK Returned Due to Invalid Multi-Packet Flag Setting
9.2.9 NAK Returned Due to Invalid First-Packet Flag Setting
9.2.10 NAK Returned Due to an Invalid Sequence Number
9.2.11 Retransmission Triggered by a Response Timeout
9.2.12 Retransmission Triggered by a NAK
9.2.13 Link Termination After Third Retry Attempt (NAK)
9.2.14 Link Termination After Third Retry Attempt (Response Timeout)
9.2.15 Link Termination Due to Channel Traffic Timeout
9.2.16 Sensitivity to the Initial State of the Toggle Bit
9.2.17 Default Packet Size Validation (Read)
9.2.18 Default Packet Size Validation (Write)

A.2.2 Results of “PSEM Test Procedures”


Section Test name Result Code Comment #
9.3.1 PSEM Service Transaction
9.3.2 Identification Service (Response)
9.3.3 Negotiate Service (Change Baud Rate)
9.3.4 Negotiate Service (Number of Packets and Packet Size)
9.3.5 Negotiate Service (Honors Negotiated Limits)
9.3.6 Negotiate Service (Establish Maximum Packet Limits)
9.3.7 Wait Service
9.3.8 Terminate Service Working
9.3.9 Service Sequence State

ANNEX B - Verification Codes (Normative)


This annex provides a description of the various verification codes (stickers) and logos associated with
the codes that may be issued by a tested for the selected device.

- XXXII -

S-ar putea să vă placă și