Sunteți pe pagina 1din 44

Technical

Instructions

Reference: PTC Orbe/MES-E&A/RDT


Replaces: TM-31.505
January 2004

Important
The content of this document is a trade secret. It may not
be reproduced, distributed, or disclosed to third parties,
without proper authorization.
All rights belong to Nestec Ltd. 1800 Vevey, Switzerland

Instrument Calibration
Program Implementation
Guide

Nestec S.A. 2004 Nestl PTC Orbe MES-E&A Department

TM-31.505 / January 2004


2/44
Table of contents

PURPOSE AND SCOPE............................................................................................ 3


GENERAL REQUIREMENTS .................................................................................... 4
ROLES AND RESPONSIBILITIES ............................................................................ 5
IMPLEMENTATION ................................................................................................... 7
Definition of Calibration Program Elements .....................................................................................7
Measuring Loop Inventory and Calibration Quality Requirements...................................................7
Calibration Procedures & Calibration-related Forms......................................................................12
Scheduling System.........................................................................................................................14
Labeling/Tagging System...............................................................................................................15
Measurement Traceability ..............................................................................................................17
Test and Reference Standards ......................................................................................................18
Personnel Training .........................................................................................................................19
Calibration Certificates ...................................................................................................................20
Documentation ...............................................................................................................................21
Calibration Software .......................................................................................................................23
Periodic Review of Calibration Program.........................................................................................24
APPENDICES .......................................................................................................... 25
Table of contents ............................................................................................................................25
I. Calibration Class Classification .............................................................................................26
II. Calibration Interval Definition Guide ......................................................................................29
III. Calibration Interval Quick Reference Guide...................................................................31
IV. Calibration Certificate (Example).......................................................................................32
V. Calibration Procedure (Example)...........................................................................................33
VI. Measuring Loop Inventory (Example) ...............................................................................34
VII. Calibration Report (Example) ............................................................................................35
VIII. Instrument Calibration Schedule Form (Example) ............................................................36
IX. Test and Reference Measurement List (Example)............................................................37
X. Definitions ..............................................................................................................................38
XI. Standardization Bodies .....................................................................................................43
XII. Bibliography and references..............................................................................................44

TM-31.505 / January 2004


3/44
PURPOSE AND SCOPE

Foreword Operational Excellence through line optimization while maintaining product quality
and ensuring safety has been the driving force in our factories these days. This has
triggered the definition of tough operational requirements that are established based
on measurable benchmarks.
In our manufacturing lines, some of these benchmarks are based on measurable
physical quantities. And to make sure that the benchmarks are realistic, the
accuracy of the instruments used must be regularly checked and maintained.
To support this, installing an effective instrument calibration program is essential.
This program ensures that calibratable instrument or measuring systems used in
factories are compared and adjusted to metrology standards of higher accuracy.

Purpose and This document should serve as a reference for those who are responsible for putting
content of the in place and maintaining an instrument calibration program. This will help them to:
document
Better understand the meaning and purpose of instrument calibration.
Have an idea how to set-up, implement, and maintain an instrument
calibration program
Know how to ensure instrument calibration traceability in accordance with
the requirements of the Nestle Quality System.

Target This document is addressed to: Market Chief Engineers, PTC and R&D Engineering
Audience Heads, Factory Engineers, Factory Electricity & Automation Heads, and Project
Engineers.

Field of This document applies to the preparation and implementation of a calibration


application program in Nestl factories. This covers field installed instruments as well as the
test and reference instruments and measuring devices in the workshops.
This document is not a guide as to how to perform the actual calibration of each
particular instrument in detail. It also does not cover the details on how to manage
and organize instrument calibration databases.
Calibration procedures are not treated in this document. Instrument calibration
database management is also treated separately, either by a dedicated software
package, or by a manual filing system.
This does not cover instruments used in general or in-line laboratories used for
quality and analytical checks, which are covered by, dedicated Nestec instructions.
Instruments used for Alternative Methods are also covered by a separate Nestec
document CP-31.504.
This document does not intend to replace any locally existing Instrument Calibration
Programs, which are effectively implemented. Existing programs may however be
adopted to ensure that all the calibration program elements mentioned in this
document are satisfactorily met.

Instrument Instrument calibration is the 4.8th element in the Nestl Quality System (NQS), and is
Calibration as part of the requirements for achieving the First Priority Level. It has been given
an NQS priority among others, because the reliability of a manufacturing process is largely
Element dependent on the accuracy of the instrument used for the application.
For this reason, an effective calibration program has to be put in place in compliance
with the NQS.

TM-31.505 / January 2004


4/44
GENERAL REQUIREMENTS

General Rules The calibration program should address the following:


Ensure product safety and quality.
Ensure safety of workplace and environment by ensuring the accuracy and
reliability of measuring systems used in monitoring safety parameters.
Ensure the accuracy and reliability of measuring systems used to measure
benchmarks for line optimization.

Basic Elements In implementing an instrument calibration program, the following basic elements
have to be complied with:
Measuring Loop Inventory and Calibration Quality Requirements
Calibration procedures and calibration-related forms
Scheduling system
Labeling/tagging system
Measurement traceability
Adequacy of calibration equipment (test and reference instruments or
working standards)
Personnel Training
Calibration certificates
Documentation/Instrument calibration filing system and/or calibration
software
Periodic Review of Calibration Program

TM-31.505 / January 2004


5/44

ROLES AND RESPONSIBILITIES

Factory The Factory Engineer takes overall responsibility for ensuring that an effective and
Engineer sustainable instrument calibration program is implemented in their plant.

Factory The factory Electricity & Automation (E&A) group is responsible for the following:
Electricity &
Management and maintenance of the instrument calibration program
Automation
Creation and maintenance of the measuring loop inventory
Creation, update, and safekeeping of instrument calibration historical
records and certificates
Compliance of Calibration Program Quality Requirements
Planning, scheduling and performing instrument calibrations
Define the Calibration Program Quality Requirements together with Quality,
Manufacturing and Safety
Safeguarding of test and reference instruments

Quality The Quality Assurance Department is responsible for the following:


Assurance
Definition of critical measuring loops which are affecting food safety and
food quality, particularly the Critical Control Points and Control Points, in
close coordination with the Manufacturing Group
Providing assistance in finalizing the Calibration Quality Requirements
Setup and implementation of internal quality audits for instrument
calibration.

Manufacturing The Manufacturing Department is responsible for the following:


Definition of critical measuring loops which are affecting operational
performance.
Providing assistance in the definition of critical measuring loops which are
affecting food safety and food quality, particularly the Critical Control Points
and Control Points
Providing assistance in finalizing the Calibration Quality Requirements
Implementation of operational procedures to quickly and periodically
validate the accuracy and reliability of critical measuring loops before the
next calibration schedule.
Provide assistance in the setup and implementation of internal quality
audits for instrument calibration.

Safety The Safety Officer is responsible for the following:


Definition of critical measuring loops which are affecting safety and
environmental protection.
Providing assistance in finalizing the Calibration Quality Requirements
Provide assistance in the setup and implementation of internal quality
audits for instrument calibration.

Continued on next page

TM-31.505 / January 2004


6/44
Approving These are the people responsible for the approval of critical calibration quality
Technical requirements such as the Calibration Interval and Maximum Permissible Error in
Authority Measurement.
Calibration Class Definition:
Quality, Operational & Legal Requirements Measurements QA Manager,
Engineering Manager, Manufacturing Manager
Safety Parameter Measurements Safety Officer, Engineering Manager,
Manufacturing Manager
Calibration Interval (definition and revisions):
Calibration Class A QA Manager, Engineering Manager, Manufacturing
Manager, Safety Officer
Calibration Class B QA Manager, Engineering Manager, Manufacturing
Manager, Safety Officer
Calibration Class C Manufacturing Manager, Engineering Manager,
Safety Officer

TM-31.505 / January 2004


7/44
IMPLEMENTATION

Definition of Calibration Program Elements

Purpose The various elements for the instrument calibration program must be defined
properly before starting the actual calibration. This is important to ensure that:
The prioritization of instruments or measuring loops is well defined
Calibration intervals and calibration quality requirements are set
Instrument calibration schedule is well established
Calibration procedures and forms are well defined

Measuring Loop Inventory and Calibration Quality Requirements

Procedure In the preparation of Calibration Quality Requirements, the following procedure may
be followed:
1. Measuring Loop Inventory the list of measuring loops (or chains) in the
manufacturing process (including test and reference instruments) has to
be prepared, complete with all the relevant information relating to the
instrument(s) in the measuring loop.
2. Assignment of Calibration Class together with the responsible persons,
the measuring loops (or instruments) will then have to be classified
according to their calibration class. The calibration class is dependent
on the various requirements from quality, operational, and safety
perspectives
3. Definition of calibration interval. Taking into consideration the calibration
class, degree of utilization, environmental factors and manufacturers
recommendations, the calibration interval has to be set.
4. Maximum Permissible Error in Measurement This has to be defined
taking into consideration the following: Uncertainty Budget + the
allowable tolerance (or margin) in operational requirements (based on
the criticality or risk involved in the measurement)
5. Definition of other calibration requirements. Other calibration criteria
must also be defined such as number of calibration points, reference
standard, maximum number of adjustments allowed, etc.

Measuring The Measuring Loop Inventory must include all calibratable instruments or
Loop Inventory measuring systems. This must include the following:
1. Field installed measuring instruments (installed in the process lines)
2. Test and reference measuring instruments (reference standards)

Continued on next page

TM-31.505 / January 2004


8/44
Calibration The first step in the classification of the measuring loops is the identification of their
Class functions based on one of the following usage categories:
Category 1: Quality Requirements (Food Safety and Quality Consistency)
Category 2: Operational Requirements (Legal Operational Requirements,
Efficiency and Productivity Benchmarks, Process Control)
Category 3: Worker Safety & Environmental Requirements
The following table maybe used in assigning calibration class to measuring loops.
Refer to Appendix I for the details in the classification.
Calibration
Category Function
Class
Quality
CCP A
1
Thermal Processing Measuring Loops A
CP B
Operational
Legal Manufacturing Requirements A
2 Production/Usage Monitors, KPI B
Benchmarks
Process Control C
Safety & Environment
Primary Element A
3
Secondary Element B
Monitoring instrument C
Calibration Class A instruments in this calibration class have the highest priority
in calibration schedule and have shorter calibration intervals
Calibration Class B instruments in this calibration class have the average priority
in calibration schedule and have longer calibration intervals
Calibration Class C instruments in this calibration class have the lowest priority in
calibration schedule and have less stringent or more flexible calibration intervals
Please take note that:
Test and Reference Instruments (or working standards) must be automatically
assigned the Calibration Class A.

Calibration The calibration interval is defined during the identification of critical instruments. This
Interval is defined in collaboration with the manufacturing group, QA, safety officer, and
manufacturing specialist.
Measuring loops falling under calibration classes A & B must be strictly defined.
Compliance to the defined intervals must also be strongly enforced.
Instruments under classification class C may take the recommended interval by the
manufacturer or may be calibrated on an upon request basis. The cost of
calibration is a major factor to consider for the instruments in this calibration class.
However, in any situation, the local legal requirements for calibration interval, if
existing, must be strictly complied with. Example for this is the legal requirements for
Thermal Processing operations.
The definition and revision of calibration interval has to be approved by the
Approving Technical Authority.
Please refer to Appendix II for the guide in defining Calibration Intervals

Continued on next page

TM-31.505 / January 2004


9/44

Review of Calibration intervals have to be reviewed periodically. It is recommended to review


Calibration and update the intervals based on the calibration historical records, particularly the
Interval last 3 calibration records. This is important to be able to optimize the calibration
interval. Through this review, we can minimize the cost by extending calibration
intervals, and avoid risks, by ensuring the reliability of measurements until the next
calibration schedule.

Maximum The total accuracy of a measurement in industrial applications means the accuracy
Permissible of the final reading, i.e., the accuracy of the whole measuring loop. The final
Error in measured value is often the digital value in the automation system used for process
Measurement control, process reports, energy and material balance calculations, etc.
In determining the maximum permissible error (uncertainty budget) for industrial
applications, it is important to identify the uncertainty contributors. Here are the most
common uncertainty contributors:

Reference
Standard

Measuring
Environment Equipment

Uncertainty of Installation
Metrologist the Measured
Characteristic

Calculation or
Definition of
reading errors
Measurand

Calibration
Procedure

The maximum permissible error has to be defined collectively by the groups involved
in the program. This must also be approved by the Approving Technical
Authority. It is not practical to enforce very tight accuracy requirements on all
measuring loops. The maximum permissible error has to be defined taking into
consideration the following criteria:
Measurement error determined considering the uncertainty contributors.
Calibration Class
Allowable tolerance (or margin) in manufacturing requirements for measuring
loops under Categories B and C, a wider permissible error may be tolerated.
Accuracy requirements will have a direct impact on the calibration cost since they will
directly affect the calibration interval and the choice of instruments.

Continued on next page

TM-31.505 / January 2004


10/44
Determination There do exist international norms defining certain rules in calibration as in ISO/IEC
of the 17025, or in calculating uncertainty budget as in ISO Guide to the Expression of
maximum Uncertainty in Measurement (GUM). These can also be applied in defining factory
permissible process instrumentation calibration standards, but they can be too complex and too
measurement strict for the process instrumentation calibration requirements.
error in
For process instruments, this simplified approach can be used in determining the
measurement
maximum permissible error.
Taking into consideration the various uncertainty contributors, the measurement
error (or combined uncertainty of measurement) can be calculated as follows

ETot = E12 + E22 + E32 + En2

Where,
ETot = the combined error
E1-n = the uncertainty factory for each uncertainty contributor
This means that the contributor with the biggest uncertainty factor will have a
significant impact on the combined uncertainty, while the contributors with
small uncertainty factor will have insignificant impact on the combined error.
For industrial applications, we could simplify the model by considering only the major
uncertainty contributors.
Let ES be the error due to the sensor or measuring equipment,
Let EL be the uncertainty in reading the indicator (or the display in the HMI)
of the system to be checked or calibrated; this value is given by the
manufacturer.
Let ET be the uncertainty of the test and reference instrument or standard,
as shown in the last calibration report for this system.
Let EC be the uncertainty of any intermediate signal conversion device you
may put in between the measuring equipment and the indicating device such
as analog/digital converters, PLC cards, etc.
Let EG be the overall measurement error (or uncertainty budget).

EG = ES2 + EL2 + ET2 + EC2

For industrial applications, we could simplify the model by considering only the major
uncertainty contributors.
The Maximum Permissible Measurement Error in Measurement is then calculated
using this overall measurement error EG as a reference plus a certain margin
depending on the criticality (Calibration Class) of the measurement loop. As an
example, the accuracy required for a temperature measurement for a sterilization
process may not be as stringent as the temperature measurement of the
temperature of the hot air for the Egron.
The Manufacturing Group in collaboration with Engineering, Quality Assurance and
Safety must define this margin. This margin normally can be in the range from 1.1x
to 2.0 x EG.

Continued on next page

TM-31.505 / January 2004


11/44
Who are The following groups must be involved in the definition of critical measuring loops
involved and the corresponding calibration intervals and error tolerances.
1. Quality and Operational Parameters
Quality Assurance
Manufacturing Group
Engineering, Electricity & Automation
2. Safety
Safety Officer
Manufacturing Group
Engineering, Electricity & Automation

Calibration For test and reference instruments:


Certificate
Test and reference instruments must be traceable to national or international
standards. It is recommended that these shall be calibrated by authorized external
calibration bodies or by the instrument or device supplier, who are capable of issuing
calibration certificates. The calibration certificates must always be traceable to
national or international standards.
Internal calibration may be allowed, but these must be done using working standards
(or other reference instruments) of higher accuracy and which are traceable to
national and international standards.
For the field instruments:
These field instruments must be calibrated using test and references instruments
that are traceable and with valid calibration certificates.
The calibration certificate must be prepared and signed by the person who made the
actual calibration. This can be printed through the calibration software or maybe
prepared from a separate form. Signed copies of these certificates must be put on
file.
Please see Appendix IV for an example of the calibration certificate.

TM-31.505 / January 2004


12/44
Calibration Procedures & Calibration-related Forms

Calibration The calibration procedure defines the process of performing the actual calibration. It
Procedure provides the following information:
The equipment required
The physical medium to be used for calibration
The type, number and settings for input signals
The required accuracy
The test and reference instruments needed
The number of tests required
Other information related to the instrument
The calibration procedure for each type of measuring loop or instrument must be
defined.
Procedures must be clearly and effectively documented in Standard Operating
Procedure (SOP) or Quality Monitoring Scheme (QMS) format.
Please refer to Appendix V for an example of a calibration procedure.

Initial An initial calibration is required upon installation, repair or replacement of an


Calibration instrument. This calibration maybe done in-house or maybe based on a calibration
certificate issued by the manufacturer. In any case, the initial calibration must still be
valid until the next calibration schedule of the instrument under consideration.

Adjustment If a measurement or the result of the calibration exceeds the maximum permissible
error, an adjustment or correction has to be made. The appropriate adjustment
procedure must be followed. After any adjustment, a re-calibration has to be made.
Multiple adjustments or re-calibrations may be needed until the instrument can be
restored to the desired accuracy. If the desired accuracy is never attained after
several adjustments, the instrument has to be replaced. The maximum number of
adjustments or re-calibration must also be defined.

Deviation For instruments classified under Category 1, Calibration Class A (Quality-related,


Reporting CCPs or Thermal Processing instruments), any calibration resulting in a deviation of
measurement outside the maximum permissible error, the Quality Assurance
Department must be notified immediately. This is necessary for the QA to make an
immediate decision on the disposition of the products manufactured.
A separate procedure must be established in the factory, concerning the handling of
non-conforming instruments.

In-Line In-line validation is a quick process of checking the reliability of some key
Validation instruments at the point of measurement. This can be made in the process through
simple comparison between the measuring instrument and a working standard. The
comparison could be on either instantaneous value or accumulated value of the
parameter. The operator could then request calibration of the instrument if the
measuring error is unacceptable.
Examples of this are: checking temperature sensors with a reference thermometer;
or checking load cells with known test weights. This, however, is not applicable for
some measuring loops.
A separate procedure can also be established in the factory for this purpose.

Continued on next page

TM-31.505 / January 2004


13/44
Methods of Individual Instrument Calibration
Calibration
This is a simple instrument calibration that checks only the accuracy and reliability of
the measuring instrument concerned. The instrument is normally taken out from the
line, and tested and compared to a reference standard. The number of calibration
points, required number of tests, and the method to be used are defined in the
corresponding calibration procedure for the instrument under calibration.
If the error exceeds the maximum permissible error, then, adjustments maybe made
and re-calibration done until the error is reduced to an acceptable limit.
This method is sufficient if:

Maximum Permissible Error 1.5 x the Global Error of the Measuring Loop

The Global Error of the Measuring Loop, being the quadratic sum of the errors of the
instrument that compose it.

ELoop = E12 + E22 + E32 + En2

However, if the Maximum Permissible Error is below the error allowance stated
above, it is recommended to perform the calibration of the measuring loop as a
whole.
Measuring Loop Calibration
For a measuring loop consisting of multiple components, such as an instrument, a
transmitter and a display, this set of measuring devices will have to be jointly
calibrated as one measuring system.
A test will be made in normal conditions of work and with all the instruments correctly
installed. The calibration can be done by using a reference standard measuring in
parallel with the instrument under test, or by using a signal generator at the point of
measurement. The reading in the reference standard is then compared to that of the
display in the visualization software, indicator, or stand-alone controller, as the case
may be.
The calibration has to demonstrate that the measuring loop has the accuracy
superior or equal to the required accuracy of the manufacturing process.

Measuring If an instrument inventory is readily available, the same inventory can be used in
Loop Inventory preparing the Measuring Loop Inventory. From the instrument inventory, the
Form measuring loop number (if it is different from the instrument tag number) and the
calibration class have to be added.
You may use the form in Appendix VI as an example in completing the Measuring
Loop Inventory. This form indicates the detailed specification of the measuring
loop or instrument concerned

Calibration The calibration report has to be prepared by the technicians after conducting the
Report calibration in the field. This report is used as a reference in preparing the calibration
certificate or as an input to the calibration software. This must also be put on file.
Please refer to document in Appendix VII as an example.
Electronic records may be allowed as long as it complies with the local regulations
and with the agreement of the Approving Technical Authority.

TM-31.505 / January 2004


14/44
Scheduling System

Purpose To sustain the effectiveness of the calibration program, a scheduling system has to
be put in place. The scheduling system will provide the information on which
measuring loop has to be calibrated and at which date. The system must also be
designed so as to properly distribute the calibration activities within one calibration
cycle, taking into consideration the calibration intervals, shutdown schedules,
working standards, and manpower resources.

Computer- The calibration schedule can be incorporated in the instrument calibration


assisted management software, if one is being used. This scheduling function normally
scheduling comes integrated in any calibration software.
If calibration software is not used, the calibration schedule can also be included in
the computerized maintenance management system (CMMS) in the factory, wherein
work orders are automatically generated for the instruments that are due for
calibration.
A combination of both is also possible, which means that the calibration historical
records are kept in the instrument calibration management system, while the
calibration schedule is maintained in the CMMS.

Manual The calibration schedule can also be planned manually through the preparation of an
Scheduling Instrument Calibration Schedule. This is a spreadsheet containing the list of
measuring loops with their corresponding schedule for calibration, spread over a 6
month or one year period.
As an example, refer to the Appendix VIII

TM-31.505 / January 2004


15/44
Labeling/Tagging System

Purpose Tagging of instruments or measuring loops is put in place to facilitate identification of


instruments. This tag number will also be used as a key reference in the
instrumentation database. This is also the reference key for traceability.

Tagging The following rules should be followed in tagging measuring loops or instruments:
Convention
Each measuring loop should have a unique tag number. A measuring loop
may contain one or more instruments.
The tag number should be linked to the asset number where the measuring
loop belongs.
All instruments belonging to one loop must have the same numeric tag (the
letter prefix may vary depending on the function. This is important because
the calibration will be done on a per loop basis, and the recording system (in
either manual or electronic) is also on a per loop basis.
For more details on the tagging convention please refer to TM 221.18

Instrument Tag Each instrument should be provided with an instrument tag in accordance with the
tagging convention mentioned above. The tag can be of stick-on or hook-up type.
This are usually mounted on the instrument, if conditions permit. Otherwise, these
tags can also be placed on the wall or on the support next to the instrument.
The type of material and printing method has to be specified that it can withstand the
environmental condition where the instrument is installed.
Here is an example of the stick-on instrument tag made of Gravoply (2-layer plastic
material for engraving), with engraved characters:

Color-coded tagging may also be applied, to differentiate instruments belonging to


different calibration class. This can also be applied to highlight the prioritization and
criticality of field instruments.

Continued on next page

TM-31.505 / January 2004


16/44
Calibration Tag After each calibration, the instrument has to be fitted with a calibration tag. The
calibration tag should contain the following information:
Loop number
Calibration date
Calibration due
Calibrated by (person who performed the calibration)
Here is an example of the calibration tag, a hook-up type also made from Gravoply.
The information are written using a permanent marker:

Loop No.

Calibration date

Calibrated by
Calibration
due

Calibration The calibration date refers to the date of actual calibration. This is dependent on the
Date instrument calibration plan, which is based on the calibration interval defined for
each instrument. This has to be indicated in the calibration tag after the actual
calibration has been made. This must also be indicated in the corresponding
calibration report and must be entered in the calibration software. The next
calibration due is calculated from this last calibration date.

Calibration Due The calibration due refers to the next calibration schedule, the maximum of which is
defined by the calibration interval. This must also be indicated in the calibration tag
after the actual calibration. This must also be reflected in the calibration report. If
calibration software is used, the next calibration schedule is calculated automatically
provided the calibration interval has been properly defined and entered into the
system.

TM-31.505 / January 2004


17/44
Measurement Traceability

Definition & According to the internationally recognized VIM definition, traceability is defined as
Purpose the property of the result of a measurement or the value of a standard whereby it can
be related to stated references, usually national or international standards, through
an unbroken chain of comparisons all having stated uncertainties.
This is also an important element in the Nestl Quality System. The proof of
measurement traceability is important to guarantee the quality of our products and
processes and to ensure the safety of our work place and environment

Traceability The calibration program must be designed such that it provides a way to relate the
results of a measurement or value of a reference standard to higher-level standards.
The program must ensure that the measuring loops (particularly those in Calibration
Classes A&B) must be traceable to a national or international standard.
Calibration certificates and/or reports must state the traceability to a national or
international standard of measurement and must provide the measurement results
and associated uncertainty of measurement and/or a statement of compliance.
Reference instruments should be regularly sent for calibration to external bodies
capable of providing traceability to national or international reference standards.
Reference materials should, where possible, be traceable to national or international
standards of measurement or reference material.
It is important to note that traceability is the property of the result of a measurement,
not of an instrument or calibration report or laboratory. The measuring loop by which
values are transferred must be clearly understood and under control.

Traceability Here is a diagram showing the instrument calibration traceability requirements.


Diagram
SI Units Example
International
International Reference Mass
Standards Standards

National National Reference


Standards Mass Standards

Reference Local Calibration


Standards Laboratories

Working
Standards Factory
Test Weights

Process
Instruments
Load Cell

TM-31.505 / January 2004


18/44
Test and Reference Standards

Purpose Test and reference measuring devices (working standards) are essential to facilitate
the calibration of field instruments. These devices must be periodically sent to
certified external bodies for calibration. These test and measuring devices that are
used for field calibration, guarantee the traceability of the calibration of our
measuring systems.

Types of The most common type of test and reference measuring devices are as follows:
Instruments
Thermometers
Pressure calibrators
Signal generators (current, voltage, interval, etc.)
Resistance boxes
Multi-function calibrators or meters
Reference weights or masses
You may use the following form in Appendix IX to list the test and reference
measuring devices. This form indicates the detailed specification of these
instruments.

Calibration These test and reference instruments and standards must be sent periodically for
Certificates calibration to legally recognized external certifying bodies. They will determine if an
instrument or standard is still valid for its intended use, and they will also specify the
next calibration due. They must also issue a certificate, duly signed by the person
responsible, specifying the accuracy level of the instrument calibrated. The
certificate is valid until the next calibration due indicated in the certificate.
This certificate must be properly kept in the corresponding folders of the test and
reference measuring device.

TM-31.505 / January 2004


19/44
Personnel Training

Purpose Instrument calibration involves complicated and highly technical systems and
procedures. In order to have an effective instrument calibration program, it is
imperative to have competent technical staff to manage and implement the program.

Competent It is required to have at least 1 highly skilled staff in the factory to manage the
Staff instrument calibration program. The person(s) implementing the calibration program
must be proficient in performing the required tasks. He must also be knowledgeable
in the requirements of traceability as mandated in the Nestl Quality System. He
must also be knowledgeable of the safety-related measurements in the plant.
The personnel performing the calibration must have the necessary education,
training, background, and experience.

Training The person(s) managing the calibration program and doing the actual measurement
and calibration must have undergone the appropriate training on metrology and
calibration. The level of competency and the kind of training required will have to be
adopted locally according to legal requirements and available training programs.
The qualifications and training must be documented. On a periodic basis, the
calibration personnel must be updated with the current technology and procedures.

Third-party If the actual measurement or calibration is sub-contracted to a third party. The


personnel person(s) doing the actual measurement or calibration must provide a proof of
competence of this kind of task, through a copy of the training certificate.
However, the management of the calibration program and the responsibility of
ensuring its effectiveness remain with the Engineering Department of the factory.
Nestl personnel who are responsible for the calibration program must supervise the
external parties performing the calibration.
This must also be covered by existing sub-contracting policies in the factory.

TM-31.505 / January 2004


20/44
Calibration Certificates

Purpose Calibration certificates provide the proof of calibration. It will also be used as a
reference for the current accuracy of the instrument and the error corrections made.
Also, it will be used as the basis for the determination of the next calibration
schedule.

Calibration This certificate must be provided by the authorized external calibration bodies or by
Certificate for the instrument or device supplier. The calibration certificate must always be
test and traceable to national or international reference standards.
reference
This document certifies that the instrument performs within the acceptable error
standards
tolerance limits. The certificate must also indicate the expiration date of the
calibration. This expiration date should be the latest date for the next calibration
schedule for the reference standard concerned.
These certificates must be kept in their corresponding folder in the calibration files.

Calibration This certificate is prepared by the technician conducting the actual calibration and to
Certificate for be duly signed by the person responsible for calibration. The certificate maybe
other prepared manually using a calibration certificate form or it may also be automatically
measuring generated using the calibration software.
systems
For the measuring loops, which are usually sent to the manufacturers or their
authorized representatives for calibration, they must issue a calibration certificate
indicating that the instrument is working within the tolerable limits and the
corresponding expiration date.
Signed copies of these certificates must be put on file, for traceability purposes.
Please refer to Appendix IV for an example of the calibration certificate.

TM-31.505 / January 2004


21/44
Documentation

Purpose To facilitate traceability, it is important to have an efficient filing system. The fast and
accurate retrieval of calibration information must be ensured. The use of calibration
software facilitates this task. The filing system consists of two parts.
Measuring Loop Inventory database. A calibration software is very useful for
this purpose
Calibration certificates this is mandatory and each instrument must have on
file a valid calibration certificate.

Instrument An inventory database of all field instruments and all test and reference instruments
inventory must be put in place. The database contains the following information for each
database measuring loop
Tag number
Description of instrument
Location (area where installed)
Operating value or range
Maximum Permissible Error in Measurement
Instrument brand and model
Calibration Class
Calibration Interval
Process Medium
The database should also contain the following information:
Calibration schedule
Maintenance History of the measuring loop or instrument.

Calibration Calibration records must be loop independent. Although we have a common loop
Records number for all instruments belonging to one measuring loop, each instrument
belonging to that loop should have a unique identifier (by adding a suffix to the loop
number, or using the instrument serial number as a reference as examples).
Each measuring loop record then should contain all the data about the loop as well
as the calibration history of each instrument belonging to the loop. If an instrument is
transferred from one measuring loop to another, its calibration history should also go
with it, and that the change must be reflected in the measuring loop record.

Continued on next page

TM-31.505 / January 2004


22/44
Records Copies of certificates will be distributed as follows:
Calibration workshop
Production Office
For the calibration workshop records. These must be maintained and stored in a
safeguarded area.
The records must be retained for the years as specified in the record retention policy
of the factory concerned.
The records must include:
Identification of the instrument
Location
Calibration History
Traceability Documentation
Test Equipment Used
Who Performed the Calibration
Date of Calibration

Filing of Calibration certificates have to be properly put on file. Two separate calibration
calibration certificate-filing systems have to be setup: one for the field instruments and one for
certificates the test and reference instruments.
Filing has to be done by loop number. Past calibration certificates must be kept
together with the succeeding ones. Retention period may be defined as needed.

TM-31.505 / January 2004


23/44
Calibration Software

Purpose As the number of instruments in the inventory increases, the more difficult it is to
manage the calibration program. The calibration software helps the user in the
following tasks:
Management of the Measuring Loop Inventory database. Ensures fast and
efficient storage and retrieval of information
Planning of instrument calibration schedule by automatically tracking
calibration dates and calibration interval.
Traceability
Prints reports including calibration certificate.

Instrument It is highly recommended to use an Instrument Calibration Management Software


Calibration Package to manage the calibration database. This software will have the following
Management functions:
Software
Sole repository of all field-installed, test, and reference instruments
Packages
Stores all relevant information of the instruments, including the calibration
history.
Provides recommendations in calibration schedule by reporting the
instruments due for calibration.
Provides a portable field calibrator and repository of information, which
could speed up the calibration and recording time.
Facilitates traceability by providing the query functions
Prints calibration-related reports including calibration procedures,
calibration forms and calibration certificates
There are various Instrument Calibration Management Software Packages available
in the market. Among the prominent ones are the following:
QM6 or CMX Calibration Maintenance Management Software. This is
software provided by the company Beamex Oy Ab, Finland. The company
web site is: http://www.beamex.com/
AMS Intelligent Device Manager from Emerson Process Management.
The company web site is: http://www.emersonprocess.com
MET / CAL from Fluke Corporation. The company web site is:
http://www.fluke.com

TM-31.505 / January 2004


24/44
Periodic Review of Calibration Program

Purpose The periodic review is needed to fine tune and re-validate the following:
Measuring Loop Inventory
Calibration Class
Maximum Permissible Error in Measurement
Calibration Interval
Instrument Calibration Schedule
The Measuring Loop Inventory and the corresponding Calibration Class have to be
reviewed in relation to the changes in the manufacturing requirement or process.
With the use of historical calibration information it is practical to adjust accordingly
the calibration interval and calibration schedule.

Review of The Measuring Loop Inventory and the Calibration Class have to be reviewed
Measuring annually. This is to ensure that changes to existing manufacturing operations are
Loop Inventory taken into account, that no critical measurement is left out, and that retired
instruments are removed from the inventory. It is also important to review the
Calibration Class and Maximum Permissible Error in Measurement in order to
update the prioritization of the instruments.

Calibration The calibration interval must also be reviewed annually. With the help of historical
Interval Review calibration data, we can fine-tune the calibration interval. If the calibration history
shows no drift in the accuracy, it may be economical to lengthen the interval between
calibrations. On the other hand, if the drift is significant, it maybe practical to shorten
the calibration interval, to ensure reliability of measurement.
This review of calibration interval can also help in identifying the reliability of the
instrument used in the processing lines.

Persons to The following groups must be represented in the review process:


conduct the
Quality Assurance
review
Safety
Manufacturing
Engineering (Electricity & Automation)
Any revisions in the Calibration Class, Maximum Permissible Error in Measurement
or in Calibration Interval have to be approved by the Approving Technical
Authority as defined above.

TM-31.505 / January 2004


25/44

APPENDICES
Table of contents

These appendices contain the following documents:


Documents Page
Calibration Class Classification 26
Calibration Interval Definition Guide 29
Calibration Interval Quick Reference Guide 31
Calibration Certificate (Example) 32
Calibration Procedure (Example) 33
Measuring Loop Inventory (Example) 34
Calibration Report (Example) 35
Instrument Calibration Schedule Form (Example) 36
Test and Reference Measurement List (Example) 37
Definitions 38
Standardization Bodies 43
Bibliography and references 44

TM-31.505 / January 2004


26/44

I. Calibration Class Classification

Calibration Class
A High Priority
Category Function
B Medium Priority
C Low Priority
Quality
CCP (Critical Control Point) refers to measuring
loops identified to be critical in ensuring Food
Safety, based on the HACCP studies. The proper
calibration of this measuring loop is mandatory to
comply with Food Safety (First Priority Level of the
Nestl Quality System). A
Examples:
Temperature measurement used in
sterilization process
Metal detectors classified as CCP
Thermal Processing Measuring Loops refers to
processes involving sterilization and
pasteurization. This is also considered as a
CCP, but is highlighted here due to some specific
applications and strict government requirements
1 in traceability.

Examples: A
Temperature measurement and
recording in cookers and sterilizers.
Sterilization process involved in retorts
and UHT where the temperature and
flow measurements are critical to
ensure the minimum time required.
CP (Control Point) refers to the measuring loops
that have direct influence on the quality of the
product but do not necessarily affect food safety.

Examples:
Roaster drum temperature B
measurement
Weigh scales batching systems
Pressure and temperature measuring
loops on evaporators

Continued on next page

TM-31.505 / January 2004


27/44

Calibration Class
A - High Priority
Category Function
B - Medium Priority
C - Low Priority
Operational
Legal Operational Requirements refers to
measuring loops installed in the lines that are
periodically inspected or used by legal authorities.

Examples: A
BOD (Biological Oxygen Demand)
monitors on effluents
Flowmeter for Waste Water Effluents
Production/Usage Monitors, KPI Benchmarks
refers to measuring loops used in monitoring
production output, material usages, line efficiency
and performance, or those used as benchmarks
for Key Performance Indicators (KPI)
B
2 Examples:
Batch weighers
Level transmitters used for Inventory
Energy consumption meters
Process Control refers to all measuring loops
used in controlling or for monitoring the process.
There performance may not have a significant
impact on the finished product quality, and they
are not used for legal requirements and for
production/usage report or KPIs. C
Examples:
Steam flow measurement
Process tank level measurement
Process temperature measurement

Continued on next page

TM-31.505 / January 2004


28/44

Calibration Class
A - High Priority
Category Function
B - Medium Priority
C - Low Priority
Safety & Environment
Primary Element - refers to all measuring loops,
which directly triggers the actuation of a safety
interlock, such as valve closure, motor stoppage
or plant shutdown. This also includes all
measuring loops triggering alarms that signal the
drastic and immediate human intervention due to
safety reasons, such as room evacuation, manual
shower triggering, etc.
A
Examples:
Egron CO Detection System
Egron Exhaust air temperature used for
Egron Safety Interlocks
Extraction cell pressure measurement
Oxygen level detectors
Secondary Element - refers to all measuring
loops, which triggers pre-alarm levels, or warning
3 signals (if they are separate from the primary
element).

Examples:
B
High-pressure alarm triggers for pressure
vessels such as boiler drums.
High level alarm triggers for tanks
containing sensitive media, such as acids,
caustic solutions, etc.
Monitoring instrument refers to all measuring
loops installed for environmental protection
purposes, but only for monitoring trends.

Examples:
C
DO (Dissolved Oxygen) and pH meters for
waste water treatment plant
Differential Pressure Measurement for Bag
Filters

TM-31.505 / January 2004


29/44

II. Calibration Interval Definition Guide

Factors to Here are the key factors to consider in the initial definition of the calibration interval.
consider 1. Calibration Class
2. Degree of Utilization
3. Environmental Consideration
4. Manufacturers Recommendation
5. Legal Requirements
However, one must remember that the optimum calibration interval is always based
on experience and historical calibration data. It is recommended to have a periodic
review of the defined intervals, when sufficient calibration data have been gathered.
The simplified table below can be used as a quick guide for the initial definition of
calibration interval. It provides recommendations on calibration intervals taking into
consideration the above-mentioned factors.

Calibration This is the classification defined during the preparation of the Measuring Loop
Class Inventory. The measuring loops can be prioritized as follows:
Class A - High Priority
Class B - Medium Priority
Class C - Low Priority
The interval has to be adopted based on the criticality of the instrument. The greater
the risk involved the more frequent the calibration of this measuring loop should be.

Degree of The utilization of the instruments is also a factor that affects their performance. All
Utilization instruments do drift over time even if they are not used, but the frequency of and the
duration of usage can speed up the drift.
We can classify the degree of utilization of instruments into two main groups:
High when the instrument is used (or powered up) 50% or more of the time within
one calibration cycle (time between 2 calibration).
Low when the instrument is used (or powered up) less than 50% of the time. This
is typically applicable to factories with single-shift operation (provided they cut-
off the power supply to instruments after the shift). This is also applicable for
factories or lines with seasonal operation (used only for a few weeks or months
in a given year).

Environmental The environment where the instruments are installed can also have an effect on how
Consideration they behave over time. Examples are temperature, vibration, dust, etc.
We can classify the environmental consideration into two main groups:
Severe when the instruments are subjected to severe vibrations or high ambient
temperature.
Normal for normal environmental conditions (within the manufacturers
recommended operating conditions).
There is no clear and strict rule as to these groupings, it will be up to the judgment of
the responsible individuals to decide which conditions are considered severe and
which ones are considered normal for the instrument under consideration.

Continued on next page

TM-31.505 / January 2004


30/44

Manufacturers The starting point for defining the calibration interval would be the recommendation
Recommendati from the manufacturer, particularly those instruments where we dont have so much
on experience working with. These are normally conservative estimates if the
instruments are used within the process and environmental conditions specified for
the instrument concerned.
In the absence of the manufacturers recommendation, this value will then be based
on the best judgment of the calibration technicians, using relevant historical
information and experience.

Legal Some countries may have existing laws that regulate calibration requirements for
Requirements certain measuring loops involved in high-risk food manufacturing applications such
as in Thermal Processing. In any case, the calibration interval mandated by these
regulations must be strictly complied with.

Standard To simplify scheduling, it is recommended to standardize the calibration intervals,


Calibration expressed in the number of months. Defining it into the number of days or weeks
Intervals would be too complicated and unrealistic. For this reason, we recommend to limit
the intervals into the following groupings

1 Monthly
3 Every 3 months
6 Every 6 months
9 Every 9 months
12 Every 12 months
24 Every 24 months
R Upon Request (Calibration Class C instruments with intervals beyond 24
months)

TM-31.505 / January 2004


31/44

III. Calibration Interval Quick Reference Guide

Manufacturers Proposed Calibration Interval


Recommendation Environmental (Expressed in number of months)
Utilization
(Expressed in Consideration Calibration Calibration Calibration
number of months) Class A Class B Class C
1 High Severe 1 1 1
1 High Normal 1 1 1
1 Low Severe 1 1 3
1 Low Normal 1 1 3
3 High Severe 1 3 3
3 High Normal 1 3 6
3 Low Severe 3 3 6
3 Low Normal 3 3 9
6 High Severe 3 3 9
6 High Normal 3 6 12
6 Low Severe 3 6 12
6 Low Normal 3 9 18
9 High Severe 3 6 12
9 High Normal 3 9 18
9 Low Severe 6 9 18
9 Low Normal 6 12 R
12 High Severe 3 9 18
12 High Normal 6 12 24
12 Low Severe 6 12 R
12 Low Normal 9 18 R
18 High Severe 6 12 R
18 High Normal 9 18 R
18 Low Severe 9 18 R
18 Low Normal 12 18 R
24 High Severe 9 18 R
24 High Normal 12 24 R
24 Low Severe 12 24 R
24 Low Normal 18 24 R

TM-31.505 / January 2004


32/44

IV. Calibration Certificate (Example)

TM-31.505 / January 2004


33/44

V. Calibration Procedure (Example)

Calibration Procedure for Temperature Measurements (Thermostatic


Oven)

Preparation 1. Ensure that installation is stopped.


2. Take into account the calibration report and note all its entries.
3. Ensure that the "calibration due" date of the sensor under calibration has not yet
expired.
4. Ensure that the " calibration due" date of the indicator (or display) under
calibration has not yet expired.
5. Ensure that the operating range is correct - if mentioned.
6. Record the influential parameters of the environmental conditions.
7. Set up the arrangement:

SYSTEM TO BE CHECKED WORKING STANDARD

INDICATOR SENSOR SENSOR INDICATOR

THERMOSTATIC OVEN

8. If necessary, extend the probe cable after locating the wiring.

Operational 1. Set the switch on the AMETEK oven, located above the supply connector, to I.
Procedure
2. Check that "Read C" is lit.
3. Press the "Set / Read" button.
4. Check that "Set" is lit.
5. Press the "Temp. set" button and set the first value to be simulated.
6. Press the "Set / Read" button and check that "Read" is lit.
7. Wait until the temperature shown on the indicators is stable within + / - 0,1 C.
8. Note the temperature readings on all indicators.
9. Repeat the adjustments and readings at all calibration points.
10. Set the switch to 0.
11. Update the calibration tag.
12. Prepare the calibration report.

TM-31.505 / January 2004


34/44

VI. Measuring Loop Inventory (Example)

Operating Max. Calibration


Calibration Process
Tag Description Location Value/Range Eng. Units Permissible
Class
Interval Function Brand Model Serial number
(nominal 20%) Error Medium
(months)
Pressure, Steam NCPP
LP-P5233 4 t0 6 Bar 0.5 Bar B 12 Safety 14588197
Satellite Inlet R. dry.3m2 Emerson Steam

Temp. Hot Water NCPP


LP-T5234 70 to 90 C 1C B 12 Quality
Satellite Inlet R. dry.3m2 Fortex 211693 Water

Temp. Hot Water NCPP


LP-T6134 80 to 100 C 1C B 18 Operational
Satellite Outlet R. dry.1,5m2 Fortex 211693 Water

Temp. Hot Water NCPP


LP-T6136 70 to 90 C 1C B 12 Quality
Satellite Inlet R. dry.1,5m2 Fortex 211693 Water

NCPP
LP-W17 Balance Ktron 160 to 240 Kg/h 2 kg/h B 12 Operational 130495-01
ZLT Line Ktron Dry mix

NCPP
LP-F1A15 Hot Water Flowmeter 140 to 200 Kg/h 2 kg/h B 12 Operational 0880088197
ZLT Line Emerson Water

LP- NCPP
Enzyme BAN Flowmeter 4 to 6 Kg/h 50 g/h B 12 Quality 383387 Enzyme
FM1B12 ZLT Line Emerson

LP- NCPP
Enzyme Cereflo Flowmeter 4 to 6 Kg/h 50 g/h B 12 Quality 381747 Enzyme
FM1B16 ZLT Line Emerson

LP- NCPP
Enzyme AMG Flowmeter 4 to 6 Kg/h 50 g/h B 12 Quality 363512 Enzyme
FM1B27 ZLT Line Emerson
NCPP
LP-T1B37 Temp. tubular Heater 80 to 100 C 1C A 6 Quality 12A8241
CHE Line Fortex Water

LP-P1121 Pressure in reactor NCPP 40 to 50 Bar 1 Bar A 6 Safety Emerson 5TE 21W25
Nitrogen

LP-FI1B15 Flowmeter out of reactor NCPP 130 to 180 Kg/h 2 kg/h B 12 Operational Labom GA2200 436234G Slurry

Approved by: ________________________ ____________________________ __________________________


Production Manager QA Manager E&A Group

TM-31.505 / January 2004


35/44

VII. Calibration Report (Example)

A - IDENTIFICATION
Apparatus description: TEMPERATURE MEASURING LOOP Tag:
Sensor Manufacturer Supplier Type / Model Serial No. Error Tolerance

Usual location Building Room Floor Installation Zone Tag

Indicator Manufacturer Supplier Type / Model Serial No.


Control room

Other:

B - DESCRIPTION OF OPERATION
Method used: Calibration with physical quantity generator: Thermostatic oven
Procedure used:
Range checked:
Values simulated:
Material used:
Weather station Hygrometer, Barometer, Thermometer
Working Manufacturer Type No. Accuracy class Calibration Date
standards: Last Next
Sensor

Indicator

C - MEASUREMENTS
Environmental conditions: Influential Readings Uncertainties
parameters
Humidity % RH +/- % RH
Atmospheric pressure hPa +/- hPa
Temperature C +/- C
SIMULATED
READINGS (C) FROM INDICATORS
VALUES, C
AS FOUND AS LEFT
Units: Reference Stand. Control Room Other: Reference Stand. Control Room Other:

Calibration Date: Conducted by: Initials:

D - OVERALL MEASUREMENT ERROR AND RESULTS

TM-31.505 / January 2004


36/44

VIII. Instrument Calibration Schedule Form (Example)

PTC Orbe INSTRUMENT CALIBRATION SCHEDULE


Electricity & Automation
Tag No. Year 2003
Week No. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 Remarks
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL
PLANNED
ACTUAL

Prepared by: _______________________________ Approved by: ___________________________________

TM-31.505 / January 2004


37/44

IX. Test and Reference Measurement List (Example)

PTC Orbe TEST/REFERENCE INSTRUMENT LIST


Electricity & Automation

NAME / BRAND SERIAL DATE


MODEL RANGE QTY. COMMENTS
OF INSTRUMENT NO. ACQUIRED

TM-31.505 / January 2004


38/44

X. Definitions

Accuracy (of From VIM: closeness of the agreement between the result of a measurement and a
measurement) true value of the measurand.
Note: accuracy is a qualitative concept. (A 'true' value can never be perfectly known
or indeed defined.). The term precision should not be used for "accuracy".

Accuracy (of a From VIM: ability of a measuring instrument to give responses close to a true value.
measuring
instrument)

Accuracy class From VIM: class of measuring instruments that meet certain metrological
requirements that are intended to keep errors within specified limits
Note: A number usually denotes an accuracy class or symbol adopted by convention
and called the class index.

Adjustment From VIM: operation of bringing a measuring instrument into a state of performance
suitable for its use
Note: Adjustment may be automatic, semiautomatic or manual.

As found The result of the measurement before any correction or adjustment is made on an
calibration instrument or measuring loop. This is normally the result of the first calibration.

As left The result of the measurement after a correction or adjustment is made on an


calibration instrument or measuring loop. If the result of measurement is within the specified
limits and no correction or adjustment is required, the results in both the As found
and As left calibrations should be the same.

Calibration Calibration is a measurement process that assigns values to the property of an


artifact or to the response of an instrument relative to reference standards or to a
designated measurement process. The purpose of calibration is to eliminate or
reduce bias in the user's measuring loop relative to the reference base.

The definition provided by the International Vocabulary of Basic and General Terms
in Metrology (VIM; 1993).
Calibration is a set of operations that establish, under specified conditions, the
relationship between values of quantities indicated by a measuring instrument or
measuring loop, or values represented by a material measure or reference material,
and the corresponding values realized by standards.

Calibration and The result of a calibration may be recorded in a document, sometimes called a
measurement calibration certificate or a calibration report. Also can refer to a document
certificates accompanying a certified reference material stating one or more property values and
their uncertainties, and confirming that the necessary procedures have been carried
out to ensure their validity and traceability.

(VIM and ISO Guide 30: Terms and definitions used in connection with reference
materials. See also ISO Guide 31: Contents of certificates of reference materials)

Continued on next page

TM-31.505 / January 2004


39/44

Calibration The calibration procedure defines the actions and materials or devices needed to
Procedure compares an "unknown" or test item(s) or instrument with reference standards
according to a specific algorithm.

Correction From VIM: the value added algebraically to the uncorrected result of a measurement
to compensate for systematic error.
Notes:
1. The correction is equal to the negative of the estimated systematic error.
2. Since the systematic error cannot be known perfectly, the compensation cannot
be complete.

Corrected Result of a measurement after a correction or adjustment has been made for a
result measurement error. This is commonly referred to as the result of the second or third
calibration after an adjustment or modification

Deviation The difference between the measured value and its reference value

Dead band From VIM: maximum interval through which a stimulus may be changed in both
directions without producing a change in response of a measuring instrument

Drift From VIM: characteristic slow change of a metrological of a measuring instrument

Error (of From VIM: result of a measurement minus the true value of the measurand.
measurement) Error is numerically equal to correction but opposite in sign.

Error (of From VIM: indication of a measuring instrument minus a true value of the
measuring corresponding input quantity
instrument) Notes:
1. Since a true value cannot be determined, in practice a conventional true value is
used
2. This concept applies mainly where the instrument is compared to a reference
standard.
3. For a material measure, the indication is the value assigned to it.

Error, random From VIM: result of a measurement minus the mean that would result from an
infinite number of measurements of the same measurand carried out under
repeatability conditions

Error, From VIM: mean that would result from an infinite number of measurements of the
systematic same measurand carried out under repeatability conditions minus a true value of the
measurand

Measurand This refers to a specific physical quantity or force subject to measurement.


Example: TC of a given coffee extract at a temperature of 30 C.
The specification of a measurand may require specifications about quantities such
as time, temperature and pressure.

Continued on next page

TM-31.505 / January 2004


40/44

Measurement From VIM: set of operations having the object of determining a value of a quantity

Measuring or From VIM: set of values of measurands for which the error of a measuring
working range instrument is intended to lie within specified limits

Measuring loop Series of elements of a measurement system that constitutes the path of the
or measurement signal from the input to the output.
measurement
chain

Measuring From VIM: complete set of measuring instruments and other equipment assembled
system to carry out specified measurements

Metrology Science of measurement

Nominal range Range of indications obtained through a particular setting in the configuration of a
measuring instrument. In most measuring instruments, this can be adjustable
relative to the application or use.
Note: Nominal range is normally stated in terms of its lower and upper limits, for
example, "6 bars to 10 bars".

Precision The closeness of agreement between independent test or measurement results


obtained under stipulated conditions

Reference From VIM: conditions of use prescribed for testing the performance of a measuring
conditions instrument or for intercomparison of results of measurements

Reference From VIM: material or substance one or more of whose property values are
material sufficiently homogeneous and well established to be used for the calibration of an
apparatus, the assessment of a measurement method, or for assigning values to
materials.

Repeatability From VIM: ability of a measuring instrument to provide closely similar indications for
(of a measuring repeated applications of the same measurand under the same conditions of
instrument) measurement
Notes:
1. These conditions include:
reduction to a minimum of the variations due to the observer
the same measurement procedure
the same observer
the same measuring equipment, used under the same conditions
the same location
repetition over a short period of time.
2. Repeatability may be expressed quantitatively in terms of the dispersion
characteristics of the indications.

Continued on next page

TM-31.505 / January 2004


41/44

Repeatability From VIM: closeness of the agreement between the results of successive
(of results of measurements of the same measurand carried out under the same conditions of
measurements) measurement
Notes:
1. These conditions are called repeatability conditions.
2. Repeatability conditions include:
the same measurement procedure
the same observer
the same measuring instrument, used under the same conditions
the same location
repetition over a short period of time.
3. Repeatability may be expressed quantitatively in terms of the dispersion
characteristics of the results.

Reproducibility From VIM: closeness of the agreement between the results of measurements of the
(of results of same measurand carried out under changed conditions of measurement
measurements) Notes:
1. A valid statement of reproducibility requires specification of the conditions
changed.
2. The changed conditions may include:
principle of measurement
method of measurement
observer
measuring instrument
reference standard
location
conditions of use
time.
3. Reproducibility may be expressed quantitatively in terms of the dispersion
characteristics of the results.
4. Results are here usually understood to be corrected results.

Resolution (of a From VIM: smallest difference between indications of a displaying device that can be
displaying meaningfully distinguished
device) Notes:
1. For a digital displaying device, this is the change in the indication when the least
significant digit changes by one step.
2. This concept applies also to a recording device.
3. It is important not to confuse the resolution of a display alone with the resolution
of a force-measuring system which incorporates a display; the system will have
less (poorer) resolution than the display alone

Response time From VIM: time interval between the instant when a stimulus is subjected to a
specified abrupt change and the instant when the response reaches and remains
within specified limits around its final steady value.

Span The difference between the upper and lower limits of a nominal range.
Example for a nominal range of -40 mbar to +40 mbar, the span is 80 mbar.
This is sometimes commonly referred to simply as range.

Stability From VIM: ability of a measuring instrument to maintain constant its metrological
characteristics with time.

Continued on next page

TM-31.505 / January 2004


42/44

Standard, From VIM: a standard recognized by an international agreement to serve


International internationally as the basis for assigning values to other standards of the quantity
concerned

Standard, From VIM: a standard recognized by a national decision to serve in a country as the
National basis for assigning values to other standards of the quantity concerned.

Standard, From VIM: standard that is designated or widely acknowledged as having the highest
Primary metrological qualities and whose value is accepted without reference to other
standards of the same quantity

Standard, From VIM: standard whose value is assigned by comparison with a primary standard
Secondary of the same quantity

Standard, From VIM: standard, generally having the highest metrological quality available at a
Reference given location or in a given organization, from which measurements made there are
derived

Standard, From VIM: standard that is used routinely to calibrate or check material measures,
Working measuring instruments or reference materials
Notes:
1. A working standard is usually calibrated against a reference standard.
2. A working standard used routinely to ensure that measurements are being
carried out correctly is called a check standard.

Standard From VIM: uncertainty of the result of a measurement expressed as a standard


uncertainty deviation.

Traceability According to the internationally recognized VIM definition, traceability is defined as


the property of the result of a measurement or the value of a standard whereby it can
be related to stated references, usually national or international standards, through
an unbroken chain of comparisons all having stated uncertainties. Accordingly, the
results of measurements must be traceable to reference standards maintained by
local or national agencies, ex. NIST, BSI, AFNOR, DIN, JISC, etc.
In most cases, the ultimate reference for a measurement result is the definition of the
appropriate unit in the International System of Units (SI).

Transparency From VIM: ability of a measuring instrument not to alter the measurand
Examples:
mass balance is transparent;
a resistance thermometer that heats the medium whose temperature it is
intended to measure is not transparent.

Uncertainty of Parameter, associated with the result of a measurement that characterizes the
measurement dispersion of values that could reasonably be attributed to the measurand.

TM-31.505 / January 2004


43/44

XI. Standardization Bodies

[1] Bureau international des poids et mesures


http://www.bipm.org/links/welcome.shtml
[2] International Organization for Standardization
http://www.iso.ch/iso/en/ISOOnline.frontpage
[3] National Institute of Standards and Technology
http://www.nist.gov/
[4] International Electrotechnical Commission
http://www.iec.ch/index.html
[5] European Committee for Electrotechnical Standardization
http://www.cenelec.org/

TM-31.505 / January 2004


44/44

XII. Bibliography and references

[1] ISO Guide to the Expression of Uncertainty in Measurement (GUM)

International Organization for Standardization


GUM, 1995
ANSI/NCSL Z540-2-1997
When reporting the result of a measurement of a physical quantity,
some quantitative indication of the result has to be given to assess its
reliability and to allow comparisons to be made. The Guide to the
expression of uncertainty in measurement establishes general rules for
evaluating and expressing uncertainty in measurement that can be
followed at many levels of accuracy and in many fields.
[2] International Vocabulary of Basic and General Terms in Metrology

International Organization for Standardization


VIM, 1993
An international agreement on terminology, prepared as a collaborative
work of experts appointed by BIPM, IEC, IFCC, ISO, IUPAC, IUPAP
and OIML. This vocabulary covers subjects relating to measurement
and includes information on the determination of physical constants
and other fundamental properties of materials and substances
The publication contains 60 pages
[3] Technical Note on ISO Guide to the Expression of Uncertainty in
Measurement

National Institute of Standards and Technology (NIST)


NIST Technical Note 1297 (TN 1297)
This is a summary of the ISO Guide to the Expression of Uncertainty in
Measurement
[4] SI Guide

International Organization for Standardization


This covers the basics of: the historical background, the principles of
the SI, the base units, derived units, multiples and sub-multiples,
additional units, printing rules, space and time, periodic phenomena,
mechanics, heat, electricity and magnetism, light, acoustics, physical
chemistry, atomic and nuclear physics activity, ionizing radiations,
characteristic numbers, and it ends with conversion tables.
[5] C.D. Ehrlich and S.D. Rasberry, Metrological Timelines in Traceability,
Journal of Research of the National Institute of Standards and Technology
103, 93 (1998).

TM-31.505 / January 2004

S-ar putea să vă placă și