## 1. What are the benefits of calibration?

Calibration gives you confidence in the measurements you are making. Without it, measurements mean little, if anything. Instruments you depend on to measure product functionality or quality may give you false information, causing you to pass bad products or fail good products. In the end, a comprehensive calibration program increases quality and efficiency by making sure the measurements you rely on mean something.

## 2.Why calibration is so important?

Calibration & traceability are essential to assure the quality of any design or production process. At the heart of any process is the ability of measure and control. Without calibration and the meaning it gives to measurements of all types, it is difficult to assure that processes are well controlled and the end products meet their specification. That is why a documented calibration program is a key part of all major quality standards like ISO 9000, Q9000 & FDA GMPs

## 3.What must be done prior to calibration?

first of all note down the specifications of any instruments, the following terminology used in specifying the instruments ( for e.g.: Multimeter).

- Number of Digits & over range: The “Number of digits” specification is the most fundamental, and sometimes, the most confusing characteristic of a multimeter. The number of digits is equal to the maximum number of “9’s” the multimeter can measure or display. This indicates the number of full digits. Most multimeter’s a have the ability of over range and adds a partial or “
^{1}/_{2}“digit.

For example, the HP 34401A multimeter can measure 9.99999 Vdc on the 10 V range. This represents 6 full digits of resolution. The multimeter can also over range on the 10 V range and measure up to a maximum of 12.00000 Vdc. This corresponds to a 6^{1}/_{2} – digit measurement with 20% over range capability.

- Sensitivity: Sensitivity is the minimum level that the multimeter can detect for a given measurement. Sensitivity defines the ability of the multimeter to respond to small changes in the input level.

For example, suppose you are monitoring a 1 mVdc signal and you want to adjust the level to within 1V. To be able to respond to an adjustment this small, this measurement would require a multimeter with sensitivity of at least 1V. you could use a 6^{1}/_{2} -digit multimeter if it has a 1Vdc or smaller range. You can also use a 4^{1}/_{2} -digit multimeter with a 10mVdc range.

- Resolution: Resolution is the numeric ratio of the maximum displayed value divided by the minimum displayed value on a selected range. Resolution is often expressed in percent, parts-per-million (ppm), counts, or bits.

For example, a 6^{1}/_{2} -digit multimeter with 20% over range capability can display a measurement with up to 1,200,000 counts of resolution. This corresponds to about 0.0001% (1ppm) of full scale, or 21 bits including the sign bit. All four specifications are equivalent

- Accuracy: accuracy is a measure of the “exactness’ to which the multimeter’s measurement uncertainty can be determined
*relative*to the calibration reference used. Absolute accuracy includes the multimeter’s relative accuracy specification plus the known error of the calibration reference relative to national standards. To be meaningful. The accuracy specifications must be accompanied with the conditions under which they are valid. These conditions should include temperature, humidity and time.

A) Transfer accuracy: refers to the error introduced by the multimeter due to noise and short term drift. The error becomes apparent when comparing two nearly-equal signals for the purpose of “transferring” the known accuracy of one device to the other.

B) 24-Hour Accuracy: indicates the multimeter’s relative accuracy over its full measurement range for short time interval and within a stable environment. Short-term accuracy is usually specified for a 24-hour period and for a 1^{0}C temperature range.

C) 90-Day & 1-Year accuracy: these long-term accuracy specifications are valid for a 23^{0}C 5^{0}C temperature range. These specifications include the initial calibration errors plus the multimeter’s long term drift errors.

- Temperature Coefficients: Accuracy is usually specified for a 23
^{0}C temperature range. This common temperature range for many operating environments. You must add additional temperature coefficient errors to the accuracy specification if you are operating the multimeter outside a 23^{0}C temperature range (the specification is per^{0}C)

## 4.How to calculate total measurement error?

The multimeter’s accuracy specifications are expressed in the form: (% of reading + % of range). In addition to the reading error and range error, you may need to add additional error for certain operating conditions.

If you are operating the multimeter outside the 23^{0}C 5^{0}C temperature range specified, apply an additional *temperature coefficient error.*

For DC voltage, DC current and resistance measurements, you may need to apply an additional *reading speed error or auto zero OFF error.*

For AC voltage, AC current and resistance measurements, you may need to apply an additional *low frequency error or crest factor error*.

“% of reading error”: The reading error compensates for inaccuracies that result from the function and range you select, as well as the input signal level. The reading error varies according to the input level on the selected range. This error is expressed in percent of reading. The following table shows the reading error applied to the multimeter’s 24-hour dc voltage specification

Range | Input level | % of reading error (specified by Manufacturer) | Reading Error Voltage |

10Vdc 10Vdc 10Vdc | 10Vdc 1Vdc 0.1Vdc | 0.0015 0.0015 0.0015 | 150V 15V 1.5V |

Reading error = 0.0015% x 10Vdc = 0.015V (i.e. 150V)

0.0015% x 1Vdc = 0.0015V (i.e.15V)

0.0015% x 0.1Vdc = 0.00015V (i.e. 1.5V)

“% of range Error”: This range error compensates for inaccuracies that result from the function and range you select. The range error contributes a constant error, expressed as a percentage of range, independent of the input signal. The following table shows the range error applied to the

## 5.Multimeter’s 24-hour dc voltage specification.

Range | Input level | % of range error (specified by Manufacturer) | Range Error Voltage |

10Vdc 10Vdc 10Vdc | 10Vdc 1Vdc 0.1Vdc | 0.0004 0.0004 0.0004 | 40V 40V 40V |

Range error = 0.0004% x 10Vdc = 0.004V (i.e. 40V)

0.0004% x 10Vdc = 0.004V (i.e.40V)

0.0004% x 10Vdc = 0.004V (i.e. 40V)

“Total Measurement Error”: To compute the total measurement error, add the reading error and range error. You can then convert the total measurement error to a “percent of input” error or a “ppm (part-per-million) of input” error as shown below.

Error Example: Assume that a 5Vdc signal is input to the multimeter on the 10Vdc range. Compute the total measurement error using the 90-day accuracy specifications: (0.0020% of reading + 0.0005% of range).

Reading Error = 0.0020% x 5Vdc = 100V

Range Error = 0.0005% x 10Vdc = 50V

Total Error = 100V + 50V = 150V

= 30ppm of 5Vdc

1V = 10^{-6} = 1, 00,000