What is calibration?

It underpins the infrastructure of our modern lives.

MCMC Addict
6 min readJan 18, 2024
Photo by Sigmund on Unsplash

In our daily lives, we often measure our body weight directly with a scale and the temperature of a room with a wall thermometer. But a vast number of measurements are being made all the time by sensors, instruments, measuring systems and diagnostic equipment, most of which are invisible in our infrastructure. It is no exaggeration to say that the infrastructure of modern life is underpinned by a wide range of measurement systems whose quality is maintained by regular calibration.

What is calibration?

The International Vocabulary of Metrology (VIM) defines calibration as a two-step operation or process under specified conditions. The first step is to establish a relationship between indications or readings from an instrument to be calibrated and the corresponding quantity values given by measurement standards. The quantity values and the indications should be associated with their measurement uncertainties (MUs). The second step is to use the relational information to establish another relationship and obtain a measurement result from an indication of the instrument.

In some cases, the two-step process cannot be completed by a laboratory. Because of this case, only the first step can be accepted as calibration, which can be carried out by a calibration laboratory or a calibration unit within an organization. In fact, the current definition of calibration (VIM3) is an extension of the previous one-step definition (VIM2). In the past, the second step was usually considered to occur after calibration. However, by including the second step in the definition, calibration can directly demonstrate the metrological traceability of measurement results (expressed in terms of measured values and associated MU) at the calibration site.

The relationship between the indications and the quantity values may be expressed by a statement, a calibration table, a calibration curve or a calibration function. In any case, the relationship should be able to give a one-to-one correspondence between the indications and the quantity values over a range of uses. In addition, the relationship between them should be monotonic, at least over the range. Sometimes, the relationship may be expressed by a correction factor added to or multiplied by each indication to compensate for any systematic effect. The factor may be different for different ranges of indications or the same for the whole range. It is evident that the factor should have its associated MU.

Resistance measurement bridge for Standard Platinum resistance thermometers

A calibration case: platinum resistance thermometers

For example, consider the calibration of platinum resistance thermometers with a nominal resistance of 100 Ω, known as Pt 100 sensors used in industry for accurate temperature measurement. In general, the resistance of a metal such as platinum increases with temperature. The resistance R of the sensor as a function of the Celsius temperature t can be well expressed over the range 0 °C to 800 °C using the empirical equation as follows:

where R_0, A, B and C are calibration coefficients to be determined by calibration.

In the first step of its calibration, the sensor is exposed to a stable environment where the temperatures and the resistance of the sensor are measured more precisely and accurately as (t_1, t_2, …, t_n) using a reference thermometer and as (R_1, R_2, …, R_n) using a resistance bridge, respectively. At each point, the temperature-resistance pair (t_i and R_i ) are the quantity values with their units, °C and Ω. If n is greater than or equal to the number of the coefficients, 3, for the range above 0 °C, all the coefficients can be determined to fix the functional relationship. Solving the quadratic equation now makes it possible to decide on the temperature of an environment by measuring the corresponding resistance (inverse mapping). Including the second step in the calibration means that we should be able to confirm that the temperature determined from a resistance agrees with that measured by the reference thermometer within an associated uncertainty immediately after the first step. The second step allows the inverse mapping to be checked for inversion accuracy and monotonicity of the mapping over the range.

The difference between verification and adjustment

Calibration is frequently confused with verification and adjustment. Verification is another process to ensure that specified requirements of a measuring system, such as maximum permissible errors, are met, which requires calibration. Adjustment is a set of operational processes performed on a measuring system so that the system produces prescribed indications corresponding to the applied quantity values to be measured. Typically, the quantity values are derived from measurement standards. Calibration is a prerequisite for both adjustment and verification. However, verification does not necessarily require adjustment, which is carried out differently depending on an organization’s quality assurance policy.

In the case of the Pt 100 sensors mentioned above, there are four different accuracy classes (AA, A, B, C) in IEC 60751:2022 according to tolerance. Pt 100 sensors can be calibrated to verify that the sensors meet the specifications of a class. It is not possible to adjust the sensors themselves. The classification of sensors allows us to ensure that an indicator can support a particular class of sensors. The indicator could be calibrated and adjusted by applying a settable reference resistance instead of connecting a Pt 100 sensor.

Why should we regularly calibrate?

Immediately after an instrument has been calibrated, the user has a high degree of confidence that the measurement results are correct or at least within specification. However, as the instrument is subjected to stresses during use and storage, the user’s confidence decreases to the point where it no longer meets its specifications. As confidence decreases to an unacceptable level, the user is pressured to recalibrate the instrument.

When the user recalibrates the instrument to mitigate the uncertainty growth, they conclude whether the instrument is really within specification or not. If it is out of specification, the user can accept that such a situation is caused by a physical aspect of the uncertainty growth of the instrument. Let me call the physical aspect ‘measurement uncertainty growth (MUG)’. But if it is within specification, the user can accept that their psychological anxiety causes it. The MUG is known to be caused by any stress on the instruments, such as storage in a rapidly changing environment, excessive use, damage, and even ageing.

Measurement uncertainty growth (the right-hand side shows the drift of the peak and the increase in the width of the distributions taken from calibrations of 134 Zener voltage standards[Ref. 3])

The MUG can be expressed in changes in probability distributions, manifested in the drift of the peaks and the increase in the width of the distributions over time, as shown in the figures above.
The probability of an instrument being out of specification increases over time. (the shaded area on the distribution curves on the left) In practice, the calibration history of many instruments from the same model makes it possible to construct probability functions that show similar behaviour over time, as shown in the right-hand figure above. This shows my experience analyzing the calibration history of 134 Zener voltage standards, a secondary reference standard in DC voltage measurement. The above argument suggests regular calibration is essential to keep measuring systems within specification.

Summary

Calibration is one of the most essential activities supporting our modern infrastructure. What calibration is, as defined in VIM3, is explained using the example of a Pt 100 temperature sensor. Even after calibration, it is necessary to recalibrate instruments to keep them within specification, as the measurement uncertainty increases over time. This is one reason why we should calibrate. It would be natural to move on to the calibration interval in the following article.

Reference

  1. The International Vocabulary of Metrology (VIM)
  2. Establishment and Adjustment of Calibration Intervals (NCSLI RP-1, 2010)
  3. H. -S. Shim and S. -N. Park, IEEE Transactions on Instrumentation and Measurement, vol. 71, pp. 1–10 (2022)

Metrological traceability and measurement uncertainty are essential concepts discussed in future articles.

--

--