Think You’ve Got Problems? Try Calibrating a Precision Temperature Sensor

date_range 2020-07-30 preview 102 account_box MhicMall

The challenge of calibrating sensors for real-world physical variables is a mixed situation. For some sensors, it’s not too difficult to provide a known stimulus to the sensor; for other sensors, doing so is a major problem.

Let’s look at an easier one first—a linear variable differential transformer (LVDT) that accurately measures linear elongation (position) over a range from as little as one centimeter (cm) to perhaps 25 cm, depending on the model. For example, the 02560389-000 LVDT from TE Connectivity Measurement Specialties provides linear displacement measurement over two inches (50.8 centimeters) with 0.25% linearity over the entire stroke range.

 

Figure 1: The model 02560389-000 LVDT from TE Connectivity Measurement Specialties provides accurate position readings over a two-inch range, with 0.25% linearity. (Image source: TE Connectivity Measurement Specialties)

To calibrate the associated analog front-end (AFE) electronics, you can use a precise signal from an instrument such as a ratio transformer, which was developed about 100 years ago and is still used today (Figure 2).

 

Figure 2: This classic-style ratio transformer is used to simulate the LVDT output versus position when calibrating the performance of the sensor’s analog interface circuitry. (Image source: Tegam Inc.)

However, using the ratio transformer doesn’t test the LVDT itself. To do that, you can attach a strain gage extensometer, digital mechanical caliper, or optical caliper to the LVDT, then measure its output at specific benchmark position settings.

But what about temperature sensor calibration? Again, it’s fairly easy to create an electrical signal that accurately simulates the nonlinear output of the temperature sensor and check its AFE, but how do you check the temperature sensor itself when you are looking for accuracy to a fraction of a degree? Most standard temperature sensors such as resistance temperature detectors (RTDs), thermistors, solid-state devices, and thermocouples are good “out of the box” to about 1⁰C to 2⁰C, but when you get to tenths of a degree absolute accuracy (not the same as resolution, of course) that’s another story.

The reality is that you can’t just set up a basic heater, measure its temperature using a higher accuracy system, and then just substitute your sensor under evaluation in the same system. There are just too many ways the comparison can be corrupted by how you do it. For this reason, users of high-accuracy temperature sensors can:

1) Either send the sensor out to a lab, such as Ellab A/S, which has the necessary set-ups, or buy a test setup from a vendor such as Fluke Corp for in-house use, or

2) Buy a temperature sensor that comes fully calibrated with NIST-traceable documentation from one of the many suppliers of these “better” units.

What if you need to achieve absolute accuracy to 0.1⁰C, or 0.01⁰C, or even better than 0.01⁰C? Perhaps it is hard to believe, but it can be done. Researchers at the National Institute of Standards and Technology (NIST) have developed a thermal-infrared radiation thermometer (TIRT) for the -50⁰C (-58⁰F) to 150⁰C (302⁰F) range (corresponding to infrared wavelengths between eight and fourteen micrometers), which can measure temperatures with the precision of a few thousandths of a degree Celsius. Even better, it does not require cryogenic cooling, as many other high-performance infrared temperature sensors do.

How did they achieve this level of performance? They used the three-layer approach common in analog and sensor-related designs:

1) Choose the best, highest performance component available, including “aging” them if needed to minimize long-term drift.

2) Employ a design topology that not only minimizes errors but also self-cancels them where possible, such as employing matched resistors with identical temperature coefficients on the shared substrate of a differential or instrumentation amplifier.

3) Minimize external sources of induced errors such as electromagnetic (EM) fields or ambient temperature changes.

My first introduction to this trio of tactics was when I read an article by Jim Williams, the late, legendary analog design genius, in a 1976 issue of EDN, “This 30-ppm scale proves that analog designs aren't dead yet.” This scale was designed to meet some very aggressive objectives: it had to be portable, low cost, resolve 0.01 pound in a 300.00 full-scale range (that’s the 30 parts per million), never require calibration or adjustment, and have an absolute accuracy within 0.02%. Despite the article’s age (nearly 50 years!) and the many changes in components and technologies since it was written, the underlying lessons are still valid.

How did the people at NIST create their thermometer, called the Ambient-Radiation Thermometer (ART) (Figure 3)? The design is described in full detail in their paper with the very modest title “Improvements in the design of thermal infrared radiation thermometers and sensors” published in Optics Express from the Optical Society of America, as well as “Precise Temperature Measurements with Invisible Light” published by NIST.

 

Figure 3: In the NIST Ambient Radiation Thermometer, infrared (IR) light from a fixed-temperature calibrated source enters the thermometer enclosure through a lens (1) and makes its way to the detector output (6), which is routed to an amplifier that boosts the signal levels. (Image source: NIST)

Conclusion

Next time you wonder about the accuracy of your sensor-based readings, be sure to be clear to yourself: how much of the error is due to the sensor itself, and how much is due to the electronics? How do you check each one independently?