RTD Sensor Calibration

RTD Temperature Sensors are popular and highly accurate temperature measuring devices used in a wide variety of different industries. The abbreviation RTD stands for “Resistance Temperature Detector” due to the fact that the output resistance changes based on changes to the temperature. By measuring the RTD’s resistance value, the temperature can be accurately measured.

The highest accuracy RTD sensors are made from platinum, due to its linear temperature-resistance relationship. RTD sensors made of platinum are called PRT, “Platinum Resistance Thermometer.” The most common platinum PRT sensor used in the process industry has a resistance of 100 ohms at a temperature of 0°C (32°F) and has a temperature-resistance curve of 0.00385 Ohms per °C between 0 and 100°C. RTD sensors can also have different temperature curves, such as 0.00392 Ohms per °C or be made from copper, nickel alloys, or other metal oxides.

To maintain their high accuracy RTD sensors require regular calibration due to the sensor’s possible presence of drift due to being subjected to different environmental conditions such as extreme temperatures or excessive vibration. Since RTD’s can measure very accurately, even a slight drift can create a significant measurement error.

RTD Sensor Measurement

The output resistance of the RTD needs to be measured to obtain the temperature value. The resistance can be measured in Ohms then converted into a temperature measurement by using a conversion table or calculation spreadsheet of the temperature-resistance curve of the RTD type or a temperature measurement device can be used that will automatically convert the measured resistance into a temperature reading.

When most measurement devices measure the resistance of the sensor, they send a low current through the sensor and then measures the voltage produced across the sensor resistance. By using Ohms Law, the resistance can then be calculated by dividing the voltage measured by the value of the current source. When measuring the resistance of the sensor with only 2 wires, the voltage is measured using the same wires that the current is applied to, which produces an error due to the measurement of the resistance in the wires and any connections that carry the current. This is why most RTD’s on the market are available in either a 3-wire or 4-wire configuration, which eliminates that error in the measurement.

RTD Accuracy and Classification

RTD probes have different classifications depending on the tolerances of the probes. Typically, there are 4 or 5 classes of accuracy that indicates how far probe can drift away from its nominalized temperature-resistance curve. The classes used are important to ensure the interchangeability of similar probes. The main classes are Class B, Class A, Class AA, 1/3 DIN, and 1/10 DIN. Class B is the classification that’s used for most manufacturing and industrial temperature measurement systems. The 1/10 DIN classification is the most accurate classification and is used in higher accuracy quality and calibration laboratories. The applicable tolerances range from Class B which is equal to ±(0.3 + 0.005*t)°C or ±0.30 °C at 0°C to the 1/10 DIN classification which is ±1/10(0.3 + 0.005*t)°C or ±0.03 °C at 0°C.

RTD Sensor Calibration Process

Typically, there are two ways that RTD’s can be calibrated, by using the appropriate sensor tolerances when used with a digital readout or to determine the sensor resistance over several temperature points to calculate coefficients that will define the temperature-resistance curve of the RTD sensor. RTD sensors cannot be adjusted to measure the specific nominal value. So, any adjustment needs to be made to the digital readout being used for the resistance measurement or the coefficients need to be entered into the device prior to the sensor being used. The use of the coefficients to correct the sensor measurement will be the most accurate means to obtain the sensor’s measurement.

Calibrations are typically performed by comparing the output temperature value with another more-accurate classification of RTD place in a temperature medium, such as a water/oil bath or dry-block calibrator or a physical intrinsic standard such as fixed-point cells.

When calibrating RTD sensors, 4 or 5 different temperatures are checked depending on the range of the sensor. An ice point check or triple point if water check at 0 °C is always required. This value is also measured several times during the calibration to monitor the stability and repeatability of the sensor. As with all temperature measurements, ample time is required for the sensors to reach equilibrium at the desired temperature before any measurements are taken.

Is the calibration lab you’re using giving you all the information you need?
Find out here, and whether you’re actually saving any money.