Need a quote or advice?

Contact one of our experienced engineers

International: +44 1895 2522222
Call TC
TC Ltd for Temperature Measurement & Control

RTD Self-Heating: Causes, Effects, and Minimization

When using a Resistance Temperature Detector (RTD), particularly platinum RTDs like the Pt100, it’s essential to understand the phenomenon of self-heating—an unintended rise in the sensor’s temperature caused by the very act of measuring it.

What Is RTD Self-Heating?

To measure resistance, an electrical current must pass through the RTD element. However, this current generates heat within the sensor due to electrical power dissipation, slightly raising the sensor’s temperature above that of its environment. This introduces a measurement error, as the RTD is no longer in perfect thermal equilibrium with the medium it’s intended to sense.

Formula for Heat Generation in RTDs

Power (P) = I2 × R

Where:

  • I is the current through the RTD
  • R is the resistance of the RTD

How Significant Is the Self-Heating Error?

The degree of self-heating depends on the RTD’s operating environment, construction, and the applied current. Here are some practical examples for a standard Pt100 sensor:

Environment Current Self-Heating Effect
Water at 0°C (ice point) 1 mA ~20 millikelvin (mK)
Still air 1 mA ~50 mK
Still air 3 mA ~0.5 K (500 mK)

Factors Influencing Self-Heating

  • Sensor design: Larger elements or sensors with better thermal mass transfer dissipate heat more effectively.
  • Installation: Good thermal contact between the RTD element and its sheath, as well as between the sheath and the surrounding medium, reduces temperature rise.
  • Ambient conditions: Self-heating is more noticeable in low-convection environments like still air or gases. In high-flow liquids, the effect is often negligible due to rapid heat dissipation.
  • Excitation current: Higher current increases the heating exponentially. Using the minimum required current helps reduce this effect.

Minimizing RTD Self-Heating Error

  • Use the lowest practical measurement current (typically 0.1 mA to 1 mA for Pt100 sensors).
  • Optimize sensor construction for good thermal conduction from the RTD element to the medium.
  • Improve thermal coupling by ensuring tight installation into thermowells or direct immersion into the process media.

Quantifying Self-Heating Error

A common method for estimating and correcting self-heating involves:

  1. Measuring the RTD resistance at a constant environmental temperature using two different currents.
  2. Plotting resistance versus the square of the current.
  3. Extrapolating the trend line back to zero current to estimate the true resistance with no self-heating influence.

Summary

RTD self-heating is a measurable and often correctable source of error. By controlling measurement current and ensuring proper thermal contact, this effect can be minimized—preserving the accuracy and reliability of temperature readings.

Note: The information in this guide is provided for general informational and educational purposes only. While we aim for accuracy, all data, examples, and recommendations are provided “as is” without warranty of any kind. Standards, specifications, and best practices may change over time, so always confirm current requirements before use.

Need help or have a question? We’re here to assist — feel free to contact us.

Further Reading

RTD vs Thermocouple – Choosing the Right Sensor
Explore the features and characteristics of thermocouples and RTDs

RTD Output Tables
View Resistance versus Temperature tables for all Pt100 sensors.

What are the RTD colour codes?
Explore RTD colour codes and wiring configurations.

Next: RTD Standards & Tolerances: Understanding IEC 60751 →