Infrared Thermographic Testing

From Non Destructive Testing (NDT) Wiki
Jump to: navigation, search



According to the fundamental Law of Planck all objects above absolute zero emit infrared radiation.

This radiation only becomes visible to the human eye when the temperature is above about 500oC.

Infrared monitoring equipment has been developed which can detect infrared emission and visualize it as a visible image.

The sensitive range of the detector lies between 2 and 14 microns. The 2-5.6 micron range is generally used to visualize temperature between 40oC and 2000oC and the 8-14 micron range is used for temperature between -20oC and ambient temperatures.

The thermograms taken with an infrared camera measure the temperature distribution at the surface of the object at the time of the test.

It is important to take into consideration that this temperature distribution is the result of a dynamic process. Taking a thermogram of this object at an earlier or later time may result in a very different temperature distribution. This is especially true when the object has been heated or cooled.

The detectability of any internal structure such as voids, delaminations or layer thicknesses depends on the physical properties (heat capacity, heat conductivity, density, emissivity) of the materials of the test object. Naturally any interior ’structure’ has an effect on the temperature distribution on the surface. If the temperature changes on the surface there is a delay before the effect of this change occurs below where a defect such as a void occurs. The longer the time delay before the temperature changes, the greater the depth of the defect below the surface. Generally anything deeper than 10 cm will only show after a long period of time (>1 hr) after the temperature change has occurred.

Since the infrared system measures surface temperatures only, the temperatures measured are influenced by three factors:

  1. subsurface configuration.
  2. surface condition.
  3. environment.

As an NDT technique for inspecting concrete, the effect of the subsurface configuration is usually most interesting. All the information revealed by the infrared system relies on the principle that heat cannot be stopped from flowing from warmer to cooler areas, it can only be slowed down by the insulating effects of the material through which it is flowing. Various types of construction materials have different insulating abilities or thermal conductivities. In addition, differing types of concrete defects have different thermal conductivity values. For example, an air void has a lower thermal conductivity compared with the surrounding concrete. Hence the surface of a section of concrete containing an air void could be expected to have a slightly different temperature from a section of concrete without an air void.

There are three ways of transferring thermal energy from a warmer to a cooler region:

  1. conduction.
  2. convection.
  3. radiation.

Sound concrete should have the least resistance to conduction of heat, and the convection effects should be negligible. The surface appearance, as revealed by the infrared system, should show a uniform temperature over the whole surface examined. However, poor quality concrete contains anomalies such as voids and low density areas which decrease the thermal conductivity of the concrete by reducing the energy conduction properties without substantially increasing the convection effects. In order to have heat energy flow, there must be a heat source. Since concrete testing can involve large areas, the heat source should be both low cost and able to give the concrete surface an even distribution of heat. The sun fulfils both these requirements. Allowing the sun to warm the surface of the concrete areas under test will normally supply the required energy.

During night-time hours, the process may be reversed with the warm ground acting as the heatsource. For concrete areas not accessible to sunlight, an alternative is to use the heat storage ability of the earth to draw heat from the concrete under test. The important point is that in order to use infrared thermography, heat must be flowing through the concrete. It does not matter in which direction it flows.

The second important factor to consider when using infrared thermography to measure temperature differentials due to anomalies is the surface condition of the test area. The surface condition has a profound effect upon the ability of the surface to transfer energy by radiation. This ability of a material to radiate energy is measured by the emissivity of the material, which is defined as the ability of the material to radiate energy compared with a perfect blackbody radiator. A blackbody is a hypothetical radiation source, which radiates the maximum energy theoretically possible at a given temperature. The emissivity of a blackbody equals 1.0. The emissivity of a material is strictly a surface property. The emissivity value is higher for rough surfaces and lower for smooth surfaces. For example, rough concrete may have an emissivity of 0.95 while shiny metal may have an emissivity of only 0.05. In practical terms, this means that when using thermographic methods to scan large areas of concrete, the engineer must be aware of differing surface textures caused by such things as broom textured spots, rubber tire tracks, oil spots, or loose sand and dirt on the surface.

The final factor affecting temperature measurement of a concrete surface is the environmental system that surrounds that surface. Some of the factors that affect surface temperature measurements are:

  • SOLAR RADIATION: testing should be performed during times of the day or night when the solar radiation or lack of solar radiation would produce the most rapid heating or cooling of the concrete surface.
  • CLOUD COVER: clouds will reflect infrared radiation, thereby slowing the heat transfer process to the sky. Therefore, night-time testing should be performed during times of little or no cloud cover in order to allow the most efficient transfer of energy out of the concrete.
  • AMBIENT TEMPERATURE: This should have a negligible effect on the accuracy of the testing since one important consideration is the rapid heating or cooling of the concrete surface. This parameter will affect the length of time (i.e. the window) during which high contrast temperature measurements can be made. It is also important to consider if water is present. Testing while ground temperatures are less that 0oC should be avoided since ice can form, thereby filling subsurface voids.
  • WIND SPEED: High gusts of wind have a definite cooling effect and reduce surface temperatures. Measurements should be taken at wind speeds of less than 15 mph (25 km/h).
  • SURFACE MOISTURE: Moisture tends to disperse the surface heat and mask the temperature differences and thus the subsurface anomalies. Tests should not be performed while the concrete surface is covered with standing water or snow.

Once the proper conditions are established for examination, a relatively large area should be selected for calibration purposes. This should encompass both good and bad concrete areas (i.e. areas with voids, delaminations, cracks, or powdery concrete). Each type of anomaly will display a unique temperature pattern depending on the conditions present. If, for example, the examination is performed at night, most anomalies will be between 0.1° and 5°C cooler than the surrounding solid concrete depending on configuration. A daylight survey will show reversed results, i.e. damaged areas will be warmer than the surrounding sound concrete.