From 1.4 to 25 micrometers (µm)
Gaitherburg, MD, USA — NIST is about to open the world’s most accurate facility for calibrating infrared (IR) detectors.
It is made possible by the establishment of an extremely precise reference scale for detector performance based on NIST’s newly developed standard detectors with sensitivities two orders of magnitude better than any others of comparable design.
When final quality-system checks now in process are completed later this year, calibrations will be made available to customers through NIST Measurement Services.
The new NIST Infrared Spectral Comparator Facility (IR-SCF), developed over the past 10 years with major funding from the U.S. Air Force and in collaboration with two leading detector manufacturers, is expected to draw customers from the Department of Defense, NASA, instrument companies, and calibration laboratories, among others.
It will be of particular interest for calibrating sensors viewing the Earth from space for weather and climate-science applications since the emitted infrared and far-infrared radiation from the Earth is critical for understanding the Earth’s energy balance and its response to changing levels of atmospheric greenhouse gases.
Other IR applications include night vision and heat-seeking missiles, industrial monitoring, medical diagnostics, proximity-sensor technology, and machine-to-machine communication in the Internet of Things.
Until the advent of the new system, NIST’s Sensor Science Division offered IR detector calibrations in the 1.4 micrometer (µm) to 14 µm wavelength range. The IR-SCF extends that range to 25 µm and establishes a greatly improved reference scale for responsivity – the amount of electrical signal a detector produces compared to the input radiation it receives.
The updated scale is based on measurements from super-sensitive pyroelectric detectors custom-fabricated to NIST design and sensitivity specifications for use as calibration standards.*
After extensive testing and numerous iterative improvements, the latest of these devices have signal-to-noise performance about 200 times better than previously obtainable.
That sensitivity, in turn, allows the NIST standard detectors to be used at high precision in combination with a monochromator – an instrument that can be adjusted to emit only specific wavelengths.
Monochromators are very accurate, but only extremely sensitive detectors capable of discriminating between very small differences in wavelengths can take full advantage of them.
That combination, plus improved IR output intensity from the monochromator, advanced electronics, a mounting stage that can adjust a detector’s position to within a few micrometers in three dimensions, and other improvements, give the new facility unmatched capabilities.
The calibration process – which spans many days and can cost upward of $10,000 – begins with a blackbody heat source (typically 1100 °C) that emits IR radiation which is then channeled into the monochromator.
Inside the device’s moisture-controlled enclosure is a rotatable turret assembly containing a selection of different gratings, each of which is optimized for a particular set of wavelengths.
Just as a glass prism spreads white light out into a rainbow, with each color emerging at a slightly different angle, the monochromator grating separates the IR into specific wavelengths accessible at different angles. The resulting radiation is within 1% of the target wavelength.
The IR output beam, less than 3 mm wide, is carefully aimed at the detector under test, which is mounted next to the NIST standard detector.
At each selected wavelength over the appropriate range, one measurement is made with the customer’s detector and then the NIST detector is moved into the beam and takes its own measurement at the same power and wavelength. Comparing the accumulated data across a span of many wavelengths produces the calibration.
“This is a major achievement that will allow measurements in the difficult infrared to far-infrared part of the spectrum to approach the accuracy now possible in the visible and ultraviolet spectrum where high-quality standard detectors have been available for years,” says Gerald Fraser, Chief of NIST’s Sensor Science Division.
“The effort will allow R&D teams to better leverage the capabilities of the infrared spectrum for applications that range from infrared astronomy to quality control in manufacturing.”
Contact: George Eppeldauer
* NIST’s standard detectors operate on the pyroelectric effect, in which changing the temperature of certain materials – in this case, by heating them with IR radiation – produces an AC current across the material. Each detector contains a crystal whose electrical condition is continuously monitored as its temperature briefly rises in response to IR radiation, drops as the device is allowed to cool (by briefly turning off the IR beam), and then rises again when the next IR pulse arrives. The pulsed beam is created by a rotating “chopper” that alternately blocks the IR beam and allows it to pass.
The figure of merit for sensitivity in such detectors is called noise equivalent power (NEP, expressed in nW/Hz1/2), and lower values represent higher sensitivity. A decade ago, NIST used standards-quality pyroelectric detectors with an NEP of 60 nW/Hz1/2 or higher. Before the completion of the IR-SCP, NIST’s detectors had NEP values around 7 nW/Hz1/2.The new standard detectors have an NEP around O.2 nW/Hz1/2, close to the NEP of liquid helium-cooled silicon composite bolometers.
DISCLAIMER: Certain trade names and company products are depicted in this article for informational purposes only. In no case does such identification imply recommendation or endorsement by the National Institute of Standards and Technology, nor does it imply that the products are necessarily the best available for the purpose.