IR Temperature Detection with sensitivty of 1 degree - possible?

Click For Summary
SUMMARY

The discussion centers on designing a system for detecting temperature differences of 1 degree Celsius using infrared (IR) technology. Users recommend the Fluke Thermal Imagers for their ability to detect temperature differences down to 0.5 degrees Celsius, although they are expensive and primarily suited for relative temperature readings. For building a custom solution, infrared thermocouples from Omega are suggested, but they come with challenges such as low signal levels and sensitivity to surface emissivity. The most effective measurement technique involves taking reference readings from both the cold surface and the hot object to accurately determine the temperature difference.

PREREQUISITES
  • Understanding of infrared thermography and its applications
  • Familiarity with thermal imaging technology, specifically Fluke Thermal Imagers
  • Knowledge of infrared thermocouples and their operational principles
  • Basic principles of temperature measurement and calibration techniques
NEXT STEPS
  • Research the specifications and capabilities of Fluke Thermal Imagers for temperature detection
  • Explore infrared thermocouples available at Omega and their application in temperature measurement
  • Learn about the calibration process for infrared sensors to improve accuracy
  • Investigate the impact of surface emissivity on infrared temperature readings and how to compensate for it
USEFUL FOR

Engineers, hobbyists, and researchers involved in thermal imaging, temperature measurement, and sensor design will benefit from this discussion.

MIMSAR
Messages
5
Reaction score
0
Hi,

I'm trying to design a system which will be able to detect a hot object on a cold surface. The downside is the temperature difference is only 1 degree celsius and I'd like to use IR from a distance of about 10cm (further away preferred if possible).

Does anyone with any experience of this know a suitable sensor or system I could use please. I'm guessing some form of calibration will be needed to tell the system the temperature of the cold surface first.

Thanks,
 
Engineering news on Phys.org
The handheld Fluke Thermal Imagers I've used could detect differences down to 1/2 a degree C. They weren't that great for detecting the absolute temperature, but on a false color display, the relative differences were easy to see.

You might need to shop around. My experience was from a couple of years ago, but the cameras are very expensive. We leased rather than purchased one. We wanted a camera that could export the numeric data so we could do more quantitative analysis of the temperature gradients.
 
Thanks. I've actually used a thermal imager and found it to be quite effective. However, I'm now trying to build my own system tuned to the required temperature and the ability to interface with other components. Are there any suitable sensors/circuits, ideally giving an analogue output of the temperature at the focus?

I considered an IR laser thermometer but wondered would the beam of IR actually heat the object up in addition to reflecting to give a temperature reading?
 
You can buy thermal imaging cameras with USB interfaces like this:

http://www.infraredcamerasinc.com/fix-mounted-thermal-imaging-equipment.html"

I've never priced anything like this, but I'd imagine it is pretty expensive.

IR laser thermometers are still passive sensors. The laser is just for aiming. The aren't very accurate, but plenty good enough for detecting hot spots on circuit boards or in the AC system is working.

If you need a fairly precise infrared temperature, check the Omega catalog:

http://www.omega.com/search/esearch...ed+Sensor&submit=Search&ori=Infrared+Sensor+"
 
Last edited by a moderator:
I've done this using infrared thermocouples. You can purchase them at Omega.com. There are a couple of downsides to using these:

1. The signal level is tiny - on the order of 50uv/C
2. They have a high impedance - several k-ohm, the leakage current of many standard thermocouple meters will swamp the circuits output.
3. They're affected by their case temperature - ours was compensated by a constant temperature jacket.
4. They're affected by the emissivity of the surface. We used a rough surface to help compensate.
5. Ours had a Germanium lens. Touching the lens was enough to start it corroding.

The easiest, most sure way of taking this measurement is to point the sensor at one surface, take a reference reading, point it at the other, take another reading, and then subtract the difference. This will make up for a great many shortcomings in the system.

Good Luck,

Mike
 

Similar threads

  • · Replies 20 ·
Replies
20
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
Replies
18
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 67 ·
3
Replies
67
Views
7K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 20 ·
Replies
20
Views
2K