IR Temperature Detection with sensitivty of 1 degree - possible?

In summary: I work for, has been working on a temperature sensing system that uses a combination of capacitive and resistive sensors. He says that the capacitive sensors are the most accurate, but they are also the most sensitive to variations in the environment. He has been able to get a temperature range of +/- 1 degrees C with these sensors.
  • #1
MIMSAR
5
0
Hi,

I'm trying to design a system which will be able to detect a hot object on a cold surface. The downside is the temperature difference is only 1 degree celcius and I'd like to use IR from a distance of about 10cm (further away preferred if possible).

Does anyone with any experience of this know a suitable sensor or system I could use please. I'm guessing some form of calibration will be needed to tell the system the temperature of the cold surface first.

Thanks,
 
Engineering news on Phys.org
  • #2
The handheld Fluke Thermal Imagers I've used could detect differences down to 1/2 a degree C. They weren't that great for detecting the absolute temperature, but on a false color display, the relative differences were easy to see.

You might need to shop around. My experience was from a couple of years ago, but the cameras are very expensive. We leased rather than purchased one. We wanted a camera that could export the numeric data so we could do more quantitative analysis of the temperature gradients.
 
  • #3
Thanks. I've actually used a thermal imager and found it to be quite effective. However, I'm now trying to build my own system tuned to the required temperature and the ability to interface with other components. Are there any suitable sensors/circuits, ideally giving an analogue output of the temperature at the focus?

I considered an IR laser thermometer but wondered would the beam of IR actually heat the object up in addition to reflecting to give a temperature reading?
 
  • #4
You can buy thermal imaging cameras with USB interfaces like this:

http://www.infraredcamerasinc.com/fix-mounted-thermal-imaging-equipment.html"

I've never priced anything like this, but I'd imagine it is pretty expensive.

IR laser thermometers are still passive sensors. The laser is just for aiming. The aren't very accurate, but plenty good enough for detecting hot spots on circuit boards or in the AC system is working.

If you need a fairly precise infrared temperature, check the Omega catalog:

http://www.omega.com/search/esearch...ed+Sensor&submit=Search&ori=Infrared+Sensor+"
 
Last edited by a moderator:
  • #5
I've done this using infrared thermocouples. You can purchase them at Omega.com. There are a couple of downsides to using these:

1. The signal level is tiny - on the order of 50uv/C
2. They have a high impedance - several k-ohm, the leakage current of many standard thermocouple meters will swamp the circuits output.
3. They're affected by their case temperature - ours was compensated by a constant temperature jacket.
4. They're affected by the emissivity of the surface. We used a rough surface to help compensate.
5. Ours had a Germanium lens. Touching the lens was enough to start it corroding.

The easiest, most sure way of taking this measurement is to point the sensor at one surface, take a reference reading, point it at the other, take another reading, and then subtract the difference. This will make up for a great many shortcomings in the system.

Good Luck,

Mike
 

1. How does IR temperature detection work?

The IR temperature detection method measures the temperature of an object by detecting and analyzing the infrared radiation emitted by the object. This radiation is converted into an electrical signal, which is then processed and translated into a temperature reading. The sensitivity of 1 degree means that the device can detect temperature changes as small as 1 degree.

2. What is the advantage of using IR temperature detection with a sensitivity of 1 degree?

The advantage of using IR temperature detection with a sensitivity of 1 degree is that it allows for precise and accurate temperature measurements. This level of sensitivity is especially useful in applications where even small temperature changes can have significant impacts, such as in scientific research or industrial processes.

3. How is the sensitivity of IR temperature detection measured?

The sensitivity of IR temperature detection is typically measured in degrees or just as a decimal number. This value represents the smallest temperature change that the device can detect and accurately measure. A sensitivity of 1 degree means that the device can detect temperature changes as small as 1 degree.

4. Can IR temperature detection with a sensitivity of 1 degree be used in all environments?

IR temperature detection with a sensitivity of 1 degree can be used in a wide range of environments, but it may not be suitable for all applications. Factors such as the distance between the object and the detector, ambient temperature, and the presence of other sources of heat can affect the accuracy of the temperature measurement. It is important to consider these factors when using IR temperature detection with a sensitivity of 1 degree.

5. Are there any limitations to using IR temperature detection with a sensitivity of 1 degree?

While IR temperature detection with a sensitivity of 1 degree is highly accurate and precise, it is not suitable for all temperature measurement applications. It may not be able to accurately measure temperatures of objects that are too small or too far away, or in environments with high levels of interference. In these cases, other temperature detection methods may be more suitable.

Similar threads

  • General Engineering
Replies
18
Views
2K
Replies
16
Views
1K
Replies
6
Views
1K
  • General Engineering
2
Replies
67
Views
4K
Replies
5
Views
1K
Replies
7
Views
1K
Replies
20
Views
1K
  • Electrical Engineering
Replies
6
Views
2K
Replies
19
Views
1K
  • Mechanical Engineering
Replies
17
Views
3K
Back
Top