IR Temperature Detection with sensitivty of 1 degree - possible?

AI Thread Summary
Designing a system to detect a 1-degree Celsius temperature difference using infrared (IR) technology is feasible, with options like handheld Fluke Thermal Imagers noted for their sensitivity. Users recommend considering thermal imagers with USB interfaces for data export and analysis, although they can be expensive. IR laser thermometers are suggested as passive sensors for detecting hot spots, but they may not provide precise readings. Infrared thermocouples from suppliers like Omega are viable but come with challenges such as low signal levels and sensitivity to surface emissivity. A reliable method involves taking reference readings from both surfaces to accurately measure the temperature difference.
MIMSAR
Messages
5
Reaction score
0
Hi,

I'm trying to design a system which will be able to detect a hot object on a cold surface. The downside is the temperature difference is only 1 degree celcius and I'd like to use IR from a distance of about 10cm (further away preferred if possible).

Does anyone with any experience of this know a suitable sensor or system I could use please. I'm guessing some form of calibration will be needed to tell the system the temperature of the cold surface first.

Thanks,
 
Engineering news on Phys.org
The handheld Fluke Thermal Imagers I've used could detect differences down to 1/2 a degree C. They weren't that great for detecting the absolute temperature, but on a false color display, the relative differences were easy to see.

You might need to shop around. My experience was from a couple of years ago, but the cameras are very expensive. We leased rather than purchased one. We wanted a camera that could export the numeric data so we could do more quantitative analysis of the temperature gradients.
 
Thanks. I've actually used a thermal imager and found it to be quite effective. However, I'm now trying to build my own system tuned to the required temperature and the ability to interface with other components. Are there any suitable sensors/circuits, ideally giving an analogue output of the temperature at the focus?

I considered an IR laser thermometer but wondered would the beam of IR actually heat the object up in addition to reflecting to give a temperature reading?
 
You can buy thermal imaging cameras with USB interfaces like this:

http://www.infraredcamerasinc.com/fix-mounted-thermal-imaging-equipment.html"

I've never priced anything like this, but I'd imagine it is pretty expensive.

IR laser thermometers are still passive sensors. The laser is just for aiming. The aren't very accurate, but plenty good enough for detecting hot spots on circuit boards or in the AC system is working.

If you need a fairly precise infrared temperature, check the Omega catalog:

http://www.omega.com/search/esearch...ed+Sensor&submit=Search&ori=Infrared+Sensor+"
 
Last edited by a moderator:
I've done this using infrared thermocouples. You can purchase them at Omega.com. There are a couple of downsides to using these:

1. The signal level is tiny - on the order of 50uv/C
2. They have a high impedance - several k-ohm, the leakage current of many standard thermocouple meters will swamp the circuits output.
3. They're affected by their case temperature - ours was compensated by a constant temperature jacket.
4. They're affected by the emissivity of the surface. We used a rough surface to help compensate.
5. Ours had a Germanium lens. Touching the lens was enough to start it corroding.

The easiest, most sure way of taking this measurement is to point the sensor at one surface, take a reference reading, point it at the other, take another reading, and then subtract the difference. This will make up for a great many shortcomings in the system.

Good Luck,

Mike
 
Hi all I have some confusion about piezoelectrical sensors combination. If i have three acoustic piezoelectrical sensors (with same receive sensitivity in dB ref V/1uPa) placed at specific distance, these sensors receive acoustic signal from a sound source placed at far field distance (Plane Wave) and from broadside. I receive output of these sensors through individual preamplifiers, add them through hardware like summer circuit adder or in software after digitization and in this way got an...
I have recently moved into a new (rather ancient) house and had a few trips of my Residual Current breaker. I dug out my old Socket tester which tell me the three pins are correct. But then the Red warning light tells me my socket(s) fail the loop test. I never had this before but my last house had an overhead supply with no Earth from the company. The tester said "get this checked" and the man said the (high but not ridiculous) earth resistance was acceptable. I stuck a new copper earth...
I am not an electrical engineering student, but a lowly apprentice electrician. I learn both on the job and also take classes for my apprenticeship. I recently wired my first transformer and I understand that the neutral and ground are bonded together in the transformer or in the service. What I don't understand is, if the neutral is a current carrying conductor, which is then bonded to the ground conductor, why does current only flow back to its source and not on the ground path...

Similar threads

Back
Top