Emissivity and IR Thermometers

Main Question or Discussion Point

Hi

I am trying to get my head around Emissivity and was wondering if anyone could help.

There is a "blackbody" with emissivity of 0.97.
We want to use it to see whether some IR thermometers are giving suitable readings. Unfortunately the emissivity of the thermometer is unknown and cannot be altered so we will only be able to estimate their accuracy.

So if the "blackbody" is set to a temperature say 50°C, will it emit 97% of this temperature? So the IR thermometer should give a reading around 50*0.97 = 48.5° if it does not include a correction???

I would be grateful of any help. I have done some reading on the subject but I am getting more confused as most websites say about adjusting the IR thermometer but this is not possible in my situation.

Many thanks

Related Other Physics Topics News on Phys.org
mfb
Mentor
First: The absolute temperature scale is Kelvin, anything proportional to temperature has to be expressed in Kelvin. "97% of the Celsius-value" is meaningless, 97% of -100°C would be -97°C and therefore hotter?

A blackbody with an emissivity of 0.97 will emit 0.97 of the radiation of a perfect blackbody. 50°C correspond to ~323K. As radiation scales with the temperature to the 4th power, 3% less radiation means ~3/4% less temperature, which would correspond to ~323*0.9925=320.6K or ~47.6°C. By coincidence, this is not so far away from your value, but this has two errors which partially cancel here. Do not use my values, they are just estimates.

Keep in mind that radiation does not simply scales with temperature, the wavelength distribution gets shifted, too. A perfect blackbody with a real temperature of 47.6°C will emit the same power, but slightly shifted towards longer wavelengths.