- #1

jdawg

- 367

- 2

- TL;DR Summary
- How is total emissivity for a real material effected by temperature

Hello,

I’m trying to better my understanding of how the total emissivity changes with temperature for ceramic materials. Currently it is my understanding that non-metals typically have a high emissivity. A sanded surface will result in a higher emissivity, and that spectral emissivity varies greatly with wavelength for non-metals.

For black bodies, you should expect the emissivity to increase as temperature increases (molecules vibrating faster, therefore emit more energy)

Of course real materials don’t behave as black bodies... how would you expect the total emissivity to change as you increase the temperature for the following scenarios?:

total emissivity integrated over X-ray spectrum?

total emissivity integrated over IR?

Total emissivity integrated over microwave?

Total emissivity integrated over RF?

I was reading a paper on the subject that I can’t seem to find again. In the paper they had total emissivity plotted as a function of temperature, but emissivity values dropped as the temperature increased... how is this possible? Does this have to do with the wavelengths they integrated over? I can’t remember what part of the spectrum the paper was looking at. Since black bodies see more energy emitted for higher frequencies as temperature increases, would you expect less energy emitted for lower wavelengths? Perhaps even a drop in emissivity values as temperature increases?

Please correct any inconsistencies in my line of thinking. Any information or papers that someone can share with me is greatly appreciated!

I’m trying to better my understanding of how the total emissivity changes with temperature for ceramic materials. Currently it is my understanding that non-metals typically have a high emissivity. A sanded surface will result in a higher emissivity, and that spectral emissivity varies greatly with wavelength for non-metals.

For black bodies, you should expect the emissivity to increase as temperature increases (molecules vibrating faster, therefore emit more energy)

Of course real materials don’t behave as black bodies... how would you expect the total emissivity to change as you increase the temperature for the following scenarios?:

total emissivity integrated over X-ray spectrum?

total emissivity integrated over IR?

Total emissivity integrated over microwave?

Total emissivity integrated over RF?

I was reading a paper on the subject that I can’t seem to find again. In the paper they had total emissivity plotted as a function of temperature, but emissivity values dropped as the temperature increased... how is this possible? Does this have to do with the wavelengths they integrated over? I can’t remember what part of the spectrum the paper was looking at. Since black bodies see more energy emitted for higher frequencies as temperature increases, would you expect less energy emitted for lower wavelengths? Perhaps even a drop in emissivity values as temperature increases?

Please correct any inconsistencies in my line of thinking. Any information or papers that someone can share with me is greatly appreciated!