Energy of Infrared Light

  • Thread starter JSGandora
  • Start date
95
0
Wikipedia said:
On 11 February 1800, Herschel was testing filters for the sun so he could observe sun spots. When using a red filter he found there was a lot of heat produced. Herschel discovered infrared radiation in sunlight by passing it through a prism and holding a thermometer just beyond the red end of the visible spectrum. This thermometer was meant to be a control to measure the ambient air temperature in the room. He was shocked when it showed a higher temperature than the visible spectrum. Further experimentation led to Herschel's conclusion that there must be an invisible form of light beyond the visible spectrum.
I thought IR light has the least energy since UV light has the most (shorter wavelengths more energy). The article states that the IR light makes the air temperature at the highest, what is happening here?
 

Drakkith

Staff Emeritus
Science Advisor
2018 Award
20,574
4,303
I thought IR light has the least energy since UV light has the most (shorter wavelengths more energy). The article states that the IR light makes the air temperature at the highest, what is happening here?
The IR radiation is directly heating the thermometer, not the air. UV light is almost entirely blocked by the atmosphere before ever reaching the surface.
 
95
0
Ah, then why is the IR heat the thermometer more than the visible light? E=p/lambda and thus shorter wavelength means more energy? IR has the greatest wavelength out of all of them.
 
380
1
I suspect that mercury (assuming a mercury thermometer) might absorb IR radiation more efficiently than visible.
 

Drakkith

Staff Emeritus
Science Advisor
2018 Award
20,574
4,303
Ah, then why is the IR heat the thermometer more than the visible light? E=p/lambda and thus shorter wavelength means more energy? IR has the greatest wavelength out of all of them.
I don't know how effectively visible light heats the thermometer compared to IR, but the point of the article was that the thermometer wasn't in the visible light at all, but in the IR light. Since IR light is invisible Herschel didn't know it was there and the increasing temperature of the thermometer enabled him to discover it.
 

DaveC426913

Gold Member
18,199
1,804
I don't know how effectively visible light heats the thermometer compared to IR, but the point of the article was that the thermometer wasn't in the visible light at all, but in the IR light. Since IR light is invisible Herschel didn't know it was there and the increasing temperature of the thermometer enabled him to discover it.
You're missing the point of the OP's question though:
...it showed a higher temperature than the visible spectrum..
 

Drakkith

Staff Emeritus
Science Advisor
2018 Award
20,574
4,303
95
0
1,897
200
Do you have source for that?

Also, from this picture from Wikipedia, it says that IR waves are mostly absorbed by atmospheric gases, which further confuses me.
But Herschel held the thermometer just beyond the red end of the spectrum, and those wavelengths are mostly not absorbed.
 
95
0
Wait what? The thermometer just beyond the red end must be in the Infrared region correct? I hope I'm not misinterpreting your statement.
 
1,897
200
Wait what? The thermometer just beyond the red end must be in the Infrared region correct? I hope I'm not misinterpreting your statement.
Yes, and if you look at the link you gave, you'll see the atmosphere won't stop much of this near-infrared radiation
 
95
0
You mean this link?
http://upload.wikimedia.org/wikipedi...ic_opacity.svg [Broken]

It says "Most of the infrared spectrum absorbed by atmospheric gases (best observed from space)". I think it is pretty clear in saying that the infrared spectrum is absorbed.
 
Last edited by a moderator:
351
2
You mean this link?
http://upload.wikimedia.org/wikipedi...ic_opacity.svg [Broken]

It says "Most of the infrared spectrum absorbed by atmospheric gases (best observed from space)". I think it is pretty clear in saying that the infrared spectrum is absorbed.
Note the valley just to the right of red light at 1 μm. That valley is infrared and is absorbed less than red light (although more than green light).

Remember that visible light is a small section of the spectrum. Infrared is a huge range in comparison. While much of that range is blocked by the atmosphere, there are specific frequencies that are not. These appear as valleys on that graphic.

Edit to add:
The link you gave only considers how much of each frequency is absorbed. It is also important how much is emitted. This link shows how much actually reaches sea level:
http://upload.wikimedia.org/wikipedia/commons/4/4c/Solar_Spectrum.png

There you see there is still a problem. Since the highest point in infrared is still lower than almost all of visible light. I can only guess at an explanation:
1. the thermometer absorbed infrared more than visible.
2. the prism absorbed visible more than infrared.
3. the thermometer happened to be near the ends of visible where the intensity falls lower than near infrared.

As I said, these are only guesses. I'll be interested to see if anyone else can provide a more certain answer.
 
Last edited by a moderator:
95
0
Oh, so the small valley accounts for it having a higher temperature. Also, maybe the fact that it is more concentrated than the other light rays that are scattered makes it heat up the thermometer more than the green - violet light.
 

fluidistic

Gold Member
3,613
94
My thoughts:
Visible light passes through the glass material containing the mercury (thermometer). It gets mostly reflected from the mercury so overall it doesn't heat that much the thermometer. However the infrared light is more absorbed than the visible light by the glass (I have no idea about the mercury though). The glass then heats up and since it's in direct contact with the mercury, the mercury heats up and the thermometer shows an increase of temperature greater than the one with visible light.
 

Redbelly98

Staff Emeritus
Science Advisor
Homework Helper
12,038
127
I thought IR light has the least energy since UV light has the most (shorter wavelengths more energy). The article states that the IR light makes the air temperature at the highest, what is happening here?
IR has less energy per photon than visible or UV. So if we had the same number of photons of IR , visible, and UV, then yes the IR would have the least amount of energy.

But that doesn't apply here -- there are unequal numbers of photons of the different wavelengths, and so if there are enough photons of IR then it can have more energy than is in the visible portion of the spectrum.
 

A.T.

Science Advisor
9,619
1,479
if there are enough photons of IR then it can have more energy than is in the visible portion of the spectrum.
The irradiance in the visible portion of the spectrum is greater than for for IR:

Solar_Spectrum.png


From: http://en.wikipedia.org/wiki/Sunlight

But there are some holes due to filtering by the atmosphere. The prism might also have filtered more visible than IR light.

It should also be noted that the dispersion of light is not proportional to wavelength:

Dispersion-curve.png


Visible range is orange. From: http://en.wikipedia.org/wiki/Dispersion_(optics)

So holding the thermometer into the IR area might capture a wider range of wavelengths, than within the visible and UV area (more surface under the first graph due to a wider wavelength range, despite lower irradiance).
 
Last edited:

fluidistic

Gold Member
3,613
94
But there are some holes due to filtering by the atmosphere. The prism might also have filtered more visible than IR light.
I don't think the holes really matter. He placed the thermometer right after the end of the visible spectrum. So just before the O2 hole.
According to wikipedia,
Fused quartz is used in the ultraviolet as normal glasses lose their transparency there.
but I don't find anything related to the near visible infrared.
 

Redbelly98

Staff Emeritus
Science Advisor
Homework Helper
12,038
127
The irradiance in the visible portion of the spectrum is greater than for for IR:

http://upload.wikimedia.org/wikipedia/commons/4/4c/Solar_Spectrum.png" [Broken]
From: http://en.wikipedia.org/wiki/Sunlight

But there are some holes due to filtering by the atmosphere. The prism might also have filtered more visible than IR light.

It should also be noted that the dispersion of light is not proportional to wavelength:

http://upload.wikimedia.org/wikipedia/commons/2/20/Dispersion-curve.png" [Broken]
Visible range is orange. From: http://en.wikipedia.org/wiki/Dispersion_(optics)

So holding the thermometer into the IR area might capture a wider range of wavelengths, than within the visible and UV area (more surface under the first graph due to a wider wavelength range, despite lower irradiance).
Yup, I was aware of the spectrum peaking in the visible, but as it seemed the OP's main confusion was about the energy-per-photon, I just wanted to clear that up. It seemed that nobody else was addressing that.

I thought about the lesser dispersion of n in the IR, and you bring up a good point that the transmission of the prism could have played a role. Another possibility is that the thermometer had a greater absorptivity for near-IR wavelengths -- effectively acting as a filter.
 
Last edited by a moderator:
407
34
Also, from this picture from Wikipedia, it says that IR waves are mostly absorbed by atmospheric gases, which further confuses me.

http://upload.wikimedia.org/wikipedia/commons/3/34/Atmospheric_electromagnetic_opacity.svg
Wiki is correct, but "mostly" doesn't mean all. It is this infrared that escapes absorption by the atmosphere that warms you when you step from the shade into the direct sunlight.

Almost all surfaces absorb different proportions of infrared radiation and visible radiation. This is usually measured by their visible light albedos and their infrared albedos.

Glass is largely transparent to visible light and largely opaque to infrared. Thus it transmits the one and absorbs the other. Mercury absorbs both. This is why the thermometer heated up more in the infrared than it did in the visible. It was absorbing more radiant energy per unit of surface area in the infrared zone.

Edit: Idle thought. Was Herschel observing indoors (through a glass window) or outdoors?
 
Last edited:

Redbelly98

Staff Emeritus
Science Advisor
Homework Helper
12,038
127
Glass actually transmits pretty well up to 2 or 4 micron wavelengths, depending on the specific type of glass. If Herschel was looking just beyond the visible part of the spectrum, it wouldn't matter if he was observing through a glass window or not.

Here are some transmission curves for different glasses; note the expanded vertical scale on the right-hand portion of the graph:

[PLAIN]http://www.edmundoptics.com/images/articles/glass-transmission-curve.gif [Broken]

(From http://www.edmundoptics.com/technical-support/optics/optical-glass/?&viewall [Broken] )​
 
Last edited by a moderator:
5
0
maybe the IR light heated up the air around the thermometer and then the heat transferred from the air to the glass to the mercury. This could be in addition to many of the other ideas others have mentioned.
 

Drakkith

Staff Emeritus
Science Advisor
2018 Award
20,574
4,303
maybe the IR light heated up the air around the thermometer and then the heat transferred from the air to the glass to the mercury. This could be in addition to many of the other ideas others have mentioned.
The amount of infrared light absorbed by the air around the thermometer is miniscule. We can effectively ignore it for this discussion. If it was significant, one would have to ask how all that infrared light got through the miles of atmosphere only to be absorbed by 1-2 feet of air.
 

fluidistic

Gold Member
3,613
94
Hey guys I finally know the answer to this question.
I thought about the lesser dispersion of n in the IR
I think this is it.
Glass is largely transparent to visible light and largely opaque to infrared. Thus it transmits the one and absorbs the other. Mercury absorbs both. This is why the thermometer heated up more in the infrared than it did in the visible. It was absorbing more radiant energy per unit of surface area in the infrared zone.
I'm afraid this isn't the reason, because you can do the experiment with a blackened bulb (with paint for example) and you still get that green heats up more than blue, red heats up more than green and the infrared heats up more than any "color" in the visible spectrum.
So the mercury and glass isn't the reason, you can replace both and still get the effect.
The "true" reason was posted in 2004 in this forum (see https://www.physicsforums.com/showthread.php?t=14736). Direct link to the website explaining it: http://home.znet.com/schester/calculations/herschel/index.html.
I'll quote in case the website disappears.
In a famous experiment, Sir William Herschel discovered the infrared region of the solar spectrum in the year 1800. He used a glass prism to disperse the sun's rays and a thermometer to record the "temperature" of each of the wavelengths. To his surprise, he found that the highest reading of the thermometer was in a region beyond the reddest rays, and thus discovered the infra-red ("below-red"). See Discovery of Infrared.

On first consideration, this result is surprising. The energy peak of the solar spectrum is at 0.60 micron (orange light), and definitely not in the infrared. So why did Herschel observe the highest reading in the infrared?

The answer turns out to be the experimental design, and a failure to correct for refraction. In Herschel's setup, sunlight is refracted by a prism. The index of refraction of course must vary with wavelength so that the sunlight would be dispersed into its various colors. If the index of refraction varied linearly with wavelength, Herschel would not have needed to correct for that variation, since the wavelengths would be uniformly spaced along his table.

However, since the index of refraction varies non-linearly with wavelength, the wavelengths will not be uniformly spaced along Herschel's measuring table. The actual spacing of the wavelengths versus distance along his table for an incidence angle of 45° from air into glass shows that the infrared region is much more highly concentrated than optical wavelengths. (The plot shows the spacing along the spectrum divided by the distance from the prism. Hence to get the actual spacing in cm or inches, multiply by the distance from the prism in cm or inches.) The relative concentration factor is shown normalized to 0.60 micron.

The net result is that Herschel's observed "temperature" should then peak in the infrared. The energy vs. wavelength plot for the "45° glass" model given above shows that Herschel's observed "temperature" should keep increasing toward longer wavelengths. However, when properly corrected for the relation between wavelength and distance along his table, he should have finally published the "no wavelength concentration" curve shown in that plot!

A good referee should have caught this mistake.
P.S.:Here's a link with blackened bulb and no mercury, still showing up the effect and you can see that the red color range is much narrower than the blue one: http://coolcosmos.ipac.caltech.edu/cosmic_classroom/classroom_activities/herschel_example.html.
 
Last edited:

Drakkith

Staff Emeritus
Science Advisor
2018 Award
20,574
4,303
Ah, that makes sense fluidistic.
 

Related Threads for: Energy of Infrared Light

  • Posted
Replies
6
Views
5K
Replies
8
Views
4K
Replies
3
Views
3K
Replies
8
Views
9K
Replies
2
Views
4K
  • Posted
Replies
1
Views
2K
  • Posted
Replies
3
Views
2K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving

Hot Threads

Top