What makes a light source "brighter"?

  • Thread starter Thread starter iamconfused
  • Start date Start date
  • Tags Tags
    Light Source
AI Thread Summary
The discussion centers on the relationship between temperature and light intensity, particularly in the context of blackbody radiation. As temperature increases, both Wien's Displacement Law and the Stefan-Boltzmann Law indicate that the peak wavelength shifts to shorter wavelengths and the total emission increases significantly. Higher temperatures lead to more photons being emitted due to increased molecular interactions and energy levels. The conversation also touches on how CO2 in the atmosphere absorbs and re-emits infrared light, affecting the intensity of radiation that escapes into space. Ultimately, warmer CO2 re-emits photons with greater intensity than colder CO2, contributing to the greenhouse effect.
iamconfused
Messages
9
Reaction score
1
Hi all, I am hoping someone could clear up a concept that doesn't make sense to me. I am thinking in terms of a blackbody diagram, which is "Intensity" on the y-axis and wavelength on the x axis. I understand that when you heat something to a higher temperature, the curves shift to shorter wavelengths and the curves are narrower and also taller (more intense). I know wavelength is inversely proportional to the energy. So the energy of a particular color of light will always be the same for that particular color. And intensity seems to be energy per unit area and time. So just thinking about one particular frequency of light, let's say 500 nm as an example, the energy of 500 nm light should be the same no matter the temperature it's at. So what makes it more intense at a higher temperature? Is it, more photons being released? And if it's more photons per unit area and time, what exactly is making that happen? If you have a higher temperature object emitting light, what about higher temperature makes it release more photons?

To give some context, I'm learning about global warming, and how the atmosphere absorbs IR light. And apparently the blackbody spectrum of the Earth will show colder curves because the atmosphere absorbs IR and reradiates it into space higher in the atmosphere, where it's colder. I'm wondering why, for a particular frequency of light, it being colder will make it's intensity go down. What does the molecule do to it?
 
Science news on Phys.org
Emissivity.
 
The University of Virginia link above has a lot of information.

When the temperature is increased, two things could take place parallely

1) Wien's Displacement Law: (1/λ) ∝ T
When, Temperature increases, peak λ goes down and peak frequency goes up proportionally
Coupled with E = hc/λ, average energy of the photon goes up proportionally.

2) Stefan Boltzmann Law: E = σT4
When, Temperature increases, total emission goes up by T power 4. This could be a huge factor as even a slightest increase in temperature increases the emission by a lot because of the T power 4 factor.

Effectively, if you (could) increase the temperature by 20 percent of a blackbody emitter, average energy (of the peak frequency) of the photons goes up by 20 percent of the original. Total emission would be 2.07 times of the original.
 
  • Like
Likes iamconfused
iamconfused said:
So just thinking about one particular frequency of light, let's say 500 nm as an example, the energy of 500 nm light should be the same no matter the temperature it's at. So what makes it more intense at a higher temperature? Is it, more photons being released?

Yes, it's more photons being released per unit of time.

iamconfused said:
And if it's more photons per unit area and time, what exactly is making that happen? If you have a higher temperature object emitting light, what about higher temperature makes it release more photons?

The molecules and atoms are interacting more often and with greater energies, which shifts the spectrum to higher frequencies and generates more photons on average. For example, a molecule with a tail that 'whips' around as it vibrates can interact with the surrounding molecules more often and will have more energy as the temperature increases.

There are other effects as well. Take two charged particles and shoot them towards each other. You'll find that the higher the relative velocity, the more EM radiation the two emit during their interaction and the higher the frequency. A free-electron laser works off of this principle.
 
  • Like
Likes iamconfused
iamconfused said:
I know wavelength is inversely proportional to the energy... So just thinking about one particular frequency of light, let's say 500 nm as an example, the energy of 500 nm light should be the same no matter the temperature it's at.

Wait, are you talking about the energy of a single photon? Yes, that's inversely proportional to wavelength. But that doesn't mean every source of the same wavelength is dimmer than every source of a shorter wavelength. If my long-wavelength photons are 2 eV and my short-wavelength photons are 3 eV, then 10 long-wavelength photons (20 eV total) have more energy than 5 short-wavelength photons (15 eV total). I can have a red light (long wavelength, low energy photons) which is brighter than a blue light (short wavelength, high energy photons).

I'm pretty sure that as temperature goes up, the amount of energy at EVERY WAVELENGTH in a blackbody goes up. Like so:
https://simple.wikipedia.org/wiki/B...a/File:Black-body_radiation_vs_wavelength.png
 
Last edited:
RPinPA said:
Wait, are you talking about the energy of a single photon?

I guess I'm not really thinking about a single photon, but a group of photons from one frequency of light. Let's say we have a group of photons whose frequency is all the same, and that CO2 absorbs that particular frequency. I was told that if the CO2 was warm (such as if they are from a lower part of the atmosphere), the CO2 would absorb those photons, and then reemit them at the same frequency, and that since the CO2 is warm, it would re-emit those photons (with the same frequency) with a higher intensity than if the CO2 had been colder, because warmer blackbody curves have more intensity. So if that same group of photons was absorbed by cold CO2, that cold CO2 would re-emit photons of that frequency with less intensity, because the CO2 was colder.

In the case of the light leaving the Earth's atmosphere (as measured from space), if a group of photons originally emitted from the ground were absorbed in the bottom/warmer part of the atmosophere, they would get emitted but are likely to be absorbed by colder CO2 above, until eventually it can leave out into space. So measuring the intensity at that particular frequency for those photons would look colder because those photons last were absorbed by colder CO2.

So if the intensity relates to how many photons are being released per second, then I'm imagining if the bottom of the atmosphere where it's warm is emitting say 10 of these photons per second, and then these 10 photons get absorbed higher up by colder CO2 and remitted, then now the CO2 would be releasing less than 10 photons per second.It sounds to me like photons are getting lost/destroyed in the system? Or perhaps, colder molecules release the photons more slowly or something?
 
iamconfused said:
I was told that if the CO2 was warm (such as if they are from a lower part of the atmosphere), the CO2 would absorb those photons, and then reemit them at the same frequency, and that since the CO2 is warm, it would re-emit those photons (with the same frequency) with a higher intensity than if the CO2 had been colder, because warmer blackbody curves have more intensity. So if that same group of photons was absorbed by cold CO2, that cold CO2 would re-emit photons of that frequency with less intensity, because the CO2 was colder.

No, that's not right. This kind of absorption and re-emission isn't black body radiation. The total emission by the CO2 would be the sum of the absorbed light that's been re-emitted plus the inherent black body emission (which increases in frequency and intensity with increasing temperature).

What makes CO2 a good insulator, as far as I understand, is that when it absorbs outgoing light from the surface, it re-emits it in all directions. So a portion of this light simply shines back down onto the surface. This in turn increases the temperature of the surface (since it's now absorbing both sunlight and this returned IR light). The net effect here is to raise the temperature of the surface until an equilibrium is reached. Note that the majority of the light from the Sun falls outside the range absorbed by CO2 while, in contrast, a much greater proportion of the Earth's emitted thermal radiation falls within the CO2 absorption region. This is why the 'blanket' effect only works one way, which is to warm the Earth.
 

Similar threads

Replies
1
Views
3K
Replies
8
Views
2K
Replies
4
Views
7K
Replies
16
Views
24K
Replies
5
Views
2K
Replies
28
Views
3K
Replies
2
Views
2K
Back
Top