What makes a light source "brighter"?

In summary: CO2 molecule would release more photons that are at that particular frequency, and the blackbody diagram would show that the curve is shifted to shorter wavelengths.In summary, when the temperature is increased, the intensity of a particular frequency of light goes down because more photons are being released.
  • #1
iamconfused
9
1
Hi all, I am hoping someone could clear up a concept that doesn't make sense to me. I am thinking in terms of a blackbody diagram, which is "Intensity" on the y-axis and wavelength on the x axis. I understand that when you heat something to a higher temperature, the curves shift to shorter wavelengths and the curves are narrower and also taller (more intense). I know wavelength is inversely proportional to the energy. So the energy of a particular color of light will always be the same for that particular color. And intensity seems to be energy per unit area and time. So just thinking about one particular frequency of light, let's say 500 nm as an example, the energy of 500 nm light should be the same no matter the temperature it's at. So what makes it more intense at a higher temperature? Is it, more photons being released? And if it's more photons per unit area and time, what exactly is making that happen? If you have a higher temperature object emitting light, what about higher temperature makes it release more photons?

To give some context, I'm learning about global warming, and how the atmosphere absorbs IR light. And apparently the blackbody spectrum of the Earth will show colder curves because the atmosphere absorbs IR and reradiates it into space higher in the atmosphere, where it's colder. I'm wondering why, for a particular frequency of light, it being colder will make it's intensity go down. What does the molecule do to it?
 
Science news on Phys.org
  • #3
Emissivity.
 
  • #4
The University of Virginia link above has a lot of information.

When the temperature is increased, two things could take place parallely

1) Wien's Displacement Law: (1/λ) ∝ T
When, Temperature increases, peak λ goes down and peak frequency goes up proportionally
Coupled with E = hc/λ, average energy of the photon goes up proportionally.

2) Stefan Boltzmann Law: E = σT4
When, Temperature increases, total emission goes up by T power 4. This could be a huge factor as even a slightest increase in temperature increases the emission by a lot because of the T power 4 factor.

Effectively, if you (could) increase the temperature by 20 percent of a blackbody emitter, average energy (of the peak frequency) of the photons goes up by 20 percent of the original. Total emission would be 2.07 times of the original.
 
  • Like
Likes iamconfused
  • #5
iamconfused said:
So just thinking about one particular frequency of light, let's say 500 nm as an example, the energy of 500 nm light should be the same no matter the temperature it's at. So what makes it more intense at a higher temperature? Is it, more photons being released?

Yes, it's more photons being released per unit of time.

iamconfused said:
And if it's more photons per unit area and time, what exactly is making that happen? If you have a higher temperature object emitting light, what about higher temperature makes it release more photons?

The molecules and atoms are interacting more often and with greater energies, which shifts the spectrum to higher frequencies and generates more photons on average. For example, a molecule with a tail that 'whips' around as it vibrates can interact with the surrounding molecules more often and will have more energy as the temperature increases.

There are other effects as well. Take two charged particles and shoot them towards each other. You'll find that the higher the relative velocity, the more EM radiation the two emit during their interaction and the higher the frequency. A free-electron laser works off of this principle.
 
  • Like
Likes iamconfused
  • #6
iamconfused said:
I know wavelength is inversely proportional to the energy... So just thinking about one particular frequency of light, let's say 500 nm as an example, the energy of 500 nm light should be the same no matter the temperature it's at.

Wait, are you talking about the energy of a single photon? Yes, that's inversely proportional to wavelength. But that doesn't mean every source of the same wavelength is dimmer than every source of a shorter wavelength. If my long-wavelength photons are 2 eV and my short-wavelength photons are 3 eV, then 10 long-wavelength photons (20 eV total) have more energy than 5 short-wavelength photons (15 eV total). I can have a red light (long wavelength, low energy photons) which is brighter than a blue light (short wavelength, high energy photons).

I'm pretty sure that as temperature goes up, the amount of energy at EVERY WAVELENGTH in a blackbody goes up. Like so:
https://simple.wikipedia.org/wiki/B...a/File:Black-body_radiation_vs_wavelength.png
 
Last edited:
  • #7
RPinPA said:
Wait, are you talking about the energy of a single photon?

I guess I'm not really thinking about a single photon, but a group of photons from one frequency of light. Let's say we have a group of photons whose frequency is all the same, and that CO2 absorbs that particular frequency. I was told that if the CO2 was warm (such as if they are from a lower part of the atmosphere), the CO2 would absorb those photons, and then reemit them at the same frequency, and that since the CO2 is warm, it would re-emit those photons (with the same frequency) with a higher intensity than if the CO2 had been colder, because warmer blackbody curves have more intensity. So if that same group of photons was absorbed by cold CO2, that cold CO2 would re-emit photons of that frequency with less intensity, because the CO2 was colder.

In the case of the light leaving the Earth's atmosphere (as measured from space), if a group of photons originally emitted from the ground were absorbed in the bottom/warmer part of the atmosophere, they would get emitted but are likely to be absorbed by colder CO2 above, until eventually it can leave out into space. So measuring the intensity at that particular frequency for those photons would look colder because those photons last were absorbed by colder CO2.

So if the intensity relates to how many photons are being released per second, then I'm imagining if the bottom of the atmosphere where it's warm is emitting say 10 of these photons per second, and then these 10 photons get absorbed higher up by colder CO2 and remitted, then now the CO2 would be releasing less than 10 photons per second.It sounds to me like photons are getting lost/destroyed in the system? Or perhaps, colder molecules release the photons more slowly or something?
 
  • #8
iamconfused said:
I was told that if the CO2 was warm (such as if they are from a lower part of the atmosphere), the CO2 would absorb those photons, and then reemit them at the same frequency, and that since the CO2 is warm, it would re-emit those photons (with the same frequency) with a higher intensity than if the CO2 had been colder, because warmer blackbody curves have more intensity. So if that same group of photons was absorbed by cold CO2, that cold CO2 would re-emit photons of that frequency with less intensity, because the CO2 was colder.

No, that's not right. This kind of absorption and re-emission isn't black body radiation. The total emission by the CO2 would be the sum of the absorbed light that's been re-emitted plus the inherent black body emission (which increases in frequency and intensity with increasing temperature).

What makes CO2 a good insulator, as far as I understand, is that when it absorbs outgoing light from the surface, it re-emits it in all directions. So a portion of this light simply shines back down onto the surface. This in turn increases the temperature of the surface (since it's now absorbing both sunlight and this returned IR light). The net effect here is to raise the temperature of the surface until an equilibrium is reached. Note that the majority of the light from the Sun falls outside the range absorbed by CO2 while, in contrast, a much greater proportion of the Earth's emitted thermal radiation falls within the CO2 absorption region. This is why the 'blanket' effect only works one way, which is to warm the Earth.
 

FAQ: What makes a light source "brighter"?

1. What is the definition of "brightness" when referring to a light source?

Brightness is the perceived intensity of light emitted from a source. It is a subjective measure and can vary depending on factors such as distance and sensitivity of the observer's eyes.

2. How does the wattage of a light source affect its brightness?

The wattage of a light source is a measure of its power consumption, not its brightness. However, in general, a higher wattage light source will produce more light and appear brighter than a lower wattage light source.

3. Can the color of a light source impact its perceived brightness?

Yes, the color or wavelength of light can affect its perceived brightness. For example, shorter wavelengths such as blue and violet are perceived as brighter than longer wavelengths such as red and orange.

4. How does the distance from a light source impact its brightness?

The farther away an observer is from a light source, the less bright it will appear. This is because light spreads out as it travels, causing a decrease in intensity at greater distances.

5. Is there a limit to how bright a light source can be?

Yes, there is a limit to how bright a light source can be. This is known as the maximum luminance or luminous intensity and is determined by the physical properties of the light source and the environment in which it is observed.

Similar threads

Replies
1
Views
2K
Replies
8
Views
1K
Replies
4
Views
7K
Replies
16
Views
23K
Replies
5
Views
1K
Replies
66
Views
6K
Replies
28
Views
3K
Replies
2
Views
1K
Back
Top