Why can you see stars (1/r or 1/r^2 dropoff of power)?

In summary, the intensity of light from a distant star decreases by 1/r^2 due to Gauss' law, even though the electric and magnetic fields decrease by 1/r. This is because the intensity is proportional to the square of the field.
  • #1
xerxes73
10
0
I read somewhere that people can see stars because an electromagnetic wave drops off by 1/r, therefore the power delivered by the electromagnetic wave stays strong enough to activate the receptors in your eye. I believe, this 1/r relation was realized by Maxwell when he was analyzing and combining the Maxwell equations. But, even still, the intensity or flux of light waves should be dropping off by 1/r^2 because (according to Gauss' law) the light from that distant star all passes through a notional sphere that gets bigger and bigger the farther one gets away from that distant star. The sphere expands by r^2 for an increase of distance r. So why doesn't the intensity of the light drop off by 1/r^2 or does it? Even though the light itself decreases by 1/r, the intensity still has to follow this 1/r^2 relation.

Your eye being triggered depends on the power delivered by that electromagnetic radiation right? If this is true, what would be the equation for this? Dependent on 1/r or 1/r^2?

If someone could just generally shed some illumination on this overall situation and what is going on here with 1/r versus 1/r^2 and when one or the other matters, I would greatly appreciate it. Thanks! -xerxes73
 
Physics news on Phys.org
  • #2
xerxes73 said:
I read somewhere that people can see stars because an electromagnetic wave drops off by 1/r, therefore the power delivered by the electromagnetic wave stays strong enough to activate the receptors in your eye. I believe, this 1/r relation was realized by Maxwell when he was analyzing and combining the Maxwell equations. But, even still, the intensity or flux of light waves should be dropping off by 1/r^2 because (according to Gauss' law) the light from that distant star all passes through a notional sphere that gets bigger and bigger the farther one gets away from that distant star. The sphere expands by r^2 for an increase of distance r. So why doesn't the intensity of the light drop off by 1/r^2 or does it? Even though the light itself decreases by 1/r, the intensity still has to follow this 1/r^2 relation.

Your eye being triggered depends on the power delivered by that electromagnetic radiation right? If this is true, what would be the equation for this? Dependent on 1/r or 1/r^2?

If someone could just generally shed some illumination on this overall situation and what is going on here with 1/r versus 1/r^2 and when one or the other matters, I would greatly appreciate it. Thanks! -xerxes73

The intensity is the square of the field. So if the field goes like 1/r the intensity goes like 1/r^2.

This is the case for light radiating from a localized source (like a star); the electric and magnetic fields both depend on the distance from the star as
[tex]
E\sim\frac{1}{r}e^{ikr}\;,
[/tex]
where ck is the angular frequency of the light.

As for the intensity:
[tex]
I\sim |E|^2\sim \frac{1}{r^2}\;.
[/tex]
 
  • #3


I can provide an explanation for why we can see stars and how the 1/r or 1/r^2 dropoff of power plays a role in this phenomenon. Firstly, the 1/r or 1/r^2 dropoff of power is related to the inverse square law, which states that the intensity of a point source (such as a star) decreases as the square of the distance from the source increases. This means that the farther away we are from a star, the less intense its light will appear to us.

Now, when it comes to seeing stars, it is important to understand that our eyes have receptors called rods and cones that are specifically designed to detect light. These receptors are sensitive to the power of the electromagnetic radiation that reaches them. This means that the intensity of the light reaching our eyes is crucial for us to be able to see stars.

So, why can we still see stars even though the intensity of the light decreases with distance? This is where the 1/r or 1/r^2 dropoff of power comes into play. The 1/r dropoff refers to the decrease in the energy of the electromagnetic radiation as it spreads out over an increasing area. This is why the intensity of the light decreases as we move farther away from the source. However, the 1/r^2 dropoff is related to the decrease in the number of photons (or individual particles of light) reaching our eyes. This is because the photons spread out over a larger area as they travel, leading to a decrease in the number of photons that reach our eyes. This is what causes the decrease in intensity that we observe.

In summary, the 1/r or 1/r^2 dropoff of power is crucial for us to be able to see stars. The 1/r dropoff is responsible for the decrease in intensity of the light, while the 1/r^2 dropoff explains the decrease in the number of photons reaching our eyes. Both of these factors work together to allow us to see stars, even though they are incredibly far away. I hope this explanation helps to shed some light on the situation.
 

1. Why do stars appear to be dimmer the further away they are?

This is due to the inverse square law, which states that the intensity of light decreases as the square of the distance from the source increases. In simpler terms, as the distance between an observer and a star increases, the amount of light reaching the observer decreases exponentially, resulting in a dimmer appearance.

2. How does the distance between stars affect their brightness?

The distance between stars has a direct impact on their brightness. As the distance between two stars increases, the brightness of each star decreases. This is because the light from the stars spreads out over a larger area, resulting in a decrease in intensity.

3. Why does the brightness of stars vary?

Stars vary in brightness due to a variety of factors, including their size, temperature, and distance from Earth. Generally, larger and hotter stars appear brighter, while smaller and cooler stars appear dimmer. Additionally, stars may also vary in brightness due to changes in their luminosity, caused by processes such as explosions or changes in their surface temperature.

4. What is the significance of the 1/r or 1/r^2 dropoff of power in relation to stars?

The 1/r or 1/r^2 dropoff of power is a mathematical representation of the inverse square law, which explains why stars appear dimmer the further away they are. It is a fundamental principle in physics and is crucial in understanding the behavior of light and other forms of radiation in our universe.

5. How does the inverse square law apply to other astronomical objects besides stars?

The inverse square law applies to all objects in the universe that emit or reflect light, including planets, moons, and galaxies. It also applies to other forms of radiation, such as sound and gravitational waves. Essentially, any source of energy or radiation follows this law, making it a fundamental concept in many areas of science.

Similar threads

  • Introductory Physics Homework Help
Replies
10
Views
2K
  • Classical Physics
2
Replies
36
Views
4K
Replies
1
Views
587
  • Introductory Physics Homework Help
Replies
6
Views
2K
Replies
1
Views
1K
Replies
7
Views
1K
  • Other Physics Topics
Replies
7
Views
1K
Replies
16
Views
1K
Replies
4
Views
4K
Replies
4
Views
3K
Back
Top