Why can you see stars (1/r or 1/r^2 dropoff of power)?

  • Context: Undergrad 
  • Thread starter Thread starter xerxes73
  • Start date Start date
  • Tags Tags
    Power Stars
Click For Summary
SUMMARY

The discussion centers on the relationship between electromagnetic wave intensity and distance, specifically addressing the 1/r and 1/r^2 drop-off. It is established that while the electric field of light from a star decreases with distance as 1/r, the intensity, which is proportional to the square of the electric field, decreases as 1/r^2. This relationship is consistent with Gauss' law, which explains the dispersion of light over a notional sphere that expands with distance. The confusion arises from the distinction between the power delivered by the wave and the intensity perceived by the eye, which ultimately relies on the 1/r^2 relationship.

PREREQUISITES
  • Understanding of electromagnetic waves and their properties
  • Familiarity with Maxwell's equations
  • Knowledge of Gauss' law and its implications for light intensity
  • Basic concepts of wave intensity and field strength
NEXT STEPS
  • Study Maxwell's equations in detail to grasp electromagnetic wave behavior
  • Explore Gauss' law and its applications in optics
  • Investigate the mathematical relationship between electric field strength and intensity
  • Learn about the perception of light and how it relates to human vision
USEFUL FOR

Students of physics, optical engineers, and anyone interested in the principles of light propagation and perception.

xerxes73
Messages
10
Reaction score
0
I read somewhere that people can see stars because an electromagnetic wave drops off by 1/r, therefore the power delivered by the electromagnetic wave stays strong enough to activate the receptors in your eye. I believe, this 1/r relation was realized by Maxwell when he was analyzing and combining the Maxwell equations. But, even still, the intensity or flux of light waves should be dropping off by 1/r^2 because (according to Gauss' law) the light from that distant star all passes through a notional sphere that gets bigger and bigger the farther one gets away from that distant star. The sphere expands by r^2 for an increase of distance r. So why doesn't the intensity of the light drop off by 1/r^2 or does it? Even though the light itself decreases by 1/r, the intensity still has to follow this 1/r^2 relation.

Your eye being triggered depends on the power delivered by that electromagnetic radiation right? If this is true, what would be the equation for this? Dependent on 1/r or 1/r^2?

If someone could just generally shed some illumination on this overall situation and what is going on here with 1/r versus 1/r^2 and when one or the other matters, I would greatly appreciate it. Thanks! -xerxes73
 
Physics news on Phys.org
xerxes73 said:
I read somewhere that people can see stars because an electromagnetic wave drops off by 1/r, therefore the power delivered by the electromagnetic wave stays strong enough to activate the receptors in your eye. I believe, this 1/r relation was realized by Maxwell when he was analyzing and combining the Maxwell equations. But, even still, the intensity or flux of light waves should be dropping off by 1/r^2 because (according to Gauss' law) the light from that distant star all passes through a notional sphere that gets bigger and bigger the farther one gets away from that distant star. The sphere expands by r^2 for an increase of distance r. So why doesn't the intensity of the light drop off by 1/r^2 or does it? Even though the light itself decreases by 1/r, the intensity still has to follow this 1/r^2 relation.

Your eye being triggered depends on the power delivered by that electromagnetic radiation right? If this is true, what would be the equation for this? Dependent on 1/r or 1/r^2?

If someone could just generally shed some illumination on this overall situation and what is going on here with 1/r versus 1/r^2 and when one or the other matters, I would greatly appreciate it. Thanks! -xerxes73

The intensity is the square of the field. So if the field goes like 1/r the intensity goes like 1/r^2.

This is the case for light radiating from a localized source (like a star); the electric and magnetic fields both depend on the distance from the star as
<br /> E\sim\frac{1}{r}e^{ikr}\;,<br />
where ck is the angular frequency of the light.

As for the intensity:
<br /> I\sim |E|^2\sim \frac{1}{r^2}\;.<br />
 

Similar threads

  • · Replies 28 ·
Replies
28
Views
3K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 1 ·
Replies
1
Views
614
  • · Replies 36 ·
2
Replies
36
Views
6K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K