Amplitude of light and intensity

Click For Summary
SUMMARY

The amplitude of light emitted from a point source decreases as the distance from the source increases, specifically from R to 2R. This decrease in amplitude directly results in a reduction of intensity, as defined by the equations Energy=(1/2)kA^2 and Intensity=(Energy/time)/area. The energy of the light remains constant while it spreads over a larger area, leading to diminished intensity and amplitude at greater distances. The phenomenon occurs due to the spherical distribution of light wavefronts, which causes the power received to decrease as the distance increases.

PREREQUISITES
  • Understanding of light wave properties, including amplitude and intensity
  • Familiarity with the equations for energy and intensity of light
  • Knowledge of point source light behavior and spherical wavefronts
  • Basic principles of energy conservation in wave propagation
NEXT STEPS
  • Study the relationship between amplitude and intensity in electromagnetic waves
  • Explore the concept of spherical wavefronts and their impact on light intensity
  • Learn about energy conservation in wave propagation and its implications
  • Investigate real-world applications of light intensity reduction in optics
USEFUL FOR

Students studying physics, particularly those focusing on optics and wave behavior, as well as educators explaining the principles of light propagation and intensity changes.

hangover
Messages
13
Reaction score
0

Homework Statement


Light is emitted from a point source. How does amplitude of light change when light has traveled from R to 2R?


Homework Equations


Energy=(1/2)kA^2 where k is constant and A is amplitude
Intensity=(Energy/time)/area


The Attempt at a Solution


My teacher said that:
As area=4piR^2, intensity of light decreases when light travels to 2R. Therefore the amplitude of light decreases. Isn't the energy be constant while intensity changes? I am confused as he said that amplitude of light decreases as light moves further from the source?

What is the cause of the phenonmenon that light becomes dimmer in further distance? Decrease of intensity or decrease of amplitude or both?

Thanks for answering my question.
 
Physics news on Phys.org
It is the decrease in amplitude that results in the decrease in intensity. (since frequency is constant)
The energy of the light does not decrease when it spreads out usually, but becomes spread over a larger area. (for a point or isotropic source, think of the wavefronts as spherical shells) This is the reason why the intensity (and amplitude) detected further away from the source decreases, because the power received at that point is reduced as the wave spreads out and distributes its energy over a larger region.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
3
Views
850
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 35 ·
2
Replies
35
Views
4K
Replies
17
Views
3K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
8K