Amplitude of light and intensity

  • Thread starter hangover
  • Start date
  • #1
14
0

Homework Statement


Light is emitted from a point source. How does amplitude of light change when light has travelled from R to 2R?


Homework Equations


Energy=(1/2)kA^2 where k is constant and A is amplitude
Intensity=(Energy/time)/area


The Attempt at a Solution


My teacher said that:
As area=4piR^2, intensity of light decreases when light travels to 2R. Therefore the amplitude of light decreases. Isn't the energy be constant while intensity changes? I am confused as he said that amplitude of light decreases as light moves further from the source?

What is the cause of the phenonmenon that light becomes dimmer in further distance? Decrease of intensity or decrease of amplitude or both?

Thanks for answering my question.
 

Answers and Replies

  • #2
954
117
It is the decrease in amplitude that results in the decrease in intensity. (since frequency is constant)
The energy of the light does not decrease when it spreads out usually, but becomes spread over a larger area. (for a point or isotropic source, think of the wavefronts as spherical shells) This is the reason why the intensity (and amplitude) detected further away from the source decreases, because the power received at that point is reduced as the wave spreads out and distributes its energy over a larger region.
 

Related Threads on Amplitude of light and intensity

  • Last Post
Replies
4
Views
1K
  • Last Post
Replies
2
Views
2K
Replies
2
Views
11K
  • Last Post
Replies
8
Views
2K
  • Last Post
Replies
2
Views
5K
Replies
1
Views
14K
Replies
2
Views
3K
  • Last Post
Replies
3
Views
123
Replies
2
Views
30K
Top