Spherical waves, Optics, phase, homework help please

heycoa
Messages
73
Reaction score
0
The question is as follows:

A point source emits a spherical wave with λ = 500 nm. If an observer is far away from the source and is only interacting with the light across a small area, one can approximate the local wave as a plane wave. How far from the source must the observer be so that the phase of the wave deviates by less than 36° over an illuminated spot 3.9 cm in diameter?

I honestly don't even know where to start with this one. I have all my homework done except for this problem. Please help!
 
Physics news on Phys.org
Hello, heycoa. Try to give us some idea of what you understand and don't understand about the problem. Do you see why the phase of the wave differs for different points of the illuminated spot?
 
Well, I don't really understand how the phase differs. It says in the problem that the wave can be treated as a plane wave. I was thinking that the phase should remain throughout the lit spot.
 
heycoa said:
Well, I don't really understand how the phase differs. It says in the problem that the wave can be treated as a plane wave.
that's not all it says... it says that it can be treated as a plane wave across a small area. This bit is key.

The problem also tells you what kind of wave it is.
What kind of wave is it?

Can you sketch wave-fronts for such a wave? Do so.
The wavefronts are lines of equal phase.

If you draw a line some distance away from the source of the fronts - you can see how different parts of the line experience a different phase of the wave (not all the line is on the same wavefront).
 
Ok so its basically like the flat square wave-fronts, but when the reach the circular area, the center of the wave-fronts hit the area first?
 
Yes, that's right. For which points of the illuminated spot is there the greatest phase difference?
 
To me its very difficult to visualize. I am trying to work this out down on paper, but I just cannot make the connection between the circular area and phase difference. I do not know what is denoting a difference in phase.
 
I am just making a guess, but is the difference in phase the greatest when the wave is bisecting the circular area? Because more of the wave is covered up than the rest? This seems to be a very abstract way of calculating a phase difference. Maybe I am just slow :(
 
Have you studied the connection between "path difference" and phase difference? Which point(s) of the illuminated spot is (are) closest to the source? Which is (are) farthest from the source?
 
  • #10
If I set an origin in the center of the illuminated area, then the point closest to the source is -1.95 cm, and the furthest would be 1.95 cm. I have not come across "path difference" yet.
 
  • #11
Path difference just refers to the difference in distance that light travels along two different paths. The attached picture shows the illuminated spot from a point source S. Which point of the spot is closest to S and which point(s) is farthest?
 

Attachments

  • Phase difference.jpg
    Phase difference.jpg
    13.2 KB · Views: 547
  • #12
A is closest and B is furthest. It seems that I was totally confusing the setup. Also, I guess I did not realize that I could be using ray tracing. For some reason I was stuck with waves in my head.

Thank you, by the way, for spending so much time with me!
 
  • #13
So this means that the longer path will result in a phase shift, since its path is further. I basically have to find the correct r that gives me a phase difference of 36 degrees. I was totally over thinking the problem!
 
  • #14
actually I need to find the correct distance from the illuminated area
 
  • #15
so can I use the angle between the hypotenuse and d? Does setting that angle to 36 degrees produce the correct phase shift?
 
  • #16
heycoa said:
so can I use the angle between the hypotenuse and d? Does setting that angle to 36 degrees produce the correct phase shift?

No, that's not going to get it.

The attached drawing shows a red wavefront just reaching the center of the spot and the path difference is shown in blue.

If the path difference h - d happened to equal one-half wavelength, then the phase difference between the light reaching A and the light reaching B would be 1/2 of 360o = 180o. (One full wavelength corresponds to a phase difference of 360o)

What fraction of a wavelength would the path difference need to be to get a phase difference of 36o?
 

Attachments

  • Phase difference path diff.jpg
    Phase difference path diff.jpg
    15.5 KB · Views: 646
Last edited:
  • #17
1/10 of the 500 nm wavelength
 
  • #18
Right. So you need to find the distance d such that h - d = λ/10. Hint: express h in terms of d and r using geometry.
 
  • #19
I don't know if i did this right, but I considered the path difference to be only 50 nm, then I used Pythagoras theorem as such: x^2 + .0195^2 = (x+50*10^-9)^2. I solved for x, and I got 3,802.5 m. Does this appear correct to you?
 
  • #20
in this case x = d and (x+50*10^-9) = h
 
  • #21
heycoa said:
I don't know if i did this right, but I considered the path difference to be only 50 nm, then I used Pythagoras theorem as such: x^2 + .0195^2 = (x+50*10^-9)^2. I solved for x, and I got 3,802.5 m. Does this appear correct to you?

Yes, I think that's right. (I'm a bit surprised at how large the distance d turns out to be. Over 2 miles! But the calculation appears correct.)

Good work.
 
  • #22
I cannot thank you enough for your time, patience, and energy in helping me. Thank you very very much, I really appreciate it. You helped me learn something that I should have already and stuck with it. Thanks again TSny!

P.S. How do I give you points or the medal for helping me?
 
  • #23
Glad to help. ( No points or medals around here. Wouldn't know what to do with 'em if I got 'em :blushing:)
 
Back
Top