1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Spherical waves, Optics, phase, homework help please!

  1. Jan 26, 2013 #1
    The question is as follows:

    A point source emits a spherical wave with λ = 500 nm. If an observer is far away from the source and is only interacting with the light across a small area, one can approximate the local wave as a plane wave. How far from the source must the observer be so that the phase of the wave deviates by less than 36° over an illuminated spot 3.9 cm in diameter?

    I honestly don't even know where to start with this one. I have all my homework done except for this problem. Please help!
     
  2. jcsd
  3. Jan 26, 2013 #2

    TSny

    User Avatar
    Homework Helper
    Gold Member

    Hello, heycoa. Try to give us some idea of what you understand and don't understand about the problem. Do you see why the phase of the wave differs for different points of the illuminated spot?
     
  4. Jan 26, 2013 #3
    Well, I dont really understand how the phase differs. It says in the problem that the wave can be treated as a plane wave. I was thinking that the phase should remain throughout the lit spot.
     
  5. Jan 26, 2013 #4

    Simon Bridge

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member
    2016 Award

    that's not all it says.... it says that it can be treated as a plane wave across a small area. This bit is key.

    The problem also tells you what kind of wave it is.
    What kind of wave is it?

    Can you sketch wave-fronts for such a wave? Do so.
    The wavefronts are lines of equal phase.

    If you draw a line some distance away from the source of the fronts - you can see how different parts of the line experience a different phase of the wave (not all the line is on the same wavefront).
     
  6. Jan 26, 2013 #5
    Ok so its basically like the flat square wave-fronts, but when the reach the circular area, the center of the wave-fronts hit the area first?
     
  7. Jan 26, 2013 #6

    TSny

    User Avatar
    Homework Helper
    Gold Member

    Yes, that's right. For which points of the illuminated spot is there the greatest phase difference?
     
  8. Jan 26, 2013 #7
    To me its very difficult to visualize. I am trying to work this out down on paper, but I just cannot make the connection between the circular area and phase difference. I do not know what is denoting a difference in phase.
     
  9. Jan 26, 2013 #8
    I am just making a guess, but is the difference in phase the greatest when the wave is bisecting the circular area? Because more of the wave is covered up than the rest? This seems to be a very abstract way of calculating a phase difference. Maybe I am just slow :(
     
  10. Jan 26, 2013 #9

    TSny

    User Avatar
    Homework Helper
    Gold Member

    Have you studied the connection between "path difference" and phase difference? Which point(s) of the illuminated spot is (are) closest to the source? Which is (are) farthest from the source?
     
  11. Jan 26, 2013 #10
    If I set an origin in the center of the illuminated area, then the point closest to the source is -1.95 cm, and the furthest would be 1.95 cm. I have not come across "path difference" yet.
     
  12. Jan 26, 2013 #11

    TSny

    User Avatar
    Homework Helper
    Gold Member

    Path difference just refers to the difference in distance that light travels along two different paths. The attached picture shows the illuminated spot from a point source S. Which point of the spot is closest to S and which point(s) is farthest?
     

    Attached Files:

  13. Jan 26, 2013 #12
    A is closest and B is furthest. It seems that I was totally confusing the setup. Also, I guess I did not realize that I could be using ray tracing. For some reason I was stuck with waves in my head.

    Thank you, by the way, for spending so much time with me!
     
  14. Jan 26, 2013 #13
    So this means that the longer path will result in a phase shift, since its path is further. I basically have to find the correct r that gives me a phase difference of 36 degrees. I was totally over thinking the problem!
     
  15. Jan 26, 2013 #14
    actually I need to find the correct distance from the illuminated area
     
  16. Jan 26, 2013 #15
    so can I use the angle between the hypotenuse and d? Does setting that angle to 36 degrees produce the correct phase shift?
     
  17. Jan 26, 2013 #16

    TSny

    User Avatar
    Homework Helper
    Gold Member

    No, that's not going to get it.

    The attached drawing shows a red wavefront just reaching the center of the spot and the path difference is shown in blue.

    If the path difference h - d happened to equal one-half wavelength, then the phase difference between the light reaching A and the light reaching B would be 1/2 of 360o = 180o. (One full wavelength corresponds to a phase difference of 360o)

    What fraction of a wavelength would the path difference need to be to get a phase difference of 36o?
     

    Attached Files:

    Last edited: Jan 26, 2013
  18. Jan 26, 2013 #17
    1/10 of the 500 nm wavelength
     
  19. Jan 26, 2013 #18

    TSny

    User Avatar
    Homework Helper
    Gold Member

    Right. So you need to find the distance d such that h - d = λ/10. Hint: express h in terms of d and r using geometry.
     
  20. Jan 26, 2013 #19
    I dont know if i did this right, but I considered the path difference to be only 50 nm, then I used Pythagoras theorem as such: x^2 + .0195^2 = (x+50*10^-9)^2. I solved for x, and I got 3,802.5 m. Does this appear correct to you?
     
  21. Jan 26, 2013 #20
    in this case x = d and (x+50*10^-9) = h
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Spherical waves, Optics, phase, homework help please!
Loading...