Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

I'm new here and I'm wondering something about diffraction of waves

  1. Jun 9, 2006 #1
    Ok, I read that diffraction only occurs when wavelength is larger than than the obstacle/opening. But then I find an equation looking like:

    sin(theta) = wavelength/width opening

    For determining the diffraction. Now if the wavelength has to be larger than w for diffraction to occur, why is it that if you apply this to the above equation an error will occur?(note: if wavelength > w then the result will be greater than 1, and sin ^ -1 of any number greater than 1 = error)

    Sorry if the answer is really obvious or something, it's just that this concept is driving me insane, my physics teacher wasn't really able to help much either with the understanding. Please someone respond soon.
     
  2. jcsd
  3. Jun 9, 2006 #2
    Sorry for the double post, but I should mention that this situation is with a single-slit, not sure if that was obvious or not, just decided to let you all know anyway.
     
  4. Jun 9, 2006 #3

    Doc Al

    User Avatar

    Staff: Mentor

    Not so. Light diffracting though a slit creates a pattern of a bright center, surrounding by dark fringes, then further bright areas, and so on. Much of the diffracted light is in that central bright area.

    That equation is for finding the angle of the dark fringes that surround the central maxima of a single slit diffraction pattern. That defines the width of the central (and most intense) part of the diffracted beam. When the wavelength is much smaller than the slit, that central beam is very narrow--not spread out much. (Of course, if the slit is too wide, you won't see much of a single slit diffraction pattern.) But when the wavelength is the same size as the slit (or larger) that equation no longer applies since the spread is so great that there are no dark fringes--the central maxima fills the entire area beyond the slit. So in that sense, diffraction is maximum.
     
  5. Jun 9, 2006 #4
    Wow, thank you so much. I didn't think of it like that before at all. So basically you're saying that even when the wavelength is less than the width of the opening, waves are diffracted? Why is that exactly? I mean if the wavelength is small how is it being diffracted(bent by the edges of the opening)? Sorry this seems really easy but I'm always overcomplicating things.
     
  6. Jun 9, 2006 #5

    Doc Al

    User Avatar

    Staff: Mentor

  7. Jun 9, 2006 #6
    I checked those links, they seemed helpful enough.

    Alright, well my physics teacher said that the stuff I was geting into was a year ahead of me, as seems the case here.

    So let me get this straight, even if the wavelength is smaller, the waves still diffract? That seems to be the case here, apparently what I read originally was wrong, but now it seems that a better relationship is being defined. Decreasing the width of the opening produces a similar effect to increasing the wavelength, right? So then now after reading what those articles had to say I have a slightly better understanding of this.

    If the wavlength is larger than the opening, than there are no "dark areas" produced correct? Now I see why that equation doesn't apply in this case, like I originally thought, that simply wouldn't make sense in the equation, considering it's used to find the location of dark areas...

    So then is it considered a different type of diffraction when the wavlength is larger than the width of the opening? Or just an extreme version of the previous situation?
     
  8. Jun 9, 2006 #7

    Doc Al

    User Avatar

    Staff: Mentor

    Sounds like you are getting the idea.

    I'd consider that having the wavelength larger than the slit is just an extreme version of the same kind of single slit diffraction. The key to understanding diffraction is to follow the phase differences of the light from different parts of the original wavefront across the slit. At different angles, the light from different parts of that wavefront must travel different distances to reach the same point on the screen, and thus has different phases: at some angles the interference is constructive, at other angles destructive. But if the slit is small compared to the wavelength, then the phase differences will never be large enough to cancel out even at large angles.
     
  9. Jun 9, 2006 #8
    "But if the slit is small compared to the wavelength, then the phase differences will never be large enough to cancel out even at large angles."

    Ok, I understand this all pretty well now I feel, well at least a lot more than before.

    However, this quote seems to puzzle me slightly, why aren't the phase differences large enough in this case? I probably missed something but if you could clear up that question for me it would be greatly appreciated.
     
  10. Jun 9, 2006 #9

    Doc Al

    User Avatar

    Staff: Mentor

    The difference in path length for light from different parts of the original wavefront cannot be any greater than the width of the slit. (Which would be the path length difference at 90 degrees--the maximum angle.) So, if that slit width is a small fraction of a wavelength, then that maximum phase difference will only be a small fraction of a wavelength. To get destructive interference you need a phase difference of half a wavelength (180 degrees).
     
  11. Jun 9, 2006 #10
    Alright, it all seems pretty clear now. Thanks for all the advice. If I think of anything else I'll just ask tomorrow as it's getting late right now.
     
  12. Jun 10, 2006 #11

    jtbell

    User Avatar

    Staff: Mentor

    Note that you get diffraction even if the width of the slit is half-infinite, so to speak. For example, if you shine a laser bean so that it grazes the edge edge of a razor blade, you get the intensity pattern shown in the graph at the top of this page:

    http://scienceworld.wolfram.com/physics/Half-InfiniteScreenDiffraction.html

    In that graph, you can think of the blade as lying along the x-axis (actually it's labeled u in that diagram), extending from x = 0 (the edge) to x = +infinity. If there were no diffraction, the graph for x < 0 would be a straight horizontal line at y = 1, and for x > 0 it would be a straight line along the x-axis (zero intensity).
     
  13. Jun 10, 2006 #12
    Ok here's one more thing I'm wondering about:

    "The difference in path length for light from different parts of the original wavefront cannot be any greater than the width of the slit. (Which would be the path length difference at 90 degrees--the maximum angle.) So, if that slit width is a small fraction of a wavelength, then that maximum phase difference will only be a small fraction of a wavelength. To get destructive interference you need a phase difference of half a wavelength (180 degrees)."

    As Doc Al previously said. I'm wondering how you can figure this out, does that mean since destructive interference occurs when it's a low wavelength that the maximum angle is at least 180 degrees? How do you find the maximum angle? By the way, this should be the last qestion I have about this. Thanks again.
     
  14. Jun 10, 2006 #13

    Hootenanny

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I don't think the Doc is refering to an 'angle' as you interpret it there. A phase difference can also be expressed using angle. For example, if two waves are out of phase by a full wavelength, they can be sad to be [itex]\lambda[/itex] meters out of phase, 360o out of phase or 2[itex]\pi[/itex]c out of phase. Therefore, Doc Al is refering to two waves that are 180o out of phase, or half a wavelength out a phase.

    Does that make sense?
     
  15. Jun 10, 2006 #14
    Yeah I know about that(learned in math), but how do you find out that the slit width is the maximum phase difference?
     
  16. Jun 10, 2006 #15
    Sorry for the double post again, but, also, how do you know that the max phase difference at that point would 90 degrees?
     
  17. Jun 10, 2006 #16

    Hootenanny

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    HINT: Look at the function;

    [tex]a{\color{red}\sin\theta} = m\lambda[/tex]
     
  18. Jun 10, 2006 #17
    I thought that equation only applied when then the wavelength was less than the width of the opening. And Doc Al said that 90 degrees was the maximum for when the wavelength was greater than the width of the opening.
     
  19. Jun 10, 2006 #18

    Hootenanny

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    90 degrees is the maximum full stop. You cannot diffract something (as far as I know) past 90 degrees. That equation does only apply when the wavelength is less than a, I never said it wasn't. You can see for the formula that;

    [tex]\frac{m\lambda}{a}\leq 1[/tex]

    I'm sorry but I don't know exactly what your asking here.
     
    Last edited: Jun 10, 2006
  20. Jun 10, 2006 #19
    Ok basically, I know that the maximum for that is 90 degrees, but earlier than that point doesn't destructive interference occur? If that's the case then don't you need 180 degrees for destructive interference(well enough to cancel out waves I mean)?
     
  21. Jun 10, 2006 #20

    Hootenanny

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Do you mean when [itex]\lambda = a[/itex]?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?