Why Does Diffraction Occur Even When Wavelength Exceeds Aperture Width?

AI Thread Summary
Diffraction occurs even when the wavelength is smaller than the aperture width, contrary to the common misconception that it only happens in such cases. The equation sin(theta) = wavelength/width opening is used to determine the angles of dark fringes in a single-slit diffraction pattern, but it becomes invalid when the wavelength is larger than the slit width, as the central maxima fills the entire area. As the width of the slit decreases or the wavelength increases, diffraction effects become more pronounced, leading to a broader spread of light. The phase differences in light waves traveling through the slit determine constructive and destructive interference, with significant diffraction observed even at small angles. Understanding these principles clarifies why diffraction patterns can still be observed under various conditions.
Byrgg
Messages
335
Reaction score
0
Ok, I read that diffraction only occurs when wavelength is larger than than the obstacle/opening. But then I find an equation looking like:

sin(theta) = wavelength/width opening

For determining the diffraction. Now if the wavelength has to be larger than w for diffraction to occur, why is it that if you apply this to the above equation an error will occur?(note: if wavelength > w then the result will be greater than 1, and sin ^ -1 of any number greater than 1 = error)

Sorry if the answer is really obvious or something, it's just that this concept is driving me insane, my physics teacher wasn't really able to help much either with the understanding. Please someone respond soon.
 
Science news on Phys.org
Sorry for the double post, but I should mention that this situation is with a single-slit, not sure if that was obvious or not, just decided to let you all know anyway.
 
Byrgg said:
Ok, I read that diffraction only occurs when wavelength is larger than than the obstacle/opening.
Not so. Light diffracting though a slit creates a pattern of a bright center, surrounding by dark fringes, then further bright areas, and so on. Much of the diffracted light is in that central bright area.

But then I find an equation looking like:

sin(theta) = wavelength/width opening
That equation is for finding the angle of the dark fringes that surround the central maxima of a single slit diffraction pattern. That defines the width of the central (and most intense) part of the diffracted beam. When the wavelength is much smaller than the slit, that central beam is very narrow--not spread out much. (Of course, if the slit is too wide, you won't see much of a single slit diffraction pattern.) But when the wavelength is the same size as the slit (or larger) that equation no longer applies since the spread is so great that there are no dark fringes--the central maxima fills the entire area beyond the slit. So in that sense, diffraction is maximum.
 
Wow, thank you so much. I didn't think of it like that before at all. So basically you're saying that even when the wavelength is less than the width of the opening, waves are diffracted? Why is that exactly? I mean if the wavelength is small how is it being diffracted(bent by the edges of the opening)? Sorry this seems really easy but I'm always overcomplicating things.
 
I checked those links, they seemed helpful enough.

Alright, well my physics teacher said that the stuff I was geting into was a year ahead of me, as seems the case here.

So let me get this straight, even if the wavelength is smaller, the waves still diffract? That seems to be the case here, apparently what I read originally was wrong, but now it seems that a better relationship is being defined. Decreasing the width of the opening produces a similar effect to increasing the wavelength, right? So then now after reading what those articles had to say I have a slightly better understanding of this.

If the wavlength is larger than the opening, than there are no "dark areas" produced correct? Now I see why that equation doesn't apply in this case, like I originally thought, that simply wouldn't make sense in the equation, considering it's used to find the location of dark areas...

So then is it considered a different type of diffraction when the wavlength is larger than the width of the opening? Or just an extreme version of the previous situation?
 
Sounds like you are getting the idea.

I'd consider that having the wavelength larger than the slit is just an extreme version of the same kind of single slit diffraction. The key to understanding diffraction is to follow the phase differences of the light from different parts of the original wavefront across the slit. At different angles, the light from different parts of that wavefront must travel different distances to reach the same point on the screen, and thus has different phases: at some angles the interference is constructive, at other angles destructive. But if the slit is small compared to the wavelength, then the phase differences will never be large enough to cancel out even at large angles.
 
"But if the slit is small compared to the wavelength, then the phase differences will never be large enough to cancel out even at large angles."

Ok, I understand this all pretty well now I feel, well at least a lot more than before.

However, this quote seems to puzzle me slightly, why aren't the phase differences large enough in this case? I probably missed something but if you could clear up that question for me it would be greatly appreciated.
 
The difference in path length for light from different parts of the original wavefront cannot be any greater than the width of the slit. (Which would be the path length difference at 90 degrees--the maximum angle.) So, if that slit width is a small fraction of a wavelength, then that maximum phase difference will only be a small fraction of a wavelength. To get destructive interference you need a phase difference of half a wavelength (180 degrees).
 
  • #10
Alright, it all seems pretty clear now. Thanks for all the advice. If I think of anything else I'll just ask tomorrow as it's getting late right now.
 
  • #11
Note that you get diffraction even if the width of the slit is half-infinite, so to speak. For example, if you shine a laser bean so that it grazes the edge edge of a razor blade, you get the intensity pattern shown in the graph at the top of this page:

http://scienceworld.wolfram.com/physics/Half-InfiniteScreenDiffraction.html

In that graph, you can think of the blade as lying along the x-axis (actually it's labeled u in that diagram), extending from x = 0 (the edge) to x = +infinity. If there were no diffraction, the graph for x < 0 would be a straight horizontal line at y = 1, and for x > 0 it would be a straight line along the x-axis (zero intensity).
 
  • #12
Ok here's one more thing I'm wondering about:

"The difference in path length for light from different parts of the original wavefront cannot be any greater than the width of the slit. (Which would be the path length difference at 90 degrees--the maximum angle.) So, if that slit width is a small fraction of a wavelength, then that maximum phase difference will only be a small fraction of a wavelength. To get destructive interference you need a phase difference of half a wavelength (180 degrees)."

As Doc Al previously said. I'm wondering how you can figure this out, does that mean since destructive interference occurs when it's a low wavelength that the maximum angle is at least 180 degrees? How do you find the maximum angle? By the way, this should be the last qestion I have about this. Thanks again.
 
  • #13
I don't think the Doc is referring to an 'angle' as you interpret it there. A phase difference can also be expressed using angle. For example, if two waves are out of phase by a full wavelength, they can be sad to be \lambda meters out of phase, 360o out of phase or 2\pic out of phase. Therefore, Doc Al is referring to two waves that are 180o out of phase, or half a wavelength out a phase.

Does that make sense?
 
  • #14
Yeah I know about that(learned in math), but how do you find out that the slit width is the maximum phase difference?
 
  • #15
Sorry for the double post again, but, also, how do you know that the max phase difference at that point would 90 degrees?
 
  • #16
HINT: Look at the function;

a{\color{red}\sin\theta} = m\lambda
 
  • #17
I thought that equation only applied when then the wavelength was less than the width of the opening. And Doc Al said that 90 degrees was the maximum for when the wavelength was greater than the width of the opening.
 
  • #18
Byrgg said:
I thought that equation only applied when then the wavelength was less than the width of the opening. And Doc Al said that 90 degrees was the maximum for when the wavelength was greater than the width of the opening.

90 degrees is the maximum full stop. You cannot diffract something (as far as I know) past 90 degrees. That equation does only apply when the wavelength is less than a, I never said it wasn't. You can see for the formula that;

\frac{m\lambda}{a}\leq 1

I'm sorry but I don't know exactly what your asking here.
 
Last edited:
  • #19
Ok basically, I know that the maximum for that is 90 degrees, but earlier than that point doesn't destructive interference occur? If that's the case then don't you need 180 degrees for destructive interference(well enough to cancel out waves I mean)?
 
  • #20
Byrgg said:
Ok basically, I know that the maximum for that is 90 degrees, but earlier than that point doesn't destructive interference occur? If that's the case then don't you need 180 degrees for destructive interference(well enough to cancel out waves I mean)?

Do you mean when \lambda = a?
 
  • #21
Yeah, because then the angle is 90 degrees right? I thought you needed 180 degrees for enough interference to cancel out waves.
 
  • #22
Byrgg said:
Yeah, because then the angle is 90 degrees right? I thought you needed 180 degrees for enough interference to cancel out waves.

The angle of diffraction is 90 degrees not the phase difference. You need 180o phase difference for a minima. However, the maximum angle of diffraction is at 90o.
 
  • #23
Oh, I thought the phase difference and angle were the same number, oops, ok thanks for clearing that up, but then how do you determine the phase difference?
 
  • #24
Ok I think this will be my last post here, here's something I pulled from a site known as "physic's classroom":

"In fact, when the wavelength of the waves are smaller than the obstacle or opening, no noticeable diffraction occurs."

I know this was sort of cleared up before but this is confusing slightly. If the wavelength is larger than/equal the barrier/opening, the diffraction is 90 degrees right? So does this quote just mean that those smaller angles(when the wavelength is smaller then the barrier/opening) are not noticeable? Does this mean we can only see diffraction when it's 90 degrees? I might be missing something here, but if someone could clear up this(hopefully final) bit of information, I would be very greatful.
 
  • #25
'When waves (eg pressure waves) pass through openings that have about the same magnitude (or less) than the wavelength of the incident wave, diffraction will occur."

This is something I just read by another person here in another thread somewhere.

I thought diffraction always occurred when a wave passed through an slit, what's going on? Someone please explain this to me.
 
  • #26
If the the wavelength is equal to or greater than the size of the opening, you will indeed obsereve no diffraction pattern. Note it is only 90o when a = \lambda; if a &gt; \lambda then the angle of diffraction becomes undefined. Now in reference to your quote;
Physics Classroom said:
"In fact, when the wavelength of the waves are smaller than the obstacle or opening, no noticeable diffraction occurs."
Let us examine the Fraunhofer single slit relationship;

a\sin\theta = m\lambda \Leftrightarrow \theta = \sin^{-1}\left( \frac{m\lambda}{a}\right)

Now, observe what happens as the wavelength gets shorter (tend to zero)

\theta = \lim_{\lambda \rightarrow 0} \sin^{-1}\left( \frac{m\lambda}{a} \right)= 0

So, as the wavelength gets shorted, the angle between the porgressive minima get progressively smaller and hence the distance between the minima becomes smaller, leading to a less distinct diffraction pattern. Imagine blowing up a balloon and drawing spots on the surface at 1mm intervals, now start to let the air out of the balloon what happens to the spots as you deflate the balloon?

Do you follow?
 
Last edited:
  • #27
Yeah I understand that but what about my last post?
 
  • #28
Byrgg said:
'When waves (eg pressure waves) pass through openings that have about the same magnitude (or less) than the wavelength of the incident wave, diffraction will occur."

This is something I just read by another person here in another thread somewhere.

I thought diffraction always occurred when a wave passed through an slit, what's going on? Someone please explain this to me.
I think what the poster meant to say was that noticeable diffraction occurs when the size of the opening has the same magnitude (or less) than the wavelength of the incident wave.
 
  • #29
Then what does noticeable diffraction look like exactly?
 
  • #30
Byrgg said:
Then what does noticeable diffraction look like exactly?
If the diffraction was noticable, you would see a diffraction pattern on a screen places suitabale distance away. Or if they were waves in a tank, you would see the waves 'curve'. If the diffraction wasn't noticible, you would just see a single spot of light or waves that appear to be linear, it doesn't mean that diffraction isn't occurring it just means it isn't noticable, i.e. we can't observe it. For example in a double slit, the angle of diffraction between progrssive maxima would be so small, all the maxima would appear 'merge' into a single line on the screen.
 
  • #31
Ok so here's an example, if the wavelength is less than the width opening(in a tank let's say), would you be able to see the curvature of the waves? Also, after that, let's say the wavelength was larger than(or equal to) the width of the opening, what would the curves look like then?
 
  • #32
Byrgg said:
Ok so here's an example, if the wavelength is less than the width opening(in a tank let's say), would you be able to see the curvature of the waves? Also, after that, let's say the wavelength was larger than(or equal to) the width of the opening, what would the curves look like then?
http://www.ngsir.netfirms.com/englishhtm/Diffraction.htm" is a nice applet that illustrates the phenomenon.
 
Last edited by a moderator:
  • #33
The applet isn't working for some reason.
 
  • #34
Byrgg said:
The applet isn't working for some reason.
I think you need Java installed. To asnwer you question: If the wavelength was significantly less than the size of the opening then very little diffraction would occur, you may see some small curvature about the edges of the opening. If the wavelength was greater than or equal to the slit, you would see significant diffraction, the smaller the opening when compared with the wavelength the smaller the radius of the curve.
 
  • #35
Ok, just to make sure I understand this well enough, could you tell me if I'm right?

When the wavelength is less than the width of the opening, the pattern of bright/dark areas is formed, and the wave appears to be(if it can be seen clearly) a straight line with curved edges. When the wavelength is equal to or greater than the width of the opening, there is no pattern of bright/dark areas, and the wave is basically just a curve?

Is that right, or did I say something incorrect?
 
  • #36
Byrgg said:
Ok, just to make sure I understand this well enough, could you tell me if I'm right?

When the wavelength is less than the width of the opening, the pattern of bright/dark areas is formed, and the wave appears to be(if it can be seen clearly) a straight line with curved edges. When the wavelength is equal to or greater than the width of the opening, there is no pattern of bright/dark areas, and the wave is basically just a curve?

Is that right, or did I say something incorrect?
Sounds like a good summary to me.
 
  • #37
Ok, if the wavelength is less than the width of the opening, then will the edges always be curved, with the middle being a straight line? Or when the wavelength is almost as large as the width of the opening, will it be just one curve?

Also, do the curves always extend all the way back to the wall from the line bisecting the opening(forming a quarter of a circle)?
 
  • #38
Please someone answer.
 
  • #39
Byrgg said:
Ok, if the wavelength is less than the width of the opening, then will the edges always be curved, with the middle being a straight line? Or when the wavelength is almost as large as the width of the opening, will it be just one curve?
Yes, roughly speaking this is the case, there may still be some diffraction so that the 'centre' wavefronts are not completely linear and may have a slight curve but it if the gap was significantly bigger, then the centre wavefronts would be linear.

If \lambda\approx a, then yes, the wavefront would appear circular.
Byrgg said:
Also, do the curves always extend all the way back to the wall from the line bisecting the opening(forming a quarter of a circle)?
Yes, this is correct. The waves would appear semi-circular if \lambda\approx a with the centre of the circule located at the centre of a. Quite a good reference for diffraction as explained by Hugens pricinple using wavefronts is available http://physics.tamuk.edu/~suson/html/4323/diffract.html"
 
Last edited by a moderator:
  • #40
Ok, so then if the gap is significantly bigger than the wavelength, do the curves still make the quarter of a circle at the edges?

Basically, if the wave is linear, with the edges curved, do those edges reach back to the wall making the mentioned quarter of a circle?
 
  • #41
Byrgg said:
Ok, so then if the gap is significantly bigger than the wavelength, do the curves still make the quarter of a circle at the edges?

Basically, if the wave is linear, with the edges curved, do those edges reach back to the wall making the mentioned quarter of a circle?
Yes, the curves are approximatly circular and extend all the way back to the 'wall' in which the slit is cut. Perhaps these images may help (taken from the same site as the applet);
Slit width is smaller than or comparable to wavelength
http://www.ngsir.netfirms.com/applets/diffraction/diffraction2.png

Slit width is significantly geater than wavelength
http://www.ngsir.netfirms.com/applets/diffraction/diffraction1.png
 
Last edited by a moderator:
  • #42
Alright, just another confirmation, let's say the wavelength is larger than the slit, the diffraction is 90 degrees right? This means the waves cannot be diffracted any more, correct? So does increasing the wavelength after this point have any other effect? I think you may have mentioned this before, but could you tell me what happens in this case?
 
  • #43
Byrgg said:
Alright, just another confirmation, let's say the wavelength is larger than the slit, the diffraction is 90 degrees right? This means the waves cannot be diffracted any more, correct? So does increasing the wavelength after this point have any other effect? I think you may have mentioned this before, but could you tell me what happens in this case?
I think you may be confusing diffraction and interfernce due to diffraction here, probably my fault for not clarifying the matter earlier. The angle we were referring to previously (using Fraunhofer defraction) is the angle at which the minima occur due to the interence of diffracted waves. Now, that maxium angle this can [theoretically] occur at is 90o. It is possible however, that waves can be diffracted by an angle greater than 90o. I would recommed reading around hugene's principle.
 
  • #44
When the wavelength is larger than the slit, the "central maxima" of the intensity pattern extends to the limit, which is 90 degrees* (measured from the direction that the wave is traveling). There is essentially no dark fringe, but the intensity does vary. (It's a bit involved, but here's a link if you want to get into the nitty gritty: http://hyperphysics.phy-astr.gsu.edu/hbase/phyopt/sinint.html#c1)

90 degrees is the limit, in the case of single slit diffraction, for a very simple reason: past 90 degrees there's a barrier that blocks the wave! :smile:
 
  • #45
"90 degrees is the limit, in the case of single slit diffraction, for a very simple reason: past 90 degrees there's a barrier that blocks the wave!"

Ok that's what I thought, but what about when hootenanny said this:

"It is possible however, that waves can be diffracted by an angle greater than 90o. I would recommed reading around hugene's principle."

So in a single slit, could I assume that the maximum diffraction is 90 degrees? Then would hootenanny's post apply to non-single slit diffraction cases only?

Also, since I'm only in gr.11 if calculating the diffraction in these other cases isn't overly complicated, could someone show me how, or at least give a rough summary? Oh and, I should be able to understand up to gr. 12material, seeing as gr. 11 is basically done and I have a good mark.
 
  • #46
Someone? Please?
 
  • #47
Oh, and in addition, is 90 degrees the max angle for the minima? I'm sure this is true for single-slit, but does this change for non-single slit?
 
  • #48
Byrgg said:
So in a single slit, could I assume that the maximum diffraction is 90 degrees?
Since the material out of which the slit has been cut extends to each side, how could any light be found at an angle greater than 90 degrees? Yes, the maximum angle at which light could possibly be diffracted is 90 degrees for that simple reason.
Then would hootenanny's post apply to non-single slit diffraction cases only?
Right.

Also, since I'm only in gr.11 if calculating the diffraction in these other cases isn't overly complicated, could someone show me how, or at least give a rough summary? Oh and, I should be able to understand up to gr. 12material, seeing as gr. 11 is basically done and I have a good mark.
I applaud your moxie, but until you learn to actually calculate the single slit case--which is not so easy!--I wouldn't worry about other diffraction geometries. (By all means read up on those diffraction patterns--slit, edge, circular hole, disk--but don't think you'll be able to actually calculate those patterns without a lot of study.) The hyperphysics site has some good material. (Check the links I gave before and poke around.)

My recommendations:
(1) Heed Hoot's advice to learn about Huygen's principle.
(2) Study Young's double slit experiment (if you haven't already)--much easier to analyze and it will give you a good understanding of interference. And that will give you a head start in understanding single slit diffraction.
(3) Learn how that single slit equation in your first post is derived.
(4) If you still want more, grab yourself a copy of a decent college physics text. (Halliday and Resnick is standard.) You'll soon see that even at the (first year) college level that single slit diffraction is only handled up to a certain point. For example, understanding the derivation of that equation from your first post in this thread is usually as far as you get in some courses. Calculating the intensity pattern--via the method of phasors--is usually done only in higher level courses. (Depends on the instructor and the text.)
 
  • #49
Ok, well I don't know if I'll go around looking all for answers to everything right now, but what about my last question?

In a non-single slit case, is the maximum angle for the minima still 90 degrees?

Oh yeah, and in a single-slit scenario, is the angle of diffraction the same as the angle of the first minima? Is it ever the same as the first minima?
 
  • #50
Byrgg said:
In a non-single slit case, is the maximum angle for the minima still 90 degrees?

When you have diffraction through an aperture, or a set of apertures in a plane barrier, you're usually interested in the light that makes it through to the other side. In this situation, the maximum angle of either a minimum or maximum must be 90 degrees. Greater than 90 degrees, and you're dealing with waves that have been reflected back on the same side of the barrier as the source.

In principle, I suppose one could analyze this situation using similar mathematical tools as with transmission of light through a barrier, but introductory textbooks don't cover this. In fact, I don't think I've seen it in an intermediate-level textbook, either.

One situation where people do discuss "backscattering" is with things like shining light on small targets such as dust grains or molecules. This is a rather advanced topic, though.

Oh yeah, and in a single-slit scenario, is the angle of diffraction the same as the angle of the first minima? Is it ever the same as the first minima?

The usual elementary formula for single-slit diffraction, n \lambda = a \sin \theta, where n is an integer > 0, gives the angles of minimum intensity of the diffracted light. If n = 1 it gives you the first minimum on either side of the center.
 
Back
Top