# I'm new here and I'm wondering something about diffraction of waves

## Main Question or Discussion Point

Ok, I read that diffraction only occurs when wavelength is larger than than the obstacle/opening. But then I find an equation looking like:

sin(theta) = wavelength/width opening

For determining the diffraction. Now if the wavelength has to be larger than w for diffraction to occur, why is it that if you apply this to the above equation an error will occur?(note: if wavelength > w then the result will be greater than 1, and sin ^ -1 of any number greater than 1 = error)

Sorry if the answer is really obvious or something, it's just that this concept is driving me insane, my physics teacher wasn't really able to help much either with the understanding. Please someone respond soon.

Related Other Physics Topics News on Phys.org
Sorry for the double post, but I should mention that this situation is with a single-slit, not sure if that was obvious or not, just decided to let you all know anyway.

Doc Al
Mentor
Byrgg said:
Ok, I read that diffraction only occurs when wavelength is larger than than the obstacle/opening.
Not so. Light diffracting though a slit creates a pattern of a bright center, surrounding by dark fringes, then further bright areas, and so on. Much of the diffracted light is in that central bright area.

But then I find an equation looking like:

sin(theta) = wavelength/width opening
That equation is for finding the angle of the dark fringes that surround the central maxima of a single slit diffraction pattern. That defines the width of the central (and most intense) part of the diffracted beam. When the wavelength is much smaller than the slit, that central beam is very narrow--not spread out much. (Of course, if the slit is too wide, you won't see much of a single slit diffraction pattern.) But when the wavelength is the same size as the slit (or larger) that equation no longer applies since the spread is so great that there are no dark fringes--the central maxima fills the entire area beyond the slit. So in that sense, diffraction is maximum.

Wow, thank you so much. I didn't think of it like that before at all. So basically you're saying that even when the wavelength is less than the width of the opening, waves are diffracted? Why is that exactly? I mean if the wavelength is small how is it being diffracted(bent by the edges of the opening)? Sorry this seems really easy but I'm always overcomplicating things.

Alright, well my physics teacher said that the stuff I was geting into was a year ahead of me, as seems the case here.

So let me get this straight, even if the wavelength is smaller, the waves still diffract? That seems to be the case here, apparently what I read originally was wrong, but now it seems that a better relationship is being defined. Decreasing the width of the opening produces a similar effect to increasing the wavelength, right? So then now after reading what those articles had to say I have a slightly better understanding of this.

If the wavlength is larger than the opening, than there are no "dark areas" produced correct? Now I see why that equation doesn't apply in this case, like I originally thought, that simply wouldn't make sense in the equation, considering it's used to find the location of dark areas...

So then is it considered a different type of diffraction when the wavlength is larger than the width of the opening? Or just an extreme version of the previous situation?

Doc Al
Mentor
Sounds like you are getting the idea.

I'd consider that having the wavelength larger than the slit is just an extreme version of the same kind of single slit diffraction. The key to understanding diffraction is to follow the phase differences of the light from different parts of the original wavefront across the slit. At different angles, the light from different parts of that wavefront must travel different distances to reach the same point on the screen, and thus has different phases: at some angles the interference is constructive, at other angles destructive. But if the slit is small compared to the wavelength, then the phase differences will never be large enough to cancel out even at large angles.

"But if the slit is small compared to the wavelength, then the phase differences will never be large enough to cancel out even at large angles."

Ok, I understand this all pretty well now I feel, well at least a lot more than before.

However, this quote seems to puzzle me slightly, why aren't the phase differences large enough in this case? I probably missed something but if you could clear up that question for me it would be greatly appreciated.

Doc Al
Mentor
The difference in path length for light from different parts of the original wavefront cannot be any greater than the width of the slit. (Which would be the path length difference at 90 degrees--the maximum angle.) So, if that slit width is a small fraction of a wavelength, then that maximum phase difference will only be a small fraction of a wavelength. To get destructive interference you need a phase difference of half a wavelength (180 degrees).

Alright, it all seems pretty clear now. Thanks for all the advice. If I think of anything else I'll just ask tomorrow as it's getting late right now.

jtbell
Mentor
Note that you get diffraction even if the width of the slit is half-infinite, so to speak. For example, if you shine a laser bean so that it grazes the edge edge of a razor blade, you get the intensity pattern shown in the graph at the top of this page:

http://scienceworld.wolfram.com/physics/Half-InfiniteScreenDiffraction.html

In that graph, you can think of the blade as lying along the x-axis (actually it's labeled u in that diagram), extending from x = 0 (the edge) to x = +infinity. If there were no diffraction, the graph for x < 0 would be a straight horizontal line at y = 1, and for x > 0 it would be a straight line along the x-axis (zero intensity).

Ok here's one more thing I'm wondering about:

"The difference in path length for light from different parts of the original wavefront cannot be any greater than the width of the slit. (Which would be the path length difference at 90 degrees--the maximum angle.) So, if that slit width is a small fraction of a wavelength, then that maximum phase difference will only be a small fraction of a wavelength. To get destructive interference you need a phase difference of half a wavelength (180 degrees)."

As Doc Al previously said. I'm wondering how you can figure this out, does that mean since destructive interference occurs when it's a low wavelength that the maximum angle is at least 180 degrees? How do you find the maximum angle? By the way, this should be the last qestion I have about this. Thanks again.

Hootenanny
Staff Emeritus
Gold Member
I don't think the Doc is refering to an 'angle' as you interpret it there. A phase difference can also be expressed using angle. For example, if two waves are out of phase by a full wavelength, they can be sad to be $\lambda$ meters out of phase, 360o out of phase or 2$\pi$c out of phase. Therefore, Doc Al is refering to two waves that are 180o out of phase, or half a wavelength out a phase.

Does that make sense?

Yeah I know about that(learned in math), but how do you find out that the slit width is the maximum phase difference?

Sorry for the double post again, but, also, how do you know that the max phase difference at that point would 90 degrees?

Hootenanny
Staff Emeritus
Gold Member
HINT: Look at the function;

$$a{\color{red}\sin\theta} = m\lambda$$

I thought that equation only applied when then the wavelength was less than the width of the opening. And Doc Al said that 90 degrees was the maximum for when the wavelength was greater than the width of the opening.

Hootenanny
Staff Emeritus
Gold Member
Byrgg said:
I thought that equation only applied when then the wavelength was less than the width of the opening. And Doc Al said that 90 degrees was the maximum for when the wavelength was greater than the width of the opening.
90 degrees is the maximum full stop. You cannot diffract something (as far as I know) past 90 degrees. That equation does only apply when the wavelength is less than a, I never said it wasn't. You can see for the formula that;

$$\frac{m\lambda}{a}\leq 1$$

Last edited:
Ok basically, I know that the maximum for that is 90 degrees, but earlier than that point doesn't destructive interference occur? If that's the case then don't you need 180 degrees for destructive interference(well enough to cancel out waves I mean)?

Hootenanny
Staff Emeritus
Gold Member
Byrgg said:
Ok basically, I know that the maximum for that is 90 degrees, but earlier than that point doesn't destructive interference occur? If that's the case then don't you need 180 degrees for destructive interference(well enough to cancel out waves I mean)?
Do you mean when $\lambda = a$?

Yeah, because then the angle is 90 degrees right? I thought you needed 180 degrees for enough interference to cancel out waves.

Hootenanny
Staff Emeritus
Gold Member
Byrgg said:
Yeah, because then the angle is 90 degrees right? I thought you needed 180 degrees for enough interference to cancel out waves.
The angle of diffraction is 90 degrees not the phase difference. You need 180o phase difference for a minima. However, the maximum angle of diffraction is at 90o.

Oh, I thought the phase difference and angle were the same number, oops, ok thanks for clearing that up, but then how do you determine the phase difference?

Ok I think this will be my last post here, here's something I pulled from a site known as "physic's classroom":

"In fact, when the wavelength of the waves are smaller than the obstacle or opening, no noticeable diffraction occurs."

I know this was sort of cleared up before but this is confusing slightly. If the wavelength is larger than/equal the barrier/opening, the diffraction is 90 degrees right? So does this quote just mean that those smaller angles(when the wavelength is smaller then the barrier/opening) are not noticeable? Does this mean we can only see diffraction when it's 90 degrees? I might be missing something here, but if someone could clear up this(hopefully final) bit of information, I would be very greatful.

'When waves (eg pressure waves) pass through openings that have about the same magnitude (or less) than the wavelength of the incident wave, diffraction will occur."

This is something I just read by another person here in another thread somewhere.

I thought diffraction always occured when a wave passed through an slit, what's going on? Someone please explain this to me.