Explaining Diffraction of Waves & Its Role of Wavelength

  • Thread starter Thread starter sebastianhorvath
  • Start date Start date
  • Tags Tags
    Diffraction Waves
AI Thread Summary
Diffraction is the bending of waves when they encounter gaps or obstacles, primarily influenced by the wavelength relative to the size of the gap or barrier. The critical role of wavelength is highlighted, as diffraction occurs when the wavelength is greater than the gap size, affecting how waves propagate. A single wave impulse will still diffract, but its output will differ from a continuous wave due to the spectrum of frequencies involved. The discussion also touches on the transition from Rayleigh to Mie theory when the size of obstacles exceeds the wavelength, clarifying that diffraction does not cease but changes in character. Understanding the mathematics behind these phenomena is essential for a deeper grasp of diffraction principles.
sebastianhorvath
Messages
3
Reaction score
0
Hi,
I have a question about the phenomena of diffraction.
Diffraction is said to be the bending of waves as they pass through a gap or around an object in the path of the wave. Diffraction occurs if the wavelength is greater than the distance across the gap through which it is traveling, or the width of the barrier it is traveling around.
I was wondering why the wavelength played a critical role in this. The wavelength is measured from peak to peak of two consecutive waves. If you therefore sent a single wave impulse through a gap would it have a defined wavelength and would it therefore diffract? This question has puzzled me for the last two weeks and no one seems to be able to come up with a sensitive answer so far. I therefore decided to try asking this question on some science forums.

Thank you
Sebastian
 
Science news on Phys.org
In EM, wavelength determines the scale of the problem. The physical size of an aperture is unimportant, what matters is the size of the aperture relative to the wavelength, so scaling up the wavelength, for example, will have the same effect as decreasing the size of the aperture. To understand exactly how this relationship unfolds, you need to do the maths.

As for the single impulse scenario, since the behaviour of the wave is wavelength dependant, it makes sense to calculate the spectrum of such a wave. This is typically done via Fourier analysis. The spectrum will possesses a continuous spectrum (since the waveform is finite), with higher frequency (lower wavelength) components diffracting less than lower frequency components. The bottom line is, the wave will still diffract, but the 'output wave', the wave after the obstacle, will be different to that of the continuous input-wave scenario.

Claude.
 
Thank you for the proof. I will attempt to read past the first page when I have some more time on the weekend.

Also thanks Claude for your explanation. I can understand how the size of the aperture relative to the wavelength is important rather than the actual size. But then why is there such a sudden cutoff when the size of the obstacle gets greater than the wavelength. This really posses the more fundamental question of why diffraction occurs, and I would probably need to try to understand the mathematics of the proof to completely understand the phenomena.

Sebastian
 
sebastianhorvath said:
But then why is there such a sudden cutoff when the size of the obstacle gets greater than the wavelength.
Sebastian

What cutoff exactly are you referring to here?

Waves being diffracted by obstacles (rather than apertures) falls within Rayleigh and Mie theories. As I understand it, once the size of the objects become comparable to the wavelength, the dipole approximation associated with Rayleigh scattering becomes invalid, which is why one needs to invoke Mie theory to accurately describe the scattered field, as it is a rigorous derivation.

Claude.
 
sebastianhorvath said:
But then why is there such a sudden cutoff when the size of the obstacle gets greater than the wavelength.

There is no such sudden cutoff. Light passing a sharp edge diffracts, even though the "other edge" may be very far away. In studying the Fresnel theory of diffraction, it's a standard example or exercise to calculate the diffraction pattern produced by a barrier that extends to infinity in one direction, but is open to infinity in the other direction. A Google search on "knife edge Fresnel diffraction" turns up lots of lecture notes, lab experiments, etc.
 
I thought when the size of the barrier gets greater than the wavelength diffraction no longer occurs (or is a lot less noticeable). This is what I was referring to as cutoff, however maybe I was wrong.

Sebastian
 
Back
Top