In order for diffraction to occur, the slit width must be on the order of the wavelength, correct? I'm puzzled because if the wave is measured along the x-axis while the slit is along the y-axis, I don't see the connection. Is this best described as a quantum mechanical effect? By passing through the slit the position of the wave is well known, resulting in a great uncertainty in the momentum which leads to the "spreading out" of the wave. What's the classical analong? Hyugens' principle?
Diffraction occurs regardless, there is no prior requirement that must be met. In the (theoretical) case of an infinite plane wave incident on a rectangular aperture, the extent of the diffraction (i.e. the far-field angle) depends on (wavelength)/(aperture width), that is, the size of the wavelength compared to the size of the aperture. Diffraction is usually regarded as a classical effect since diffraction is 'built-in' to Maxwell's equations. Claude.
So say I shine a laser pointer though a doorway, there's still going to be some (albeit slight) diffraction?
If the beam is narrower than the doorway, you're not going to get any diffraction from the doorway. If you shine the beam so it grazes one side of the doorway, then ideally you get "knife-edge diffraction" which is a well-studied situation. The thickness of a real wall will mess things up in practice, though.
So the beam has to be wider than the doorway for diffraction to occur. Does the wavelength play a role if it is not an infinite plane wave hitting the slit, but instead some other sort of wave (spherical, for example)?
A laser beam diffracts even in the absence of any apertures. Why does an aperture cause an infinite plane wave to diffract? Because it TRUNCATES the wavefront. It is this truncation (or 'cutting-off') that results in the diffraction predicted by Maxwell's equations. A laser pointer emits wavefronts that are ALREADY truncated, thus it will diffract. Shining a laser through a doorway will not have an effect because the doorway does not truncate the wavefront any further. Any aperture that DOES truncate the beam further, will cause the laser beam to diffract more than normal. Claude.
So forget about laser beams or electromagnetic waves. What about water waves or sound waves? Obviously these will also diffract if the wavefront is truncated because they are described by similar (wave) equations. What about my quantum-mechanical viewpoint. Is that another way of viewing the same phenomena for, say, an electron through a slit?
Which I say is equivalent to knowing the position of the electron better, therefore greater uncertainity in the momentum which "spreads out" the wavefunction.
I have seen such arguments made in elementary physics texts, though I find it a little too hand-wavey for my liking. Claude.
Really? What's hand-wavey about it? I guess one would have to do some calculations to make it more concrete. Do you know if this has been done?
I have written about this quite a while back, so if you care to read about it.... https://www.physicsforums.com/journal.php?do=showentry&e=79&enum=24 Zz.
Most of these types of 'derivations' assume that the particles diffract without explaining why, many quote a circular proof such as citing the HUP as being responsible for diffraction, when the HUP is the result being pulled out of the analysis. These textbooks are written for an audience who are just being introduced to QM, and is more to illustrate the idea of the HUP. So I can see why they would skip that part or just gloss over it, a rigorous analysis would be far more intimidating for the reader. I'm sure such rigorous derivations exist, but they belong in advanced texts and journal papers. Claude.