## Relation between diffraction and wavelength

In order for diffraction to occur, the slit width must be on the order of the wavelength, correct? I'm puzzled because if the wave is measured along the x-axis while the slit is along the y-axis, I don't see the connection. Is this best described as a quantum mechanical effect? By passing through the slit the position of the wave is well known, resulting in a great uncertainty in the momentum which leads to the "spreading out" of the wave. What's the classical analong? Hyugens' principle?
 PhysOrg.com physics news on PhysOrg.com >> A quantum simulator for magnetic materials>> Atomic-scale investigations solve key puzzle of LED efficiency>> Error sought & found: State-of-the-art measurement technique optimised
 Recognitions: Science Advisor Diffraction occurs regardless, there is no prior requirement that must be met. In the (theoretical) case of an infinite plane wave incident on a rectangular aperture, the extent of the diffraction (i.e. the far-field angle) depends on (wavelength)/(aperture width), that is, the size of the wavelength compared to the size of the aperture. Diffraction is usually regarded as a classical effect since diffraction is 'built-in' to Maxwell's equations. Claude.
 So say I shine a laser pointer though a doorway, there's still going to be some (albeit slight) diffraction?

Mentor

## Relation between diffraction and wavelength

If the beam is narrower than the doorway, you're not going to get any diffraction from the doorway.

If you shine the beam so it grazes one side of the doorway, then ideally you get "knife-edge diffraction" which is a well-studied situation. The thickness of a real wall will mess things up in practice, though.
 So the beam has to be wider than the doorway for diffraction to occur. Does the wavelength play a role if it is not an infinite plane wave hitting the slit, but instead some other sort of wave (spherical, for example)?
 Recognitions: Science Advisor A laser beam diffracts even in the absence of any apertures. Why does an aperture cause an infinite plane wave to diffract? Because it TRUNCATES the wavefront. It is this truncation (or 'cutting-off') that results in the diffraction predicted by Maxwell's equations. A laser pointer emits wavefronts that are ALREADY truncated, thus it will diffract. Shining a laser through a doorway will not have an effect because the doorway does not truncate the wavefront any further. Any aperture that DOES truncate the beam further, will cause the laser beam to diffract more than normal. Claude.
 So forget about laser beams or electromagnetic waves. What about water waves or sound waves? Obviously these will also diffract if the wavefront is truncated because they are described by similar (wave) equations. What about my quantum-mechanical viewpoint. Is that another way of viewing the same phenomena for, say, an electron through a slit?
 Recognitions: Science Advisor Yes, you truncate the possible locations where the electron could be.
 Which I say is equivalent to knowing the position of the electron better, therefore greater uncertainity in the momentum which "spreads out" the wavefunction.
 Recognitions: Science Advisor I have seen such arguments made in elementary physics texts, though I find it a little too hand-wavey for my liking. Claude.
 Really? What's hand-wavey about it? I guess one would have to do some calculations to make it more concrete. Do you know if this has been done?

Mentor
Blog Entries: 27
 Quote by eep Which I say is equivalent to knowing the position of the electron better, therefore greater uncertainity in the momentum which "spreads out" the wavefunction.

http://www.physicsforums.com/journal...y&e=79&enum=24

Zz.

Recognitions: