View Single Post
Ray12
#1
Mar1-12, 08:25 PM
P: 6
I've searched online and on the forum but still can't find an explanation or mechanism behind why diffraction is dependent upon wavelength.

For example, assume a water wave that diffracts around a small boat (smaller than the wavelength). The degree of diffraction decreases as the boat gets bigger, until being nil when the boat is larger than the wavelength.

Why is this? Is it just an empirical observation that's taken as an axiom or is there an explanation for this?

(Note: any explanations can involve Newtonian Mechanics and Vector Calculus, as I am already familiar with them.)
Phys.Org News Partner Physics news on Phys.org
Engineers develop new sensor to detect tiny individual nanoparticles
Tiny particles have big potential in debate over nuclear proliferation
Ray tracing and beyond