View Single Post
Mar1-12, 08:25 PM
P: 6
I've searched online and on the forum but still can't find an explanation or mechanism behind why diffraction is dependent upon wavelength.

For example, assume a water wave that diffracts around a small boat (smaller than the wavelength). The degree of diffraction decreases as the boat gets bigger, until being nil when the boat is larger than the wavelength.

Why is this? Is it just an empirical observation that's taken as an axiom or is there an explanation for this?

(Note: any explanations can involve Newtonian Mechanics and Vector Calculus, as I am already familiar with them.)
Phys.Org News Partner Physics news on
New approach to form non-equilibrium structures
Nike krypton laser achieves spot in Guinness World Records
Unleashing the power of quantum dot triplets