View Single Post
Ray12
#1
Mar1-12, 08:25 PM
P: 6
I've searched online and on the forum but still can't find an explanation or mechanism behind why diffraction is dependent upon wavelength.

For example, assume a water wave that diffracts around a small boat (smaller than the wavelength). The degree of diffraction decreases as the boat gets bigger, until being nil when the boat is larger than the wavelength.

Why is this? Is it just an empirical observation that's taken as an axiom or is there an explanation for this?

(Note: any explanations can involve Newtonian Mechanics and Vector Calculus, as I am already familiar with them.)
Phys.Org News Partner Physics news on Phys.org
Physicists unlock nature of high-temperature superconductivity
Serial time-encoded amplified microscopy for ultrafast imaging based on multi-wavelength laser
Measuring the smallest magnets: Physicists measured magnetic interactions between single electrons