I was working out a practice test on optics and was a little confused about how to tackle this problem:(adsbygoogle = window.adsbygoogle || []).push({});

If you illuminate a hole and project the resulting pattern on a screen- as you decrease the hole size, the screen pattern gets smaller initially and then ultimately begins to grow larger. Why is this? Derive a formula for the screen pattern diameter as a function of the hole size, distance between screen and hole and the wavelength.

I know the effect is diffraction, but deriving the mentioned relation is giving me trouble. I was thinking of using the kirchoff integral theorem, and then perhaps carrying out an integral over a rectangular aperature; but maybe that is overkill. Is their an easier way to start thinking about this?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Question about optics (diffraction)

**Physics Forums | Science Articles, Homework Help, Discussion**