Gza
- 446
- 0
I was working out a practice test on optics and was a little confused about how to tackle this problem:
If you illuminate a hole and project the resulting pattern on a screen- as you decrease the hole size, the screen pattern gets smaller initially and then ultimately begins to grow larger. Why is this? Derive a formula for the screen pattern diameter as a function of the hole size, distance between screen and hole and the wavelength.
I know the effect is diffraction, but deriving the mentioned relation is giving me trouble. I was thinking of using the kirchhoff integral theorem, and then perhaps carrying out an integral over a rectangular aperature; but maybe that is overkill. Is their an easier way to start thinking about this?
If you illuminate a hole and project the resulting pattern on a screen- as you decrease the hole size, the screen pattern gets smaller initially and then ultimately begins to grow larger. Why is this? Derive a formula for the screen pattern diameter as a function of the hole size, distance between screen and hole and the wavelength.
I know the effect is diffraction, but deriving the mentioned relation is giving me trouble. I was thinking of using the kirchhoff integral theorem, and then perhaps carrying out an integral over a rectangular aperature; but maybe that is overkill. Is their an easier way to start thinking about this?