Question about optics (diffraction) (1 Viewer)

Users Who Are Viewing This Thread (Users: 0, Guests: 1)

Gza

434
0
I was working out a practice test on optics and was a little confused about how to tackle this problem:


If you illuminate a hole and project the resulting pattern on a screen- as you decrease the hole size, the screen pattern gets smaller initially and then ultimately begins to grow larger. Why is this? Derive a formula for the screen pattern diameter as a function of the hole size, distance between screen and hole and the wavelength.


I know the effect is diffraction, but deriving the mentioned relation is giving me trouble. I was thinking of using the kirchoff integral theorem, and then perhaps carrying out an integral over a rectangular aperature; but maybe that is overkill. Is their an easier way to start thinking about this?
 

Galileo

Science Advisor
Homework Helper
1,989
5
Gza said:
If you illuminate a hole and project the resulting pattern on a screen- as you decrease the hole size, the screen pattern gets smaller initially and then ultimately begins to grow larger. Why is this? Derive a formula for the screen pattern diameter as a function of the hole size, distance between screen and hole and the wavelength.
Is this considered introductory physics nowadays? :eek: I must be getting old.

Maybe this will help a bit:
http://en.wikipedia.org/wiki/Diffraction
 

Gza

434
0
Lol, yeah this was probably a mispost; i got a chance to check the other posts in this section and realized it at that point. Thanks for the reference though galileo.
 

The Physics Forums Way

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top