- #1
Philip Koeck
- 785
- 221
Textbook examples usually involve a plane monochromatic wave that is diffracted by a plane grating.
If one places an ideal focusing lens behind the grating one will get a diffraction pattern in the back focal plane of the lens.
The geometric size of this diffraction pattern is proportional to the focal length of the lens.
Now if the incident wave is divergent or convergent then the diffraction pattern will end up at a distance d larger or smaller, respectively, than the focal length.
There are two things I'm wondering:
Is the diffraction pattern still sharp on a plane or on a curved surface?
Is the size of the diffraction pattern still proportional to the distance d?
If one places an ideal focusing lens behind the grating one will get a diffraction pattern in the back focal plane of the lens.
The geometric size of this diffraction pattern is proportional to the focal length of the lens.
Now if the incident wave is divergent or convergent then the diffraction pattern will end up at a distance d larger or smaller, respectively, than the focal length.
There are two things I'm wondering:
Is the diffraction pattern still sharp on a plane or on a curved surface?
Is the size of the diffraction pattern still proportional to the distance d?