How is diffraction affected by grating thickness?
I would like to talk to someone here who has either theoretical or practical experience with not-too-thin transmission gratings.
I have the following problem. I want to compute the far-field diffraction image of an electromagnetic wave (with a wavelength in the visible spectrum) as it passes through a diffraction grating which is 3-5 wavelengths (1-2 micrometers) thick. For precision I want to use vector diffraction theory because the grating period is also a few wavelengths (few micrometers) in size. How does a thickness of such an extent (3-5 lambda) affect the diffraction image? Can I get reasonable result if I suppose the grating to be infinitely thin and neglect its thickness? Or the discrepancy is substantial and I should carry out a precise calculation with near-field propagation inside the holes of the grating?
I would really appreciate an answer from anyone who is competent in optics.
Thank you and have a nice day!