Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How is diffraction affected by grating thickness?

  1. Sep 15, 2012 #1
    I would like to talk to someone here who has either theoretical or practical experience with not-too-thin transmission gratings.

    I have the following problem. I want to compute the far-field diffraction image of an electromagnetic wave (with a wavelength in the visible spectrum) as it passes through a diffraction grating which is 3-5 wavelengths (1-2 micrometers) thick. For precision I want to use vector diffraction theory because the grating period is also a few wavelengths (few micrometers) in size. How does a thickness of such an extent (3-5 lambda) affect the diffraction image? Can I get reasonable result if I suppose the grating to be infinitely thin and neglect its thickness? Or the discrepancy is substantial and I should carry out a precise calculation with near-field propagation inside the holes of the grating?

    I would really appreciate an answer from anyone who is competent in optics.

    Thank you and have a nice day!
    Daniel
     
  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted



Similar Discussions: How is diffraction affected by grating thickness?
Loading...