- #1
matheyrichs
- 20
- 0
I'm working on a laser-based holographic interferometry system. Basically, I produce a linear fringe pattern with a Mach-Zender style interferometer setup. Then when I put a transparent sample in one of the beam pathways (cell cultures), I can determine the phase offset produced based on the deformation of these fringes.
So I've been having trouble getting a strong interference signal at my camera and a colleague recommended I polarize the laser light before sending it through the system in order to get stronger fringe intensity. He was absolutely right -- my fringes are much clearer than they used to be. I don't understand why this works. He tried to tell me that when two beams interfere that are the same polarization, the signal is better (obviously...) but he couldn't tell me why this was the case. I would think that if you have a superposition of 2 different polarizations and each one interferes similarly, shouldn't the interference term have the same intensity relative to the background signal??
Any explanation for why interferometers work better with linearly polarized light will be much appreciated!
Thanks!
So I've been having trouble getting a strong interference signal at my camera and a colleague recommended I polarize the laser light before sending it through the system in order to get stronger fringe intensity. He was absolutely right -- my fringes are much clearer than they used to be. I don't understand why this works. He tried to tell me that when two beams interfere that are the same polarization, the signal is better (obviously...) but he couldn't tell me why this was the case. I would think that if you have a superposition of 2 different polarizations and each one interferes similarly, shouldn't the interference term have the same intensity relative to the background signal??
Any explanation for why interferometers work better with linearly polarized light will be much appreciated!
Thanks!