- #1
jmm5872
- 43
- 0
If a "line" source of visible light is not really a line but has a width of 1mm, how far must it be from a double slit which it illuminates in order for the two slits to be reasonably coherent? Assume the slit separation is .5mm.
I approached this problem the same as finding the distance needed for two light sources to be coherent.
Let the line between the two sources be perpendicular to the line from source 1 to the point P, and assume that L1P and L2P are parallel and approximately equal to a length L.
We also need L2P to not exceed L1P by 1/2 a wavelength.
Then I rearranged the pythagorean formula and used the above criteria to get this formula.
L=(L12)2/wavelength
But I don't think this is the whole picture. I haven't included the .5mm slit width anywhere. Since the slit width is 1/2 the laser width does this mean that the screen containing the slits can be 1/2 the distance L of the above formula?
I approached this problem the same as finding the distance needed for two light sources to be coherent.
Let the line between the two sources be perpendicular to the line from source 1 to the point P, and assume that L1P and L2P are parallel and approximately equal to a length L.
We also need L2P to not exceed L1P by 1/2 a wavelength.
Then I rearranged the pythagorean formula and used the above criteria to get this formula.
L=(L12)2/wavelength
But I don't think this is the whole picture. I haven't included the .5mm slit width anywhere. Since the slit width is 1/2 the laser width does this mean that the screen containing the slits can be 1/2 the distance L of the above formula?