1. The problem statement, all variables and given/known data Basically, there is an interferometer with one perfect mirror and a mirror with varying thickness across it. You shine parallel coherent light, wavelength 633nm, and see an interference pattern. The interference batter is wavy dark and light fringes of varying thickness. Two neighbouring fringes have their centers marked, A and B and those are separated by 3mm. Find the slope of the line between A and B on the uneven mirror's surface wrt plane of constant phase of the wave. 2. Relevant equations y=m(lambda)/2 3. The attempt at a solution I understand why thick fringes are formed - the source isn't a point source, so it is a beam of light. That beam can be broken up into infinitely many point sources, each of which results in an interference pattern. But, I have absolutely no idea how to approach this question. I guess I want to find the wavelength that would have resulted in fringe center separation of 3mm and then compare it to 633nm (maybe get slope out of that). But I don't know how. :S Any hint would be greatly appreciated.