Hi optical engineers/physicists, I'm designing an optical sensor system, and it would be very beneficial if there was a method for determining the difference between two very close (down to tens of femtometers) optical wavelengths. The difference measured can be relative, although absolute would be preferred. The wavelengths will be around the 1550nm area. I thought there would be accurate interferometric methods already in place for this, but as far as I can see the only method used is to measure the beat wavelength created from adding the signals (completely impractical in this case, where the beat wavelength would be of the order of metres). Am I missing an obvious or common technique? The question sounds simple, but I cannot seem to find a simple solution at all. Any advice is greatly appreciated of course. Thanks, Lichen Edit: One way of course is to measure the wavelengths separately and then subtract electronically, but the system is then subject to twice the measurement error. I would rather the differential measurement is performed optically and only then is photodetection performed.