Hey guys, There is something I have known and applied for a long time, that the greater the length of an interferogram the greater the resolution of the resulting frequency-domain spectrum. But I've never fully understood why, I've always waved it off as something to do with the uncertainty principle because this is what I was told many years ago in secondary school. I've read around and I can't really find a good explanation. It'd be great if somebody here knew. Thanks :) Regards, Chris.