Interferometry (testing of optics)

In summary, the conversation discusses the best approach for measuring wavefront error of optics using an interferometer. One person suggests tilting out the fringes before measurement, while another suggests leaving some fringes in for better accuracy and precision. The use of Zernike polynomials and the importance of leaving some residual tilt for numerical accuracy is also mentioned. The conversation ends with a recommendation for a book on optical shop testing.
  • #1
Doc
47
4
Hi all,

I'm a mechanical engineer who has been dumped into optical metrology at work without anybody much more knowledgeable than myself to help me out. A previous mentor who left recently (who was our optical expert) always told me when measuring wavefront error of optics to "tilt-out the fringes" before taking measurements with the interferometer. I have been doing this for some recent measurements: getting rid of nearly all of the fringes bar maybe one. However, yesterday a colleague thought it would be better to leave some fringes in, maybe five to six. He didn't know why this was a good idea, and neither do I.

I was curious and did two measurements of an elliptical flat. One measurement I did after tilting out nearly all of the fringes. A second measurement I did after adding fringes back in, maybe around ten. The rms wavefront error measured is approximately the same (after the software subtracts off the tilt).

My question is, 'are more or less fringes better' for this type of measurement? I have seen measurement reports from vendors who sent us the optics and interferograms on their reports have roughly five fringes.

Please help!
 

Attachments

  • interferograms.png
    interferograms.png
    118.3 KB · Views: 273
  • layout.png
    layout.png
    2.3 KB · Views: 270
Science news on Phys.org
  • #2
My sympathies - I've been in the position of being promoted to expert because the actual experts left a couple of times. It's slightly nerve-wracking.

In a manual setup, I was taught to leave in a few fringes because it's easier for humans. For example, if you've got a plane wave propagating down two arms and you want to zero the path difference then you are looking for the fringe with the strongest contrast. If you have a setup with no tilt, you have to figure out whether the current black screen is blacker than the last black screen. This is difficult for humans. If you look at three or four lines, though, spotting the darkest one is more manageable. And if you have a non-plane wave you'll find that with a bit of tilt then the fringes often curve one way one side of equal path length and the other way the other side, and spotting the place where the lines switch from one curve to the other is easier than looking for the largest andmost uniformly dark central spot. And when you adjust something you have more of a chance of counting accurately how many lines went across your visual field than how many times your visual field blinked bright and dark.

All of that might be why your colleague feels like lines would be better. It might also be why you get interferograms looking like that - so you can eyeball the results yourself.

However, I get the impression you have a computer driven system. I rather suspect that none of these constraints apply to that. Or rather, different constraints apply, and the best approach would depend on how the software analyses the interferograms. I must say my experience with interferometry is mostly in the classroom setting, and we used manual kit to learn on (also, computer controlled kit was out of the university's price range at the time). It might be worth investigating if your instrumentation manufacturer has a training program. Or seeing if you can buy your former mentor a beer one day?
 
  • #3
Doc said:
Hi all,

I'm a mechanical engineer who has been dumped into optical metrology at work without anybody much more knowledgeable than myself to help me out. A previous mentor who left recently (who was our optical expert) always told me when measuring wavefront error of optics to "tilt-out the fringes" before taking measurements with the interferometer. I have been doing this for some recent measurements: getting rid of nearly all of the fringes bar maybe one. However, yesterday a colleague thought it would be better to leave some fringes in, maybe five to six. He didn't know why this was a good idea, and neither do I.

I was curious and did two measurements of an elliptical flat. One measurement I did after tilting out nearly all of the fringes. A second measurement I did after adding fringes back in, maybe around ten. The rms wavefront error measured is approximately the same (after the software subtracts off the tilt).

My question is, 'are more or less fringes better' for this type of measurement? I have seen measurement reports from vendors who sent us the optics and interferograms on their reports have roughly five fringes.

Please help!

'Tilt' wavefront error does not result in degraded images; this is why the software automatically subtracts the tilt. One reason why it's best to leave some residual tilt is numerical accuracy and precision- most likely, the software decomposes the interferogram into Zernike polynomials. If you manually remove all of the tilt, then noise can substantially contribute to the polynomial decomposition, resulting in a poor fit. Too much tilt will likewise result in increased noise (once the software subtracts the tilt out), so a 'best practice' is to leave some tilt in the optic under test.

I highly recommend that you get a copy of Malacara's book "Optical Shop Testing"- it's an indispensable reference for this kind of work.

Does that help?
 
  • Like
Likes BvU and Ibix
  • #4
Ibix said:
My sympathies - I've been in the position of being promoted to expert because the actual experts left a couple of times. It's slightly nerve-wracking.

In a manual setup, I was taught to leave in a few fringes because it's easier for humans. For example, if you've got a plane wave propagating down two arms and you want to zero the path difference then you are looking for the fringe with the strongest contrast. If you have a setup with no tilt, you have to figure out whether the current black screen is blacker than the last black screen. This is difficult for humans. If you look at three or four lines, though, spotting the darkest one is more manageable. And if you have a non-plane wave you'll find that with a bit of tilt then the fringes often curve one way one side of equal path length and the other way the other side, and spotting the place where the lines switch from one curve to the other is easier than looking for the largest andmost uniformly dark central spot. And when you adjust something you have more of a chance of counting accurately how many lines went across your visual field than how many times your visual field blinked bright and dark.

All of that might be why your colleague feels like lines would be better. It might also be why you get interferograms looking like that - so you can eyeball the results yourself.

However, I get the impression you have a computer driven system. I rather suspect that none of these constraints apply to that. Or rather, different constraints apply, and the best approach would depend on how the software analyses the interferograms. I must say my experience with interferometry is mostly in the classroom setting, and we used manual kit to learn on (also, computer controlled kit was out of the university's price range at the time). It might be worth investigating if your instrumentation manufacturer has a training program. Or seeing if you can buy your former mentor a beer one day?

Thanks that all makes sense. Unfortunately the mentor has left the country for work, I have his email but I just didn't want to pester him too much regarding stuff like this.

Thanks again for the response!
 
  • #5
Andy Resnick said:
'Tilt' wavefront error does not result in degraded images; this is why the software automatically subtracts the tilt. One reason why it's best to leave some residual tilt is numerical accuracy and precision- most likely, the software decomposes the interferogram into Zernike polynomials. If you manually remove all of the tilt, then noise can substantially contribute to the polynomial decomposition, resulting in a poor fit. Too much tilt will likewise result in increased noise (once the software subtracts the tilt out), so a 'best practice' is to leave some tilt in the optic under test.

Yes the software does decompose the interferogram into Zernikes.

Am I, in effect, taking signal out of my data by removing the tilt and consequently making noise more prominent?

Andy Resnick said:
I highly recommend that you get a copy of Malacara's book "Optical Shop Testing"- it's an indispensable reference for this kind of work.

Does that help?

Yes! I have that very book on my desk and have been reading through it this week. Like you say I have found it incredibly helpful, but I couldn't really find an answer to this particular question in my readings.

Thanks for the help!
 
  • #6
Doc said:
Yes the software does decompose the interferogram into Zernikes.

Am I, in effect, taking signal out of my data by removing the tilt and consequently making noise more prominent?

I think that's a reasonable way to think about the issue, definitely. Glad you find Malacara's book helpful!
 

1. What is interferometry?

Interferometry is a technique used in optics to measure the properties of light waves. It involves splitting a light beam and recombining it to create an interference pattern, which can be used to determine characteristics such as wavelength, phase, and intensity.

2. How is interferometry used in testing optics?

Interferometry is used in testing optics to measure the shape, surface quality, and other properties of lenses, mirrors, and other optical components. By analyzing the interference patterns created by the light passing through the optics, precise measurements can be made to ensure the components meet the required specifications.

3. What are the advantages of using interferometry in optics testing?

Interferometry offers several advantages over traditional methods of optics testing. It is non-contact, meaning there is no risk of damaging the optics, and it can provide highly accurate and precise measurements. It is also a fast and efficient method, allowing for high throughput in manufacturing processes.

4. Are there any limitations to interferometry in optics testing?

While interferometry is a powerful tool for optics testing, it does have some limitations. It requires a stable and controlled environment to produce accurate results, and it is most effective for measuring smooth, highly polished surfaces. Additionally, it may not be suitable for testing certain types of optics, such as highly curved or non-reflective surfaces.

5. What are some real-world applications of interferometry in optics?

Interferometry has a wide range of applications in optics, including astronomy, microscopy, and semiconductor manufacturing. It is also used in the production of precision optical components for devices such as cameras, telescopes, and laser systems. In addition, interferometry is used in the medical field for imaging techniques such as optical coherence tomography.

Similar threads

  • New Member Introductions
Replies
1
Views
63
Replies
2
Views
993
Replies
2
Views
1K
  • Other Physics Topics
Replies
7
Views
3K
  • Introductory Physics Homework Help
Replies
2
Views
1K
Replies
9
Views
1K
  • Other Physics Topics
Replies
1
Views
1K
Replies
33
Views
2K
Replies
5
Views
3K
  • Atomic and Condensed Matter
Replies
5
Views
2K
Back
Top