Photographing diffraction grating interference patterns

  • I
  • Thread starter ChrisXenon
  • Start date
  • #1
60
10

Summary:

I need help understanding lenses in this case
If I direct a laser pointer onto a diffraction grating and place a screen beyond it, I see a diffraction pattern - a line of dots - as expected.
If I move the screen further away from the grating, the distance between the dots increases - again as expected.
If I place a camera where the screen was I can photograph the line of dots.

But changing the grating-to-camera distance does not change the distance between the dots in the image.
I cannot understand why, so I'm interested in either the answer or a good place to go and learn the necessary optics theory.

Many thanks
Chris

PS: for some reason Watching threads does not work for me. I do not get notified of replies despite selecting this option.
anyone else seeing this/knows how to fix it?
 

Answers and Replies

  • #2
Charles Link
Homework Helper
Insights Author
Gold Member
2020 Award
4,818
2,165
The camera lens serves to make the far field diffraction pattern occur in the focal plane of the camera lens. Basically the distance ## x ## in the focal plane is given by ## x=f \theta ##, where ## \theta ## is the angle in the far field pattern. This property has its application in diffraction grating spectrometers, where the far field diffraction pattern from a 2" grating can be observed at a distance of a meter or less, using a spherical focusing mirror, where one would otherwise need to go a distance of 50 meters or more to observe a far field pattern from a grating of that size. You might find this Insights article that I recently authored of interest: https://www.physicsforums.com/insights/fundamentals-of-the-diffraction-grating-spectrometer/
 
  • Like
Likes vanhees71
  • #3
60
10
Charles, thank you so much. I am new to this so your article is difficult for me to understand but I am working on that. If you're inclined to give more time to me, I'll explain a little more, but if not - thanks anyway.

I am trying to improve a simple spectrometer, and my laser pointer setup was to try to understand what's going on. In my test setup there is no concave lens - the light is collimated because it comes from a laser pointer.
In the actual spectrometer there is also no mirror - the incoming light is collimated because it passes through 2 successive slits and because the distance from those to the grating is relatively large (about 40cm). The grating is at maybe 45 degrees to the instrument axis and the camera is placed very close behind the grating, its optical axis is perpendicular to the grating. The problem is that the spectrum seen by the camera covers only 50% of its available sensor width - so it is wasting sensor resolution. I want to spread it more widely to use all of the sensor. Moving the camera away from the sensor does not do that as discussed and - thanks to you - I think I now understand why. So how to do this? I can't change the camera lens. If I place the grating and the camera at some non-90 degree relative angle, I CAN spread the spectrum out wider. Of course this will change the relationship between the incoming wavelength and the x-axis position, but I think it does so linearly so I can calibrate for that in the software. Any thoughts are very welcome.

Thanks, Chris
 
  • Like
Likes vanhees71
  • #4
sophiecentaur
Science Advisor
Gold Member
2020 Award
25,363
4,945
Summary:: I need help understanding lenses in this case

But changing the grating-to-camera distance does not change the distance between the dots in the image.
Are you using the camera lens or are you working 'prime focus' (directly on the image sensor)?
I recently had a moan on another thread about the need for more specific information about experimental setups. Where is the camera focused? Is it on autofocus or what?
If you provided a simple diagram of your actual setup then we would be all singing from the same hymn sheet.
Everyone has access to Powerpoint - type apps and they will all allow you to produce neat, legible diagrams which can be exported as jpegs. Perfect for PF consumption.
 
  • Like
Likes vanhees71
  • #5
vanhees71
Science Advisor
Insights Author
Gold Member
16,669
7,948
Don't export diagrams as jpegs, because then Gibbs's phenomenon blurs it out; png is way better!
 
  • Like
Likes sophiecentaur
  • #6
sophiecentaur
Science Advisor
Gold Member
2020 Award
25,363
4,945
Don't export diagrams as jpegs, because then Gibbs's phenomenon blurs it out; png is way better!
You are right. I just suggested jpeg because 'everyone' knows and recognises it. pdf is also very good and scales the text to any size.
 
  • #7
60
10
Are you using the camera lens or are you working 'prime focus' (directly on the image sensor)?
I recently had a moan on another thread about the need for more specific information about experimental setups. Where is the camera focused? Is it on autofocus or what?
If you provided a simple diagram of your actual setup then we would be all singing from the same hymn sheet.
Everyone has access to Powerpoint - type apps and they will all allow you to produce neat, legible diagrams which can be exported as jpegs. Perfect for PF consumption.
Hello Sophie and thanks for your reply.

I've been in the other position - someone is looking for help and is apparently unwilling to invest in a basic Google search or a foirum search or even in writing full sentences to explain a problem they want strangers to solve for them. That's annoying.
In this case, my lack of rigour is born of ignorance not idleness. The idea that the camera might not have a lens or that there would be more than one choice of what to focus on did not occur to me because of my ignorance of the possibilities.

So, there IS a lens, but I didn't say so because I didn't see an alternative. The reason I'm here asking for help is, in a way, the same reason I'm unable or unlikely to give a complete description of my situation - my own ignorance of the topic in general.

Similarly you ask where the camera is focused. The true but embarrassing answer is "I dunno - I just turn the lens barrel until I get a sharp image of the spectrum". So it's manual focus, and now, since you asked, I have checked focus by removing it from the instrument and monitoring what it can see and finding the distance for sharpest focus. It is a webcam and so its depth of field is large and focus doesn't seem really sharp at any distance, but I'd say it's focused at infinity. With hindsight, I can see that you needed to know this, but it was not ommitted through idleness.

Why no diagram? Well, I thought the description was OK but I am happy to provide one below.

I really appreciate how you trod a careful line between giving me useful feedback on how to do better without being unkind - so thanks for that.
 

Attachments

  • #8
sophiecentaur
Science Advisor
Gold Member
2020 Award
25,363
4,945
Thanks for the picture. It would, perhaps have been better to post just a line diagram. It isn't at all clear what's what, in that mass of stuff which includes the camera. Is that a marked-up photo of y our actual setup?
If things are as I think you describe then, with autofocus (or at ∞), the camera is not actually looking at the fringes that you see on a screen. It's doing what your eyes do when they look through a dirty window at a house across the street; the stuff on the glass is out of focus and it will see a sort of virtual image of the collimating slot. That could be why altering the position of the camera is making no difference.
If you can remove the lens then you should get an image on the sensor, direct. I only have experience of a CMOS 'webcam style' for astrophotography which is mostly operated without the lens and you can unscrew it easily. However, the sensor will be tiny (probably around 10mm'ish) and you may only get a small part of the spectrum on it. You say you want good resolution so it may require a very narrow slot in the collimator. (The beam from your red pointer is probably actually quite wide (for good pointing to a audience). Experiment with different slot widths. Two razor blades or equivalent, held in a cardboard frame, perhaps, would allow for easy adjustment. The secret to solving this sort of problem is to experiment, experiment, experiment - and remember / write down results or you could forget which version of the setup actually worked best.
It is probably not necessary for me to mention the need to cover it all with a black sheet - to exclude extraneous light - sorry if it's telling my grandmother how to suck eggs.
 
  • #9
vanhees71
Science Advisor
Insights Author
Gold Member
16,669
7,948
You are right. I just suggested jpeg because 'everyone' knows and recognises it. pdf is also very good and scales the text to any size.
Well, pdf is the best in this case, because you get a vector graphics (though I have my doubt with M$ products, which have a very strange behavior concerning vector-graphics formats at least some decades ago, when I still used them).
 
  • Like
Likes sophiecentaur
  • #10
Charles Link
Homework Helper
Insights Author
Gold Member
2020 Award
4,818
2,165
The mirror with the focal point at the entrance slit to collimate the light is necessary. Otherwise, I don't see how you would get the full grating illuminated by the incident light, unless the slits are so narrow that they create a wide diffraction pattern. You will get very poor resolution if you are only using a small portion of the grating.
 
Last edited:
  • Like
Likes sophiecentaur
  • #11
60
10
Thanks for the picture. It would, perhaps have been better to post just a line diagram. It isn't at all clear what's what, in that mass of stuff which includes the camera. Is that a marked-up photo of y our actual setup?
If things are as I think you describe then, with autofocus (or at ∞), the camera is not actually looking at the fringes that you see on a screen. It's doing what your eyes do when they look through a dirty window at a house across the street; the stuff on the glass is out of focus and it will see a sort of virtual image of the collimating slot. That could be why altering the position of the camera is making no difference.
If you can remove the lens then you should get an image on the sensor, direct. I only have experience of a CMOS 'webcam style' for astrophotography which is mostly operated without the lens and you can unscrew it easily. However, the sensor will be tiny (probably around 10mm'ish) and you may only get a small part of the spectrum on it. You say you want good resolution so it may require a very narrow slot in the collimator. (The beam from your red pointer is probably actually quite wide (for good pointing to a audience). Experiment with different slot widths. Two razor blades or equivalent, held in a cardboard frame, perhaps, would allow for easy adjustment. The secret to solving this sort of problem is to experiment, experiment, experiment - and remember / write down results or you could forget which version of the setup actually worked best.
It is probably not necessary for me to mention the need to cover it all with a black sheet - to exclude extraneous light - sorry if it's telling my grandmother how to suck eggs.
Thanks again Sophie. I've uploaded a diagram. The blue part is the lens barrel and the brown part is the PCB on which the CCD chip and ancilliary electronics are mounted. The camera lens is directly behind the grating - maybe 3mm from it. I don't understand your window analogy - the camera is not pointing axially, it is looking at the black interior of the tube. Without the grating it would see a black internal wall.

You drew my attention to CCD cersus CMOS. I've been using "CCD" but (gulp) been using the term informally to mean "sensor". I've actually no idea if it's CCD or CMOS. It's a cheap webcam and the spec just refers to the "sensor" not specifying the technology.

Removing the lens and imaging on the sensor directly is a fascinating prospec. This sensor is only 3mm across and 0 as you say - I don't think I could get a significant part of the spectrum on it. I've just tried to form an image by holding it manually in front of the grating but can't form any kind of image. I may try again with a more controlled support but, as well as the "won't fit" thing, this leaves the sensor open to the air which might be unwise.

My understanding of the way a greating works is that the incident beam width doesn't matter - the more slits in use the better the contrast and resolution beucause of more destructive interference, but from what you're saying that's wrong. Maybe it's only true for a monochromatic source. This spectrometer has a 1mm slit ans is currently giving about 1nm resolution; I'm just trying to make it better.

Don't worry about condescending - I recognise I'm the underling here and happy to learn.

My experiments are grim because I'm propping things up with bluetac and weights, and things are moving and drooping out of alignment - I need an optical bench. But it IS crystal clear now that the camera/grating distance does not affect the spectrum image size which is why I posted. Only tilting the grating with respect to the camera stretches the spectrum. Since that's the only paramter I can see that's available that works, I think I will make a custom frame with a large angle and see if it will calibrate correctly or is a software change would be needed. Unless you have other ideas?

Thanks again.
 

Attachments

  • #12
60
10
The mirror with the focal point at the entrance slit to collimate the light is necessary. Otherwise, I don't see how you would get the full grating illuminated by the incident light, unless the slits are so narrow that they create a wide diffraction pattern. You will get very poor resolution if you are only using a small portion of the grating.
Thanks Charles, I don't have a mirror and the spectrometer does work. It's giving me spectra bewteen 400 - 1000nm to about 1nm accuracy. But it's wasting the available sensor resolution which is why I wanted to improve it. It's become increasingly clear that I don't really understand how it works, beyond "light is collimated by slits & distance, then hits grating, splits up, and is captured by webcam", and that lack of understanding is why I can't move forward. I also don't see how the full grating is illuminated, but then I also dont' see why it has to be.
 
  • #13
BvU
Science Advisor
Homework Helper
13,850
3,416
I also dont' see why it has to be.
The more lines used, the sharper the spectral lines will come out. See hyperphysics (scroll down!)

But it's wasting the available sensor resolution
with the ##x=f \theta## from @Charles Link in #2 you can see that a telephoto lens gets you a bigger spread on the camera sensor
 
Last edited:
  • #14
60
10
The more lines uses, the sharper the spectral lines will come out. See hyperphysics (scroll down!)

with the ##x=f \theta## from @Charles Link in #2 you can see that a telephoto lens gets you a bigger spread on the camera sensor
Thanks BvU. My spectrometer works reasonably well for me with a small amount of it covered, but this is one of several things I now understand much better from this forum, giving me the opportunity to make a better spectrometer.

I can't change the lens in my webcam, so this won't be an option for me which is why I'#m looking at increasing the grating/camera angle.

Thanks again.
 
  • Like
Likes Charles Link
  • #15
Charles Link
Homework Helper
Insights Author
Gold Member
2020 Award
4,818
2,165
Tilting the lens/camera axis will only lead to distortion of the image. You might fill more of the focal plane array, but it is almost guaranteed that this will not improve the resolution. One thing that might help is to view the spectrum at order ##m=2 ##.
 
  • #16
60
10
OK thanks again Charles.
 
  • Like
Likes Charles Link
  • #17
sophiecentaur
Science Advisor
Gold Member
2020 Award
25,363
4,945
@ChrisXenon I did a google search for Webcam spectrograph and there are a lot of hits. This one is an example I found. The successful ones use a collimating lens so you may need to buy one.
 
  • Like
Likes Charles Link
  • #18
davenn
Science Advisor
Gold Member
9,394
8,275
Don't export diagrams as jpegs, because then Gibbs's phenomenon blurs it out; png is way better!
except PNG's are bloated in size
 

Related Threads on Photographing diffraction grating interference patterns

Replies
5
Views
2K
Replies
6
Views
3K
  • Last Post
Replies
9
Views
5K
  • Last Post
Replies
13
Views
3K
  • Last Post
Replies
2
Views
292
  • Last Post
Replies
4
Views
3K
  • Last Post
Replies
10
Views
2K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
18
Views
6K
Top