How to photograph a diffraction pattern?

In summary: Thanks for the input.In summary, the central spot can be very bright relative to the others if single-slit interference is relevant. The sensor might have already been saturated.
  • #1
jinawee
28
2
I want to take a picture of a diffraction pattern directly. If I project it on a wall I see a clear pattern, but when trying to get the pattern on the sensor, I only record a bright green spot. I don't know what the problem is.

I'm using a ~50mm focal length lens focused at infinity. Is this the right setup? Should the photo be taken without the lens?
 
Science news on Phys.org
  • #2
Can you change the exposure time or make an HDR image? The central spot can be very bright relative to the others if single-slit interference is relevant.
 
  • #3
The sensor might have already been saturated. The huge difference in the zeroth diffraction order than the others is normal, that's why sometimes diffraction pattern image is displayed in log scale.

Reducing exposure time or placing a filter might help. Remember that if you use filter, its thickness might slightly change the overall size of the diffraction pattern.
 
  • #4
Hi jinawee, I've done this at home myself.
I'd recommend you experiment with
  • different exposure times
  • different camera settings (i.e. if you can change exposure compensation, white balance, ISO)
  • project the pattern on a white wall or white paper/screen
  • turn off all unnecessary room lighting and/or do it in the evening/at night, i.e. exclude all superfluous light
 
  • #5
Thanks, I'll give it another try with a longer exposure.
 
  • #6
jinawee said:
I'm using a ~50mm focal length lens focused at infinity. Is this the right setup? Should the photo be taken without the lens?

How are you producing the diffraction pattern? Are you using a prism or a diffraction grating? Any focusing/collimating lenses in your setup? (Other than your imaging camera)
 
  • #7
Drakkith said:
How are you producing the diffraction pattern? Are you using a prism or a diffraction grating? Any focusing/collimating lenses in your setup? (Other than your imaging camera)

I'm strudying Fresnel diffraction, so:

Laser pointer -> ~-10mm diverging lens -> Slit -> 2m -> Camera
 
  • #8
Is there any chance you could upload a picture of the diffraction pattern on the wall and give us an indication of its size?
 
  • #9
The image I see on the screen is something like this:

6itAKZ1m.jpg


The central line about 1mm thick. The diffraction pattern covered the whole lens, so the size shouldn't be a problem.

I used higher exposition times and open and closed diaphragms but got nothing. For example,

jxkSY0Am.jpg


those points might be from the sensor, but it's not what I was looking for. It seems that the bright spot from the lase has some structure:

5aANK8wm.jpg


7JfBtyRm.jpg


Would this be part of the diffraction pattern from the slit?

I was recommend to focus the camera at infinity, but since this is a divergent beam shouldn't I change the focus distance. Anyway, I don't think that's the problem because I tried several distances.
 

Attachments

  • fresnelren.JPG
    fresnelren.JPG
    4.8 KB · Views: 640
  • fresnelren.JPG
    fresnelren.JPG
    4.8 KB · Views: 577
  • fresnelren.JPG
    fresnelren.JPG
    4.8 KB · Views: 571
  • fresnelren.JPG
    fresnelren.JPG
    4.8 KB · Views: 591
  • #10
Do you have a sketch of the setup?
I was assuming you make a picture of a screen, but the last post doesn't sound like that.
 
  • #11
It seems to me that the problem with photographing the diffraction pattern could be the ratio of intensities you are trying to record. You probably have only, effectively 8 bit level resolution, which is 256 levels and also, unless your optics are very good, the flare of a lens could be limiting the resolution. Why are you not using a projection on a screen? It could be a good idea to 'mask' the centre spot by using a small hole / slot in a screen to let the peak of the pattern through. You could also double or triple the throw of your projection. The loss of light can easily be dealt with by a longer exposure.
There could also be a problem with the JPEG processing of the image. Could you set the camera to TIFF or RAW format?
 
  • #12
Perhaps HDR might be worth investigating. That is, combining multiple images taken with different exposure.
 
  • #13
I don't think you want to be in focus. It seems to me that being in focus just means you're imaging the aperture of the slit. The particular diffraction pattern produced on the wall is a result of the light diverging. If you haven't already, you could try going waaaay out of focus and slowly coming back in and observing what happens to the pattern.
 
  • #14
Lord Crc said:
Perhaps HDR might be worth investigating. That is, combining multiple images taken with different exposure.
If it were only a matter of ratio of light levels then that would be a good idea. I suspect HDR could just reveal artefacts of the optics, though.
 
  • #15
Drakkith said:
I don't think you want to be in focus. It seems to me that being in focus just means you're imaging the aperture of the slit. The particular diffraction pattern produced on the wall is a result of the light diverging. If you haven't already, you could try going waaaay out of focus and slowly coming back in and observing what happens to the pattern.
Yes. You really need to be 'focussing' on a real image that's at the position of a notional projection screen. I can't get my head around the consequences of that and, personally, I'd be inclined to put an actual screen in that place! You should get a wider part of the diffraction pattern that way, I think.
 
  • #16
jinawee said:
The image I see on the screen is something like this:
<snip>.

If you put your camera at the screen (no lens on the camera), the CCD should directly record the diffraction pattern.

Alternatively, try putting your camera (with lens attached, focused to infinity) as close to the aperture as you can.
 
  • #17
Andy Resnick said:
If you put your camera at the screen (no lens on the camera), the CCD should directly record the diffraction pattern.

Alternatively, try putting your camera (with lens attached, focused to infinity) as close to the aperture as you can.
I would think that using a camera sensor directly could limit the size of the image.?
 
  • #18
First of all, I'm trying to record the image directly because my professor said that was the correct way to measure the intensity. I should compare the measured intensity with the predictions.

Now, HDR doesn't seem to be relevant because it consists on "fusing" the highlights and the shadows in a single image. But I can't get the shadows to start.

I'm confused why it's so easy to get an image of the pattern projected on a screen, but not directly. Why do I observe a bright point from the laser through the camera, but none on the screen?

It seems that placing the screen modifies the relative intensity, but why? Shouldn't it preserve the dynamic range?

I think that the lens itself is part of the problem, since there might be internal reflections. Random phase shifts might be important too. I haven't tried without the lens to prevent dust getting into the sensor. I could try with just a piece of glass covering it. Or using a webcam without the lens, although the sensor would be really small and I would need some mechanism to move it along a line.

My professor has commented that we could drop the laser and use a home-made spectrometer as a light source. I don't think this is a good idea because it would have quite low intensity and it's hard to calibrate.
 
  • #19
sophiecentaur said:
I would think that using a camera sensor directly could limit the size of the image.?

Yes, but since the diffraction pattern is a function of angle, just move the sensor closer to 'see' more of it. To be sure, there are limits (pixel size, spurious patterns from the Bayer filter, etc.), but it's quick and easy.
 
  • #20
jinawee said:
<snip>
I'm confused why it's so easy to get an image of the pattern projected on a screen, but not directly. Why do I observe a bright point from the laser through the camera, but none on the screen?
<snip>

Because, as far as your camera (or eyeball) is concerned, the screen is an object that emits light. Another way to see this is to have your camera (with lens) facing the laser/aperture, about 10 feet away. Place a thin sheet of paper somewhere in between the laser and camera, then focus on the paper- you'll see the diffraction pattern (or whatever). Then, without re-focusing, remove the paper. What do you see now?
 
  • #21
Andy Resnick said:
Because, as far as your camera (or eyeball) is concerned, the screen is an object that emits light. Another way to see this is to have your camera (with lens) facing the laser/aperture, about 10 feet away. Place a thin sheet of paper somewhere in between the laser and camera, then focus on the paper- you'll see the diffraction pattern (or whatever). Then, without re-focusing, remove the paper. What do you see now?
Was that a real question or one for which you know the answer?
I can't help feeling that there could be parts of the diffraction image formed on the paper, well off axis, for which no light could enter the camera lens directly. Doesn't it require some scattering on the paper for that to happen? (i.e. only light from the diffracting aperture that is within the angle subtended by the camera lens aperture that will appear as (a part of) the diffraction pattern). Is your post implying that I am (or my intuition is) wrong?
 
  • #22
sophiecentaur said:
Was that a real question or one for which you know the answer?

I do know the answer- or at least the result. Try it and see.
 
  • #23
Andy Resnick said:
I do know the answer- or at least the result. Try it and see.
It's all right for guys like you but I don't have a laser and an optical bench in my kitchen. :wink:
But the 'overall' diffraction pattern of the system must be modified somewhat if the camera aperture truncates the pattern that exists out in the space between.
 
  • #24
The lens cannot project the diffraction pattern onto the camera sensor, because the pattern is not an aerial image. The lens will only project the appearance of the light source (pinpoint laser). You have two options:
1. Remove the lens from the camera and project the pattern directly onto the image sensor. Adjust the camera-to-slit distance to scale the size of the pattern to fit.
2. Project the pattern onto a diffusing white screen, then photograph it from there. Option 1 should give the better result.
You will still find variations in the light intensity due to phase variances in the laser source. This can be improved by adding a spatial filter directly in front of the laser, which is a short-focal-length lens such as a microscope objective, followed by a small pinhole. The distance from lens to pinhole needs to be set very accurately for good transmission efficiency.
 
  • #25
I would think that defocusing would project the pattern onto the sensor. The diffraction pattern is, after all, a specific case of unfocused light falling on a surface.
 
  • #26
It shouldn't be necessary to use a camera to find out the result. All you should need to do would be to use your eye directly. But the pupil of the eye is only a couple of mm wide so you would not see anything but a small portion of any pattern. It is easy enough to see some diffraction patterns directly, though. LEDs on cars can be seen through net curtains and other regular woven fabrics and they appear as fringes. However, the diffracting structure is very wide so the eye must be seeing different bits of the pattern, caused by different bits of the fabric weave. You are not looking at just a single hole / pair of slits etc. in that case.
 
  • #27
Drakkith said:
I would think that defocusing would project the pattern onto the sensor. The diffraction pattern is, after all, a specific case of unfocused light falling on a surface.
I just read that again and it made me think. The diffraction pattern that falls on a screen is due to the phase relationships between the various contributions of waves from different parts of the originating 'structure'. The screen reveals the phase relationships by scattering the light and taking away any coherence that existed in that plane. Those phase relationships are entirely different for the (still coherent) light traveling further and arriving at the lens of a camera that would have been focussed on the screen when the screen has been removed - so you will get a totally different diffraction pattern. Adding a lens will again alter the phase relationships but in a coherent way, so you would expect some sort of diffraction pattern - but you won't be actually 'focussing' any image. However, as I have mentioned before, it is only in the unobstructed, direct path through the lens that you will get some portion of the screen pattern. The lens will vignette the pattern and reduce how much of it gets to the sensor, compared with the hole left when you take the lens away.
I think I conclude that the camera should be put quite close to the diffracting structure for a useful result. This will result in a small 'throw' and a consequentially small pattern. But a high res sensor will allow you to magnify that image and get a reasonable result.
 
  • #28
Drakkith said:
I would think that defocusing would project the pattern onto the sensor. The diffraction pattern is, after all, a specific case of unfocused light falling on a surface.

That works *if* the entrance pupil of the camera lens is located at (or really close to) the diffracting aperture. Then, by focusing the lens to infinity, the far-field diffraction pattern of the aperture will be projected onto the sensor (because the lens is 1 focal length away from the sensor).
 
  • Like
Likes sophiecentaur
  • #29
Andy Resnick said:
That works *if* the entrance pupil of the camera lens is located at (or really close to) the diffracting aperture. Then, by focusing the lens to infinity, the far-field diffraction pattern of the aperture will be projected onto the sensor (because the lens is 1 focal length away from the sensor).
That makes sense. Thanks.
 

1. How do I set up my equipment for photographing a diffraction pattern?

To photograph a diffraction pattern, you will need a laser or light source, a diffraction grating, and a camera with manual settings. Set up the light source to shine through the diffraction grating onto a flat surface, and place the camera on a tripod facing the diffraction pattern.

2. What camera settings should I use for capturing a diffraction pattern?

It is best to use manual settings on your camera when photographing a diffraction pattern. Set a low ISO (100-200), a small aperture (f/8 or smaller), and a slow shutter speed (1/10 sec or slower). Adjust these settings as needed for optimal results.

3. How do I know if I have captured a good diffraction pattern?

A good diffraction pattern will have clear, distinct lines that are evenly spaced and symmetric. The colors should also be vibrant and well-defined. If the lines are blurry or uneven, try adjusting your camera settings or the position of the light source.

4. Can I use a smartphone or point-and-shoot camera to photograph a diffraction pattern?

Yes, you can use a smartphone or point-and-shoot camera to capture a diffraction pattern. However, these cameras may not have manual settings, which can make it more challenging to capture a clear image. It is recommended to use a DSLR or mirrorless camera for the best results.

5. Are there any safety precautions I should take when photographing a diffraction pattern?

Yes, it is important to handle lasers and diffraction gratings with caution. Do not look directly into the laser beam, and avoid shining it into anyone's eyes. Wear appropriate protective eyewear if necessary. Also, be aware of any potential fire hazards when using a laser as a light source.

Similar threads

Replies
13
Views
1K
Replies
17
Views
1K
Replies
15
Views
5K
Replies
30
Views
2K
Replies
12
Views
2K
Replies
2
Views
1K
Replies
26
Views
9K
Replies
2
Views
2K
Replies
8
Views
1K
Back
Top