# Testing theory of diffraction

## Homework Statement

A diffraction grating with 500 lines per mm is held directly in front of the lens of a digital camera, with a bright white light source 8m away. The image recorded by the camera shows both the light source and the spectrum created by the light source. We know that the CMOS array of the camera has a pitch of 0.0036 mm per pixel, and the separation between the light source in the image and the middle of the "blue" region in the spectrum is about 730 pixels, which translates to a distance of 2.63mm. Based on this information, find the angle to the first order maximum of the blue light. Does this angle agree with the theory?

## Homework Equations

d*sin(θ)=λ
d = (1/500)mm
λ = 475 nm (for blue light)

## The Attempt at a Solution

Using
d*sin(θ)=λ
This gives a prediction of about 13 degrees for θ.
I'm not even sure what the relevant distances would be to get an angle from the picture. I've tried many permutations, and the closest I can get to 13 degrees is just 3 degrees. Does anyone have any idea how I can test the equation by using the picture?

Here is the picture, by the way.

#### Attachments

• tmp2.bmp
286.9 KB · Views: 394
After thinking about this some more, I may have a lead on how to solve it. Some information I left out that I'm thinking is probably important is that the camera lens/objective has a focal length of 6.5 mm. I then used the simple lens equation to find the image distance (assumed object distance was infinity), which is 6.5 mm. Then, calculating magnification, M = -(si/so), and li = (si/so)*lo, where so is distance from lens to object, si is distance from lens to image, li is length in image plane, lo is length in "object plane". Then, using so = 8m, si = 6.5mm, li = 2.63mm, I find lo = 3.24 m. Then, using arc length s = r(theta), with s = 3.24 m and r = 8 m, I find theta = 23 degrees. Still a ways off, but I didn't measure the distance from the lens to the light source very carefully.

Last edited: