# Homework Help: Double slit interference

1. Nov 14, 2005

### siifuthun

A vertical screen has two narrow slits separated by distance d. A
second screen, parallel to the first, is a distance L away (L>>d) and
displays the first minimum of the two slit interference pattern a height
h above the horizontal line drawn from the center of the slits to the
second screen. What is the smallest angle that the second screen be
tilted to make that minimum become a maximum?

I think that what they're asking for is what angle do we tilt the screen so that a maximum occurs at height h. In that case, if we were to slant the screen towards the slits, would the angle that the second screen is moved be equal to the angle between the two slits? If that's the case, then do we just solve for dsin(theta) = n*lambda?

Or would I need make the distance that these two rays travel equal to 1 wavelength by adding another 1/2 wavelength by slanting the screen?

2. Nov 15, 2005

### Andrew Mason

I don't see how you would get a maximum at the position of first minima by tilting the screen. To get the first maximum, you would need to make that point on the screen at a greater height above the horizontal line between the centre of the slits to the screen. By tilting it, you will only make it decrease.

AM

3. Nov 15, 2005

### siifuthun

Hmm, I don't know, I was a bit confused as to what the professor was asking, he made up this problem and didn't check for any errors before posting it as a practice for the midterm. We're going to go over in discussion today, so maybe that'll help.