# Homework Help: Double-Slit Interference Problem

1. Apr 2, 2012

### KendrickLamar

1. The problem statement, all variables and given/known data

A physics instructor wants to produce a double-slit interference pattern large enough for her class to see. For the size of the room, she decides that the distance between successive bright fringes on the screen should be at least 2.81 cm. If the slits have a separation d = 0.0410 mm, what is the minimum distance from the slits to the screen when 590 nm light is used?

2. Relevant equations
L = x .d / m* λ

3. The attempt at a solution
L = x .d / m* λ
[(2.81)(.0410x10^-1)] / [(1)(590x10^-7) = 195.27 cm

i dont understand why this is wrong, we have this online quiz (which accepts answers within 3% of the correct answer) and some of my friends did it the same way, used the same exact formula plugged my numbers in for me too and they got the same answer as me, but for their quiz got it correct so i feel they just coincidentally got it right, and that it was w/in 3% for them but theres something im doing wrong some slight thing thats turning out to make my answer incorrect.

Can anyone help me please ASAP?