1. The problem statement, all variables and given/known data A laser emitting light with a wavelength of 560 nm is directed at a single slit, producing an interference pattern on a screen that is 3.0 m away. The central maximum is 5.0 cm wide. a) Determine the width of the slit and the distance between adjacent maxima. b) What would the effect on this pattern be, if i) the width of the slit was smaller? ii) the screen was moved further away? iii) a larger wavelength of light was used? c) How would this interference pattern differ if the light was shone through a i) double slit? ii) diffraction grating? 2. Relevant equations λ = WΔy / L λ = 560 nm = 5.60 x 10^-7 m L = 3 m Δy = 5 cm = 0.05 m W=? 3. The attempt at a solution a) 5.60 x 10^-7 m = W(0.05 m) / 3 m W = (5.60 x 10^-7 )(3 m) / 0.05 m = 3.36 x 10^-5 m I'm thinking that the distance between adjacent maxima and width of slit is the same in this case. b) i) If the width of the slit is smaller then the angle with the horizontal must increase. ii) Ifthe screen is moved further away then the width of the slit is decreased. iii) If a larger wavelength is used the the distance between minima and maxima is increased (not sure about this one). c) i) On a double slit the interference pattern is much clearer. ii) On a diffraction grating destructive interference might occur. I need some hints about b and c please help.