1. The problem statement, all variables and given/known data First I have one general question about single-slit diffraction patterns. I've never fully understood this phenomenon. In my book it is stated that there will be minima when the waves from the top and the bottom of the slit differ exactly a whole wavelength, because one can then divide the slit into pairs of gaps that will then interfere destructively. Then maxima should occur more or less in between these minima, when the top and bottom waves differ k+0.5 (k integer from 0 on) wavelengths. Is it correct that the intensity then decreases with I = I0 (1/(2k+1))? Each time one divides the slit into an odd number of gaps, from 3, 5, 7 etc. on, there will still be pairs of gaps interfering destructively, so only 1/3, 1/5, 1/7, etc. of the intensity remains. Then I do not understand why one doesn't consider the situation when pairs of gaps differ k wavelength. E.g. you divide the slit into two gaps and the top of these two gaps differ one wavelength, so a/2 sin(theta) = k(lambda). But one can reduce that to a sin(theta) = (2k)(lambda), which overlaps with the formula for the maxima! Where am I getting stuck? Then there is one problem I couldn't solve. 1) A plane wave of wavelength 590 nm is incident on a slit with a width of a = 0.40 mm. A thin converging lens of focal length +70 cm is placed between the slit and a viewing screen and focuses the light on the screen. (a) How far is the screen from the lens? (b) Calculate the angle (theta) of the first diffraction minimum 2. Relevant equations Diffraction formulas for single silt, lens formula.. 3. The attempt at a solution For the problem: I can't see how you can derive the distance from this information and don't understand what the lens does to the interference pattern.