Imagine a single slit with plane light waves incident on it with a screen (ideally far enough from the slits to simplify the math). According to Kirchhoff's diffraction formula, when a very wide slit is doubled, average intensity (averaged over all diffraction angles) doubles, and so does E_peak (the electric field at the central peak). This all agrees with energy conservation. However, when a very narrow slit (that is, much smaller than the wavelength of light) is halved, E_peak halves, and average intensity go to 1/4, which does not seem to agree with energy conservation! The average intensity should halve if the light let through the slit is halved. Calculated intensity does this because the diffraction pattern begins to lie "off of the screen" (that is, the diffraction pattern is still going at theta=pi/2). Obviously, edge effects of the slit are becoming important for very thin slits. My question is this: is the prediction of Kirchhoff's diffraction formula correct (that is, are edge effects reflecting or absorbing extra energy to maintain energy conservation) or is the formula incorrect (that is, are edge effects affecting the screen's diffraction pattern)? Possibly the answer to this depends on the specifics of the slit. I am not an optics person, so I hope someone can help me! Basically, how much energy is expected to go through a very thin slit?