I have looked through my optics textbook and many websites about single-slit diffraction. They all end up deriving an equation that looks something like this: I = I0*(sinc(B))2, where B = (1/2)*k*b*sin(theta), k = wavenumber, b = slit width. I don't know if there's something I'm not understanding, but I have a hard time believing that the intensity only depends on the angle. Shouldn't intensity decrease as distance from the slit increases? Thanks.