I am quite aware of the effect of slit width and wavelength in a single-slit diffraction pattern. However, my teacher has never touched on the effect of a source moving towards or away from the slit. Neither can I find any satisfactory or comprehensible response to this phenomenon. While my intuition tells me that only the intensity changes, I am not entirely sure.. Could anybody discuss the effect on the diffraction pattern when a source moves away, towards, or sideways with respect to the slit? What about the effect of distance between the slit and the screen on which the wave is incident (will the diffraction pattern become wider with less intensity)? Any explanation will be appreciated. Thanks!