1. The problem statement, all variables and given/known data Monochromatic electromagnetic radiation of wavelength λ nm falls on double slits, creating a diffraction pattern on a screen L m away, Suppose now that the distance between the slits begins to increase at a constant rate: dd/dt = x μm/s. Assume that everything else remains unchanged. Find the rate at which the distance between the central maximum and first maximum is changing, in cm/s, at the instant the distance between the slits is d μm 2. Relevant equations x = nλL/d 3. The attempt at a solution x = λL/d dx/dt = (λL/d) * (1/ dd/dt) is this right? I am confused about dd/dt part.