In my physics book, one of the basic quick quizzes checkpoints asks what happens to the central peak in a diffraction envelope when you decrease the wavelength of light (from 650 nm to 450 nm, for reference). My understanding is that the width of the peak would decrease, while the number of interference fringes would remain the same. This is because the number of interference fringes depends only on the width of the slit and the distance of the screen based on m=d/a. The width of the peak decreases, because using the equation asin(theta)=m(lambda), m and a are constant, while changing lambda would change theta, thereby changing the height of the central maximum. The first order minimum would be closer to the center with a smaller wavelength. However, my book says the answer is that the width of the peak remains the same (along with the number of interference fringes). This seems inherently wrong. Based on my research on the internet, as well as the interactive picture my book included with the question, I feel like the publishers made a mistake in the answer. Clearly, wavelength affects the diffraction pattern. Right? Please help. My brain hurts.