1. The problem statement, all variables and given/known data Two slits spaced d 0.0720 mm apart are 0.800 m from a screen. Coherent light of wavelength λ passes through the two slits. In their interference pattern on the screen, the distance from the center of the central maximum to the first minimum is 3.00 mm. The intensity at the peak of the central maximum is 0.0500 W/m^2. What is the intensity at point on the screen that is 1.50 mm from the center of the central maximum? PS: wavelength calculated: 2.7 *10^-7 2. Relevant equations Δ=Dλ/ =0 (cos (πd/λ) * sin θ )^2 sin θ ≈ y/D 3. The attempt at a solution Δ=Dλ/ 3 *10^-3 = 0.8 * λ / 0.072*10^-3 λ = 2.7 *10 ^-7m this one should be correct coz my attempt in part a (which is a similar question but just change to 2mm from 1.5mm )is correct. =0 (cos (πd/λ) * sin θ )^2 sin θ ≈ y/D = 1.875*10^-3 (sin θ=0.1074) (cos (πd/λ) * sin θ )^2 = cos^2(π* 0.072*10^-3 * 1.875*10^-3/2.7 *10 ^-7) = 0^2 I=0 in part (a) I used degree but it's wrong, then I used radian and got the correct ans. but now I'm wrong again. even I use actual no. of sin θ=0.1074, I still don't get the correct ans. besides using maths calc, as I remember destructive interference happens in the middle of two bright regions, but now I'm unsure about it.