1. The problem statement, all variables and given/known data An interference pattern is produced by light with a wavelength 580 nm from a distant source incident on two identical parallel slits separated by a distance (between centers) of 0.480 mm . Let the slits have a width 0.320 mm . In terms of the intensity I_0 at the center of the central maximum, what is the intensity at the angular position of θ_1? Edit: Apologies in advance for the messy equation below. I'm not quite sure how to use the toolbar above for subscripts and exponents. 2. Relevant equations I_1 = I_0 * cos^2((π*d*sin(θ_1) / λ)*((sin(π*a*sin(θ_1)/λ)/(π*a*sin(θ_1)/λ))^2 θ_1 = 1.21*10^-3 rad d = .480*10^-3 m a = .320*10^-3 m λ = 580*10^-9 m 3. The attempt at a solution The equation is just multiplying the interference pattern by the diffraction pattern. Mostly just plugging in the variables and then solving from there. The answer I got was .23[/0], but that was wrong. My question is, since the original problem is stating that the distance is being measured from center, would I subtract the width a from the separation d, so that I find the actual separation between the two slits?