1. The problem statement, all variables and given/known data Consider a plane wave (of wavelength λ) incident on a wall at an angle Φ = 30. There are two slits in the wall separated by a distance d=10λ. Each slit has width a<<λ. Rays emerging from the slits propagate to a distant screen where an interference/diffraction pattern may be seen. a. For rays emerging from the slits at the angle θ, calculate the total path length difference in terms of θ, Φ, a, and λ b. For what angle θ will we find the "central maximum"? c. For what angle θ will we find the first interference minimum? Note: There will be a "first minimum" on each side of the central maximum. Find one of these. 2. Relevant equations I'm not really sure. Maybe the equation for the intensity of a two-slit interference-diffraction pattern. 3. The attempt at a solution The problem is, I have no idea what to make of a. I'm guessing for b and c, that I'm supposed to find the angles at which intensity will be a maximum and minimum, but I don't even have the path difference. I cannot figure out how the fact that the wave is incident affects the pattern.