1. The problem statement, all variables and given/known data Two narrow slits are illuminated with light of wavelength 500 nm. Adjacent maxima near the center of the interference pattern are separated by 1.5 degrees. How far apart are the slits? 2. Relevant equations d sin θ = mλ 3. The attempt at a solution So we're given that: λ = 500 nm m = 1.5 degrees We need to find d, the distance between the two slits. I'm having a difficult time trying to figure out what to do with the value of m they give me for degrees for how far the maxima are apart. I wasn't whether I needed to take the sin of 1.5 degrees and multiply it into my wavelength value or not but upon doing this, I got a value of d = 13.1 meters. My book says the answer should be much smaller than this, about around 1.98 x 10-5 m. I'm lost how on you would go about correctly finding this unless I'm missing another equation that I would need? Any help is greatly appreciated.