Wave interference disappearing?
A quick conceptual sort of question. I have a problem with two speakers separated by 80 cm and a movable microphone sliding along the wall or something 4 meters away. The problem is to find things like the position of the first mimimum in sound intensity and the second maximum.
I'm using the conditions that for maximums sin theta = lambda /d and for mins sin theta = (n + 1/2) lambda/d. The wavelength of the sound turns out to be .44 meters so for the first part of the question I just use some trig and everything is fine.
But the second part of the question asks how my answers change when the distance between the speakers shrinks to 10 cm. Now since the wavelength is a bit bigger than the separation, using those formulas gives sines larger than one. My question is how should I interpret that? Does this mean that once the wavelength/separation ratio gets too large the interference pattern disappears? That's what I'm thinking but I wasn't sure if there's something about using those conditions that I'm missing (and none of my books seem to come out and say that--is it too obvious to be mentioned explicitly?).