e(ho0n3
- 1,349
- 0
[SOLVED] Amplitude of Sound Waves from Two Sources at a Point
Problem. Two sources, A and B, emit sound waves, in phase, each of wavelength \lambda and amplitude D_M. Consider a point P that is a distance r_A from A and r_B from B. Show that if r_A and r_B are nearly equal (r_A - r_B \ll r_A). then the amplitude varies approximately with position as
\frac{2D_M}{r_A} \, \cos \frac{\pi}{\lambda} (r_A - r_B)
Let D(x, t) be the function that describes the displacement of the sound waves at some time t and a distance x from the source. I figure that the displacement at point P must be D(r_A, t) + D(r_B, t) right? One thing I'm noticing is that the expression for the amplitude given in the problem statement does not vary with time. What gives?
Problem. Two sources, A and B, emit sound waves, in phase, each of wavelength \lambda and amplitude D_M. Consider a point P that is a distance r_A from A and r_B from B. Show that if r_A and r_B are nearly equal (r_A - r_B \ll r_A). then the amplitude varies approximately with position as
\frac{2D_M}{r_A} \, \cos \frac{\pi}{\lambda} (r_A - r_B)
Let D(x, t) be the function that describes the displacement of the sound waves at some time t and a distance x from the source. I figure that the displacement at point P must be D(r_A, t) + D(r_B, t) right? One thing I'm noticing is that the expression for the amplitude given in the problem statement does not vary with time. What gives?