 #1
Saptarshi Sarkar
 99
 12
 Homework Statement:
 A source S of frequency ##f_0## and an observer O, moving with speeds ##v_1## and ##v_2## respectively, are moving away from each other. When they are separated by distance a (t=0), a sound pulse is emitted by the source. Suppose velocity of sound to be ##v_s## and calculate the time ##t_1## that it takes for the pulse to be received by O.
 Relevant Equations:

Total distance the pulse needs to travel:
##D = a + v_1t_1##
Speed of sound pulse = ##v_s  v_2##
So,
##t_1 = \frac {a + v_1t_1} {v_s  v_2}##
But the solution should be
##t_1 = \frac a {v_s  v_2}##
I assumed the following 
1. I did not consider the frequency as the Doppler shift in frequency was not asked.
2. I did not add the distance the source moved in time ##t_1## to the total distance traveled by the wave as the pulse was emitted at t=0.
Is any of my assumptions wrong?
##t_1 = \frac a {v_s  v_2}##
I assumed the following 
1. I did not consider the frequency as the Doppler shift in frequency was not asked.
2. I did not add the distance the source moved in time ##t_1## to the total distance traveled by the wave as the pulse was emitted at t=0.
Is any of my assumptions wrong?