Hi all, I was just thinking about the Doppler effect today and I was wondering why distance between the source and listener does not affect the frequency experienced by the listener. Consider the Doppler formula: fL = fS*(v+vL)/(v+vS), with the positive direction taken from listener to source and the following situation: Police car with siren driving at a speed of 20m/s behind a criminal getaway car moving at a speed of 15m/s (Police car is behind the criminal's car at this point in time, and frequency of siren emitted by police car is 400Hz). As the police car catches up to the criminal, the frequency heard by the criminal will be shifted above 400Hz, from the Doppler formula. The opposite effect occurs after the police car catches up to the criminal and is now in front of the criminal (Assume that both cars still travel at their constant velocities). That is, the frequency heard by the criminal will now be less than 400Hz. What is confusing me is when this transition from >400Hz to <400Hz occurs if both cars are always travelling with different velocities. As the police car catches up to the criminal, from the formula it seems that up till the point that the police car overtakes the criminal, the criminal hears a constant >400Hz frequency siren. What happens when the police car is side by side with the criminal car? What frequency does the criminal hear then? 400Hz? Then when the police car overtakes the criminal car, all of a sudden the criminal hears a <400Hz frequency siren? Which then remains constant as the police car travels further and further from the criminal car? That just seems non-intuitive to me. Could it be a logic flaw, or a wrong application of the Doppler formula? I would appreciate any help in aiding my understanding for this topic! Thanks!