# Simple trig problem has me stumped

Hi

I'm sure this is an easy one, but I've managed to thoroughly confuse myself. Basically I'm trying to come up with a formula to determine how long a signal takes to travel from point a to point b.

Here are the givens:

a and b lie on parallel lines where the distance between the lines is d and a line drawn from a to b is perpendicular to both lines.

if a and b are at rest, the time it takes a signal to go from a to b is t

if a and b are moving in the same direction at speed t/m, how long would it take a signal from a to reach b?

This seems like a simple right triangle relationship but I can't seem to figure it out.

Any help appreciated.

P.S. This isn't homework. It's for part of a Doppler shift program I'm writing.

Last edited:

HallsofIvy
If the signal is a sound wave, in a medium in which the speed of sound is v0= d/t, then set up a coordinate system in which the origin of the signal is at (0,0) and the receiver is initially at (0, d). If they are both moving with velocity u, then at time T, the receiver will be at (uT,d). In order that the signal be received then, we must have the distance from (0, 0) to (uT, d) equal to v0T= dT/t. That is $\sqrt{u^2T^2+ d^2}= dT/t$ or $u^2T^2+ d^2= d^2T^2/t^2$ or $((d^2/t^2)- u^2)T^2= d^2$. Solve that for T.