- #1

ktoz

- 171

- 12

Hi

I'm sure this is an easy one, but I've managed to thoroughly confuse myself. Basically I'm trying to come up with a formula to determine how long a signal takes to travel from point a to point b.

Here are the givens:

a and b lie on parallel lines where the distance between the lines is d and a line drawn from a to b is perpendicular to both lines.

if a and b are at rest, the time it takes a signal to go from a to b is t

if a and b are moving in the same direction at speed t/m, how long would it take a signal from a to reach b?

This seems like a simple right triangle relationship but I can't seem to figure it out.

Any help appreciated.

P.S. This isn't homework. It's for part of a Doppler shift program I'm writing.

I'm sure this is an easy one, but I've managed to thoroughly confuse myself. Basically I'm trying to come up with a formula to determine how long a signal takes to travel from point a to point b.

Here are the givens:

a and b lie on parallel lines where the distance between the lines is d and a line drawn from a to b is perpendicular to both lines.

if a and b are at rest, the time it takes a signal to go from a to b is t

if a and b are moving in the same direction at speed t/m, how long would it take a signal from a to reach b?

This seems like a simple right triangle relationship but I can't seem to figure it out.

Any help appreciated.

P.S. This isn't homework. It's for part of a Doppler shift program I'm writing.

Last edited: