- #1
Huskies213
- 34
- 0
Suppose a plane flies 3470 miles from New York to London at an average speed of 1810 mi/h. Then the plane leaves London and flies to Los Angeles 5460 miles away with an average speed of 1437 mi/h. Find the average speed.
I know that v(avg) = (delta)X/(delta) T.
But my question is how do you find the average speed? is it distance 1 + distance 2 divided by speed 1 + speed 2?
or
is it Distance 1/speed one + distance 2 divided by speed 2, and then add the two answers together ?
I know that v(avg) = (delta)X/(delta) T.
But my question is how do you find the average speed? is it distance 1 + distance 2 divided by speed 1 + speed 2?
or
is it Distance 1/speed one + distance 2 divided by speed 2, and then add the two answers together ?