- #1

- 1,011

- 0

Problem:

A plane flying horizontally at an altitude of 1 mile and a speed of 500 mph passes directly over a radar station. Find the rate at which the distance from the plane to the station is increasing when it is 2 miles away from the station.

d(distance)/dt = sqrt(3)*250

I know this is the correct answer: looked in back of book.

Questions:

1. What exactly is d(distance)/dt? is it the velocity that the plane must go along the hypotenuse inorder to be at the same place at the same time if it where traveling along the horizontal?

2. Is this d(distance)/dt only constant at this moment in time? and not say 2,3,4.5... hours from its starting position?

A plane flying horizontally at an altitude of 1 mile and a speed of 500 mph passes directly over a radar station. Find the rate at which the distance from the plane to the station is increasing when it is 2 miles away from the station.

d(distance)/dt = sqrt(3)*250

I know this is the correct answer: looked in back of book.

Questions:

1. What exactly is d(distance)/dt? is it the velocity that the plane must go along the hypotenuse inorder to be at the same place at the same time if it where traveling along the horizontal?

2. Is this d(distance)/dt only constant at this moment in time? and not say 2,3,4.5... hours from its starting position?