Hello guys, I just have a quick question concerning the following problem: Problem In an air race, an airplane flies from a point directly over Metropolis to a point directly over Gotham City, and then turns around and flies back to the starting point. The airspeed (that is, speed of the airplane relative to the air) is constant throughout the flight and equal to v. Gotham City lies a distance D due east of Metropolis. How much time is required for the round trip if a steady wind of speed vw is blowing toward the south? My answer is: 2*D/sqrt(v^2-vw^2). Initially I had 2*D/sqrt(v^2+vw^2) but this is wrong considering the fact that the airplane would land south of the city where it was actually headed. The reason i put the minus in is that the pilot is not allowed to drift off course so he has to countersteer which results in substracting the vw from v in order to get the vector pointing from gotham to metropolis. Is my answer correct?