1. The problem statement, all variables and given/known data A radio telescope, whose two antennas are separated by 55 m, is designed to receive 3.0-MHz radio waves produced by astronomical objects. The received radio waves create 3.0-MHz electronic signals in the telescope's left and right antennas. These signals then travel by equal-length cables to a centrally located amplifier, where they are added together. The telescope can be "pointed" to a certain region of the sky by adding the instantaneous signal from the right antenna to a "time-delayed" signal received by the left antenna a time Δt ago. (This time delay of the left signal can be easily accomplished with the proper electronic circuit.) If a radio astronomer wishes to "view" radio signals arriving from an object oriented at a 12 ∘ angle to the vertical as in the figure (Figure 1) , what time delay Δt is necessary? 2. Relevant equations dx/l = m(lambda) 3. The attempt at a solution i use dx/l = dsin(theta) x = lsin(theta) they give me l = 55 m theta = 12 degrees x = 55sin(12) = 11.4 m which is the distance ahead that one sound wave is of the other sound travels 343 m/s so it travels that distance in 1 s/ 343m (11.4 m) = .033 seconds so that is how much they need to delay the signal. is this valid?