(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Two radio antennas radiating in phase are located at points A and B, 200m apart. The radio waves have a frequency of 5.80MHz. A radio receiver is moved out from point B along a line perpendicular to the line connecting A and B.

At what distances from B will there be destructive interference? (Note: The distance of the receiver from the sources is not large in comparison to the separation of the sources, so equation [itex]dsin\theta = (m+0.5)λ[/itex] does not apply.)

3. The attempt at a solution

I tried looking at the difference in path lengths separately, because I know that for destructive interference to occur, (m+1/2)λ=Δray length.

I drew a right angle triangle, where one side was 200m, the other side was r2 and the hypotenuse was r1. I said that θ was the angle between the 200m and r1. So then:

[itex]r_1=200tanθ,~~ r_2=\frac{200}{cosθ}

~~~~~Δr = 200 \frac{sinθ-1}{cosθ} = (m+1/2)λ[/itex]

But I can't figure out how to solve this equation, nor can I see another way to solve this problem. Thanks!

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Antennas causing destructive interference

**Physics Forums | Science Articles, Homework Help, Discussion**