Hi,(adsbygoogle = window.adsbygoogle || []).push({});

Here's the question:

a) There is a radio telescope that is 1000 feet in diameter. It is claimed that it can detect a signal that lays down on the entire surface of the earth a power of only one picowatt. What power would be incident on the antenna in this case?

It seems to me that this can be solved simply by considering the ratio of the surface area of the earth to the power of the signal, and applying that to the surface of the telescope.

As far as I can see, given radius of earth = 6.37 x 10E6 m and 1 foot = 0.3048m,

{ [ 1 x 10E-12 ] / [ 4 x pi x (6.37x10E6)^2 ] } x [ pi x (1/2 * 1000 * 0.3048)^2 ]

would give the answer. It actually gives 1.43 x 10E-22 when what is required is 5.6 x 10E-22 (4 times larger).

Can anyone spot what I'm doing wrong?

Cheers

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Homework Help: Simple ratios dilemma; telescopes and signals from out of space

**Physics Forums | Science Articles, Homework Help, Discussion**