# Simple ratios dilemma; telescopes and signals from out of space

1. Oct 6, 2005

### T7

Hi,

Here's the question:

a) There is a radio telescope that is 1000 feet in diameter. It is claimed that it can detect a signal that lays down on the entire surface of the earth a power of only one picowatt. What power would be incident on the antenna in this case?

It seems to me that this can be solved simply by considering the ratio of the surface area of the earth to the power of the signal, and applying that to the surface of the telescope.

As far as I can see, given radius of earth = 6.37 x 10E6 m and 1 foot = 0.3048m,

{ [ 1 x 10E-12 ] / [ 4 x pi x (6.37x10E6)^2 ] } x [ pi x (1/2 * 1000 * 0.3048)^2 ]

would give the answer. It actually gives 1.43 x 10E-22 when what is required is 5.6 x 10E-22 (4 times larger).

Can anyone spot what I'm doing wrong?

Cheers

2. Oct 6, 2005

### Tom Mattson

Staff Emeritus
You are using the total surface area of the entire Earth ($4\pi r^2$), when you should only be using the cross-sectional area of the Earth ($\pi r^2$).

3. Oct 6, 2005

### T7

Yep. Thanks. :)