Simple ratios dilemma; telescopes and signals from out of space

T7

18
0
Hi,

Here's the question:

a) There is a radio telescope that is 1000 feet in diameter. It is claimed that it can detect a signal that lays down on the entire surface of the earth a power of only one picowatt. What power would be incident on the antenna in this case?


It seems to me that this can be solved simply by considering the ratio of the surface area of the earth to the power of the signal, and applying that to the surface of the telescope.

As far as I can see, given radius of earth = 6.37 x 10E6 m and 1 foot = 0.3048m,

{ [ 1 x 10E-12 ] / [ 4 x pi x (6.37x10E6)^2 ] } x [ pi x (1/2 * 1000 * 0.3048)^2 ]

would give the answer. It actually gives 1.43 x 10E-22 when what is required is 5.6 x 10E-22 (4 times larger).

Can anyone spot what I'm doing wrong?

Cheers
 

Tom Mattson

Staff Emeritus
Science Advisor
Gold Member
5,453
21
You are using the total surface area of the entire Earth ([itex]4\pi r^2[/itex]), when you should only be using the cross-sectional area of the Earth ([itex]\pi r^2[/itex]).
 

T7

18
0
Tom Mattson said:
You are using the total surface area of the entire Earth ([itex]4\pi r^2[/itex]), when you should only be using the cross-sectional area of the Earth ([itex]\pi r^2[/itex]).
Yep. Thanks. :)
 

The Physics Forums Way

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving
Top