Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Limits on radio telescopes

  1. Jun 6, 2005 #1
    More research for scifi stories:

    What the "theoretical" limits on the size of a radio telescope (assuming great advances in manufacturing techniques and unlimited resources). I know the resolution is dependent on how far apart you place the individual components, so could you in theory have one that spanned the entire solar system? Even larger? I know relativity at some point would make synchronizing the signals difficult, but assuming you could compensate for this? What becomes the limiting factor?

    In a related question, is it even withint the realm of possible theory (assuming amazing advances in technology and again unlimited resources but no fundamental changes to the laws of physics as we know them) to build some sort of telescope that would have enough resolution to make out small structures on the surface of a planet 5 light years distant? Would a series of fortunately place black holes contributing some gravitational lensing change the answer at all?

    Is there some specific resolution that reaches some theortical limit based on the laws of physics (as opposed to the quality of raw materials and production techniques within our reach)? If so, what causes it, and can it be accurately determined?

  2. jcsd
  3. Jun 10, 2005 #2


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I'm not an expert in this area, but I suspect the limiting factor would be light-gathering power. Although interferometers are good for high angular resolution, there are still practical limits to how much light you can collect in your dishes. It's not enough that you can distinguish the photons from two objects 1 cm apart on a nearby planet, you also have to collect enough photons to see them.

    I would think so. A city that was 10 km in size would subtend an angle of

    [tex]\theta=5 \times 10^{-8}\ arcseconds[/tex]

    or about a ten millionth of an arcsecond. The separation of the telescopes would then need to be

    [tex]D=\frac{\lambda}{\theta}\simeq 0.3\ \frac{1\ cm}{\lambda}\ AU[/tex]

    So at radio frequencies you would need to separate them by about the size of the earth's orbit. Optical frequencies would require a separation much smaller (like the size of a country), but it would be much harder to synchronize the phases. The size of the dishes required would depend upon the amount of light the city was outputting and how long you were observing it for.

    Constructing a lensing system out of black holes would, in principle, be subject to the same diffraction constraints as for an ordinary lens or mirror. The only advantage would be that you wouldn't have to synchronize the signals.

    I think you're effectively asking about the validity of electromagnetic theory on various scales. There's no such limit that I know of, but perhaps a particle physicist could give a better answer.
    Last edited: Jun 10, 2005
  4. Jun 10, 2005 #3


    User Avatar
    Science Advisor
    Gold Member

    Resolution decreases [hence aperature must increase] as wavelength increases. Radio telescopes achieve far worse resolution than optical or shorter wavelength instruments. The fact you can make a radio dish out of chicken wire instead of polished glass should give you a rough idea of the order of magnitude difference in resolution.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook