Do we "lose" information when viewing objects at a distance?

  • I
  • Thread starter Josh S
  • Start date
  • #1
1
1

Main Question or Discussion Point

Hey Everyone! So disclaimer, I'm not involved with physics professionally or educationally, it's just a hobby I like to think about, so please excuse my ignorance :). Anyway, I had a thought, and I was wondering if you guys could elucidate it.

Imagine that we have a hypothetical telescope that is so accurate and so powerful that it can allow us to see small pebbles on planets large amounts of light years away. My question is, is there any distance/magnification that we will start to "lose" information at? Ie, is the information (photons in this case) that would allow us to view a rock on a planet 40,000 LY away even available to us here on earth?

And if not, do we know how much information we start to lose as things become more distant? Is there some kind of formula for this?
 
  • Like
Likes Likith D

Answers and Replies

  • #2
phinds
Science Advisor
Insights Author
Gold Member
2019 Award
15,943
5,662
The light on a distant pebble gets to us only in that it is reflected from the pebble. Now, a pebble is going to reflect light in all directions, so the amount of light that is reflected off of a pebble AND arrived at us gets smaller and smaller as the distance increases. How do you suppose that affects the answer to your question "... is there any distance/magnification that we will start to "lose" information at?" How do you suppose it affects your basic premise (that a telescope could even work at extreme distances)?

Also, some of light waves from the pebble are likely to encounter space dust, etc, and the likelihood of that happening increases as distance increases. The light rays that make it to Earth have to make it through the atmosphere. Why do stars "twinkle" and how does that affect your scenario?
 
  • Like
Likes vsv86
  • #3
3,379
942
You are just talking about optical resolution I think.
For instance the camera on NASAs Mars orbiter can resolve objects of around 1 meter size on Mar's surface into a few pixels.
Enough so that the Curiosity rover can be identified, but not in any detail, it appears as a whitish blob.

Is your question asking if there is a theoretical limit to optical resolution?
There almost certainly is, but for the time being telescope technologies are advancing rapidly/
 
Last edited:
  • #4
russ_watters
Mentor
19,459
5,660
Just to complete that last thought, optical resolution is a function of telescope diameter. So the technologies in question are focused on making bigger telescopes or arrays.
 
  • Like
Likes vsv86
  • #5
Andy Resnick
Science Advisor
Education Advisor
Insights Author
7,390
1,840
Imagine that we have a hypothetical telescope that is so accurate and so powerful that it can allow us to see small pebbles on planets large amounts of light years away. My question is, is there any distance/magnification that we will start to "lose" information at? Ie, is the information (photons in this case) that would allow us to view a rock on a planet 40,000 LY away even available to us here on earth?

And if not, do we know how much information we start to lose as things become more distant? Is there some kind of formula for this?
Questions like this come up again and again- the problem (IMO) is that a qualitative answer is easy but a quantitative answer is very complicated.

Qualitative: yes, information is lost as light propagates from source to detector, independently of the optical system. This information loss is due to the interaction of light and the medium through which is propagates, and has many names: 'seeing' is one of the more common terms (https://www.repository.cam.ac.uk/handle/1810/251667), but you experience this any time you try and image through turbulence, for example:

http://www.cs.cmu.edu/~ILIM/projects/IM/turbulence.png

Even if the telescope was in space, etc. etc.... the light interacts with the interstellar medium as it propagates and so information is lost. Quantitatively, you need to figure out the 'cross-spectral coherence matrix' to calculate how much information is lost- the matrix is a measure of how much information you can extract from the optical field and reconstruct in an image.

http://www.nat.vu.nl/~tvisser/PinO.pdf

So quantitatively, information is always lost. Now, if you want to calculate (using the paraxial approximation) how large an entrance pupil you need to set the Rayleigh criterion to 'a stone's width 40,000 LY away', feel free to do so.
 
  • Like
Likes vsv86
  • #6
25
13
I think all the previous answers are good, but there is one tiny bit that I think has been missed - the evanescent waves (https://en.wikipedia.org/wiki/Evanescent_field). The point is that even if you had no dust between your detectors and the distant object, and if you had perfect optics, and if you would capture all the light that has been reflected/scattered by the object, you would still loose information unless your detectors were within sub-wavelength distance of the object. In optics this meens you would have to be tens of nanometers away from the object.

The reason is that some of the light can never 'leave' the object because it is attenuated by exponential (rather than polynomial) decay. Unless you are very close to the object, you will not pick up these exponentially attenuated waves. Imaging the object is therefore always affected by low-pass filtering due to propagation of light. Evanescent waves are first to go, then you loose if you have dust etc, finally you loose due to finite size of your telescope.
 
  • #7
3,379
942
I think the Cassini mission to the Saturn system returned a few interesting pics of Earth-Moon
Detailed enough to see that they are separate objects,
 

Related Threads on Do we "lose" information when viewing objects at a distance?

Replies
66
Views
46K
Replies
8
Views
8K
Replies
2
Views
1K
Replies
1
Views
2K
Replies
9
Views
7K
  • Last Post
Replies
9
Views
8K
Top