Is there a limit to the amount of info in reflected light?

AI Thread Summary
The discussion centers on the theoretical limits of information contained in reflected light from Earth and how it can be captured by optical devices. It highlights that while sunlight reflects off Earth, the energy density decreases with distance, affecting the amount of recoverable detail. The resolution limit is tied to the wavelength of light, making it theoretically possible to see bacteria, but practical limitations of current telescope technology prevent this. Atmospheric turbulence further complicates the ability to capture detailed images, as it degrades information, although techniques like adaptive optics can help recover some detail. Ultimately, the conversation explores the potential for future advancements in optical technology to enhance our ability to capture and interpret the information in reflected light.
DavidReishi
Messages
67
Reaction score
1
Sunlight hits our planet, for example, and reflects light outward back into space. Hence why photos can be taken of Earth from outer-space. But if we disregard technological limits to optics, etc., then in theory how much information does this reflected light contain? Is it rich enough, for instance, to contain microscopic details like bacteria? And is there a limit to the amount of detail that this reflected light contains? Further, does the amount of detail contained in this reflected light remain constant regardless of the distance that the light travels through space?
 
Last edited:
Science news on Phys.org
DavidReishi said:
Sunlight hits our planet, for example, and reflects light outward back into space. Hence why photos can be taken of Earth from outer-space. But if we disregard technological limits to optics, etc., then in theory how much information does this reflected light contain? Is it rich enough, for instance, to contain microscopic details like bacteria? And is there a limit to the amount of detail that this reflected light contains? Further, does the amount of detail contained in this reflected light remain constant regardless of the distance that the light travels through space?
The reflected light spreads out in all directions, so energy density falls with distance. And energy density equates to information. So by using a large telescope to look at Earth, more light can be collected and more detail obtained. But we cannot see things which are smaller than the "beamwidth" given by the telescope aperture. So it will be impossible to see bacteria as they are even smaller than any telescope aperture.
 
The ultimate resolution limit is proportional to the wavelength of light - you can see bacteria, as they are larger than the wavelength of visible light (a few hundred nanometers). That is just a theoretical possibility, however - there is no practical way to make a telescope that captures a large fraction (>1%) of the light reflected by Earth, and focuses it in a coherent way.
 
  • Like
Likes davenn
mfb said:
The ultimate resolution limit is proportional to the wavelength of light - you can see bacteria, as they are larger than the wavelength of visible light (a few hundred nanometers). That is just a theoretical possibility, however - there is no practical way to make a telescope that captures a large fraction (>1%) of the light reflected by Earth, and focuses it in a coherent way.
A microscope can see bacteria, as I agree they are larger then the wavelength of light, but my understanding is that a telescope cannot distinguish anything smaller than its own diameter.
 
DavidReishi said:
But if we disregard technological limits to optics, etc., then in theory how much information does this reflected light contain?

This question- imaging through a turbulent layer of atmosphere- is still an active subject of research. The turbulent air degrades information, but it is possible to recover some of it through speckle imaging, adaptive optics (wavefront sensing), etc.

Roggemann and Welsh's book is an excellent starting place:

https://www.crcpress.com/Imaging-Through-Turbulence/Roggemann-Roggemann-Welsh-Hunt/9780849337871
 
tech99 said:
but my understanding is that a telescope cannot distinguish anything smaller than its own diameter.

no, that isn't correct

I can aim my scope at all sorts of things around my neighbourhood and clearly resolve objects very much smaller than the scopes' aperture
... insulators on power poles, a bug crawling up that same pole, leaves on trees ... the list is endless

EDIT: ohhh and to really go directly against your theory ...
I can increase the aperture of the scope and have even better resolution
This is common practice with telescopes optical and radioDave
 
Is is not so that the largest scale distributions of galactic clusters are thought to be magnifications of quantum fluctuations during inflation? Information can seem to be meaningless, but it is still information.

2-missingdarkm.jpg
 
tech99 said:
A microscope can see bacteria, as I agree they are larger then the wavelength of light, but my understanding is that a telescope cannot distinguish anything smaller than its own diameter.
Telescopes and microscopes are not so different. Our hypothetical oversized telescope would look more like a microscope, as the object it looks at is close to the telescope (relative to its size).
 
  • Like
Likes Monadnockbob
I appreciate all the responses, but I think there may be too much hang-up on telescopes. Let me try a different approach and first abstract from the distance/diffusion question if that's possible.

Google satellite images capture the human figure. But what about 250 years from now? What detail will technology be able to capture from the same distance? "But," you might say, "we don't know what kind of technology we'll have in 250 years...we might have completely different forms of light gathering."

Which brings us precisely to my point. The reflected light itself shooting off of the Earth and its objects, and how much detail is actually contained in it (i.e. regardless if the light is even "seen.")
 
  • #10
DavidReishi said:
there may be too much hang-up on telescopes.
What else, other than a telescope, can be used to see details on the Earth from a distant point in space? If you want detail then you need a 'viewing device' (aka telescope) that has a large aperture. This is because Diffraction will impose limits on the acuity of any optics, even when fancy image processing is used over long periods of time.
DavidReishi said:
The reflected light itself shooting off of the Earth and its objects, and how much detail is actually contained in it
The amount of 'recoverable' detail depends entirely on the system that's used to see it with and the amount of extraneous light that is interfering with the 'wanted' image.
 
  • #11
sophiecentaur said:
The amount of 'recoverable' detail depends entirely on the system that's used to see it...

Okay, so what if we put the system that's used to see it on a historically sliding scale, say from the beginning of telescopes to 10,000 years from now into the future (bear with me)... Doesn't that make it possible to ask the question, Well how much information is actually contained in the reflected light itself? It would be similar to asking, What is the limit of the 'system that's used to see it' beyond which it doesn't matter because there's no finer detail contained in the light itself to be communicated?
 
  • #12
Also, we can say that the above context of planets, space and the sun is perhaps arbitrary...for now.

So that a question like the following can be asked first. If you stand 10 feet from me holding out a petri dish covered in bacteria, does the light that reflects off the petri deish and travels 186,000 mi/sec to me contain detail of that bacteria? Forget whether I can see it with my naked eye... Is the information, i.e. light describing the bacteria as distinct from its background, etc., actually making it 10 feet to me? That is, in theory, could a strong enough optical device, even if not invented yet, allow me to see the bacteria?
 
  • #13
DavidReishi said:
Also, we can say that the above context of planets, space and the sun is perhaps arbitrary...for now.

So that a question like the following can be asked first. If you stand 10 feet from me holding out a petri dish covered in bacteria, does the light that reflects off the petri deish and travels 186,000 mi/sec to me contain detail of that bacteria? Forget whether I can see it with my naked eye... Is the information, i.e. light describing the bacteria as distinct from its background, etc., actually making it 10 feet to me? That is, in theory, could a strong enough optical device, even if not invented yet, allow me to see the bacteria?

As I mentioned earlier, the answer depends on what the air is doing. The movement of air affects the optical path, and if the air is not moving in a deterministic fashion then information is irretrievably lost, at length scales determined by the motion.

Consider looking at something through thermal haze:

http://www.the-digital-picture.com/...mm-f-5-6.3-Di-VC-USD-Lens/Railroad-Bridge.jpg

No lens can 'undo' this type of image degradation. The best we can to is to use many images and computational approaches to guess what the undistorted image is. Looking down from space is easier than looking up from Earth, but I can't easily explain why.

As far as the question, 'Can I resolve a bacterium at 10 feet?' The answer no. I can demonstrate this by the basic design parameters. Given a bacterium 1 x 3 microns (E. Coli) located 3 meters way, my lens needs to have an angular resolution of approximately 1.1111 × 10^-7 radians (6.366×10^-6 degrees, 0.0229 arcsec), corresponding to a lens diameter of 5.5 meters (Rayleigh criteria). So that's kind of silly. But maybe we can be smart and use aperture synthesis to reduce the mass. What about the focal length?

The E Coli needs to span 2 x 6 pixels (since it's resolve, not detect), using 3 micron pixels (small, but not unreasonable) gives a linear magnification of 6, and since the object distance is 3 meters, the image distance is 0.5m, which gives a focal length of 0.43 meters. But maybe we can figure out how to make nm-scale detectors, which would help increase the focal length. Because right now our lens has a numerical aperture of 6.4, meaning we can't image in air. Which is what we wanted to do. So we have to turn to computational approaches, combining many 'partial' images to reconstruct the object field.

If you want to see small things, yoo have to put your lens close to them. You can be far away (and it's often better to be further away), but the lens itself has to be close.
 
  • #14
Andy Resnick said:
As I mentioned earlier, the answer depends on what the air is doing. The movement of air affects the optical path, and if the air is not moving in a deterministic fashion then information is irretrievably lost, at length scales determined by the motion.

Consider looking at something through thermal haze:

http://www.the-digital-picture.com/...mm-f-5-6.3-Di-VC-USD-Lens/Railroad-Bridge.jpg

No lens can 'undo' this type of image degradation. The best we can to is to use many images and computational approaches to guess what the undistorted image is. Looking down from space is easier than looking up from Earth, but I can't easily explain why.

Let's make this a non-factor. Let's say, if I understand your terminology correctly, that it's more to detect than resolve, i.e. that distortion's fine.

As far as the question, 'Can I resolve a bacterium at 10 feet?' The answer no. I can demonstrate this by the basic design parameters. Given a bacterium 1 x 3 microns (E. Coli) located 3 meters way, my lens needs to have an angular resolution of approximately 1.1111 × 10^-7 radians (6.366×10^-6 degrees, 0.0229 arcsec), corresponding to a lens diameter of 5.5 meters (Rayleigh criteria). So that's kind of silly. But maybe we can be smart and use aperture synthesis to reduce the mass. What about the focal length?

What if we're talking not about lenses and the science that goes with them, but about some future and currently unfathomable means of collecting light. The point is, does the same degree of information seen under a microscope actually reach me physically (i.e. not according to perception but according to reality) in the light that travels 10 ft from the petri-dish to me?

If you want to see small things, yoo have to put your lens close to them. You can be far away (and it's often better to be further away), but the lens itself has to be close.

In older satellite images of earth, you could make out many prominent features of the landscape, but by comparison to which the human form would be regarded as a "small thing." Yet, nowadays, satellite images easily make out people. It's not that the satellites got closer. The actual information of the human form was in the light that reached the older satellites too. It was just the optical technology that lagged. Right?
 
  • #15
mfb said:
The ultimate resolution limit is proportional to the wavelength of light - you can see bacteria, as they are larger than the wavelength of visible light (a few hundred nanometers).

I didn't notice this before...it answers a large part of the question.

That is just a theoretical possibility, however - there is no practical way to make a telescope that captures a large fraction (>1%) of the light reflected by Earth, and focuses it in a coherent way.

What about simply a human form, as seen in today's satellite images of earth? The reflected light from Earth that travels towards the optics of a satellite, insofar as it misses the satellite, keeps traveling. Is it not a matter only of technology how farther and farther away the light could be collected and the same human form still be made out? In other words, does such detail keep traveling in the light itself?
 
  • #16
It is "just" a matter of technology, yes. A telescope twice the size at twice the distance can (in principle) see the same as the closer and smaller telescope. Same for a microscope.
 
  • #17
Previous comments are correct that the angular resolution depends on the ratio of the aperture (diameter of telescope lens) to the distance away (and the wavelength of the light) by Rayleigh's criterion (look for "Angular Resolution" in Wikipedia). However there is also the fact that in order to obtain an image one has to divide the detection area into pixels and effectively count the number of photons landing on each pixel with sufficient statistics to be able to resolve the required intensity variations.

If the photons arrived perfectly evenly in time then in order to resolve a 1% difference in intensity from one pixel to the next you should only need to wait for ~100 photons on each pixel. However their arrival rate is not steady but random (poissonian), and so in order to have a 1 sigma likelihood of a 1% intensity resolution, you would need to wait for ~10,000 (=100^2) photons to arrive on each pixel (I think!). If you now allow say 6x2 pixels for your bacterium, then you need to wait for 10,000 photons from each of those 12 locations on the bacteria to be scattered from its surface in just the right direction to enter the focussing aperture (which then steers them to land on the right pixel). So exposure time (or shutter speed) sets another fundamental limit to the information that you can obtain from reflected light.

Knowing the brightness of illumination, and knowing the percentage of scattered light that will enter the aperture of your telescope, will allow you to work out how long you have to wait in order to obtain a 1% (~7 bit) brightness resolution from your (so-far) perfect photon detecting sensor array. If your sensor is imperfect and generates approximately the same rate of random thermal activations as the real photon detections, then you will need to wait 4 times longer to obtain the same intensity resolution (I think!). If your random non-photon detection rate (thermal noise in the sensor) is significantly greater than the rate of arrival of signal photons, then you may never be able to resolve an image no matter how long you wait. This is a practical (non-fundamental) limit to your information collection.

So the diameter of a single aperture (or the distance apart of multiple apertures) together with the wavelength determines the angular resolution of your imaging device, while the illumination of the object, the total collecting area of the lens, and the efficiency of your pixel detectors determines how long you will have to wait before an acceptable image can be built up from randomly scattered and randomly arriving photons.
 
Last edited:
  • #18
Fortunately, photon statistics and angular resolution scale in the same way - at twice the distance the photon flux per area reduces by 1/4, but to keep the angular resolution your mirror area goes up by a factor of 4 as well.
A different way to see this: for a fixed angular resolution you have to cover a fixed fraction of the solid angle, which means you capture a fixed fraction of the emitted photons. Noise is certainly a problem, illumination should not be an issue: a bacterium in sunlight emits about 10 million photons per (100nm)2 and second. Collect 1% of them and you get 1% resolution within 1/10 of a second.
 
  • #19
A good calculation. But if you allow multiple apertures - as is done with very long baseline interferometry at microwave frequencies (using the diameter of the Earth as the baseline), and has been done over much lesser distances in the optical - then angular resolution is independent of photon statistics. Collecting 1% of the light scattered from a bacterium using a lens on a satellite or as the OP suggested in "outer space" would be problematic to say the least :smile:
 
  • #20
davenn said:
no, that isn't correct

I can aim my scope at all sorts of things around my neighbourhood and clearly resolve objects very much smaller than the scopes' aperture
... insulators on power poles, a bug crawling up that same pole, leaves on trees ... the list is endless

EDIT: ohhh and to really go directly against your theory ...
I can increase the aperture of the scope and have even better resolution
This is common practice with telescopes optical and radioDave
Davenn. But if the telescope is focused at infinity, then a tiny lamp at the focus produces a parallel beam the diameter of the aperture, and the converse happens when it is "receiving".
I agree that if focussed closer then infinity it can resolve small objects. A radio telescope cannot resolve something smaller than its beamwidth (however that is defined).
 
  • #21
DavidReishi said:
Let's make this a non-factor. Let's say, if I understand your terminology correctly, that it's more to detect than resolve, i.e. that distortion's fine.

What if we're talking not about lenses and the science that goes with them, but about some future and currently unfathomable means of collecting light. The point is, does the same degree of information seen under a microscope actually reach me physically (i.e. not according to perception but according to reality) in the light that travels 10 ft from the petri-dish to me?

In older satellite images of earth, you could make out many prominent features of the landscape, but by comparison to which the human form would be regarded as a "small thing." Yet, nowadays, satellite images easily make out people. It's not that the satellites got closer. The actual information of the human form was in the light that reached the older satellites too. It was just the optical technology that lagged. Right?

You are clearly trying to get a specific answer. The correct answer is 'no'. Yes, we can make better optics now than in the past. But there are hard physical limits on what is possible, and I've spelled out a few of them for you. Do you think optics is exempt from physical laws?
 
  • #22
jwinter said:
If the photons arrived perfectly evenly in time then in order to resolve a 1% difference in intensity from one pixel to the next you should only need to wait for ~100 photons on each pixel. However their arrival rate is not steady but random (poissonian), and so in order to have a 1 sigma likelihood of a 1% intensity resolution, you would need to wait for ~10,000 (=100^2) photons to arrive on each pixel (I think!). If you now allow say 6x2 pixels for your bacterium, then you need to wait for 10,000 photons from each of those 12 locations on the bacteria to be scattered from its surface in just the right direction to enter the focussing aperture (which then steers them to land on the right pixel). So exposure time (or shutter speed) sets another fundamental limit to the information that you can obtain from reflected light.

This is mostly well-reasoned. Now calculate how much time it takes to gather those photons given illumination conditions equivalent to solar illuminance (hint: not very long). Exposure time is not relevant here.
 
  • #23
So much has been said that I'd reply to if I wasn't so overwhelmed...but please know that I'm reading the posts thoroughly and in many cases multiple times.

Andy Resnick said:
You are clearly trying to get a specific answer. The correct answer is 'no'. Yes, we can make better optics now than in the past. But there are hard physical limits on what is possible, and I've spelled out a few of them for you. Do you think optics is exempt from physical laws?

No, I don't think that. But, again, I'm not really talking about the optics side of it, but rather about the light itself, and the detail contained in it physically as it travels.

Let's go back to the petri-dish with bacteria held 10 feet from me. Someone else stands to the side and blocks me from seeing the petri-dish with a placard...moving the latter to allow me only the quickest glance of the dish. During that glance, does the light that reflects off the petri-dish, physically travels over to me, and hits me in the face, actually contain in it the same amount of information as that found under a microscope? (...forgetting for a moment that microscopes have their own light-source) Or, during the light's travel to me, is the fine detail that the light contains physically lost from it?
 
  • #24
More time allows to gather more information. No information is lost in the propagation of light - at least not in theory (this is a very fundamental concept of physics, and not limited to light). In practice, reconstructing what happened gets harder over time and distance (this is another very fundamental concept of physics - entropy).
 
  • #25
Andy Resnick said:
This is mostly well-reasoned. Now calculate how much time it takes to gather those photons given illumination conditions equivalent to solar illuminance (hint: not very long). Exposure time is not relevant here.
If you are attempting to view the Earth from outer space as the OP suggested, with the hope of imaging a bacterium, using any possible-to-create arrangement of widely spaced lensing (to get the resolution), then I think exposure time is going to become relevant!
 
  • #26
mfb said:
No information is lost in the propagation of light - at least not in theory (this is a very fundamental concept of physics, and not limited to light). In practice, reconstructing what happened gets harder over time and distance (this is another very fundamental concept of physics - entropy).

This makes no sense. No information is lost in the propagation of light through vacuum- which is facile and not what this thread is about. The situation discussed here violates the assumptions used in Van Cittert-Zernike's theorem.

Information loss is a fundamental aspect of diffuse propagation, it is not something that can be abstracted away. Information is always lost in the propagation of light through any medium. Propagation through fluid media introduces additional loss mechanisms- information loss that cannot be recovered. This may not be introductory physics level material, but its a very fundamental aspect of physics.

https://www.osapublishing.org/ol/abstract.cfm?uri=ol-28-13-1078
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.174.6512&rep=rep1&type=pdf
https://www.osapublishing.org/josa/abstract.cfm?uri=josa-53-3-317
http://www.intechopen.com/download/pdf/28319
 
Last edited:
  • #27
DavidReishi said:
But, again, I'm not really talking about the optics side of it, but rather about the light itself, and the detail contained in it physically as it travels.

Why do you think the propagation of light through a lossy/scattering medium is not accompanied by a loss of information?
 
  • #28
jwinter said:
If you are attempting to view the Earth from outer space as the OP suggested, with the hope of imaging a bacterium, using any possible-to-create arrangement of widely spaced lensing (to get the resolution), then I think exposure time is going to become relevant!

There is no possible arrangement of lenses. This thread is becoming foolish and devoid of science.
 
  • #29
Andy Resnick said:
Why do you think the propagation of light through a lossy/scattering medium is not accompanied by a loss of information?

This thread is becoming foolish and devoid of science.

I'm sorry if you're becoming frustrated. But these are questions about light and its transmission of information that I'd really like help answering.

I don't think that the propagation of light through a lossy/scattering medium is not accompanied by a loss of information. I don't know what gave you the impression that I thought that. In fact, for the sake of my inquiry I'm fine with the idea of loss of information...as long as it's not total loss of information.

Anyway, what about our petri-dish situation? Is loss of information really an issue at 10 feet? I mean, the light traveling to me from the dish allows my eyes to make out the outline of the dish with perfect clarity. So why would loss be an issue when it comes to the much smaller bacteria at the center of the plate? The visual information of the bacteria, contained in the light, is hopelessly scattered and diffused, but the visual information of the dish's form is transmitted to me, for practical purposes, perfectly?
 
  • #30
DavidReishi said:
I'm sorry if you're becoming frustrated. But these are questions about light and its transmission of information that I'd really like help answering.

I'm becoming frustrated because from my perspective, you have put *zero* effort into understanding my answer.

DavidReishi said:
In fact, for the sake of my inquiry I'm fine with the idea of loss of information...as long as it's not total loss of information.

Obviously the information content does not undergo a discrete change from 100% to 0%. The process is well described by the book I referenced and several of the papers I referenced. Have you done *any* independent reading? Even a book covering fiber optic communications will have material that is applicable here. What exactly have you done to learn the relevant material?

DavidReishi said:
Anyway, what about our petri-dish situation? Is loss of information really an issue at 10 feet? I mean, the light traveling to me from the dish allows my eyes to make out the outline of the dish with perfect clarity. So why would loss be an issue when it comes to the much smaller bacteria at the center of the plate? The visual information of the bacteria, contained in the light, is hopelessly scattered and diffused, but the visual information of the dish's form is transmitted to me, for practical purposes, perfectly?

Do you understand what the Abbe limit means? Do you understand the idea of spatial frequencies, and how that is used to describe blurring? Do you understand how information is encoded in an electromagnetic field?

You began by asking a perfectly valid question, but you have not tried to understand the answer.
 
  • #31
Andy Resnick said:
I'm becoming frustrated because from my perspective, you have put *zero* effort into understanding my answer.

I'm sorry if it seemed like that.

Obviously the information content does not undergo a discrete change from 100% to 0%. The process is well described by the book I referenced and several of the papers I referenced. Have you done *any* independent reading? Even a book covering fiber optic communications will have material that is applicable here. What exactly have you done to learn the relevant material?

Well today I read what I could find about the ability of smaller wavelengths of light, i.e. smaller than bacteria, to penetrate the atmosphere. I was thrilled to learn that the sun puts off x-rays, but then hugely disappointed to learn, according to NASA, that the eighteen miles of Earth's atmosphere blocks more than 99% of those rays from making it to the Earth's surface.

Do you understand what the Abbe limit means? Do you understand the idea of spatial frequencies, and how that is used to describe blurring? Do you understand how information is encoded in an electromagnetic field?

No, no, no, and no. Are those decisive issues in our 10 ft petri-dish demo? If so, do you mind telling me how?

You began by asking a perfectly valid question, but you have not tried to understand the answer.

I promise, you give me something I can recognize as an answer, and I'll try my hardest to understand it.
 
  • #32
Andy Resnick said:
Information loss is a fundamental aspect of diffuse propagation, it is not something that can be abstracted away. Information is always lost in the propagation of light through any medium. Propagation through fluid media introduces additional loss mechanisms- information loss that cannot be recovered. This may not be introductory physics level material, but its a very fundamental aspect of physics.
To lose information, you would need several different initial states to end up in the same final state. It would also be a direct violation of CPT symmetry. If you find any situation like that, go and take the Nobel Prize(s)! For black holes this triggered decades of discussion, but without black holes everyone agrees that no information is lost.
Information is lost in a practical sense - it gets too hard to recover experimentally. That's what I wrote. But the information is never lost completely, it is just a recovery problem.
 
  • #33
Andy Resnick said:
There is no possible arrangement of lenses. This thread is becoming foolish and devoid of science.
I accept you would need more than just lenses since the light from the widely separated detector apertures needs to be combined in a manner which can interfere constructively and destructively (ie phase preserving). But it is fundamentally do-able (see "astronomical optical interferometry" in Wikipedia). This is real science.

As I mentioned previously, this is regularly done in the microwave regime using a large fraction of the diameter of the Earth as the baseline and is called "very long baseline interferometry". There is no reason why the same technique could not be done from satellites with much wider separation in space, and there is no fundamental reason why the same technique could not be applied in the optical regime. I believe it should be possible to achieve with technology already available - and photon statistics will be the fundamental problem in obtaining useful results.
 
  • #34
jwinter said:
I accept you would need more than just lenses since the light from the widely separated detector apertures needs to be combined in a manner which can interfere constructively and destructively (ie phase preserving). But it is fundamentally do-able (see "astronomical optical interferometry" in Wikipedia). This is real science.

As I mentioned previously, this is regularly done in the microwave regime using a large fraction of the diameter of the Earth as the baseline and is called "very long baseline interferometry". There is no reason why the same technique could not be done from satellites with much wider separation in space, and there is no fundamental reason why the same technique could not be applied in the optical regime. I believe it should be possible to achieve with technology already available - and photon statistics will be the fundamental problem in obtaining useful results.
If we are in the Radiation Near Field of the array, within the Rayleigh Distance (D^2/2 lambda), we can resolve objects smaller than the array. So yes, an array consisting of two satellites could resolve small objects on Earth. But if we are in the Radiation Far Zone (beyond the Rayleigh Distance), we cannot resolve objects smaller than the aperture. This is because, as Rayleigh pointed out, there is a maximum distance at which a lens or array can be focussed to a point, and beyond this the best we can achieve is to focus at infinity and accept that the beam starts off parallel and then diverges in accordance with diffraction theory. The principle applies also to a large array, such as two satellites.
As far as loss of information is concerned, neglecting atmospheric disturbance, if we completely surround Earth with an antenna, yes we could catch all the photons. But a finite size aperture can only catch a sample of the photons, so there is a loss of information.
 
  • #35
jwinter said:
As I mentioned previously, this is regularly done in the microwave regime using a large fraction of the diameter of the Earth as the baseline and is called "very long baseline interferometry".
The comparison is a bit unfair, as VLBI looks for coherent sources of radiation. That is fine for microwaves, but a bacterium is not a laser - it won't emit coherent radiation. You would need fancy optics, but on a smaller scale, we have that already. A VLT-like telescope in low Earth orbit (~400 km), neglecting issues with orbital motion (horribly unrealistic) and the atmosphere, could resolve structures as small as 0.2 millimeters on the ground. While it won't make nice pictures, it could see some structure in two large adjacent bacteria of type Thiomargarita namibiensis.
 
  • #36
mfb said:
The comparison is a bit unfair, as VLBI looks for coherent sources of radiation.
I don't think this is true (but I could be wrong) - it depends what you mean by "coherent". VLBI accepts and interferes a large band of microwave frequencies, not just a single laser-like source. True, it is a very narrow band compared to light, but not particularly narrow for its microwave regime. The two sources are "coherent" simply because they both come from the same small area of the sky.
mfb said:
... a bacterium is not a laser - it won't emit coherent radiation.
Provided the light is being emitted from the same (small) source area, I think it must be coherent? White light coming from a single-slit (or a bacterium sized pin-hole !) is coherent enough to interfere when passed through a following double-slit. Incredibly as it may seem, light from opposite sides of a distant star is also coherent enough to be interfered (try explaining that!).
mfb said:
A VLT-like telescope ...
Yes! A VLT-like telescope in space with very large separation between the detecting apertures was one idea I had in mind. Thanks for your support! Another possibility would be to mix the received light with that from a laser comb generator so as to "mix it down" to a bunch of microwave frequencies, after which it can be digitized and interfered with its partner detector(s) computationally - as is done with VLBI. I believe this approach is somewhat within reach of today's technology.

As mentioned previously, the angular resolution of such an array is governed by Rayleigh's criterion where the "diameter" in the equation is the separation of the satellites.
 
  • #37
The bacterium emits incoherent light - the coherence length is of the order of the wavelength of the light itself. If we combine the light of multiple mirrors with the same path length, then we can get interference. Get the path length wrong and you won't gain anything from the combination (apart from more light).
This is different with VLBI, where you can record the phase, store it digitally and then combine it in a computer. You won't be able to do that with incoherent visible light, try to measure the phase and you ruin coherence between mirrors.
 
  • #38
mfb said:
The bacterium emits incoherent light - the coherence length is of the order of the wavelength of the light itself. If we combine the light of multiple mirrors with the same path length, then we can get interference. Get the path length wrong and you won't gain anything from the combination (apart from more light).
This is different with VLBI, where you can record the phase, store it digitally and then combine it in a computer. You won't be able to do that with incoherent visible light, try to measure the phase and you ruin coherence between mirrors.
I don't believe it is any different to VLBI. In the case of VLBI you record the wideband (= incoherent by your definition?) microwave signal at two or more globally spaced locations together with an atomic clock signal to allow them to be resynchronized for interference calculations later on. The path length is adjusted to be identical just by getting the synchronization exactly right. Neighbouring pixel data is obtained by stepping the synchronization very slightly for each interference accumulation calculation. As I recall you don't even have to know the synchronization very exactly to start with - just sweep it through the right region and the image jumps out at you when you get it right! An extremely stable clock signal is necessary however because the signal is very weak and noisy and so has to be coherently integrated for a relatively long time.

The process of "mixing" light down by multiplying it with very stable carrier light source (laser comb) converts it into the microwave realm while preserving the phase and gives one a synchronizing signal to allow the "path length" to be adjusted exactly right in later calculations. At this point the signal can be recorded electronically and exactly the same process as is done for VLBI can be followed to obtain the image for each microwave bandwidth limited record. Integrating the many images for each small (microwave wide) segment of the optical bandwidth will give a final complete optical image.
 
  • #39
If the microwave signal is incoherent, adding it does not increase the resolution, because the phases your different telescopes measure are not correlated then.
 
  • #40
Since VLBI obviously works, the microwave signal arriving at single-sensor-detectors, positioned at globally spaced locations, and pointed at a distant patch of sky that is many thousands of light-years wide must then be coherent. No?

Since white light from a slit (much bigger than a bacterium), that then diverges into two slits, can then be brought back together to interfere. It must also be coherent?

Since the light from a bacterium that diverges widely into a microscope objective lens, can then be brought back together (constructively and destructively combining) to form an image, all of those rays must also be coherent!
 
  • #41
tech99 said:
A radio telescope cannot resolve something smaller than its beamwidth (however that is defined).
The Aperture is a major part of what governs the resolving power but a radio telescope can be much better than just a paraboloid reflector. In fact, the disadvantage that radioastronomy has, due to the large apertures required is partially offset by the fact that the amplitude and phase of microwave signals can be dealt with by electronics and the result is that a given aperture can produce better results. (They punch above their weight) So I would say that the above statement is probably more appropriate for optical telescopes than for radio telescopes.
 
  • #42
jwinter said:
Since VLBI obviously works, the microwave signal arriving at single-sensor-detectors, positioned at globally spaced locations, and pointed at a distant patch of sky that is many thousands of light-years wide must then be coherent. No?
The sources must emit coherent radiation. There can be many individual sources emitting radiation that is coherent. Like many individual lasers.
jwinter said:
Since white light from a slit (much bigger than a bacterium), that then diverges into two slits, can then be brought back together to interfere. It must also be coherent?
No - with sunlight (for example) you have to be very careful and overlap the actual radiation (not your measurement values) to see interference. VLBI would not work that way, you would need a worldwide network of RF waveguides, carefully designed to avoid losing coherence.
jwinter said:
Since the light from a bacterium that diverges widely into a microscope objective lens, can then be brought back together (constructively and destructively combining) to form an image, all of those rays must also be coherent!
No, it is similar to the white light.
 
  • #43
mfb said:
To lose information, you would need several different initial states to end up in the same final state. It would also be a direct violation of CPT symmetry. If you find any situation like that, go and take the Nobel Prize(s)!

Now you are just being unreasonable. Google 'diffusion' and "Gibbs paradox". The loss of information is non-reversible.
 
  • #44
DavidReishi said:
I'm sorry if it seemed like that.
No, no, no, and no. Are those decisive issues in our 10 ft petri-dish demo? If so, do you mind telling me how?

I'll take you at your word- let's start over, ok?

First, yes- those three related buzzwords are fundamental concepts that directly address your ability to sufficiently (accurately?) image an object, regardless of your method of imaging. So let's start there, and proceed a little bit at a time.

Please carefully define what "the visual information of the dish's form" means. I'll start you off- "the dish's form is represented by a 3-D optical field, created when incident light illuminates and scatters off of the dish. The 3-D field can be modeled with Kirchhoff's diffraction formula, considering the dish as the illuminated aperture". Now you go from there to "the visual information of the dish's form".

Once you have a quantitative way to describe that information, consider the concept of 'angular spectrum', and think about how that relates to diffraction. What kind of information is diffracted into large angles? What kind of information is diffracted into small angles? As a related topic, think about what Laue/Bragg patterns are and how those images are used to obtain information about crystal structure.

After you have done that, then use Abbe's or Rayleigh's limit to calculate how much of the scattered light, diffracting into the full hemisphere, must be collected to resolve various aspects of the visual information of the dish's form.

That's enough for now, I think...
 
  • #45
jwinter said:
I accept you would need more than just lenses since the light from the widely separated detector apertures needs to be combined in a manner which can interfere constructively and destructively (ie phase preserving).

This is exactly what I posted in #5 and #13 of this thread. Glad we are in agreement!
 
  • #46
Andy Resnick said:
Now you are just being unreasonable. Google 'diffusion' and "Gibbs paradox". The loss of information is non-reversible.
It is not a loss of information. It is a loss of accessible information. If you can prove otherwise, go and get the Nobel Prize. Seriously.

No-cloning theorem and no-deleting theorem tell us that information fundamentally is conserved, and all experiments are in agreement with that.
 
  • #47
mfb said:
The bacterium emits incoherent light - the coherence length is of the order of the wavelength of the light itself. If we combine the light of multiple mirrors with the same path length, then we can get interference. Get the path length wrong and you won't gain anything from the combination (apart from more light).
This is different with VLBI, where you can record the phase, store it digitally and then combine it in a computer. You won't be able to do that with incoherent visible light, try to measure the phase and you ruin coherence between mirrors.
I think the problems mentioned for for incoherent light are not fundamental but just problems of implementation of the telescope, which relies on memory. In a general case, if we consider a source which is modulated with noise, a conventional antenna array can still image it because all elements of the array receive the signal with identical modulation envelope but just the "carrier" phase differs over 0 - 360 degrees depending on direction.
For an object smaller than the resolution of the telescope, the fact that different parts of the object's surface radiate incoherently is not important, because the distant telescope sees the vector sum, and so it sees a single noise modulated source. For example, a filament lamp can be located by a telescope.
 
  • #48
tech99 said:
For example, a filament lamp can be located by a telescope.
Sure it can be located, but you don't gain much in angular resolution if you take multiple pictures of it at different locations and combine them later (assuming the filament lamp is located so far away that triangulation does not work). To gain in resolution you need the light coming from the lamp to interfere while taking a single combined picture.
 
  • #49
tech99 said:
I think the problems mentioned for for incoherent light ...
There is no problem with incoherent light because light from the same small area, or small angle of view, is coherent. It just has a short coherence length - of order of the wavelength of light (or of microwaves if we are doing VLBI) as has been pointed out by others. But that doesn't prevent interference. As we well know that when light is focussed (so that divergent rays are brought back to the same spot over equal path length) we get a good image - ie the rays interfere.

There is not a laser (or maser) in space for every pixel on a VLBI image! The microwaves from every pixel area in space is just as "incoherent" as the light from every pixel area on a bacterium. If you "mix" (as in heterodyne - which is phase preserving) light frequencies down to microwave (with a very stable laser comb), then in principle the same process can be done with light as is routinely done with microwaves, and whatever is done obviously works. Others will have to work out how it works for themselves because I am tired of trying to explain things on this thread.
 
Last edited:
  • #50
mfb said:
INo-cloning theorem and no-deleting theorem tell us that information fundamentally is conserved, and all experiments are in agreement with that.

Sigh... the experiments discussed here are inverse scattering problems; the above don't apply when the initial state can not be completely specified.
 
Back
Top