A Question about Imaging and PSFs

  • Thread starter cepheid
  • Start date
  • Tags
    Imaging
In summary: If every point in the image plane is convolved with the PSF, why is it that this is only obvious in certain cases?Take astronomical imaging: for images of bright point sources (e.g., the brightest stars), we see rings, spikes etc. Why do we not see these features for dimmer stars? Because they are below the detection limit for the detector. If only 0.1% of the energy goes into the spikes you might see if for a 6mag star but not a 25mag galaxy. Furthermore, what about images of extended objects? Why is it that galaxies and nebulae look fine, and don't look like some sort of blurred
  • #1
cepheid
Staff Emeritus
Science Advisor
Gold Member
5,199
38
Hi,

If every point in the image plane is convolved with the PSF, why is it that this is only obvious in certain cases?

Take astronomical imaging: for images of bright point sources (e.g., the brightest stars), we see rings, spikes etc. Why do we not see these features for dimmer stars? Furthermore, what about images of extended objects? Why is it that galaxies and nebulae look fine, and don't look like some sort of blurred mess?

Also, what is it fundamentally about everyday/terrestrial imaging that makes it so that these concerns don't seem to matter at all? Why is it that I can feel confident that more pixels = a sharper image, without having to worry about the actual *optics?* One would think that the miniscule lenses included with ever smaller consumer digital electronics would offer pretty lousy angular resolution.
 
Physics news on Phys.org
  • #2
cepheid said:
If every point in the image plane is convolved with the PSF, why is it that this is only obvious in certain cases?
It's only a problem for diffraction limited images. And only obvious for point on a dark background.
Take astronomical imaging: for images of bright point sources (e.g., the brightest stars), we see rings, spikes etc. Why do we not see these features for dimmer stars?
Because they are below the detection limit for the detector. If only 0.1% of the energy goes into the spikes you might see if for a 6mag star but not a 25mag galaxy.
Furthermore, what about images of extended objects? Why is it that galaxies and nebulae look fine, and don't look like some sort of blurred mess?
They are a blurred mess at the scale < arcsec

Also, what is it fundamentally about everyday/terrestrial imaging that makes it so that these concerns don't seem to matter at all? Why is it that I can feel confident that more pixels = a sharper image, without having to worry about the actual *optics?*
The optics generaly aren't diffraction limited an the scenes are normally confused enough that you don't see them. If you are one of the sad bores on photo forums who look at individual pixels on photos of test charts to prove your camera is best - you will.
One would think that the miniscule lenses included with ever smaller consumer digital electronics would offer pretty lousy angular resolution.
They are pretty bad - but this leads to blurring which combined with the heavy jpeg compression mean you don't see the effects
 
  • #3
Thank you for the explanations mgb_phys. I just wanted to see if I could trust the physics and apply it to the situation in as straightforward a way as I was attempting to.

I guess for terrestrial imaging, it is, as you said, a question of not ever really having to worry about the kind of angular resolution that you need in astronomy. Nobody worries about why you can't see individual trees in a forest tens of kilometres away. Things look reasonable, like the way you'd expect them to look.

One more thing, if I may. You mentioned that the scenes (obviously much busier than a bunch of bright points on a dark background) are "confused." What exactly does that mean? I have some vague idea that the confusion limit occurs when you're looking deep enough that you see so many sources, it is impossible to distinguish them from the background noise (again speaking in an astronomy-specific context, sorry).
 
Last edited:
  • #4
I didn't mean it in a technical sense, I meant that a random background of trees/people etc disguises obvious optical abberation whereas bright point sources on an empty background emphasizes them.
 
  • #5
Right okay...that makes sense. Thanks for the clarification.
 
  • #6
cepheid said:
Hi,

If every point in the image plane is convolved with the PSF, why is it that this is only obvious in certain cases?

Take astronomical imaging: for images of bright point sources (e.g., the brightest stars), we see rings, spikes etc. Why do we not see these features for dimmer stars? Furthermore, what about images of extended objects? Why is it that galaxies and nebulae look fine, and don't look like some sort of blurred mess?

Also, what is it fundamentally about everyday/terrestrial imaging that makes it so that these concerns don't seem to matter at all? Why is it that I can feel confident that more pixels = a sharper image, without having to worry about the actual *optics?* One would think that the miniscule lenses included with ever smaller consumer digital electronics would offer pretty lousy angular resolution.

Introducing a discrete detector (pixels) invalidates an optical system from being shift-invariant, so it is not proper to consider imaging as a convolution operation anymore.

That said, if the pixels are smaller than the PSF, one can approximate the system as being linearly shift-invariant. The rings/spikes. etc are diffractive artifacts of the aperture, and depending on how the overall brightness of the image is scaled, the details of dimmer objects can be lost- note that on order to view these artifacts, there is usually blooming present in the central peak. There's no contradiction with imaging points and extended objects- the diffraction artifacts may be lessened by the fact that those "side-lobes" are much dimmer than the center peak, and get washed out by imaging extended objects- the image will simply appear blurry.

Pixelated imaging systems can behave very differently from continuous systems- aliasing is the main effect people recognize, and the key is proper matching of the pixel size to the PSF, something that is accomplished by adjusting the numercal aperture of the system. An excellent resource for this topic is "Analysis of Sampled Imaging Systems" by Ronald Driggers (SPIE proess). But yes, those little cameras in consumer electronics are quite impressive- I wouldn't mind seeing the optical layout.
 

Related to A Question about Imaging and PSFs

What is imaging and what is its purpose?

Imaging is the process of creating a visual representation of an object, usually through the use of technology such as cameras or microscopes. Its purpose is to provide a way to see and analyze objects or phenomena that are not visible to the naked eye.

What is a PSF and how does it relate to imaging?

A PSF (Point Spread Function) is a mathematical function that describes how an ideal point of light is spread out in an image. It is an important concept in imaging as it helps to understand the limitations and quality of an imaging system.

How does the PSF affect the resolution and quality of an image?

The PSF directly affects the resolution and quality of an image as it determines how sharp and clear the image will be. A smaller and more concentrated PSF leads to higher resolution and better image quality, while a larger and more spread out PSF results in lower resolution and poorer image quality.

What factors can affect the shape and size of a PSF?

The shape and size of a PSF can be affected by various factors such as the optics of the imaging system, the wavelength of light used, and any aberrations in the system. Other factors include the size and shape of the object being imaged, the distance between the object and the imaging system, and the quality of the image sensor.

How can PSF engineering be used to improve imaging quality?

PSF engineering involves designing and manipulating the PSF of an imaging system to improve its quality. This can be done through various techniques such as using specialized optics, adjusting the wavelength of light, and correcting aberrations. It can also be used to enhance specific features or details in an image, making it a valuable tool in scientific research and imaging technology development.

Similar threads

  • Astronomy and Astrophysics
4
Replies
125
Views
4K
  • Classical Physics
Replies
21
Views
1K
  • Astronomy and Astrophysics
2
Replies
43
Views
10K
  • Astronomy and Astrophysics
Replies
25
Views
1K
  • Astronomy and Astrophysics
Replies
10
Views
1K
  • Astronomy and Astrophysics
Replies
1
Views
4K
  • General Discussion
Replies
2
Views
6K
  • Astronomy and Astrophysics
Replies
34
Views
12K
  • Astronomy and Astrophysics
Replies
9
Views
3K
  • Mechanics
Replies
1
Views
12K
Back
Top