Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

A Question about Imaging and PSFs

  1. Jan 20, 2009 #1

    cepheid

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Hi,

    If every point in the image plane is convolved with the PSF, why is it that this is only obvious in certain cases?

    Take astronomical imaging: for images of bright point sources (e.g., the brightest stars), we see rings, spikes etc. Why do we not see these features for dimmer stars? Furthermore, what about images of extended objects!?! Why is it that galaxies and nebulae look fine, and don't look like some sort of blurred mess?

    Also, what is it fundamentally about everyday/terrestrial imaging that makes it so that these concerns don't seem to matter at all? Why is it that I can feel confident that more pixels = a sharper image, without having to worry about the actual *optics?* One would think that the miniscule lenses included with ever smaller consumer digital electronics would offer pretty lousy angular resolution.
     
  2. jcsd
  3. Jan 20, 2009 #2

    mgb_phys

    User Avatar
    Science Advisor
    Homework Helper

    It's only a problem for diffraction limited images. And only obvious for point on a dark background.
    Because they are below the detection limit for the detector. If only 0.1% of the energy goes into the spikes you might see if for a 6mag star but not a 25mag galaxy.
    They are a blurred mess at the scale < arcsec

    The optics generaly aren't diffraction limited an the scenes are normally confused enough that you don't see them. If you are one of the sad bores on photo forums who look at individual pixels on photos of test charts to prove your camera is best - you will.
    They are pretty bad - but this leads to blurring which combined with the heavy jpeg compression mean you don't see the effects
     
  4. Jan 20, 2009 #3

    cepheid

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Thank you for the explanations mgb_phys. I just wanted to see if I could trust the physics and apply it to the situation in as straightforward a way as I was attempting to.

    I guess for terrestrial imaging, it is, as you said, a question of not ever really having to worry about the kind of angular resolution that you need in astronomy. Nobody worries about why you can't see individual trees in a forest tens of kilometres away. Things look reasonable, like the way you'd expect them to look.

    One more thing, if I may. You mentioned that the scenes (obviously much busier than a bunch of bright points on a dark background) are "confused." What exactly does that mean? I have some vague idea that the confusion limit occurs when you're looking deep enough that you see so many sources, it is impossible to distinguish them from the background noise (again speaking in an astronomy-specific context, sorry).
     
    Last edited: Jan 20, 2009
  5. Jan 20, 2009 #4

    mgb_phys

    User Avatar
    Science Advisor
    Homework Helper

    I didn't mean it in a technical sense, I meant that a random background of trees/people etc disguises obvious optical abberation whereas bright point sources on an empty background emphasizes them.
     
  6. Jan 20, 2009 #5

    cepheid

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Right okay...that makes sense. Thanks for the clarification.
     
  7. Jan 21, 2009 #6

    Andy Resnick

    User Avatar
    Science Advisor
    Education Advisor

    Introducing a discrete detector (pixels) invalidates an optical system from being shift-invariant, so it is not proper to consider imaging as a convolution operation anymore.

    That said, if the pixels are smaller than the PSF, one can approximate the system as being linearly shift-invariant. The rings/spikes. etc are diffractive artifacts of the aperture, and depending on how the overall brightness of the image is scaled, the details of dimmer objects can be lost- note that on order to view these artifacts, there is usually blooming present in the central peak. There's no contradiction with imaging points and extended objects- the diffraction artifacts may be lessened by the fact that those "side-lobes" are much dimmer than the center peak, and get washed out by imaging extended objects- the image will simply appear blurry.

    Pixelated imaging systems can behave very differently from continuous systems- aliasing is the main effect people recognize, and the key is proper matching of the pixel size to the PSF, something that is accomplished by adjusting the numercal aperture of the system. An excellent resource for this topic is "Analysis of Sampled Imaging Systems" by Ronald Driggers (SPIE proess). But yes, those little cameras in consumer electronics are quite impressive- I wouldn't mind seeing the optical layout.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?