# Resolution (visual acuity) and screen discreteness

1. Aug 20, 2014

### fog37

Hello Forum,

Resolution is an angular distance (measured in fractions of an 1 degree angle). The lower the resolution the better. It represents the ability to distinguish two objects that are very closer to each other.

For an optical system, the resolution increases when the diameter D of the lens increases (or if the wavelength of the used light decreases).

1) If the system has multiple lenses, will the aperture pupil determine the resolution or the size of the aperture stop?

Because of the unavoidable diffraction, two object points will be imaged by a circular lens as Airy disks. If the two discs are too large they will overlap and look like a single blobby image: we will not be able to distinguish the two objects separately.

The two point images (Airy discs) form on the imaging screen that is often assumed to be continuous. What if the imaging screen is (like the human retina) discrete an formed by a finite number of photoreceptors with a certain size and shape?

Does the size of each receptor need to be smaller, larger or equal to the diameter of the Airy disks?

Thanks,
Fog37

2. Aug 20, 2014

### Staff: Mentor

First, the higher the resolution the better, as higher resolution means you can distinguish objects closer to each other than a lower resolution.

If the system has multiple lenses, it is usually the diameter of the aperture stop that determines the resolution. For example, my telescope has an aperture stop that is 8 inches in diameter, while the mirror is about 9 inches in diameter. The resolution is the same as an 8 inch mirror that doesn't have an aperture stop. (The reason for the difference is that having the mirror larger than the aperture stop helps with vignetting)

If you are worried about resolution, your detector should have a pixel size approximately 2-3 times smaller than your airy disk. See here: http://starizona.com/acb/ccd/advtheorynyq.aspx

3. Aug 21, 2014

### Andy Resnick

Discrete (sampled) imaging systems can be more difficult to analyze because they are not linear shift-invariant systems: small shifts in object position don't result in small shifts of the image. Often, the pixel size is designed to be much smaller than the Airy disc, and as Drakkith notes, in that limit 'normal' rules apply. The opposite limit (pixel size much larger than the Airy disc) is not much relevant.

In between, a variety of imaging artifacts can occur, most notably 'aliasing' which is the introduction of spurious high-frequency components to the image caused by the mismatch between sample frequency and image frequency, similar to Moire' patterns. The best (IMO) book to understand this is Vollmerhausen and Driggers' "Analysis of Sampled Imaging Systems" (SPIE press)

Note that the retina is not anything like a simple pixelated array- there are multiple processing layers located in the retina that operate on the signal prior to the optic nerve.

4. Aug 21, 2014

### fog37

So, in essence, for the simple case of a single lens and a viewing screen with a finite amounts of receptors on it, the resolution is primarily determined by the diameter D of the lens and wavelength of light and later by the number of receptors on the screen.

If there are too few receptors we run into the problem of undersampling that leads to artifacts which compromise the overall resolution.

fog37