Can better sensors extract unlimited info from old lens?

  • Context: Graduate 
  • Thread starter Thread starter nugat
  • Start date Start date
  • Tags Tags
    Lens Nikon Sensors
Click For Summary

Discussion Overview

The discussion revolves around the implications of higher resolution sensors in photography and whether they can extract more information from existing lenses. Participants explore the relationship between sensor resolution, lens performance, and the theoretical limits of image quality, touching on concepts from optics and imaging science.

Discussion Character

  • Debate/contested
  • Technical explanation
  • Conceptual clarification
  • Mathematical reasoning

Main Points Raised

  • Some participants argue that a higher resolution sensor can always extract more resolution from existing lenses, citing Fourier transforms and modulation transfer functions (MTFs) as supporting evidence.
  • Others challenge this view, suggesting that lens performance has inherent limits that cannot be exceeded by simply using a higher resolution sensor.
  • One participant expresses a philosophical perspective, likening the situation to a "quantum Zeno effect," questioning whether there is a limit to the information a lens can provide.
  • Concerns are raised about the meaningfulness of resolution figures without accompanying contrast data, with some advocating for specific performance measures in photography.
  • There is a discussion about the relevance of lens resolution and how it interacts with sensor resolution, with some participants proposing that the optical performance of a system is determined by both components.
  • Participants mention the common issue in astrophotography regarding the matching of camera resolution to telescope capabilities, indicating that higher resolution does not always equate to better image quality.
  • Questions arise about the definition of resolving power in astrophotography, with no consensus on the best metric to use.
  • Photon counting is mentioned, with some participants clarifying its typical applications in imaging methods.

Areas of Agreement / Disagreement

Participants express differing views on whether higher resolution sensors can extract more information from existing lenses, with no consensus reached. Some support the idea while others firmly oppose it, leading to an unresolved debate on the topic.

Contextual Notes

Limitations in the discussion include the dependence on definitions of resolution and performance metrics, as well as the unresolved nature of mathematical models in imaging sciences that relate to the claims made.

nugat
Messages
8
Reaction score
0
I placed this in quantum department but since no comments came either it belongs here or is too trivial.
Please comment.

MY POST IN QUANTUM PH. FORUM
...
"
This problem I encountered in my photography forum.
A new model of camera comes out with a much higher megapixel sensor (Nikon D800, 36mpix). I argue that such sensor will need new lenses to be meaningful. Others claim that a given lens performance (here resolution) is always utilized by a new higher resolution sensor. They call up Fourier transforms, modulation transfer functions etc. I am not an engineer, more of an amateur philosopher. To me it smells of Zeno paradox (a "quantum Zeno effect" ?). There sure must be a limit on the amount of information a new ever improving recording medium (sensor) can extract from the same old lens? Or not?
Or does this issue belong to classical physics?
I mean, in my mind I see a lemon running out of juice at some point of the squeeze.
Isn't the information coming through any given lens limited then? No more photons to be converted into electrons? On one hand it's the issue of quantum efficiency of the CMOS sensor, but also the physical information limit of the light field?
So are those Fourier transforms of MTFs accurate descriptions or we are at the point where technology stumbles on quanta?


OPONENT'S ORIGINAL POST IN PHOTO FORUM

"... As I wrote the formula is an approximation. For accurate results one would need to multiply the MTFs by means of Fourier transforms.
Every image will show more resolution on the D800 with any lens at any aperture than on previous Nikon's.
The resolution of a system is a function of the resolution of it's parts, not only a single component.
The resolution of a lens plus sensor system can be approximated by:

1/Total resolution = 1/lens resolution + 1/sensor resolution. Or

Total resolution = (lens resolution)x(sensor resolution) / [ (lens resolution) + (sensor resolution) ]. "...




MY POSITION IN PHOTO FORUM

"Resolution figures are meaningless without accompanying contrast data.
One has to put somewhere the cutoff line: what is acceptable. In cinematography Zeiss Master Prime delivering 70lp/mm at 70% contrast is the standard bearer (at 25k U$ and several pounds)
In photography I am willing to accept 50% contrast as a minimum performance measure.
If somebody wants to go down to MTF 25 or even 10--his choice. He will be able to point to "extra resolution" on his cranked up monitor viewed at 100%.
The concept of a lens delivering an ever-increasing stream of visual information as the recording medium becomes more and more capable, is philosophically intriguing, to say the least. A "quantum Zeno effect" of sorts."



OPPONENT'S ANSWER IN PHOTO FORUM

"If you would have read the thread I linked to further, you would have found that the approximation is based on multiplications of MTFs by means of Fourier transforms, You can pick any frequency (resolution) you want. Higher frequencies are generally used as a measure of resolution and lower as contrast.

The approximate formula is still valid and serves as a good first approximation to explain the principles of what can be expected for a system of lens plus sensor.The optical performance of a system of two components (lens and sensor) is not determined just by a single component but by both."

You can think of it as two MTFs stacked on top of each others."
 
Science news on Phys.org
nugat said:
I placed this in quantum department but since no comments came either it belongs here or is too trivial.
Please comment.

<snip>

OPONENT'S ORIGINAL POST IN PHOTO FORUM

"... Every image will show more resolution on the D800 with any lens at any aperture than on previous Nikon's.
<snip>

It's hard for me to figure out what you are asking, but the sentence I pulled out from the post is false. A blurry image will still look blurry at increased sampling. I haven't seen any side-by-side comparisons of the new D800 and (say) D7000, but the claim is that some high-quality lenses will show improved performance on the D800.

What exactly are you asking?
 
There is a strong belief among some photo pundits that a higher resolution sensor will always extract extra resolution from all existing lenses.This is supposedly based on Fourier transforms of modulation transfer functions of the imaging chain.
Is it true?
Do math models of imaging sciences give support to such claims?

I have had an impression that there is something like a lens performance limits independent of the measuring (recording) component of the imaging chain. If a lens delivers 70lp/mm at MTF70 as measured on the MTF equipment that should be it. No sensor should extract 90lp/mm at MTF70 from this lens.
Am I wrong?
TIA
 
nugat said:
There is a strong belief among some photo pundits that a higher resolution sensor will always extract extra resolution from all existing lenses.This is supposedly based on Fourier transforms of modulation transfer functions of the imaging chain.
Is it true?

No.

nugat said:
Do math models of imaging sciences give support to such claims?

On the contrary, the results demonstrate the claims are false. See, for example, "Analysis of Sampled Imaging Systems":

https://www.amazon.com/dp/0819434892/?tag=pfamazon01-20

nugat said:
I have had an impression that there is something like a lens performance limits independent of the measuring (recording) component of the imaging chain. If a lens delivers 70lp/mm at MTF70 as measured on the MTF equipment that should be it. No sensor should extract 90lp/mm at MTF70 from this lens.
Am I wrong?
TIA

If I understand you, then you are correct- the lens has become the limiting factor in your system performance.
 
Thanks. I ordered the book from. Time to brush up in optoelectrical engineering.
 
Note, this is a common problem with astrophotography. There are online and downloadable calculators for matching a camera with a telescope based on resolution. Short version, though, is that it does you no good to have a camera with much higher a resolution than your telescope.
 
Thank you for the info. Could you pass me any link to such calculators?
It's good to know that the issue is understood and long resolved among professionals.
Out of curriosity, how do you define resolving power of astrophotography glass? In lp/mm?
TIA.
 
There's no one single best metric to define 'resolving power', because that concept is ill-defined. One could specify the point spread function over the exit pupil, but in terms of adaptive optics, it likely makes more sense to specify the actual wavefront at the exit pupil.
 
Like counting all the photons and their energies?
 
  • #10
nugat said:
Like counting all the photons and their energies?

I don't understand what you mean- typically, photon counting is not used as a wide-field imaging method (although it is used as a confocal-type imaging method).
 
  • #11
nugat said:
Thank you for the info. Could you pass me any link to such calculators?
It's good to know that the issue is understood and long resolved among professionals.
Out of curriosity, how do you define resolving power of astrophotography glass? In lp/mm?
TIA.

In a "perfect" system it would be the smallest spot the telescope or camera is capable of making based on the diameter of the objective and the wavelength of light. Attempting to decrease pixel sizes to less than about half the size of the airy disc would be a waste, as it would not get any more detail. Note that real systems will have imperfections and aberrations such as dispersion, coma, astigmatism, and others that will degrade the image even further.

http://en.wikipedia.org/wiki/Airy_disc
http://en.wikipedia.org/wiki/Optical_resolution
 
  • #12
Andy Resnick said:
I don't understand what you mean- typically, photon counting is not used as a wide-field imaging method (although it is used as a confocal-type imaging method).

Excuse my unscientific language, I am just an amateur.
I was thinking of a model to describe resolution in terms of photons registered vs photons sent out from a strict pattern. How many photons travel the lens and arrive at the proper positions in space.
Or maybe it's total nonsense.
 
  • #13
nugat said:
Thank you for the info. Could you pass me any link to such calculators?
http://www.newastro.com/downloads/index.php
http://www.astro.shoregalaxy.com/index_010.htm
http://www.ccd.com/ccd113.html
http://geogdata.csun.edu/~voltaire/pixel.html

There is some debate in these links about just how many pixels you need to cover the telescope's resolution limit, but they all agree that the the resolution limit is not dependent on the number of pixels. In other words: One common method for assessing resolution is by being able to "split" a double star (see both stars). No amount of additional pixels (beyond a well-matched camera/scope) will change whether a certain telescope can split a certain double star.

Also, in astrophotography there is a very serious downside to too many pixels: the pixels don't actually touch each other, so the more pixels you have the more of the imaging surface is covered with borders between pixels instead of pixels. That reduces the sensitivity of the camera, in addition to the loss in sensitivity caused by the smaller pixels themselves. This issue of course also exists for regular photography, it just isn't as important.
 
Last edited by a moderator:
  • #14
There is a counter-argument for more pixels, as a way of reduce the effect of noise in the sensor, by spreading the noise over a wider spatial frequency range and filtering out the part that can't possibly be "signal" because it is beyond the resolving power of the optics.

But for "normal" photography I don't think that has much relevance, especally considering that many digital images are filtered through lossy compression schemes (e.g. jpg) in any case.

FWIW this principle is applied in other digital data aquisition systems - e.g. pro audio recording and processing often uses sampling rates much higher than human hearing can process, and much higher than what finally stored on CDs or DVDs, to minimize noise.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
2
Views
3K
  • · Replies 152 ·
6
Replies
152
Views
11K
  • · Replies 17 ·
Replies
17
Views
7K