- #1
crick
- 43
- 4
The focal of the lens equivalent of two thin lens at distance h is
$$1/f=1/f_1+1/f_2+h/(f_1 f_2)$$
Therefore, supposing that ##f_1>0## and ##f_2>0## (both lenses are convergent), if ##f_1+f_2 <h## then the equivalent lens should be divergent.
Nevertheless consider the example in picture
The two lenses have focals such that ##f_1+f_2 <h## but the image is real, i.e. the equivalent lens cannot be divergent. I understood the ray diagram, but how can this hold true?
$$1/f=1/f_1+1/f_2+h/(f_1 f_2)$$
Therefore, supposing that ##f_1>0## and ##f_2>0## (both lenses are convergent), if ##f_1+f_2 <h## then the equivalent lens should be divergent.
Nevertheless consider the example in picture
The two lenses have focals such that ##f_1+f_2 <h## but the image is real, i.e. the equivalent lens cannot be divergent. I understood the ray diagram, but how can this hold true?