- #1
marcnn
- 13
- 0
Homework Statement
There are three subquestions in this question, all marked bold.
Let's consider a gradient index lens of thickness ##d##, whose refractive index changes with the distance from the axis with the following formula
$$ n(r) = n_1 + a r^2 $$.
Determine the lens's focal length ##f##. Consider only rays falling parallel to the axis, whose distance from it ##r_1## is small. Assume that ##\lvert \Delta r \rvert << d##, ##\lvert \Delta r \rvert << r_1##.
Homework Equations
$$x(y) = d + \sqrt{R^2 - y^2} - R \approx d - \frac {y^2}{2R}$$
$$l(y) = x(y) \cdot n(y) + n_0\sqrt{[f + d - x(y)]^2 + y^2}$$
The Attempt at a Solution
First of all, by the Fermat's principle we know that the light will travel with the path, for which the time or, equivalently ##\int_X^Y n dS## is minimal.
So I would say that the ray will approach the axis for ##a>0##. But in fact it is the opposite.
Why?
We can use the principle so as to calculate the focal length. The thickness of the lens depending on the distance from the axis is
$$x(y) = d + \sqrt{R^2 - y^2} - R \approx d - \frac {y^2}{2R}$$
Let the ray enter the lens at a distance ##y## from the axis at a point ##A##. Let it hit the lens's axis at point ##B##.
If ##f > 0##, then length of the ray's optical path between points ##A## and ##B## is
$$l(y) = x(y) \cdot n(y) + n_0\sqrt{[f + d - x(y)]^2 + y^2}$$ since ##\lvert \Delta r \rvert << r_1##.
But if ##f<0## then no ray hits the optical axis (only the rays' extensions). It is claimed that in this case
$$l(y) = x(y) \cdot n(y) - n_0\sqrt{[f + d - x(y)]^2 + y^2}$$
Why do we change the sign in this situation?
Later on it is claimed that ##f## is the focal length, then, ##l(y)## does not depend on ##y##. Why?