# Optics: Converging waves

Hi. A spherical wave ##e^{i(kr-\omega t)}## diverging from a single point ##(x=0,y=0,z=-z_0)## can be approximated as a parabolic wave in the paraxial case around the z-axis. I.e., ##k r = k \sqrt{x^2+y^2+z^2} \simeq k (z +\frac{x^2+y^2}{2z})##.

OK, then let's say a lens is placed such that its optical axis coincides with the ##z## axis and its focus points are at ##-z_0## and ##z_0##. In this case, the outgoing parabolic wave from ##-z_0## will be focused into the point ##z_0##. My question is, how is this to be modeled mathematically? Intuitively I would guess that ##k r \simeq k ( z - \frac{x^2+y^2}{2z})##, but what is ##kr## equal to in the accompanying case of a converging spherical wave? Something ala ##e^{i(kr + \omega t)} e^{i \phi}##, where ##\phi## is some phase factor?

I would appreciate it if you guys could help me in clearing this stuff up :)

Thanks

Last edited:

Related Classical Physics News on Phys.org
blue_leaf77
Homework Helper
In this case, the outgoing parabolic wave from −z0-z_0 will be focused into the point z0
If this is converging lens, then the rays will be focused at infinity, that is the outgoing rays are collimated.

If this is converging lens, then the rays will be focused at infinity, that is the outgoing rays are collimated.
crap, yeah you're right. I was thinking in terms of rays from the object plane being focused into the image plane, but I mixed it up. sorry.

But anyway, do you know the mathematical form of waves converging to a single point?

blue_leaf77