# Image formation in terms of electromagnetic waves

1. Nov 26, 2014

### center o bass

In texts on geometrical optics I have read that an image is formed where light rays converge -- without an explanation of why.

Thus I ask, why are images formed where light rays converge? In particular, what is the answer to this question in terms of the wave theory of light (Electrodynamics)?

How are information propagated from an object and depicted on a screen through an electromagnetic wave? What is responsible for the contrasts in the image?

2. Nov 26, 2014

### Staff: Mentor

Light rays are perpendicular to wavefronts. A bunch of light rays leaving an object point, being refracted/reflected, and converging onto an image point, is equivalent to a series of circular wavefronts expanding from an object point, being refracted/reflected, and forming a new set of (semi)circular wavefronts which "shrink" onto an image point.

Electromagnetic fields obey the principle of superposition, so the waves from different object points propagate to different image points without affecting each other's propagation.

3. Nov 26, 2014

### center o bass

Thanks! Would you happen to have a reference that explain this in some detail?

4. Nov 26, 2014

### center o bass

But if they converge to an image point, is there not a chance that they will interfere destructively?

5. Nov 26, 2014

### Staff: Mentor

In normal lighting the light is incoherent. With coherent light, like a laser, you can get constructive and destructive interference. I think that this is what gives rise to the perceived "shimmer" of laser light.

6. Nov 26, 2014

### Staff: Mentor

The light from a single point source is always in phase with itself when it arrives at the image point. All parts of a single circular wavefront (or "ripple" if you use the analogy of waves on a water surface) leave the source point simultaneously, and arrive at the image point simultaneously.

In the light-ray picture, this corresponds to all the rays from the source point to the image point having the same optical path length.

7. Nov 27, 2014

### center o bass

I.e. both rays take a path of least time?

8. Nov 27, 2014

### Staff: Mentor

Right. Or rather, all the valid rays that one can draw from the object point to the image point take the same (least) time. There are infinitely many of them.

Last edited: Nov 27, 2014
9. Nov 27, 2014

### Drakkith

Staff Emeritus
I was under the impression that light from a single point is not completely in phase with itself when arriving at the focal point, hence the reason that the light forms a circular diffraction pattern, with the maximum intensity at the center of the pattern and falling off from there. Have I misunderstood something?

10. Nov 27, 2014

### Staff: Mentor

Sure, at some level you have to start taking diffraction into account. I was assuming we were dealing with situations in which geometrical (ray) optics is "good enough" that we can use it more or less interchangeably with wave optics. You have to have pretty good lenses and be dealing with very pointlike objects (e.g. in astrophotography) before diffraction effects become noticeable, and even then diffraction patterns are pretty small.

11. Nov 27, 2014

### Drakkith

Staff Emeritus
Awesome. Thanks, JT.

Forgive me if this is too basic of an answer:

Well, if you want to deal with waves, imagine that you have a plane wave striking a detector at a normal angle. The energy from the wave is deposited equally across the detector and you have no intensity variation between pixels. If we instead cause that EM wave to converge on the detector, the energy of the wave will be deposited into a small portion of the detector. The pixels that the wave converges on will read a high intensity, while the pixels the wave doesn't converge on will read a low intensity. This variation in pixel value forms an image.