Understanding Fourier Optics: From Spatial to Temporal Transformations

  • Context: Graduate 
  • Thread starter Thread starter KFC
  • Start date Start date
  • Tags Tags
    Fourier Optics Works
Click For Summary

Discussion Overview

The discussion revolves around the principles of Fourier optics, particularly the differences between spatial and temporal transformations, the role of phase in optical systems, and the applicability of Fourier optics in various scenarios, including complex optical systems and near-field conditions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • Some participants express confusion regarding the spatial Fourier transformation in optical systems compared to temporal cases, noting that spatial transformations appear to be direct mappings observable in physical space.
  • It is suggested that the physical image on a screen is a scaled Fourier transform, with scaling dependent on distance and wavelength, and that spatial frequency is distinct from temporal frequency.
  • Participants discuss the complex nature of Fourier transforms, with one noting that while intensity is recorded on film, the phase information is also significant, affecting the interference patterns if altered.
  • One participant raises a question about the implications of changing phase while keeping intensity constant, seeking to understand the resultant differences in the image.
  • Another participant introduces the concept of far-field diffraction patterns and their relationship to angles, emphasizing the importance of understanding the scaling factors involved.
  • Concerns are raised about the limitations of Fourier optics, particularly in near-field scenarios and the complications introduced by CCD detector arrays, which can lead to loss of shift invariance and other issues.

Areas of Agreement / Disagreement

Participants generally agree on the foundational concepts of Fourier optics but express differing views on its limitations and applicability, particularly regarding near-field conditions and the effects of complex media on transformations. The discussion remains unresolved on some technical aspects, particularly concerning the implications of phase changes.

Contextual Notes

Limitations include the assumption of linear shift-invariant systems in Fourier optics, the potential failure of these assumptions in near-field scenarios, and the complications arising from the use of CCD detectors, which may introduce aliasing and spurious responses.

KFC
Messages
477
Reaction score
4
I am reading some introduction on Fourier optics. In the text, the simplest example would be a single len system (focal length f) with an object f(x,y) sitting on the front focal plane of the len while the image is the corresponding spatial Fourier transformation F(\frac{x}{\lambda f}, \frac{y}{\lambda f}) where \lambda is the wavelength.

This spatial transformation is quit confusing while being compared temporal case. In temporal case, Fourier transform of f(t) is given as F(\omega) which is defined in frequency domain. Hence, we cannot directly observe that in physical space, we need to put F(\omega) into some instrument to observe the frequency spectrum. But
for the spatial case, it seems that the Fourier transform is nothing but just a rearrangement or mapping and scaling from the source to the image, but anyway the image are still in the physical space because the image can be directly observed on the
rear focal plane with screen or film. I wonder if my understand is correct?

Another question on the complex function. In most cases, the Fourier transform will be complex-valued. And for len system, F(\frac{x}{\lambda f}, \frac{y}{\lambda f}) could also be complex, but I think physically the film will only record the intensity instead. But what about the phase? What's the role of the phase in this case? Or I ask in this way. If I keep the intensity for every point on the image unchanged but I shift the phase by a number or randomly change the phase, how difference the image will be? in what way?

Thanks.
 
Science news on Phys.org
KFC said:
But
for the spatial case, it seems that the Fourier transform is nothing but just a rearrangement or mapping and scaling from the source to the image, but anyway the image are still in the physical space because the image can be directly observed on the
rear focal plane with screen or film. I wonder if my understand is correct?

Yes... what's happening is that what is *physically* on the screen is a scaled Fourier transform, with the scaling depending on the distance to the screen and wavelength. (The frequency in the Fourier transform here isn't a temporal frequency... it's a spatial frequency.)

KFC said:
Another question on the complex function. In most cases, the Fourier transform will be complex-valued. And for len system, F(\frac{x}{\lambda f}, \frac{y}{\lambda f}) could also be complex, but I think physically the film will only record the intensity instead. But what about the phase?

The phase is exactly what you would expect... if you had a measurement device at the screen that was capable of precisely measuring the electric field in the light, you would find that the fields at two different points on the screen would be out of phase by the amount predicted by the imaginary part of the Fourier transform.

Films and screens can't measure to this precision though, so we end up with the intensity instead, as you noted.
 
KFC said:
I am reading some introduction on Fourier optics.
<snip>

This spatial transformation is quit confusing while being compared temporal case.
<snip> it seems that the Fourier transform is nothing but just a rearrangement or mapping and scaling from the source to the image, but anyway the image are still in the physical space because the image can be directly observed on the
rear focal plane with screen or film. I wonder if my understand is correct?

Another question on the complex function. In most cases, the Fourier transform will be complex-valued. And for len system, F(\frac{x}{\lambda f}, \frac{y}{\lambda f}) could also be complex, but I think physically the film will only record the intensity instead. But what about the phase? What's the role of the phase in this case? Or I ask in this way. If I keep the intensity for every point on the image unchanged but I shift the phase by a number or randomly change the phase, how difference the image will be? in what way?

Thanks.

Good questions. First, the spatial transform: if you like, the fourier-conjugate variables are *angles*: absent a screen, the far-field diffraction pattern (the Fourier transform of the aperture) is in terms of angle. The scaling factors relate the image size (in distance) to the angles when a screen is present.

Second, you are correct- the Fourier transform/point spread function/optical transfer function is complex; (nonabsorptive) aberrations are modeled as changes to the phase of the wavefront. This gets into the difference between coherent and incoherent imaging, the reason why the cutoff frequency for imaging is either given by the pupil function directly or the autocorrelation of the pupil function, differences between optical transfer function and modulation transfer function, etc.

An instructive example is to work the Fourier transform for two pure sine apertures: 1 in amplitude (transmission) only and 1 in phase only. The far-field diffraction pattern is quite different.
 
Thanks both of you. That explains what I doubt. And it makes sense that "far-field diffraction pattern (the Fourier transform of the aperture) is in terms of angle", thanks Andy.

I still have one question. In the text, it either use single len system as example or consider a field propagating in free space. In both case, far field sitiutation is considered. So does it mean Fourier Optics in only valid in case of far field? I am curious how to apply Fourier transformation if I place a very complicated "transparency" or media in-between object and image!?
 
Fourier optics analysis is based on the assumption of a linear shift-invariant system. So you can model a very complex optical system under some conditions- Fourier optics will fail for the near field, for example, and must be extended to a vector formalism when the numerical aperture becomes large.

But perhaps the most practical complication is the use of CCD detector arrays- then shift invariance is lost. Aliasing, spurious frequency responses, etc. will occur. However, some analysis is still possible.
 
Thanks a lot. I do learn something :)

Andy Resnick said:
Fourier optics analysis is based on the assumption of a linear shift-invariant system. So you can model a very complex optical system under some conditions- Fourier optics will fail for the near field, for example, and must be extended to a vector formalism when the numerical aperture becomes large.

But perhaps the most practical complication is the use of CCD detector arrays- then shift invariance is lost. Aliasing, spurious frequency responses, etc. will occur. However, some analysis is still possible.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
4K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
Replies
12
Views
2K