# Optics - Obtaining structure of an object using a laser

1. Nov 12, 2014

### The_Foetus

Hi,

Recently I did an experiment to try and discover what some objects look like microscopically, using a laser and looking at their diffraction patterns. We used the fact that the intensity profile you obtain is the fourier transform of the object you're shining it through, so we can recover what the original object looks like by taking the fourier transform of the image.

However looking back over it, I'm getting very confused about the mathematics converting the distances in our Fourier transform to real distances.

We used a camera and matlab to do the fourier transform, and we got a conversion of about 50 pixels/mm for the camera. The difficulty comes when using the co-ordinates in matlab. For instance, for one of our objects, when we took the fourier transform it returned that the co-ordinate distance between peaks was 100 pixels (or 1/pixels?). We used k = (2pi/λf)x, but do we get the real distance between peaks to be 100*(λf/2pi) or what? Apologies if this sounds a bit confusing, will try and compose my sentences better if it's too unclear

2. Nov 12, 2014

### Andy Resnick

I'm not exactly sure about your setup and data analysis, but if I understand correctly, your image is better thought of as 'reciprocal space' (or angle) rather than 'pixels/mm'- and 50 pixels/mm sounds like your have ginormous pixels... Anyhow, it should be 'pixels/steradian' (or 'pixels/radian' in one dimension), and the FFT image would then have units of length (and 1/pixels).

I forget the exact conversion factor, but IIRC, say your sensor subtends 0.01 radian (the actual number will depend on the sensor size and distance from the object). Then, each pixel of the FFT image will represent 2π/0.01 *λ units of length. I think- I need to verify this, but my references are at the office.

3. Nov 13, 2014

### M Quack

Believe it or not, this technique is frequently used with x-rays. It is called coherent diffraction imaging. You can get resolutions down to 10 or 20 nanometers with this.

http://en.wikipedia.org/wiki/Coherent_diffraction_imaging

A related technique is Ptychography

http://en.wikipedia.org/wiki/Ptychography

The main problem with the Fourier transform is that you only measure the amplitude of the Fourier signal. The phase is unknown.
To do the back-transformation from the diffraction signal to the real space object you need both amplitude and phase.

One way of recovering the phase is to oversample, i.e. to measure many more amplitudes than are needed for the reconstruction,
and use this redundancy to find a phase/amplitude solution that is consistent with all observed amplitudes.

4. Nov 13, 2014

### Andy Resnick

I think you mean 'intensity' instead of 'amplitude'.

Yes, with the increasing availability of area detectors, this technique (also called static light scattering) is gaining popularity. Unfortunately, I couldn't find the scale factor that provides the proper units- there's the scattering vector q = 4π sin(θ/2)/λ and the corresponding length l = 2π/q, but none of my image processing books provide a clear relationship between a digital image of (say) a Laue pattern and the corresponding real-space length scale obtained by a DFT.

5. Nov 14, 2014

### M Quack

Yes, intensity of the light which gives you the amplitude of the electric field by taking the square root...

BTW, in Bragg's law, the scattering angle (angle between incident and exit beams) is usually denote by 2theta (theta is the Bragg angle),
then q= 4pi sin(theta)/lambda.

To get the scattering angle 2theta you need to know the sample-to-detector distance.

tan(2theta)=(number of pixels)(pixel size)/(sample-to-detector distance).

"number of pixels" is the distance, in pixels, of your measurement point to the point where the incident beam hits the detector, assuming the detector is perpendicular to the incident beam.

6. Nov 14, 2014

### M Quack

Last edited by a moderator: May 7, 2017