# Hanbury Brown and Twiss effect explanation

1. Jun 7, 2014

### inkskin

I've visited a great many sites and looked at papers to fully understand this, and still have some confusion regarding the Hanbury Brown and Twiss effect.

Classically, speaking, we treat photons as waves. the math yields a correlation function which boils down to a constant term and a cos term. then what?

Quantum mechanically speaking, the wave function reaching either detector is 1/sqrt(2)*(A(1)B(2)+B(1)A(2)), A's and B's being wavefunctions of the photons on path 1 and 2 on detectors A and B. Is this the reason? the wavefunction is multiple states? I don't quite understand this fully. Or is it because bosons have unit spin, and hence must have symmetric space in their wave function(thus leading to bunching) and the opposite for fermions.

Also, does the light have to be non coherent for it to work? Something to do with lasers having a poisson distribution, and one need super-poisson(non coherent) light for the expt. why?

I realise this is a lot of question, but i'm utterly confused. Any help would be appreciated. thanks

2. Jun 8, 2014

### atyy

The wave function is due to symmetrization for bosons. There is the opposite effect for fermions.

The Hanbury Brown and Twiss effect doesn't work for lasers, because the effect assumes that the two sources are uncorrelated.

Take a look at section IV and V of http://arxiv.org/abs/nucl-th/9804026 .

3. Jun 8, 2014

### Cthugha

The HBT experiment is more or less a clever way to measure the variance of the photon number distribution of your light field. Or using a different wording: One looks for correlations.

To check whether there are correlations present, one will typically compare the joint detection rates to the joint detection rate expected if there are no correlations present - the latter will just be the product of the mean detection rates n. Assume for simplicity that we simply sample the same field twice. Then this ratio will be:

$$g^{(2)}=\frac{\langle: n^2 :\rangle}{\langle n \rangle^2}.$$

This is the correlation function. A value of 1 will indicate the absence of correlations. The thing you measure in the HBT-effect is the normal-ordered correlation function (the two ":" are there to indicate that). Normal-ordering of the underlying photon operators just means that you correctly incorporate the effect of the detection of the first photon: the light field will now have one photon less:

$$g^{(2)}=\frac{\langle n (n-1) \rangle}{\langle n \rangle^2}.$$

Now the instantaneous photon number will be the mean photon number with some deviation added:

$$g^{(2)}=\frac{\langle (\langle n \rangle +\delta) (\langle n \rangle +\delta -1) \rangle}{\langle n \rangle^2}.$$

Now you can evaluate all these terms. The mean value of the deviation should vanish. The expectation value of the square of the deviation survives. This is the variance of the photon number distribution. That leaves us with three surviving terms:

$$g^{(2)}=\frac{\langle n \rangle^2 +\langle \delta^2 \rangle- \langle n\rangle }{\langle n \rangle^2}=1+\frac{\langle \delta^2 \rangle}{\langle n \rangle^2}-\frac{1}{\langle n \rangle}.$$

Now one can perform a sanity check and evaluate this result for three typical states of the light field:

a) A single photon. A non-classical single photon state has no noise at all and a photon number of 1. This will leave us with a g2 of 0. The correlation is negative. If you have exactly one photon and detect it, it is gone and you will not be able to detect another photon afterwards.

b) Thermal light. Thermal light follows the Bose-Einstein distribution. For a mean of <n> it has a variance of n^2 +n. That leaves us with a value of 2. The detection events are correlated.

c) Coherent light. This follows a Poissonian distribution. For a mean value of <n>, this distribution has a variance of sqrt(<n>). Inserting that into g2, you will find that the last two terms cancel exactly and you are left with a value of 1. The shot noise contributions exactly cancel the contribution you get by destroying a photon with the first detection. This is also the reason why coherent states are eigenstates of the photon annihilation operator. They are immune to loss. So the detections will be completely uncorrelated for laser light.

The interpretation of g2 is straightforward. It gives you the probability of detecting a photon pair normalized to the same probability for a light field of the same mean photon number,but all photons being statistically completely independent.

So the HBT effect does not work for laser light because the two light beams are uncorrelated for laser light, while you need the classical correlations of thermal light to see the HBT effect in terms of bunching.

There are more ways of getting the same result,for example the interference approach first introduced by Fano, but in my opinion it is easier to first digest the simple scenario given above before going to more complicated explanations.

4. Jun 15, 2014

### inkskin

Thank you so much for your responses. They have been incredibly useful and cleared some doubts. However one thing I am still not absolutely certain of is that why do fermions and bosons act differently? Is it because of what property of their wavefunction?
Also, in setting up the hbt expt, the ideal source would be a single photon source, right? however, if i'm unable to obtain that, can a laser (inspite of being coherent) be attenuated with filters to give very few photons and hence be used as a source? can any other kind of laser be used? or some other source? and while collecting the data, experimentally, do they all display bunching?

5. Jun 15, 2014

### UltrafastPED

G. I. Taylor was able to get down to single photon levels by very simple and inexpensive means ... in 1908! See http://spiff.rit.edu/classes/phys314/lectures/dual3/dual3.html

6. Jun 15, 2014

### Cthugha

When you evaluate the probability amplitudes for two indistinguishable two-detection events (like two indistinguishable photons or electrons being detected at two different detectors), you get an interference term which contains a commutator. As the symmetry properties of Bosons and Fermions are different (symmetric vs. antisymmetric wave functions for example), this term will get a different sign for Bosons and Fermions. For Bosons indistinguishable pathways leading to the same final result will interfere constructively, while you get destructive interference for Fermions.

No, this is pretty much the worst source you can use. At least if you want to see the HBT-effect in terms of photon bunching. If you have a look at my last post, you will see that single photons show a g2 of 0, which is antibunching - the exact opposite of the HBT effect. This result is pretty trivial. A single photon will never be detected at two detectors simultaneously, so you will never get simultaneous clicks in the detector. So pretty much the only thing a HBT setup can do with single photons is showing that they really are single photons. This is a standard way of using the setup in labs all around the world as this experiment is the gold standard for showing that photons are particles and that one really has a single photon source, but you will not get photon bunching that way.

No, an attenuated laser will never give you single photons. A real single photon source will give you one photon at most at a single time. A weak laser can give you one photon on average, but you can never switch off the fluctuations.

And: no, as I wrote in my last post, only thermal light will show bunching (g2=2). Laser light and single photons will not show that. However, most readily available thermal light sources are spectrally broad and thus have a short coherence time, so typical detector time resolutions will be too short to see it.

The most practical approach for creating a good light source to see the HBT effect is a pseudothermal Martienssen lamp. You take some laser light (work safely - it may be dangerous) and focus it onto a ground glass disk. You will see some characteristic speckle pattern. Now mount the ground glass disk on some rotating motor (those designed for model planes work well) and rotate it. You will now see a rotating speckle pattern. If you use a narrow pinhole and just use a very small part of the scattered light, this light field will behave exactly like thermal light and will show photon bunching. The coherence time will depend mostly on how fast the ground glass disk gets rotated. Typically you will get coherence times in the low microsecond range.

7. Jun 16, 2014

### inkskin

If you were to NOT sample the same field. And if you were to look at the 2 detectors, for joint detection. Then <:n^2:> wouldn't become <n(n-1)> right, because the mean detection rate at each detector is still n. this comes in after the detection of one photon right, so for the detection of the second photon, we look to the second detector, at which a photon hasn;t been detected before, and hence it still gives n. and the first detector that gives n and it's not necessary to account for the previous detection of a photon here already. we're only looking for co-incidences in the second detector, now that the first has already detected a photon.

ALso, if the value of the correlation function is 2, why does this imply correlation?

8. Jun 16, 2014

### Cthugha

Yes, but the HBT-effect is about autocorrelation. If you take two completely independent light fields - no interaction with the same system, not originating from the same source, not subject to the same noise sources - nothing will happen.

g2 is a direct measure of deviations from statistical independence. If you take the time resolved version g2(tau), you can directly interpret it as the relative probability to detect a second photon at time tau if you already detected one at time zero. This relative probability is normalized to the probability you would get for equivalent light fields of the same intensity and statistically independent photon detection events.

So by definition, a value of g2=1 means that you have no correlations. Any deviation means you will get correlations (g2(tau)>1) or anticorrelations (g2(tau)<1). g2>1 just means that it is more probable to detect a second photon at a time delay tau. Consider some kind of cascaded decay that starts randomly, where some system will go from an excited state to an intermediate state and from there down to the ground state. Let us assume that it will emit a photon of the same energy during both processes.

As the process starts randomly, when repating many times you will get some equally distributed mean intensity over time. However, the intermediate state will have some finite lifetime, so the second photon will have a rather fixed delay compared to the first one. If you measure g2, you will find, that it will have a pretty large value at a delay tau that corresponds to the lifetime of that intermediate state.

Or consider a light field that is incredibly noisy. It has some mean intensity, but the photon number is fluctuating by huge amounts. In that case, if you detect a photon now, the probability that the instantaneous photon number is way above the mean photon number is large and the probability to detect a second photon at short delays will be huge, just because you are still way above the mean photon number. This will give a light field with g2>1 for small delays tau.

9. Jun 16, 2014

### inkskin

thanks a lot. lastly, I must ask, the quantum interpretation of this is still not very clear to me. I have read a couple of things about 2 sources and 2 detectors, and they probability of photon at detector 2 being released by source 1 or 2, or a superposition. But is there any concrete way to prove this with an operator? how does that work

10. Jun 16, 2014

### Cthugha

I am not exactly sure what you are asking here. Do you want some deeper understanding on why Bosons and Fermions give different results? This is a consequence of the spin-statistics theorem (http://en.wikipedia.org/wiki/Spin–statistics_theorem) and the symmetry/antisymmetry of many-body wavefunctions.

Or do you rather want a more intuitive picture of the double detection process? In that case, things are similar to standard qm problems. If you want to calculate the probability for some process happening, you add all the indistinguishable probability amplitudes leading to the same result and square it to get the probability density (e.g. if there is no way at all to find out whether a photon was emitted by source 1 or 2). Afterwards you can add all the probability amplitudes for distinguishable events leading to the same final result (e.g. if you know whether a photon was emitted from source 1 or 2).

This is not too different from the double slit. If you know which slit a particle went through, you will not get interference. Only if that information is not available, you will get interference terms. The HBT-effect is similar, but works "one order higher". Instead of interferences in the field, you get interferences in the intensities. The quantum interpretation of the HBT effect is given in this paper: http://scitation.aip.org/content/aapt/journal/ajp/29/8/10.1119/1.1937827 (U. Fano, Am. J. Phys. 29, 539 (1961)), but it involves quite a bit of math. I am not sure, whether it is appropriate for your level of physics education or not.

11. Jun 16, 2014

### inkskin

so the n here represents the no of counts of photons in each arm. am i right? joint detection rates depend upon ensemble average of detection at each arm <n^2>. say one photon has been detected in detector 1, given this, in a small time tau, we are looking at detector 2, which has an probability of any one of the n photons being detected.

So it should still be <n^2> right? the first detection is irrelevant now, and hence doesn't have to be accounted for. it only acts as a 'timer' of sorts

12. Jun 16, 2014

### inkskin

also, you suggested using a pseudothermal Martienssen lamp. why this? why won't light from any thermal source suffice to display correlations? is noise the problem?

13. Jun 16, 2014

### Cthugha

n is the instantaneous photon number without any averaging. It will fluctuate over time. Maybe even strongly so. So when you have 20 photons and one gets detected, you will have 19 left to detect on the other side. When you have only one photon and it gets detected, you will have none left on the other side. What you do now is averaging correctly. You take the photon number distribution of your light field and for each possible photon number you calculate the probability for that photon number to occur, then the probability of a photon being detected at detector one, consider the remaining photons and consider the probability that another one gets detected at detector two.

It does not only act as a timer. Photon detection is a statistical process and the probability that one actually gets detected may be somewhat low. So the probability that a first photon gets detected at all is significantly higher if the instantaneous photon number right now is high. Therefore, the first detection means that you have a rather high probability of having a large photon number right now.

Oh, it will display correlations, but these will vanish on a timescale on the order of the coherence time. For light from the sun this means 100 femtoseconds or so. Typical semiconductor thermal light sources are somewhere in the picosecond range. The typical time resolution of good photo diodes is in the nanosecond range. Trying to measure the correlations that way is like trying to measure the thickness of a hair using a standard ruler. It will not work. You can use an atom based laser (not a semiconductor laser) below threshold, but I doubt you have one available. The Martienssen lamp is cheap and convenient.

14. Jul 18, 2014

### inkskin

sorry for the delayed response. I was out of town. Thank you for all this. You've been a HUGE help in understanding this.

Why is it that i cannot attenuate a laser enough to generate single photons . I understand that it's a quantum state. but if i were to space and time gate a tubelight. then in principle i should!

15. Jul 18, 2014

### Cthugha

For a laser the exact photon number in any time interval is not exactly definded and dimming intensity is not a deterministic, but a stochastic process.

Let us say for simplicity that you start with around 1000000 photons. The photon number fluctuations will be on the order of the square root of that. Now attenuation is a process similar to putting a beam splitter in. For each photon in the beam you get a certain probability that it will be removed from the beam. You can repeat this up to the point that you will have only one photon or less on average, but due to the stochastic nature of attenuation, some finite probability of having more than one photon present will necessarily remain.

For "real" single photons you need a non-linearity. Take a single atom. It has one excited state and one ground state, so it saturates at an excitation number of one. If it emits a photon it returns to the ground state and you need some time to pump it back to the excited state again. During this time no second photon can be emitted. So here the blockade ensures that no second photon can be emitted in terms of a deterministic process, not just a stochastic one.

16. Jul 21, 2014

### inkskin

okay, i understand that. but if i were to hypothetically gate an already attenuated source, from say a tubeligand gate it in time and space such that i allow exactly one quantum state. One photon. Then that would be it, right? Physically may not be possible, but theoretically!

17. Jul 21, 2014

### Cthugha

Well, you would need something like an "active gate". As you never know how many photons are present just opening and closing a gate for some amount of time will not help. So you need a non-linear gate: Something that closes if more than one photon is present and transmits light if only one photon is present.

People are working on that. Consider for example a cavity which transmits in a narrow spectral window. If the refractive index of the material inside the cavity changes, the transmission window will change, too. The refractive index depends on intensity, so you can create a gate that transmits small photon numbers, but not large ones. Now if you find some system where even the difference between one and two photons present induces such a shift, you have your non-linear gate. People have done similar stuff, for example in the group of Lukin. See "Quantum nonlinear optics with single photons enabled by strongly interacting atoms", Nature 488, 57–60 (02 August 2012). http://www.nature.com/nature/journal/v488/n7409/full/nature11361.html

There might be a free version available on the ArXiv in case you are interested and do not have a subscription.

Last edited: Jul 21, 2014
18. Jul 21, 2015

### YuryM

>Now you can evaluate all these terms. The mean value of the deviation should vanish. The expectation >value of the square of the deviation survives. This is the variance of the photon number distribution. That >leaves us with three surviving terms:
>
>$$g^{(2)}=\frac{\langle n \rangle^2 +\langle \delta^2 \rangle- \langle n\rangle }{\langle n >\rangle^2}=1+\frac{\langle \delta^2 \rangle}{\langle n \rangle^2}-\frac{1}{\langle n \rangle}.$$
>Now one can perform a sanity check and evaluate this result for three typical states of the light field:

What if we send equal superposition of 0 and 1 states? \delta is 0.5 then and so is <n>. We end up with g2=-1. On the other hand it seem obvious that we should get g2=1, since there is no possible cross-corellation.

19. Jul 21, 2015

### Cthugha

No, you do not end up with -1. g2 even cannot become smaller than 0. The first term trivially gives a +1 all the time. As you noted, the expectation value of the absolute of delta and the photon number n are the same. Therefore, the second term also is +1. The third term is just the inverse mean photon number, which gives a -2 here. So in sum you get 0. This is trivially the result you will get for any mixture just containing 0-photon Fock states and 1-photon Fock states as one will never detect photon pairs from this light field.

20. Jul 21, 2015

### YuryM

Thank you for quick response.
Oops, of course, 0. Or perfect anti-correlation. Is this what it should be? Sure, one never detects photon pairs, and when one detector is 1 the other is 0, but if one is 0, the other is not necessarily 1. Sound like partial anti-correlation.

Actually, I am struggling with counter-intuitive (to me) fact that half-mirror does not change g_2 when splitting of thermal light at very low photon counts. If one exposes two detectors to the same beam or if the beam is split and each half falls onto its own detector, g_2 is the same. I understand that a mirror never changes photon number distribution or its autocorrelations, however the fact that it does not change cross-correlation is strange to me.

Another question - what determines g_2 in HBT experiment? The photon number distribution is not sufficient, is it? For example if I feed thermal, i.e. exponential aka geometric distribution to a "mirror simulator" (photon goes either left or right, wave properties are ignored), I get g2 which goes to 0 as <n>->0. Am I making mistake in my simulations somewhere? Apart of treating photons as particles and forgetting entirely about the wave part of quantum mechanics?