Slit diffraction: time between emission and detection of photon

Click For Summary
SUMMARY

The slit diffraction experiment demonstrates that the mean detection time, {T_{mean}}, varies with the number of slits (N). As N increases, the complexity of the diffraction pattern enhances the probability of photon detection, resulting in a decrease in {T_{mean}}. Specifically, with one slit (N=1), {T_{mean}} is higher, while it approaches zero as N approaches infinity, indicating near-certain detection. This relationship highlights the impact of slit number on photon detection dynamics in diffraction experiments.

PREREQUISITES
  • Understanding of photon emission and detection processes.
  • Familiarity with slit diffraction and interference patterns.
  • Knowledge of probability distributions, particularly Gaussian distributions.
  • Basic principles of experimental physics and data collection methods.
NEXT STEPS
  • Explore the mathematical formulation of slit diffraction patterns.
  • Investigate the role of photon statistics in quantum mechanics.
  • Learn about the implications of increasing slit numbers on interference patterns.
  • Study the experimental setup and data analysis techniques for photon detection experiments.
USEFUL FOR

Physicists, experimental researchers, and students studying quantum mechanics and wave optics will benefit from this discussion, particularly those interested in photon behavior and diffraction phenomena.

m.e.t.a.
Messages
111
Reaction score
0
For this question I am considering a slit diffraction experiment set up as follows:

{Monochromatic source} ------> {Single slit} ------> {Diffraction grating with N slits} ------> {Screen with small movable detector}

The monochromatic light source emits photons one at a time. The principal interference maximum occurs at position x=0 on the screen. The detector is placed at some point, x, on the screen where the probability of detecting the photon is non-zero (also: x \ne 0). The detector detects all photons which arrive between positions x and x + \Delta x.

Photons are emitted one by one at a slow rate. Every time a photon is emitted, a stopwatch is started. If the photon is detected at the detector then the stopwatch is stopped and that time measurement, T is logged. If the photon is not detected then no measurement is recorded and the experiment is run again with a new photon.

The experiment is repeated many times. Finally, a probability distribution is plotted: {T} vs. {probability of T}. (I presume that this probability distribution will be approximately Gaussian in shape, although its exact shape is not important here.) This probability distribution will be centred around some mean value of T, {T_{mean}}.

Suppose that the experiment is run three times with different numbers of slits:

(i) N=1

(ii) N=2

(iii) N \to \infty


My question: will {T_{mean}} vary in each case? And if it will vary, how so?

(This is a stripped-down version of a longer question I posted a few days ago, https://www.physicsforums.com/showthread.php?p=2695689#post2695689.)
 
Physics news on Phys.org
Yes, {T_{mean}} will vary in each case. As the number of slits increases, the diffraction pattern becomes more complex and the probability of detecting the photon at a given position on the screen increases. This means that the mean time for detection, {T_{mean}}, will decrease. As N \to \infty, the probability of detecting the photon at any point on the screen is effectively 1, so {T_{mean}} will tend to 0.
 

Similar threads

  • · Replies 25 ·
Replies
25
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 81 ·
3
Replies
81
Views
7K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 17 ·
Replies
17
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K