I Geiger counters and measurement

  • Thread starter Thread starter jeeves
  • Start date Start date
  • Tags Tags
    Measurement
Click For Summary
The discussion centers on the quantum mechanics of measurement, specifically regarding the state of an unstable particle and a Geiger counter after a period without detection of decay. Two main interpretations are debated: one where the particle remains in a superposition of decayed and non-decayed states, and another where it is definitively in a non-decayed state due to the lack of a counter click. Participants emphasize the importance of entanglement between the particle and the counter, arguing that without measurement, the system cannot be said to collapse to a specific state. The conversation highlights that the observed outcomes will differ based on the state of the system at the time of observation. Ultimately, the state of the system remains a complex interplay of probabilities rather than a definitive condition until a measurement is made.
  • #91
It's indeed a mathematical fact that the exponential-decay law is an approximation. It is derived from 1st-order perturbation theory, neglecting the reaction between the decay products and leads to Breit-Wigner distributions for the transition amplitudes in energy-representation, which means to the exponential decay law in the time domain. It's precisely this neglect of the reaction between the decay products that leads to the exponential decay law, i.e., it neglects the possibility to go back to the original state after decay. This is a good approximation in many cases, but, e.g., not for cases, where the decay is "close to threshold", as demonstrated in the papers cited above. I think the Nature article nicely summarizes this state of affairs.
 
  • Like
  • Informative
Likes protonsarecool, jeeves, hutchphd and 1 other person
Physics news on Phys.org
  • #92
jeeves said:
We have come to the conclusion in this thread that when the detector is not moving (e.g. it is a cat in a box), non-observation does not count as a "measurement" and will not produce a quantum Zeno effect, because the cat/detector does not directly probe the nucleus, it only interacts with the decay product after the decay.
It seems to be the case that observed Zeno effects are indeed due to perturbations on the system that take place during direct measurements (e.g. acceleration of sodium atoms). But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.

Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

[*] - By good quantum theory I mean suitable dynamics (including the dynamics that entangle measured and measurement device) and initial state
 
Last edited:
  • Like
Likes vanhees71
  • #93
Morbert said:
But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.

Why do you assume that such a theory does not already exist? As one interacts with the real world life gets complicated and theories more complicated.
I am also at a loss as to why one would not expect repeated measurement to render Breit-Wigner result insufficient as previously pointed out by @vanhees71.
How this has anything to do with "interpretation" is certainly beyond my ken.
 
  • #94
Morbert said:
Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

You may find Section X of the following paper useful: https://arxiv.org/abs/quant-ph/0611067

To better understand the nature of continuous mea-
surements, we will now consider in detail an example of
how a continuous measurement of position arises in a fun-
damental physical system: a single atom interacting with
light. Again, to obtain weak measurements, we do not
make projective measurements directly on the atom, but
rather we allow the atom to become entangled with an
auxiliary quantum system—in this case, the electromag-
netic field—and then make projective measurements on
the auxiliary system (in this case, using a photodetector).
It turns out that this one level of separation between the
system and the projective measurement is the key to the
structure of the formalism. Adding more elements to the
chain of quantum-measurement devices does not change
the fundamental structure that we present here.

The authors say that measuring the EM field is only a "weak measurement" of the state of the atom, so it won't lead to quantum Zeno. I guess this is essentially the "you're measuring the decay product, not the atom" claim being detailed with more math.

What I'm curious about is: Why does measuring the EM field for photons constitute a "weak measurement" (see definition on page 4, left column)? The reasoning in Section X seems valid, but it's not clear to me how the measurement is fuzzy in a way that allows them to avoid quantum Zeno. After all, isn't it the case that knowing the state of the EM field allows you to precisely identify the state of the two-level atom?
 
  • #95
Morbert said:
It seems to be the case that observed Zeno effects are indeed due to perturbations on the system that take place during direct measurements (e.g. acceleration of sodium atoms). But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.
Yes, and this is indeed what QT does since its discovery in 1925 ;-). The Zeno effect is due to interactions of the system including the measurement devices and the "environment". Measurement devices are also nothing special but consist of the building blocks of matter as the observed object and it obeys no special "rules" other than the rules described by QT. What else should it be and how else should it behave?
Morbert said:
Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

[*] - By good quantum theory I mean suitable dynamics (including the dynamics that entangle measured and measurement device) and initial state
It's just the standard dynamics described in any valid textbook of quantum theory.
 
  • Like
Likes protonsarecool
  • #96
jeeves said:
What I'm curious about is: Why does measuring the EM field for photons constitute a "weak measurement" (see definition on page 4, left column)? The reasoning in Section X seems valid, but it's not clear to me how the measurement is fuzzy in a way that allows them to avoid quantum Zeno. After all, isn't it the case that knowing the state of the EM field allows you to precisely identify the state of the two-level atom?
I think the fuzziness they are referring to here is fuzziness in the moment the particle decays as opposed to fuzziness in the energy of the two-level atom. I.e. Instead of

"A detector clicking at time ##t## implies the particle decayed at time ##t'##"

We say

"a detector clicking at time ##t## implies the particle probably decayed at time ##t'\pm\Delta t##"

where the 'probably' gets weakers as ##\Delta t## is made smaller. At the end of post #56 I remarked that I did not think a continuous measurement (taking ##\Delta t \rightarrow 0##) could be normalised. They have introduced a fuzziness (weakness of measurement as ##\Delta t\rightarrow 0##) so that a norm-preserving time-evolution can be obtained. This would presumably not satisfy Home+Whitaker since they derive a Zeno effect from non-continuous, discrete sequences of measurements.
hutchphd said:
Why do you assume that such a theory does not already exist? As one interacts with the real world life gets complicated and theories more complicated.
I am also at a loss as to why one would not expect repeated measurement to render Breit-Wigner result insufficient as previously pointed out by @vanhees71.
How this has anything to do with "interpretation" is certainly beyond my ken.
I wouldn't. I am instead saying that papers like the Whitaker one above are not using a correct quantum theory, and hence they are predicting statistics contingent on a sequence of indirect measurements when there should be no such contingency.
vanhees71 said:
Yes, and this is indeed what QT does since its discovery in 1925 ;-).

It's just the standard dynamics described in any valid textbook of quantum theory.
To be clearer and more formal, by quantum theory of the system, I meant dynamics ##U(t)## and initial state ##\rho_0## such that if ##\Pi_s## is the "detector has not clicked" projector then $$\mathrm{Tr}\left[\Pi_s(T)\rho_0\Pi^\dagger_s(T)\right] = \mathrm{Tr}\left[\Pi_s(T)\Pi_s(T/2)\rho_0\Pi^\dagger_s(T/2)\Pi^\dagger_s(T)\right] = \mathrm{Tr}\left[\Pi_s(T)\dots\Pi_s(T/n)\rho_0\Pi^\dagger_s(T/n)\dots\Pi^\dagger_s(T)\right]$$Home and Whitaker were presenting a dynamics where this did not hold, and I think their problem, as corroberated by posts by you DrChinese et al, is their dynamics are not actually correct.
 
  • Like
Likes PeroK
  • #97
The time evolution is due to the Hamiltonian of the system not by some projectors. If the interaction between the detector and the system under consideration is relevant for its time evolution, you have to include the corresponding interaction Hamiltonian.
 
  • Like
Likes hutchphd
  • #98
vanhees71 said:
The time evolution is due to the Hamiltonian of the system not by some projectors. If the interaction between the detector and the system under consideration is relevant for its time evolution, you have to include the corresponding interaction Hamiltonian.
I'm using the convention ##\Pi_s(t) = U^{-1}(t)\Pi_sU(t)##. And yes I agree the dynamics have to include degrees of freedom of the system being entangled with the atom.
 
  • Like
Likes vanhees71
  • #99
Morbert said:
Home and Whitaker were presenting a dynamics where this did not hold, and I think their problem, as corroberated by posts by you DrChinese et al, is their dynamics are not actually correct.
I've done some more reading and I think the majority of what has been said in this thread so far is not totally correct (including the analysis I gave a few pages ago).

I believe a correct analysis of a decaying atom with a continuous photodetector is given in "Quantum Zeno effect with general measurements" by Koshino and Shimizu, Physics Reports, 2005. It has hundreds of citations, so I conclude that it is "generally accepted physics." An arxiv version is here.

Section 5 contains the main argument. They show that for a continuous photodetector with small enough response time (much smaller than the initial time period where the atom decays non-exponentially) there is indeed a Zeno effect. The reason we do not see Zeno effects in day-to-day experiments is that the response times of typical photodetectors are many orders of magnitude larger than what would be necessary to get these effects. Their argument includes "indirect measurement," and they show how to extend it to non-continuous, discrete measurements at small time intervals.

I think with trivial changes you can analyze the Home–Whitaker setup in the same way, and you will indeed see a Zeno effect from those successive indirect measurements if the time interval is small enough.
 
  • Like
Likes Morbert
  • #100
jeeves said:
I think with trivial changes you can analyze the Home–Whitaker setup in the same way, and you will indeed see a Zeno effect from those successive indirect measurements if the time interval is small enough.
The Koshino paper is interesting, particularly in the way it dissolves the distinction between a direct and indirect measurement, so we might indeed see a Zeno effect in the Home-Whitaker thought experiment involving the hollow sphere of detectors. The Koshino paper shows this would not in fact be paradoxical because each contraction of the hollow sphere of detectors perturbs the atom, since the atom and detector are coupled via the photon field. Different sequences of indirect measurements = different dynamics = different decay statistics.
 
  • Like
Likes vanhees71

Similar threads

  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
1
Views
3K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
Replies
12
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K