I Geiger counters and measurement

  • I
  • Thread starter Thread starter jeeves
  • Start date Start date
  • Tags Tags
    Measurement
  • #51
I believe that makes sense. Let me see if I understand.

Suppose again we are in the short-time regime where the decay is not memoryless, and I observe after some time has passed that the cat has not died (atom has not decayed). Then do you agree that I have learned something nontrivial about the system? If so, how would you write the state in the alive/undecayed and dead/decayed basis arising from decoherence?
 
Physics news on Phys.org
  • #52
jeeves said:
I believe that makes sense. Let me see if I understand.

Suppose again we are in the short-time regime where the decay is not memoryless, and I observe after some time has passed that the cat has not died (atom has not decayed). Then do you agree that I have learned something nontrivial about the system?
Yes.
jeeves said:
If so, how would you write the state in the alive/undecayed and dead/decayed basis arising from decoherence?
I guess we start in a known state ##\ket {\psi(0)}##. The system would then evolve according to something like:
$$\ket {\Psi(t)} = a(t) \ket {\psi_d} + b(t) \ket {\psi(t)}$$Where we now, hypothetically, have a superposition into decayed and undecayed states, where the undecayed state itself is a function of time. I suspect at some fundamental level this may break conservation laws, but let's not worry about that.

The rest of the calculation re entanglement would be the same, except the state "detector = no" will be entangled with the time dependent undecayed state ##\psi(t)##.
 
  • #53
PeroK said:
I guess we start in a known state ##\ket {\psi(0)}##. The system would then evolve according to something like:
$$\ket {\Psi(t)} = a(t) \ket {\psi_d} + b(t) \ket {\psi(t)}$$Where we now, hypothetically, have a superposition into decayed and undecayed states, where the undecayed state itself is a function of time. I suspect at some fundamental level this may break conservation laws, but let's not worry about that.
This seems reasonable, but I'm still a bit confused. Suppose I try to apply this formalism to a classic quantum Zeno experiment, where the atom is repeatedly measured at small time intervals in a way that reduces or eliminates the possibility of decay. How does the time-dependent undecayed state formalism predict the Zeno effect? (Recall such an effect is possible only because we are in the non-exponential decay regime.)
 
  • #54
jeeves said:
This seems reasonable, but I'm still a bit confused. Suppose I try to apply this formalism to a classic quantum Zeno experiment, where the atom is repeatedly measured at small time intervals in a way that reduces or eliminates the possibility of decay. How does the time-dependent undecayed state formalism predict the Zeno effect? (Recall such an effect is possible only because we are in the non-exponential decay regime.)
I didn't think we were discussing the Quantum Zeno effect! I don't know anything specific about that.
 
  • #55
PS I think @DrChinese already gave a good summary of this:

DrChinese said:
Many things (other than radioactive decay) have a probability of occurring per unit of time. Example: when an electron drops to a lower orbital and emits a photon. I would not normally call a detection "non-event" to be equivalent to a "continuous series of observations" of the particle in question. (There might be a few cases where it is difficult to suitably define a "non-event" or a "continuous series of observations".)
 
  • #56
I think there is an important distinction between using QM to model if the particle has decayed in some time interval vs modelling the moment the particle decays. Let ##\Pi_\mathrm{d}, \Pi_\mathrm{nd}, \Pi_\mathrm{c}, \Pi_\mathrm{nc}## be the projectors for "decayed", "not decayed", "clicked" and "not clicked" respectively.

Seeing that the detector is in the state "not clicked" at some arbitrary time ##t## constitutes a measurement. It resolves the question of whether or not the particle has decayed in the time interval ##\left[0,t\right)## (assuming the experiment was prepared at time 0). The probability is $$\mathbf{Tr}\left[\rho \Pi_\mathrm{nd}(t)\right]$$ and the alternative (particle has decayed) trivially decoheres $$\mathbf{Tr}\left[\Pi_\mathrm{d}(t)\rho \Pi_\mathrm{nd}(t)\right] = 0$$ We can also perform a more general measurement: we can measure whether or not the the particle decays in some time interval ##\left[t,t+\Delta t\right)## by checking the detector at ##t## and at ##t+\Delta t##. The probability is $$\mathrm{Tr}\left[\Pi_\mathrm{d}(t+\Delta t)\Pi_\mathrm{nc}(t)\rho\Pi_\mathrm{nc}(t)\Pi_\mathrm{d}(t+\Delta t)\right]$$ Like before, all alternatives decohere $$\mathrm{Tr}\left[\Pi_\mathrm{nd}(t+\Delta t)\Pi_\mathrm{nc}(t)\rho\Pi_\mathrm{nc}(t)\Pi_\mathrm{d}(t+\Delta t)\right] = \mathrm{Tr}\left[\Pi_\mathrm{d}(t+\Delta t)\Pi_\mathrm{c}(t)\rho\Pi_\mathrm{nc}(t)\Pi_\mathrm{d}(t+\Delta t)\right] = 0$$ A "continuous measurement" would resolve whether or not the particle decayed precisely at time ##t##. I.e. Take the above and let ##\Delta t## go to 0. I don't think this is normaliseable so I don't think QM can model such a continuous measurement.
 
  • #57
jeeves said:
Suppose again we are in the short-time regime where the decay is not memoryless, and I observe after some time has passed that the cat has not died (atom has not decayed). Then do you agree that I have learned something nontrivial about the system?
I don't. See below.

PeroK said:
Where we now, hypothetically, have a superposition into decayed and undecayed states, where the undecayed state itself is a function of time.
Even if you do this, you still haven't changed the fundamental fact about the "observation" the OP is describing: it's an observation of the cat, not an observation of the atom. And all you're observing about the cat is that it's alive. You can deduce from this that the atom has not decayed, but, if you are including multiple "undecayed states" in your model for the atom, observing that the cat is alive does not tell you which "undecayed state" the atom is in, and therefore does not tell you anything useful about the probability of decay. The only way to know anything useful about the atom's state is either to observe that the cat died--which tells you the atom decayed--or to prepare the atom in a known "undecayed" state (which amounts to observing the atom directly and obtaining the result that the atom is in that particular state). A "quantum Zeno effect" experiment would amount to doing the latter; but just observing that the cat is alive does not.
 
  • Like
Likes gentzen
  • #58
The entire question boils down to the question, whether the Geiger counter affects the decaying particle sufficiently before it decays. In more physical terms that means the question is, whether the presence of the Geiger counter leads to interactions with the decaying nucleus in such a way that it affects the dynamics leading to the decay of the nucleus. This is, FAPP, not the case, because the nuclear forces holding the nucleus together are very strong against the long-range interactions with the material of the Geiger counter (i.e., electromagnetic interactions and, though totally academic, in principle gravitation). So the presence of the Geiger counter does not affect the dynamics of the nucleus before the decay and thus also not its mean lifetime. What interacts with the Geiger counter is the decay product (He nuclei, electrons, or ##\gamma##'s for ##\alpha##, ##\beta##, or ##\gamma## decay), and there is indeed always some propability that the Geiger counter does not register this decay product, but no matter whether it does or not, it doesn't affect the lifetime of the unstable nucleus.

This changes for other systems, where the sheer existence of a measurement device interacts with the observed unstable system in such a way that it affects the dynamics of this system and thus may change the transition probability/aka its lifetime tremendously. An example is to put an atom in some cavity such that a photon of a transition does not "fit" with its frequency in the cavity. Then this transition is suppressed and the lifetime of the corresponding excited state can be very much longer than for an atom in free space.

All this has nothing to do with "collapse" but just with interaction between measurement devices/or any other stuff around an observed quantum object. Imho, "Collapse" should only be discussed in the interpretation subforum, because whether or not you assume a collapse, depends on your personal interpretation of quantum theory. I strongly plead against introducing the collapse at all since it's a kind of Pandora's Box in the context of relativity and causality.
 
  • Like
Likes Nugatory, WernerQH and PeroK
  • #59
vanhees71 said:
The entire question boils down to the question, whether the Geiger counter affects the decaying particle sufficiently before it decays. In more physical terms that means the question is, whether the presence of the Geiger counter leads to interactions with the decaying nucleus in such a way that it affects the dynamics leading to the decay of the nucleus.

Thank you, vanhees. Based on your answer, I have the following understanding. Is it correct?

Suppose we have a Schrodinger's cat setup in a transparent box. I use ##A(t)## as the coefficient of the "undecayed" atom state, and ##B(t)## as the coefficient of the "decayed" atom state.

At ##t=0##, we have ##A(0) = 1##.

At ##t=1##, I look at the box at see the cat is alive. My observation of the cat (or the cat itself, or me staring directly at the location of the atom, etc.) has no effect on the evolution of the decaying atom. So the atom is in the same superposition as it would've been had I not looked. That is, we have coefficients ##A(1)## and ##B(1)##.

I keep observing the box. At ##t=2## I see the cat is still alive. The coefficients are now ##A(2)## and ##B(2)##. It is still the case that nothing has interacted with the atom in a meaningful way.

I keep observing the box. At ##t=3##, the atom decays, the poison is released, and I observe the cat die. We now have ##A(3) = 0## and ##B(3) = 1##.

Is this correct so far?

Regarding how I learn of the state of the atom: Should I think of this as the cat being entangled with the decay product, which is in turn entangled with the atom? So, after the particle decays, the cat becomes highly entangled with the atom and knowledge of the cat's status is equivalent to knowledge of the atom's status? And crucially (to avoid quantum Zeno), this entanglement does not appear prior to the decay?
 
Last edited:
  • #60
The cat's state is entangled with the nucleus + decay products after the decay. Without looking at a time ##t## all you know is the probability for the cat to be dead or alive. It's given by the radioactive decay law (to very good accuracy). The survival probability of the nucleus is ##P(t)=\exp(-t/\tau)##, where ##\tau## is the mean lifetime of the mother nucleus.

I'm a follower of the minimal statistical interpretation since that's all needed to use quantum theory as a physical theory. For me everything beyond this (including the collapse hypothesis) is metaphysics (for some people it seems to take the status of a kind of religion) and thus beyond the aim of physics as a natural science, which is to find mathematical descriptions of phenomena to be observable (meaning measurable and quantifiable).

A probability has an epistemic meaning. It is a measure for the expectation of the cat's state some time ##t## after a given preparation of the system at time ##t=0##. The probabilities are described by quantum theory, i.e., by the time-evolution equation for the statistical operator or (if you deal with pure states) the Schrödinger equation for the corresponding state ket. If you take notice of the state of the cat nothing specific need to happen with the cat. All you do is to update your (probabilistic) description of the situation, gaining new information about the state of the ket being either dead or alive when looking. There is no mysterious collapse.

Note that this posting (and imho the entire thread) does not belong to the quantum mechanics forum but to the interpretation subforum ;-)).
 
  • Like
Likes hutchphd
  • #61
jeeves said:
Suppose we have a Schrodinger's cat setup in a transparent box. I use ##A(t)## as the coefficient of the "undecayed" atom state, and ##B(t)## as the coefficient of the "decayed" atom state.
You are oversimplifying. You need more than two coefficients to describe the situation using the Schrödinger equation. In the description of the decay you need to include also the decay products (typically an escaping electron and anti-neutrino). Whether or not a Geiger counter is present, in the derivation of Fermi's Golden Rule we take the squared modulus of the coefficients of the final states (Born rule) and add them up. This would lead to a decay probability increasing with time like ## t^2 ##, were it not for the density of final states. As time progresses, the contributing final states decrease, their distribution in energy becoming ever sharper (the width decreasing proportional to ## t^{-1} ##). In this way we arrive at a constant decay rate. And for all we know, radioactive decay happens even without Geiger counters present.

What really happens, is that a radioactive nucleus just sits there for a long time, and suddenly decays. A slowly (over the course of minutes, days, or years) evolving wave function exists only in the minds of theoreticians. You should not think that your two coefficients ## A(t), B(t) ## have some direct physical meaning.
 
  • #62
vanhees71 said:
Note that this posting (and imho the entire thread) does not belong to the quantum mechanics forum but to the interpretation subforum ;-)).

I object to this characterization. I do not believe I am asking anything about interpretation. I am simply asking: What is the state of the atom at various times? This is a question with empirically testable consequences. In principle, the atom could be measured, and its state determined. Interpretation has nothing to do with it. (Also, I did not mention collapse in that most recent post.)

WernerQH said:
What really happens, is that a radioactive nucleus just sits there for a long time, and suddenly decays. A slowly (over the course of minutes, days, or years) evolving wave function exists only in the minds of theoreticians. You should not think that your two coefficients ## A(t), B(t) ## have some direct physical meaning.
For similar reasons, I hold that am not imbuing my coefficients with any physical meaning. I just want to know what the coefficients are at ##t=1,2,3## in the situation I described, and why they are that way. By "why" I mean I am asking for strictly empirical explanations (such as decoherence, lack of interactions, etc.), not anything to do with interpretation.
 
  • Like
Likes gentzen
  • #63
jeeves said:
I do not believe I am asking anything about interpretation. I am simply asking: What is the state of the atom at various times?
The problem is the answer to this question is interpretation-dependent (e.g. Some interpretations ascribe no objective character to the state).

An interpretation-independent question that might still serve your purposes might be something like: "Given preparation of the particle X and sequence of observations Y that occur at times {t1, t2, t3, ...} what are the likelihoods of the possible outcomes"
 
  • #64
Sure, we can gloss "What is the state?" as asking about statistics of identical ensembles. And my question "Why is the state that way?" can be glossed as "How does one derive the empirically observed statistics a priori by reasoning about Schrodinger evolution and entanglement?" .

The key point is: We all know what the empirically observed statistics are going to be. I would like to know how to correctly reason about the formalism of quantum mechanics to derive those statistics. (It's easy to reason incorrectly and predict a quantum Zeno effect if, for example, one views watching the cat as a "measurement" of the atom.) I presented some reasoning on the last page and am curious if it is right.
 
Last edited:
  • #65
jeeves said:
Sure, we can gloss "What is the state?" as asking about statistics of identical ensembles. And my question "Why is the state that way?" can be glossed as "How does one derive the empirically observed statistics a priori by reasoning about Schrodinger evolution and entanglement?" .

The key point is: We all know what the empirically observed statistics are going to be. I would like to know how to correctly reason about the formalism of quantum mechanics to derive those statistics. (It's easy to reason incorrectly and predict a quantum Zeno effect if, for example, one views watching the cat as a "measurement" of the atom.) I presented some reasoning on the last page and am curious if it right.
So here his how I would model a decaying particle ##\phi## and detector ##\psi##:

The particle and detector at some initial time ##t_i## are prepared as $$|\Psi_0\rangle = |\phi_s\rangle|\psi_\text{ready}\rangle$$ with unitary evolution $$U(t)|\Psi_0\rangle = a(t)|\phi_s\rangle|\psi_\text{not clicked}\rangle + b(t)|\phi_d\rangle|\psi_\text{clicked}\rangle$$ where ##s## and ##d## are for "survived" and "decayed" respectively. We can model the behaviour of the detector with a projective decomposition of the identity operator on the detector's state space ##I = E_\text{clicked} + E_\text{not clicked}## . We construct a chain of projectors for whatever temporal resolution we demand to represent the various possible outcomes. A simple example: we can model an outcome like "the detector clicks at time ##T##" as $$E_\text{not clicked}(T-\delta t)\otimes E_\text{clicked}(T+\delta t)$$ which has a probability $$p_a = \mathrm{Tr}\left[E_\text{clicked}(T+\delta t) \otimes E_\text{not clicked}(T-\delta t) |\Psi_0\rangle\langle\Psi_0|E_\text{not clicked}(T-\delta t)\otimes E_\text{clicked}(T+\delta t) \right]$$
 
  • #66
jeeves said:
(It's easy to reason incorrectly and predict a quantum Zeno effect if, for example, one views watching the cat as a "measurement" of the atom.) I presented some reasoning on the last page and am curious if it is right.
IMO, that's why it's not a good idea to try to relate QM directly to macroscopic behaviour. One thing you must do is deconstruct your macroscopic ideas before you can assume they are directly related to underlying QM phenomena. In other words, an "observation" at the macroscopic level may not map very well to measurements of a QM system, as generally understood. For example:

When you are watching a cat, what is actually happening? If the cat is poisoned, how long does it take to react? How long before you know it's dying rather than yawning or whatever?

These are not silly questions once you've taken the step of assuming that watching a cat is equivalent to a continuous measurement of a microscopic system on whose state its live depends!

In answer to my own question, watching a cat is an enormously fuzzy and imprecise set of measurements compared to the sort of precise measurement required in a QM experiment.

We could compare this with the recent experiments regarding the anomalous dipole moment of the muon. The mean lifetime of a muon is about ##2.2## microseconds. You have no ability in watching a cat to pinpoint changes in its macroscopic state over those timescales.

If you rigged up a system where the muon's decay was linked to the death of a cat by some mechanism, then you simply cannot claim that by "continuously" watching the cat you are continuously aware of whether the muon has decayed or not. The muon will have been created and have decayed in a timescale shorter than the timescales of any macroscopic phenomenon. Even if the cat reacted to poison in ##1## second, that is still 500,000 times longer than the muon's lifetime. And your observations of the cat and its reactions do not pin down the lifetime of the muon in any meaningful way.
 
  • Like
Likes protonsarecool, hutchphd, gentzen and 1 other person
  • #67
jeeves said:
I am simply asking: What is the state of the atom at various times?
And the best answer that basic QM, with no interpretation involved, can give you is that the QM math is not telling you "the state of the atom" in the sense you appear to be using the term. It's just telling you how to predict probabilities for possible observations. There is a mathematical thingie in the machinery used to predict probabilities that is called "the state of the atom" (more precisely the "wave function" or "state vector" of the atom), but basic QM, without adopting any particular interpretation, does not make any claim whatsoever about that mathematical thingie's physical meaning. All basic QM says is that you can use the mathematical thingie to predict probabilities.

jeeves said:
This is a question with empirically testable consequences.
No, it isn't. You can empirically test the predictions for probabilities of possible observations without bringing in any particular QM interpretation, but you cannot empirically test what "the state of the atom" is without bringing in some particular QM interpretation, because different interpretations make different claims about what "the state of the atom" is in any particular situation, even though they all agree on the predictions of probabilities.
 
  • Like
Likes protonsarecool and hutchphd
  • #68
PS watching a cat is very different from continually subjecting an atom to a series of ultraviolet pulses to suppress its evolution to an excited state - as described in a quantum Zeno experiment:

https://en.wikipedia.org/wiki/Quantum_Zeno_effect#Experiments_and_discussion

Fuzzy, imprecise macroscopic measurements cannot be used to interrogate microscopic QM systems in the way you are imagining.
 
  • #69
PeterDonis said:
And the best answer that basic QM, with no interpretation involved, can give you is that the QM math is not telling you "the state of the atom" in the sense you appear to be using the term. It's just telling you how to predict probabilities for possible observations.
Sure. As I said in post #64, I just want to know how to apply the math appropriately to correctly predict the empirically observed outcomes. We make predictions in QM by constructing a wave function and reading off the desired probabilities. I am asking only how to construct the wave function appropriately. I do not intend to give it any special ontological status, or interpret it metaphysically in any way. When I ask "What is the state of the atom?" I am only asking about the appropriate wave function to assign. I will endeavor to be more precise about this distinction.

PeroK said:
PS watching a cat is very different from continually subjecting an atom to a series of ultraviolet pulses to suppress its evolution to an excited state - as described in a quantum Zeno experiment:

https://en.wikipedia.org/wiki/Quantum_Zeno_effect#Experiments_and_discussion

Fuzzy, imprecise macroscopic measurements cannot be used to interrogate microscopic QM systems in the way you are imagining.
I completely agree. I would just like to know how to correctly account for this (for all practical purposes, etc.) when writing down wave functions to predict things.

What you say in post #66 seems consistent with the description sketched in my post #59. (In particular, "You have no ability in watching a cat to pinpoint changes in its macroscopic state over those timescales.") Do you find the description I give there reasonable? (Again, please read "state" in an ontologically minimal way.)
 
  • #70
jeeves said:
I just want to know how to apply the math appropriately to correctly predict the empirically observed outcomes.
You use the known probability of decay per unit time for whatever atom you have in your apparatus, and integrate that over the time since you prepared the atom to get a cumulative probability. In other words, you use the familiar math of radioactive decay that was already known even before QM was discovered (unless you are using a setup that involves QM corrections to the exponential decay law, in which case you have to use the decay probability math that includes those corrections).

You can, of course, write things in terms of a QM wave function with coefficients ##A(t)## and ##B(t)## in front of the "non-decayed" and "decayed" terms, where ##B(t)## is determined by the radioactive decay law and ##A(t)## is determined by normalization (the squared norm of the wave function as a whole must be 1). But this doesn't tell you anything new as far as probabilities go that the radioactive decay law doesn't already tell you. And if you're not picking any particular QM interpretation, you have no reason to even bother writing down the wave function in the first place since you're not assigning it any physical meaning and it doesn't add any ability to predict probabilities that you don't already have.
 
  • Like
Likes hutchphd
  • #71
PeterDonis said:
You can, of course, write things in terms of a QM wave function with coefficients ##A(t)## and ##B(t)## in front of the "non-decayed" and "decayed" terms, where ##B(t)## is determined by the radioactive decay law and ##A(t)## is determined by normalization (the squared norm of the wave function as a whole must be 1). But this doesn't tell you anything new as far as probabilities go that the radioactive decay law doesn't already tell you. And if you're not picking any particular QM interpretation, you have no reason to even bother writing down the wave function in the first place since you're not assigning it any physical meaning and it doesn't add any ability to predict probabilities that you don't already have.
Yes, I would like to write things in terms of a wave function with coefficients ##A(t)## and ##B(t)##. In particular, I would like to know whether what I wrote in post #59 is correct (regarding both the coefficients themselves and the reasoning used to reach them).

I disagree that if I am not picking a particular interpretation, there is no reason to do this. The reason is to see if my understanding of the QM formalism is correct. This knowledge will be useful for when I consider more complicated situations, where I do not have prior knowledge of the probabilities or a suitable classical approximation.
 
  • #72
jeeves said:
Sure. As I said in post #64, I just want to know how to apply the math appropriately to correctly predict the empirically observed outcomes. We make predictions in QM by constructing a wave function and reading off the desired probabilities. I am asking only how to construct the wave function appropriately.
Usually you would be dealing with some specific microscopic phenomenon. Most introductory texts deal with the Hydrogen atom and the prediction of its spectrum. That's a classic solution to the SDE for the energy eigenstates (wavefunctions).

But, QM is not just about wavefunctions. If you want to try some genuine QM, there's an accessible and insightful treatment at undergraduate level here:

http://physics.mq.edu.au/~jcresser/Phys304/Handouts/QuantumPhysicsNotes.pdf
 
  • #73
I found the following quote in the paper
Peres, Asher (1989). Quantum limited detectors for weak classical signals. Physical Review D, 39(10), 2943–2950.

He writes of quantum Zeno and continuous measurements that:
This problem is peculiar to quantum systems. It disappears
in the semiclassical limit, where eigenvalues become extremely dense.
From the quantum point of view,
classical measurements are always fuzzy. This is why a
watched pot may boil, after all: the observer watching it
is unable to resolve the energy levels of the pot. Any hy-
pothetical device which could resolve these energy levels
would also radically alter the behavior of the pot. Like-
wise, the mere presence of a Geiger counter does not
prevent a radioactive nucleus from decaying. The
Geiger counter does not probe the energy levels of the
nucleus (it interacts with decay products whose Hamiltonian
has a continuous spectrum). As the preceding calculations show,
peculiar quantum effects, such as the
Zeno "paradox" occur only when individual levels are
resolved (or almost resolved).

This explanation is consistent with Nugatory's (and others) given earlier.

What does the parenthetical "it interacts with decay products whose Hamiltonian has a continuous spectrum" contribute to the explanation? Why is the fact that the Hamiltonian has continuous spectrum relevant to the argument that a Zeno effect cannot occur? The Hamiltonian of the decay product is only relevant after the decay happens, and once decay happens, we no longer have to worry about a Zeno effect.
 
  • #74
jeeves said:
Yes, I would like to write things in terms of a wave function with coefficients ##A(t)## and ##B(t)##. In particular, I would like to know whether what I wrote in post #59 is correct (regarding both the coefficients themselves and the reasoning used to reach them).
With just two coefficients, you are dealing with a two-level system. There is no way that the solution can have any semblance to the decay of a radioactive atom. You cannot just assume coefficients. How would you specify the Hamiltonian? Have you ever done some simple exercises in QM?

jeeves said:
What does the parenthetical "it interacts with decay products whose Hamiltonian has a continuous spectrum" contribute to the explanation? Why is the fact that the Hamiltonian has continuous spectrum relevant to the argument that a Zeno effect cannot occur? The Hamiltonian of the decay product is only relevant after the decay happens, and once decay happens, we no longer have to worry about a Zeno effect.
Do you know Fermi's Golden Rule? Have you looked at its derivation? Although the Born rule is usually stated as relating to "measurements", Fermi's Golden Rule is routinely applied in situations where it doesn't make sense to talk about measurements or observations. For example in the calculation of nuclear reaction rates in the interior of stars. And for the derivation of Fermi's Golden Rule it is irrelevant when (or if at all) a measurement is made. "Measurement" and "wave function collapse" still linger prominently in textbooks, but IMHO they are not actually a crucial part of the formalism. John Bell, in his essay "Against Measurement" railed against treating "measurement" as a fundamental process. I agree with him that measurements should be explained in terms of something more fundamental.
 
  • Like
Likes gentzen
  • #75
jeeves said:
Why is the fact that the Hamiltonian has continuous spectrum relevant to the argument that a Zeno effect cannot occur?
Because no classical measurement can ever resolve all energy levels of a continuous measurement. At least that is the argument Asher Peres gives in your quote:
From the quantum point of view, classical measurements are always fuzzy. This is why a watched pot may boil, after all: the observer watching it is unable to resolve the energy levels of the pot.
...
As the preceding calculations show, peculiar quantum effects, such as the Zeno "paradox" occur only when individual levels are resolved (or almost resolved).

For the more detailed explanation, he refers to "the preceding calculations".
 
  • #76
WernerQH said:
With just two coefficients, you are dealing with a two-level system. There is no way that the solution can have any semblance to the decay of a radioactive atom. You cannot just assume coefficients. How would you specify the Hamiltonian? Have you ever done some simple exercises in QM?
I am merely repeating the analysis in post #12 of this thread. Do you disagree with that analysis?
gentzen said:
For the more detailed explanation, he refers to "the preceding calculations".
The preceding calculations do not seem to have a direct bearing on this particular example. The only relevant comment is:
One can even consider a passive detector,
such as a Geiger counter waiting
for the decay of a nucleus, but this situation does not fit
at all with our definition of a measurement. This setup is
best described as a single metastable system with several
decay channels.
 
  • #77
jeeves said:
I would like to write things in terms of a wave function with coefficients ##A(t)## and ##B(t)##.
I described how to do that in what you quoted, including how to determine ##B(t)## and ##A(t)##.

jeeves said:
In particular, I would like to know whether what I wrote in post #59 is correct (regarding both the coefficients themselves and the reasoning used to reach them).
As far as I can tell, your post #59 is saying that ##A(t)## and ##B(t)## get determined by the method I described in what you quoted (the radioactive decay law for ##B(t)## and normalization for ##A(t)##) until the cat is observed to die; at that point we deduce that the atom has decayed and the state collapses to ##A = 0##, ##B = 1## (the "atom decayed, cat dead" state). This is correct (as long as you are careful not to assign any physical meaning to "the state collapses" and to use it as a mathematical method only, to tell you what state you will now use for future predictions).

jeeves said:
The reason is to see if my understanding of the QM formalism is correct.
The QM formalism in this case is so simple that, as I said before, it doesn't really add anything to your knowledge--including to your knowledge of how to work with the QM formalism for more complicated cases.

The best simple experiment I know of to exercise your QM formalism-fu is an experiment on a pair of entangled particles to test for violations of the Bell inequalities.
 
  • #78
PeterDonis said:
As far as I can tell, your post #59 is saying that ##A(t)## and ##B(t)## get determined by the method I described in what you quoted (the radioactive decay law for ##B(t)## and normalization for ##A(t)##) until the cat is observed to die; at that point we deduce that the atom has decayed and the state collapses to ##A = 0##, ##B = 1## (the "atom decayed, cat dead" state). This is correct (as long as you are careful not to assign any physical meaning to "the state collapses" and to use it as a mathematical method only, to tell you what state you will now use for future predictions).
Great, thank you. This completely answers my original question. I will look into the Bell inequality violations; thank you for that also.

My only remaining worry is about that comment of Peres. As far as I can tell, he seems to be arguing that:
  1. The Geiger counter does not probe the energy levels of the nucleus, only the energy of the decay product after the nucleus has decayed.
  2. The decay product has continuous spectrum, and Zeno effects only apply in the presence of a discrete spectrum.
  3. Therefore, there can't be a quantum Zeno effect.
But I don't see why we need step two. It seems valid to say that:
  1. The Geiger counter does not probe the energy levels of the nucleus, only the energy of the decay product after the nucleus has decayed.
  2. Since (for all practical purposes) the Geiger counter does not entangle with the nucleus before decay, the counter cannot prevent decay, and there can be no quantum Zeno effect.
So the continuity of the spectrum seems irrelevant.
 
Last edited:
  • #79
jeeves said:
My only remaining worry is about that comment of Peres.
I don't think the Peres comment that was quoted really applies to the Geiger counter case. Or, for that matter, the cat case we have been discussing.

In the case of the counter and the cat, observation of those systems is only an indirect proxy for what we are interested in, namely, whether or not some particular atom has decayed. And because of the way those scenarios are set up, there is no meaningful "quantum Zeno" interaction between the counter, or the cat, and the atom we are interested in, until the atom decays. And that is true regardless of what the atom's energy levels are, whether they are continuous or discrete, how closely spaced they are, etc.

The kind of thing Peres is talking about is something different; he is talking about something like comparing measuring an atom in some unstable/metastable state directly vs. observing whether a cat is alive or dead. In both cases, you get something like a "binary" result (not decayed/decayed, alive/dead), but in the case of the atom, if you are measuring it directly (rather than relying on an indirect proxy like a Geiger counter), your measurement can involve an interaction (say probing the atom with a laser) that does have a "quantum Zeno" effect on the atom. But that is possible only if the states of the atom you are trying to distinguish are "spaced far enough apart", so to speak, that your measurement can reliably tell one from the other (or more precisely can reliably collapse the atom into one or the other).

In the case of observing a cat to see whether it is alive or dead, no such "quantum Zeno" effect is possible--you can't keep a cat alive just by constantly watching it. And that is because, unlike a single atom, a cat has an enormous number of possible states, which are "spaced very close together", so to speak, and what we refer to as "alive" and "dead" are not single states of the cat but huge subspaces of the cat's state space, and our observations cannot reliably force the cat into just one single state; we can't "collapse" the cat into some single desired state in its "alive" state space (which is what we would have to do to have a "quantum Zeno" effect on the cat) by observing it. In fact we can't do that by any means we have at our disposal now or in the foreseeable future.

To put this another way: probing a single atom with a probe laser can have a significant effect on its dynamics, but looking at a cat and seeing that it's alive does not have any significant effect on its dynamics. That's why we can do quantum Zeno experiments with atoms but not with cats.
 
  • Like
Likes PeroK, gentzen, jeeves and 1 other person
  • #80
How does Peres's account for the disappearance of the zeno effect in "detection of decay product" experiments (see post #73) inform the zeno paradox given by Home and Whitaker?

They establish the paradox in a thought experiment involving a hollow sphere of inward-facing detectors surrounding the atom in question. They set the radius of the sphere to be very large, and define a measurement as a rapid contraction of the sphere, such that if the particle has decayed (survived), the products will be detected (not be detected) during the contraction, and hence one of the two possible energy levels of the atom will be registered. A correlation is therefore established between detector and energy of the atom even if no decay product was detected, and without any ambiguity of continuous measurement. They claim that, using only orthodox quantum theory, the choice of sequences of measurements would affect the decay statistics even though no direct interaction takes place.
 
Last edited:
  • Like
Likes jeeves
  • #81
Morbert said:
How does Peres's account for the disappearance of the zeno effect in "detection of decay product" experiments (see post #73) inform the zeno paradox given by Home and Whitaker?
Even so I disagree with vanhess71 that this entire thread should have been in the Quantum Interpretations and Foundations subforum, the abstract of that paper feels suspiciously like a typical interpretation paradox:
... Gedanken experiments are outlined which illustrate the key features of the paradox, and its implications for the realist interpretation are discussed.

I know that the abstract also says: "It is demonstrated that collapse of state-vector is not a requirement for the paradox, which is independent of interpretation of quantum theory." But that statement too has more to do with interpretation than with plain "calculate and be happy" quantum mechanics, from my POV.
 
  • #82
Morbert said:
How does Peres's account for the disappearance of the zeno effect in "detection of decay product" experiments (see post #73) inform the zeno paradox given by Home and Whitaker?

They establish the paradox in a thought experiment involving a hollow sphere of inward-facing detectors surrounding the atom in question. They set the radius of the sphere to be very large, and define a measurement as a rapid contraction of the sphere, such that if the particle has decayed (survived), the products will be detected (not be detected) during the contraction, and hence one of the two possible energy levels of the atom will be registered. A correlation is therefore established between detector and energy of the atom even if no decay product was detected, and without any ambiguity of continuous measurement. They claim that, using only orthodox quantum theory, the choice of sequences of measurements would affect the decay statistics even though no direct interaction takes place.
I couldn't find a free version of the reference. I did find a similar paper by the same 2 authors:

https://www.fi.muni.cz/usr/buzek/zaujimave/home.pdf
A Conceptual Analysis of Quantum Zeno; Paradox, Measurement, and Experiment

I don't see where they use QM to make a testable prediction that decay rate is in any way affected by the presence (or lack thereof) of Geiger counters (measurement devices). Given these papers were written 25+ years ago, I would think we would have heard about that if this novel prediction had made a splash. On the other hand, a more recent analysis of the Zeno effect via indirect measurement finds there is no such effect (in contradiction).

https://arxiv.org/abs/quant-ph/0406191
"We study the quantum Zeno effect in the case of indirect measurement, where the detector does not interact directly with the unstable system. Expanding on the model of Koshino and Shimizu [Phys. Rev. Lett., 92, 030401, (2004)] we consider a realistic Hamiltonian for the detector with a finite bandwidth. We also take explicitly into account the position, the dimensions and the uncertainty in the measurement of the detector. Our results show that the quantum Zeno effect is not expected to occur, except for the unphysical case where the detector and the unstable system overlap."

A quantum system (such as a radioactive isotope) does not change (its statistical decay rate) in any way due to the presence or absence of an indirect measurement system designed to detect a decay product. Interestingly, it does, however, react to gravitational effects (time dilation).
 
  • Informative
  • Like
Likes Morbert and PeroK
  • #83
The Home and Whitaker paper references a 1980 paper by Peres which gives a time-evolution that is "valid for small times" $$|\left(\phi,e^{-iHt}\phi\right)|^2 \approx 1-(\Delta H)^2t^2+\cdots$$They use this in their derivation of the Zeno's paradox, which is peculiar to me. When I use the more general time evolution of an exponential decay, everything works out fine.
 
  • #84
Morbert said:
The Home and Whitaker paper references a 1980 paper by Peres which gives a time-evolution that is "valid for small times" $$|\left(\phi,e^{-iHt}\phi\right)|^2 \approx 1-(\Delta H)^2t^2+\cdots$$They use this in their derivation of the Zeno's paradox, which is peculiar to me. When I use the more general time evolution of an exponential decay, everything works out fine.
Yes, the exponential distribution is memoryless and cannot produce a Zeno effect. The effect relies on the deviations from exponential decay at short times. Quoting Wikipedia:
Unstable quantum systems are predicted to exhibit a short-time deviation from the exponential decay law. This universal phenomenon has led to the prediction that frequent measurements during this nonexponential period could inhibit decay of the system, one form of the quantum Zeno effect.
References are available in the article, and the calculation is done in equations (11) through (15) of the Home–Whitaker article you posted. I think it's also discussed somewhere in Sakurai, but my memory may be unreliable.
 
  • #85
jeeves said:
1. Yes, the exponential distribution is memoryless and cannot produce a Zeno effect.

2. The effect relies on the deviations from exponential decay at short times. Quoting Wikipedia: ...
1. Agreed.

2. I would not agree that there is such a deviation for radioactive decay. As far as I know, there isn't any indirect quantum measurement effect that would inhibit decay even in that program ("short" times).

In fact, I am not sure that there is a hypothetical short time program (regardless of what Wikipedia says). I'm also not sure what *direct* quantum Zeno effects have been studied at any level regarding radioactive decay.
 
  • #86
I am wondering if the model for time-evolution used in the [Home + Whitaker] papers above just isn't correct with these negative-measurement thought experiments. Assuming like before we have a system (##\phi##) and passive detector (##\psi##) that evolves like so $$U(t)|\phi_0, \psi_0\rangle = \alpha(t)|\phi_s, \psi_s\rangle + \beta(t)|\phi_d, \psi_d\rangle$$ The probability that the atom has survived until some time ##T## is $$\langle\phi_0,\psi_0|U^\dagger(T)|\phi_s,\psi_s\rangle\langle\phi_s,\psi_s|U(T)|\phi_0,\psi_0\rangle = |\alpha(T)|^2$$I can add an identity operator, evolved to some intermediate time ##t## without changing the result like so $$\langle\phi_0,\psi_0|I^\dagger(t)U^\dagger(T)|\phi_s,\psi_s\rangle\langle\phi_s,\psi_s|U(T)I(t)|\phi_0,\psi_0\rangle = |\alpha(T)|^2$$ But the identity operator can be projectively decomposed ##I = |\phi_s,\psi_s\rangle\langle\phi_s,\psi_s| + |\phi_d,\psi_d\rangle\langle\phi_d,\psi_d|## which is exactly what we would do to compute probabilities for an intermediate measurement. This intermediate measurement therefore can't change the probability computed for survival of the atom at ##T##, so long as we have a valid ##U##.
 
Last edited:
  • #87
DrChinese said:
https://arxiv.org/abs/quant-ph/0406191
"We study the quantum Zeno effect in the case of indirect measurement, where the detector does not interact directly with the unstable system. Expanding on the model of Koshino and Shimizu [Phys. Rev. Lett., 92, 030401, (2004)] we consider a realistic Hamiltonian for the detector with a finite bandwidth. We also take explicitly into account the position, the dimensions and the uncertainty in the measurement of the detector. Our results show that the quantum Zeno effect is not expected to occur, except for the unphysical case where the detector and the unstable system overlap."

A quantum system (such as a radioactive isotope) does not change (its statistical decay rate) in any way due to the presence or absence of an indirect measurement system designed to detect a decay product. Interestingly, it does, however, react to gravitational effects (time dilation).
Interesting paper. Thanks!
 
  • #88
DrChinese said:
I would not agree that there is such a deviation for radioactive decay. As far as I know, there isn't any indirect quantum measurement effect that would inhibit decay even in that program ("short" times).

In fact, I am not sure that there is a hypothetical short time program (regardless of what Wikipedia says). I'm also not sure what *direct* quantum Zeno effects have been studied at any level regarding radioactive decay.

There must be. An exact exponential decay law at all times is inconsistent with the laws of quantum mechanics. Refer to: Greenland, P. T. Seeking non-exponential decay. Nature 335, 298 (1988).

Further, there is indeed a program of theoretical predictions and attempted experimental tests of deviations from exponential decay in various contexts. For experimental evidence in quantum tunneling, see: Experimental evidence for non-exponential decay in quantum tunnelling, Nature 1997. I don't think experimental evidence has been found for radioactive decay specifically yet, but following the references in those papers (and citations to those paper) will lead you to theoretical work on the issue.

Morbert said:
Interesting paper. Thanks!

My inclination is to say that Home and Whitaker are simply wrong about their thought experiment. We have come to the conclusion in this thread that when the detector is not moving (e.g. it is a cat in a box), non-observation does not count as a "measurement" and will not produce a quantum Zeno effect, because the cat/detector does not directly probe the nucleus, it only interacts with the decay product after the decay.

If you accept that explanation, I am not sure how moving the detector around in space changes anything. Non-detection still won't count as a measurement because non-detection still doesn't produce an interaction with the nucleus (or anything else in that thought experiment).
 
Last edited:
  • #89
jeeves said:
1. An exact exponential decay law at all times is inconsistent with the laws of quantum mechanics. Refer to: Greenland, P. T. Seeking non-exponential decay. Nature 335, 298 (1988).

1. Here is a link to your reference:

https://www.nature.com/articles/335298a0

However, it's not really a good reference for your point, as I question whether Greenland's analysis is generally accepted (not well cited). When Greenland compared his prediction to actual relevant experiment, he found no support:

"Neither isotope revealed any deviation from exponential decay, and nor has any other test."

His actual prediction was: "In fact the decay of an isolated quantum state can never be exponential." So, there's that.

My point in all this is that deviation from exponential decay really has nothing to do with whether or not a Geiger counter type measurement can induce a quantum Zeno effect in a radioactive sample (as the OP seems to suggest). It cannot.2. I could not find a free link to your second reference. But I found another from the similar author group that might be of interest:

https://arxiv.org/abs/quant-ph/0104035
Observation of the Quantum Zeno and Anti-Zeno effects in an unstable system
"We report the first observation of the Quantum Zeno and Anti-Zeno effects in an unstable system. Cold sodium atoms are trapped in a far-detuned standing wave of light that is accelerated for a controlled duration. For a large acceleration the atoms can escape the trapping potential via tunneling. Initially the number of trapped atoms shows strong non-exponential decay features, evolving into the characteristic exponential decay behavior. We repeatedly measure the number of atoms remaining trapped during the initial period of non-exponential decay. Depending on the frequency of measurements we observe a decay that is suppressed or enhanced as compared to the unperturbed system."

This effect results from direct "measurement", rather than something indirect (like a Geiger counter).
 
  • Like
Likes PeroK, PeterDonis and Lord Jestocost
  • #90
DrChinese said:
1. Here is a link to your reference:

https://www.nature.com/articles/335298a0

However, it's not really a good reference for your point, as I question whether Greenland's analysis is generally accepted (not well cited). When Greenland compared his prediction to actual relevant experiment, he found no support:

"Neither isotope revealed any deviation from exponential decay, and nor has any other test."

His actual prediction was: "In fact the decay of an isolated quantum state can never be exponential." So, there's that.

Here is a link to the second reference: https://web.archive.org/web/2010033...chargement/Optique_Quantique/Raizen_decay.pdf

The Greenland article is a short expository article, and I would not expect it to be highly cited.

Note that the Khalfin paper cited in the paper I just linked (reference 1) predicting corrections to exponential decay has 635 citations (according to Google scholar). This is also discussed in reference 3 (708 citations), and some others. I believe the observation that the decay cannot be exactly exponential is even made in standard graduate textbooks; the Nature article cites Ballentine's book. I conclude that deviations from exponential decay are generally accepted by the physics community.

I agree that direct experimental evidence of these deviations does not seem to exist (to my knowledge). This appears to be a limitation of our experimental abilities (specifically probing short enough time scales), not of our theoretical understanding. I would reconsider my view if there were a paper that claims to access the time scales where quantum Zeno-permitting deviations are predicted for radioactive decay and finds no effect. However, my understanding is that no such paper exists, and my response is then simply that absence of evidence is not evidence of absence.

I agree these deviations are irrelevant to the Geiger counter example.
 
  • #91
It's indeed a mathematical fact that the exponential-decay law is an approximation. It is derived from 1st-order perturbation theory, neglecting the reaction between the decay products and leads to Breit-Wigner distributions for the transition amplitudes in energy-representation, which means to the exponential decay law in the time domain. It's precisely this neglect of the reaction between the decay products that leads to the exponential decay law, i.e., it neglects the possibility to go back to the original state after decay. This is a good approximation in many cases, but, e.g., not for cases, where the decay is "close to threshold", as demonstrated in the papers cited above. I think the Nature article nicely summarizes this state of affairs.
 
  • Like
  • Informative
Likes protonsarecool, jeeves, hutchphd and 1 other person
  • #92
jeeves said:
We have come to the conclusion in this thread that when the detector is not moving (e.g. it is a cat in a box), non-observation does not count as a "measurement" and will not produce a quantum Zeno effect, because the cat/detector does not directly probe the nucleus, it only interacts with the decay product after the decay.
It seems to be the case that observed Zeno effects are indeed due to perturbations on the system that take place during direct measurements (e.g. acceleration of sodium atoms). But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.

Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

[*] - By good quantum theory I mean suitable dynamics (including the dynamics that entangle measured and measurement device) and initial state
 
Last edited:
  • Like
Likes vanhees71
  • #93
Morbert said:
But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.

Why do you assume that such a theory does not already exist? As one interacts with the real world life gets complicated and theories more complicated.
I am also at a loss as to why one would not expect repeated measurement to render Breit-Wigner result insufficient as previously pointed out by @vanhees71.
How this has anything to do with "interpretation" is certainly beyond my ken.
 
  • #94
Morbert said:
Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

You may find Section X of the following paper useful: https://arxiv.org/abs/quant-ph/0611067

To better understand the nature of continuous mea-
surements, we will now consider in detail an example of
how a continuous measurement of position arises in a fun-
damental physical system: a single atom interacting with
light. Again, to obtain weak measurements, we do not
make projective measurements directly on the atom, but
rather we allow the atom to become entangled with an
auxiliary quantum system—in this case, the electromag-
netic field—and then make projective measurements on
the auxiliary system (in this case, using a photodetector).
It turns out that this one level of separation between the
system and the projective measurement is the key to the
structure of the formalism. Adding more elements to the
chain of quantum-measurement devices does not change
the fundamental structure that we present here.

The authors say that measuring the EM field is only a "weak measurement" of the state of the atom, so it won't lead to quantum Zeno. I guess this is essentially the "you're measuring the decay product, not the atom" claim being detailed with more math.

What I'm curious about is: Why does measuring the EM field for photons constitute a "weak measurement" (see definition on page 4, left column)? The reasoning in Section X seems valid, but it's not clear to me how the measurement is fuzzy in a way that allows them to avoid quantum Zeno. After all, isn't it the case that knowing the state of the EM field allows you to precisely identify the state of the two-level atom?
 
  • #95
Morbert said:
It seems to be the case that observed Zeno effects are indeed due to perturbations on the system that take place during direct measurements (e.g. acceleration of sodium atoms). But I think a good quantum theory of the system to be measured should predict the statistics reproduced by the experiment, regardless of interpretational matters like what is a measurement.
Yes, and this is indeed what QT does since its discovery in 1925 ;-). The Zeno effect is due to interactions of the system including the measurement devices and the "environment". Measurement devices are also nothing special but consist of the building blocks of matter as the observed object and it obeys no special "rules" other than the rules described by QT. What else should it be and how else should it behave?
Morbert said:
Since indirect scenarios like detectors placed around an atom still establish an irreversible correlation between the detector and the microscopic system such that statistics predicted by the theory can be compared against the statistics generated by the detector clicks/non-clicks, a good quantum theory robust against indirect measurement scenarios would be useful.

[*] - By good quantum theory I mean suitable dynamics (including the dynamics that entangle measured and measurement device) and initial state
It's just the standard dynamics described in any valid textbook of quantum theory.
 
  • Like
Likes protonsarecool
  • #96
jeeves said:
What I'm curious about is: Why does measuring the EM field for photons constitute a "weak measurement" (see definition on page 4, left column)? The reasoning in Section X seems valid, but it's not clear to me how the measurement is fuzzy in a way that allows them to avoid quantum Zeno. After all, isn't it the case that knowing the state of the EM field allows you to precisely identify the state of the two-level atom?
I think the fuzziness they are referring to here is fuzziness in the moment the particle decays as opposed to fuzziness in the energy of the two-level atom. I.e. Instead of

"A detector clicking at time ##t## implies the particle decayed at time ##t'##"

We say

"a detector clicking at time ##t## implies the particle probably decayed at time ##t'\pm\Delta t##"

where the 'probably' gets weakers as ##\Delta t## is made smaller. At the end of post #56 I remarked that I did not think a continuous measurement (taking ##\Delta t \rightarrow 0##) could be normalised. They have introduced a fuzziness (weakness of measurement as ##\Delta t\rightarrow 0##) so that a norm-preserving time-evolution can be obtained. This would presumably not satisfy Home+Whitaker since they derive a Zeno effect from non-continuous, discrete sequences of measurements.
hutchphd said:
Why do you assume that such a theory does not already exist? As one interacts with the real world life gets complicated and theories more complicated.
I am also at a loss as to why one would not expect repeated measurement to render Breit-Wigner result insufficient as previously pointed out by @vanhees71.
How this has anything to do with "interpretation" is certainly beyond my ken.
I wouldn't. I am instead saying that papers like the Whitaker one above are not using a correct quantum theory, and hence they are predicting statistics contingent on a sequence of indirect measurements when there should be no such contingency.
vanhees71 said:
Yes, and this is indeed what QT does since its discovery in 1925 ;-).

It's just the standard dynamics described in any valid textbook of quantum theory.
To be clearer and more formal, by quantum theory of the system, I meant dynamics ##U(t)## and initial state ##\rho_0## such that if ##\Pi_s## is the "detector has not clicked" projector then $$\mathrm{Tr}\left[\Pi_s(T)\rho_0\Pi^\dagger_s(T)\right] = \mathrm{Tr}\left[\Pi_s(T)\Pi_s(T/2)\rho_0\Pi^\dagger_s(T/2)\Pi^\dagger_s(T)\right] = \mathrm{Tr}\left[\Pi_s(T)\dots\Pi_s(T/n)\rho_0\Pi^\dagger_s(T/n)\dots\Pi^\dagger_s(T)\right]$$Home and Whitaker were presenting a dynamics where this did not hold, and I think their problem, as corroberated by posts by you DrChinese et al, is their dynamics are not actually correct.
 
  • Like
Likes PeroK
  • #97
The time evolution is due to the Hamiltonian of the system not by some projectors. If the interaction between the detector and the system under consideration is relevant for its time evolution, you have to include the corresponding interaction Hamiltonian.
 
  • Like
Likes hutchphd
  • #98
vanhees71 said:
The time evolution is due to the Hamiltonian of the system not by some projectors. If the interaction between the detector and the system under consideration is relevant for its time evolution, you have to include the corresponding interaction Hamiltonian.
I'm using the convention ##\Pi_s(t) = U^{-1}(t)\Pi_sU(t)##. And yes I agree the dynamics have to include degrees of freedom of the system being entangled with the atom.
 
  • Like
Likes vanhees71
  • #99
Morbert said:
Home and Whitaker were presenting a dynamics where this did not hold, and I think their problem, as corroberated by posts by you DrChinese et al, is their dynamics are not actually correct.
I've done some more reading and I think the majority of what has been said in this thread so far is not totally correct (including the analysis I gave a few pages ago).

I believe a correct analysis of a decaying atom with a continuous photodetector is given in "Quantum Zeno effect with general measurements" by Koshino and Shimizu, Physics Reports, 2005. It has hundreds of citations, so I conclude that it is "generally accepted physics." An arxiv version is here.

Section 5 contains the main argument. They show that for a continuous photodetector with small enough response time (much smaller than the initial time period where the atom decays non-exponentially) there is indeed a Zeno effect. The reason we do not see Zeno effects in day-to-day experiments is that the response times of typical photodetectors are many orders of magnitude larger than what would be necessary to get these effects. Their argument includes "indirect measurement," and they show how to extend it to non-continuous, discrete measurements at small time intervals.

I think with trivial changes you can analyze the Home–Whitaker setup in the same way, and you will indeed see a Zeno effect from those successive indirect measurements if the time interval is small enough.
 
  • Like
Likes Morbert
  • #100
jeeves said:
I think with trivial changes you can analyze the Home–Whitaker setup in the same way, and you will indeed see a Zeno effect from those successive indirect measurements if the time interval is small enough.
The Koshino paper is interesting, particularly in the way it dissolves the distinction between a direct and indirect measurement, so we might indeed see a Zeno effect in the Home-Whitaker thought experiment involving the hollow sphere of detectors. The Koshino paper shows this would not in fact be paradoxical because each contraction of the hollow sphere of detectors perturbs the atom, since the atom and detector are coupled via the photon field. Different sequences of indirect measurements = different dynamics = different decay statistics.
 
  • Like
Likes vanhees71

Similar threads

Replies
22
Views
2K
Replies
13
Views
3K
Replies
5
Views
1K
Replies
5
Views
875
Replies
1
Views
2K
Replies
4
Views
1K
Back
Top