I "Filming" a quantum measurement

nomadreid
Gold Member
Messages
1,748
Reaction score
243
TL;DR Summary
In an article (link in main text), one "films" a strontium ion in an electric field during the microsecond of its wave function "collapse"; a GIF shows the superposition as it changes. But I thought that such a distribution could only be measured by a lot of atoms/trials, not from a single atom. Predicted, perhaps, if the original state is measured, but that does not seem to be the case here.
The link is https://www.su.se/english/research/scientists-film-a-quantum-measurement-1.487234
How do they arrive at this distribution over time? It does not appear to be saying that this is the prediction (in it were, then why the "filming"?), and measurement of a single atom's superposition is supposed to be impossible, so ...? Obviously there is a basic point I am missing.
 
  • Like
Likes atyy
Physics news on Phys.org
They repeat the experimental process (preparing the system in state ##|\psi_i \rangle \langle \psi_i|## then doing the measurement of ##|\psi_j \rangle \langle \psi_j##) for a 1000 times for each setup ##(i,j)##. The paper the website is referring too is much easier to understand than the webpage ;-)).
 
  • Like
Likes nomadreid and atyy
Thanks you, vanhees71. That makes more sense than what I thought it said.
 
  • Like
Likes vanhees71
I'm often surprised how often it happens that I don't understand a popular-science news article but then am able to understand at least the concepts from the original paper, even when I'm not an expert in the topic presented. I think popular-science writing is among the most difficult tasks for a researcher in the field, but it's even more difficult for a science journalist who is usually not an expert in what he is trying to describe.
 
  • Like
Likes nomadreid
I have looked at the original paper but it is largely beyond me. However, would I be right in thinking that as the measurement takes a finite time is there at least the potential for it to have dynamics beyond that modeled by a projection operator ?

Regards Andrew
 
Sure, the state never jumps as a function of time. The time evolution is given by the Schrödinger equation for the wave function (or the von Neumann equation for the statistical operator), which is a differential equation in time, leading to a smooth time-dependence of its solutions. There are no quantum jumps in modern quantum mechanics only more or less rapid smooth transitions from one to another state.
 
  • Skeptical
  • Like
Likes Demystifier and andrew s 1905
vanhees71 said:
Sure, the state never jumps as a function of time. The time evolution is given by the Schrödinger equation for the wave function (or the von Neumann equation for the statistical operator), which is a differential equation in time, leading to a smooth time-dependence of its solutions. There are no quantum jumps in modern quantum mechanics only more or less rapid smooth transitions from one to another state.

Thank you. So would a reasonable image be that the measuring system when included in say the Schrodinger equation / Hamiltonian adds terms that lead to the transition from one state to the other and entangles them in the process?

Regards Andrew
 
  • Like
Likes vanhees71
Yes, that's a right picture!
 
  • Like
Likes andrew s 1905
vanhees71 said:
Sure, the state never jumps as a function of time. The time evolution is given by the Schrödinger equation for the wave function (or the von Neumann equation for the statistical operator), which is a differential equation in time, leading to a smooth time-dependence of its solutions. There are no quantum jumps in modern quantum mechanics only more or less rapid smooth transitions from one to another state.
The Schrödinger evolution is smooth and deterministic. But the evolution also has a random non-deterministic component. Is the non-deterministic component of evolution also smooth? If yes, how do we know that?
 
  • #10
Where is a random non-deterministic component in the Schrödinger equation,
$$\mathrm{i} \partial_t \psi(t,\vec{x})=-\frac{1}{2m} \Delta \psi(t,\vec{x}) + V(\vec{x}) \psi(t,\vec{x})?$$
I'm not talking about open quantum systems in some Schrödinger-Langevin approach!
 
  • #11
vanhees71 said:
Where is a random non-deterministic component in the Schrödinger equation
In the Born rule postulate, i.e. in the probabilistic interpretation of the solution of the Schrödinger equation. In measurements, observables acquire definite values. This acquirement of a definite value is a non-deterministic process. My question is: Is this process smooth or not?
 
  • Like
Likes nomadreid
  • #12
The state is given by the wave function, and this is evolving smoothly with time. What's "random" are the outcome of measurements, not the states!
 
  • Like
Likes nomadreid
  • #13
vanhees71 said:
What's "random" are the outcome of measurements
Right! And my question is: Do measurement outcomes arise smoothly or not?
 
  • #14
How do you define "measurement outcomes arise smoothly"? You have to let the system you want to measure with some measurement device, where some macroscopic observable (i.e., some average) delivers the "outcome". This may have some fluctuations around this average value, but they should be negligible on the macroscopic scale of resolution to provide a "unique" "measurement outcome". This macroscopic "pointer observable" also doesn't jump.
 
  • #15
Demystifier said:
In measurements, observables acquire definite values.

It does not look like the experiment under discussion in this thread does this. The process described in the paper maintains quantum coherence, so it is not a "measurement" in the usual sense, and the term "measurement" in the title is really a misnomer.
 
  • Like
Likes nomadreid and vanhees71
  • #16
vanhees71 said:
I'm not talking about open quantum systems

A quantum system that is not open does not interact with anything, which means it can't be measured, since measurement requires interaction.

vanhees71 said:
What's "random" are the outcome of measurements, not the states!

Once you know a measurement outcome, you have to discontinuously change the state. (This process is usually referred to by the "c" word.)
 
  • Like
Likes vanhees71
  • #17
Now we get into an discussion of interpretation again. Mathematically nothing jumps. Also the effective equations for open quantum systems (quantum master equations) are differential equations (like, e.g., Kadanoff-Baym or Lindblad equations). The "collapse" may look like "instaneous", but in reality it's just a rapid smooth change.
 
  • #18
vanhees71 said:
Now we get into an discussion of interpretation again.

No, what I'm referring to is not what story any interpretation tells about the mathematical process referred to by the "c" word. I'm referring to the mathematical process itself, which is part of the 7 Basic Rules (rule 7, the projection postulate).

vanhees71 said:
the effective equations for open quantum systems (quantum master equations) are differential equations (like, e.g., Kadanoff-Baym or Lindblad equations). The "collapse" may look like "instaneous", but in reality it's just a rapid smooth change.

Can you point out where in the paper on the experiment under discussion this "smooth process" is described?
 
  • #19
Eq. (5) and (6) of the Supplement. It's a pretty simple exactly solvable differential equation (known as the rotating-wave approximation). Nothing jumps!
 
  • #20
I am trying to follow the discussion. Given
PeterDonis said:
The process described in the paper maintains quantum coherence, so it is not a "measurement" in the usual sense, and the term "measurement" in the title is really a misnomer.

and the fact the paper defines an "Ideal quantum measurement " in the synopsis what is the difference? (I counted the term "measurement" 55 times in the paper.)

From what has been said above is a classical QM measurement one where the measuring device is treated classically (modeled by a projection operator) and an "Ideal QM"
one where the measurement device is also modeled as quantum mechanical or am I totally mistaken?

Regards Andrew
 
  • Like
Likes nomadreid
  • #21
vanhees71 said:
Eq. (5) and (6) of the Supplement. It's a pretty simple exactly solvable differential equation (known as the rotating-wave approximation). Nothing jumps!

Nor is anything measured, in the usual sense of "measurement" where an irreversible record is made, or, equivalently, where decoherence takes place. All that is being described by those equations is a unitary interaction between a laser and a trapped ion. There is no decoherence and no irreversible record being made. That is why I said that the term "measurement" in this context is a misnomer.

The actual "measurement" in this case is the detection of fluorescence photons. The state described by the master equations you refer to only predicts the probability for such detection, just as with any quantum state. The actual detection process itself is not even described in the paper or the supplemental material; the authors, I take it, assume, as is normal in such papers, that everyone already understands how the detection of fluorescence photons works, that it is an irreversible, decoherent process, and that the quantum states they describe in their paper only predict the probabilities for such detections.
 
  • Like
Likes nomadreid, meekerdb and vanhees71
  • #22
andrew s 1905 said:
From what has been said above is a classical QM measurement one where the measuring device is treated classically (modeled by a projection operator) and an "Ideal QM"
one where the measurement device is also modeled as quantum mechanical

No. See my response to @vanhees71 in post #21.
 
  • #23
nomadreid said:
one "films" a strontium ion in an electric field during the microsecond of its wave function "collapse"

This is a misleading description of what the experiment described in the paper is actually testing. A much better explanation of the point of the experiment is given in the Introduction to the paper:

What is an ideal measurement in quantum mechanics? What are its inner workings? How does the quantum state change because of it? These have been central questions in the development of quantum mechanics[1]. Notably, today’s accepted answer to the latter question is conceptually different from the one given in the first formalization of quantum mechanics by von Neumann[2]. Then, it was thought that an ideal measurement on aquantum system would inevitably destroy all quantum superpositions. Later, Lüders pointed out[3]that certain superpositions should survive, so that a sequence of ideal measurements would preserve quantum coherence. Lüders’s version is the one accepted today.

In other words, what this experiment is demonstrating is simply that a measurement on a quantum system only destroys superpositions (a better term would be "destroys quantum coherence") for the degrees of freedom involved in the measurement. In this case, only one of the three possible states of the trapped ion is involved in the measurement (i.e., has a nonzero interaction with whatever is being used to make the measurement), so the measurement only destroys quantum coherence that involves that one state--quantum coherence between the other two states is preserved. Or, to put it another way, since only one of the three states is "being measured", only quantum coherence involving that state is lost.

The "filming" part comes in because the measurement of the one state that is "being measured" (the state called ##|0\rangle## in the paper) only actually counts as a "measurement" if a fluorescence photon is detected; if no fluorescence photon is detected, then no "measurement" has taken place. Since the probability of detection of a fluorescence photon varies according to the interaction strength that is used, one can run the experiment multiple times with different interaction strengths to take what amount to "snapshots" of the process of loss of quantum coherence involving the state ##|0\rangle##. Very high interaction strength corresponds to the limit in which the probability of detection of a fluorescence photon approaches 1, and quantum coherence involving the state ##|0\rangle## is completely lost; very low interaction strength is the limit in which the probability of detection approaches 0, and quantum coherence is not affected at all. In between interaction strengths correspond to partial loss of quantum coherence.

Note, however, that this "partial" loss of quantum coherence is not a property of a single run of the experiment; it is only a property of an ensemble of runs all made with the same (intermediate) interaction strength. In any single run, either a fluorescence photon is detected or it is not. If it is, quantum coherence is lost; if it is not, quantum coherence is not lost. So there is no "partial" loss of coherence in any single run of the experiment, and there is also no "partial measurement" in any single run; no single run involves the measurement process being "caught" part way through. In any single run, either the measurement happens all the way, or it doesn't happen at all.
 
  • Like
Likes nomadreid and meekerdb
  • #24
Of course, the measurement process itself, in this case, the detection of the fluorescence photons, is taken for granted. What I don't understand is, why you don't consider the detection of these photons not a measurement. If I understand the paper right Fig. 2 are actually real measurement results, which demonstrate that in case (d) you indeed have made an FAPP ideal "Lüders measurement" as decribed in Fig. 1 (b). I don't see, why you think it's misleading to call it a measurement (are you implying the authors are cheating?).
 
  • #25
andrew s 1905 said:
So would a reasonable image be that the measuring system when included in say the Schrodinger equation / Hamiltonian adds terms that lead to the transition from one state to the other and entangles them in the process?

The paper's use of the term "measurement" in this context is very misleading. In Fig. 1, for example, the "measurement process" is shown as a brief pulse of the 422 nm laser that couples the ##|0\rangle## state of the atom to the photon states in the cavity, giving a nonzero probability of emission of a photon by the atom. But then, later on, we have "fluorescence detection", and in the paper's Conclusion, this step is described as "a measurement of the photon environment":

The system that is subject to the measurement is brought into contact with a macroscopic pointer state by facilitating a strong interaction between state ##|0\rangle## of the system and the photon environment. A measurement of the photon environment then reveals the state of the system.

So we measure "the system" by letting it interact with "the photon environment", and then...measuring the photon environment. In other words, the label "measurement process" for the interaction between the system (the atom) and the photon environment is a misnomer; that's not where the actual "measurement" occurs. The actual "measurement" is the "measurement of the photon environment", which, as I noted in an earlier post, is not described mathematically in the paper at all.
 
  • Like
Likes nomadreid
  • #26
vanhees71 said:
Of course, the measurement process itself, in this case, the detection of the fluorescence photons, is taken for granted. What I don't understand is, why you don't consider the detection of these photons not a measurement.

I never said the detection of the fluorescence photons is not a measurement. I said the exact opposite: that is the measurement. But the 422 nm laser pulse that is labeled "measurement process" in Fig. 1 of the paper is not the detection of the fluorescence photons (the figure makes that obvious, as I described in post #25 just now). So the 422 nm laser pulse is not a measurement. It's just a unitary interaction that enables a (possible--whether or not a fluoresecence photon is actually detected is probabilistic) measurement later on in the experiment.

vanhees71 said:
If I understand the paper right Fig. 2 are actually real measurement results,

No, they're not, they're "reconstructed from experimental data", just like it says in the caption of the figure. The "experimental data" is just the record of fluorescence photon detections or non-detections for each run of the experiment. That is the "measurement results". What is shown in Fig. 2 is a model-dependent calculation, just like the figure's caption says. A model-dependent calculation is not the same as "real measurement results".

vanhees71 said:
I don't see, why you think it's misleading to call it a measurement

See my post #25 and my comments above about the 422 nm laser pulse.

vanhees71 said:
(are you implying the authors are cheating?).

No. Such misleading use of the term "measurement" is, unfortunately, common in the literature. But that doesn't make it any less misleading.
 
  • Like
Likes nomadreid
  • #27
vanhees71 said:
How do you define "measurement outcomes arise smoothly"? ... This macroscopic "pointer observable" also doesn't jump.
So the macroscopic pointer evolves with time smoothly but non-deterministically, would you agree?
 
  • Like
Likes vanhees71
  • #28
PeterDonis said:
So we measure "the system" by letting it interact with "the photon environment", and then...measuring the photon environment. In other words, the label "measurement process" for the interaction between the system (the atom) and the photon environment is a misnomer; that's not where the actual "measurement" occurs. The actual "measurement" is the "measurement of the photon environment", which, as I noted in an earlier post, is not described mathematically in the paper at all.
What's described is a FAPP ideal filter measurement. The measurement is through coupling of the measured system to the "photon environment" (a bit strange an expression for coupling to the electromagnetic quantum field). For strong enough coupling you realize an (almost) ideal filter measurement. The subsequent verification that you realized this filter through interaction with the "photon environment" is just a standard measurement, which is well established. It's not the aim of the paper to also describe this measurement mathematically. I'm pretty sure that this is done in the standard literature of the AMO community.
 
  • #29
PeterDonis said:
I never said the detection of the fluorescence photons is not a measurement. I said the exact opposite: that is the measurement. But the 422 nm laser pulse that is labeled "measurement process" in Fig. 1 of the paper is not the detection of the fluorescence photons (the figure makes that obvious, as I described in post #25 just now). So the 422 nm laser pulse is not a measurement. It's just a unitary interaction that enables a (possible--whether or not a fluoresecence photon is actually detected is probabilistic) measurement later on in the experiment.
The 422 nm laser pulse is a measurement. In the case of strong enough intensities it's even an ideal filter measurement, as is demonstrated in the supplement. For not so high intensities it's not an ideal filter measurement but one with some uncertainty, which is quantified by the corresponding transition probabilities. Formally, it's indeed a "measurement", where you don't take note of the result first. That's done with the detection of the fluorescence photons.
No, they're not, they're "reconstructed from experimental data", just like it says in the caption of the figure. The "experimental data" is just the record of fluorescence photon detections or non-detections for each run of the experiment. That is the "measurement results". What is shown in Fig. 2 is a model-dependent calculation, just like the figure's caption says. A model-dependent calculation is not the same as "real measurement results".
If you want to verify that you really have prepared a certain quantum state you have to measure something (here the fluorescence photons) and reconstruct the state from the (statistical) data. What else should a "state-determination measurement" be?
 
  • #30
Demystifier said:
So the macroscopic pointer evolves with time smoothly but non-deterministically, would you agree?
If you mean by "macroscopic pointer" some macroscopic observable (like a pointer position), I agree, because indeed it's an average (over some macroscopic small but microscopic large space-time interval), and there are (quantum as well as thermal) fluctuations around the mean value.
 
  • #31
vanhees71 said:
If you mean by "macroscopic pointer" some macroscopic observable (like a pointer position), I agree, because indeed it's an average (over some macroscopic small but microscopic large space-time interval), and there are (quantum as well as thermal) fluctuations around the mean value.
But the average evolves deterministically, due to the Ehrenfest theorem. And thermal fluctuations can in principle be eliminated, by performing the experiment at zero temperature. So basically, the randomness of measurement outcomes arises from quantum fluctuations, I think you would agree with that. My problem is this: How do we know that the effect of these quantum fluctuations is smooth?
 
  • Like
Likes vanhees71
  • #32
I agree. What's smooth are the probability distributions, whose time evolution on the fundamental level is defined by differential equations (von Neumann equation for the stat. op.).

Of course, if you go to effective descriptions of fluctuations a la Langevin you describe them by stochastic differential equations.
 
  • #33
vanhees71 said:
What's smooth are the probability distributions, whose time evolution on the fundamental level is defined by differential equations (von Neumann equation for the stat. op.).
Yes, but this evolution is also deterministic. I am interested in the random non-deterministic evolution.

vanhees71 said:
Of course, if you go to effective descriptions of fluctuations a la Langevin you describe them by stochastic differential equations.
Fine, this effective description suggests that the random time evolution is smooth. But another effective description based on quantum jumps suggests that it isn't smooth. So it seems that effective descriptions alone cannot decide. Is there a more fundamental argument for the thesis that the random time evolution is smooth?
 
  • #34
vanhees71 said:
The state is given by the wave function, and this is evolving smoothly with time. What's "random" are the outcome of measurements, not the states!
Do those outcomes arise smoothly or instantaneously? How would they arise smoothly if measuring the position of a particle means that the particle is found at a classically impossible location, e.g. on the other side of a barrier(quantum tunneling).
Does the term smoothly as you are using it include instantaneous jumps?
 
  • #35
Demystifier said:
But the average evolves deterministically, due to the Ehrenfest theorem. And thermal fluctuations can in principle be eliminated, by performing the experiment at zero temperature. So basically, the randomness of measurement outcomes arises from quantum fluctuations, I think you would agree with that. My problem is this: How do we know that the effect of these quantum fluctuations is smooth?
It seems to me that by definition a macroscopic pointer obeys classical physics and hence would have to respond smoothly to any fluctuations.

Regards Andrew
 
  • #36
andrew s 1905 said:
It seems to me that by definition a macroscopic pointer obeys classical physics and hence would have to respond smoothly to any fluctuations.
What do you mean by "classical"? If it responds to quantum (that is, non-classical) fluctuations, then I wouldn't call it classical.
 
  • #37
Demystifier said:
What do you mean by "classical"? If it responds to quantum (that is, non-classical) fluctuations, then I wouldn't call it classical.
I mean classical in the sense used "in the measuring apparatus is treated classically" i.e. obeys classical mechanics. I thought that was the whole QM measurement issue. By some means the quantum state being measured is converted to a state in a classical macroscopic object, the pointer. Is that not that the only way a "permanent" record can be made?

Regards Andrew
PS My working model is a photo multiplier tube where the photons are detected statistically but thus induces a macroscopic current detected by a galvometer who's pointer moves smoothly.

I apologise if this is all a red herring by I am trying to follow the discussion as best I can.
 
  • #38
andrew s 1905 said:
I mean classical in the sense used "in the measuring apparatus is treated classically" i.e. obeys classical mechanics. I thought that was the whole QM measurement issue.
But if it's behavior is random, then it doesn't obey classical mechanics. I know that books often carelessly say "the measuring apparatus is treated classically", but strictly speaking it can't be true. If the measurement outcomes are random, then the behavior of the apparatus at least partially must be non-classical.

andrew s 1905 said:
By some means the quantum state being measured is converted to a state in a classical macroscopic object, the pointer. Is that not that the only way a "permanent" record can be made?
The italics is the crucial issue. By what means precisely? In particular, is it by smooth or non-smooth means?
 
  • Like
Likes andrew s 1905
  • #39
Demystifier said:
The italics is the crucial issue. By what means precisely? In particular, is it by smooth or non-smooth means?
Thanks, that takes me back to my original question in post #5.

Given the above discussions and assuming the paper did actually describe a measurement and it indeed showed it was not instantaneous does that at least allow the possibility of some dynamics beyond that modeled by a projection operator?

Clearly, there are differing views so I will study more to better understand the various points of view.

Thanks all for taking my questions seriously and giving considered responses.

Regards Andrew
 
  • #40
Demystifier said:
Yes, but this evolution is also deterministic. I am interested in the random non-deterministic evolution.Fine, this effective description suggests that the random time evolution is smooth. But another effective description based on quantum jumps suggests that it isn't smooth. So it seems that effective descriptions alone cannot decide. Is there a more fundamental argument for the thesis that the random time evolution is smooth?
There are no quantum jumps, at least not in standard quantum mechanics. It is an old ad-hoc description of the emission and absorption of photons in the Bohr atom model by electrons jumping between different allowed orbits. Today we describe it dynamically, often sufficiently in first-order perturbation theory with an external classical em. field (semi-classical approximation), which describes of course only induced emission and absorption or the quanum em. field, which includes spontaneous emission too.
 
  • #41
EPR said:
Do those outcomes arise smoothly or instantaneously? How would they arise smoothly if measuring the position of a particle means that the particle is found at a classically impossible location, e.g. on the other side of a barrier(quantum tunneling).
Does the term smoothly as you are using it include instantaneous jumps?
Also tunneling takes of course some time. It's not easy to define the "tunneling time" in a consistent way, and I'm not sure, whether there's a definition to which all quantum physicists agree.

An instantaneous jump is of course the opposite of smooth transitions, and according to standard quantum (field) theory there are no instantaneous jumps.
 
  • #42
vanhees71 said:
Also tunneling takes of course some time. It's not easy to define the "tunneling time" in a consistent way, and I'm not sure, whether there's a definition to which all quantum physicists agree.

An instantaneous jump is of course the opposite of smooth transitions, and according to standard quantum (field) theory there are no instantaneous jumps.
Even on the double slit detection screen? The collapse happens smoothly and takes some time?
I haven't read about that. Has this time been measured?

Edit: there was this Bulgarian guy who worked on this field. It seems you are right. https://www.quantamagazine.org/quantum-leaps-long-assumed-to-be-instantaneous-take-time-20190605/
 
  • Like
Likes vanhees71
  • #43
vanhees71 said:
It's not easy to define the "tunneling time" in a consistent way, and I'm not sure, whether there's a definition to which all quantum physicists agree.
There isn't. Anyway, my definition (which I think is consistent) is in https://arxiv.org/abs/2010.07575
 
  • Like
Likes vanhees71
  • #44
andrew s 1905 said:
Given the above discussions and assuming the paper did actually describe a measurement and it indeed showed it was not instantaneous does that at least allow the possibility of some dynamics beyond that modeled by a projection operator?
The paper analyses how the probability distribution changes with time. Probability distribution is measured on a large ensemble of events. It says nothing about time evolution of individual events. Theoretically, the time evolution of probability distribution is described by a deterministic equation, so there is nothing surprising in the fact that it is smooth. But randomnes is a property of single events. The experiment says nothing about the question whether the individual random events are smooth or involve instantaneous jumps.
 
  • Like
Likes andrew s 1905 and vanhees71
  • #45
vanhees71 said:
The 422 nm laser pulse is a measurement.

No, it isn't. As I've already said, the actual measurement is the detection of the fluorescence photons. The paper even says so. I know the paper calls the 422 nm laser pulse a "measurement", but that makes no sense because it goes on to say that this "measurement" happens by measuring the fluorescence photons.

You even acknowledge that yourself:

vanhees71 said:
If you want to verify that you really have prepared a certain quantum state you have to measure something (here the fluorescence photons)

Yes, exactly: the measurement is of the fluorescence photons.
 
  • #46
This is again a semantics discussion. You can also call the 422 nm laser pulse a preparation procedure and only the detection of the fluroescence photons a measurement. It doesn't change the fact that the laser pulse realizes (with larger intensitities of the laser) a projection, which is often called a von Neumann filter measurement in the literature.
 
  • #47
andrew s 1905 said:
assuming the paper did actually describe a measurement

The question, as I have been saying, is what measurement the paper describes. The actual measurement in the experiment is the detection of fluorescence photons, and the paper does not describe that at all--it just says it happens and statistics on the detections are collected and then math is done to reconstruct the state that was prepared by the experimental process. Nowhere does the paper describe or explain the detection of fluorescence photons in terms of any continous, non-random process.
 
  • #48
vanhees71 said:
the laser pulse realizes (with larger intensitities of the laser) a projection

What projection? Please point out the specific equation in the paper that you are referring to.
 
  • #49
PeterDonis said:
The question, as I have been saying, is what measurement the paper describes. The actual measurement in the experiment is the detection of fluorescence photons, and the paper does not describe that at all--it just says it happens and statistics on the detections are collected and then math is done to reconstruct the state that was prepared by the experimental process. Nowhere does the paper describe or explain the detection of fluorescence photons in terms of any continous, non-random process.
I realize and appreciate what you are saying. You have a definite use of the term measurement but in the paper they define what they mean by an "Ideal Quantum Measurement" which seems acceptable to the referees and Physical Review Letters but is not to you.

I am in no position the judge the issue I was just reflecting the disagreement and trying to be fair to you in acknowledging your doubt.

Regards Andrew
 
  • #50
Well, do you describe mathematically, how you read with your ey
PeterDonis said:
What projection? Please point out the specific equation in the paper that you are referring to.
In the paragraph right at the page break between pp 2-3. A bit later in the paper Eq. (7) shows that for ##t \rightarrow \infty## you realize the projection (in the usual literature it's called a measurement; I'm also not happy with that formulation, and I'd rather call it a preparation procedure).
 
Back
Top