I Ontology of spin measurement

615
372
Moreover, isn't true that the electron of Dirac equation has no charge or spin unless it interacts with an EM. why is that?
Do you have a source for that? In either case, in Bohmian mechanics there is a clear answer: spin is not a property of the particle but of the guiding wave, which is a spinor; the guiding wave imparts spin onto the particle.

In SED (a semi-Bohmian semiclassical competitor to QED) particles are fundamentally spinless as well; spin is imparted onto particles by the circularly polarized modes of the ground state of the background field via the Lorentz force.
 

vanhees71

Science Advisor
Insights Author
Gold Member
12,971
4,997
I'm not sure which part of his textbook you looked at, but see this paper:
Yes, this is a generalization of idealized measurements to imprecise measurements of real-world detectors, formalized in terms of the POVM formalism. Idealized measurements, i.e., precise measurements are a special case, where the ##\hat{E}_m## are projectors ##|m \rangle \langle m|## with ##m## labelling a complete orthonormal set of eigenvectors of the measured observable. I've no clue, how else to interpret the example used by Peres, if the ##\hat{J}_k## are not the self-adjoint (I don't think hermitian is sufficient though Peres claimes this) operators representing spin.
 

DarMM

Science Advisor
Gold Member
1,999
1,017
I've no clue, how else to interpret the example used by Peres, if the ##\hat{J}_k## are not the self-adjoint (I don't think hermitian is sufficient though Peres claimes this) operators representing spin.
He's not saying ##\hat{J}_k## don't represent spin. He's saying that a typical POVM cannot be understood as measurement of ##J\cdot n## for some direction ##n##, i.e. a typical POVM cannot be associated with spin in a given direction or in fact with any classical quantity. It seems to be simply an abstract representation of the responses of a given device.
 

vanhees71

Science Advisor
Insights Author
Gold Member
12,971
4,997
Yes, in Quantum Information theory the Shannon entropy is the entropy of the classical model induced by a context. So it naturally depends on the context. I don't see why this is a problem, it's a property of a context. There are many information theoretic properties that are context dependent in Quantum Information.

Von Neumann entropy is a separate quantity and is a property of the state, sometimes called Quantum Entropy and is equal to the minimum Shannon entropy taken over all contexts.

I don't see that what @vanhees71 and I are saying is that different. He's just saying that the von Neumann entropy is the quantum generalization of Shannon entropy. That's correct. Shannon entropy is generalized to the von Neumann entropy, but classical Shannon entropy remains as the entropy of a context.

It's only Peres's use, referring to the entropy of the distribution over densities, that seems nonstandard to me.
Of course, the entropy measure depends on the context. That it's strength! It's completely legitimate to define an entropy ##H## and also obviously useful in some investigations is quantum informatics as Peres. To avoid confusion, I'd not call it Shannon entropy.

Let me try again, to make the definition clear (hoping to have understood Peres right).

Peres describes the classical gedanken experiment to introduce mixed states and thus the general notion of quantum state in terms of a statistical operator (which imho should be a self-adjoint positive semidefinite operator with trace 1): Alice (A) prepares particles in pure states ##\hat{P}_n=|u_n \rangle \langle u_n|##, each with probability ##p_n##. The ##|u_n \rangle## are normalized but not necessarily orthogonal to each other. The statistical operator associated with this situation is
$$\hat{\rho}=\sum_n p_n \hat{P}_n.$$
Now Peres defines an entropy by
$$H=-\sum_n p_n \ln p_n.$$
This can be analyzed using the general scheme by Shannon. Entropy in Shannon's sense is a measure for the missing information given a probability distribution relative to what's considered complete information.

Obviously Peres takes the ##p_n## as the probability distribution. This distribution describes precisely the situation of A's preparation process: It describes the probability that A prepares state ##\hat{P}_n##, i.e., an observer Bob (B) uses ##H## as the entropy measure if he knows that A prepares the specific states ##\hat{P}_n##, each with probability ##p_n##. Now A sends him such a state. For B complete information would be to know which ##\hat{P}_n## this is, but he doesn't know it but only the probability ##p_n##. That's why B uses ##H## as the measure for missing information.

Now the mixed state ##\hat{\rho}## defined above describes something different. It provides the probability distribution for any possible measurement on the system. Complete information in QT means that we measure precisely (in the old von Neumann sense) a complete set of compatible observables ##O_k##, represented by self-adjoint operators ##\hat{O}_k## with orthonormalized eigenvectors ##|\{o_k \} \rangle##. If we are even able to filter the systems according to this measurement we have prepared the system as completely as one can according to QT, namely in the pure state ##\hat{\rho}(\{o_k \})=|\{o_k \} \rangle \langle \{o_k \}|##.

The probabilities for the outcome of such a complete measurement are
$$p(\{o_k \})=\langle \{o_k \} |\hat{\rho}|\{o_k \} \rangle.$$
Relative to this definition of "complete knowledge", given A's state preparation described by ##\hat{\rho}## B associates with this situation the entropy
$$S=-\sum_{\{o_k \}} p(\{o_k \}) \ln p(\{o_k \}).$$
Now it is clear that this entropy is independent of which complete set of compatible observables B chooses to define what's complete knowledge in this quantum-theoretical sense means, since obviously this entropy is given by
$$S=-\mathrm{Tr} (\hat{\rho} \ln \hat{\rho}).$$
This is the usual definition of the Shannon-Jaynes entropy in quantum theory, and it's identical with von Neumann's definition by this trace. There's no contradiction between ##H## and ##S## of any kind, it are just entropies in Shannon's information theoretical sense referring to different information about the same preparation procedure.

One has to keep in mind, to which "sense of knowledge" the entropy refers to, and no confusion can occur. As I said before, I'd not call H the Shannon entropy to avoid confusion, but it's fine as a short name for what Peres clearly defines.
 

A. Neumaier

Science Advisor
Insights Author
6,716
2,674
(I don't think hermitian is sufficient though Peres claimes this) operators representing spin.
Spin operators are Hermitian and bounded. This implies already that they are selfadjoint.
 

vanhees71

Science Advisor
Insights Author
Gold Member
12,971
4,997
You are using Born's rule claiming, in (2.1.3) in your lecture notes, that measured are exact eigenvalues - although these are never measured exactly -, to derive on p.21 the standard formula for the q-expectation (what you there call the mean value) of known observables (e.g., the mean energy ##\langle H\rangle## in equilibrium statistical mechanics) with unknown (most likely irrational) spectra. But you claim that the resulting q-expectation is not a theoretical construct but is ''in agreement with the fundamental definition of the expectation value
of a stochastic variable in dependence of the given probabilities for the outcome of a measurement of this variable.'' This would hold only if your outcomes match the eigenvalues exactly - ''accurately'' is not enough.
We have discussed this a zillion of times. This is the standard treatment in introductory text, and rightfully so, because you have to first define the idealized case of precise measurements. Then you can generalize it to more realistic descriptions of imprecise measurements.

Peres is a bit contradictory when claiming everything is defined by defining some POVM. There are no POVMs in the lab but only real-world preparation and measurement devices.

If, as you claim, precise measurements were not what Peres calls a "quantum test", which sense then would this projection procedure make? Still, I don't think that it's good language to mix the kets representing pure states with eigenstates of the observable operators. At latest at the point if you bring in dynamics using different pictures of time evolution, this leads to confusion. I understood this only quite a while after having learnt QT for the first time by reading the book by Fick, which is among the best books on QT I know. The only point which I think is wrong is to envoke the collapse postulate. Obviously one can not have a QT textbook that gets it completely right :-((.
 

vanhees71

Science Advisor
Insights Author
Gold Member
12,971
4,997
But your usage makes the value of the Shannon entropy dependent on a context (the choice of an orthonormal basis), hence is also not the same as the one vanhees71 would like to use:

Thus we now have three different definition, and it is far from clear which one is standard.

On the other hand, why should one give two different names to the same concept?
No. As I tried to explain in #154 there's only one definition of Shannon entropy, which is very general but a very clear concept. It's on purpose context dependent, i.e., that's not a bug but a feature of the whole concept.

The only confusion arises due to the unconventional use of the word Shannon entropy in the context of the probabilities described by quantum states.
 

A. Neumaier

Science Advisor
Insights Author
6,716
2,674
No. As I tried to explain in #154 there's only one definition of Shannon entropy, which is very general but a very clear concept. It's on purpose context dependent, i.e., that's not a bug but a feature of the whole concept.

The only confusion arises due to the unconventional use of the word Shannon entropy in the context of the probabilities described by quantum states.
Shannon entropy is a classical concept that applies whenever one has discrete probabilities. Peres, DarMM, and you appply it consistently to three different situations, hence are all fully entitled to call it Shannon entropy. You cannot hijack the name for your case alone.
 
Last edited:

vanhees71

Science Advisor
Insights Author
Gold Member
12,971
4,997
Sigh. Shannon entropy is not restricted to classical physics. It's applicable for any situation described with probabilities. I also do not see what's wrong with the usual extension of the Shannon entropy to continuous situations. After all entropy in the context of statistical physics has been defined using continuous phase-space variables.
 

A. Neumaier

Science Advisor
Insights Author
6,716
2,674
Sigh. Shannon entropy is not restricted to classical physics. It's applicable for any situation described with probabilities.
This is just what I had asserted. Instead of sighing it might be better to pay attention to what was actually said: You, DarMM and Peres consider three different kinds of quantum situations described with probabilities and hence get three different Shannon entropies. They all fully deserve this name.
I also do not see what's wrong with the usual extension of the Shannon entropy to continuous situations.
The Shannon entropy of a source is defined as the minimal expected number of questions that need to be asked to pin down the classical state of the source (i.e., the exact knowledge of what was transmitted), given the probability distribution for the possibilities. It applies by its nature only to discrete probabilities, since for continuous events no finite amount of questions pins down the state exactly.
After all entropy in the context of statistical physics has been defined using continuous phase-space variables.
In the statistical physics of equilibrium, the Hamiltonian must have a discrete spectrum; otherwise the canonical density operator is not defined: Indeed, if the spectrum is not discrete, ##e^{-\beta H}## is not trace trace class, and the partition function diverges.

On the other hand, Boltzmann's H is, by the above argument, not a Shannon entropy.
 
615
372
So basically, the criterion being given is that for something to be "real" it must be capable of being acted on by other "real" things, whereas the wave function is not acted on by anything. But this doesn't seem right, because the wave function is determined by Schrodinger's Equation, which includes the potential energy, and the potential energy is a function of the particle configuration. So I don't think it's correct to say that the wave function is not acted on by anything.
This is a misunderstanding, but a very intriguing one, namely a category error. The wave function is a solution to the Schrödinger equation, which is specifically determined by positions (among other things which is not relevant for the rest of the argument).

Being a solution to an equation is definitely not the same kind of logical or mathematical relationship as e.g. the relationship between two dynamical objects such as two masses mutually acting upon each other; the former is a relationship between input and output, while in the latter the relationship is between two inputs who together determine an output.
 
26,260
6,864
This is a misunderstanding, but a very intriguing one, namely a category error.
In the context of, say, the Copenhagen interpretation of QM, it is, yes. But not in the context of Bohmian mechanics, which was the interpretation under discussion in the post of mine you quoted and the subthread it is part of. In Bohmian mechanics the wave function is a real thing; the Schrodinger Equation is simply an equation that governs the dynamics of this real thing.
 
34
15
In Bohmian mechanics the wave function is a real thing
Hmm, I'm not sure that's necessarily the case. BM can frame the wavefunction as nomological rather than ontological. I.e. Instead of being a thing that exists, it is a representation of the behaviour of things that exist.

From "Quantum Physics Without Quantum Philosophy" by Goldstein et al

"It should be clear by now what, from a universal viewpoint, the answer to these objections must be: the wave function of the universe should be regarded as a representation, not of substantial physical reality, but of physical law."

"As such, the wave function plays a role analogous to that of the Hamiltonian function H = H (Q, P ) ≡ H (ξ ) in classical mechanics [...] And few would be tempted to regard the Hamiltonian function H as a real physical field, or expect any back-action of particle configurations on this Hamiltonian function."
 
615
372
In the context of, say, the Copenhagen interpretation of QM, it is, yes. But not in the context of Bohmian mechanics, which was the interpretation under discussion in the post of mine you quoted and the subthread it is part of. In Bohmian mechanics the wave function is a real thing; the Schrodinger Equation is simply an equation that governs the dynamics of this real thing.
Actually, that isn't exactly true: in BM the wavefunction isn't a field in physical space(time), but a vector field in configuration space (NB: the fact that the wavefunction is living in configuration space while acting upon particles in space(time) is also why BM is an explicitly nonlocal theory).

This is quite similar to the Hamiltonian vector field which exists in phase space; the difference is that the Hamiltonian vector field is static, while the wavefunction as a solution to the SE is dynamic; I suspect however that this difference might be a red herring, because we actually know of a static wavefunction, namely the solution to the Wheeler-de Witt equation.
 
34
15
The reference you give, as far as I can tell, isn't talking about BM.
Bohmian Mechanics is the main subject of the book. The quotes are from chapter 11.5: "A Universal Bohmian Theory". Specifically, the wavefunction is described as nomological in response to the objection that the wavefunction in BM doesn't experience any back-action from the existing configuration.
 
26,260
6,864
the wavefunction is described as nomological in response to the objection that the wavefunction in BM doesn't experience any back-action from the existing configuration
I don't have the book, but the position you describe seems similar to that described in the paper @DarMM linked to in post #50. We discussed that earlier in the thread.
 
34
15
I don't have the book, but the position you describe seems similar to that described in the paper @DarMM linked to in post #50. We discussed that earlier in the thread.
Ah ok, I missed the earlier context of the discussion.
 

vanhees71

Science Advisor
Insights Author
Gold Member
12,971
4,997
This is just what I had asserted. Instead of sighing it might be better to pay attention to what was actually said: You, DarMM and Peres consider three different kinds of quantum situations described with probabilities and hence get three different Shannon entropies. They all fully deserve this name.

The Shannon entropy of a source is defined as the minimal expected number of questions that need to be asked to pin down the classical state of the source (i.e., the exact knowledge of what was transmitted), given the probability distribution for the possibilities. It applies by its nature only to discrete probabilities, since for continuous events no finite amount of questions pins down the state exactly.

In the statistical physics of equilibrium, the Hamiltonian must have a discrete spectrum; otherwise the canonical density operator is not defined: Indeed, if the spectrum is not discrete, ##e^{-\beta H}## is not trace trace class, and the partition function diverges.

On the other hand, Boltzmann's H is, by the above argument, not a Shannon entropy.
So you are saying that the classical examples for the application of equilibrium statistics is flawed, i.e., no ideal gases, no Planck black-body radiation, specific heat of solids, and all that? How can it then be that it works so well in physics? That you have to take the "thermodynamic limit" very carefully is clear.
 

A. Neumaier

Science Advisor
Insights Author
6,716
2,674
So you are saying that the classical examples for the application of equilibrium statistics is flawed, i.e., no ideal gases, no Planck black-body radiation, specific heat of solids, and all that? How can it then be that it works so well in physics? That you have to take the "thermodynamic limit" very carefully is clear.
The derivation of thermodynamics from statistical mechanics is sound. I give such a derivation in Part II of my online book in terms of the grand canonical density operator, independent of any interpretation in terms of probabilities and of any thermodynamical limit.

I am only claiming that the thermodynamic concept of entropy in general (e.g., in Boltzmann's H-theorem) has nothing to do with Shannon entropy. It is not amenable to an information theoretic analysis. The latter is limited to discrete probabilities.
 

DarMM

Science Advisor
Gold Member
1,999
1,017

vanhees71

Science Advisor
Insights Author
Gold Member
12,971
4,997
The derivation of thermodynamics from statistical mechanics is sound. I give such a derivation in Part II of my online book in terms of the grand canonical density operator, independent of any interpretation in terms of probabilities and of any thermodynamical limit.

I am only claiming that the thermodynamic concept of entropy in general (e.g., in Boltzmann's H-theorem) has nothing to do with Shannon entropy. It is not amenable to an information theoretic analysis. The latter is limited to discrete probabilities.
Well, there are two camps in the physics community: The one camp likes the information theoretical approach to statistical physics, the other hates it. I belong to the first camp, because for me the information theoretical approach provides the best understanding what entropy is from a microscopic point of view.

What I mean by "thermodynamical limit" is the limit to take the volume to infinity, keeping densities constant. This is the non-trivial limit you seem to refer to when saying you need descrete probability distributions. Indeed at finite volume with the appropriate (periodic in the case you want proper momentum representations) spatial boundary conditions, momentum gets discrete and you can so all calculations in a (pretty) well-defined way.
 

A. Neumaier

Science Advisor
Insights Author
6,716
2,674
Well, there are two camps in the physics community: The one camp likes the information theoretical approach to statistical physics, the other hates it.
My arguments are of a logical nature. They do not depend on emotional feelings associated with an approach. The information theoretical approach is limited by the nature of the questions it asks.
What I mean by "thermodynamical limit" is the limit to take the volume to infinity, keeping densities constant.
In this thermodynamical limit, all statistical uncertainties reduce to zero, and no trace of the statistics remains.

In particular, it is not the limit in which a discrete probabiliy distribution becomes a continuous one. Thus it does not justify to apply information theoretical reasoning to continuos probability distributions.

Neither is Boltzmann's H-theorem phrased in terms of a thermodynamic limit. No information theory is needed to motivate and understand his results. Indeed they were obtained many dozens year before Jynes introduced the thermodynamic interpretation
 
Last edited:

A. Neumaier

Science Advisor
Insights Author
6,716
2,674
Do you mean a discrete sample space?
I mean a discrete measure on the sigma algebra with respect to which the random variables associated with the probabilities are defined. This is needed to make sense of the Shannon entropy as a measure of lack of information. For example, Wikipedia says,
Wikipedia said:
The Shannon entropy is restricted to random variables taking discrete values. The corresponding formula for a continuous random variable [...] is usually referred to as the continuous entropy, or differential entropy. A precursor of the continuous entropy h[f] is the expression for the functional Η in the H-theorem of Boltzmann. [...] Differential entropy lacks a number of properties that the Shannon discrete entropy has – it can even be negative. [...] The differential entropy is not a limit of the Shannon entropy for n → ∞. Rather, it differs from the limit of the Shannon entropy by an infinite offset. [...] It turns out as a result that, unlike the Shannon entropy, the differential entropy is not in general a good measure of uncertainty or information.
for me the information theoretical approach provides the best understanding what entropy is from a microscopic point of view.
What should a negative amount of missing information mean??? Which improved understanding does it provide???
 
Last edited:

Want to reply to this thread?

"Ontology of spin measurement" You must log in or register to reply here.

Related Threads for: Ontology of spin measurement

  • Posted
Replies
4
Views
1K
  • Posted
Replies
6
Views
434
  • Posted
Replies
7
Views
2K
  • Posted
Replies
3
Views
2K
Replies
2
Views
168
  • Posted
Replies
11
Views
593
  • Posted
Replies
3
Views
1K
Replies
3
Views
1K

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving

Hot Threads

Top