Wave Function Collapse and Entropy

stevendaryl
Staff Emeritus
Science Advisor
Homework Helper
Insights Author
Messages
8,943
Reaction score
2,954
I don't want to argue about whether the notion of "wave function collapse" is a good way of understanding quantum mechanics, or not. For the purposes of this discussion, let's just adopt uncritically the naive approach to quantum mechanics, that:

  1. Between measurements, the system evolves according to Schrodinger's equation.
  2. A measurement always produces an eigenvalue of the operator corresponding to the quantity being measured.
  3. Immediately after a measurement, the wavefunction "collapses" to an eigenstate of that operator corresponding to the eigenvalue that you measured.

My question is: how is the entropy of a system affected by measurement? There is a sense in which it acts like a random perturbation, and so I would think that it would increase the entropy, but on the other hand, the state becomes more definite after a measurement, which would make me think that the entropy has been lowered.

Does my question make any sense, and if so, does it have a standard answer?
 
Physics news on Phys.org
http://en.wikipedia.org/wiki/Density_matrix#Entropy
"This entropy can increase but never decrease with a projective measurement, however generalised measurements can decrease entropy.[6][7] The entropy of a pure state is zero, while that of a proper mixture always greater than zero. Therefore a pure state may be converted into a mixture by a measurement, but a proper mixture can never be converted into a pure state. Thus the act of measurement induces a fundamental irreversible change on the density matrix; this is analogous to the "collapse" of the state vector, or wavefunction collapse."

Reference 6 of the Wikipedia quote is Nielsen and Chuang.
Theorem 11.9 (p515): Projective measurements increase entropy.
Exercise 11.15 (p515): Generalised measurements can decrease entropy.
 
Last edited:
It may help you to propose some example. What system? How exactly does the collapse happen? Which kind of entropy do you have in mind? Without that, I am afraid your question is too general.
 
Entropy is a statistical concept: S = k ln W (to quote Bolzmann) wherein the W irepresents the number of possible microstates corresponding to the macroscopic state of a system. For a single particle, W = 1, so S = 0.
I have always found it useful to recall that mathematically a single 'particle' is best represented by a tensor field in Minkowski space. That representation simultaneously satisfies both QM and GR.
When a 'measurement' is performed on a 'particle' or a 'wave', a tensor operator is applied to the tensor field to extract the desired value from the set of variables participating in the tensor field. Whether the value is appropriate for a 'particle' or a 'wave' depends on the effect the operator has on the tensor.
In any case, the Dirac mathematics are used: the tensor product of the conjugate tensor with the result of the application of the operator to the original tensor, which is then fully integrated over Minkowski space to produce the value sought. Whichever operator or field is used, the operation irreversibly alters the original tensor (in 3-D this is called 'wave collapse') into something that resembles either a standing wave or a moving particle.
 
I discuss the issue of 'collapse' in detail in my new book in terms of the Transactional Interpretation,

www.cambridge.org/9780521764155

In my proposal, wf 'collapse' is a form of spontaneous symmetry breaking. This would decrease the entropy relative to the perfectly symmetric state if you consider it analogous to the case, say, of a crystal forming in an amorphous solid.
 
rkastner said:
I discuss the issue of 'collapse' in detail in my new book in terms of the Transactional Interpretation,

www.cambridge.org/9780521764155

In my proposal, wf 'collapse' is a form of spontaneous symmetry breaking. This would decrease the entropy relative to the perfectly symmetric state if you consider it analogous to the case, say, of a crystal forming in an amorphous solid.

I read about the Transactional Interpretation decades ago in Analog magazine. Its inventor, John Cramer, was also a columnist in that magazine. I thought it sounded fascinating, but I haven't heard anything much about it since, so I assumed that it wasn't taken seriously by most physicists.
 
Hi Steven,

It is beginning to get more attention. I have been publishing papers on this; many of them are available on arxiv.org. Re the book, I have some introductory and preview material at my website, http://transactionalinterpretation.org/

Best regards
Ruth Kastner
 
Ruth,

Thanks, I will take a look.
 
atyy said:
http://en.wikipedia.org/wiki/Density_matrix#Entropy
"This entropy can increase but never decrease with a projective measurement, however generalised measurements can decrease entropy.[6][7] The entropy of a pure state is zero, while that of a proper mixture always greater than zero. Therefore a pure state may be converted into a mixture by a measurement, but a proper mixture can never be converted into a pure state. Thus the act of measurement induces a fundamental irreversible change on the density matrix; this is analogous to the "collapse" of the state vector, or wavefunction collapse."

Reference 6 of the Wikipedia quote is Nielsen and Chuang.
Theorem 11.9 (p515): Projective measurements increase entropy.
Exercise 11.15 (p515): Generalised measurements can decrease entropy.
This increase of entropy by projective measurement assumes that a true collapse does not exist.
But if it does, then it actually DECREASES entropy. See e.g.
http://arxiv.org/abs/1108.3080
TABLE 1
 
  • #10
This thread is interesting, because opposite answers are given. I think given that stevendaryl explicitly asks about collapse, the answer is that entropy is the same after a measurement. When we perform a projective measurement on a pure state, the outcome is a pure state again and the (von Neumann) entropy of all pure states is zero.

Such a measurement can be devided into two parts: decoherence and collapse. The process of decoherence increases the entropy of the system and collapse reduces it. Nielsen and Chuang (cited by atty) only consider the first part, so there, entropy increases. But this doesn't produce a single outcome. Tegmark's "observation" in table 1 (cited by Demystifier) starts with the decohered state, so there, entropy decreases. But this doesn't cover the complete measurement process where we want to start with a pure state. If you combine the processes shown in table 1, my point that entropy is the same after the measurement is nicely illustrated.

stevendaryl said:
[...] but on the other hand, the state becomes more definite after a measurement, which would make me think that the entropy has been lowered.
It does become more definite only wrt to one basis, but becomes less definite wrt to other bases (consider spin measurements for example). So I don't think we should use this kind of reasoning.
 
  • #11
Demystifier said:
This increase of entropy by projective measurement assumes that a true collapse does not exist.
But if it does, then it actually DECREASES entropy. See e.g.
http://arxiv.org/abs/1108.3080
TABLE 1

That's the basis for my original question. It seemed that if a particle was in a mixture of a spin-up state and a spin-down state, which is a high-entropy state, and then I measure the spin and find that it is spin-down, then afterward it's in a low-entropy state (pure spin-down).
 
  • #12
kith said:
This thread is interesting, because opposite answers are given. I think given that stevendaryl explicitly asks about collapse, the answer is that entropy is the same after a measurement. When we perform a projective measurement on a pure state, the outcome is a pure state again and the (von Neumann) entropy of all pure states is zero.

Okay, but in classical thermodynamics, the entropy of a system is roughly the log of the volume in phase space of the set of points consistent with macroscopic observables. Can this classical notion of entropy be affected by measurements?
 
  • #13
stevendaryl said:
It seemed that if a particle was in a mixture of a spin-up state and a spin-down state, which is a high-entropy state, and then I measure the spin and find that it is spin-down, then afterward it's in a low-entropy state (pure spin-down).
That's right if you talk about an incoherent mixture of states (which corresponds to a diagonal density matrix). There, the situation is analogous to classical statistical mechanics.

But in this situation, there's no collapse. Collapse occurs only for superpositions, which are pure states and have zero entropy (analogous to points in phase space). If your initial state is a supersposition of a spin-up state and a spin-down state wrt to the z-axis, you can always find a measurement direction in such a way, that your state is an eigenstate to the corresponding spin operator. A measurement in this direction always yields the same result, hence your initial state already is a low entropy state. It's not meaningful to assign different entropies to superpositions and eigenstates, because the property of being an eigenstate depends on the basis. Every state is a superposition wrt to some basis.
 
Last edited:
  • #14
All the discussion above seems to be about von Neumann entropy. The von Neumann entropy is zero for any pure state. Hence, if the initial and final states are pure states (whatever the unobserved intermediate states are), then the entropy of the final state is the same as that of the initial state.

Then why does entropy in Nature increase with time? Because the entropy responsible for that is NOT the von Neumann entropy. To get an entropy which increases with time, one must introduce some COARSE GRAINING. It is the corresponding coarse grained entropy (and not the von Neumann entropy) which increases with time.
 
  • #15
Demystifier said:
All the discussion above seems to be about von Neumann entropy. The von Neumann entropy is zero for any pure state. Hence, if the initial and final states are pure states (whatever the unobserved intermediate states are), then the entropy of the final state is the same as that of the initial state.

Then why does entropy in Nature increase with time? Because the entropy responsible for that is NOT the von Neumann entropy. To get an entropy which increases with time, one must introduce some COARSE GRAINING. It is the corresponding coarse grained entropy (and not the von Neumann entropy) which increases with time.

My original post about entropy was not about von Neumann entropy, but about thermodynamic entropy.
 
  • #16
Demystifier said:
All the discussion above seems to be about von Neumann entropy. The von Neumann entropy is zero for any pure state. Hence, if the initial and final states are pure states (whatever the unobserved intermediate states are), then the entropy of the final state is the same as that of the initial state.

Then why does entropy in Nature increase with time? Because the entropy responsible for that is NOT the von Neumann entropy. To get an entropy which increases with time, one must introduce some COARSE GRAINING. It is the corresponding coarse grained entropy (and not the von Neumann entropy) which increases with time.

Would this be the same as considering that the density matrix is a reduced density matrix (ie. we can only access the state of a subsystem)? So for a sequence of pure states of the universe, the von Neumann entropy of the reduced density matrix would still increase?

(I guess I'm asking if integrating out is a good enough form of coarse graining.)
 
Last edited:
  • #17
atyy said:
Would this be the same as considering that the density matrix is a reduced density matrix (ie. we can only access the state of a subsystem)? So for a sequence of pure states of the universe, the von Neumann entropy of the reduced density matrix would still increase?
If there was nothing resembling the wave function collapse, then one could say that entropy of the subsystem increases, due to decoherence. But the fact is that something resembling the wave function collapse does exist (which decoherence by itself cannot explain, which is why decoherence does not completely solve the measurement problem). For that matter it is not important whether the collapse is related to consciousness (von Neumann), or happens spontaneously (GRW), or is only an illusion (many worlds, Bohmian, etc.), as long as it exists at least in the FAPP sense.
 
  • #18
atyy said:
(I guess I'm asking if integrating out is a good enough form of coarse graining.)

I think it's not.
 
  • #19
Demystifier said:
If there was nothing resembling the wave function collapse, then one could say that entropy of the subsystem increases, due to decoherence. But the fact is that something resembling the wave function collapse does exist (which decoherence by itself cannot explain, which is why decoherence does not completely solve the measurement problem). For that matter it is not important whether the collapse is related to consciousness (von Neumann), or happens spontaneously (GRW), or is only an illusion (many worlds, Bohmian, etc.), as long as it exists at least in the FAPP sense.

Demystifier said:
I think it's not.

I guess it's not obvious to me that the reduced density matrix doesn't involve collapse, since I've seen it said that the Born rule is implicitly used in getting it.
 
  • #20
Demystifier said:
If there was nothing resembling the wave function collapse, then one could say that entropy of the subsystem increases, due to decoherence.
But in most cases, there is nothing resembling wave function collapse. Decoherence occurs whenever the interaction between systems leads to entanglement and not only during measurements.

I'm still wondering how exactly this is related to the observable entropy increases. If we start with two pure states, I guess that any fundamental interaction that leads to maximal entanglement should begin to disentangle the systems afterwards. So I would expect an oscillating entropy for the systems (in classical mechanics, no entropy change arises from such a situation). Which of course would call for an explanation why our observations always take place in the rising entropy domain.
 
Last edited:
  • #21
atyy said:
I guess it's not obvious to me that the reduced density matrix doesn't involve collapse, since I've seen it said that the Born rule is implicitly used in getting it.
If you do an experiment, you get one definite outcome for your observable. The density matrix contains the probabilities for all possible outcomes, so it isn't the final state you perceive.
 
  • #22
atyy said:
I guess it's not obvious to me that the reduced density matrix doesn't involve collapse, since I've seen it said that the Born rule is implicitly used in getting it.
Mathematically, the reduced density matrix is obtained by partial tracing, which technically does not depend on the Born rule. The Born rule only serves as a motivation for doing the partial trace, but formally you can do the partial trace even without such a motivation.

A Born-rule-independent motivation for doing the partial trace is the fact that the evolution of the resulting object (reduced density matrix) does not depend on the whole Hamiltonian, but only on the Hamiltonian for the subsystem.
 
Last edited:
  • #23
kith said:
So I would expect an oscillating entropy for the systems (in classical mechanics, no entropy change arises from such a situation).
In both quantum and classical mechanics (based on deterministic Schrodinger and Newton equations, respectively), it is true that entropy will start to decrease after a certain time, and return arbitrarily closely to the initial state. However, in both cases, the typical time needed for for such a return is many orders of magnitude larger than the age of the Universe.

kith said:
Which of course would call for an explanation why our observations always take place in the rising entropy domain.
That's easy to explain. If we happen to live in an entropy-decreasing era, we will naturally redefine the sign of time accordingly, i.e., re-interpret it as an era in which entropy is increasing. That's because our brain, and ability to remember, is also determined by the direction in which entropy increases.

The good question is why the direction in which entropy increases is everywhere the same, i.e., why it is not the case that entropy increases in one subsystem and decreases in another? The answer is that it is interaction between the subsystems which causes them to have the same direction of the entropy increase:
http://arxiv.org/abs/1011.4173v5
 
  • #24
Demystifier said:
In both quantum and classical mechanics (based on deterministic Schrodinger and Newton equations, respectively), it is true that entropy will start to decrease after a certain time, and return arbitrarily closely to the initial state.
Are you talking about the von Neumann and Liouville entropies or about some course grained entropy? I don't see how the Liouville entropy may increase for a subsystem if it remains constant for the whole system.

Demystifier said:
If we happen to live in an entropy-decreasing era, we will naturally redefine the sign of time accordingly, i.e., re-interpret it as an era in which entropy is increasing. That's because our brain, and ability to remember, is also determined by the direction in which entropy increases.
Interesting, I haven't thought along these lines before. Can you recommend a not too technical article which expands on this? Also thanks for the link to your article, it looks quite promising. /edit: I'm skimming it right now and it probably simply is this reference I asked about ;-).
 
Last edited:
  • #25
kith said:
Are you talking about the von Neumann and Liouville entropies or about some course grained entropy? I don't see how the Liouville entropy may increase for a subsystem if it remains constant for the whole system.
I am talking about course grained entropy, of course.

kith said:
Interesting, I haven't thought along these lines before. Can you recommend a not too technical article which expands on this?
For example, Hawking explains it in his "Brief History of Time". I guess I cannot recommend a less technical literature than that.

A much more refined analysis of that stuff, but still very non-technical, is the book:
H. Price, Time's Arrow and Archimedes Point

Another good related non-technical book is:
D. Z. Albert, Time and Chance

There is also a good non-technical chapter on that in:
R. Penrose, The Emperor's New Mind
 
Last edited:
  • #26
Demystifier said:
Mathematically, the reduced density matrix is obtained by partial tracing, which technically does not depend on the Born rule. The Born rule only serves as a motivation for doing the partial trace, but formally you can do the partial trace even without such a motivation.

A Born-rule-independent motivation for doing the partial trace is the fact that the evolution of the resulting object (reduced density matrix) does not depend on the whole Hamiltonian, but only on the Hamiltonian for the subsystem.

If the Born rule isn't used, couldn't one just give an arbitrary reweighting of the sum over the environment and still get an object defined only on the subsystem (ie. is the averaging over the environment unique if the Born rule isn't used?)
 
  • #27
Demystifier said:
The good question is why the direction in which entropy increases is everywhere the same, i.e., why it is not the case that entropy increases in one subsystem and decreases in another? The answer is that it is interaction between the subsystems which causes them to have the same direction of the entropy increase:
http://arxiv.org/abs/1011.4173v5

To me, the mystery of entropy is not just that the arrows of time for all parts of the universe are the same, but that the thermodynamic arrow of time is aligned with the cosmological arrow of time. That is, for all parts of the universe, entropy decreases in the direction of the Big Bang.
 
  • #28
kith said:
If you do an experiment, you get one definite outcome for your observable. The density matrix contains the probabilities for all possible outcomes, so it isn't the final state you perceive.

Well, a nondeterministic theory cannot possibly describe the final outcome, it can only describe the set of possibilities and their associated probabilities.
 
  • #29
stevendaryl said:
Well, a nondeterministic theory cannot possibly describe the final outcome, it can only describe the set of possibilities and their associated probabilities.

The recipe of using the reduced matrix may not imply the collapse interpretation, but it seems that is as close as you can get.

The collapse interpretation says that initially the system is in some state \vert \Psi\rangle. You perform an experiment to measure some observable with eigenvalues \lambda and corresponding eigenstates \vert \Psi_\lambda\rangle (for simplicity, assume non-degeneracy). Then the results are that afterward:

For every value of \lambda, there is a probability of \vert \langle \Psi \vert \Psi_\lambda\rangle \vert^2 that the system is in state \vert \Psi_\lambda\rangle

This is captured by the density matrix formalism as the transition

\vert \Psi \rangle \langle \Psi \vert \Rightarrow \sum_\lambda \vert \langle \Psi \vert \Psi_\lambda\rangle \vert^2 \vert \Psi_\lambda \rangle \langle \Psi_\lambda \vert
 
  • #30
I guess thinking about it classically, Demystifier's argument must be right. Measurement gives us more information, which is a reduction in entropy. Entropy increases when we forget, according to Landauer's exorcism of Maxwell's demon.

I guess what's not obvious to me is - how much coarse graining do we need, since the partial trace in getting the reduced density matrix is a form of coarse graining?
 
  • #31
Then Von Neumann Entropy and the Shannon Entropy are the same if we average over a time not so little so the relaxing time of the subsystems but little in comparation with the relaxation time of the complete system.

When finished the decoherence process, diagonal terms of density operators are null, then entropy grew, but when collapse matters, the entropy can decrease, in the extremal case to a pure state again, or a mix with no more entropy like before start decoherence process. If the measurement is not ideal, entropy can increase, obviously not to the entropy of the final of decoherence process, because then there is no information gained, and not measurement at all.
 
  • #32
Of course, the wave function is different if we considerate the collapse or not, and the post evolution is too. But the situation is different, if you consider the measurement realized, the posterior wavefunction give the probability conditionated to the measurement result, if you don´t considerate the collapse and considerate the total evolution, you only obtain the probability of measurements, not conditionate to no result of a measurament. All the problems are solved considering that wavefunction is only an instrument to calculate the probabilities of a measurement, in relation with the previous information, and not a real state of the system. The measurement problem doesn´t exist with this consideration, and collapse is perfectly ok
 
  • #33
StarsRuler said:
Of course, the wave function is different if we considerate the collapse or not, and the post evolution is too. But the situation is different, if you consider the measurement realized, the posterior wavefunction give the probability conditionated to the measurement result, if you don´t considerate the collapse and considerate the total evolution, you only obtain the probability of measurements, not conditionate to no result of a measurament. All the problems are solved considering that wavefunction is only an instrument to calculate the probabilities of a measurement, in relation with the previous information, and not a real state of the system. The measurement problem doesn´t exist with this consideration, and collapse is perfectly ok

I don't think that viewing the wave function as merely a matter of information is very satisfying. For one thing, there is interference between alternatives. What does it mean for information to interfere with other information? For another, when people talk about information, it's usually the case that they make a distinction between the state of the world and our information about that state. Bell's theorem shows that there is no sensible notion of "state" that the wave function could be about.

Something that I've thought about that in some ways makes the information-theoretic view more palatable to me, personally, is the "consistent histories" approach. You give up on describing what the state of the world is and the dynamics of that world state, and instead view the object of interest to be histories of observations. Quantum mechanics tells us the relative probabilities of those histories.
 
  • #34
stevendaryl said:
The recipe of using the reduced matrix may not imply the collapse interpretation, but it seems that is as close as you can get.

The collapse interpretation says that initially the system is in some state \vert \Psi\rangle. You perform an experiment to measure some observable with eigenvalues \lambda and corresponding eigenstates \vert \Psi_\lambda\rangle (for simplicity, assume non-degeneracy). Then the results are that afterward:

For every value of \lambda, there is a probability of \vert \langle \Psi \vert \Psi_\lambda\rangle \vert^2 that the system is in state \vert \Psi_\lambda\rangle

This is captured by the density matrix formalism as the transition

\vert \Psi \rangle \langle \Psi \vert \Rightarrow \sum_\lambda \vert \langle \Psi \vert \Psi_\lambda\rangle \vert^2 \vert \Psi_\lambda \rangle \langle \Psi_\lambda \vert

See the density matrix above: in the transactional interpretation, what you get due to responses from all available absorbers is precisely that density matrix. Each weighted projection operator is an incipient transaction, and its weight is the Born Rule. This is true collapse in the there is no causal account for why one of the projection operators becomes the actualized outcome. However, TI provides a clear physical account of measurement as formulated in von Neumann's 'process 1' (the transition from pure state to density matrix), where no other interpretation can give an account of this other than a decision-theoretic one. The Bohmian theory appears to offer a solution, but it is far from clear that having a 'particle' in one of the channels translates to actualization of that outcome (see, e.g. critique by Wallace and Brown, 2005 ---ref on request).

What I can't understand is why there seems to be so much resistance to this obvious solution. Yes, it involves including advanced field solutions (with negative energies), but you can't get away from that in the relativistic domain anyway. So the basic message is that in order to solve the measurement problem, you need to take into account relativistic processes (emission and absorption) in a direct action picture (basic field propagation is time/energy - symmetric). This is very natural, since it is the most general theoretical formulation. Note that you can regain the empirical asymmetry of radiative processes with appropriate boundary conditions. Then the existence of those BC become theoretically falsifiable, which makes it a stronger theory methodologically -- in contrast to the standard approach in which an asymmetric field theory is simply assumed ad hoc.

See my new book on TI, in particular Chapters 3 and 4, for how TI solves the measurement problem. Sorry the book is rather pricey, but you can get it at many libraries and on interlibrary loan if interested. Also I will provide specially discounted, autographed copies for students with documented finanicial hardship. Contact me through my website to apply for this discount.
http://transactionalinterpretation.org/2012/10/09/to-contact-me/
 
  • #35
I dont´know details about the Bohmian interpretation. But there is no QFT theory for this interpretation. I don´t know if it is beacuse it is not possible or it is not clear that it is possible, but it is a pending question.
 
  • #36
Something that I've thought about that in some ways makes the information-theoretic view more palatable to me, personally, is the "consistent histories" approach

I read the Grifiths Book about consistent histories and it was unsmokable. ¿ Do you know another source for this interpretation more readable

I don´t understand the problem with Bell theorem for the informational interpretation of wavefunction.

The only thing that it no solves is that it imposses a minimun limit for the measurement, but no the real duration of measurement ( maybe it implies to study measurement apparatus), and the possible degeneration in measurement, that can provokes an increased entropy from pure state just before the decoherence start
 
Last edited:
  • #37
The consistent histories approach boils down to the idea that one can formulate differing sequences (histories) of events (typically between a given first and last event), and the different possible sequences can form a classical probability space if the associated observables satisfy certain criteria. But that doesn't solve the measurement problem. Instead it invokes linguistic limitations on what properties you are allowed to consider as simultaneously determinate, based on which 'framework' (decomposition of the multiple-time Hilbert space) they belong to. Thus it is 'contextual' in that certain properties are only attributable to the system in the context of a given framework -- and the framework is an epistemic construct. In contrast I argue that you don't need an observer to solve the measurement problem, if you take emission and absorption into account. Things happen -- and what happens does not depend on which framework an observer wishes to consider.
 
Last edited:
  • #38
In contrast I argue that you don't need an observer to solve the measurement problem, if you take emission and absorption into account

But an observer is not a mental observer neither in informational aproximation to wavefunction, it is a classical apparatus, considered like any system , or better, coordinates ( in phase space) of any system with a preccissión lower than uncertainty relations. Even a electron could be a classical aparatus, but preccissión in position ( if momentum is reasonable determinate) would be ridiculous. This is the Landau vision, for example, the only problem I see is that we could consider all universe classical, maybe the only limitation than space uncertainty could not up to the universe lenght. Maybe it would obligate the particle to be ONLY quantic. This last is a conjecture of mine. Without a limitation, the Landau vision would be problematic, all the universe would be classical and there isn´t wavefunction. Well, maybe Landau vision probably would be more elaborated, I only know the book of not relativistic quantum physics from him completed with decoherence process . This is the only vision I get to understand. Many worlds too, but ...

I like many worlds too , but many worlds=many problems. for example: Probability rules only works in an infinite repetition of an experiment, but probability is near relative frequency when repetitions are reasonable large, with an error sqare of n order.
 
Last edited:
  • #39
StarsRuler said:
I dont´know details about the Bohmian interpretation. But there is no QFT theory for this interpretation. I don´t know if it is beacuse it is not possible or it is not clear that it is possible, but it is a pending question.
Or perhaps there is Bohmian interpretation of QFT, but you didn't know about it? See e.g.
http://arxiv.org/abs/0904.2287 [Int. J. Mod. Phys. A25:1477-1505, 2010]
 
  • #40
StarsRuler said:
But an observer is not a mental observer neither in informational aproximation to wavefunction, it is a classical apparatus," ...

"I like many worlds too , but many worlds=many problems. for example: Probability rules only works in an infinite repetition of an experiment, but probability is near relative frequency when repetitions are reasonable large, with an error sqare of n order.

Concerning your first statement, I was referring to the notorious 'Heisenberg cut' problem in which there is no criterion within nonrelativistic QM for saying where the 'quantum' realm ends and the 'classical apparatus' realm begins. This is the point of the Schrodinger's Cat paradox: Schrodinger could not find any way to say that the cat isn't in a superposition (or any of the other classical objects in the box, such as the geiger counter). In the TI approach as I have developed it (see my papers and book), there is a clear criterion. You need to include the relativistic domain, noting in particular that the coupling amplitude between fields is the amplitude to emit or absorb a field quantum, to get it.

Yes, many worlds interpretations have a big problem explaining the Born Rule. In TI it is completely evident. You can read it off the density matrix (which describes the set of weighted incipient transactions due to a particular set of absorbers), as I noted previously.
 
  • #41
StarsRuler said:
Yes, there is a criterion for sepparate quantum from classical. All apparatus are classical with more or less preccission, restricted of course by uncertainty relations. When the apparatus impreccissions are more than measurements error that we got in our experiment, then we must study by a quantum wavefunction. But with a minor preccission, even an electron become to be classical, for example

What you have given is just a pragmatic criterion.
Do you understand the point of the 'Schrodinger's Cat' paradox?
Schrodinger recognized that when you must describe an object by a quantum state, all the interactions of that object with other objects must seemingly inherit the linear evolution of the quantum state of the original object, and there is no way to 'break the linearity', so it seems to infect all other objects with which it interacts, even if they are macroscopic (like a cat). The principled (as opposed to pragmatic) way to stop the 'infection' of macroscopic objects by the quantum linearity is to take into account absorption, which is a relativistic process. You can't successfully solve the measurement problem purely within the nonrelativistic theory.
 
  • #42
StarsRuler, the measurement problem is not the same as a shortcoming in a given theory such as Newtonian physics. It is an intractable problem that nobody has been able to solve adequately in the usual approaches, which treat nonrelativistic theory as the whole story in conceptual terms. Even when using the relativistic QM, people still think there is a measurement problem (e.g. Feynman was baffled by the measurement problem even though he invented a lot of the relativistic quantum theory). This is because he ultimately rejected the direct-action theory he pioneered with Wheeler, because he thought it did not allow self-action. Cramer developed the Wheeler-Feynman approach into TI and that is what offers the solution. Davies developed the W-F approach into a quantum relativistic version that successfully addresses the self-action issue that bothered Feynman.

The solution to the measurement problem consists in using a direct-action theory (Wheeler/Feynman/Davies) to describe field propagation, and taking into account the response of the absorber as the physical circumstance defining measurement. Taking into account that the coupling amplitudes between fields are amplitudes for emissions or absorption defines the micro/macro boundary. This is all explained in my book (specifically Chapters 3,4 and 6) and you can get an idea of the treatment of the micro/macro distinction in Part 5 of this paper published in Foundations of Physics: http://arxiv.org/abs/1204.5227
 
  • #43
Bell's theorem shows that there is no sensible notion of "state" that the wave function could be about.


Sorry, my english is not good, Stevendaryl, I don´t understand the sentence. Does you mean that wavefunction represents a state, and information about a state is not strictly the state? Anyway, I don´t get the relation with Bell´s theorem
 
  • #44
Since we're throwing around lots of interpretations of entropy, I can't resist adding my personal favorites as fuel to the fire:

E. T. Jaynes argued that "entropy of a system" is an ill-defined concept. Entropy should be defined for probability distributions, not for physical systems. In Jaynes' words (PDF link):

It is possible to maintain the view that the system is at all times in some definite but unknown pure state, which changes because of definite but unknown external forces; the probabilities represent only our ignorance as to the true state. With such an interpretation the expression “irreversible process” represents a semantic confusion; it is not the physical process that is irreversible, but rather our ability to follow it.

And here's Claude Shannon explaining why he chose the name entropy for his famous information-theoretical entropy:

My greatest concern was what to call it. I thought of calling it "information," but the word was overly used, so I decided to call it "uncertainty." When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage."
 
  • #45
StarsRuler said:
Sorry, my english is not good, Stevendaryl, I don´t understand the sentence. Does you mean that wavefunction represents a state, and information about a state is not strictly the state? Anyway, I don´t get the relation with Bell´s theorem

Well, in the usual way that people talk about "information", there is a distinction between what is true about the system we are interested in, and what we know about that system. For example, in communication theory, we have a message, which is a sequence of letters, and that sequence is altered by noise in the communication channel. Since the receiver doesn't know for certain whether any particular character is noise or not, the best he can do is come up with a probability distribution on possible messages.

In the case of classical statistical mechanics, the system is assumed to be in some unknown state, described by a point in phase space (phase space gives the locations and velocities of all the particles in the system). But we don't actually know this state, so we can quantify our lack of knowledge by using probability distributions on phase space.

In these classical examples, the word "state" is used to mean two different things: (1) what is true about the system of interest, and (2) what we know about the system.

If you want to interpret the wave function as information about the system, then that's a state in sense (2). But it's hard for me to see how it is meaningful to talk about "information about the system" unless there is also an unobserved REAL state in sense (1) that this information is about.
 
  • #46
Ok, but is not neccesary know about the real system, there is no lost of power if we preocupate only about that we know about the system. In relativity theory, is information what can´t go over light speed, no problem with entanglament
 
  • #47
StarsRuler said:
Ok, but is not neccesary know about the real system, there is no lost of power if we preocupate only about that we know about the system. In relativity theory, is information what can´t go over light speed, no problem with entanglament

As I said, it's hard for me to understand what the word "know" means, unless it means a correspondence between the system itself and our description of the system.
 
  • #48
it's hard for me to understand what the word "know" means, unless it means a correspondence between the system itself and our description of the system.

But this is because you wish that QM be a complete theory. If we not impose this matter, there is no problem with collapse. You are not the only that has this wish, of course. I think is good work over this question ( and many others, physics is far away of be the final theory by the moment), but then the theory could be very different, many postulates could change. I think the problem of go out of the information sphere to the real states sphere is that we always in the information sphere. What conditions must have a theory that we not can directly study for being the correct theory in the out sphere correspondly to our (probably) known information sphere?
 
  • #49
StarsRuler said:
But this is because you wish that QM be a complete theory.

I don't think that's correct. It doesn't have anything to do with whether it is a "complete theory" or not. I'm talking about the meaning of the word "information". Or "knowledge". Those words don't require a complete theory, but they do require some notion of "fact" that is distinguished from "information".
 
Last edited:
  • #50
So I would expect an oscillating entropy for the systems (in classical mechanics, no entropy change arises from such a situation).

No, because the measurement time is always equal or greater than decoherence time, then you never observe the entropy increase relative to decoherence. You can observe an increase of entropy because the observation that you do about the system is not so precisely like the observable in which decohere the system. You can repeat the measure with more precision, but the system will be evolutionate in the new decoherence time before the new measurement.
This doesn´t justify the permanent no reduction of entropy in the "strobbed story" after every measurement , but it permits it. There is no in present days a deduction of second principle of thermodynamics.

Well, there is one ( I only saw mencionated by Schiller in his books, but he doesn´t desarrollate the second principle demostration, supussing that is a minimum value for entropy of any system that is the Boltzmann constant (planck units). If something now a reference for it.
 
Back
Top