I The typical and the exceptional in physics

Click For Summary
The discussion centers on the implications of quantum mechanics for macroscopic objects, particularly regarding their position and standard deviation. It argues that while quantum mechanics allows for superpositions, practical physics often focuses on typical behaviors rather than exceptional cases, as these are more relevant for applications. The conversation highlights that statistical mechanics successfully describes macroscopic properties using mixed states, which do not adhere to the superposition principle applicable to pure states. Additionally, it addresses the circular reasoning in assuming small standard deviations for macroscopic observables without substantial justification. Ultimately, the dialogue emphasizes the distinction between theoretical constructs and the practical realities of physical systems.
  • #31
In any case ##\hat{\rho}_{\text{Bob}}## just describes Bob's knowledge about the particle. That's a simpler interpretation without changing any observable content of the description!
 
Physics news on Phys.org
  • #32
What I've never understood is what objection is there to saying that the "collapse" is a collapse of information? What has collapsed is the way you characterize the system, and the predictions you now make based on the new information. You're dealt a hand in cards, and it could be anything, but you pick the hand up and look at it, and now you have new information. We don't call that a "collapse" because there is no interference between different possible hands, but we well know why, those different hands are decohered already and the interferences are destroyed. So what is the difference between saying "the hand was determined before I picked it up", versus "the possible hands that existed before I looked show no mutual coherences"? I think those two statements are in every way precisely the same, as long we understand that what we are doing is manipulating information. Can anyone tell me an operational difference between those two quoted statements? It seems to me that what we have here is a single nonproblematic epistemology that can be connected nonuniquely with multiple ontologies, all of which are strange in some way. We should not be surprised that insistence on mapping epistemologies to ontologies produces strange results, as it has throughout the entire history of science. We are still going to do it, we like ontology, but why be surprised the result is strange-- that's what I don't get.
 
  • Like
Likes odietrich and vanhees71
  • #33
vanhees71 said:
In any case ##\hat{\rho}_{\text{Bob}}## just describes Bob's knowledge about the particle. That's a simpler interpretation without changing any observable content of the description!

But that's a weird perspective. When it comes to the two-particle composite system, Bob and Alice know everything there is to know about this system. It's described by a pure state, which is, for quantum mechanics, the maximum amount of information you can have about a system. To say that Bob's mixed state reflects his ignorance about his particle means that he knows less about a part of a system than he knows about the whole system.

Actually, I read a paper once that described entanglement in exactly these terms. For a classical composite system, the entropy of the complete system has to be greater than the entropy of any of the components. But for quantum mechanics, this isn't always the case. For two-particle entangled system, the entropy for the composite system can be zero, because you know exactly what the state is. But the entropy of the components can be nonzero.
 
  • #34
Ken G said:
What I've never understood is what objection is there to saying that the "collapse" is a collapse of information? What has collapsed is the way you characterize the system, and the predictions you now make based on the new information. You're dealt a hand in cards, and it could be anything, but you pick the hand up and look at it, and now you have new information. We don't call that a "collapse" because there is no interference between different possible hands, but we well know why, those different hands are decohered already and the interferences are destroyed. So what is the difference between saying "the hand was determined before I picked it up", versus "the possible hands that existed before I looked show no mutual coherences"?

Yeah, that's one of the many things that is strange about quantum mechanics. The collapse, or update, or whatever you want to call it, when you perform a measurement seems completely unremarkable if you think that there already was some definite value for whatever it is that you measured, you just didn't know what it was. And for any experiment you want to perform on a system, there is no detectable difference between the two interpretations: (1) It already had a value, I just didn't know what it was, and (2) It didn't have a value until I measured it.

In light of that, it's tempting to just treat quantum probabilities as classical probabilities, reflecting ignorance of the true state of things. But Bell's theorem (or you could say, the existence of noncommuting observables) shows that it's not consistent to believe that about all possible measurements.

Of course, you can only measure one thing at a time, but which thing you measure isn't determined ahead of time.
 
  • Like
Likes RockyMarciano
  • #35
stevendaryl said:
Actually, I read a paper once that described entanglement in exactly these terms. For a classical composite system, the entropy of the complete system has to be greater than the entropy of any of the components. But for quantum mechanics, this isn't always the case.

Yes - and that's because of the possibility of pure states in QM. The Araki-Lieb inequality states that for any two quantum systems A and B the entropies are bounded by
| S(A) - S(B) | ≤ S(A,B) ≤ S(A) + S(B)
The RH bound is of course just the classical entropy inequality, but the LH bound is purely quantum mechanical.

The corresponding inequality for classical entropies is
sup [ S(A), S(B) ] ≤ S(A,B) ≤ S(A) + S(B)
where now the LH bound expresses the fact that for any 2-component classical system the total entropy must be at least as large as the entropy of either of the component parts.

If we use an information-theoretic measure of the correlation I = S(A) + S(B) - S(A,B) which is just the mutual information and is a measure of the difference in the information between making measurements on A and B alone and making joint measurements on the [AB] system.

Then for quantum systems the correlation is bounded by I ≤ 2 inf [ S(A), S(B) ]
The corresponding classical bound is I ≤ inf [ S(A), S(B) ]

So in these terms there is the potential for a quantum system to have twice as much information content in the correlation as the corresponding classical system.
 
  • Like
Likes vanhees71
  • #36
stevendaryl said:
I don't know in what sense you are rejecting collapse
This is my view of collapse:

The collapse is a sudden change of the model used by an observer to reinterpret the situation when new information comes in, hence depends on when and whether the observer cares to take notice of a physical fact. This clearly cannot affect other observers and their models of the physical situation. Hence there is no nonlocal effect. Nonlocal correlations appear only when a single observer compares records of other (distant) observer's measurements. At that time the past light cone of this observer contains all the previously nonlocal information, so that locality is again not violated.

As one can see from the wording in terms of subjective information, it applies to modeling a large sample of equally prepared systems when the model is loosely interpreted to apply to a single of these. Although this interpretation is strictly speaking forbidden (in the sense that objective probabilities for single events do not exist), it is informally meaningful in a Bayesian subjective probability interpretation.
 
  • Like
Likes vanhees71
  • #37
Ken G said:
hen we idealize our systems as closed, we are left with no way to explain how they would behave.
This is no different from the classical situation, where we idealize our die-casting system to a stochastic process, and are left with no way to explain how a single die would behave.

Idealizations always introduce uncertainty, and if this uncertainty is big because there is sensitive dependence on unmodled degrees of freedom then only probabilities (i..e, relative frequencies for a large sample) can be predicted.
 
  • #38
A. Neumaier said:
This is my view of collapse:

The collapse is a sudden change of the model used by an observer to reinterpret the situation when new information comes in, hence depends on when and whether the observer cares to take notice of a physical fact. This clearly cannot affect other observers and their models of the physical situation. Hence there is no nonlocal effect.

It's unclear. For quantum systems, there are two aspects to measurements: (1) the choice of an observable (for spin-1/2, it's a direction to measure spin relative to), and (2) the value for that observable. Once the observable is chosen, then the subsequent update that comes from learning the value of that variable is exactly like an ordinary classical update that occurs when you learn the value of a pre-existing quantity. But the fact that in quantum mechanics, the probability only exists after the observable is chosen makes it difficult (for me) to view collapse as simply updating based on new information.
 
  • Like
Likes zonde
  • #39
A. Neumaier said:
Idealizations always introduce uncertainty, and if this uncertainty is big because there is sensitive dependence on unmodled degrees of freedom then only probabilities (i..e, relative frequencies for a large sample) can be predicted.

I would say that while it is correct to use open systems, it's also easy to be misled. Yes, there is a sensitive dependence on unmodeled degrees of freedom. But that is NOT what is going on with mixed states due to entanglement. Or at least, that's not all that is going on. As I said, a PURE two-component state becomes a mixed state when you trace out the degrees of freedom of one of the components. But in that case, it's just factually incorrect to ascribe the probabilities in the resulting density matrix to sensitive dependence on unmodeled degrees of freedom. The probabilities in this case don't come from ignorance about the exact details, because we started with a pure state, where we know all there is to know.
 
  • #40
stevendaryl said:
because we started with a pure state, where we know all there is to know.
This is a severe problem for the knowledge interpretation of quantum mechanics, and only for that interpretation. This interpretation claims that a pure state gives the maximum knowledge one can have about a system, while a mixed state represents incomplete knowledge.

But this view is self-contradictory as your tracing out example shows. If you know everything about the whole system, it would imply that you know very little about the subsystem, while if you know everything about a subsystem but nothing about the remainder of the system, this cannot even be described in this model of knowledge.

Thus I reject the whole basis of your reasoning as it is conceptually unsound.

A sensible interpretation of quantum mechanics must not only assign meaning to the whole system but to all subsystems. Indeed, of complex systems we usually know a lot about subsystems but not so much about the whole system. My interpretation of a density operator satisfies this requirement (and is completely different from your conceptually inconsistent view).
 
  • #41
A. Neumaier said:
This is a severe problem for the knowledge interpretation of quantum mechanics, and only for that interpretation. This interpretation claims that a pure state gives the maximum knowledge one can have about a system, while a mixed state represents incomplete knowledge.

But this view is self-contradictory as your tracing out example shows.

Well, I consider just about all interpretations of quantum mechanics, including yours, to be in the same boat.
 
  • #42
stevendaryl said:
But the fact that in quantum mechanics, the probability only exists after the observable is chosen makes it difficult (for me) to view collapse as simply updating based on new information.
I don't understand. One knows which information came in (the measurement of a particular spin direction) and one updates the model (not the probability!) according to that information. Probability doesn't even enter!

This is just the same as is done in real-time stochastic modeling of the stock market - whenever some new data come in (whatever these data are about) the model is updated.

The only difference is that the stochastic model is in the first case a quantum-classical process (a classical stochastic process conditioned on quantum states) while in the second case it is a purely classical process.
 
  • #43
A. Neumaier said:
I don't understand. One knows which information came in (the measurement of a particular spin direction) and one updates the model (not the probability!) according to that information. Probability doesn't even enter!

Probability enters in that a measurement of one component of an entangled system updates the probabilities associated with the other component.
 
  • #44
stevendaryl said:
Probability enters in that a measurement of one component of an entangled system updates the probabilities associated with the other component.
No. Measurement of one component of an entangled system updates the state of the whole system. That's the only thing going on. As a consequence, all predictions about the whole system change, of course. Probability plays no active role in this process.
stevendaryl said:
But the fact that in quantum mechanics, the probability only exists after the observable is chosen makes it difficult (for me) to view collapse as simply updating based on new information.
Probabilities refer to predictions about relative frequencies of certain events in case that they are observed. Thus they are purely theoretical entities which always exist. The probability of casting 1 with a particular die is 1/6 even if the experimenter cannot cast this particular die.
 
  • Like
Likes vanhees71
  • #45
A. Neumaier said:
This is no different from the classical situation, where we idealize our die-casting system to a stochastic process, and are left with no way to explain how a single die would behave.
I agree, collapse in experiments on classical systems works exactly the same as the quantum case, and the epistemology of how we use probabilities is also exactly the same. So there is no epistemological problem at all-- epistemologically, "collapse" just means "connecting the predicted behavior of ensembles with the concept of individual outcomes." The problem only appears when one attempts to build a quantum ontology, because if that ontology says "all is wavefunctions that evolve unitarily", then one cannot understand how a measurement occurs that is capable of obtaining a single outcome without taking the measuring device out of the system that is evolving unitarily. So I agree that collapse is not a problem, but I don't agree that the reason it's not a problem is that systems are really open, I see it as not a problem because all formal physical theories describe the ontologies of closed systems, which we then use epistemologically by marrying them to how we process information. Thus, we should never be surprised when our formal structures fail to provide a complete ontology, because we always open systems to look at them. QT is merely the place where we realized this, something we should have known all along. So I have no problem with collapse-- I regard it as an epistemological device stemming from how we cull and correlate information.
Idealizations always introduce uncertainty, and if this uncertainty is big because thlemerere is sensitive dependence on unmodled degrees of freedom then only probabilities (i..e, relative frequencies for a large sample) can be predicted.
I completely agree-- it's all about the degrees of freedom we choose to model. We create collapse, it is part of how we do science. All we should expect the equations of physics to do for us is to give us a diagonal density matrix. The rest comes from us. The Copenhagen view is there is no quantum ontology, MWI says there is no classical ontology, Bohm says there is both a classical and quantum ontology and they are both just the same, but I say all ontology is really epistemology in a convincing disguise.
 
  • #46
Ken G said:
The Copenhagen view is there is no quantum ontology, MWI says there is no classical ontology, Bohm says there is both a classical and quantum ontology and they are both just the same, but I say all ontology is really epistemology in a convincing disguise.
Whereas I assert an ontology that smoothly combines deterministic and stochastic, classical and quantum aspects without needing variables beyond orthodox quantum mechanics. This ontology is given by my thermal interpretation. The thermal interpretation simply spells out what is the hidden secret of all shut-up-and-calculate successes. It is consistent on every level and has all properties one can reasonably ask for.

If the predicted uncertainty of the value of an observable given by quantum mechanics is small, it is a reliable prediction for a single system. The larger the uncertainty is the more independent repetitions one needs to reduce the uncertainty of the statistics to a desired level of accuracy, according to the ##N^{-1/2}## rule for the law of large numbers.

There are no interpretation problems with experiments where the position outcome depends on how an electron spin turns out, since the predicted uncertainty is then large. Neither is there an interpretation problem for macroscopic observables, since under the usual classically predictable circumstances their quantum uncertainty is tiny.

Thus I am very satisfied with this interpretation. It gives me the feeling that I really understand quantum mechanics.
 
  • #47
Ken G said:
All we should expect the equations of physics to do for us is to give us a diagonal density matrix.
They do so only under very special circumstances (quantum measurement). More usually, the density operator remains non-diagonal in any reasonable basis.

But no matter whether or not diagonal, the diagonal entries encode probabilities of interest, and the expectations computed from the full density operator encode approximate values of measurable observables.
 
  • #48
stevendaryl said:
But that's a weird perspective. When it comes to the two-particle composite system, Bob and Alice know everything there is to know about this system. It's described by a pure state, which is, for quantum mechanics, the maximum amount of information you can have about a system. To say that Bob's mixed state reflects his ignorance about his particle means that he knows less about a part of a system than he knows about the whole system.
That's just one more example for the fact that a completely determined state doesn't imply that all observables are determined. In this case the single-particle spins are even maximally uncertain (in the Shannon sense of information theory). Indeed you know everything you can now about the total spin, namely that it's 0 but you lack as much information about the single-particle spins as you can. That's QT at its purest form :-).
Actually, I read a paper once that described entanglement in exactly these terms. For a classical composite system, the entropy of the complete system has to be greater than the entropy of any of the components. But for quantum mechanics, this isn't always the case. For two-particle entangled system, the entropy for the composite system can be zero, because you know exactly what the state is. But the entropy of the components can be nonzero.
Yes, here ##S=-\mathrm{Tr} \ln (\hat{\rho})=-\mathrm{Tr} \ln(|\psi \rangle \langle \psi|)=0## (as for any proper pure state. The knowledge is maximal concerning the total spin of the two-spin system. For Bob's particle it's ##S_B=-\mathrm{Tr} \ln(\hat{\rho}_B)=-\ln 2##.
 
  • #49
A. Neumaier said:
This is a severe problem for the knowledge interpretation of quantum mechanics, and only for that interpretation. This interpretation claims that a pure state gives the maximum knowledge one can have about a system, while a mixed state represents incomplete knowledge.

But this view is self-contradictory as your tracing out example shows. If you know everything about the whole system, it would imply that you know very little about the subsystem, while if you know everything about a subsystem but nothing about the remainder of the system, this cannot even be described in this model of knowledge.

Thus I reject the whole basis of your reasoning as it is conceptually unsound.

A sensible interpretation of quantum mechanics must not only assign meaning to the whole system but to all subsystems. Indeed, of complex systems we usually know a lot about subsystems but not so much about the whole system. My interpretation of a density operator satisfies this requirement (and is completely different from your conceptually inconsistent view).
The point is that quantum theory tells you that even if you have maximal possible knowledge about a system, you don't know the values of all possible observables. That's all the example shows.
 
  • #50
https://arxiv.org/abs/1405.3483
Steven Weinberg
Quantum Mechanics Without State Vectors
In this paper, SW proposes a formulation of QM based solely on density matrices.
Does this solve the problem? How is it similar or different to the AN formulation?
TIA Jim Graber
 
  • #51
stevendaryl said:
Probability enters in that a measurement of one component of an entangled system updates the probabilities associated with the other component.
Again, I have to ask, are you suggesting that probabilty is a dynamical variable in a physical process ?

What you are describing as collapse is a change in the Hamiltonian. There is no physical wave function. It is a way of calculating probabilites that honour the physical symmetries and contains no dynamical information.
 
  • Like
Likes vanhees71
  • #52
Mentz114 said:
Again, I have to ask, are you suggesting that probabilty is a dynamical variable in a physical process ?

What you are describing as collapse is a change in the Hamiltonian. There is no physical wave function. It is a way of calculating probabilites that honour the physical symmetries and contains no dynamical information.

I don't know what you're calling a change in the Hamiltonian. What Hamiltonian are you talking about? In an EPR-type experiment, I can imagine a number of Hamiltonians that might be relevant, but I don't see that any of them quite fit what you said above:
  1. The Hamiltonian describing the process for creating the twin pair.
  2. The Hamiltonian governing the pair as they travel from the source to the detector. (Usually, this is treated as free-particle propagation.)
  3. The Hamiltonian governing the interaction between the particles and the detectors.
 
  • #53
The way it seems to me is that you have two possibilities:
  1. Either a measurement reveals some pre-existing property of the system being measured, or
  2. The property doesn't exist before the measurement act, and the act of measurement causes the property to have a value. (This is the claim that microscopic systems don't have properties until they are measured.)
(I guess to be complete, I should include the Many-Worlds possibility, which is that systems can simultaneously have different values in different "possible worlds", and a measurement simply determines which branch you (or the measurement device) is in.)

Option #1 seems incompatible with Bell's theorem, and option #2 seems incompatible with locality, because Alice can remotely measure a property of Bob's particle. That's no problem, if measurement is just revealing a pre-existing property (#1), but seems like a nonlocal interaction if the measurement changes the system being measured (from an indefinite value to a definite value).
 
  • #54
.
stevendaryl said:
The way it seems to me is that you have two possibilities:
  1. Either a measurement reveals some pre-existing property of the system being measured, or
  2. The property doesn't exist before the measurement act, and the act of measurement causes the property to have a value. (This is the claim that microscopic systems don't have properties until they are measured.)
(I guess to be complete, I should include the Many-Worlds possibility, which is that systems can simultaneously have different values in different "possible worlds", and a measurement simply determines which branch you (or the measurement device) is in.)

Option #1 seems incompatible with Bell's theorem, and option #2 seems incompatible with locality, because Alice can remotely measure a property of Bob's particle. That's no problem, if measurement is just revealing a pre-existing property (#1), but seems like a nonlocal interaction if the measurement changes the system being measured (from an indefinite value to a definite value).
I don't understand how any of this is relevant to my question - 'are you suggesting that probabilty is a dynamical variable in a physical process ?'.
You also seem to think all physics is EPR and Bell.
You've lost me. I won't partake further in this discussion because I don't understand what you are saying. You are making too many wrong assumption to make sense to me. :frown:
 
  • #55
stevendaryl said:
The way it seems to me is that you have two possibilities:
  1. Either a measurement reveals some pre-existing property of the system being measured, or
  2. The property doesn't exist before the measurement act, and the act of measurement causes the property to have a value. (This is the claim that microscopic systems don't have properties until they are measured.)
(I guess to be complete, I should include the Many-Worlds possibility, which is that systems can simultaneously have different values in different "possible worlds", and a measurement simply determines which branch you (or the measurement device) is in.)

Option #1 seems incompatible with Bell's theorem, and option #2 seems incompatible with locality, because Alice can remotely measure a property of Bob's particle. That's no problem, if measurement is just revealing a pre-existing property (#1), but seems like a nonlocal interaction if the measurement changes the system being measured (from an indefinite value to a definite value).
Isn't there a third alternative?
3. There is a pre-existing property of the the system being measured that is altered by the act of measurement.
 
  • #56
jimgraber said:
https://arxiv.org/abs/1405.3483
Steven Weinberg
Quantum Mechanics Without State Vectors
In this paper, SW proposes a formulation of QM based solely on density matrices.
Does this solve the problem? How is it similar or different to the AN formulation?
See https://www.physicsforums.com/posts/5419800 and the subsequent discussion. The most interesting aspect is that in the ##C^*##-algebra setting for interacting quantum fields (featuring factors of type ##III_1##), pure states do not even exist! This is quite unlike the situation in quantum mechanics of finitely many degrees of freedom and for free quantum fields.
 
  • #57
stevendaryl said:
The way it seems to me is that you have two possibilities:
  1. Either a measurement reveals some pre-existing property of the system being measured, or
  2. The property doesn't exist before the measurement act, and the act of measurement causes the property to have a value. (This is the claim that microscopic systems don't have properties until they are measured.)
This might be the only possibilities if the system were isolated - but then it would be unmeasurable. In the real world, were systems are open, there is a third, and actually realized, possibility:

3. A measurement reveals some preexistent property of the universe, but due to the approximations made in delineating a specific piece of the universe as the ''system'', the revealed property (a macroscopic pointer reading) can only be very imperfectly related to a property of the single system, resulting in an only stochastic description.

If one sees how the approximations come about and the mathematics behind the approximation process (rather than only the idealized end result), this is indeed the way it happens both in classical and in quantum mechnaics.
 
  • Like
Likes Mentz114
  • #58
A. Neumaier said:
They do so only under very special circumstances (quantum measurement). More usually, the density operator remains non-diagonal in any reasonable basis.
I wouldn't call it "very special circumstances" when those are the only circumstances we ever test! Everything else is demonstrably just a stepping stone to the laws of physics giving us something we can test, so that's what I mean when I say "all we can expect those laws to give us."
 
  • #59
A. Neumaier said:
Whereas I assert an ontology that smoothly combines deterministic and stochastic, classical and quantum aspects without needing variables beyond orthodox quantum mechanics. This ontology is given by my thermal interpretation.
But to me, it doesn't sound like an ontology at all-- it sounds like an epistemology only! It does sound like exactly the epistemology we actually use, so it's very much what I'm talking about-- it is not a law of physics in the conventional sense, because it does not describe an ontology, it describes what we will get if we analyze information in a given way, which is just the way we do it.
Thus I am very satisfied with this interpretation. It gives me the feeling that I really understand quantum mechanics.
I would say you understand how to use quantum mechanics to get it to do for you what you want it to do for you, which is to approximately predict observations. Whether you attribute the inherent uncertainty to the observation or to the system doesn't really matter, you are asserting a fundamental disconnect between the two that we could never test or pinpoint. So it sounds to me like your comfort with it comes from not attempting to create an ontology at all, it's ducking that need-- and I'm saying that's exactly the way to get comfortable with any theory. Ontologies always create discomfort unless one doesn't dig into them too deeply. But if you want to regard your epistemological formulation as an ontology instead, it seems to me it needs to address this question: why are the observations inherently approximate?
Indeed, I see that you have already answered that just above:
A. Neumaier said:
A measurement reveals some preexistent property of the universe, but due to the approximations made in delineating a specific piece of the universe as the ''system'', the revealed property (a macroscopic pointer reading) can only be very imperfectly related to a property of the single system, resulting in an only stochastic description.
I would claim that the epistemological foundations of that statement are clear, one merely cuts out the first phrase before the comma and the other parts that have no direct connection to what is actually being done by the physicist. I agree with the rest-- we choose how to correlate and bin the information at our disposal, and the way we do that generates concepts like "systems" and "properties", none of which need exist anywhere but in our heads. It is what we are doing with the information that creates the collapse, we can use the formalism to understand the generation of a diagonal density matrix in a straightforward way, and that's all it is needed for.
 
Last edited:
  • #60
Neumaier: does this old post of yours describe an aspect of your thermal interpretation, a consequence of it, or is it an addition?

A. Neumaier said:
To be able to discuss why I find the assumptions of Bell far too strong, let me distinguish two kinds of causality: extended causality and separable causality. Both kinds of causality are manifestly local Lorentz invariant and imply a signal speed bounded by the speed of light. Here a signal is defined as a dependence of measured results at one spacetime point caused by a preparation at another spacetime point.

Separable causality is what is assumed in Bell-type theorems, and is thereby excluded by the standard experiments (assuming that all other conditions used in the derivation of such theorems hold in Nature). On the other hand, extended causality is far less demanding, and therefore is not excluded by the standard arguments.

To define these two kinds of causality I use the following terminology. A point object has, at any given time in any observer's frame, properties only at a single point, namely the point in the intersection of its world line and the spacelike hyperplane orthogonal to the observer's 4-momentum at the time (in the observer frame) under discussion. An extended object has properties that, in some observer frames at some time depend on more than one space-time position. A joint property is a property that explicitly depends on more than one space-time location within the space-time region swept out by the extended object in the course of time.

Both kinds of causality agree on the causality properties of point objects (''point causality'') but differ on the causality properties of extended objects. Extended causality takes into account what was known almost from the outset of modern quantum mechanics - that quantum objects are intrinsically extended and must be treated as whole. This is explicitly expressed in Bohr's writing (N. Bohr, On the notions of causality and complementarity, Dialectica 2 (1948), 312. Reprinted in Science, New Ser. 111 (1950), 51-54.):
(Thanks to Danu for locating this quote!)

Here are the definitions:
  • Point causality: Properties of a point object depend only on its closed past cones, and can influence only its closed future cones.
  • Extended causality: Joint properties of an extended object depend only on the union of the closed past cones of their constituent parts, and can influence only the union of the closed future cones of their constituent parts.
  • Separable causality: Joint properties of an extended object consist of the combination of properties of their constituent points.
I believe that only extended causality is realized in Nature. It can probably be derived from relativistic quantum field theory. If this is true, there is nothing acausal in Nature. In any case, causality in this weaker, much more natural form is not ruled out by current experiments.

Thanks.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 12 ·
Replies
12
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 25 ·
Replies
25
Views
5K
Replies
135
Views
11K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K
Replies
10
Views
2K