Does QM ever violate classical probability theory?

In summary: The mathematical solution they propose is interesting. It is also highly non-trivial to determine whether it can be reduced to a classical problem or not. In fact, most of the time it is not possible to do so. That's why we have to use numerical methods. 4) I am not sure if I agree with the "suggestions" they make at the end. In any case, I do not think they are claiming that they have found the ultimate theory of decision making. They are just showing that the quantum formalism can be used to model it. In the same way as we can use it to model many other things.In summary, reading the conversation about the
  • #1
BWV
1,465
1,780
reading this http://uk.arxiv.org/abs/1004.2529 about supposed parallels between the mathematical structure of probability in QM and some problems in economics

question is that are there really any violations of classical probability theory, such as Pr(A) > Pr(A [itex]\cup[/itex] B) in QM? The supposed examples all seem to point to interference effects which to my thinking are not violations of Probability theory as you could construct a similar situation with classical waves
 
Physics news on Phys.org
  • #2
BWV said:
reading ... about supposed parallels between the mathematical structure of probability in QM and some problems in economics
Reading that - haven't you got the imperssion, that it would be one more great example for Sokal and Bricmont?
(http://en.wikipedia.org/wiki/Fashionable_Nonsense)

is that are there really any violations of classical probability theory, such as Pr(A) > Pr(A [itex]\cup[/itex] B) in QM?
I haven't spotted any.
 
  • #3
Thanks

although to be fair the linked article is only attempting to use the mathematics of QM for other applications, rather than postulating some link between the actual physics and economics.

But the whole formal structure of "disjunctions" in the paper does not make much sense to me - the whole phenomenon could just as easily be explained by cognitive errors of individuals responding to polls.
 
  • #4
I would not call it 'attempting to use mathematics... for..."
I would rather call it: 'attempting to impress social-science reader with mathematic symbols she do not understand'
 
  • #5
xts said:
I would not call it 'attempting to use mathematics... for..."
I would rather call it: 'attempting to impress social-science reader with mathematic symbols she do not understand'

true enough, but not as egregious as using curvature tensors to explain arbitrage relationships

http://arxiv.org/abs/hep-th/9710148

We give a brief introduction to the Gauge Theory of Arbitrage. Treating a calculation of Net Present Values (NPV) and currencies exchanges as a parallel transport in some fibre bundle, we give geometrical interpretation of the interest rate, exchange rates and prices of securities as a proper connection components. This allows us to map the theory of capital market onto the theory of quantized gauge field interacted with a money flow field. The gauge transformations of the matter field correspond to a dilatation of security units which effect is eliminated by a gauge transformation of the connection. The curvature tensor for the connection consists of the excess returns to the risk-free interest rate for the local arbitrage operation. Free quantum gauge theory is equivalent to the assumption about the log-normal walks of assets prices. In general case the consideration maps the capital market onto lattice QED.

Have had a lot of fun at the office giving this paper to newly minted MBAs as "required study material"
 
  • #6
BWV said:
question is that are there really any violations of classical probability theory, such as Pr(A) > Pr(A [itex]\cup[/itex] B) in QM?
I haven't really thought about it, but the probability measures in QM are (generalized) probability measures on a non-distributive lattice, not probability measures on σ-algebras. You can probably derive some "violations" from the non-distributivity, but I'm not going to think about that tonight.

I think your ">" should be a "≤". [itex]P(A\cup B)=P(A\cup(B-A))=P(A)+P(B-A)\geq P(A)[/itex]

xts said:
I would not call it 'attempting to use mathematics... for..."
I would rather call it: 'attempting to impress social-science reader with mathematic symbols she do not understand'
I haven't read the article, but the journal reference says "International Journal of Theoretical Physics", and one of the authors is Diederik Aerts, who I believe is a leading expert on quantum logic.
 
  • #7
Fredrik said:
I haven't read the article ... leading expert on quantum logic.
So try to read it!
I am looking forward to see your defense of this babble: please try to use non-adjectival logic.
 
  • #8
[itex]P(.): \mathcal{T} \rightarrow \mathbb{R}[/itex] is a mapping from the field of all events [itex]\mathcal{T}[/itex] to the set of real numbers with the following properties (due to Kolmogorov):
Non-negativity:

[tex]
\left(\forall A \in \mathcal{T}\right) \, P(A) \ge 0
[/tex]

Additivity:
[tex]
A \cap B = \emptyset \Rightarrow P(A \cup B) = P(A) + P(B)
[/tex]

Normalization:
[tex]
P(\Omega) = 1
[/tex]
where [itex]\Omega[/itex] is the certain event.

If you use the set relations:
[tex]
A \cup B = A \cup (B - A), \; A \cap (B - A) = \emptyset
[/tex]
and
[tex]
B = (A \cap B) \cup (B - A), \; (A \cap B) \cap (B - A) = \emptyset
[/tex]
and use the additivity of probability:
[tex]
P(A \cup B) = P(A) + P(B - A)
[/tex]
[tex]
P(B) = P(A \cap B) + P(B - A)
[/tex]
to eliminate [itex]P(B - A)[/itex], we get:
[tex]
P(A \cup B) = P(A) + P(B) - P(A \cap B)
[/tex]

Then, if what you claim is true:

BWV said:
such as Pr(A) > Pr(A [itex]\cup[/itex] B)

it would imply:
[tex]
P(A \cap B) > P(B), \; \forall A, B \in \mathcal{T}
[/tex]
Then, taking any [itex]A \subseteq B[/itex], we would have [itex]A \cap B = B[/itex] and we would arrive at a contradiction:
[tex]
P(B) > P(B) \ \bot
[/tex]
 
  • #9
Fredrik said:
I haven't really thought about it, but the probability measures in QM are (generalized) probability measures on a non-distributive lattice, not probability measures on σ-algebras. You can probably derive some "violations" from the non-distributivity, but I'm not going to think about that tonight.

I think your ">" should be a "≤". [itex]P(A\cup B)=P(A\cup(B-A))=P(A)+P(B-A)\geq P(A)


I was posting a violation of classical prob listed in the link. The example given was polling data of categorizations - for example respondents were asked something like does x belong in category a or in categories (a or b). There are common cognitive biases that violate probability laws - the author seems to be saying that you need a "non-classical" statistics to deal with them, which seems like a dubious proposition to me
 
  • #10
How does the author measure [itex]P(A)[/itex] or [itex]P(A \cup B)[/itex]?
 
  • #11
I didn't read the paper just skimmed the abstract and first page fast last night. Just some thouhgts on the topic, less specific to the paper.

1) Their conceptual analogy between QM and decision theory is to ve very intuitive and interesting. I myself use decision theory analogies a lot since for me they are highly natural. We all make decisions everday, one deos not need to study it formally to understand it.

2) It's not right to say that they violate classical probability - of course they don't. That' just mathematics. What they do claim is this: The cognitive decisions made by test subjects FAILS to be MODELLED by classical probability (classical logic), and that it's better modeled by quantum logic. This is not surprising to me.

This is exactly on par with that you can't explain quantum interference in terms of classical bayesian probability and classic logic. But this we know already.

BWV said:
But the whole formal structure of "disjunctions" in the paper does not make much sense to me - the whole phenomenon could just as easily be explained by cognitive errors of individuals responding to polls.

3) I think the whole point is that whatever you call it "cognitve errors" or something else, the task is to model it. To predict how a subject responds to input, means to understand its' decision process. The point is that whatever happens in the decision process, that is due to nature, so it does not make much sense to call it "errors" if "errors" are integral part of the decision process.

The analogy with physics is:

Determine how a system (say a piece of matter) RESPONDS/ACTS on perturbation/input, and quantum mechanically we model this as the system "considering all paths etc" which are then "added" under interference effects, this is very analogous to a decision process!

Determing how a subject responds/acts to input, which of course can be tought of as the subject choosing and action after some cognitive reflections to make a decision on how to rationally respond to input.

Here a big difference that leads to a different insight is that decisions does not have the purpse of beeing what might be thought of as "logically correct", instead decisions are tactical decisions, that are expect to yield maximum return. This is why sometimes decisions that turn out "wrong" is retrospect, are still maximally rational. This is why one should be careful to make statements of what in a deciusion process that are "errors".

/Fredrik
 
  • #12
Dickfore said:
How does the author measure [itex]P(A)[/itex] or [itex]P(A \cup B)[/itex]?

As I understand it, from trials of test subjecst (humans) that get to answer verbal questions. Replacing logical operations with and and or. So the measurement = observing what human subjects "decide" to answer on given questions.

So what the try to model is the human decision process. All they found is that contrary to what we think is "logical" the subjects fail to make decsions as per classical logic. That's their only point. If we call this an "error" or logical fallacy, that's partly right, but that's beyond the point. The quest is still to modell it, with "fallacies" and all. I think the onjceture from cognitive psychology is that there exists a rational explanation for the fallacies, which involves how the decisions process in the brain actually works - it apparently does not work with simple classical logic.

/Fredrik
 
  • #13
Fra said:
I didn't read the paper just skimmed the abstract and first page
So maybe you understand why Bell's theorem and EPR paradox were mentioned in introduction to the paper as important to it? Look a bit later - how impressive mathematical formalism!

1) Their conceptual analogy between QM and decision theory is to ve very intuitive and interesting.
If you find it interesting - you should follow references to check the methodology they used to obtain data. Actually, they didn't do any experiments - they reinterprete 20-years old data collected by J.Hampton (http://www.staff.city.ac.uk/hampton/PDF files/Hampton Disjunction1988b.pdf) on an impressive sample of 40 students, answering junctive questions and disjuntive ones a week later. Hampton somehow forgot to test, what would be a correlation between answers if exactly the same questions would be asked twice. Whole this "theory" is built upon inconsistency of answers given by two persons.
Other data comes from the experiment, where 2 groups of 10 persons each were asked disjunctive question, why other group of 20 were asked junctive one. Quick excercise for you on "classical" probability theory: what is a probability that 25% or less of 20 people give one answer and 30% or more of 10 people gives the same answer if those are independent samples? That is the "evidence" (25% > 30%) upon which the whole "theory" is built.

The cognitive decisions made by test subjects FAILS to be MODELLED by classical probability (classical logic), and that it's better modeled by quantum logic. This is not surprising to me.
Again. 40 person sample with no estimation of expected errors, or comparison made on statistics: 5/20 vs. 3/10 I would be rather cautious using capitalics here.

3) I think the whole point is that whatever you call it "cognitve errors" or something else, the task is to model it. To predict how a subject responds to input, means to understand its' decision process.
Great! But - again:
- data they use do not justify rejection of simplest theory: "human cognition follows Boolean logic"
- even if cognition do not follow classical logic, there is no relation with EPR, Bell, von Neumann, and Quantum Mechanics.

But "Quantum" makes papers more sexy! Not only papers - Calgonit Quantum - the best dishwasher tabs!
 
Last edited:
  • #14
I'm not particularly interested nor impressed by that paper as such, but I interpreted the OP as questioning what possibly reasonable relation there exists between decision theory and physics, in their logic.

I think the connection is there, but this was known to be before, that paper doesn't seem present anything new as I see.

You are possible right that "it's sexy", that's possibly a factor :) But that doesn't make the connection wrong.

xts said:
So maybe you understand why Bell's theorem and EPR paradox were mentioned in introduction to the paper as important to it?

I think it was mentioned since in physics bell's theorem and the inequalities more or less serves as the no-go theorem for locally realistic theories; which essentially means they are explained by classical logic.

Not that I bother care much, but they seem to have some idea about some no-go teorem for decisions theory, which you could tell if it can be explained by classicla logic or not.
xts said:
If you find it interesting

Not the paper itself, but the general connection. But I surely don't need that paper to know that. In particular the connection between decision process and physical interactions is profound IMO. This you can even tell from most of my posts on here. But that papers isn't a physics paper so I don't care, I just wanted to stand up for the connection.

/Fredrik
 
  • #15
Fra said:
As I understand it, from trials of test subjecst (humans) that get to answer verbal questions. Replacing logical operations with and and or. So the measurement = observing what human subjects "decide" to answer on given questions.

xts said:
Actually, they didn't do any experiments - they reinterprete 20-years old data collected by J.Hampton (http://www.staff.city.ac.uk/hampton/PDF files/Hampton Disjunction1988b.pdf) on an impressive sample of 40 students, answering junctive questions and disjuntive ones a week later. Hampton somehow forgot to test, what would be a correlation between answers if exactly the same questions would be asked twice. Whole this "theory" is built upon inconsistency of answers given by two persons.


The reason I was asking this is because if they made a statistical estimate of the probabilities, then they can only give confidence interval for the difference [itex]P(A) - P(A \cup B)[/itex]. If this interval happens to contain the zero, then, any statistical test of the hypothesis for a particular sign of this difference will be rejected at the given level of confidence. As the sample size (40 students) is pretty small, I would assume the 95% confidence interval (standard in natural sciences) to be pretty wide. I do not intend to perform any calculation for this, however.
 
  • #16
Dickfore said:
I would assume the 95% confidence interval (standard in natural sciences) to be pretty wide. I do not intend to perform any calculation for this, however.
You should make! Data used to build the "theory" presented in this paper are not significant even on one standard deviation level.
 
  • #17
Fra said:
I think the connection is there, but this was known to be before, that paper doesn't seem present anything new as I see.

You are possible right that "it's sexy", that's possibly a factor :) But that doesn't make the connection wrong.
Of course, I can't prove their idea is wrong. I may only say that the justification they provided is wrong.
I may also say that the physical theories they quote are not related by any means to the subject they touch.
Such 'parallels' are not more justified than claim about quantum nature of day and night (which is cyclic, but the light emitted by excited atom is also cyclic, and is described by quantum mechanics).
There are millions crap ideas you can't prove to be wrong.
What I (and every sceptical person, stick to Occam's rule) expect from scientific paper is not to present "theories", which (however not falsifable) are not justified by any evidence, especially not by the evidence cited by the particular paper.
 
Last edited:
  • #18
xts said:
I may only say that the justification they provided is wrong.

Very possible! I honestly didn't read it carefuly, and I don't even care to.

I have other justifications that's more than good enough for me and I don't need more. However my htinking is different, rather than trying to apply QM formalism to decisions theory, I think that DEEPER insights into how decisions work, will help us find an instrinsic measurement theory that's helpful for QG and unification.

This is why that paper itself doesn't interest med much. (Not enough to even read it properly)

/Fredrik
 
  • #19
Fra said:
I honestly didn't read it carefuly, and I don't even care to.
Honestly: me too. I just browsed it to point out several absurds and followed the bib-link to see what data they are based on.

I think that DEEPER insights into how decisions work, will help us find an instrinsic measurement theory that's helpful for QG and unification.
Wow! Would you say few words more how the human decision theory is related to Quantum Gravity?
 
  • #20
In the theory of weak measurements there appear probabilities >1 and <0, so they violate classical probability explicitly.
 
  • #21
xts said:
Wow! Would you say few words more how the human decision theory is related to Quantum Gravity?

I'm not sure if you're ironic :wink:

There is no one-liners to elaborate this, I have a series of conjectures and connections neede to explain it. But it seems a little bit off topic and probably also not in line with forums conditions if I was to post a list of conjectures here. But when beeing on topic a lot of my ideas is visible in other discussions here, because then I believe it can be classified as regular intellctually sound discussions. For me to post a list of conjectures of the my pet idea seems in appropriate.

Also I wouldn't say "IS related", because no one knows what full QG really is, thus how anything is related to it. Thus while I am personally convinced, objectively it's just a conjecture of course.

In despite that I find this extremely plausible I've understood that it's hard to convey the point. The results of the reasoning, the reworked theory should be much easier to understand of course, because it would be a simple matter of either it gives correct predictions and postdictions or not but it's work in progress.

/Fredrik
 
  • #22
Fra said:
(decision theory vs Quantum Gravity) I'm not sure if you're ironic :wink:
Not at all!
Alain Sokal had been ironic enough already. I am sure that after publication of his "Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity" http://physics.nyu.edu/sokal/transgress_v2/transgress_v2_singlefile.html, anyone (especially you) today, claiming any parallels between social sciences and Quantum Gravity must have really strong and convincing argumentation to touch again that most ever ridiculed parallelism used by pseudo-science.
I am really curious, what this argumentation is!
 
  • #23
This is no ambition of a complete definition of the ideas but at least a few more words:

The "decisions" I envision are made by "observers" which results in specific ways for the observers to place bets in a game, which is simply the interaction with other observers. The "placing of bets" is associated to computing the action.

observer ~ matter system

The soup of interacting observers, causes an evolution in the population - this is how I picture the path from some big bang event until today. I do not entertain any multiverse ideas, there is only one universe, but many observers.

Now the task is: To try to predict the population of observers at the equilibrium points. And to predict the exact actions of these observers.

My conjecture is that ALL actions are form the intrinsic perspective nothing but random walks. Each observer however "sees" a different state space in where the random walk takes place. This is why a random walker, seen by an external observer is interpreted as beeing subject to various forces.

My picture also contains a reconstruction of QM. I just start from a pure inference picture, which is phrased in terms of combinatoric, and quantum logic will be shown to simply be forced from evolution. Quantum logic is necessariy for survival. And it can be understood in terms of how information is encoded in the observer, as differnt structures that are related in a way that transformations are lossy. There are analogies here to how the human brain seems to work. When you look at this in the interacting picture it does explain the deviation from classical logic. (don't ask for proof at this point; I'm just describing my reasoning here).

In this picture there exists a unversal attraction that I associate to the precursor of gravity. The idea is that any two observers that interact with each other will learn about each other and their information divergence will decrease, this attraction depends also on their relative complexity. Relatively speaking a low complexity observer is more attrached by a large observer than the other way around. This all due to inertia in the decision process.

Now the unification in this picture follows from scaling the observers. The complexity of the interactions are necessarily dropping to zero as the mass of the observers does. The other interactions are then branched off as complexity is increased. Moreover this complexity scaling is a physical process, which is responsible for mass generation.

It's this process + evolutionary selection, that should provide predictive power and unification.

My work will take place in stages.

1. characterize the mathematics of the decision process ( this is mosty combinatorics combined with lossy transofmrations), starting from low complexity

2. Characterizes such interacting systems, starting from low complexity

3. See how these systems interaction and how the solution to problem 1, scales with mass range.

4. Find the limits that recovers known phyhsics. For example. QM and QFT corresponds to an infinitely massive observer, observing smaller systems.

The problem stages is the 1-2-3 are coupled! Ie. I need to solve them all together. The origin of mass and the branching of the interactions are different aspects of the same process.

/Fredrik
 
  • #24
Just few comments:
1. I can't judge if your view is valid or not, as a model of decision strategies, frankly, I don't fully understand the idea, but I am far from judging it as wrong..
2. If you want to test it - make a computer (Monte-Carlo) simulation of it and see if the results obtained are anyhow related to common sense experience.
3. Your model is not related by any means to QM - it does not utilise any QM basic ideas (especially about the probability of experimental outcome being equal to square of modulus of deterministically evolving wavefunction)
4. Parallel to QM may be only metaphorical and is not valid in any formal sense: (i.e. QM mathematical formalism is not applicable to it by any means, as above). My advice is not to use it, if you don't wish to be ridiculed in the same way as Sokal & Bricmont ridiculed Lacan and Derrida.
5. Try to say the same, as you just said, but not using QM nor GR terms like: 'state', 'state space', 'observer', 'gravity', 'coupling', etc. You gave those names to some items of your model, but then you overinterprete them assigning to them meanings related to the QM use of the term, not related by any means to your model.
Then probably you may see either weaknesses of your idea, or a way to transform it into "classical" model - the one which may be really modeled, not only metaphorically narrated.No one in QM uses "quantum logic". All QM is based on mainstream mathematics and mainstream logic. "Quantum logic" was an idea proposed by von Neumann, quickly abandoned by him, as impractical. Some people try to exploit it further, but it has nothing to "quantum mechanics" and is rather the intellectual game on the peripheries of logic/mathematics.
The major argument of "quantum logic" - the opposition of (q = "the particle is in the interval [-1,1]" vs. r = "the particle is not in the interval [-1,1]") is not a tertium non datur, but a fallacy: r is later interpreted (and paradox is taken of of it) as "the particle is outside the interval [-1,1]", which would be correct only if we assume that position of the particle is well defined and makes sense even if it is not measured. The opposition is of the same kind, as "p = you cheat your wife on weekends" and "~p = r = you cheat your wife on weekdays".
 
  • #25
You are jumping too far in your assessments. Remember, you asked just for a "few more words". The conclusions you made aren't possible from the short outline I made. It was just meant to explain my reasoning and my intuition, nothing else.
xts said:
3. Your model is not related by any means to QM - it does not utilise any QM basic ideas (especially about the probability of experimental outcome being equal to square of modulus of deterministically evolving wavefunction)
Of course it relates to QM, but I'm reconstructing measurement theory, so it's different and certainly QM framework as it stands is completely incapable of addressing these issues, but that's the whole point of course.

QM is however something that is explained as "correspondence limit". But again, the whole justification for why reconstruction of QM is necessary is the rest of it - unification and QG.

QM and GR are correspondence limits and serve as early reality checks. So reproducing QM ang GR in the appropriate limits are just reality checks. But that's not the point, the point is wether I will succed at the unifications.

In my picture, the state space (analog of hilbert space; I ) is defined by the information structures that encode the observer. I construct this space from combinatorics and from lossy datacompressions. Fouriertransform with a lost phase informaton os you get powerspectrum is just a simple example that often relates q and p.

From this state space, one can define, at each state, a "differential space" which is defined by the possible ways the original information was wrong, but which respects information capacity. The way this space is constructed, implies that it has a builting arrow of time. And the time evolution is like a random walk on a evolving map. But during the walk, the map is distorted, so it's generally onlt the differential evolution that is unitary. The finite evolution contains feedback that causes the structure on which the random walk takes place to evolve.

This does not contradict QM, since unitarity is maintained as long as we consider eqvuivalent external observers, such as we do when computing S-matrix for example.

What I'm talking about are like measurement theory for open systems, but desribed from the inside.

Probabilites are defined from counting permutations of microstates. However there is no born rule simply because I don't need it in the basic construction. However, just for the sake of correspondence I expect to be able to explain it in the appropriate limit.

Edit: I realize this is clearly dritfing off topic so i won't entertain this disucssion more in this thread. Given time, I hope to be able to publish some papers. Until them a few words is all there will be I'm afraid. There are several reasons for that, it's not just about forum rules :)

/Fredrik
 
Last edited:
  • #26
Unsubscribed.
 

1. What is quantum mechanics (QM) and classical probability theory?

Quantum mechanics is a branch of physics that describes the behavior of particles at a microscopic level. Classical probability theory, on the other hand, is a mathematical framework that deals with the likelihood of events occurring in a system.

2. How does QM relate to classical probability theory?

QM and classical probability theory are related in that they both deal with the probability of events occurring in a system. However, QM goes beyond classical probability theory by considering the inherent uncertainty and randomness of particles at a microscopic level.

3. Can QM ever violate classical probability theory?

Yes, in certain situations, QM can violate classical probability theory. This is because QM takes into account the randomness and uncertainty of particles, while classical probability theory assumes that events can be predicted with certainty.

4. What are some examples of situations where QM may violate classical probability theory?

One example is the double-slit experiment, where a single particle can behave like a wave and pass through both slits at the same time. This goes against classical probability theory, which would predict that the particle can only pass through one slit at a time.

Another example is quantum entanglement, where two particles can become connected in such a way that the state of one particle affects the state of the other, regardless of the distance between them. This violates classical probability theory, which assumes that there is no such instantaneous connection between particles.

5. How do scientists reconcile the discrepancies between QM and classical probability theory?

There is ongoing research and debate in the scientific community on how to reconcile the discrepancies between QM and classical probability theory. Some scientists propose new theories, such as hidden variables or modifications to classical probability theory, while others argue that QM accurately describes the behavior of particles and there is no need for reconciliation.

Similar threads

  • Quantum Physics
Replies
1
Views
575
Replies
6
Views
925
  • Quantum Physics
Replies
5
Views
1K
Replies
4
Views
846
  • Quantum Physics
Replies
34
Views
3K
  • Quantum Physics
Replies
13
Views
990
Replies
8
Views
1K
  • Quantum Physics
Replies
6
Views
2K
Replies
13
Views
2K
Back
Top