Does observation really mean interaction?

In summary, the conversation discusses the concept of observation and its relationship to interaction in quantum mechanics. It explores various interpretations, including the idea that any physical interaction can be considered an observation, and the different perspectives on the role of consciousness in the collapse of a wavefunction. It also delves into the concept of entanglement and how it relates to the measurement of quantum states. Ultimately, the conversation highlights the importance of distinguishing between physical interactions and conscious observations in understanding quantum mechanics.
  • #1
kaos
65
0
Does observation really mean interaction??
For example if something didnt interact with something else(for example proton A is far away for anything else) it would be in a superposition but would collapse into a normal state(decoherence) when it interacts(say proton A bumps into electron B).

What i am asking is that is any physical interaction is actually the obsevation and does not really require a conscious observer?
 
Physics news on Phys.org
  • #2
Interesting question. I don't think there is a definitive answer to that yet.
My understanding is that when your particle interacts with a second particle, what you get is some degree of entanglement between both particles, but you still have a superposition. Eventually, this superposition will get entangled with objects far away, including yourself as an observer.
I think many people would consider the system to be in a state of superposition until the chain of entanglements reaches you. Others would deny that.
I don't know what the position of the "environment induced decoherence" school (Zurek et al) is on this.
I have also heard people distinguishing between the "many worlds" interpretation as separate from the "many minds" interpretation, but I fail to see the difference.
Another variant that I haven't had a chance to explore is the "consistent histories" or "decoherent histories" interpretation.
Maybe one of the more knowledgeable members of the forum can illuminate us on this.
 
  • #3
take this example:

Let's have 2 particles (call them p1 and p2) and an apparatus (A) all them in this known initial state (to) (suppose also, to be simple, that this state is an eigenstate of the "free hamitonian" of the 2 particles and the apparatus, i.e. a stationary state):

|psi(0)>=|p1_0>|p2_0>|A_0> =>|psi(t)>= K |p1_0>|p2_0>|A_0>

Where K is a constant (i.e. from the unitary time evolution)

Suppose, now, that p1 and p2 interact between to and t1: you have a known interaction that acts only on the subspace of particles p1 and P2:
|psi(0)> ---> |psit(t)>=|p1_p2(t)>|A_0>

where |p1_p2(t)> is the entangled state of the 2 particles p1 and p2.
i.e. |p1_p2(t2)>= sum_n cn |p1_n>|p2_n> (where we use the |p2_n> basis).

Now, we have a system of 2 entangled particles and a non entangled measurement apparatus.

We may do a measure the state of particle 2 without any interaction on particle 1 (at t2> t1):

You have an interaction between p2 and A that can be described by this unitary evolution (when apparatus initial state is |A_0>):

U_T= sum_n|p2__n><p2_n|(x)|A_0><A_n|
(note that U_T does not act on particle 1 space: identity for this space)

(just before the measurement) |p2_state>|A_0> ---> sum_n p2_n|p2_n>|A_n> (just after the mesurement).

Where p2_n = <p2_n|p2_state>

Where |A_n> is the pointer of the measurement apparatus and |p2_n> is the associated possible particle p2 states (i.e. one particle p2 state for one measurement pointer state).

Thus, when this measurement is applied to the state:
|psit(t2)>=|p1_p2(t2)>|A_0>

We have to detail the entangled state |p1_p2(t2)>= sum_n cn |p1_n>|p2_n>
(we use the |p2_n> basis).

---> |psi_after_measure>=sum_n cn.|p1_n>|p2_n>|A_n>

Thus, to answer your question, if we know the result of the measure (say |A_n>), we know the state of particle 2 and thus the state of particle 1 without any interaction from the measurement apparatus to the particle 1. I.e. we can know the state of particle 1 without measuring it [directly]. (however we need an interaction somewhere, at least to allow, a state change of the apparatus, like in any physical theory).
This is what we can simply call conditional probability (and we use it in experiments like EPR).
However, you are free to interpret the measurement result (the outcome) as the observation by the experimenter of the state of the apparatus and call it the consciousness or whatever else (MWI, DBM, etc …): this is the different many interpretations of QM and it has no impact on the statistical results given by the QM theory:
In a formal measurement result, we just do a conditional statement: knowing an outcome state [of the measurement apparatus], we deduce the state of particle 2 and thus particle 1 with a 100% confidence. As with probabilities, QM does not describe the outcomes, just the statistics of outcomes. Mixing them, usually lead to paradoxes and confusions.

Seratend.
 
  • #4
Hmm is a computer that is recording a result of the experiment considered an observer??For example what if a qm experiment such as the double slit experiment was done without an observer but recorded by a computer? Would the result of wavefunction collapse be the same??
 
  • #5
kaos said:
Hmm is a computer that is recording a result of the experiment considered an observer??For example what if a qm experiment such as the double slit experiment was done without an observer but recorded by a computer? Would the result of wavefunction collapse be the same??

Here I am again with my pet interpretation :-)
You can consider all OTHER "observers" as just physical processes which entangle the "observer" with the state (so you apply linear quantum theory, and yes, Alice becomes in an entangled state). It is only when YOU PERSONALLY observe the result that you apply Born's rule (or, if you want to, the projection postulate). A remote apparatus or person cannot "collapse" the wavefunction for you: you have to do that yourself. The collapse is in fact not really necessary, but a way to simplify the treatment.

So let us consider that you start with an entangled state |a+>b->+|a->|b+>.

Imagine that Alice observes a and Bob observes b. After they've done their thing, the state will be:

|A+>|B->|a+>|b-> + |A->|B+>|a->|b+>

So yes, these people, their printed notes and everything remain in an entangled state (there are, if you want, two Alices and two Bobs now).

Alice comes to you, and tells you her result, and she tells you she saw "+". You could now entangle yourself with Alice:
|You+x>|A+>|B->|a+>|b-> + |You-x>|A->|B+>|a->|b+>
but because it is YOU who observed it, you can apply Born's rule (50% chance Alice tells you + and 50% chance Alice tells you -) ; given that in this particular instant she TOLD you +, it means that your conscious observation is now in the |You+x> branch, and whatever you will observe later, it will be correlated with the result "and Alice told you +".
So you can just as well forget about the branch |You-x> because you never consciously observed that and never will. So you can "project out" the state and obtain:
|A+>|B->|a+>|b->
If Bob comes now to you and tells you his result, you know he's in the B- state, so there's no surprise anymore, but you "collapsed" him at a distance.
But you can also continue to work with the full state:
|You+x>|A+>|B->|a+>|b-> + |You-x>|A->|B+>|a->|b+>
and when Bob tells you his result, this becomes:
|You+->|A+>|B->|a+>|b-> + |You-+>|A->|B+>|a->|b+>
So if you now use Born's rule there are 2 possible states of your mind:
Alice told you + and Bob tells you -, or Alice told you - and Bob tells you +.
But since you remember that Alice told you +, you can only consider the first case. In this case, you see that there's no need for collapse, the world just continues to evolve in linear superposition, and your consciousness chooses branches according to the successive observations you make.

cheers,
Patrick.
 
  • #6
kaos said:
Hmm is a computer that is recording a result of the experiment considered an observer??For example what if a qm experiment such as the double slit experiment was done without an observer but recorded by a computer? Would the result of wavefunction collapse be the same??

See Vanesh post for further details (up to you to change some of its words, and also to change some of my own words, in order to be ok for your personal philosophy as long as it does not impact on physical predictions, i.e. the born rules, and if it is self coherent).
Vanesh describes what QM says: a QM state, together with an observable is a statistical description of a peculiar system. [Classical] Statistics does not describe how the outcomes of an experiment occur, just the statistics [of the outcomes]. This is the same for QM: Never forget that a QM state is not an outcome. At most, a QM state will tell you that 100% of the outcomes (the measurement results) in an experiment are identical.

A computer may be considered as an “observer” as long as you known the interaction with the part of the system you want to "measure":
You have the global state |psi(t)>=sum_a a(t)|computer_a(t)>|system_a(t)>
and you must know the interaction between Computer and system or part of the system (my previous post) so that you know |psi(t)> at another time through the unitary evolution (the schroedinger equation). (I am assuming that the SE is still valid for huge systems).
Therefore knowing the computer is in the state |computer1> and knowing its record function (its measurement interaction with the system under test) you deduce the state of the system (see my previous post or vanesh post):

At time t: |psi(t)> sum_i ci(t)|computer_i>|system_i> in a given basis (what we like to call the basis of the observable of the measurement. It can be the computer basis in this case, same thing for system.
(this decomposition is arbitrary: we choose the statistics we want to know: it has no impact on the system, as with classical statistics we are free to choose the random variables we want to analyse)
Thus, if the state of computer is |computer1234>, you automatically deduce the state of the system under mesurement: |system_1234>. The statistics of this result is given by |c1234(t)|^2 that's all.

Now we can detail |computer_i> and |system_i> in the case of the double slit experiment, (I will try to be as simple as possible):

Let's say that |system_i> is the state (we have light or no light in a coarse approach) of the diffraction pattern at the position i ("x=i") (we have done a discretisation of the space to keep this analysis as simple as possible):
|system>=sum_i |system_i> where |<i|system_i>|^2 is the well known diffraction pattern of the double slit at the position x=i
Thus we can say that the state |computer_i> gives the diffraction amplitude of the position i of the double slit experiment (i.e we have constructed a computer detector).
We can complicate the system and say that the computer has a set of detectors at every j position (for example a CCD coupled to the computer), in that case, we have a |systemi> state coupled to the computer state |computer_i> described by:
|system_i>= sum_j aij |j> where |j> is the position in the interference pattern and aij is the light amplitude associated to this position.
In this case we may have for example the global state: |psi>=|computer_io>|system_io>: we know that we have with 100% confidence a single interference pattern in a double slit experiment => we also known that the computer is in the state |computer_io> (the recorded interference pattern).
But you can also have this state:
|psi>=|computer_io>|system_io> +|computer_i1>|system_lightoff>
|Computer_i1> state is the recorded state of a double slit experiment where we haven’t switch on the light.
And so on.

Seratend.
 
  • #7
seratend said:
(I am assuming that the SE is still valid for huge systems).

I think that this has been the relatively recent (last tens of years) paradigm shift, namely that the unitary evolution (linear quantum mechanics) is to be taken much more seriously than previously thought.

In the first decennia of QM, people made the distinction between the "microworld" which was ruled by QM, and the "macroworld" which was ruled by classical physics, observers, etc... Schroedinger's cat is the typical example: "how can a cat be in a superposition of dead and alive states ??"

Some people tried (still try) to introduce small modifications in the Schroedinger equation, which would change the unitary evolution into a projection. Small enough modifications so that we still keep the microscopic successes of QM (atomic and molecular physics etc...) and get rid of the "incomprehensible" parts you run into when you have to consider superpositions of people-states a la Schroedinger's cat. (I have to say that for a while I was also favorable to such an approach) The difficulty that remains to be solved is to introduce modifications at a mesoscopic scale, which make up the twilight zone between the QM and the classical world.

However, EPR-like results make this approach very difficult to follow: indeed, what's so "mesoscopic" about two entangled photons being separated by 50 km ??
As I said earlier, it seems that we have to take the strict unitarity of QM more seriously than used to be done before. And that's the slow paradigm shift that has been taking place during the last decennia (although there were some illuminated minds, like von Neuman, who DID forsee this paradigm shift, and some people, like Hawking, apply the superposition principle even to stellar environments over time scales of the order of the age of the universe, to have superpositions between states "gas contracted, made a star and collapsed into a black hole" and "the gas didn't contract enough to do so").

Taking the unitary evolution more seriously means, sooner or later, that you have to consider pieces of paper, computers and people in states which are superpositions of "classical" states. Everett was probably the first one to take this seriously, and he called these superpositions "many worlds" (well, he didn't, that's a name that came later in fact). He claimed that there is only the unitary part of quantum mechanics and that's all.

Before declaring him ready for the asylum, the question needs to be examined more thoroughly. In the same way as Special and General Relativity made us think again about concepts such as space and time, quantum theory makes us think again about "what is an observation". Even if people are in superposed states, WILL WE OBSERVE ANYTHING WEIRD ? The answer is an amazing: NO ! It is because each observer, in each term of the big quantum state of the universe, will be entangled with exactly those classical states of the others he has always been observing. So each term entangles a "me" state with a completely "classical" set of states of my neighbour, the moon, my wife, etc... The detailled explanation of this is the content of what's called "decoherence theory" but is nothing else but strict application of linear quantum theory. However, there IS a remaining problem in this view, and it is called the Born rule: the rule that gives you the PROBABILITY to observe one of the states in a superposition. Pushing it a bit, it gives the probability for you to BE in one of your different states (and from decoherence theory, it follows then that this rule "propagates downward"). Nobody has yet been able to deduce this rule from unitary quantum theory (and as I said, I think they will never be able to do so). So SOMETHING has to apply the Born rule, highly on top of the "observer chain". And there's a myriad of interpretations there, my pet interpretation being that your consciousness is using continuously the Born rule to choose in which of the "me" branches your "real me" will be going. It is then completely up to your personal taste to say that this "projected out" this state, or whether you continue to take into account all other states in the superposition. In fact, you can just as well project it out, because there is no way in which these other terms in the superposition will ever affect any measurable result (by you). The existence, or not, of these other terms is not open to experimental inquiry.
So in a way you can say that it is your mind that "collapses the wavefunction". Or you can prefer to say that it is your mind which chooses the term with which "you" will spend the rest of your life, but leave the other terms still there, with copies of you but which "you" will not experience. It is up to your personal taste. You will never decide by experiment.

How "crazy" is this ? At first sight, you say that the guy telling you this must have hit his head real hard against a heavy, blunt object. But the more you think about it, the less "crazy" it sounds, because it is in no conflict with any conscious observation you can ever make. And - that's the lesson - you don't know about no OTHER observations but YOUR conscious observations !
What I find interesting is that modern physics seems to oblige us each time to go back to some philosophical issues by defying the "intuitive" notions of concepts such as "time" and "observation".

Of course this consciousness thing can be difficult to swallow. But all OTHER views of QM need to introduce "unobserved physics", the most obvious one being "collapse at a distance". What is the physical process that decides that now, locally, we measure the state of A, collapsing it and at the same time collapsing a state of B at 50 km, while they are doing the same over there, it depending on the observer speed to say which one occurred first ?? There's too much unexplained physics in that (for the moment).

cheers,
Patrick.
 
Last edited:
  • #8
kaos said:
What i am asking is that is any physical interaction is actually the obsevation and does not really require a conscious observer?

These "consciousness" ideas are an ancient relict of the 1920's. This
was long before people could imagine that intelligence or consciousness
could be constructed by interconnecting large numbers of nerve cells or
transistors.

Intelligence or consciousness in those days was mostly viewed as
some kind of unknown "vapor" floating around in the brain and the
body. :bugeye: Sometimes interactions of this vapor could be proved
with other substances with predictable results (think alcohol, drugs)

The presumed interaction philosophy of the consciousness as some
kind of mysterious vapor or spirit and the micro world has it's roots in
this kind of dark age ideas.

Regards, Hans
 
  • #9
Hans de Vries said:
These "consciousness" ideas are an ancient relict of the 1920's. This was long before people could imagine that intelligence or consciousness could be constructed by interconnecting large numbers of nerve cells or transistors.

As you can imagine, I do absolutely not agree with that point of view. First of all, you should distinguish between "intelligence" and "consciousness". Intelligence is an observable property which can be deduced from behaviour, and what you say applies to intelligence.

However, the philosophical problems related to "consciousness" are more subtle. In fact, what's called "the hard problem of consciousness" in philosophy comes down to the fact that you cannot deduce consciousness from behaviour. As such, it becomes quite difficult to say much about it.
The only consciousness of which you know it exists is your own. By induction, you assume that other people also have a consciousness and by extrapolation you could possibly guess that other physical constructions (such as animals) might have a consciousness. But there is absolutely no way to find out.

Imagine you make a computer which, for all practical purposes, behaves like a human being in conversation (style the Turing test). What makes you think it is conscious ? You can say that it is intelligent. But when it weeps, does it "really suffer" ? How do you know ?

Imagine I freeze you and take your brain out, make an exact copy of it, put that copy back into your head and heat you up again. As long as a brain is a classical device, nothing stops me in principle from doing so. If you survived the operation, is that "you" or are "you" in fact dead, in the dustbin, and did I construct a twin brother with all your memories?

The issue of consciousness is much more involved than you imply in your post. However, these discussions belong more on the epistemology forum than on the QM forum...

cheers,
Patrick.
 
Last edited:
  • #10
Consciousness from a nerve-cell / transistor point of view

vanesch said:
The issue of consciousness is much more involved than you imply in your post.


Consciousness from a nerve-cell / transistor point of view:


Consciousness is the ability to obtain and maintain a (simplified) copy
of the surrounding world and simulate the progression independently
of what is happening in the real world. There is a continuous process
going on that compares the progression in the real world via the sensory
inputs (eyes, ears) and the simulated artificial progression. The better
the correspondence, the higher the consciousness.

There are two basic levels of consciousness:

Spatial, Temporal: The ant's brain predict that the jaws of the predator
insect will catch him if he takes no action. His brain simulates various
scenarios to escape the jaws and choses the most promising scenario.

Social Consciousness: This starts with the ability to simulate and predict
the the behavior of other conscious "objects" in the surrounding
environment. The insects brain may be conscious of the fact that the
jaws of the predator are not just moving "by the wind" but that there is
some intelligence behind them. This may then be used to select the
right scenario.



The social consciousness of mammal and primates (including the human
type) can simulate and predict the behavior of others with social
consciousness of others with social consciousness of others with...
et-cetera.

And then to complete: We have the ability to simulate and predict
a surround world which includes ourselfs as an object that has such
a social consciousness. We are self-conscious.



When dreaming, deprived of any sensory inputs, you have the ability to
visualize your own variant of the world, hear sounds, hear voices of
others with a similar level of social consciousness. visualize simulated
predictions of their behavior et-cetera.

What makes the human brain better compared to other mammals is the
language system which classifies our surroundings into thousands of
objects into words of just a few bytes. Such a data-compressed
representation allows for an inter-linking system that associates
thousands of different objects, their properties, their uses and their
behaviour with each other.


And yes, it is all machine implementable.

Regards, Hans
 
  • #11
Hans de Vries said:
Consciousness from a nerve-cell / transistor point of view:


Consciousness is the ability to obtain and maintain a (simplified) copy
of the surrounding world and simulate the progression independently
of what is happening in the real world. There is a continuous process
going on that compares the progression in the real world via the sensory
inputs (eyes, ears) and the simulated artificial progression. The better
the correspondence, the higher the consciousness.

Ok, then, that's not what a philosopher uses for the word. Probably artificial intelligence people do so, because, after all, they have to be able to produce "conscious" devices in their proposals, to get funding :smile:
What you describe is simply a kind of intelligence: a certain processing capacity coupled to sensory information. I'm of course not contesting the possibility to do that, but I claim that that has nothing to do with what a philosopher means with "consciousness".

But to come back to the physics, I only need an aspect of it, namely the "identity" part, the fact that a conscious being is conscious of its own identity (knows the difference between "I" and "the rest of the world").
Your definition denies this identity part completely. In order to illustrate what I mean, let's do some horrible things:
If you are a conscious machine, and I make a perfect copy of you, with identical memory content, then which of the two has the "I" sensation ? Wouldn't you mind me making a copy of you and afterwards, slowly burning you, while you are still a bit conscious ? Or wouldn't you mind me writing all the information I need to make a copy of you, write it on a disk, and then kill you ? If 20 years later, I use that information to reconstruct you, is that "you" or just a frankenstein monster I created which behaves like you ?
Should I go on trial or not, if, before I kill someone, I write this information on a disk ?

cheers,
Patrick.
 
  • #12
vanesch said:
Ok, then, that's not what a philosopher uses for the word. Probably artificial intelligence people do so, because, after all, they have to be able to produce "conscious" devices in their proposals, to get funding :smile:
What you describe is simply a kind of intelligence: a certain processing capacity coupled to sensory information. I'm of course not contesting the possibility to do that,

This is the basic mechanism how consciousness works. It also explains
the various levels of consciousness from insects to humans"

but I claim that that has nothing to do with what a philosopher means with "consciousness".

I'm generally not too impressed by philosophers talk. Not if it's about
physics, nor if it's about Artificial Intelligence, Let alone if it is a
combination of the two...

But to come back to the physics, I only need an aspect of it, namely the "identity" part, the fact that a conscious being is conscious of its own identity (knows the difference between "I" and "the rest of the world").
Your definition denies this identity part completely. In order to illustrate what I mean, let's do some horrible things:
If you are a conscious machine, and I make a perfect copy of you, with identical memory content, then which of the two has the "I" sensation ? Wouldn't you mind me making a copy of you and afterwards, slowly burning you, while you are still a bit conscious ? Or wouldn't you mind me writing all the information I need to make a copy of you, write it on a disk, and then kill you ? If 20 years later, I use that information to reconstruct you, is that "you" or just a frankenstein monster I created which behaves like you ?
Should I go on trial or not, if, before I kill someone, I write this information on a disk ?

If you were an identical copy of another physical you then it would
just "feel" like you.

Such copy "kill" copy "kill" sequence may actually be ongoing on the
microscopic scale. How do we know that an elementary particle that
has moved from one place of another is still build out of the same
"material substance"? If there is anything wave like in the transport
then it may be that only the information has propagated. Copied
from one location to the other. Your previous you is nothing more
than infinitesimal small rimples in the vacuum left behind.

And there is no way to tell the difference...

Regards, Hans
 
Last edited:
  • #13
Well, it seems this thread has shift somewhat towards philosophy. I will try to focuss on physics.
First, the separation of problems:
I hope that everyone agree that a state and an observable in QM describes the outcome statistics of the observable in an experiment.
The classical probability does the same: we have a random variable and its probability law.
I think we must not confuse the outcomes and the statistics of outcomes. Statistics of outcomes is well explained by QM (born rules). What QM does not explain is the arrival of an outcome in a given experiment trial. This is the same thing with statistical classical mechanics: its describes only the statistics of outcomes.
Now, we can ask why in an experiment trial (classical or QM) we almost always assume "we see" a "single" outcome (and not an event).

(note outcome and events are used in the mathematical probability language)

I think, if we can correclty (logically) answer to this question, we can explain better why we almost always assume we see an "outcome world".

Seratend.
 
  • #14
From what i understand of string theory, "copying and killing" is going on all the at the fundamental reconstitution of matter level by adjusting vibrational signatures.

Put your left hand in front of you and move it from left to right. What you see is actually not your hand moving but the strings changing shape to account for perceived movement. We see the hand moving at a pathetic 25 frames a second, enough to trick our brain.

At lightspeed and Planck size magnification, the end string where the air touches your thumb changed it's collective vibrational signature from air to hand and the bit of your little finger changed it's signature to that of air.

In a fixed 11 dimensional background nothing moves only the strings change shape
 
  • #15
seratend said:
Well, it seems this thread has shift somewhat towards philosophy. I will try to focuss on physics.
First, the separation of problems:
I hope that everyone agree that a state and an observable in QM describes the outcome statistics of the observable in an experiment.

The problem with that statement is that it is ambiguous: namely what constitutes an experiment (usually called a "measurement") ? When do you switch from the "hilbert state description" to the "statistics description" ? What is the difference between an "interaction" (described by unitary QM), and a "measurement" (giving you a statistical description through the Born rule) ? It is THE longstanding problem in QM, from the beginning, and all the discussion is about when (if) this transition from hilbert space description to statistical description takes place.

What I tried (maybe in a very clumsy way) to point out is that we should probably attach more "reality" to the hilbert space state description than "just a tool that gives us a statistical description" ; it is in any case the current trend, when you see people consider "the state function of the universe".

cheers,
Patrick.
 
  • #16
Hans de Vries said:
Such copy "kill" copy "kill" sequence may actually be ongoing on the
microscopic scale.

Well, that's exactly what unitary quantum theory says: the current "you" becomes in an entangled state with whatever "you" observe, each time that you observe something.
There's even no ambiguity to what are those "you" states: this is given by the Schmidt decomposition. This is essentially "many worlds". Each "you" state has then a different, but consistent experience.

Let us consider you, in a you0 state, and two other systems, S and T, each in their states:

|universe0> = |you0> (a|s1> + b|s2>) (c |t1> + d |t2> )|rest>

Imagine now that "you look at system S". It simply means that whatever makes you up physically has an interaction with the S system through an interaction hamiltonian.

So after that, you become entangled with system S:

|universe1> = (a |you-s-1> |s1x> + b |you-s-2> |s2x>) (c|t1> + d |t2>)|rest>

So now essentially, the "you0" state disappeared, and there are now two copies of "you", one in the you-s-1 state, and one in the you-s-2 state. The you-s-1 state is happy, he "measured" system S to be in the S1 state. The "you-s-2" state is also happy, he "measureed" system S to be in the S2 state.

Imagine now that "you-s-1" decides to to nothing, while "you-s-2" decides to measure system T. This means, there will now be an interaction hamiltonian that will entangle "you-s-2" with the T system, while leaving the "you-s-1" state alone.

This gives you:

|universe2> = a |you-s-1> (c |t1> + d |t2>)|rest> + b c |you-s-2-t-1> |t1> |rest> +
b d |you-s-2-t-2> |t2>|rest>

Again, the "you-s-2" state has disappeared, and there are now 3 copies of you, with different histories: you-s-1, you-s-2-t-1 and you-s-2-t-2.

Each of these you's has a different history:

*)you-s-1 lives in a world where he measured S in state s1, and didn't touch upon system T.
*) you-s-2-t-1 lives in a world where he measured S and found it to be in state s2, and then decided to measure system T, and measured t1.
*) you-s-2-t-2 lives in a world where he measured S, found s2, decided to measure T and found t2.

The problem is that MOST of these "yous" will accumulate experiences which are NOT in agreement with the statistics of the Born rule. It is the problem MWI proponents never could solve (although they give a lot of "plausibility arguments"), and that's because they left out explicitly the Born rule.

So there are miriads of "yous" in different terms. Now, as far as you know, yourself, you accumulate experiences as do all these "you" states, but "which one is the real you" ?
You DO get the right statistical results out if you say that, each time such a split takes place, the "real" you chooses randomly to be in one of the different branches, with a probability equal to the relative norms squared of the terms in which you can split. And you can then forget about all those other "you" states ; explicitly (by projecting them out), or implicitly (they exist, but you'll never hear of them again).
The first view is solipsist, in that you change the physical state of the universe by your pure observation, the second view not.

This is the only way to have at the same time:
- attach a physical reality to "the quantum state"
- apply quantum theory to all "physical interactions" without classifying some as "observations" and others as "interactions"
- get out the correct Born rules.

In all other cases, one has to draw the line between which physical processes are "interactions" (described by an interaction hamiltonian) and which physical processes are "measurements" (to which statistical rules apply).

cheers,
Patrick.
 
  • #17
In order to illustrate better the emergence of the Born rule in the preceding discussion, I will consider the following.

Let us consider N independent but identical 2-state systems S_1, S_2 ... S_N which are prepared in identical quantum states: a |+> + b |->, with a = sqrt(p) and b = sqrt(1-p).
These systems are, for instance, N electrons. You will measure them successively:

|state-0> = |you0> |s1> |s2> ... |sN>

|s1> = a|1+> + b|1-> etc...

After you measure the first electron, you get:

|state-1> = (a|you+> + b |you->) |s2>...|sN>

(I drop the 1+ and 1- states which are tied up with the "youx" for a shorthand ; consider them now part of the yous.)

After each "you" measured also the second electron, you get:

|state-2> = (a^2 |you++> + a b |you+-> + b a |you-+> + b^2 |you++>)
|s3>...|sN>

you get the general idea:

|state-N> = sum over all 2^N possible outcomes of terms:
a^k b^(N-k) |you xxx>
where xxx stands for all possible combinations of N + or - ;
k is the number of + in the corresponding sequence.

You see that the "you" which is in the |you++...+++> state has a very peculiar experience: all his measurements gave a +. If p = 0.999, then this is maybe in agreement with the Born rule, but if p = 0.0001, it is highly unlikely. And there is no way to discriminate between all these "you's" if you do not explicitly say that somehow, "the real you" has to choose amongst all its possibilities according to the Born rule.

You may be tricked by an MWI plausibility argument, namely that there is now a "statistical lot" of "yous" and that "you" just happen to be one of them.
But that is in agreement with the Born rule ONLY if p = 0.5.
Indeed, then all individual sequences +--+++-++---... are equally probable, and if you "hit at random a you", you will get out the right statistics. But it doesn't work anymore when p is different from 0.5. MWI proponents try to get out of this by other arguments, such as that a 0.25/0.75 split should then in fact count 3 different yous for the second case and so on, but that's sneaking in the Born rule though the backdoor. There is indeed no formal reason to split this component in 3 equal-length components.
They also use the argument that, at the end of times, all non-Born states become vanishingly small (in Hilbert norm) as compared to the one with the Born statistics (which also become vanishingly small, but less so), but that's a cheap argument in that, what will happen "at the end of times" doesn't concern us right now, and nobody said that the Hilbert norm plays any role (because that's exactly the content of the Born rule).

Recently, Deutsch had a sophisticated argument that, if each current "you" were a kind of rational player who has to bet onto the future by risking capital and have different returns for him in his different futures, there is only one mathematically consistent way for him to do so, and that is by assigning probabilities for his own future according to the Born rule, but others have pointed out that in doing so, he is in fact sneaking in the Born rule in his hypotheses of what he defines to be a rational player.

cheers,
Patrick.
 
  • #18
vanesch said:
The problem with that statement is that it is ambiguous: namely what constitutes an experiment (usually called a "measurement") ? When do you switch from the "hilbert state description" to the "statistics description" ? What is the difference between an "interaction" (described by unitary QM), and a "measurement" (giving you a statistical description through the Born rule) ? It is THE longstanding problem in QM, from the beginning, and all the discussion is about when (if) this transition from hilbert space description to statistical description takes place.
This is the same thing for "an experiment" in classical probability (or statistical Newtonian mechanics): it is ambiguous. So my proposition to focus on it as a good explanation will work for QM or classical statistics.
In my modest opinion, a state and an observable describes the statistics (formally) associated to the observable that come from the postulates of QM. Saying that, I still have not logically associated the statistical results (the born rules or classical probability) to outcomes in experiments (or measurements or whatever we want: it has to be defined). In my modest opinion, it is at this interface where the paradoxes are introduced by the lack of a formal self-consistent explanation: I can understand the born rules as long as we stay with statistics (they are logically defined).

Probability theory describes, logically, the statistics [of outcomes]. Not how we can produce an outcome (at least it is what I have understood :) in an experiment. As long as we stay with statistics (the born rules or classical probability) we do not have to take care of the implementation of an experiment (its "realisation in the real world"). Therefore, experiments and/or measurements realisations lack this formal explanation (i.e. connecting formally the outcomes to the statistics, and not saying this is due to the holly grail of the large number law). It is exactly what you call the transition from the Hilbert space to the statistical description [of outcomes] in an experiment and I agree with that (at least what I have understood) as well as it is the longstanding problem of QM (i.e. QM accepted formalism describes statistics not the outcomes).
However, I prefer to separate this problem (connecting the statistical data to the outcomes in an experiment) from the well explained statistics description (i.e. QM statistics or classical statistics model). It is, if you prefer, a tentative to focus on a mathematical analysis and words rather than on physical interpretations/words (i.e. like hilbert with the formal axiomatization of the geometry: we separate the mathematical object "circle" from the "real" circle): just to avoid falling into metaphysics or philosophy.

vanesch said:
What I tried (maybe in a very clumsy way) to point out is that we should probably attach more "reality" to the hilbert space state description than "just a tool that gives us a statistical description" ; it is in any case the current trend, when you see people consider "the state function of the universe".
cheers,
Patrick.

Yes. But I prefer to say: I want to understand how I can connect the statistics given by a state and an observable of QM formalism to the outcome results in an experiment (or measurement). I prefer to focus on outcomes ( the math definition) an their relation to the statistics (or the probability law if you prefer). However, I still need to define better what is an experiment/measurement, but I removed "reality" or "mind" or "consciouness" words (or the other near philosophical terms) from the explanations.

Seratend.
 
  • #19
seratend said:
However, I prefer to separate this problem (connecting the statistical data to the outcomes in an experiment) from the well explained statistics description (i.e. QM statistics or classical statistics model). It is, if you prefer, a tentative to focus on a mathematical analysis and words rather than on physical interpretations/words (i.e. like hilbert with the formal axiomatization of the geometry: we separate the mathematical object "circle" from the "real" circle): just to avoid falling into metaphysics or philosophy.

I think I understand what you are saying. And for all practical purposes, you are of course right, out of the formalism of QM, we get statistical predictions and that's what we compare to the outcomes of experiments.

But there is, as far as I understand, an extra issue in QM which doesn't have a counterpart in classical statistical physics, and that is the following: in classical statistical physics, the state IS a statistical description.
I wouldn't have any peculiar difficulties with accepting that there are, say, fundamental "random processes", such as when an electron collides into an atom, there is 20% chance that it undergoes an elastic collision (and with a certain probability distribution of the angle and so on), 70% chance that it ionises the atom, etc... which have no underlying "statistical mechanics" explanation. From these basic statistical building blocks, I could then understand how a bigger system works, and deduce its statistical description.

Imagine I have a microscopic process that, when measured, has chance a to give result + and chance 1-a to give result -. I could then also construct a kind of "many world" interpretation in which I say that both happen, but the real "me" then applies the statistics and will choose to be in the world with result + or result -. Although strictly possible, you understand that this is a bit ridiculous. The evident interpretation is that there is just one world, and that you have a chance to see + with probability a and a chance to see - with a probability 1-a. I don't know if this is the correct word, but we are here in a case where the microscopic probabilities are transitive, in that they induce probabilities at higher and higher levels of observation, up to the level of your conscious observation, in a smooth way. There's no need for a break between "observation" and "physical process".

But this is NOT the case in quantum theory. The hilbert state description is NOT a statistical description which is "transitive" (in the way I defined it above). This is best illustrated by the differences between statistical mixtures and pure states. They are NOT equivalent descriptions, and you can put yourself in situations where you DO see the difference. I'm not talking about the individual realisations here. I'm talking about the fact that if a system is in a pure state, it is physically in a different state than if it is in a mixture. In the first case I can obtain interference effects, and not in the second case. The statistical description that you obtain with the Born rule is when everything is reduced to a statistical mixture, and this can only happen, when you start with a pure state, through DIFFERENT processes than ALL the individual physical processes (which are all described by unitary evolution) we know of. Moreover, these processes make interference then impossible in principle. So, if you insist that quantum mechanics "gives us a statistical description of nature" it automatically includes that you define WHEN this transition takes place ; and it cannot take place through ANY of the microscopic physical phenomena we know of (because those do NOT apply, what I call, statistical transitivity but instead, apply unitary evolution).

It is THIS transition which somehow forces me to consider things like "consciousness" and so on, because I don't see where to put this transition if it cannot be induced by an electromagnetic, weak or strong interaction. It is not because I'm a new age guy :biggrin:

Happily for the working physicist, he doesn't have to think too much about it. Indeed, a physical process "behaves as if" it is a measurement, when the to-be-measured states entangle so much with something else that it will be difficult to trace back the correlations... on condition that the transition with the Born rule will be applied later. This sentence describes succinctly what decoherence theory tells us. (But you STILL need a process that will apply the Born rule later.)
So because of the huge difference between the entanglements you can control in microscopic systems, and the entanglements that occur when you use a measurement device (and tacitly assuming that something DOES force the transition from a pure state to a mixture), for all practical purposes, yes, quantum theory gives you a statistical description of the results you read off from measurement systems.

cheers,
Patrick.
 
  • #20
For some strange reason, this thread didn't get updated when I submitted my post. So I post another one to it. And just to say something, I puzzled together my view on QM in my journal...

cheers,
Patrick.
 
  • #21
In fact, I just found a paper, today, which describes quite closely what I've been arguing for all the time here. The author calls it MWI, but in fact, he also introduces the Born measure the way I did it for sentient beings. He even gives an evolutionary reason for it :-)

quant-ph/9609006

Lev Vaidman:

“On schizophrenic experiences of the neutron or why we should believe in the MW interpretation of quantum theory”

cheers,
Patrick.
 
  • #22
vanesch said:
In fact, I just found a paper, today, which describes quite closely what I've been arguing for all the time here. The author calls it MWI, but in fact, he also introduces the Born measure the way I did it for sentient beings. He even gives an evolutionary reason for it :-)

quant-ph/9609006

Lev Vaidman:

“On schizophrenic experiences of the neutron or why we should believe in the MW interpretation of quantum theory”

cheers,
Patrick.
I am still working on your post (it takes more time than expected!). I'll try to have a look a this paper.
Thanks.

Seratend.
 
  • #23
I have taken a moment trying to understand the “state IS a statistical description” (the implicit content): I still have some problems in connecting the outcomes statistics to the statistics given by a [classical] probability law (the mathematical formulation). I think this may be the key point to explain why we have such problems with quantum states and the multiple interpretations we find around the world.
Therefore, I will try to develop a while in order to be sure we understand ourselves (note that I am not a probability expert so I can make some errors, and I will try to stay with standard probability :).

Frankly, I have a problem to connect statistics of outcomes to probabilities. [classical] probabilities give us a law on a peculiar random variable on a probability space. From the probability law, we deduce [formally] the statistics of the events of the random variable (from the probability law P and the event e (a set), we have the number P(e)). Therefore, the probability statistics are given on events (set of points, part of the random variable set of values). This is where the problems arise:
In the case of continuous [random variable] probabilities, the probability of a point (the event is a set containing a single point) is null (e.g. the Lebesgue measure on |R). Thus, if we want to construct statistics of events based on “real” experiments we have a physical problem: it requires *uncountable* infinite trials to get some statistical values different from 0 (case where the outcome of an experiment is a point).
However, when we have the discrete case, we may have the probability of a point event non null. Together with the weak law of the large numbers, we can deduce the probability law of an event point from a countable number of outcomes (points) of an experiment and therefore construct the statistics of events with minor restrictions (countability problem).

Thus, we can say that we are only able to [“physically”] measure statistics of discrete probability laws (for example, we measure the position of a point with a given precision => we are constructing a discrete random variable with countable values that covers the |R line: (xo+iΔx)[itex]_i_\in _\mathbb{Z}[/itex], thus, we measure the statistics of this discrete random variable and not the continuous one).
Therefore, the measured random variable (the “observable” in QM terminology) in [classical] statistics cannot be a continuous one, but a discrete one (it must have at most a countable set of values).
From this very basic introduction, we can see that even with classical probability, the statistics (outcomes) we can measure are not surely “what it is”. We cannot [physically] measure with an infinite [or a known] precision the statistics of continuous variable, we can only measure some sets of [huge, uncountable] outcomes of this continuous random variable (i.e. the new outcomes of one implicitly associated discrete random variable) => the “measured statistical outcome” is not a point but rather a set of possible points [of the continuous random variable] (for example due to the uncertainty of the [classical] measurement).

In classical probability, the random variable with the probability law gives the statistical description of the random variable. I can choose, as in QM, another random variable to describe another statistics (i.e. random variable ~ QM observable) – the statistics of this random variable. However, unlike QM, I select [in classical probability] a set of commuting observable from where I pick up my random variable/observable. To get the equivalent non-commuting observables of QM in classical probability, I must construct, for example (there are many other possibilities), new random variables that depend *explicitly* on the probability law of a given random variable. It is what is done, formally, with Bohmian mechanics, where classical probabilities are used to describe the QM state and observable (the random variable v(q,t) depends on the probability law rho(q,t) of the random variable q, i.e. (q,p) are non commuting observables).
Formally, a probability law and the associated random variable of the classical probability space is always equivalent to a state and an observable in the Hilbert space of QM.

For example, we have the probability laws rho(p,q,t) in Newtonian mechanics and ~rho(E) in thermodynamics, etc … for the associated random variables.

We may have other reasons why we are not able to measure with an infinite precision a continuous physical variable even in the deterministic case. However, if you accept this restriction (precision limitation in measurements), you may accept (with additional development ; ) that what we “see” is always statistics (we never know the initial states, nor the final states) and therefore, even with classical mechanics, you can also have a many world of possibilities (e.g. from chaotic classical systems and the precision limitation of measures or your “classical microscopic system” example ).
Therefore, the problem is to understand what we “see” in a world of statistics: the outcome of a discrete random variable (or if you prefer the outcome of a given observable in QM language). How do we connect the outcome of an experiment to statistical results, when by definition we assume no underlying deterministic process? (in a Mathematic formal logic, not an interpretation). Is it coherent?

In addition, the statistical/deterministic boundary becomes even fuzzier (and may be a nonsense) when we consider the result from the weak law of large numbers applied to a classical system of identical independent random variables. We get a global random variable system with a probability of 100% to be in the mean value of the independent random variable (this also works for independent identical QM observables). In a world of statistics, you recover what seems to be a deterministic world (from a set of statistical states, we recover what seems a deterministic state).
For example, consider the classical electron case where, formally, we can think it is made of an infinite countable set of independent identical random variables, each one with a random position => the position of electron (global random variable of this system) is known with a 100% confidence. In such cases, we can see that statistical and deterministic descriptions are not so well separated and they may be included one into another.

Examples: we can connect, formally, the deterministic description of the classical Maxwell equations to independent random variables (e.g. using for example the electron model above for the charge distribution).
We can do even better: we can take the QM free Hamiltonian of the em field (photons). In addition, if we assume that we have an infinite countable number of photons all in the individual eigenstate |e> (the state of the whole system |E>=|e>|e>…|e>…), we have with a 100% confidence a static density energy of this field of e= sum_n e/N (where N is the total number of photons of this field, N -->+oO).

I think that such an approach may lead to a better understanding of QM (or classical) statistics and in the same way adds more questions to the “what we really see/measure” (decidability problem) as well as the decoherence of a quantum system.

How can you interpret your examples, if you accept the occurrence of such possibilities (100% confidence)? When you take the global system (the microscopic+ macroscopic measurement system), you seem to get a deterministic value, however if you are able to look at finite parts of this system you can only get random events. The things become even more complicated, if you admit that you can either see a deterministic event from the uncertainty of the measure (because we can only “see” the approximate random variable/observable that may always give the same value).

Now, imagine one instant that the measurement part of a quantum system in a state |psi> generates an infinite number of photons to your classical human eye detector. Suppose that these infinite numbers of photons are described by the free em Hamiltonian on the vicinity of the human eye.
Suppose, in addition, that the measurement apparatus generates a colour (energy) for each state basis vector of the quantum system state under measure (the selected basis of the measurement: |psi>= sum_i ci|i>, for each |i> => colour |ei> for the photons). We have in fact an infinite number of elementary *independent* measurement apparatuses (one for each state |i>, thanks to the orthogonality of |i>) where the measurement apparatus number “ i” generates a burst of photons if the peculiar outcome of the state |psi> triggers it (note that the “outcome” is shift towards the measurement apparatus: the photons) .

The projector of such elementary apparatus is given by: P_|i>=|ei>|i><i|<0|
We have [P_|i>,P_|j>]= 0 (independent measures).
Where |ei> is the eigenstate vector of an infinite number of photons of energy ei, |0> is the vacuum (in fact we should use instead, the initial state of the em field).

=> P_|i>|psi>=ci|ei>|i> (probability of getting this outcome, i.e. the state |ei>|i> is “true”, is given by |ci|² for a state |psi>, note also that outcomes |ei>|i> are mutually exclusive). Because, we are working with an orthogonal basis (|i>), we can’t have during a trial with the state |psi>, more than one elementary measurement apparatus triggering outcome.

Therefore, at the vicinity of your eye, you will always get an infinite number of free [independent] photons outcome coming from just one of the elementary measurement apparatuses for each trial with the state |psi>. Now, thanks to the weak law of large numbers, if we have the outcome corresponding to |ei>, you see a deterministic colour, “the state |ei>” (and vice versa). From an experimental trial on a statistical state |psi>, we get a deterministic state: the colour of the photons (the outcome).

Note that *all* of those processes used in this measurement experiment are statistical. However, we get what seems to be a deterministic result. We have not used any collapse, not minds, nor worlds terminology, just the fact that the measurement apparatus is capable of controlling *statistically* a huge amount of free independent variables and the weak law of large numbers. If we get the color ei (100% statistics) then the measurement apparatus “i” is triggered during one trial of the state |psi>. You can remove the human mind: you always keep the 100% statistics for the photons energy distribution (the colour i). This result is independent of the observer but not of the measurement apparatus.
Now where is the deterministic/statistic boundary? Does such question make sense?


If you and others want to have a look on the “formal” equality trial of classical probabilities (i.e. Kolgomorov modern formulation) and QM probabilities, I recommend Richard D. Gill texts in ARXIV. I am not sure that I agree with all what he says (and it is not really important) but, I appreciate its mathematical approach of the problem (he is a mathematician, and most of its texts requires a low background to be understood). For example its text “critique of “elements of quantum probability” where he used the EPR experiment to show that we can use classical probabilities to get the same result as QM probabilities, he has done some comments on Deutsh papers.

The other point I want to underline in order to improve our mutual understanding is the separation of the time evolution of statistics from the statistics themselves. Your example on the classical microscopic system states is perfect.

Both in QM or classical probabilities, the “formal” statistics do not refer to time. If we use the time label, we are just giving a collection of different statistics: law(ti)=(ti,law_i), where law_i describes a time independent statistics/probability law. Therefore, we can formally, separate the time evolution problem from the statistical results (a bit more complicated in relativity, left for later ;). The main advantage of this approach resides in the fact that we can say that at a give time, a QM state and a given observable is formally equivalent to a random variable and a probability law. Therefore, we can see the main difference between QM and CSM (classical statistical Mechanics), comes from the time evolution rather than the statistics (in CSM, a dirac distribution law of the (q,p) random variables is conserved through time evolution while in QM, a Dirac distribution of the observable Q is not conserved through time evolution).

You have argued that “I could also construct a kind of “many worlds” interpretation [in CSM] in which I say that both [mutually exclusive states] happen, ... Although strictly possible, you understand that this is a bit ridiculous”.
You are perfectly right.
However, I prefer to show this property comes from [formally] the peculiar time evolution of the statistics in CSM rather than an interpretation.

CSM may be reformulated with the Hilbert space formalism (for example Liouville equation on a |q>(x)|p> Hilbert space) and the restriction on the states comes from the time evolution of the states (mathematical property). In CSM we will never have a *time evolution* from a state |+> towards the state |+>+|->. This is only due to the time evolution equation (statistical volume conservation), or the interactions if you prefer (and not the statistics).
In the same way, if we take an initial state |+>+|->, it will evolve in time and remains in the same form.
However, because we have not the possibility to have an interaction that allows a time evolution from a state |+> to the state |+>+|-> (and vice versa), we cannot measure/compare the difference between these states as in QM (it is a restriction in the allowable measurements projectors).
Therefore, all the measured states remain in one basis (the basis of the commuting observables) that we may call the mutually exclusive basis (as all the outcomes are mutually exclusive in a given basis).
This is the logical reason why we cannot see non mutually exclusive states in CSM (i.e. we cannot build measurement apparatuses that show/compare these states, or interactions that allow evolutions to these states/statistics): we can build formally such projectors, however we cannot build approximations of them by the CSM time evolution. Therefore, we can only say that the implementation of such projectors is ridiculous/impossible in CSM.

I will try to continue on a next post to answer to your last section “But this is NOT the case in QM theory ... but instead apply unitary evolution” (pb of pure states, mixture, decoherence and unitary time evolution, worlds, conscience versus formal conditional probability/statistics).

I have tried in this post to define a quantum machine (it can be also a classical statistical machine) where from an input quantum state and quantum interactions, we get a measurement quantum state that gives deterministic statistical results: it just requires a huge amount of particles as we can expect in macroscopic measurement apparatuses.
The important point of this toy system (that may be wrong, if I have done errors :) comes from the deterministic result it gives for each experiment trial without the use of decoherence, just the properties of statistics. From a statistical description (and evolution), we get what seems to be a deterministic state for each trial. Therefore what we see (“the outcome”), in this case is not the state of the quantum system, but rather the state of the photons.

Seratend.
 
  • #24
Very interesting (long !) post. I will reread it several times. But some quick comments.

seratend said:
This is where the problems arise:
In the case of continuous [random variable] probabilities, the probability of a point (the event is a set containing a single point) is null (e.g. the Lebesgue measure on |R).

Honestly, I don't think that's a problem. Think of the visible universe as lumped up in cubes with a 100th of the Planck length size. So there are only a finite number of physically distinguishable position states possible.
I had the impression that modern theories (superstrings, loop quantum gravity) only take into account a finite number of degrees anyways.
I don't think it is an issue.


To get the equivalent non-commuting observables of QM in classical probability, I must construct, for example (there are many other possibilities), new random variables that depend *explicitly* on the probability law of a given random variable.

Yes, but you will have to do non-local things...

In addition, the statistical/deterministic boundary becomes even fuzzier (and may be a nonsense) when we consider the result from the weak law of large numbers applied to a classical system of identical independent random variables. We get a global random variable system with a probability of 100% to be in the mean value of the independent random variable (this also works for independent identical QM observables). In a world of statistics, you recover what seems to be a deterministic world (from a set of statistical states, we recover what seems a deterministic state).

One has to be careful with this hammer of the weak law of large numbers. IF you accept that you're having probabilities, then, yes, the weak law of large numbers does make sense. However, it is no means to deduce probabilities if initially you didn't define a probability law!
It is exactly the fallacy Everett made, when he tried to prove that worldstates with non-Born statistics had Hilbert norms which became infinitesimally small "at the end of times". First of all, there was the "at the end of times" (infinite number of measurements), and next, there was a priori no reason to call the hilbert norms "probabilities" : that's exactly what he tried to prove !

Examples: we can connect, formally, the deterministic description of the classical Maxwell equations to independent random variables (e.g. using for example the electron model above for the charge distribution).
We can do even better: we can take the QM free Hamiltonian of the em field (photons). In addition, if we assume that we have an infinite countable number of photons all in the individual eigenstate |e> (the state of the whole system |E>=|e>|e>…|e>…), we have with a 100% confidence a static density energy of this field of e= sum_n e/N (where N is the total number of photons of this field, N -->+oO).

If |e> is an energy eigenstate, then it is even true for a single state :-)

How can you interpret your examples, if you accept the occurrence of such possibilities (100% confidence)?

I don't accept that (for the same reason I don't accept Everett's reasoning). First of all, you don't have an INFINITY of systems, but always a huge but finite one (watch out for black holes !). Second, once you have these finite systems, the states which do NOT have the right statistic (in a very significant way) will exist (even if they have very small Hilbert norms).

When you take the global system (the microscopic+ macroscopic measurement system), you seem to get a deterministic value, however if you are able to look at finite parts of this system you can only get random events. The things become even more complicated, if you admit that you can either see a deterministic event from the uncertainty of the measure (because we can only “see” the approximate random variable/observable that may always give the same value).

As I said, there exist states (if you allow for unitary evolution) which have a small but finite Hilbert norm, and which will give you significant deviations.
Remember my 2 state system. Even if the state of the individual 2-state systems is 0.0001|+> + 0.9999999|->, there's a finite norm for the branch in which an observer had |+++++++++++++++++++++++>.
Now, if you somehow could say that this norm WAS a probability, then we're in agreement that such a state is so improbable as that we can forget about it. However, if the Hilbert norm is NOT a probability, it is a component in the final state.
And the whole question is: what local physical process makes us decide that we can now call this hilbert norm a probability (and apply the Born rule).

Now, imagine one instant that the measurement part of a quantum system in a state |psi> generates an infinite number of photons to your classical human eye detector.

See, the story is already over. You assume that "seen by the human eye" is a classical system. Ok, you fixed the place of the Born rule application. In fact, it is not very far from where I fix it (only a few cm to the back :-)

But the problem is: if you are now going into detail, and you look at the physics of the human eye, you will see certain molecules interact with photons, through the EM interaction. Well, this interaction is described by UNITARY transformations.


=> P_|i>|psi>=ci|ei>|i> (probability of getting this outcome, i.e. the state |ei>|i> is “true”, is given by |ci|² for a state |psi>, note also that outcomes |ei>|i> are mutually exclusive). Because, we are working with an orthogonal basis (|i>), we can’t have during a trial with the state |psi>, more than one elementary measurement apparatus triggering outcome.

Again, you now applied the Born rule by your apparatusses.
But if you analyse them, they too consist of atoms and EM fields, and all this evolves in a unitary way.

So what will come out of it will be a big entanglement, if you apply rigorously your quantum theory. You will get AT THE SAME TIME a superposition of an infinity of red photons, of blue photons etc... in exactly that superposition which gave you the initial |psi>.


Therefore, at the vicinity of your eye, you will always get an infinite number of free [independent] photons outcome coming from just one of the elementary measurement apparatuses for each trial with the state |psi>.

If somehow, you could explain me WHY we have to do away with our matter-EM interaction hamiltonian and its associated unitary evolution, and apply the Born rule in this apparatus...

BTW, it escapes me why you send an infinity of photons. In principle, one would be enough, if it is in an energy state: I will measure with certainty its energy, no ?

Note that *all* of those processes used in this measurement experiment are statistical. However, we get what seems to be a deterministic result. We have not used any collapse

You did, when you said that only one apparatus could fire.


The other point I want to underline in order to improve our mutual understanding is the separation of the time evolution of statistics from the statistics themselves. Your example on the classical microscopic system states is perfect.

Both in QM or classical probabilities, the “formal” statistics do not refer to time. If we use the time label, we are just giving a collection of different statistics: law(ti)=(ti,law_i), where law_i describes a time independent statistics/probability law. Therefore, we can formally, separate the time evolution problem from the statistical results (a bit more complicated in relativity, left for later ;). The main advantage of this approach resides in the fact that we can say that at a give time, a QM state and a given observable is formally equivalent to a random variable and a probability law.

I would then like to point out that, of course, IF you apply the Born rule, that all the statistics that come out, of what you did as experiments in the past, can of course be described by a formal statistical system. After all, that's how people use quantum theory in practice.

The point I was trying to make was something different (I think): it is that we don't know of any local physical process that makes us apply the Born rule! There must be something in nature that makes us apply it, and apparently it is not through EM, weak or strong interactions ... and if superstring theory is correct, not by gravity either.
What we know already is that if that rule IS applied somewhere, THEN we may apply it with macroscopic measurement apparatus... except in EPR kinds of cases where the macroscopic entanglement CAN interfere at the moment of observation of the correlations.

But of course, you can always set up a formal probabilistic system that will spit out the correct statistics (call it a physicist, who knows quantum mechanics and applies the Born rule in his calculations when he writes up his paper!). The problem I have is not with the probabilistic nature, nor about the fact that this probabilistic nature gives rise to "deterministic" expectation values (statistics) if we observe an infinity of systems.
It is in the PHYSICS that I have a problem. I know all interactions. I know perfectly well how they give rise to unitary evolution. But "at the end of my paper, I have to apply the Born rule". Where is the physics of the Born rule ?


cheers,
Patrick.
 
  • #25
Your message got me thinking of the following issue. There is of course a formal way to associate probabilities to the quantum state, and that is the density matrix, rho. The formal "universe" is then made up by all possible density matrices rho.

But I can now reformulate the "when do we apply the Born rule" problem in this formal statistical way. The density matrix rho evolves according to:

rho(t) = U(t,t0) rho(t0) U-dagger(t,t0),

where U is the unitary time evolution matrix of the system (related to the hamiltonian of the total system).

There is a property that says that trace(rho^2) is between 0 and 1, and is equal to 1 only when we have a pure state.

You see the problem (it is always the same one, but expressed differently!):

trace(rho(t)^2 ) = trace(rho(t0)^2) (easy to work out...)

So once a pure state, always a pure state.

That by itself is not a problem of course. What IS the problem is that what the Born rule does is, in the basis of your choice (very important), you PUT ALL NON-DIAGONAL ELEMENTS TO 0.
Except if you were by coincidence measuring an eigenstate, this will give you a sudden jump in trace(rho(t)^2), which is not compatible with any physical process described by an U.

cheers,
Patrick.
 

1. What is the meaning of "observation" in scientific terms?

In science, observation refers to the process of gathering information about a phenomenon using our senses or scientific instruments. This can include visual observations, measurements, and data collection.

2. How does observation lead to interaction?

Observation can lead to interaction in various ways. For example, when we observe a phenomenon, we may ask questions, make predictions, or design experiments to test our hypotheses. These interactions with the phenomenon can lead to a deeper understanding and new discoveries.

3. Can observation change the outcome of an experiment?

Yes, observation can change the outcome of an experiment. This is known as the observer effect, where the act of observing a phenomenon can alter its behavior. To minimize this effect, scientists often use blind studies or double-blind studies where the observer is unaware of the experimental conditions.

4. Is there a difference between passive and active observation?

Yes, there is a difference between passive and active observation. Passive observation refers to simply observing a phenomenon without any deliberate effort to interact with it. Active observation, on the other hand, involves actively engaging with the phenomenon through asking questions, making predictions, or recording data.

5. How important is observation in the scientific method?

Observation is a crucial step in the scientific method. It allows scientists to gather data and evidence to support or reject their hypotheses. Without observation, the scientific method would not be able to function effectively, as it is the foundation of all scientific research and experimentation.

Similar threads

Replies
3
Views
632
Replies
4
Views
854
  • Quantum Physics
4
Replies
124
Views
3K
Replies
11
Views
1K
  • Quantum Physics
Replies
10
Views
1K
Replies
19
Views
2K
  • Quantum Physics
Replies
13
Views
2K
  • Quantum Physics
Replies
2
Views
912
Replies
14
Views
1K
Replies
4
Views
804
Back
Top