What counts as observer effect?

  • B
  • Thread starter Gaz
  • Start date
  • Tags
    Observer
In summary, the conversation discusses the concept of observer effect and its role in the collapse of the wave function. The current view is that decoherence is responsible for the collapse, but it is not fully understood. Decoherence occurs when a system interacts with its environment, causing entanglement between the two and making it impossible to assign a quantum state to just the system. This means that the coefficients of interference terms vanish, making it seem like the system is behaving classically. However, some information about the system is now encoded in the environment, and this plays a role in making the measurement seem classical. The conversation also touches on the role of the observer and how their actions, such as choosing a preferred basis and when the measurement result occurs
  • #1
Gaz
74
4
What can be counted as a observer effect? that would cause collapse of the wave function?
e.g. Is it only the which path information or is it purely the act of observing? As in if observer effect is the cause of wave function how come we can even see the interference pattern at all, Is it because we can't actually see the wave with our eyes so it is not collapsed or is it that it really is the instruments we use that effect the outcome?

Is it true that if we look at say the electron after it has past the slit it will collapse and act as if it where a particle all along and do photons also suffer from wave function collapse or do they always act as a wave as they essentially are a wave ?

No calculus please as I will most likely not have a clue what it means anyway. Just a simple answer will do.
 
Physics news on Phys.org
  • #2
The current view is that Decoherence is responsible for collapse but it doesn't explain it completely. Decoherence means that the system interacts with its environment(Thermal or optical radiation, air molecules, CMB, etc.) and becomes entangled with its degrees of freedom.
If two systems(here the system and its environment) become entangled, none of them can have an independent quantum state and you can only assign a quantum state to the system+environment.
But because you usually can't monitor environmental degrees of freedom, you should get rid of them in the joint state, so you do a procedure which is usually called tracing out environmental degrees of freedom, this will leave only the inner product of environmental base states as the coefficients of the interference terms.
Then because the environment is usually very complicated, those environmental base states are so much different that you can assume they are completely orthogonal and so their inner products vanishes with a very good approximation. This means that the coefficients of the interference terms vanishes, which means you can't observe quantum interference effects at the level of the system.

Another important point, is that by the above process, some informations about the state of the system will be encoded in the environmental degrees of freedom and because we almost always observe the system by interacting with its environment, this will have a role in making things seem classical.
 
  • Like
Likes bhobba
  • #3
So it isn't just magically changed due to which path information then. Thanks.
 
  • #4
The observer effect is subjective, and it depends on the observer knowing when he "saw" the result.

A more "objective" way of putting it is that the wave function collapse is only needed if a conditional probability is to be calculated, ie. the probability of a future outcome conditioned on an already observed result.

But the "objective" way of putting it is still subjective in the sense that if you don't want to calculate the conditional probability, then you don't need collapse.
 
Last edited:
  • #5
atyy said:
The observer effect is subjective, and it depends on the observer knowing when he "saw" the result.

A more "objective" way of putting it is that the wave function collapse is only needed if a conditional probability is to be calculated, ie. the probability of a future outcome conditioned on an already observed result.

But the "objective" way of putting it is still subjective in the sense that if you don't want to calculate the conditional probability, then you don't need collapse.

It seems to me that you're ignoring decoherence altogether, because as Schlosshauer explains in his book, you can divide measurement problem to 3 parts and decoherence solves 2 parts and the remaining part is the problem of outcomes. Also the fact that the environment will carry information about the state of the system long after the quantum effects are suppressed, makes it really plausible why we observe macroscopic systems behave classically. So it seems to me we shouldn't state measurement problem as it was from the beginning and accept that whatever it is, it is the result of the interaction with the environment. But how do we get an outcome at all? It seems that decoherence needs help on this which may come from a proper interpretation. But it seems to me you're completely ignoring all this and linking measurement problem to the role of observer, as was thought decades ago. Why is this? Or maybe I'm missing something!
 
  • Like
Likes bhobba
  • #6
Shyan said:
It seems to me that you're ignoring decoherence altogether, because as Schlosshauer explains in his book, you can divide measurement problem to 3 parts and decoherence solves 2 parts and the remaining part is the problem of outcomes. Also the fact that the environment will carry information about the state of the system long after the quantum effects are suppressed, makes it really plausible why we observe macroscopic systems behave classically. So it seems to me we shouldn't state measurement problem as it was from the beginning and accept that whatever it is, it is the result of the interaction with the environment. But how do we get an outcome at all? It seems that decoherence needs help on this which may come from a proper interpretation. But it seems to me you're completely ignoring all this and linking measurement problem to the role of observer, as was thought decades ago. Why is this? Or maybe I'm missing something!

It's hard to understand what Schlosshauer is saying. The observer in Copenhagen does 3 things: place the cut, choose the preferred basis, choose when the measurement result occurs. Decoherence does none of these unless additional assumptions are added which do the job of the observer objectively. However, these are beyond standard QM. For example, the objective choice of the preferred basis requires a postulate such as the predictability sieve, which is not a textbook criterion.

Perhaps Schlosshauer intends the additional assumptions to be something like the hidden variables of Bohmian Mechanics. Again, it is unclear if these work for all of quantum mechanics. MWI again is not a consensus solution to the measurement problem, even by proponents such as Carroll and Wallace, who both admit difficuly points. So until there is at least one interpretation that is as successful as Copenhagen, Copenhagen should remain the standard interpretation.
 
  • #7
atyy said:
It's hard to understand what Schlosshauer is saying.

Having read Schlosshauer that most definitely is not true. He explains it very carefully. You may not agree with the three parts he breaks the measurement problem into - but why he does it is quite clear. The first two are reasonably explained by decoherence - its the third that's the issue.

To the OP - Shyan is correct. These days it is generally accepted decoherence has explained quite a bit of the measurement problem - issues remain - but those issues have morphed the measurement problem into a slightly different issue - the problem of why we get any outcomes at all.

Thanks
Bill
 
  • #8
atyy said:
It's hard to understand what Schlosshauer is saying. The observer in Copenhagen does 3 things: place the cut, choose the preferred basis, choose when the measurement result occurs. Decoherence does none of these unless additional assumptions are added which do the job of the observer objectively. However, these are beyond standard QM. For example, the objective choice of the preferred basis requires a postulate such as the predictability sieve, which is not a textbook criterion.

Perhaps Schlosshauer intends the additional assumptions to be something like the hidden variables of Bohmian Mechanics. Again, it is unclear if these work for all of quantum mechanics. MWI again is not a consensus solution to the measurement problem, even by proponents such as Carroll and Wallace, who both admit difficuly points. So until there is at least one interpretation that is as successful as Copenhagen, Copenhagen should remain the standard interpretation.

Its true that decoherence itself is not a finished and polished product but even if you want to reject decoherence, you can't reject the role of interaction with the environment. So for any other objection, I can only interpret it as saying the interaction with the environment can be interpreted better than decoherence program.
 
  • Like
Likes bhobba
  • #9
Shyan said:
Its true that decoherence itself is not a finished and polished product.

As I continually point out our modern understanding of QM lacks certain key mathematical theorems (and one is the factorisation issue) - this is pointed out in the following book if anyone wants to see what they are:
https://www.amazon.com/Understanding-Quantum-Mechanics-Roland-Omnès/dp/0691004358

Most people don't know it but statistical physics has the same problem - the ergodic theorem. It was famously proved by Von-Neumann and Birchoff independently, but unfortunately isn't quite general enough so the foundations of statistical physics are in exactly the same state as QM. But for some reason I don't see people making a big issue out of that like I often see in QM.

Thanks
Bill
 
Last edited:
  • #10
bhobba said:
As I continually point out our modern understanding of QM lacks certain key mathematical theorems (and one is the factorisation issue) - this is pointed out in the following book if anyone wants to see what they are:
https://www.amazon.com/Understanding-Quantum-Mechanics-Roland-Omnès/dp/0691004358

Can you point to a specific section?

bhobba said:
Most people don't know it but statistical physics has the same problem - the ergodic theorem. It was famously proved by Von-Neumann and Birchoff independently, but unfortunately inst quite general enough so the foundations of statistical physics are in exactly the same state as QM. But for some reason I don't see people making a big issue out of that like I often see in QM.

I've never seen a statistical physics textbook discussing such issues. Do you know a book that discusses this at an introductory level?
 
  • #11
Shyan said:
Its true that decoherence itself is not a finished and polished product but even if you want to reject decoherence, you can't reject the role of interaction with the environment. So for any other objection, I can only interpret it as saying the interaction with the environment can be interpreted better than decoherence program.

No, of course I don't reject decoherence.

In Copenhagen, decoherence is what allows the classical/quantum boundary to be shifted.

In Bohmian Mechanics, decoherence is essential for thinking that it is a solution to the measurement problem for non-relativistic quantum mechanics.

I don't accept that decoherence without explicit statement of additional assumptions explains how the classical world emerges from a more fundamental quantum or sub-quantum reality.
 
  • #12
bhobba said:
Most people don't know it but statistical physics has the same problem - the ergodic theorem. It was famously proved by Von-Neumann and Birchoff independently, but unfortunately inst quite general enough so the foundations of statistical physics are in exactly the same state as QM. But for some reason I don't see people making a big issue out of that like I often see in QM.

That's wrong. In quantum mechanics the state space is not a simplex, whereas it is in classical statistical mechanics.
 
  • #13
bhobba said:
To the OP - Shyan is correct. These days it is generally accepted decoherence has explained quite a bit of the measurement problem - issues remain - but those issues have morphed the measurement problem into a slightly different issue - the problem of why we get any outcomes at all.

A simple way to see that this is not correct is that decoherence is part of Copenhagen.
 
  • #14
atyy said:
A simple way to see that this is not correct is that decoherence is part of Copenhagen.

Decoherence is part of any interpretation since it follows from the formalism. It however is not an explicit part of Copenhagen or the statistical interpretation of Ballentine. It is however part of what many consider Copenhagen's successor - Decoherent Histories:
http://motls.blogspot.com.au/2011/05/copenhagen-interpretation-of-quantum.html

Thanks
Bill
 
Last edited:
  • Like
Likes ShayanJ
  • #15
Shyan said:
Can you point to a specific section?

Not off the top of my head.

Shyan said:
I've never seen a statistical physics textbook discussing such issues. Do you know a book that discusses this at an introductory level?

If you look around you can find a lot of stuff on it eg
http://www.fisica.ufmg.br/~dickman/transfers/MecEst/Fundamentos/ergodicity.pdf

Here is another at a more abstract level:
http://www.sbfisica.org.br/rbef/pdf/060601.pdf

Thanks
Bill
 
Last edited by a moderator:
  • Like
Likes ShayanJ
  • #16
Shyan said:
Can you point to a specific section?

Can there be a mathematical theorem that any factorization is ok? Another way to think about this question at least informally is to see how Bohmian Mechanics addresses the factorization problem. Essentially, what the hidden variables do is provide a finest factorization, and then we only allow factorizations that are composed in reasonable ways from the finest factoriation. So the question then becomes, can any observable be the hidden variable? Informally, the answer appears to be "no", because the success of decoherence depends on the locality of interactions when using position basis functions. This is not a theorem, of course, but it is an informal argument.

If you want to see this stated formally, a sketch of the route to formalization is that Bohmian Mechanics embeds quantum mechanics into something in which the state is ignorance interpretable, ie, the state is a simplex, and we can use classical probability throughout. Then the finest factorization and the reasonable compositions are the sample space and the Borel sigma algebra or whatever.
 
Last edited:
  • #17
bhobba said:
Decoherence is part of any interpretation since it follows from the formalism. It however is not an explicit part of Copenhagen or the statistical interpretation of Ballentine. It is however part of what many consider Copenhagen's successor - Decoherent Histories:
http://motls.blogspot.com.au/2011/05/copenhagen-interpretation-of-quantum.html

Yes, but decoherence is essential for Copenhagen. In most versions of Copenhagen, the classical/quantum boundary can be shifted, so that the environment can also be considered quantum. If there were no decoherence, this aspect of Copenhagen would not work.

So it is only in a technical sense that decoherence helps to solve the measurement problem. Restricting to non-relativistic QM, a better way of saying things is that decoherence is required for the consistency of interpretations in which the measurement problem is not solved (eg. Copenhagen) as well as interpretations in which the measurement problem is presumably solved (eg. Bohmian Mechanics).
 
Last edited:
  • #18
atyy said:
Can there be a mathematical theorem that any factorization is ok?

Of course there can be.

What I think is more likely is that it will only be shown for systems of practical interest. As Von Neumann said his Ergodic theorem is sufficient for the purposes of statistical physics. The arguments are about what constitutes sufficient.

Thanks
Bill
 
Last edited:
  • #19
atyy said:
Yes, but decoherence is essential for Copenhagen..

It isn't. As the link I gave explained its got nothing to do with it. See the list of the key parts of that interpretation.

1. A system is completely described by a wave function ψ, representing an observer's subjective knowledge of the system. (Heisenberg)
2. The description of nature is essentially probabilistic, with the probability of an event related to the square of the amplitude of the wave function related to it. (The Born rule, after Max Born)
3. It is not possible to know the value of all the properties of the system at the same time; those properties that are not known with precision must be described by probabilities. (Heisenberg's uncertainty principle)
4. Matter exhibits a wave–particle duality. An experiment can show the particle-like properties of matter, or the wave-like properties; in some experiments both of these complementary viewpoints must be invoked to explain the results, according to the complementarity principle of Niels Bohr.
5. Measuring devices are essentially classical devices, and measure only classical properties such as position and momentum.
6. The quantum mechanical description of large systems will closely approximate the classical description. (The correspondence principle of Bohr and Heisenberg)

Thanks
Bill
 
  • #20
bhobba said:
What I think is more likely is that it will only be shown for systems of practical interest. As Von Neumann said his Ergodic thereon is sufficient for the purposes of statistical physics. The arguments are about what constitutes sufficient.

The ergodic theorems (at least the older ones, perhaps there are newer ones I'm not aware of) are not sufficient for statistical mechanics. They are essentially irrelevant.
 
  • #21
atyy said:
They are essentially irrelevant.

That's the point. There are those like Von-Neumann that disagree. Its essentially an argument of what is sufficient for practical purposes. The same with a lot of the decoherence stuff.

Thanks
Bill
 
  • #22
bhobba said:
That's the point. There are those like Von-Neumann that disagree. Its essentially an argument of what is sufficient for practical purposes. The same with a lot of the decoherence stuff.

No, you are wrong on both counts.

1) In the foundations of statistical mechanics, the (older) ergodic theorems are irrelevant, because they only guarantee thermalization on time scales far longer than actually seen in practice. There is general agreement that the old ergodic theorems are irrelevant.

2) There are some forms of quantum mechanics in which the problem measurement problem becomes analogous to those in the foundations of statistical mechanics, eg. Bohmian mechanics. However, these are non-standard. In the standard interpretation the state space of quantum mechanics is not a simplex, unlike that of classical statistical mechanics. So quantum mechanics has foundational issues that different from those of classical statistical mechanics.
 
  • #23
The " Observer effect " has it's limitations when it comes to the size of the observer. Quantum mechanics is a good tool to work out space- time problems. The theory that it takes more energy to view the small is true, however that mechanism will have to be a complex one in order to prevent destroying what we are trying to observe.
 
  • #24
I have a question : if we consider a two slit experiment with very narrow slits. Doed the slit not count as a position measurement and there should be no interferences ?
 
  • #25
No because you don't know which slit the particle went through.
 

What counts as observer effect?

The observer effect refers to the phenomenon where the act of observing or measuring a system alters its behavior or properties. It is also known as the Hawthorne effect or the experimenter effect.

What are some examples of observer effect in scientific experiments?

Examples of observer effect include a participant in a psychology study changing their behavior due to being observed, a thermometer reading changing when it is being observed, and a plant growing differently when it is being observed compared to when it is not.

How does the observer effect affect the validity of scientific experiments?

The observer effect can potentially introduce bias and affect the reliability and validity of scientific experiments. This is because the presence of an observer can influence the behavior or response of the subject being studied, leading to inaccurate results.

Can the observer effect be avoided in scientific experiments?

The observer effect cannot be completely avoided, but it can be minimized by careful design and planning of experiments. This can include using double-blind studies where both the observer and the participants are unaware of the study's purpose, or using indirect measures to collect data instead of direct observation.

How can the observer effect be accounted for in data analysis?

The observer effect can be accounted for in data analysis by comparing the results of an observed group to a control group that was not observed. Additionally, statistical techniques such as analysis of covariance (ANCOVA) can be used to control for the potential influence of the observer effect on the results.

Similar threads

Replies
23
Views
2K
  • Quantum Physics
Replies
3
Views
859
Replies
42
Views
1K
Replies
5
Views
279
  • Quantum Physics
Replies
14
Views
1K
  • Quantum Physics
Replies
13
Views
1K
Replies
60
Views
3K
Replies
32
Views
2K
Replies
8
Views
2K
Replies
16
Views
1K
Back
Top