Quantum mechanics and the macroscopic universe .

In summary, there is ongoing debate about whether quantum mechanics can fully explain phenomena at the macroscopic level. While classical mechanics cannot fully explain some macroscopic phenomena, such as ferromagnetism, quantum mechanics is able to explain properties of materials and even behaviors of large objects under certain conditions. However, there are interpretational issues surrounding the idea of superposition of states at the macroscopic level, and various solutions have been proposed, including the Copenhagen interpretation and other views that try to reconcile quantum mechanics with classical observations.
  • #1
Nick666
168
7
Dont know if this is the right place to post this...

Physicist often say classical mechanics can't explain things at subatomic levels.

So, can quantum mechanics ever explain things at the macroscopic level ?
 
Last edited:
Physics news on Phys.org
  • #2
I wonder the same thing:
Faster than light / quantum entanglement is nowhere to be found in classical objects.
Person's are not dead and not dead at the same time.
None of this make any sense in the macrosopic world?
 
  • #3
Nick666 said:
So, can quantum mechanics ever explain things at the macroscopic level ?

Much of what we generally consider "macroscopic" physics can be derived from quantum mechanics. One obvíous example would be properties of materials (electrical conductivity etc, and nowadays even mechanical properties). Most solid state physics is "quantum mechanical" to some extent (even if we often tend to use semi-classical approximations).

But, if you are referring to things like superposition of states etc then it is true that this is rarely seen in the macrosopic world. However, it IS possible. Many types of solid-state qubits (quantum bits) are so big (in some cases tens of microns) that you can see them quite easily in an optical microscope.

There are also some even more "exotic" examples such as certain types of detectors for gravitational waves, these can be HUGE (tens of tons!) but since they are cooled to very low temperatures it is still possible to observe "quantum mechanical" properties.
 
  • #4
Classical mechanics cannot explain (starting from first principle) macroscopic phenomena like ferromagnetism, superconductivity or superfluidity and the same is valid for quantum mechanics.

In general there is no (quantum) way to obtain from first principle the behaviour of a peace of matter of even 1mm cube.

Ciao

mgb2
 
  • #5
Nick666 said:
So, can quantum mechanics ever explain things at the macroscopic level ?

As so often, this touches upon interpretational issues.

The "obvious" difficulty quantum mechanics has to describe "macroscopic" physics is what Schroedinger already saw, and illustrated it dramatically with his famous cat. The cornerstone of quantum theory is the superposition principle: that the quantum state of things is a superposition of observable "classical" states.

By its very definition, this would run into an obvious problem: how can something *that is macroscopically observed* ever be in "a superposition of observable states" ? How can a cat be in a superposition of "dead" and "alive" ?

There are some "solutions" to this dilemma which are often erroneously taken as possible explanations, but which run into troubles. The first "solution" is this:

1) quantum mechanical superpositions are just a fancy word for probabilities.
So if you say that the "cat is in a superposition of dead and alive", then this simply means that the cat has a certain probability to be dead, and a certain probability to be alive (we simply don't know which one). Of course, this would then solve the issue.

Unfortunately, this is a very common misunderstanding, often promoted by elementary treatments and popularisations of quantum theory. But it is not true that one can equate a quantum-mechanical superposition always with a statistical distribution. Everything which is "typically quantum-mechanical" exactly shows the difference between both. It goes under the name of quantum-mechanical interference. One can show mathematically that no statistical distribution can describe all quantum predictions.

This issue is even more complicated by the fact that the superposition of *outcomes* IS to be considered as a statistical distribution. So people very often fall into the trap of assuming that *any* superposition represents a statistical distribution, but this can be shown to run into problems.

The second "solution" is:
2) interactions on macroscopic scale maybe are always such that the superposition of macro-states just becomes one single observable state. After all, these interactions can be quite complicated, and we can't follow all the details. So, it would simply be a result of the complicated interactions that we never end up with crazy superpositions of "cat dead" and "cat alive".

This is also not possible, at least in the current version of quantum mechanics. The reason is the unitarity of the time evolution operator.
It comes down to this: if initial state |a> gives rise to "live cat", and initial state |b> gives rise to "dead cat", then it is unavoidable that the state |a> + |b> will give rise to a superposition of dead cat and live cat. No matter how complicated U is.

So these are two non-solutions to the problem.

The "solution" by the founders of quantum theory (Bohr in particular) was simply that there is some vague borderline between the "quantum world" and the "classical world". We, human beings, live in the "classical world", which is the only "real" world. But microscopic things can sometimes "quit the classical world", undergo quantum phenomena, and, at the moment of their observation, "re-emerge" in the classical world. We're not supposed to talk about any classical property (such as the particle's position or so) during its "quantum dive", but only during its "preparation", and at the moment of its "observation". The outcome of this observation is statistical, and "re-initialises" the classical evolution from that point on. Cats are also just living in the classical world.

This goes under the name of the Copenhagen interpretation.

Of course, the above position is - although practical of course - philosophically rather unsatisfying, for two reasons: first there is the ambiguity of what is physically happening between "preparation" and "measurement" ("solved" by "you shouldn't talk about it"), but more importantly, the ambiguity of what exactly is a "measurement".

But again, this is the way one does quantum mechanics in practice.

And then, there are other views on the issue, which try to give quantum theory the possibility of giving a coherent description of what is macroscopically "classically" observed. The two that come to mind are Bohmian mechanics, and the Many Worlds Interpretation. I hesitate mentioning the "transactional" interpretation, because I'm not sure it works out completely - but that is maybe just my problem.

Some people think that quantum mechanics needs a modification in order to allow the "non-solution" 2) to apply, namely that complicated interactions give rise to the emergence of a single "outcome state". This can only be achieved by dropping the unitarity condition.

In other words, quantum mechanics as well as classical mechanics are "tangent" theories to a more complete theory which has as "asymptotic" theories quantum mechanics for the microscopic world, and classical physics for the macroscopic world. Attractive as this may seem at first sight, we already know a lot of mathematical difficulties that will arise that way, especially with respect to relativity. So if ever this is the way, it will be a *major* revision of most principles in physics.

There are also "philosophical" views on quantum mechanics, which go a bit in the direction of Copenhagen, but are more sophisticated, and which negate the existence of any objective ontology (not even a classical one). As such, quantum mechanics is just a description of what is subjectively experienced and allows one to build a coherent view of a subjective experience. The relational interpretation goes in that direction.

And finally, there is the "shut up and calculate" attitude, which tells us that all this philosophy doesn't bring in much, that quantum mechanics is a good tool to calculate outcomes of experiments, and that that is good enough. In other words, quantum mechanics is just a mathematical model that seems to do a good job, as is all of physics in the end. One shouldn't give "interpretations" to what is calculated.

A bit in the last direction goes the idea of "emerging properties", which tells us that nature just consists of "russian dolls" of models, which are more or less appropriate for a certain level of phenomena, but that there is no coherent, all-encompassing model which can describe everything - even in principle.
So many phenomena have to be described by quantum mechanics, but on a higher level of "macroscopicity", emerges classical physics without it being able to be *derived* from the underlying quantum-mechanical model, or without there being a more complete theory which has both behaviours as limiting cases.
 
Last edited:
  • #6
vanesch said:
2) interactions on macroscopic scale maybe are always such that the superposition of macro-states just becomes one single observable state. After all, these interactions can be quite complicated, and we can't follow all the details. So, it would simply be a result of the complicated interactions that we never end up with crazy superpositions of "cat dead" and "cat alive".

Are you talking about (or hinting at) decoherence here?

vanesch said:
This is also not possible, at least in the current version of quantum mechanics. The reason is the unitarity of the time evolution operator. It comes down to this: if initial state |a> gives rise to "live cat", and initial state |b> gives rise to "dead cat", then it is unavoidable that the state |a> + |b> will give rise to a superposition of dead cat and live cat. No matter how complicated U is.

Assuming the context is "decoherence", isn't this solved by focusing on a subsystem and not the whole system, in which the evolution is necessarily unitary? My understanding is that the evolution of the subsystem is not unitary, correct?
 
  • #7
As usual, vanesch, a very helpful and thoughtful answer.

One question. Isn't another view simply that for objects where the deBroglie wavelength is smaller than a Planck length, it is physically impossible to observe interference effects because it is impossible to distinguish between the two superimposed states?

In other words, take a bitmapped image that simulates gray by alternating between black and white pixels column by column (as opposed to in a checkerboard pattern). But say our resolution is such that we can only see every other column. (We have a really crappy down-rezzing algorithm :)). Then we either see solid black or solid white - apparently randomly. We never see gray because our resolution is simply not good enough and by definition never could be.
 
  • #8
peter0302 said:
One question. Isn't another view simply that for objects where the deBroglie wavelength is smaller than a Planck length, it is physically impossible to observe interference effects because it is impossible to distinguish between the two superimposed states?

No, remember that interference and related effects is -in general- NOT something that necessarily give rise to real fringes or indeed anything "visible". Quantum mechanics is much more general than that and quantum states do not need to refer to anything "physical", in general they describe what I guess you could call properties of a given system. Hence, the "border" (if you can call it that) between quantum mechanics and the macroscopic world has little to do with the actual size of an object.

It is e.g. possible to perform interferometry in phase space in a way that is completely analogues to optical interferometry in real space.
See e.g.
http://arxiv.org/abs/cond-mat/0512691
 
  • #9
nanobug said:
Assuming the context is "decoherence", isn't this solved by focusing on a subsystem and not the whole system, in which the evolution is necessarily unitary? My understanding is that the evolution of the subsystem is not unitary, correct?

There is a misunderstanding about what decoherence achieves: it doesn't solve the "and/or" problem by itself. The "trick" with the reduced density matrix begs the question, because the statistical interpretation of the diagonal elements of the reduced density matrix ALREADY assumes that one goes from a superposition to a statistical mixture (otherwise, the partial trace over "the rest" wouldn't make any sense: it only makes sense as a "sum over mutually exclusive events").

So decoherence doesn't solve the issue by itself. It HELPS solve the issue if we propose a framework in which the and/or problem IS treated, such as the many worlds interpretation, and indicates WHY we can take the "superposition-> statistical mixture" transition on "elementary" states without taking into the account the complicated interaction with the environment, and moreover why this happens in the "pointer basis".
 
  • #10
vanesch said:
So decoherence doesn't solve the issue by itself.

I was specifically addressing your claim that the evolution of the system is always unitary and that this presents a problem in the measurement of a "classical" outcomes. I agree with you. However, if one restricts oneself to a subsystem, the evolution is no longer unitary, correct? It is in this case that one can talk about a statistical mixture which to me appears considerably more 'classical' than pure states.
 
  • #11
nanobug said:
I was specifically addressing your claim that the evolution of the system is always unitary and that this presents a problem in the measurement of a "classical" outcomes. I agree with you. However, if one restricts oneself to a subsystem, the evolution is no longer unitary, correct? It is in this case that one can talk about a statistical mixture which to me appears considerably more 'classical' than pure states.

This "statistical mixture" is obtained by taking the reduced density matrix, which is obtained by taking the density matrix of the overall pure state, and make a partial trace over the other degrees of freedom. But - that's what I said - in order even to give the meaning of a density matrix to this "reduced density matrix", with its probability interpretation, one has to give a meaning to this partial trace. Indeed, there's nothing in quantum mechanics by itself that tells us that by taking a partial trace, one obtains something like a density matrix! This only makes sense if we ALREADY interpret the pure state as a statistical mixture, and then make the sum over the mutually exclusive events that correspond to the same outcome of the subsystem, but different outcomes of the other degrees of freedom.
 
  • #12
you might try reading feynman's "QED" - he does an excellent job of showing how the probablities of the quantum world translate into the traditional world of classical physical interaction at the macro level.
 
  • #13
Measurements are almost always uncertain. So, for example, let's measure the width of a piece of paper. I get 7.8", then I get 7.87", and then 7.92, and...This is deterministic?

We don't know what the "real" width is; each measurement is different -- and hardly predictable. Enter the statisticians, who tell us that the best estimate of the width is the average of the measurements. There's at least one way to do better -- the more measurements done, the smaller the standard deviation becomes, and the more accurate the mean becomes. Still, you never know for sure. As long as those measurement errors are around, certainty is an illusion -- like in Plato's cave.

In fact, your visual perception is statistical in nature; a blend of quantum and classical probabilities. Your eyes are in constant motion, scanning the visual field, sometimes guided by other things seen. Sometimes the scanning is random. We see and hear averages, and not the pristine world of classical physics.

Regards,
Reilly Atkinson
 
  • #14
vanesch said:
This only makes sense if we ALREADY interpret the pure state as a statistical mixture

OK, this is the part I am having trouble with.

My understanding of a 'pure state' is that it shows interference (non-zero off-diagonal) terms in a density matrix. Not so with a 'mixed state', in which the off-diagonal terms are zero. As such, how can you interpret the 'pure state' as a 'mixed state'?
 
  • #15
reilly said:
Measurements are almost always uncertain. So, for example, let's measure the width of a piece of paper. I get 7.8", then I get 7.87", and then 7.92, and...This is deterministic? We don't know what the "real" width is; each measurement is different -- and hardly predictable.
This explanation fits nicely with a possible definition of science, e.g.--science = uncertain knowledge of what is real.
 
  • #16
nanobug said:
OK, this is the part I am having trouble with.

My understanding of a 'pure state' is that it shows interference (non-zero off-diagonal) terms in a density matrix. Not so with a 'mixed state', in which the off-diagonal terms are zero. As such, how can you interpret the 'pure state' as a 'mixed state'?

Well, that's exactly the problem: you have to give a probability interpretation in one way or another to the diagonal elements of the density matrix (the overall density matrix) even before you can consider the reduced density matrix of the subsystem as "a density matrix".

In other words, taking the reduced density matrix by itself doesn't solve the and/or problem: you have to have solved it already before this matrix has a meaning!
 
  • #17
vanesch said:
Well, that's exactly the problem: you have to give a probability interpretation in one way or another to the diagonal elements of the density matrix (the overall density matrix) even before you can consider the reduced density matrix of the subsystem as "a density matrix". In other words, taking the reduced density matrix by itself doesn't solve the and/or problem: you have to have solved it already before this matrix has a meaning!

This is a good point but I am not sure it's the only possibility.

Interpreting the diagonal elements of the density matrix as probabilities relies, it would seem to me, on the idea of 'collapse' of the wave function. But this is exactly what decoherence is trying to explain, i.e., that (apparent) collapse. As such, it appears to me that the meaning of the density matrix could be obtained a posteriori from what we get from the reduced density matrix (which is the actually stuff that we measure in the real world). Wouldn't this way of looking at things solve the problem you presented?
 
  • #18
vanesch said:
Well, that's exactly the problem: you have to give a probability interpretation in one way or another to the diagonal elements of the density matrix (the overall density matrix) even before you can consider the reduced density matrix of the subsystem as "a density matrix".

In other words, taking the reduced density matrix by itself doesn't solve the and/or problem: you have to have solved it already before this matrix has a meaning!

Ok, but at least there is a consistency--you give prob. interpretation to the diagonal elemens of the overall density matrix,and you end up interpreting the reduced density matrix as a 'density matrix'.
 
  • #19
gptejms said:
Ok, but at least there is a consistency--you give prob. interpretation to the diagonal elemens of the overall density matrix,and you end up interpreting the reduced density matrix as a 'density matrix'.

Sure! Of course, *once* you give a prob. interpretation to the overall density matrix, then the reduced density matrix also has a sense as "density matrix" with a probability interpretation, and even helps you understand why there are no *visible* interference effects anymore.

But sometimes people claim that decoherence gives an explanation for the probability interpretation itself, and that's not true because you need it already before.
 
  • #20
vanesch said:
Sure! Of course, *once* you give a prob. interpretation to the overall density matrix, then the reduced density matrix also has a sense as "density matrix" with a probability interpretation, and even helps you understand why there are no *visible* interference effects anymore.

But this tells you that the prob. interpretation is a good one to start with--as a result of this the reduced density matrix can be interpreted as a 'density matrix' plus you knock off (visible) interference effects--not a bad bargain!
 
  • #21
gptejms said:
But this tells you that the prob. interpretation is a good one to start with--as a result of this the reduced density matrix can be interpreted as a 'density matrix' plus you knock off (visible) interference effects--not a bad bargain!

Of course! We all know that in one way or another, the probability interpretation is correct! It is the way one verifies quantum mechanics against experiments. The only question is how it comes about! Now if people say, it comes about *because of decoherence*, then that's wrong. But if we have *another* way of explaining the occurrence of probabilities (as "experience of worlds" or "nonlinearities and projection" or even the "transit from quantum to classical"), then decoherence helps making the story "coherent", as you point out.
 
  • #22
vanesch said:
Of course! We all know that in one way or another, the probability interpretation is correct! It is the way one verifies quantum mechanics against experiments. The only question is how it comes about! Now if people say, it comes about *because of decoherence*, then that's wrong. But if we have *another* way of explaining the occurrence of probabilities (as "experience of worlds" or "nonlinearities and projection" or even the "transit from quantum to classical"), then decoherence helps making the story "coherent", as you point out.

Agreed--decoherence only helps make the story coherent.But I feel *another* way of really 'explaining' occurence of probabilities will remain a distant dream--we have to contend with the fact that it is only this far that physics gets you!
 
  • #23
vanesch said:
Of course, *once* you give a prob. interpretation to the overall density matrix, then the reduced density matrix also has a sense as "density matrix" with a probability interpretation

Once again, I disagree.

The probability interpretation is an operational interpretation, i.e., made to match measurements. However, measurements always depend on the process of decoherence. It is therefore the process of decoherence that forces one to operationally interpret the reduced density matrix as 'probabilities'. After all, the full density matrix is inaccessible to measurement and only though the window of the reduced density matrix and via the process of decoherence one may peek inside.

In other words, it seems to me that technically it is not necessary to postulate a probabilistic interpretation of the density matrix a priori. That insight can be gained a posteriori after one goes through decoherence via the reduced density matrix.
 
  • #24
peter0302 said:
In other words, take a bitmapped image that simulates gray by alternating between black and white pixels column by column (as opposed to in a checkerboard pattern). But say our resolution is such that we can only see every other column. (We have a really crappy down-rezzing algorithm :)). Then we either see solid black or solid white - apparently randomly. We never see gray because our resolution is simply not good enough and by definition never could be.

Haha, check this out . Look at it, than move back 5-6 meters and look at it :) . Weird.

http://minutillo.com/steve/weblog/images/hybrid-face.jpg

Thanks for all the answers, guys .

I know that one of the interpretation of Bell's theorem experiments, is that logic isn't valid (correct me on this one) .

Why would one say that ?? Why not just say: logic at the quantum level is different from the logic at the macroscopic level .
 
Last edited by a moderator:
  • #25
Nick666 said:
Haha, check this out . Look at it, than move back 5-6 meters and look at it :) . Weird.

http://minutillo.com/steve/weblog/images/hybrid-face.jpg

Thanks for all the answers, guys .

I know that one of the interpretation of Bell's theorem experiments, is that logic isn't valid (correct me on this one) .

Why would one say that ?? Why not just say: logic at the quantum level is different from the logic at the macroscopic level .

Beware of not totally mixing up things: optical illusions have nothing to do with Bell's theorem or with logic.
 
Last edited by a moderator:
  • #26
nanobug said:
Once again, I disagree.

The probability interpretation is an operational interpretation, i.e., made to match measurements. However, measurements always depend on the process of decoherence. It is therefore the process of decoherence that forces one to operationally interpret the reduced density matrix as 'probabilities'. After all, the full density matrix is inaccessible to measurement and only though the window of the reduced density matrix and via the process of decoherence one may peek inside.

Sure. But...

In other words, it seems to me that technically it is not necessary to postulate a probabilistic interpretation of the density matrix a priori. That insight can be gained a posteriori after one goes through decoherence via the reduced density matrix.

Yes, but the *very definition* of the reduced density matrix relies on the probabilistic interpretation of the overall wavefunction.

Imagine the system and the environment as systems A and B.

Now, at a certain point, we will have as an overall wavefunction:

|psi> = a |a1> |b1> + b |a2>|b2> + ... + c |a3> |b3>

Now, consider a "measurement basis" for the environment: {|c1>, |c2> ...}.

This means that we can express |b1> in this basis, |b2> in this basis etc...

|psi> = a |a1> {c11 |c1> + c12 |c2> + ... } + b |a2> { c21 |c1> + c22 |c2> + ... } + ...

Now, consider that we do a FULL measurement on this state, in the basis {|a1>|c1>, |a>|c2>, ... ,|a2> |c1>, ...}. That is, we consider that we do a measurement on the system as well as on the environment.

We now USE the probability interpretation, which tells us that we will find, with probability |a c11|^2, that the outcome is |a1>|c1>, with probability |a c12|^2 that the outcome is |a1> |c2> etc...

These are mutually exclusive results. But consider now that we only calculate the probability to have |a1>, no matter what the environment is in. In that case, we SUM the probabilities over the mutually exclusive events which satisfy the wanted outcome |a1>, so we sum |a c11|^2 + |a c12|^2 + ...

Well, this sum is nothing else but the (1,1) diagonal element of the reduced density matrix. In the same way, the probability for |a2> will be:
|b c21|^2 + |b c22|^2 + ...
which is nothing else but the (2,2) diagonal element of the reduced density matrix.

But in order to call this number "the probability to have outcome |a2>, no matter what state the environment is in", we already needed to interpret the expansion of |psi> in a probabilistic way, so that we could make the sum over probabilities of exclusive events.

Now, the off-diagonal element (1,2) of the reduced density matrix will be:
a b* x {c11 c21* + c12 c22* + ... } which will reduce to 0 if |b1> and |b2> are orthogonal. THIS is what decoherence tells us, that the environment states that entangle with the system states in a measurement, will essentially be orthogonal and remain so.

As such, we can interpret the reduced density matrix as the "density matrix" of a statistical mixture of systems only, which only have diagonal elements, and which can be interpreted as probabilities (and hence, that this reduced density matrix behaves "well" as a density matrix).
 
  • #27
So far not.
 
  • #28
Nice picture, Nick. :)

Have to chime in again. The fact is that an interference pattern from an object with sufficiently small de Broglie wavelength will be indistinguishable from a gaussian pattern. If you did a double slit experiment with bullets, but assumed quantum effects applied, you could, in theory, map out an interference pattern. However, the fringes would be so close together that you could never actually observe them in practice.

Why do we assume the bullets are not exhibiting quantum effects merely because we cannot see them?
 
  • #29
vanesch said:
Yes, but the *very definition* of the reduced density matrix relies on the probabilistic interpretation of the overall wavefunction.

I think we are essentially in agreement with the exception of the following "twist": what I am suggesting is that by interpreting the reduced density matrix in a probabilistic way (given that this is what we measure and what we measure "forces" the probabilistic interpretation on us) we may then work "backwards" and conclude that the overall wave function has to be also interpreted in a probabilistic way for things to be self-consistent.

In the form of a question: if the reduced density matrix is given the probabilistic interpretation is it possible for the full wave function to be interpreted in any way other than probabilistic and keep the formalism consistent?
 
  • #30
nanobug said:
I think we are essentially in agreement with the exception of the following "twist": what I am suggesting is that by interpreting the reduced density matrix in a probabilistic way (given that this is what we measure and what we measure "forces" the probabilistic interpretation on us) we may then work "backwards" and conclude that the overall wave function has to be also interpreted in a probabilistic way for things to be self-consistent.

Ah, yes, that's a way to see things.

In the form of a question: if the reduced density matrix is given the probabilistic interpretation is it possible for the full wave function to be interpreted in any way other than probabilistic and keep the formalism consistent?

Probably not. It rings a bell called Gleason's theorem, but I don't know in how much this is air-tight.
 
  • #31
Is it incorrect to state that the Schroedinger's Cat thought experiment is a metaphor? My understanding is that "One can therefore assert that a quantum superposition of macroscopic states is never produced in reality (Roland Omnes Understanding Quantum Mechanics when discussing decoherence)" and therefore, a macroscopic description of a quantum superposition most likely is imperfect. While Schroedinger's Cat is useful, isn't is "just" a necessarily imperfect description of the micro realm by using a metaphor to describe a situation that our "classic" realm brains can attempt to appreciate? Is this an incorrect position?
 
Last edited:
  • #32
Sample1 said:
Is it incorrect to state that the Schroedinger's Cat thought experiment is a metaphor? My understanding is that "One can therefore assert that a quantum superposition of macroscopic states is never produced in reality. (Roland Omnes Understanding Quantum Mechanics when discussing decoherence)" and therefore, a macroscopic description of quantum superposition most likely is imperfect. While Schroedinger's Cat is useful, isn't is "just" a necessarily imperfect description of the micro realm by using a metaphor to describe a situation that our "classic" realm brains can attempt to appreciate. Is this an incorrect position?

Actually someone once told me that it was actually more of a wry observation by Schrödinger, who had a hard time coming to terms with quantum indeterminency, his demonstration was meant simultaneously to show what was happening in the quantum world, and to slightly poke fun at the seemingly absurd notion of superposition.

That said though yes it is a thought experiment, and no it shouldn't be applied to classical set ups, but as you say is a good analogy of what is going on at the quantum level, because we don't necessarily have the right language to precisely explain it in quantum terms. And not everyone is versed in the complexity of mathematics, nor can immediately grasp such mathematical concepts.
 
  • #33
Sample1 said:
One can therefore assert that a quantum superposition of macroscopic states is never produced in reality

That is not correct. One can formalise that statement into something called the "Legget criteria" which can be used to "test" for macroscopic quantum coherence. Over the past 20 years or so many systems have been shown to fulfill them, i.e. quantum superpositions of macroscopic states DO exist.
The most obvious demonstration of this is solid state qubits, which are obviously macroscopic and still exhibits quantum coherence (which is why they are called qubits).
However, the earliest example was -as far as I know- demonstration of macroscopic quantum tunnelling in Josephson junctions in the early eighties (1983-1984?) which ALSO fulfills the Legget criteria.

There is a long discussion about this in Takagi's book
"Macroscopic Quantum Tunneling ", £22 from Amazon.
 
  • #34
Over the past 20 years or so many systems have been shown to fulfill them, i.e. quantum superpositions of macroscopic states DO exist.​

The full quote is: One can therefore assert that a quantum superposition of macroscopic states is never produced in reality. Decoherence is waiting to destroy them before they can occur. The same is true for Schroedinger's cat: the stakes are put down as soon as a decay has been detected by the devilish device, before the poison phial is broken. The cat is only a wretched spectator. (Roland Omnes, Understanding Quantum Mechanics)

f95toli: I was aware of solid state quibits. I should have emphasized that I am referring to a complex macroscopic system, like a living cat.

Does that help? I will check out Takagi's book, thanks.

Schroedinger's Dog, thanks for your reply.
 
Last edited:
  • #35
The only place a cat can be alive and dead at the same time is in the human mind. We can imagine a world that does not and cannot exist in whatever passes for human reality. Further, the Schrodinger cat scenario has nothing to do with the details or nuances of quantum theory; instead it has to do with the basic nature of applied probability theory.

The state of the cat is conditional on the state of the killing device. This device could be triggered by some nuclear decay, or whether the Red Sox won on a particular day by the score of 3 to 1. It's all about conditional probability, which, of course, is purely in your mind. You know that if the killing device is on, the cat is dead. If the device is off, the cat is alive. But, common sense and experienced says that in Nature we never observe a cat that is simultaneously alive and dead. Why invent such a mythical creature? It's simply, if A then B; if not A then not B. Why invent such a thing, one that is "not B and B" all at once? Occam says a cat is alive, or dead but not both.

QM is indeed odd, but not as odd as a simultaneously alive and dead creature.Regards,
Reilly Atkinson
 

Similar threads

Replies
4
Views
850
  • Quantum Physics
Replies
14
Views
990
  • Quantum Physics
Replies
3
Views
274
  • Quantum Physics
Replies
2
Views
973
  • Quantum Physics
Replies
23
Views
996
Replies
33
Views
2K
Replies
48
Views
2K
  • Quantum Physics
Replies
6
Views
901
Back
Top