Schrödinger's Cat Explained: Understanding the Famous Thought Experiment

  • Thread starter Thread starter jhe1984
  • Start date Start date
Click For Summary
Schrödinger's Cat is a thought experiment illustrating quantum superposition, where a cat in a box can be both alive and dead until observed. The confusion arises from the idea that, unlike classical objects, quantum entities can exist in multiple states simultaneously. Opening the box collapses this superposition into one observable state, either alive or dead, according to the Copenhagen Interpretation. This paradox highlights the difference between quantum mechanics and classical physics, as macroscopic objects like cats do not exhibit superposition in practice due to decoherence. The discussion emphasizes the implications of measurement and observation in quantum theory, raising questions about reality and existence.
  • #31
ZapperZ said:
I think your explanation here isn't correct.

The metaphor is that before you open the box, the state of the cat is in a superposition of both alive anddead. This metaphor is an illustration of the superposition of orthorgonal states of a wavefunction, such as

\Psi = a|u> + b|v>

The difference between this and our classical world is that, in our classical universe, the cat is either dead or alive. It cannot exist in a superposition of two very distinct states.

The act of opening the box "collapses" (if you buy the Copenhagen Interpretation) the state (i.e. you now make a measurement) so that now, the cat is unambiguously determined to be dead or alive. You now have either a |u> or a |v> state and no longer the superposition of the two.

Zz.


In fact, in classical probability theory, it's quite possible, indeed common in many fields, to work with superimposed states. The standard definition of a random variable is virtually identical to the spectral representation of the average of operators in QM. That is, if P(x) is the probability for the random variable X to have the value x, then the random variable is given by;


X = Sum over x (x*P(x))

Looks like superposition to me. (Further, off-diagonal elements come into play when dynamics is considered -- think density matrix.) It is strictly a matter of interpretation. The most simple interpretation, in my humble view, is that superposition is but a mathematical way to finesse some of the issues raised by probability -- as in the view of having different values of the same variable at the same time. If you really believe that the cat of Schrodinger is both alive and dead then, I've got a bridge, and a sky hook for sale that you might want to examine.

The superposition is strictly in your head; not necessarily actually in nature. Dito for the collapse. Superposition is a mathematical concept, a highly important tool in many areas of science and mathematics. It's a concept that helps explain nature, it is a name, like electron or Joe, or... The name and the bearer of the name are not the same. At some point in our education, we are taught that words are not exact, they always carry ambiguity, just like our metaphors of superposition, energy conservation, ... We use words to help us work with our math, the 'real' language of physics -- using the inexact to help explain the greatly-more exact can be quite dicey./

So it goes,
Reilly Atkinson
 
Physics news on Phys.org
  • #32
reilly said:
In fact, in classical probability theory, it's quite possible, indeed common in many fields, to work with superimposed states. The standard definition of a random variable is virtually identical to the spectral representation of the average of operators in QM. That is, if P(x) is the probability for the random variable X to have the value x, then the random variable is given by;


X = Sum over x (x*P(x))

Looks like superposition to me. (Further, off-diagonal elements come into play when dynamics is considered -- think density matrix.) It is strictly a matter of interpretation. The most simple interpretation, in my humble view, is that superposition is but a mathematical way to finesse some of the issues raised by probability -- as in the view of having different values of the same variable at the same time. If you really believe that the cat of Schrodinger is both alive and dead then, I've got a bridge, and a sky hook for sale that you might want to examine.

The superposition is strictly in your head; not necessarily actually in nature. Dito for the collapse. Superposition is a mathematical concept, a highly important tool in many areas of science and mathematics. It's a concept that helps explain nature, it is a name, like electron or Joe, or... The name and the bearer of the name are not the same. At some point in our education, we are taught that words are not exact, they always carry ambiguity, just like our metaphors of superposition, energy conservation, ... We use words to help us work with our math, the 'real' language of physics -- using the inexact to help explain the greatly-more exact can be quite dicey./

So it goes,
Reilly Atkinson

You are forgetting one important fact here. If such "superpostion" is that common in classical statistics and physics, Schrodinger would not go to that extent as trying to demonstrate its strangeness as it is implied in QM, and physicists would not have continued to study it and produce non-classical results (see the Delft/Stony Brook experiments).

Besides, are you claiming that the weighting function P(x) you have in your statistical sum has the same physical representation as the basis functions in QM?

I have a completely different take on this than the "standard" QM, but that isn't what I used in trying to explain this on here. If we want to hijack this thread and argue on what is meant by a "measurement", then fine, let's go at it. But when someone asks something like this, it would be irresponsible if I answered not from the way the "conventional" interpretation of the Cat problem is, but rather in the way *I* would prefer.

Zz.
 
  • #33
Forgeting? I think not. If I recall, Schrodinger, on his claim, was working through his confusion about QM re: Born, Heisenberg, Bohr, et al. What is important is that we know a lot more about everything than did our QM founding fathers -- and I'm thinking statistics and probabilty and neural science in particular. Goodness, any part of physics, for example, that deals with vectors, finite or infinite in dimension, involves superposition -- so we started our physics life with superposition in high school or freshman physics -- rowing boats across rivers with currents; circularly polarized light, ...

In regard to probability and statistics, prior to the advent of today's powerful computers, we were severely constrained in our ability to do any bit the most modest-sized problems. Prior to 10-15 years ago stat and probability were (necessarily) quite theoretical, and quite hard to understand. And, of course, that situation was all the more pronounced in the 1920s and 1930s -- so to speak, they didn't know from.

During the 1970s and 1980s folks like me, Ph.D s in quantitative fields, dominated the high end of business consulting -- we used old mainframes, and, my favorite, VAXs and so on. Now we have MBAs and BAs able to do much of what I and my colleagues used to do, simply as a result of amazing computing power, and remarkable software, often now called Data Mining software put out by outfits like COGNOS, now IBM, Microsoft, ... (I and my colleagues have been, to use a phrase I don't like much, outmoded.) Schwinger complained that Feyman had brough field theory to the masses. Us 65 and older PhDs complain that computers and software have brought statistical analysis to the masses. (I'm getting there, and please excuse my sloppy grammer, "Us ...)

I find that many younger analysts, some trained in physics, have virtually no issues with probability or statistical interpretation, that's just stuff you do. Think, for example about Newton's days -- calculus was new, controversial with some, and created a long examination of the idea of infitesimals -- some still worry about these matters. But most of us just go ahead and do it. In regard to interpretive history, many notions have met strong resistance on the part of some: calculus, transfinite calculus, the standard, working-physicist's interpretion of QM. The historical trend is very clear: resistance to successful new ideas diminishes slowly, but it does indeed diminish.

Those of us who have worked with probability and statistics to do practical problems do not have any interprative problems -- a factor analysis is a factor analysis, even, God forbid, if we don't have normal distribution; regression is regression, Type A and Type B errors are just that. When you use this stuff (I'll find a better word someday) you get familiar, and, to put it bluntly, you get past the beginners stage fairly quickly. And, with all due respect, many physicists are beginners -- they may know theory, measures, and convergence in the mean, .. -- but they typically do not have any experience with the great realm of statistical techniques.

As we know as teachers and students, homework is essential.
When you have done a hundred regression models, hundreds of survey analyses, a hundred sales forecasts, ...you pick up a very practical approach to what you are doing -- which includes writing and delivering reports which must be intellegable to managers who know little or nothing about statistics, but know a lot about their business. My claim simply is: if theoreticians spent a couple of years doing basic statistical work for businesses or other organizations, there would be virtually no controversy over the interpretaion of QM. Again, with all due respect, the physics community is largely amateurish in its use and thinking about probabiltiy and statistics -- I'm far from the only one who has said this./

You have to do it do understand it.

There's something us old guys call the "JCL trick" JCK, IBMs Job Control Langauge was the way you got your programs running on mainframes. JCL was job protection for system programmers; JCL was tricky, went on forever, and hard to learn. Thus the first thing with a new job, was to ask "Who's the JCL guy?" IBM and the programmers made a Faustian bargain to keep things purposefully complicated.

Sometimes I think the JCL trick is, in effect, a major reason why the QM interpretive controversy continues. That is, job protection. If the controversy is resolved, that's one less area in which to work. Plus, for some, the controversy is fun, great entertainment, and goes on forever. Otherwise, I cannot understand why the physics community, for the most part, makes the interpretation of QM so complicated. During my grad student days, my professors just hammered on simpliciy, don;t crack a hardboiled egg with a pile driver (Goldstein, Classical Mechanics). In my view the simple and correct way to interpret QM is as a theory about probability, and deal with the interpretation as if you are, say, handicapping a horse race, choosing stocks, or wondering when your teenager will get home on Friday night.

A bet I'll not live to win is that in 20 or 30 years, the debate about QM and its interpretation will be a thing of the past, and my side will win. History bears me out: anybody know anyone who's into prime movers, or the divine right of Kings, ...

Throughout history, pragmatism ultimately prevails. (My reference here is virtually any history book, and particularly, Daniel Boorstein's magnificent "The Discoverers", a history of science and technology and exploration, covering pre-history to the present, beautifully written, nicely illustrated, and in a subtle way, very profound. You can see some of why I think the way I do in this book.

Again, my response went out of control again -- but then , nobody really needs to read this.

One of my favorite examples of changing intellectual frameworks is just starting to happen. That is, physicists like the brilliant Roger Penrose just love to make the workings of the human mind's consciousness exceedingly complex, invoking quantum gravity, and ... (Job protection?)

Then there are physicists like Sir Francis Crick of DNA fame who think quite the opposite. (See The Astonishing (Amazing) Hypothesis) He takes the view that consciousness is simply an aggregate of brain function. That is, as neuroscientists pursue the workings of the brain, we will understand consciousness as a consequence of the neural processing of perceptual signals from both outside and inside. (I like the idea of virtual hommunculii as an aid, a metaphor for how we see or hear or think. I believe this is similar to work that Marvin Minsky publisjed some years ago; his magnum opus on the mind -- The Society of Mind is the title.)
 
  • #34
Sorry about the duplication. ra
 
Last edited:
  • #35
Then I'd like you to write a rebuttal to the Stony Brook's Nature paper using your interpretation of the result.

Zz.
 
  • #36
ZapperZ said:
I think your explanation here isn't correct.

The metaphor is that before you open the box, the state of the cat is in a superposition of both alive anddead. This metaphor is an illustration of the superposition of orthorgonal states of a wavefunction, such as

\Psi = a|u> + b|v>

The difference between this and our classical world is that, in our classical universe, the cat is either dead or alive. It cannot exist in a superposition of two very distinct states.

The act of opening the box "collapses" (if you buy the Copenhagen Interpretation) the state (i.e. you now make a measurement) so that now, the cat is unambiguously determined to be dead or alive. You now have either a |u> or a |v> state and no longer the superposition of the two.

Zz.

I'm sorry to barge in but, can the whole premise here be explained by using the term "potential" rather than "superposition"? I've been trying to determine where potential fits into quantum theory.

The mechanical set-up of the box with a cat inside it plus the two probabilities of dead or alive are the components of the potential observed. A discontinued or continued life will be the emergent result of the potentials found in Schrodinger's (sp) metaphorical box.
 
  • #37
quantumcarl said:
I'm sorry to barge in but, can the whole premise here be explained by using the term "potential" rather than "superposition"? I've been trying to determine where potential fits into quantum theory.

The mechanical set-up of the box with a cat inside it plus the two probabilities of dead or alive are the components of the potential observed. A discontinued or continued life will be the emergent result of the potentials found in Schrodinger's (sp) metaphorical box.

Please note that the word "potential" in physics has specific meaning. Look up, for example, the meaning of an electrostatic potential. This is a clear example where words and phrases in physics have clear, underlying mathematical definitions. You simply cannot use a pedestrian definition of a word and apply it in physics.

Zz.
 
  • #38
ZapperZ said:
Please note that the word "potential" in physics has specific meaning. Look up, for example, the meaning of an electrostatic potential. This is a clear example where words and phrases in physics have clear, underlying mathematical definitions. You simply cannot use a pedestrian definition of a word and apply it in physics.

Zz.

Thanks dude!

I also question the terminology in the slit experiment, in an unauthorized way, because saying "a photon through a slit" would be like saying "a millimeter through a slit"... wouldn't it?

I'll have to look up "photon" too since I seem to think it is simply a term of measurement... in my draconian and pedestrian way!

EDIT: I had to go to the source of the word photon, Gilbert N. Lewis, 1926. The guy almost sounds as draconian as i do.

G.N.Lewis said:
I therefore take the liberty of proposing for this hypothetical new atom, which is not light but plays an essential part in every process of radiation, the name photon.

and

G.N.Lewis said:
...a new type of atom, an identifiable entity, uncreatable and indestructible, which acts as the carrier of radiant energy

Thanks!
 
Last edited:
  • #39
Let me bring another example for Reilly's POV - assume an airplane traveling from Boston to Los Angeles, spanning 10 states, has an equal 10% probability for crashing in a state, if it crashes. Now assume you hear on 9/11 that some such plane has crashed. You will form a model of the existence of the crashed plane in 10 different states, each with probability 10%. The crash could only happen in one of these 10 states, and the probability distribution is uniform. The crash now exists (is classically superposed) in 10 localities. The next news flash is that it crashed in Pennsylvania. That immediately "collapses" the mental model and the crash is now fully and instantaneously "realized" in one location. However of course, quantum superposition does result in observed interference. But that does not stop one to interpret and explain classical probabilities, and the emergence of a definite probabilistic outcome, in terms of a "collapse", IMO.

reilly said:
There's something us old guys call the "JCL trick" JCK, IBMs Job Control Langauge was the way you got your programs running on mainframes. JCL was job protection for system programmers; JCL was tricky, went on forever, and hard to learn. Thus the first thing with a new job, was to ask "Who's the JCL guy?" IBM and the programmers made a Faustian bargain to keep things purposefully complicated.

Sounds like the attempts of the programming science community to develop the C++ extensions to the 'C' programming language. It started as a bona fide, praiseworthy and necessary attempt to objectify C. It resulted in introducing several layers of irrelevant conventions dealing with code management, new sources of confusion and bureaucracy, ambiguity, severe time and space inefficiency - and created a cadre of mandarins who only they could decipher and interpret the new structures and conventions. What was gained was questionable, and could have been gained in another more rational and meaningful manner. Then the suppliers (such as Microsoft) and the mandarins started pushing for this, and thus created layers of job security and supplier security for themselves.

reilly said:
Then there are physicists like Sir Francis Crick of DNA fame who think quite the opposite. (See The Astonishing (Amazing) Hypothesis) He takes the view that consciousness is simply an aggregate of brain function. That is, as neuroscientists pursue the workings of the brain, we will understand consciousness as a consequence of the neural processing of perceptual signals from both outside and inside. (I like the idea of virtual hommunculii as an aid, a metaphor for how we see or hear or think. I believe this is similar to work that Marvin Minsky publisjed some years ago; his magnum opus on the mind -- The Society of Mind is the title.)

Penrose's attempt to mystify consciousness is not doing anybody a favor. Crick and Dennett have much more coherent and simplified/plausible explanations for it, true to Occam's razor, and there is no reason to make it inaccessible. Such attempts to obscure hard to understand issues are often politically or economically (job related) driven, and have nothing to do with the need to enlighten and discover. Just look at the clerics of almost any religion, whose job is to obscure and mystify philosophic and metaphysical issues such as existence and the mind-body problem, in order to gain politically and economically, at the expense of enlightenment, reason, discovery, and progress. You see this moment to obscure, in science, and in engineering, by your local car mechanic or electronics salesman, and most pronounced, by the practitioners of religion!
 
  • #40
zekise said:
Let me bring another example for Reilly's POV - assume an airplane traveling from Boston to Los Angeles, spanning 10 states, has an equal 10% probability for crashing in a state, if it crashes. Now assume you hear on 9/11 that some such plane has crashed. You will form a model of the existence of the crashed plane in 10 different states, each with probability 10%. The crash could only happen in one of these 10 states, and the probability distribution is uniform. The crash now exists (is classically superposed) in 10 localities. The next news flash is that it crashed in Pennsylvania. That immediately "collapses" the mental model and the crash is now fully and instantaneously "realized" in one location. However of course, quantum superposition does result in observed interference. But that does not stop one to interpret and explain classical probabilities, and the emergence of a definite probabilistic outcome, in terms of a "collapse", IMO.

Which reminds me, when a potential produces a result it collapses. The energy in the potential is transferred to your observation, muchas is seen in electrostatic potential where energy can only be measured by its expenditure or transformance.
 
Last edited:
  • #41
quantumcarl said:
Which reminds me, when a potential produces a result it collapses. The energy in the potential is transferred to your observation, muchas is seen in electrostatic potential where energy can only be measured by its expenditure or transformance.

You just haven't learned, have you?

What is this "potential" that you keep referring to? It certainly isn't the same potential that we use in physics from the way you are using it. Describe to me explicitly this mechanism in which a potential produces a collapse. Then apply it to a specific example, such as the measurement of the polarization of a photon for instance.

Zz.
 
  • #42
zekise said:
Let me bring another example for Reilly's POV - assume an airplane traveling from Boston to Los Angeles, spanning 10 states, has an equal 10% probability for crashing in a state, if it crashes. Now assume you hear on 9/11 that some such plane has crashed. You will form a model of the existence of the crashed plane in 10 different states, each with probability 10%. The crash could only happen in one of these 10 states, and the probability distribution is uniform. The crash now exists (is classically superposed) in 10 localities. The next news flash is that it crashed in Pennsylvania. That immediately "collapses" the mental model and the crash is now fully and instantaneously "realized" in one location. However of course, quantum superposition does result in observed interference. But that does not stop one to interpret and explain classical probabilities, and the emergence of a definite probabilistic outcome, in terms of a "collapse", IMO.



/QUOTE]

For the moment, I only have time to say:"Right On" Great post. Thanks
Regards,
Reilly

(More later.)
 
  • #43
ZapperZ said:
You just haven't learned, have you?

What is this "potential" that you keep referring to? It certainly isn't the same potential that we use in physics from the way you are using it. Describe to me explicitly this mechanism in which a potential produces a collapse. Then apply it to a specific example, such as the measurement of the polarization of a photon for instance.

Zz.

Personally I don't learn unless I make mistakes. I seem to be "learning" a lot here! Agh.

Anyway... I'm not saying a potential collapes, I'm saying the energy contained in a potential transfers to an end result upon observation... for instance... the observation of a dead or living cat... . So, in this case, a "final" outcome can only be determined by observation and this observation seems to take the blame for the collapse of the potential that existed before the observation.

With regard to "the measurement of the polarization of a photon" I think you're showing me another mistake I've made. Or just how much I'm learning! Thanks!

Now consider two entangled photons with random but identical polarization. Alice and Bob each have one half of the pair of entangled photons. When Alice measures her photon its original indefinite state vanishes, yielding a result of either ‘horizontal’ or ‘vertical’ polarization, each with a 50% possibility of occurring. It is now known that Bob’s photon will, upon measurement, have the same polarization as Alice’s, despite previously having the same probability of getting one or the other. It is as though Bob’s photon has been magically influenced by Alice’s. Most remarkably, the distance between Alice and Bob is irrelevant to the result. The entangled state specifies only that a measurement will find the two polarizations are equal. Entanglement is often known as EPR, and the particles EPR pairs, after Einstein, Boris Podolsky and Nathan Rosen, who first analyzed the effects of entanglement in 1935. As a basis for his theoretical prediction of quantum teleportation, Bennett uses EPR in a rather ingenious way.

From:
http://www.theowljournal.comarticle.phpissue=4&number=5&type=print&comments=1
 
Last edited by a moderator:
  • #44
ttn said:
This all highlights the "measurement problem" from which orthodox QM suffers. Is it the act of observation (literal conscious awareness) that constitutes the "measurements" which resolve superpositions into definite outcomes? That seems crazy, as illustrated by the cat. So then maybe the cat "measures" the state of the vial of poison? Or maybe the vial "measures" the state of the hammer? Or maybe the relay "measures" the output of the geiger counter? etc. The problem is: we start out with a superposition at the level of the alpha particle, and definitely end up with a state that is *not* a superposition; orthodox QM says the transition happens when a "measurement" occurs. But what the heck is a measurement exactly? Where along this continuous chain from micro to macro do the normal dynamics defer to the collapse postulate? Schroedinger was using this cat thought experiment in this way to argue against the completeness doctrine -- i.e., to argue for "hidden variables" at the original micro-level. Such variables would resolve the "fuzziness" at the very beginning, and the whole ambiguous chain would never get going. (For example, Bohm's theory does not suffer from the measurement problem because it attributes a definite position to the alpha particle from the very beginning, whether or not it is "measured.")

Let me repeat one last time what "the measurement problem" is, because I think it's not sufficiently well understood. Orthodox QM says that there are two different rules according to which wave functions evolve: Schroedinger's equation, and the collapse postulate. The first law applies when no measurement is being made; the second applies when a measurement is being made. But the theory never tells us what exact sort of physical process constitutes a "measurement." The theory is, to use Bell's apt description, "unprofessionally vague and ambiguous." *This* is what is supposed to be brought out by the infamous cat.
/

Very quickly: by your definiion I would argue that most working physicists would agree that the old 1920s version of QM interpretation, the so-called orthodox intrepretation, is pushing on silly, to be a bit more direct than Prof . Bell. From personal experience and discussions and observations of physicists over the years that we go with Born and use the notions of classical probability to describe our experiments -- cross sections, decay rates, energy, ...


As the neuroscientists get smarter and smarter about how our brains, hence minds, work, the more sense it makes to view QM through a knowledge perspective -- that's what we do in the classical world. Why change if we do not need to. Collapse only makes good, solid physical sense if it is the quick change of a neural pattern from uncertainty to certainty. Your first paragraph illustrates the absurdity of ideas that never would have faced the light if the founders were inventing QM today in view of greater practical familiarity with statistics and probability, and an astonishingly different view of mind and brain than was the case 80 years ago.

Regards,/
/
Reilly Atkinson


PS. Does anybody think that, perhaps, "An experiment can only give one result at any time." is a law of Nature? Could it be a derived notion?
 
  • #45
A while back in this thread, Zapper Z mentioned that Bob Laughlin had solved the quantum measurement problem and that the answer was contained in his new book "A Different Universe." I asked Zapper to summarize the answer and he understandably suggested I read the book first:


ZapperZ said:
I suggest you read Laughlin's book first, and then, we'll have this conversation.

Well, I've just finished reading it and would be intrigued to pick up this thread if Zapper or anyone else is interested. I have to say I liked the book very much and agree strongly with the general thesis that hyper-reductionism is leading physics in the wrong direction (e.g., way too much emphasis on stuff like string theory and not nearly enough on things like condensed matter physics where surprising, interesting emergent phenomena are the norm).

But sadly (for me) there is no solution to the measurement problem here. Indeed, the author only mentions it rather briefly in an early chapter, never to return to it specifically, and all he really says about it is that what we call "measurement" is a high-level emergent phenomenon, so of course we shouldn't expect the underlying (quantum) rules to apply.

This ignores a basic fact contained in chapter one of every undergrad quantum text, however: the Schroedinger equation is linear. That is really the source of all the difficulties, because the linearlity makes it *trivial* to solve the equation in exactly the kind of situations that give rise to all the trouble here. For example, if you have some system which amplifies a decayed atom and kills a cat -- and if that whole process is describable by the laws quantum mechanics -- then you know exactly what will happen if you have an atom that is in a superposition of decayed and not decayed. The equations are linear, so A --> X and B --> Y entails that (A+B) --> (X+Y). There's just no getting around it. Schroedinger's equation *predicts* that a half-decayed atom should evolve deterministically into a half-dead-cat, and that is just *not* what is observed.

That is the "measurement problem." The deterministic laws of quantum mechanics predict the existence of states which do not correspond to what is observed. There appear to be several options. First, the laws might only apply in some finite domain, i.e., they might not be universal. This is the strategy which results in the "collapse of the wave function" band-aid. But this just moves the question without really answering. Ok, so at some point along the chaing the wave function collapses (i.e., the deterministic Schroedinger dynamics holds up its hands and gives way to some new dynamical law, the Born rule collapse postulate). But then: exactly where in the chain is this? That is, exactly what is it that determines whether a wf evolves according to the first kind of dynamics, or the second? What exactly *is* a measurement? It is because these questions are never answered that Bell called orthodox QM "unprofessionally vague and ambiguous."

A second option is to say that the Schroedinger equation is just wrong, and to replace it with something that is not quite deterministic and not quite linear, so that for micro-systems the extra terms can be ignored and we get (basically) the usual kind of evolution, while for big macroscopic collections of particles, the extra non-linear terms result in what the orthodox view would describe as the wf having collapsed. But here there is just *one* kind of dynamics, so there is no ambiguity about when/where exactly the one kind takes a lunch break while the second kind takes over. This is of course the approach of the GRW type theories.

A final option is to reject the idea that the wave function provides a complete description of the micro states. So, for example, in Bohm's theory, even when the atom is described as being in a superposition of decayed and not decayed, there is actually a real fact about where the particle is (either still in the nucleus = not decayed, or outside the nucleus = decayed). So there is no problem of an ambiguity at the microlevel getting amplified up to the macrolevel where it conflicts with experiment. Rather, we get rid of ambiguities at the root, and the whole problem disappears. Particles are never in two places at once (even when their wave functions are spread out over large regions), and so large collections of particles (like cats) are never in the kind of configurations we don't observe them in.

I guess an even final-er option is the route taken by MWI: instead of fixing up the microphysics to bring it into agreement with what is observed, just reject the veracity of those alleged "observations" of definitely-living or definitely-dead cats. Accept that the quantum mechanical laws are always correct, even when they predict things like superpositions of live and dead cats, and then construct a big fairy tale according to which our observations to the contrary are delusional. I know some people think this is a reasonable option, but I don't. And I bet Bob Laughlin wouldn't, either.

So... what exactly is Laughlin's solution to this problem? How does his solution relate to the possibilities I just outlined? Is it one of these, or basically different in some way? Hopefully Zapper or someone else who has read (and claims to support) Laughlin's answer can clarify.
 

Similar threads

  • · Replies 46 ·
2
Replies
46
Views
8K
  • · Replies 6 ·
Replies
6
Views
498
  • · Replies 143 ·
5
Replies
143
Views
11K
  • · Replies 17 ·
Replies
17
Views
2K
Replies
42
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 72 ·
3
Replies
72
Views
8K
  • · Replies 20 ·
Replies
20
Views
3K
  • · Replies 24 ·
Replies
24
Views
3K